CN115509405A - Control method and device of operation icon, electronic equipment and storage medium - Google Patents

Control method and device of operation icon, electronic equipment and storage medium Download PDF

Info

Publication number
CN115509405A
CN115509405A CN202211203891.1A CN202211203891A CN115509405A CN 115509405 A CN115509405 A CN 115509405A CN 202211203891 A CN202211203891 A CN 202211203891A CN 115509405 A CN115509405 A CN 115509405A
Authority
CN
China
Prior art keywords
movement parameter
movement
parameter
determining
operation icon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211203891.1A
Other languages
Chinese (zh)
Inventor
李念龙
张晓平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202211203891.1A priority Critical patent/CN115509405A/en
Publication of CN115509405A publication Critical patent/CN115509405A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

Abstract

The application provides a control method and device of an operation icon, electronic equipment and a computer readable storage medium; the method comprises the following steps: receiving operation data, wherein the operation data comprises a first movement parameter of the operation icon on a first plane, the first plane is a plane in a virtual space, and the virtual space is a three-dimensional space which is perceived by a viewer through an image; determining a second movement parameter of the operation icon in a depth direction according to the first movement parameter, wherein the depth direction is a direction perpendicular to the first plane; and controlling the operation icon to move in the virtual space according to the second movement parameter. Through the application, the control precision of the three-dimensional space can be improved.

Description

Control method and device of operation icon, electronic equipment and storage medium
Technical Field
The present disclosure relates to computer technologies, and in particular, to a method and an apparatus for controlling an operation icon, an electronic device, and a storage medium.
Background
The stereoscopic display mode can intuitively express a scene, improves the immersion of a user, and faces the challenge of input precision in interaction. The existing operation modes aiming at the three-dimensional space, such as the air gesture, have quite limited operation precision, and are easy to generate fatigue after long-time use due to lack of physical plane support.
Disclosure of Invention
The embodiment of the application provides a control method and device for an operation icon, an electronic device and a computer-readable storage medium, which can improve the control precision of a three-dimensional space.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a control method of an operation icon, which comprises the following steps: receiving operation data, wherein the operation data comprises a first movement parameter of the operation icon on a first plane, the first plane is a plane in a virtual space, and the virtual space is a three-dimensional space which is perceived by a viewer through an image; determining a second movement parameter of the operation icon in a depth direction according to the first movement parameter, wherein the depth direction is a direction perpendicular to the first plane; and controlling the operation icon to move in the virtual space according to the second movement parameter.
In the foregoing solution, the determining, according to the first movement parameter, a second movement parameter of the operation icon in the depth direction includes: determining a second moving speed of the operation icon in the depth direction according to the first moving speed, and determining a second moving direction of the operation icon in the depth direction; and determining a second movement parameter of the operation icon in the depth direction according to the second movement speed and the second movement direction.
In the foregoing solution, the determining a second moving rate of the operation icon in the depth direction according to the first moving rate includes: obtaining a magnitude relation between the first moving speed and the second moving speed; determining a second movement rate of the operation icon in the depth direction based on the first movement rate and the size relationship.
In the foregoing solution, the determining a second moving direction of the operation icon in the depth direction includes: determining a target interaction object which satisfies a first relation with the operation icon in the virtual space; and determining a second moving direction of the operation icon in the depth direction according to the depth information of the target interactive object.
In the foregoing solution, the determining a second moving direction of the operation icon in the depth direction includes: receiving a direction control parameter for a depth direction; and determining a second moving direction of the operation icon in the depth direction according to the direction control parameter.
In the above solution, the controlling the operation icon to move in the virtual space according to the second movement parameter includes: determining three-dimensional movement parameters of the operation icon in the virtual space according to the first movement parameters and the second movement parameters; and controlling the operation icon to move in the virtual space according to the three-dimensional movement parameters. In some embodiments, the determining a three-dimensional movement parameter of the handle icon in the virtual space according to the first movement parameter and the second movement parameter includes: taking the first movement parameter as a first plane component of the three-dimensional movement parameter on the first plane, and taking the second movement parameter as a first depth component of the three-dimensional movement parameter in the depth direction; and determining the three-dimensional movement parameter according to the first plane component and the first depth component.
In the above solution, the determining a three-dimensional movement parameter of the operation icon in the virtual space according to the first movement parameter and the second movement parameter includes: re-determining the first movement parameter of the operation icon on the first plane according to the first movement parameter and the second movement parameter; taking the re-determined first movement parameter as a second planar component of the three-dimensional movement parameter in the first plane, and taking the second movement parameter as a second depth component of the three-dimensional movement parameter in the depth direction; and determining the three-dimensional movement parameter according to the second plane component and the second depth component.
In the foregoing solution, the re-determining the first movement parameter of the operation icon on the first plane according to the first movement parameter and the second movement parameter includes: obtaining a first moving speed corresponding to the first moving parameter and obtaining a second moving speed corresponding to the second moving parameter; calculating a difference between the first moving rate and the second moving rate, and taking the difference as a new first moving rate; and obtaining a first moving direction corresponding to the first moving parameter, and re-determining the first moving parameter of the operation icon on the first plane based on the new first moving speed and the first moving direction.
An embodiment of the present application provides a control device for an operation icon, including:
a receiving module, configured to receive operation data, where the operation data includes a first moving parameter of the operation icon on a first plane, where the first plane is a plane in a virtual space, and the virtual space is a stereoscopic space perceived by a viewer through an image;
the determining module is used for determining a second movement parameter of the operation icon in a depth direction according to the first movement parameter, wherein the depth direction is a direction perpendicular to the first plane;
and the control module is used for controlling the operation icon to move in the virtual space according to the second movement parameter.
An embodiment of the present application provides an electronic device, including:
a memory for storing executable instructions;
and the processor is used for realizing the control method of the operation icon provided by the embodiment of the application when the executable instruction stored in the memory is executed.
The embodiment of the application provides a computer-readable storage medium, which stores executable instructions for causing a processor to execute the computer-readable storage medium, so as to implement the control method for the operation icon provided by the embodiment of the application.
According to the method and the device, the operation icon used in the plane display control is used for controlling the three-dimensional space, the second movement parameter of the operation icon in the depth direction is determined based on the first movement parameter of the operation icon moving on the plane, and therefore the operation icon is controlled to move according to the determined second movement parameter, the operation icon can be well adapted to the three-dimensional space, and the control precision of the three-dimensional space is improved.
Drawings
Fig. 1 is an alternative schematic structural diagram of an electronic device 100 provided in an embodiment of the present application;
fig. 2 is an alternative flow chart of a control method for operating icons according to an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating an alternative detailed flow of step 202 provided by an embodiment of the present application;
FIG. 4 is an alternative diagram of an operation icon moving process provided by an embodiment of the application;
fig. 5 is a schematic diagram of an alternative detailed flow of step 203 provided in this embodiment of the present application.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In the following description, references to the terms "first \ second \ third" are only to distinguish similar objects and do not denote a particular order or importance, but rather "first \ second \ third" may, where permissible, be interchanged in a particular order or sequence so that embodiments of the present application described herein can be practiced in other than the order shown or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
When the operation icon in the planar display control is used for controlling the three-dimensional space, because the display effect of the operation icon still moves according to the two-dimensional pixel planar speed, when the user perceives the display picture as the three-dimensional picture, a discontinuous moving feeling is generated, which affects the user input and causes poor control precision.
Based on this, embodiments of the present application provide a method and an apparatus for controlling an operation icon, an electronic device, and a computer-readable storage medium, which can improve control accuracy of a stereoscopic space.
First, an electronic device for implementing the control method for the operation icon provided in the embodiment of the present application will be described. Referring to fig. 1, fig. 1 is an optional schematic structural diagram of an electronic device 100 provided in the embodiment of the present application, and in practical applications, the electronic device 100 may be implemented as a terminal or a server. The terminal may be, but is not limited to, a laptop, a tablet computer, a desktop computer, a smart phone, a dedicated messaging device, a portable game device, a smart speaker, a smart watch, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, content Delivery Network (CDN) service, big data and an artificial intelligence platform. The electronic device 100 shown in fig. 1 includes: at least one processor 101, memory 105, at least one network interface 102, and a user interface 103. The various components in electronic device 100 are coupled together by a bus system 104. It is understood that the bus system 104 is used to enable communications among the components. The bus system 104 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 104 in FIG. 1.
The Processor 101 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, etc., wherein the general purpose Processor may be a microprocessor or any conventional Processor, etc.
The user interface 103 includes one or more output devices 1031 that enable presentation of media content, including one or more speakers and/or one or more visual display screens. The user interface 103 also includes one or more input devices 1032 including user interface components to facilitate user input, such as a keyboard, mouse, microphone, touch screen display screen, camera, other input buttons, and controls.
The memory 105 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, and the like. Memory 105 may optionally include one or more storage devices physically located remote from processor 101.
Memory 105 includes volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a Random Access Memory (RAM). The memory 105 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, the memory 105 is capable of storing data, examples of which include programs, modules, and data structures, or a subset or superset thereof, to support various operations, in embodiments of the present application, the memory 105 has stored therein an operating system 1051, a network communication module 1052, a presentation module 1053, an input processing module 1054, and a control device 1055 that operates icons; in particular, the amount of the solvent to be used,
an operating system 1051, including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
a network communications module 1052 for reaching other computing devices via one or more (wired or wireless) network interfaces 102, exemplary network interfaces 102 including: bluetooth, wireless compatibility authentication (WiFi), and Universal Serial Bus (USB), etc.;
a presentation module 1053 for enabling presentation of information (e.g., a user interface for operating peripherals and displaying content and information) via one or more output devices 1031 (e.g., display screen, speakers, etc.) associated with user interface 103;
an input processing module 1054 for detecting one or more user inputs or interactions from one of the one or more input devices 1032 and translating the detected inputs or interactions.
In some embodiments, the control device for the operation icon provided in the embodiments of the present application may be implemented in software, and fig. 1 illustrates the control device 1055 for the operation icon stored in the memory 105, which may be software in the form of programs and plug-ins, and includes the following software modules: a receiving module 10551, a determining module 10552 and a control module 10553, which are logical and thus can be arbitrarily combined or further split depending on the functions implemented. The functions of the respective modules will be explained below.
In other embodiments, the control Device for the operation icon provided in this embodiment may be implemented in hardware, and as an example, the control Device for the operation icon provided in this embodiment may be a processor in the form of a hardware decoding processor, which is programmed to execute the control method for the operation icon provided in this embodiment, for example, the processor in the form of the hardware decoding processor may be one or more Application Specific Integrated Circuits (ASICs), DSPs, programmable Logic Devices (PLDs), complex Programmable Logic Devices (CPLDs), field Programmable Gate Arrays (FPGAs), or other electronic components.
The control method of the operation icon provided by the embodiment of the present application will be described below in conjunction with an exemplary application and implementation of the terminal provided by the embodiment of the present application.
Referring to fig. 2, fig. 2 is an alternative flow chart of a control method for operating icons according to an embodiment of the present application, which will be described with reference to the steps shown in fig. 2.
Step 201, receiving operation data, where the operation data includes a first moving parameter of the operation icon on a first plane, where the first plane is a plane in a virtual space, and the virtual space is a stereoscopic space perceived by a viewer through an image;
step 202, determining a second movement parameter of the operation icon in a depth direction according to the first movement parameter, wherein the depth direction is a direction perpendicular to the first plane;
step 203, controlling the operation icon to move in the virtual space according to the second movement parameter.
In the embodiment of the application, the virtual space is presented through the display screen. The display screen related to the embodiment of the application can be a flat display screen, can also be a non-flat special-shaped display screen, such as a curved display screen, and can also be a group of display screens formed by a plurality of flat display screens. In the embodiment of the present application, the first plane may be any plane in the virtual space, which can be viewed by the viewer as a part or all of the image. For example, if the display screen is a flat display screen, the first plane may be a plane parallel to a display interface of the flat display screen.
In practical implementation, after receiving the operation data, obtaining a first movement parameter of the operation icon on the first plane, and determining a second movement parameter of the operation icon in the depth direction according to the first movement parameter. And then controlling the operation icon to move in the virtual space according to the second movement parameter. Specifically, the operation icon may be controlled by a mouse, a keyboard, a touch screen, a gesture sensing device, an eye tracking device, and the like.
In some embodiments, the first movement parameter includes a first movement speed and a first movement direction, referring to fig. 3, fig. 3 is an optional detailed flowchart of step 202 provided in this embodiment, and step 202 may also be implemented by:
step 2021, determining a second moving speed of the operation icon in the depth direction according to the first moving speed, and determining a second moving direction of the operation icon in the depth direction;
step 2022, determining a second movement parameter of the operation icon in the depth direction according to the second movement rate and the second movement direction.
In practical implementation, the second movement parameter includes a second movement rate and a second movement direction in the depth direction. Specifically, according to the first movement rate of the first movement parameter, a second movement rate of the operation icon in the depth direction is determined, and a second movement direction of the operation icon in the depth direction is determined.
In some embodiments, the determining the second moving speed of the operation icon in the depth direction according to the first moving speed may be implemented by: obtaining a magnitude relation between the first moving speed and the second moving speed; determining a second movement rate of the operation icon in the depth direction based on the first movement rate and the size relationship.
Here, the magnitude relationship between the first moving rate and the second moving rate may be predetermined. Specifically, the magnitude relationship may be set manually in advance, or may be determined according to a three-dimensional coordinate system in the virtual space, for example, for a reference point at a first position in the display interface, if the image of the display interface is perceived by the viewer as a planar image, the coordinate of the reference point in the two-dimensional coordinate system of the planar image is recorded as a first coordinate, if the image of the display interface is perceived by the viewer as a stereoscopic image, the coordinate of the reference point in the three-dimensional coordinate system of the stereoscopic image is recorded as a second coordinate, and the magnitude relationship between the first movement rate and the second movement rate is determined according to the relationship between the first coordinate and the second coordinate.
In practical implementation, the magnitude relationship between the first moving rate and the second moving rate includes: the component with the larger value of the first component and the second component is taken as the larger component, and the second movement speed is smaller than the larger component. Three coordinate axes of a coordinate system in the stereoscopic space are denoted as an x-axis, a y-axis, and a z-axis. The plane formed by the x axis and the y axis corresponds to the first plane, and the z axis corresponds to the depth direction. Denote the first movement rate as P xy The first movement rate includes a first component P on the x-axis x And a second component P on the y-axis y Let the second movement rate be P z Then the size relationship includes: p is z <=max(P x ,P y ). The terminal may determine the second moving rate within a range satisfying the magnitude relationship. In some embodiments, the magnitude relationship further includes a determined numerical relationship, which may include, for example: the ratio of the second moving speed to the first moving speed is a preset ratio. Here, the preset ratio may be determined by the relationship between the three-dimensional coordinate system and the two-dimensional coordinate system described above.
In some embodiments, the determining the second moving direction of the operation icon in the depth direction may be implemented by: determining a target interaction object which satisfies a first relation with the operation icon in the virtual space; and determining a second moving direction of the operation icon in the depth direction according to the depth information of the target interactive object.
In actual implementation, the first relationship is that the operation icon passes through the target interaction object in the moving process. Specifically, when the operation icon passes through the target interaction object, the position of the operation icon staying on the target interaction object is used as the position of the operation icon, the depth change information of the operation icon is determined according to the depth information of the target interaction object and the position change information of the operation icon on the target interaction object, and the second moving direction of the operation icon in the depth direction is determined according to the depth change information.
Exemplarily, referring to fig. 4, fig. 4 is an optional schematic diagram of an operation icon moving process provided by an embodiment of the present application. The operation icon 401 passes through the target interaction object 402 in the moving process, the position change information of the operation icon 401 on the target interaction object 402 in the moving process is obtained, and the second moving direction of the operation icon 401 is determined according to the depth information of the target interaction object 402 and the position change information of the operation icon 401 on the target interaction object 402 in the moving process. Here, the depth change information of the operation icon 401 indicates that the depth change of the operation icon 401 is from small to large, and the second moving direction is a depth increasing direction.
In some embodiments, determining the second moving direction of the operation icon in the depth direction may be further implemented by: receiving a direction control parameter for a depth direction; and determining a second moving direction of the operation icon in the depth direction according to the direction control parameter.
In practical implementation, the user can also control the depth direction, the terminal receives a direction control parameter for the depth direction, and determines a second moving direction of the operation icon in the depth direction according to the direction control parameter. Illustratively, the user may perform direction control in the depth direction by sliding a wheel on a mouse, and the terminal acquires a direction control parameter in the depth direction by receiving a control signal for the wheel.
In some embodiments, referring to fig. 5, fig. 5 is an optional detailed flowchart of step 203 provided in this embodiment, and step 203 may also be implemented by:
step 2031, determining a three-dimensional movement parameter of the operation icon in the virtual space according to the first movement parameter and the second movement parameter;
step 2032, controlling the operation icon to move in the virtual space according to the three-dimensional movement parameter.
During actual implementation, the terminal determines three-dimensional movement parameters of the operation icon in the virtual space according to the first movement parameters and the second movement parameters, and controls the operation icon to move in the virtual space according to the three-dimensional movement parameters. Illustratively, the first movement parameter includes a first movement speed and the second movement parameter includes a second movement speed.Three coordinate axes of a coordinate system in the stereo space are denoted as an x-axis, a y-axis, and a z-axis. Wherein the plane formed by the x-axis and the y-axis corresponds to a first plane, the z-axis corresponds to the depth direction, and the first moving speed is denoted as V xy And the second moving speed is denoted as V z Then three-dimensional moving speed V xyz =sqrt(V xy ^2+V z ^2)。
In some embodiments, step 2031 may also be implemented as follows: taking the first movement parameter as a first plane component of the three-dimensional movement parameter on the first plane, and taking the second movement parameter as a first depth component of the three-dimensional movement parameter in the depth direction; and determining the three-dimensional movement parameter according to the first plane component and the first depth component.
Illustratively, the first moving speed V xy As the three-dimensional moving speed V xyz A first plane component on the first plane, and a second moving speed V z As the three-dimensional moving speed V xyz A first depth component in the depth direction by calculating formula V xyz =sqrt(V xy ^2+V z ^ 2), calculating three-dimensional moving speed V xyz
In some embodiments, step 2031 may also be implemented as follows: re-determining the first movement parameter of the operation icon on the first plane according to the first movement parameter and the second movement parameter; taking the re-determined first movement parameter as a second planar component of the three-dimensional movement parameter in the first plane, and taking the second movement parameter as a second depth component of the three-dimensional movement parameter in the depth direction; and determining the three-dimensional movement parameter according to the second plane component and the second depth component.
The re-determining the first movement parameter of the operation icon on the first plane according to the first movement parameter and the second movement parameter may be implemented as follows: obtaining a first moving speed corresponding to the first moving parameter, and obtaining a second moving speed corresponding to the second moving parameter; calculating a difference between the first moving rate and the second moving rate, and taking the difference as a new first moving rate; and obtaining a first moving direction corresponding to the first moving parameter, and re-determining the first moving parameter of the operation icon on the first plane based on the new first moving speed and the first moving direction.
In the embodiment of the application, for the process of converting the first movement parameter of the plane into the three-dimensional movement parameter of the three-dimensional object, the calculation loss of the first movement parameter is also considered, so that the conversion from the plane to the three-dimensional object is further smoother. Illustratively, let the first movement rate be P xy And recording the first moving speed corresponding to the redetermined first moving speed as P xy A second rate of movement in the depth direction is denoted P z The first moving rate P xy And a second moving speed P z The difference between them is taken as the new first moving speed P xy I.e. the second moving rate is taken as the first moving rate P xy Is the first movement rate P xy And a first movement rate is denoted as P xy Satisfies the following relationship: p xy -P xy =P z . Then, the moving direction of the first moving speed is set as the moving direction of the redetermined first moving speed, and the moving direction and the new first moving speed are marked as P xy Redetermining the first moving speed V xy And taking the re-determined first moving speed as the re-determined first moving parameter.
According to the method and the device, the second moving direction for the depth direction is determined according to the first moving parameter of the operation icon in the first plane, and the movement of the operation icon is controlled according to the second moving direction, so that the operation icon used for accurate operation in the plane display field can be suitable for the three-dimensional space, and the operation precision of the three-dimensional space is improved.
Continuing with the exemplary structure of the control device 1055 of the operation icon provided in the embodiment of the present application implemented as a software module, in some embodiments, as shown in fig. 1, the software module stored in the control device 1055 of the operation icon in the memory 105 may include:
a receiving module 10551, configured to receive operation data, where the operation data includes a first moving parameter of the operation icon on a first plane, where the first plane is a plane in a virtual space, and the virtual space is a stereoscopic space perceived by a viewer through an image;
a determining module 10552, configured to determine, according to the first movement parameter, a second movement parameter of the handle icon in a depth direction, where the depth direction is a direction perpendicular to the first plane;
a control module 10553, configured to control the operation icon to move in the virtual space according to the second movement parameter.
In some embodiments, the first movement parameter includes a first movement rate and a first movement direction, and the determining module 10552 is further configured to determine a second movement rate of the handle icon in the depth direction according to the first movement rate and determine a second movement direction of the handle icon in the depth direction; and determining a second movement parameter of the operation icon in the depth direction according to the second movement speed and the second movement direction.
In some embodiments, the determining module 10552 is further configured to obtain a magnitude relationship between the first movement rate and the second movement rate; determining a second movement rate of the operation icon in the depth direction based on the first movement rate and the size relationship.
In some embodiments, the determining module 10552 is further configured to determine a target interaction object in the virtual space that satisfies a first relationship with the handle icon; and determining a second moving direction of the operation icon in the depth direction according to the depth information of the target interactive object.
In some embodiments, the determining module 10552 is further configured to receive a direction control parameter for a depth direction; and determining a second moving direction of the operation icon in the depth direction according to the direction control parameter.
In some embodiments, the control module 10553 is further configured to determine a three-dimensional movement parameter of the handle icon in the virtual space according to the first movement parameter and the second movement parameter; and controlling the operation icon to move in the virtual space according to the three-dimensional movement parameters. In some embodiments, the determining a three-dimensional movement parameter of the handle icon in the virtual space according to the first movement parameter and the second movement parameter includes: taking the first movement parameter as a first plane component of the three-dimensional movement parameter on the first plane, and taking the second movement parameter as a first depth component of the three-dimensional movement parameter in the depth direction; and determining the three-dimensional movement parameter according to the first plane component and the first depth component.
In some embodiments, the control module 10553 is further configured to determine the first movement parameter of the operation icon on the first plane again according to the first movement parameter and the second movement parameter; taking the re-determined first movement parameter as a second planar component of the three-dimensional movement parameter in the first plane, and taking the second movement parameter as a second depth component of the three-dimensional movement parameter in the depth direction; and determining the three-dimensional movement parameter according to the second plane component and the second depth component.
In some embodiments, the control module 10553 is further configured to obtain a first moving rate corresponding to the first moving parameter, and obtain a second moving rate corresponding to the second moving parameter; calculating a difference between the first moving rate and the second moving rate, and taking the difference as a new first moving rate; and obtaining a first moving direction corresponding to the first moving parameter, and re-determining the first moving parameter of the operation icon on the first plane based on the new first moving speed and the first moving direction.
It should be noted that the description of the apparatus in the embodiment of the present application is similar to the description of the method embodiment, and has similar beneficial effects to the method embodiment, and therefore, the description is not repeated.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the control method of the operation icon described above in the embodiment of the present application.
The embodiment of the application provides a computer-readable storage medium which stores executable instructions, and the executable instructions are stored in the computer-readable storage medium and when being executed by a processor, the executable instructions cause the processor to execute the control method for the operation icon, which is provided by the embodiment of the application.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, the executable instructions may be in the form of a program, software module, script, or code written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts stored in a hypertext markup language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
In conclusion, the control precision of the three-dimensional space can be improved through the embodiment of the application.
The above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present application are included in the protection scope of the present application.

Claims (10)

1. A control method of an operation icon is characterized by comprising the following steps:
receiving operation data, wherein the operation data comprises a first movement parameter of the operation icon on a first plane, the first plane is a plane in a virtual space, and the virtual space is a three-dimensional space which is perceived by a viewer through an image;
determining a second movement parameter of the operation icon in a depth direction according to the first movement parameter, wherein the depth direction is a direction perpendicular to the first plane;
and controlling the operation icon to move in the virtual space according to the second movement parameter.
2. The method of claim 1, the first movement parameter comprising a first movement rate and a first movement direction, the determining a second movement parameter of the handle icon in a depth direction from the first movement parameter comprising:
determining a second moving speed of the operation icon in the depth direction according to the first moving speed, and determining a second moving direction of the operation icon in the depth direction;
and determining a second movement parameter of the operation icon in the depth direction according to the second movement speed and the second movement direction.
3. The method of claim 2, the determining a second rate of movement of the handle icon in the depth direction from the first rate of movement, comprising:
obtaining a magnitude relation between the first moving speed and the second moving speed;
determining a second movement rate of the operation icon in the depth direction based on the first movement rate and the size relationship.
4. The method of claim 2, the determining a second direction of movement of the handle icon in a depth direction, comprising:
determining a target interaction object which meets a first relation with the operation icon in the virtual space;
and determining a second moving direction of the operation icon in the depth direction according to the depth information of the target interactive object.
5. The method of claim 2, the determining a second direction of movement of the handle icon in a depth direction, comprising:
receiving a direction control parameter for a depth direction;
and determining a second moving direction of the operation icon in the depth direction according to the direction control parameter.
6. The method according to any one of claims 1-5, wherein the controlling the operation icon to move in the virtual space according to the second movement parameter comprises:
determining a three-dimensional movement parameter of the operation icon in the virtual space according to the first movement parameter and the second movement parameter;
and controlling the operation icon to move in the virtual space according to the three-dimensional movement parameters.
7. The method of claim 6, wherein the determining a three-dimensional movement parameter of the handle icon within the virtual space according to the first movement parameter and the second movement parameter comprises:
taking the first movement parameter as a first plane component of the three-dimensional movement parameter on the first plane, and taking the second movement parameter as a first depth component of the three-dimensional movement parameter in the depth direction;
and determining the three-dimensional movement parameter according to the first plane component and the first depth component.
8. The method of claim 6, wherein the determining a three-dimensional movement parameter of the handle icon within the virtual space according to the first movement parameter and the second movement parameter comprises:
re-determining the first movement parameter of the operation icon on the first plane according to the first movement parameter and the second movement parameter;
taking the re-determined first movement parameter as a second planar component of the three-dimensional movement parameter in the first plane, and taking the second movement parameter as a second depth component of the three-dimensional movement parameter in the depth direction;
and determining the three-dimensional movement parameter according to the second plane component and the second depth component.
9. The method of claim 8, the re-determining a first movement parameter of the handle icon in a first plane according to the first movement parameter and the second movement parameter, comprising:
obtaining a first moving speed corresponding to the first moving parameter and obtaining a second moving speed corresponding to the second moving parameter;
calculating a difference between the first moving rate and the second moving rate, and taking the difference as a new first moving rate;
and obtaining a first moving direction corresponding to the first moving parameter, and re-determining the first moving parameter of the operation icon on the first plane based on the new first moving speed and the first moving direction.
10. An electronic device, comprising:
a memory for storing executable instructions;
a processor for implementing the control method of an operation icon according to any one of claims 1 to 9 when executing the executable instructions stored in the memory.
CN202211203891.1A 2022-09-29 2022-09-29 Control method and device of operation icon, electronic equipment and storage medium Pending CN115509405A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211203891.1A CN115509405A (en) 2022-09-29 2022-09-29 Control method and device of operation icon, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211203891.1A CN115509405A (en) 2022-09-29 2022-09-29 Control method and device of operation icon, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115509405A true CN115509405A (en) 2022-12-23

Family

ID=84507184

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211203891.1A Pending CN115509405A (en) 2022-09-29 2022-09-29 Control method and device of operation icon, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115509405A (en)

Similar Documents

Publication Publication Date Title
US10691284B2 (en) Canvas layout algorithm
US10379819B2 (en) Generic editor layout using intrinsic persistence metadata
EP4002107A1 (en) Data binding method, apparatus, and device of mini program, and storage medium
US10298587B2 (en) Peer-to-peer augmented reality handlers
EP3690604A1 (en) Display control method and system, and virtual reality device
US11954464B2 (en) Mini program production method and apparatus, terminal, and storage medium
CN114648615B (en) Method, device and equipment for controlling interactive reproduction of target object and storage medium
CN109471580B (en) Visual 3D courseware editor and courseware editing method
US20170185422A1 (en) Method and system for generating and controlling composite user interface control
CN112965773A (en) Method, apparatus, device and storage medium for information display
CN115509405A (en) Control method and device of operation icon, electronic equipment and storage medium
CN115658063A (en) Page information generation method, device, equipment and storage medium
CN110888787A (en) Data monitoring method, device and system
CN112789830A (en) A robotic platform for multi-mode channel-agnostic rendering of channel responses
US20240111398A1 (en) Data processing method and device, electronic device and computer-readable storage medium
CN112230906B (en) Method, device and equipment for creating list control and readable storage medium
US11908088B2 (en) Controlling virtual resources from within an augmented reality environment
CN112492381B (en) Information display method and device and electronic equipment
CN117351177A (en) Virtual object display method, device and storage medium in three-dimensional scene
CN114416654A (en) File display method and device, electronic equipment and storage medium
CN117075770A (en) Interaction control method and device based on augmented reality, electronic equipment and storage medium
CN116808589A (en) Motion control method and device, readable medium and electronic equipment
CN117148966A (en) Control method, control device, head-mounted display device and medium
CN116612261A (en) Information processing method, device, terminal and storage medium
CN115688702A (en) Information processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination