CN112486321B - Three-dimensional model operation control method and device and terminal equipment - Google Patents

Three-dimensional model operation control method and device and terminal equipment Download PDF

Info

Publication number
CN112486321B
CN112486321B CN202011380120.0A CN202011380120A CN112486321B CN 112486321 B CN112486321 B CN 112486321B CN 202011380120 A CN202011380120 A CN 202011380120A CN 112486321 B CN112486321 B CN 112486321B
Authority
CN
China
Prior art keywords
target
gear
operation data
target control
mapping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011380120.0A
Other languages
Chinese (zh)
Other versions
CN112486321A (en
Inventor
张二阳
李志帅
敖亚磊
侯晓龙
郑旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhengzhou J&T Hi Tech Co Ltd
Original Assignee
Zhengzhou J&T Hi Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhengzhou J&T Hi Tech Co Ltd filed Critical Zhengzhou J&T Hi Tech Co Ltd
Priority to CN202011380120.0A priority Critical patent/CN112486321B/en
Publication of CN112486321A publication Critical patent/CN112486321A/en
Application granted granted Critical
Publication of CN112486321B publication Critical patent/CN112486321B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • G06F3/03544Mice or pucks having dual sensing arrangement, e.g. two balls or two coils used to track rotation of the pointing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a three-dimensional model operation control method, a device and a terminal device, wherein the method comprises the following steps: when a user uses a mouse of the terminal equipment to operate a target control, displaying an operator on an interface of the terminal equipment, wherein the operator comprises an operating rod and a mouse indicating ball; acquiring operation data when the user operates the target control by using the mouse; performing first mapping processing on the operation data based on a first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse pointer ball at the target position; and performing second mapping processing on the operation data based on a second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear. The method can avoid the problem of non-response or multiple responses, and can simulate the damping sense and the Kanton sense in a real scene.

Description

Three-dimensional model operation control method and device and terminal equipment
Technical Field
The application relates to the technical field of computer three-dimensional simulation, in particular to a three-dimensional model operation control method and device and terminal equipment.
Background
With the continuous application of the three-dimensional simulation field of the Personal Computer (PC for short), the requirement of the user for the three-dimensional interactive experience of the PC end is continuously improved, and the three-dimensional interactive experience becomes a key factor for evaluating the quality of the three-dimensional simulation effect of the PC end.
At present, the three-dimensional interaction process of the PC end is mainly based on the conversion of the user on the operations of the keyboard and the mouse, and then the operation effect is presented on the three-dimensional control of the interface. However, the operation may not be responded or may be responded for many times due to the effect of the viewing angle position during the specific operation, and the damping sense and the click sense during the operation cannot be simulated due to the limitation of the hardware functions of the keyboard and the mouse.
Disclosure of Invention
An object of the present invention is to provide a method, an apparatus and a terminal device for controlling a three-dimensional model operation, so as to solve the problems of the prior art that the operation is not responded or is responded for multiple times, and the damping and the click feeling can not be simulated when the operation is not performed.
In order to achieve the above purpose, the technical solutions adopted in the embodiments of the present application are as follows:
in a first aspect, an embodiment of the present application provides a three-dimensional model operation control method, where the method is applied to a terminal device, a three-dimensional model to be operated is displayed on an interface of the terminal device, and the three-dimensional model includes at least one target control with an adjustable gear to be operated, and the method includes:
when a user uses a mouse of the terminal device to operate a target control, displaying an operator on an interface of the terminal device, wherein the operator comprises the following display elements: an operating rod and a mouse indicating ball.
Obtaining operation data when the user uses the mouse to operate the target control, wherein the operation data comprises: the drag direction and the drag distance.
And performing first mapping processing on the operation data based on a first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse pointer ball at the target position.
And performing second mapping processing on the operation data based on a second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
As an optional implementation manner, the performing, based on the first mapping model, the first mapping process on the operation data to obtain a target position where the operation data is mapped on the operation rod, and displaying the mouse pointer ball at the target position includes:
and acquiring the number of gears of the target control and the distance between adjacent gears.
And inputting the gear number, the distance between the adjacent gears and the operation data into the first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse indicating ball at the target position.
As an optional implementation manner, the first mapping model is configured to perform length averaging on the operation levers according to the number of the gears to obtain a plurality of positions to be selected, where the number of the positions is the same as the number of the gears, and select one position to be selected from the plurality of positions to be selected as the target position according to the distance between adjacent gears and the operation data.
As an optional implementation manner, the performing, on the basis of the second mapping model, the second mapping process on the operation data to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear includes:
and acquiring the distance between adjacent gears of the target control.
And inputting the distance between the adjacent gears and the operation data into the second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
As an optional implementation manner, the second mapping model is configured to determine whether to perform gear shifting according to the operation data and the distance between adjacent gears, and use a previous gear or a next gear of a current gear on the target control as the target gear when determining to perform gear shifting.
As an optional implementation manner, the operator further includes: the system comprises a selection box and a prompt box, wherein the prompt box is used for displaying information of a selected target control.
The displaying the operator on the interface of the terminal equipment comprises the following steps:
and determining the display position of the selected middle frame according to the position of the target control on the interface.
And determining the rotation angle of the operator according to the positions of the two ends of the target control and the initial position of the operating rod.
And rotating the operator according to the rotation angle, displaying the center selection frame on the display position of the center selection frame according to the rotated angle, and displaying the operation rod, the mouse indication ball and the prompt box according to the preset position relation between the center selection frame and the operation rod, the mouse indication ball and the prompt box.
As an optional implementation manner, after the displaying the operator on the interface of the terminal device, the method further includes:
and hiding a system mouse icon displayed on the interface of the terminal equipment.
In a second aspect, an embodiment of the present application provides a three-dimensional model operation control apparatus, where the apparatus is applied to a terminal device, a three-dimensional model to be operated is displayed on an interface of the terminal device, and the three-dimensional model includes at least one target control with an adjustable gear to be operated, and the apparatus includes:
the display module is used for displaying an operator on an interface of the terminal equipment when a user operates the target control by using a mouse of the terminal equipment, wherein the operator comprises the following display elements: an operating rod and a mouse indicating ball.
An obtaining module, configured to obtain operation data when the user operates the target control using the mouse, where the operation data includes: the drag direction and the drag distance.
And the first mapping module is used for carrying out first mapping processing on the operation data based on a first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse pointer ball at the target position.
And the second mapping module is used for performing second mapping processing on the operation data based on a second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
As an optional implementation manner, the first mapping module is specifically configured to:
and acquiring the number of gears of the target control and the distance between adjacent gears.
And inputting the gear number, the distance between the adjacent gears and the operation data into the first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse indicating ball at the target position.
As an optional implementation manner, the first mapping model is configured to perform length averaging on the operation levers according to the number of the gears to obtain a plurality of positions to be selected, where the number of the positions is the same as the number of the gears, and select one position to be selected from the plurality of positions to be selected as the target position according to the distance between adjacent gears and the operation data.
As an optional implementation manner, the second mapping module is specifically configured to:
and acquiring the distance between adjacent gears of the target control.
And inputting the distance between the adjacent gears and the operation data into the second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
As an optional implementation manner, the second mapping model is configured to determine whether to perform gear shifting according to the operation data and the distance between adjacent gears, and use a previous gear or a next gear of a current gear of the target control as the target gear when determining to perform gear shifting.
As an optional implementation manner, the operator further includes: the method comprises the steps of selecting a middle frame and a prompt box, wherein the prompt box is used for displaying information of a selected target control.
The display module is specifically configured to:
and determining the display position of the selected middle frame according to the position of the target control on the interface.
And determining the rotation angle of the operator according to the positions of the two ends of the target control and the initial position of the operating rod.
And rotating the operator according to the rotation angle, displaying the selected middle frame on the display position of the selected middle frame according to the rotated angle, and displaying the operation rod, the mouse indicating ball and the prompt box according to the preset position relation between the selected middle frame and the operation rod, the mouse indicating ball and the prompt box.
As an optional implementation manner, the apparatus further includes:
and the hiding module is used for hiding the system mouse icon displayed on the interface of the terminal equipment.
In a third aspect, an embodiment of the present application provides a terminal device, including: the system comprises a processor, a storage medium and a bus, wherein the storage medium stores machine-readable instructions executable by the processor, when a terminal device runs, the processor and the storage medium communicate through the bus, and the processor executes the machine-readable instructions to execute the steps of the three-dimensional model operation control method of the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and the computer program is executed by a processor to perform the steps of the three-dimensional model operation control method according to the first aspect.
The beneficial effect of this application is:
according to the three-dimensional model operation control method, the three-dimensional model operation control device and the terminal device, when a user operates a target control of a three-dimensional model by using a mouse, the terminal device displays an operator on an interface, the terminal device further obtains a target position of the user operation mapped on an operating rod of the operator by using the first mapping model, obtains a target gear of the user operation mapped on the target control by using the second mapping model, and presents the mapped effect on the operator and the target control. Because the terminal equipment uses the two mapping models to respectively map the user operation to the operator and the target control, more accurate mapping can be carried out according to the characteristics of the operator and the target control, the accuracy of the obtained target position is higher, the problem of no response or multiple responses can be avoided, and the damping sense and the stuck sense in a real scene can be simulated.
In addition, the system mouse icon is hidden, so that interference on the user can be avoided, and the user experience is improved.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic flowchart of a three-dimensional model operation control method according to an embodiment of the present application;
FIG. 2 is an exemplary diagram of a three-dimensional model and an operator interface;
fig. 3 is a schematic flowchart of a mapping process performed by a terminal device based on a first mapping model;
fig. 4 is a schematic flowchart of the terminal device performing mapping processing based on the second mapping model;
FIG. 5 is a schematic flow chart of a display operator of the terminal device;
fig. 6 is a block diagram of a three-dimensional model operation control apparatus according to an embodiment of the present application;
fig. 7 is another block configuration diagram of a three-dimensional model operation control apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a terminal device 80 according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are only for illustration and description purposes and are not used to limit the protection scope of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. In addition, one skilled in the art, under the guidance of the present disclosure, may add one or more other operations to the flowchart, or may remove one or more operations from the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that in the embodiments of the present application, the term "comprising" is used to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
At present, the operation effect is presented on the three-dimensional control of the interface by converting the operation of the keyboard and the mouse at the PC end. However, the following two problems may occur:
firstly, the three-dimensional interaction process of the PC terminal mainly comprises a clicking type and a dragging type, and the problems that the operation is not responded or the response is repeated frequently and the like can occur due to the influence of the visual angle position in the specific using process. For high-risk simulation fields or scenarios where operations are to be completed in a short time, such problems may lead to serious consequences.
Secondly, the three-dimensional interaction at the PC end is generally completed only through a mouse and a keyboard, and due to the limitation of hardware conditions of the mouse and the keyboard, the current three-dimensional interaction process only can show a click or drag effect, and cannot simulate damping feeling and click feeling in a real scene.
Based on the above problems, the embodiments of the present application provide a three-dimensional model operation control method, which obtains target positions mapped on an operator and a three-dimensional control by using two mapping models after a user operates a mouse, and correspondingly moves an icon to the target positions, so that the obtained target positions are more accurate, the problem of non-response or multiple responses can be avoided, and damping feeling and click-and-pause feeling in a real scene can be simulated.
The method and the device for mapping the operation of the terminal device to the target positions on the operator and the operation control can be applied to a three-dimensional simulation scene, in which a three-dimensional model is displayed in a display screen of the terminal device, the three-dimensional model comprises at least one operation control, a user can use a mouse to control the operation control, and when the user controls, the terminal device maps the operation of the user to the target positions on the operator and the operation control based on the method of the embodiment of the application and correspondingly moves the icon to the target positions.
Fig. 1 is a schematic flowchart of a three-dimensional model operation control method provided in an embodiment of the present application, where an execution subject of the method is a terminal device including a mouse, for example, a PC. As shown in fig. 1, the method includes:
s101, when a user uses a mouse of the terminal device to operate a target control, displaying an operator on an interface of the terminal device, wherein the operator comprises the following display elements: an operating rod and a mouse indicating ball.
Before executing the step, a three-dimensional model to be operated is displayed on an interface of the terminal device, for example, the three-dimensional model may be a three-dimensional model of a train console, and the like.
The three-dimensional model comprises at least one target control to be operated and capable of adjusting gears, and a user can adjust the gears of the target controls by dragging a mouse.
It should be noted that the operation control in the three-dimensional model may include various types, such as a knob type, a button type, and the like. Regardless of the type of control, the control may be a target control for the adjustable gear as described herein, as long as the control is adjustable. For example, a certain control is a press type control, which is opened after being pressed for the first time, closed after being pressed for the second time, and so on, then the press type control may be used as a target control with adjustable gear according to the present application, and display processing is performed according to the method of the present application.
When the user operates the target control by using the mouse, the terminal equipment can display the operator on the interface. In the embodiment of the application, the operator is an interface element displayed by the terminal device, is independent of the three-dimensional model, and is used for timely positioning a certain target control when a user operates the target control and timely presenting the effect of the user on operating the target control. Therefore, in the application, after the user operates the target control, the operation effect can be presented on the operator and the three-dimensional model of the target control at the same time.
Fig. 2 is an exemplary diagram of a three-dimensional model and an interface of an operator, as shown in fig. 2, the three-dimensional model displayed by the terminal device includes a target control a with adjustable gears, and when a user operates the target control a through a mouse, the terminal device may display the operator in a semi-transparent manner above the target control a.
Alternatively, the operator displayed on the terminal device may include a joystick and a mouse pointer. The operating rod can be used for representing the range of the target control operation, and the position of the mouse indicating ball on the operating rod represents the effect of the target control operation of the user. Illustratively, when the user does not drag the mouse for the target control, the mouse indicating ball is located at the starting end of the operating rod, and after the user drags the mouse, the mouse indicating ball is displayed at the corresponding position of the operating rod according to the distance dragged by the user.
S102, obtaining operation data when a user operates a target control by using a mouse, wherein the operation data comprises: the drag direction and the drag distance.
Optionally, the terminal device may determine whether the user starts to drag the mouse by monitoring a click event, a movement event, and a release event of the mouse, and calculate a drag direction and a drag distance when the user drags the mouse based on a position coordinate of a mouse pointer on a display screen of the terminal device in a process that the user drags the mouse.
And S103, performing first mapping processing on the operation data based on a first mapping model to obtain a target position of the operation data mapped on the operating rod, and displaying the mouse indicating ball at the target position.
The first mapping model is used for mapping the operation of the user to the operator so as to present the effect of the user after the operation on the operator.
By using the first mapping model, the terminal device can calculate a target position where the mouse indication ball should be located on the operation rod after the user moves the mouse in the dragging direction for the target control by the dragging distance, and the terminal device moves the mouse indication ball to the target position, namely, the mouse indication ball is displayed on the target position.
Alternatively, in the embodiment of the present application, the position of the element may be represented by coordinates of the element on the display screen. Accordingly, the mouse indicates the target position of the ball on the joystick, and may refer to coordinates on the display screen that are within the range of the joystick.
And S104, performing second mapping processing on the operation data based on a second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
The second mapping model is used for mapping the user operation to the target control so as to present the effect of the user after the user operation on the target control.
By using the second mapping model, the terminal device can calculate whether the gear of the target control needs to be switched to other gears except the current gear after the user moves the mouse by the dragging distance according to the dragging direction for the target control, and if so, the gear of the target control is switched to other corresponding gears.
It should be noted that, in the specific implementation process, the execution sequence of the steps S103 and S104 is not sequential.
In this embodiment, when the user operates the target control of the three-dimensional model using the mouse, the terminal device displays the operator on the interface, and the terminal device further obtains the target position of the user operation mapped on the operating rod of the operator using the first mapping model, obtains the target gear of the user operation mapped on the target control using the second mapping model, and presents the mapped effect on the operator and the target control. Because the terminal equipment uses the two mapping models to respectively map the user operation to the operator and the target control, more accurate mapping can be carried out according to the characteristics of the operator and the target control, the accuracy of the obtained target position is higher, the problem of no response or multiple responses can be avoided, and the damping sense and the pause sense in a real scene can be simulated.
The following describes the process of the terminal device performing the mapping process based on the first mapping model and the second mapping model, respectively.
Fig. 3 is a schematic flowchart of a terminal device performing mapping processing based on a first mapping model, and as shown in fig. 3, an optional manner of the step S103 includes:
and S301, acquiring the gear number of the target control and the distance between adjacent gears.
Optionally, the number of the gears of the target control is an inherent parameter of the three-dimensional model, and the number of the gears can be obtained by reading parameter information of the three-dimensional model. The distance between adjacent gears can be obtained by calculating the distance between the coordinates of the center points of the adjacent gears.
S302, inputting the gear number, the distance between the adjacent gears and the operation data into the first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse indicating ball at the target position.
Optionally, the number of gears, the distance between adjacent gears, and the operation data are used as input parameters of a first mapping model, and the first mapping model performs a first mapping process based on the input parameters, so that a target position of the operation data mapped on the operation lever can be obtained.
Optionally, the first mapping model may be a machine learning model obtained by training in advance using a training sample, or may also be a mapping model composed of several calculation formulas.
Regardless of the form of the first mapping model, the first mapping model may perform the mapping as follows.
As an alternative embodiment, the first mapping model may be configured to perform length averaging on the operation levers according to the number of the gear positions to obtain a plurality of candidate positions that are the same as the number of the gear positions, and select one candidate position from the plurality of candidate positions as the target position according to the distance between the adjacent gear positions and the operation data.
In one example, assuming that the number of shift positions is 2, after the length of the operating lever is averaged, two candidate positions are obtained, one candidate position is a start position of the operating lever, and the other candidate position is an end position of the operating lever.
In another example, assuming that the number of shift positions is 3, after the length of the operating lever is averaged, three candidate positions are obtained, where a first candidate position is a start position of the operating lever, a second candidate position is a position of a half length of the operating lever, and a third candidate position is an end position of the operating lever.
After the processing, each gear of the target control can respectively correspond to a to-be-selected position on the operating rod. After the user drags the mouse, the mouse indicates that the ball may only be located in one of these candidate positions.
Optionally, the terminal device may determine, based on the dragging distance and the distance between adjacent gears, whether the distance dragged by the user has satisfied the distance to another gear, and if yes, determine, by combining with the dragging direction, a to-be-selected position corresponding to another gear on the operation rod before or after the mouse indication ball is moved to the current position. If not, the terminal device can keep the position of the mouse pointing ball unchanged.
Exemplarily, assuming that the number of gears is 3, the terminal device determines, based on the dragging distance and the adjacent gear distance, that the dragging distance of the user meets the distance to the adjacent gear of the current gear, and then, the terminal device determines, according to the dragging direction, that the user drags the mouse in the direction of the next gear of the current gear, so that the terminal device can use the to-be-selected position corresponding to the next gear of the current gear on the operating rod as the target position, and move the mouse indication ball to the target position.
Fig. 4 is a schematic flowchart of a process of performing mapping processing by a terminal device based on a second mapping model, and as shown in fig. 4, an optional manner of the step S104 includes:
and S401, acquiring the distance between adjacent gears of the target control.
The manner of obtaining the distance between adjacent gears of the target control in this step is the same as that in step S301, and reference may be made to the description of step S301, which is not described herein again.
S402, inputting the distance between the adjacent gears and the operation data into a second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
Optionally, the distance between the adjacent gears and the operation data are used as input parameters of a second mapping model, and the second mapping model performs second mapping processing based on the input parameters, so that a target position of the operation data mapped on the operation rod can be obtained.
Optionally, the second mapping model may be a machine learning model obtained by training in advance using training samples, or may also be a mapping model composed of several calculation formulas.
Regardless of which form the second mapping model takes, the second mapping model may accomplish the mapping as follows.
As an optional implementation manner, the second mapping model may be configured to determine whether to perform gear shifting according to the operation data and the distance between adjacent gears, and when determining to perform gear shifting, use a previous gear or a next gear of a current gear of the target control as the target gear.
Alternatively, the terminal device may perform linear or nonlinear processing based on the operation data and the distance between adjacent gears. For example, when the dragging distance is greater than or equal to half of the distance between adjacent gears, the terminal device may determine to perform gear shifting, and determine, based on the dragging direction, that the target gear to be shifted is the previous gear or the next gear of the current gear.
It is worth noting that when the user drags the mouse, the dragging may be continued for a long distance, and the total distance of dragging may be greater than the distance between adjacent gears of the target control. For the situation, the terminal device can monitor the dragging distance of the mouse in real time, once the distance of the mouse can be switched to the previous gear or the next gear is reached, gear switching and display can be completed in the above mode, and then the position after gear switching and display are taken as the new current gear, judgment and processing are continued based on the above mode, and by analogy, the specific execution process is not repeated.
In this embodiment, the second mapping model determines whether to perform shift switching based on the operation data and the distance between adjacent shifts, and does not perform shift switching when the operation data does not satisfy a specific condition, thereby better simulating a damping feeling and a click feeling of an operation.
The following describes a processing procedure when the terminal device displays the above-described operator on the interface.
As mentioned above, the operator may include a joystick and a mouse pointer, and as an alternative, the operator may further include: selecting a middle frame and a prompt frame. An example of a checkbox and prompt box interface may be found in FIG. 2, previously described. A checkbox may be displayed in a semi-transparent manner over the target control to indicate that the target control is currently being operated by the user. The prompt box is used for displaying the information of the selected target control. For example, the information of the target control displayed by the prompt box may include: name, state value, and operation mode of the target control.
Fig. 5 is a schematic flowchart of a process for displaying an operator on a terminal device, and as shown in fig. 5, an alternative way for displaying the operator on the interface of the terminal device in step S101 includes:
and S501, determining the display position of the selected middle frame according to the position of the target control on the interface.
Optionally, the position of the target control on the interface may refer to a two-dimensional coordinate of a center point of the target control on the interface. The terminal device may first obtain a three-dimensional position coordinate of the center point of the target control in the three-dimensional model, and map the three-dimensional position coordinate to the two-dimensional interface, so as to obtain a two-dimensional coordinate of the center point of the target control on the interface.
After obtaining the two-dimensional coordinate of the center point of the target control on the interface, the two-dimensional coordinate may be used as the display position of the selected middle frame. Specifically, the two-dimensional coordinates are display positions of the center points of the selected middle frames.
And S502, determining the rotation angle of the operator according to the positions of the two ends of the target control and the initial position of the operating rod.
Optionally, the direction of each target control in the three-dimensional control may be consistent with the direction of the screen, or a certain angle difference may exist, and for the operator, the direction needs to be kept consistent with the direction of the currently operating target control at any time, so that the direction of the operator can be adjusted in real time according to the direction of the target control. Specifically, in this step, the rotation angle of the operator may be determined according to the positions of the two ends of the target control and the initial position of the operation rod. The initial position of the operating rod is the position of the operating rod at the moment before the user operates the target control, and the initial position of the operating rod can be represented by the positions of two end points of the operating rod, and a ray can be obtained through the positions of the two end points.
It should be noted that if the current target control is the first target control to be operated, that is, the operator is not displayed on the interface before the current target control is operated, the initial position of the joystick may be a default position parallel to the bottom end of the screen. The positions of the two ends of the target control can represent the positions of two end points of the target control, and the other ray can be obtained through the positions of the two end points. Furthermore, the angle formed between the two rays can be used as the angle difference between the operating rod and the target control, and the angle difference can be used as the rotation angle of the operator.
And S503, rotating the operator according to the rotation angle, displaying the center selection frame on the display position of the center selection frame according to the rotated angle, and displaying the operation rod, the mouse indication ball and the prompt box according to the preset position relation between the center selection frame and the operation rod, the mouse indication ball and the prompt box.
Optionally, the center selection box, the joystick, the mouse pointer and the prompt box in the operator are used as the constituent elements of the operator, and the position relationship in the operator is preset and fixed. Therefore, the operating rod, the mouse pointer and the prompt box can be displayed simultaneously according to the fixed position relation while the selection box is displayed at the determined display position according to the rotation angle of the operator, so that the display of the operator above the target control is realized.
Optionally, when the prompt box is displayed, the name, the state, the operation mode, and the like of the currently operated target control may be displayed in the prompt box. The state of the target control can be the current gear of the target control, and the state can be updated in real time along with the operation change of a user. The operation mode of the target control may be, for example: rotate, click, etc.
As an alternative implementation, after the terminal device displays the operator on the interface, the system mouse icon displayed on the interface of the terminal device may be hidden.
After the processing of the foregoing embodiment, when the user operates the target control, the terminal device displays the operator on the interface, and adjusts the position of the mouse indication ball on the operation rod according to the operation data of the user on the system mouse, the user can know the state change of the target control according to the position change of the mouse indication ball, and the position of the system mouse icon has a small effect on the user in this scene, so that the terminal device can hide the system mouse icon, thereby avoiding interference with the user, and improving the experience of the user.
Based on the same inventive concept, a three-dimensional model operation control device corresponding to the three-dimensional model operation control method is also provided in the embodiments of the present application, and as the principle of solving the problem of the device in the embodiments of the present application is similar to that of the three-dimensional model operation control method in the embodiments of the present application, the implementation of the device may refer to the implementation of the method, and the repeated parts are not described again.
Fig. 6 is a block diagram of a three-dimensional model operation control apparatus according to an embodiment of the present disclosure, where the apparatus may be applied to a terminal device, and a three-dimensional model to be operated is displayed on an interface of the terminal device, where the three-dimensional model includes at least one target control to be operated and with an adjustable gear. As shown in fig. 6, the apparatus includes:
the display module 601 is configured to display an operator on an interface of the terminal device when a user operates the target control using a mouse of the terminal device, where the operator includes the following display elements: an operating rod and a mouse indicating ball.
An obtaining module 602, configured to obtain operation data when the user operates the target control using the mouse, where the operation data includes: the drag direction and the drag distance.
A first mapping module 603, configured to perform a first mapping process on the operation data based on a first mapping model to obtain a target position of the operation data mapped on the operation rod, and display the mouse pointer ball at the target position.
A second mapping module 604, configured to perform a second mapping process on the operation data based on a second mapping model, to obtain a target gear of the operation data mapped on the target control, and switch the gear of the target control in the three-dimensional model to the target gear.
As an optional implementation manner, the first mapping module 603 is specifically configured to:
and acquiring the number of gears of the target control and the distance between adjacent gears.
And inputting the gear number, the distance between the adjacent gears and the operation data into the first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse indicating ball at the target position.
As an alternative embodiment, the first mapping model is configured to average the lengths of the operating levers according to the number of the gear positions to obtain a plurality of candidate positions that are the same as the number of the gear positions, and select one candidate position from the plurality of candidate positions as the target position according to the distance between the adjacent gear positions and the operation data.
As an optional implementation manner, the second mapping module 604 is specifically configured to:
and acquiring the distance between adjacent gears of the target control.
And inputting the distance between the adjacent gears and the operation data into the second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
As an optional implementation manner, the second mapping model is configured to determine whether to perform gear shifting according to the operation data and the distance between adjacent gears, and when it is determined to perform gear shifting, use a previous gear or a next gear of a current gear of the target control as the target gear.
As an optional implementation, the operator further includes: the method comprises the steps of selecting a middle frame and a prompt box, wherein the prompt box is used for displaying information of a selected target control.
The display module 601 is specifically configured to:
and determining the display position of the selected frame according to the position of the target control on the interface.
And determining the rotation angle of the operator according to the positions of the two ends of the target control and the initial position of the operating rod.
And rotating the operator according to the rotation angle, displaying the center selection frame on the display position of the center selection frame according to the rotated angle, and displaying the operation rod, the mouse indicating ball and the prompt box according to the preset position relation between the center selection frame and the operation rod, the mouse indicating ball and the prompt box.
Fig. 7 is another block configuration diagram of a three-dimensional model operation control apparatus according to an embodiment of the present application, and as shown in fig. 7, the apparatus further includes:
a hiding module 605, configured to hide a system mouse icon displayed on the interface of the terminal device.
An embodiment of the present application further provides a terminal device 80, as shown in fig. 8, which is a schematic structural diagram of the terminal device 80 provided in the embodiment of the present application, and includes: a processor 81, a memory 82, and a bus 83. The memory 82 stores machine-readable instructions (e.g., execution instructions corresponding to the display module, the obtaining module, the first mapping module, the second mapping module, and the hiding module in the apparatuses in fig. 6 and 7, etc.) executable by the processor 81, when the terminal device 80 is running, the processor 81 communicates with the memory 82 through the bus 83, and the machine-readable instructions are executed by the processor 81 to perform the method steps in the above method embodiments.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the three-dimensional model operation control method.
It can be clearly understood by those skilled in the art that, for convenience and simplicity of description, the specific working process of the system and the apparatus described above may refer to the corresponding process in the method embodiment, and is not described in detail in this application. In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is merely a logical division, and there may be other divisions in actual implementation, and for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be through some communication interfaces, indirect coupling or communication connection between devices or modules, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application.

Claims (10)

1. A three-dimensional model operation control method is applied to terminal equipment, a three-dimensional model to be operated is displayed on an interface of the terminal equipment, the three-dimensional model comprises at least one target control with adjustable gears to be operated, and the method comprises the following steps:
when a user uses a mouse of the terminal device to operate the target control, displaying an operator on an interface of the terminal device, wherein the operator comprises the following display elements: an operating rod and a mouse indicating ball;
obtaining operation data when the user uses the mouse to operate the target control, wherein the operation data comprises: a dragging direction and a dragging distance;
performing first mapping processing on the operation data based on a first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse pointer ball at the target position;
and performing second mapping processing on the operation data based on a second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
2. The method according to claim 1, wherein the performing a first mapping process on the operation data based on a first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse pointer ball at the target position comprises:
acquiring the number of gears of the target control and the distance between adjacent gears;
and inputting the gear number, the distance between the adjacent gears and the operation data into the first mapping model to obtain a target position of the operation data mapped on the operation rod, and displaying the mouse indicating ball at the target position.
3. The method according to claim 2, wherein the first mapping model is configured to perform length averaging on the operation levers according to the number of the gear positions to obtain a plurality of candidate positions that are the same as the number of the gear positions, and select one candidate position from the plurality of candidate positions as the target position according to the distance between adjacent gear positions and the operation data.
4. The method according to claim 1, wherein the second mapping processing is performed on the operation data based on a second mapping model, so as to obtain a target gear of the operation data mapped on the target control, and the gear of the target control in the three-dimensional model is switched to the target gear, and the second mapping processing includes:
acquiring the distance between adjacent gears of the target control;
and inputting the distance between the adjacent gears and the operation data into the second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
5. The method according to claim 4, wherein the second mapping model is used to determine whether to perform gear shifting according to the operation data and the distance between adjacent gears, and when determining to perform gear shifting, the previous gear or the next gear of the current gear on the target control is taken as the target gear.
6. The method of any of claims 1-5, wherein the operator further comprises: selecting a middle frame and a prompt box, wherein the prompt box is used for displaying the information of the selected target control;
the displaying of the operator on the interface of the terminal device includes:
determining the display position of the selected frame according to the position of the target control on the interface;
determining the rotation angle of the operator according to the positions of the two ends of the target control and the initial position of the operating rod;
and rotating the operator according to the rotation angle, displaying the selected middle frame on the display position of the selected middle frame according to the rotated angle, and displaying the operation rod, the mouse indicating ball and the prompt box according to the preset position relation between the selected middle frame and the operation rod, the mouse indicating ball and the prompt box.
7. The method according to any one of claims 1-5, wherein after displaying the operator on the interface of the terminal device, further comprising:
and hiding a system mouse icon displayed on the interface of the terminal equipment.
8. The three-dimensional model operation control device is applied to terminal equipment, a three-dimensional model to be operated is displayed on an interface of the terminal equipment, the three-dimensional model comprises at least one target control with adjustable gears to be operated, and the device comprises:
the display module is used for displaying an operator on an interface of the terminal equipment when a user operates the target control by using a mouse of the terminal equipment, wherein the operator comprises the following display elements: an operating rod and a mouse indicating ball;
an obtaining module, configured to obtain operation data when the user operates the target control using the mouse, where the operation data includes: a drag direction and a drag distance;
the first mapping module is used for carrying out first mapping processing on the operation data based on a first mapping model to obtain a target position of the operation data on the operation rod in a mapping mode, and displaying the mouse indicating ball at the target position;
and the second mapping module is used for performing second mapping processing on the operation data based on a second mapping model to obtain a target gear of the operation data mapped on the target control, and switching the gear of the target control in the three-dimensional model to the target gear.
9. A terminal device, comprising: a processor, a storage medium and a bus, wherein the storage medium stores machine-readable instructions executable by the processor, when a terminal device runs, the processor and the storage medium communicate with each other through the bus, and the processor executes the machine-readable instructions to execute the steps of the three-dimensional model operation control method according to any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, performs the steps of the three-dimensional model operation control method according to any one of claims 1 to 7.
CN202011380120.0A 2020-11-30 2020-11-30 Three-dimensional model operation control method and device and terminal equipment Active CN112486321B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011380120.0A CN112486321B (en) 2020-11-30 2020-11-30 Three-dimensional model operation control method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011380120.0A CN112486321B (en) 2020-11-30 2020-11-30 Three-dimensional model operation control method and device and terminal equipment

Publications (2)

Publication Number Publication Date
CN112486321A CN112486321A (en) 2021-03-12
CN112486321B true CN112486321B (en) 2022-12-13

Family

ID=74937948

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011380120.0A Active CN112486321B (en) 2020-11-30 2020-11-30 Three-dimensional model operation control method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN112486321B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114237431A (en) * 2021-12-09 2022-03-25 郑州捷安高科股份有限公司 Interactive equipment control method, device and equipment of simulation scene and readable storage medium
CN114706490A (en) 2022-02-28 2022-07-05 北京所思信息科技有限责任公司 Mouse model mapping method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1804774A (en) * 2004-12-22 2006-07-19 微软公司 Improving touch screen accuracy
WO2017133572A1 (en) * 2016-02-02 2017-08-10 上海逗屋网络科技有限公司 Method and device for moving target object based on touch control
CN107066173A (en) * 2017-03-28 2017-08-18 腾讯科技(深圳)有限公司 Method of controlling operation thereof and device
CN109976650A (en) * 2019-01-25 2019-07-05 网易(杭州)网络有限公司 Man-machine interaction method, device and electronic equipment
CN110115838A (en) * 2019-05-30 2019-08-13 腾讯科技(深圳)有限公司 Method, apparatus, equipment and the storage medium of mark information are generated in virtual environment
CN111773705A (en) * 2020-08-06 2020-10-16 网易(杭州)网络有限公司 Interaction method and device in game scene

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111701226A (en) * 2020-06-17 2020-09-25 网易(杭州)网络有限公司 Control method, device and equipment for control in graphical user interface and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1804774A (en) * 2004-12-22 2006-07-19 微软公司 Improving touch screen accuracy
WO2017133572A1 (en) * 2016-02-02 2017-08-10 上海逗屋网络科技有限公司 Method and device for moving target object based on touch control
CN107066173A (en) * 2017-03-28 2017-08-18 腾讯科技(深圳)有限公司 Method of controlling operation thereof and device
CN109976650A (en) * 2019-01-25 2019-07-05 网易(杭州)网络有限公司 Man-machine interaction method, device and electronic equipment
CN110115838A (en) * 2019-05-30 2019-08-13 腾讯科技(深圳)有限公司 Method, apparatus, equipment and the storage medium of mark information are generated in virtual environment
CN111773705A (en) * 2020-08-06 2020-10-16 网易(杭州)网络有限公司 Interaction method and device in game scene

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
对STL模型的交互选择功能的实现;黄常标等;《计算机工程与应用》;20050501(第17期);121-123 *

Also Published As

Publication number Publication date
CN112486321A (en) 2021-03-12

Similar Documents

Publication Publication Date Title
CN112486321B (en) Three-dimensional model operation control method and device and terminal equipment
De Haan et al. IntenSelect: Using Dynamic Object Rating for Assisting 3D Object Selection.
Hix et al. Usability engineering of virtual environments
WO2007130539A2 (en) Geographic information system (gis) for displaying 3d geospatial images with reference markers and related methods
WO2015053266A1 (en) Plant operation training apparatus, control method, program, and plant operation training system
JP2013214275A (en) Three-dimensional position specification method
Tadeja et al. Exploring gestural input for engineering surveys of real-life structures in virtual reality using photogrammetric 3D models
Cabric et al. A predictive performance model for immersive interactions in mixed reality
Muender et al. Comparison of mouse and multi-touch for protein structure manipulation in a citizen science game interface
US20120011457A1 (en) Visualizing a view of a scene
EP2665042A1 (en) Visual processing based on interactive rendering
JP2017224170A (en) Image processing system, image processing method, and program
Raynal et al. Towards unification for pointing task evaluation in 3D desktop virtual environment
US9495124B1 (en) Device for displaying a remote display according to a monitor geometry
US11574113B2 (en) Electronic apparatus, information processing method, and recording medium
Trindade et al. Improving 3D navigation techniques in multiscale environments: a cubemap-based approach
JP3917690B2 (en) 3D image construction device
Gallagher et al. Comparison with self vs comparison with others: The influence of learning analytics dashboard design on learner dashboard use
EP1162527B1 (en) A method in a process control system and a process control system
KR102162136B1 (en) Abacus education apparatus and method using virtual reality
CN111413889A (en) Motion simulation control method and device of four-bar linkage
US9275484B2 (en) Goodness of fit based on error calculation and fit type
CN110531906A (en) A kind of the forms management method and digital oscilloscope of display interface
Scarr Understanding and Exploiting Spatial Memory in the Design of Efficient Command Selection Interfaces
CN114237431A (en) Interactive equipment control method, device and equipment of simulation scene and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Zhang Eryang

Inventor after: Li Zhishuai

Inventor after: Ao Yalei

Inventor after: Hou Xiaolong

Inventor after: Zheng Xu

Inventor before: Zhang Eryang

Inventor before: Li Zhishuai

Inventor before: Ao Yalei

Inventor before: Hou Xiaolong

Inventor before: Zheng Xu

CB03 Change of inventor or designer information
GR01 Patent grant
GR01 Patent grant