CN117193619A - Control method, medium and equipment for display proportion of virtual scene - Google Patents

Control method, medium and equipment for display proportion of virtual scene Download PDF

Info

Publication number
CN117193619A
CN117193619A CN202311077029.5A CN202311077029A CN117193619A CN 117193619 A CN117193619 A CN 117193619A CN 202311077029 A CN202311077029 A CN 202311077029A CN 117193619 A CN117193619 A CN 117193619A
Authority
CN
China
Prior art keywords
virtual scene
touch
virtual
touch operation
touch screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311077029.5A
Other languages
Chinese (zh)
Inventor
周文超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202311077029.5A priority Critical patent/CN117193619A/en
Publication of CN117193619A publication Critical patent/CN117193619A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a control method of display proportion of a virtual scene, which comprises the following steps: outputting a computer graphic interface for presenting and editing the virtual scene on the touch screen; activating and maintaining a drag state of a virtual component in the computer graphical interface according to a first touch operation on the touch screen; when the fact that a second touch operation is applied to the touch screen is detected, judging whether a touch event triggered by the first touch operation is terminated or not; and if the judgment result is negative, changing the display proportion of the virtual scene according to a gesture event, wherein the gesture event is triggered by the second touch operation or triggered by the first touch operation and the second touch operation together. Correspondingly, the invention also provides a medium and equipment related to the control method of the display proportion of the virtual scene.

Description

Control method, medium and equipment for display proportion of virtual scene
Technical Field
The invention belongs to the technical field of man-machine interaction, and particularly relates to a control method, medium and equipment for display proportion of a virtual scene.
Background
In the field of existing landscape design, industrial design or game design, it is often necessary to use a virtual scene editor. The virtual scene editor provides the function of customizing the virtual scene and a corresponding man-machine interaction graphical interface, and when the virtual scene editor is used by a user, the following operation scenes can be realized by operating the man-machine interaction graphical interface: and selecting a certain virtual component, placing the virtual component into the virtual scene, and further adjusting the position of the virtual component in a dragging mode to enable the virtual component to reach the expected position in the virtual scene. In the above operation scenario, due to the requirement of adjusting the position of the virtual component, many factors such as the screen display boundary and the operation efficiency are considered, and often the view angle scaling function of the virtual scene editor needs to be triggered multiple times, for example, the local part in the virtual scene is enlarged to facilitate the fine adjustment of the placement position of the virtual component, or the display proportion of the virtual scene is reduced entirely to facilitate the virtual component to move in the whole virtual scene rapidly. When the virtual scene editor runs on a terminal provided with a touch screen, the touch screen is not only a display device of a man-machine interaction interface of the virtual scene editor, but also a man-machine interaction interface device of the virtual scene editor, which means that a user needs to use the touch screen to realize the operation scene and trigger the view angle scaling function, in the prior art, when the user is currently dragging a certain selected virtual component, if the view angle scaling function needs to be triggered, the drag operation on the virtual component is often forced to be stopped, and touch operations such as double-finger folding, double-finger scratching, sliding control, clicking control and the like are sent again to perform view angle scaling, and then the drag operation on the virtual component is activated again to continue editing the virtual scene. Obviously, because the dragging operation of the virtual component is forced to be stopped, the placement efficiency of the virtual component is greatly reduced, and when a plurality of virtual components capable of being dragged exist, the misoperation amount is increased, so that the user experience is poor. If it is desired to trigger the view scaling function without suspending the drag operation of the virtual component, other peripherals connected to the terminal, such as a mouse and a plotter, need to be specially set, which results in an increase in hardware cost required to support the operation of the virtual scene editor.
Disclosure of Invention
In order to overcome the above-mentioned drawbacks in the prior art, the present invention provides a method for controlling a display scale of a virtual scene, the method comprising:
outputting a computer graphic interface for presenting and editing the virtual scene on the touch screen;
activating and maintaining a drag state of a virtual component in the computer graphical interface according to a first touch operation on the touch screen;
when the fact that a second touch operation is applied to the touch screen is detected, judging whether a touch event triggered by the first touch operation is terminated or not;
and if the judgment result is negative, changing the display proportion of the virtual scene according to a gesture event, wherein the gesture event is triggered by the first touch operation and the second touch operation together.
According to one aspect of the present invention, the step of changing the display scale of the virtual scene according to the gesture event in the method includes: obtaining a coordinate position of a contact point currently generated on the touch screen by the first touch operation; according to a touch track generated on the touch screen by the second touch operation, a moving direction and a moving distance from a starting point to an end point in the touch track are obtained; determining a displacement direction of a view finding lens of the virtual scene according to the coordinate position and the moving direction, and determining a displacement value of the view finding lens according to the moving distance; and updating the position parameters of the view finding lens according to the displacement direction and the displacement value so as to change the display proportion.
According to another aspect of the invention, the displacement direction in the method includes a height lifting direction and a height lowering direction of the viewfinder lens relative to a ground plane of the virtual scene; the location parameter includes a height value of the viewing lens relative to the ground plane.
According to another aspect of the invention, the ratio of the height value to the displacement value is constant in the method.
According to another aspect of the invention, the first touch operation in the method comprises a long press operation and/or a drag operation of a first finger of a user applied on the touch screen.
According to another aspect of the invention, the second touch operation in the method comprises a sliding operation of a second finger of the user applied over the touch screen.
According to another aspect of the invention, the touch events in the method include a single-finger long press event and a single-finger drag event.
According to another aspect of the invention, the gesture events in the method include a single-finger swipe event, a double-finger close event, and a double-finger scratch-out event.
According to another aspect of the invention, the virtual scene in the method is a three-dimensional virtual scene; the virtual component is a three-dimensional virtual component.
According to another aspect of the present invention, after the changing the display scale of the virtual scene according to the gesture event, the method further includes: judging whether the distance between the visual center of the virtual component and the contact point generated on the touch screen currently by the first touch operation is larger than a preset threshold value or not; and if the judgment result is yes, refreshing and displaying the virtual scene, so that the distance between the visual center and the contact is smaller than or equal to the preset threshold value.
Furthermore, the present invention provides one or more computer-readable media storing computer-executable instructions that, when used by one or more computer devices, cause the one or more computer devices to perform a method of controlling the display scale of a virtual scene as described hereinbefore.
The invention also provides a computer device comprising a memory and a processor, wherein: the memory stores a computer program, and the processor implements a method for controlling the display scale of the virtual scene as described above when executing the computer program.
When the control method for the display proportion of the virtual scene is applied to the virtual scene editor, the change of the display proportion of the virtual scene can be triggered while the dragging state of the virtual scene is activated and maintained, so that the operation efficiency of the virtual scene is effectively improved, the misoperation amount of the virtual scene editor is reduced, the multi-point touch characteristic of the touch screen is fully utilized, and the user experience is remarkably improved.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the accompanying drawings, in which:
FIG. 1 is a flow chart of one embodiment of a method for controlling the display scale of a virtual scene according to the present invention;
FIG. 2 is a flow chart of a preferred embodiment of the step S400 shown in FIG. 1, in which the display scale of the virtual scene is changed according to a gesture event;
fig. 3 to 9 are schematic views of scenes for explaining the principle of the embodiment shown in fig. 1;
fig. 10 is a schematic diagram for explaining a display scale changing principle of a virtual scene in the embodiment shown in fig. 1;
FIG. 11 is a schematic diagram for explaining the principle of the preferred embodiment shown in FIG. 2;
FIG. 12 is a flow chart of an alternative embodiment of a method for controlling the display scale of a virtual scene according to the present invention;
fig. 13 to 15 are schematic views of scenes for explaining the principle of the alternative embodiment shown in fig. 12;
FIG. 16 is a schematic diagram of an exemplary intelligent terminal for implementing the control method of the display scale of the virtual scene of the present invention;
the same or similar reference numbers in the drawings refer to the same or similar parts.
Detailed Description
For a better understanding and explanation of the present invention, reference will be made to the following detailed description of the invention taken in conjunction with the accompanying drawings. The invention is not limited to these specific embodiments only. On the contrary, the invention is intended to cover modifications and equivalent arrangements included within the scope of the appended claims.
It should be noted that numerous specific details are set forth in the following detailed description. It will be understood by those skilled in the art that the present invention may be practiced without these specific details. In the following description of various embodiments, well-known principles, structures and components are not described in detail in order to facilitate a salient point of the invention.
Referring to fig. 1, fig. 1 is a schematic flow chart of a specific embodiment of a control method for displaying a proportion of a virtual scene according to the present invention, the method includes:
step S100, outputting a computer graphic interface for presenting and editing the virtual scene on the touch screen;
step S200, activating and maintaining a dragging state of a virtual component in the computer graphical interface according to a first touch operation on the touch screen;
step S300, judging whether a touch event triggered by the first touch operation is terminated when the second touch operation is applied to the touch screen is detected, and executing step S400 if the touch event triggered by the first touch operation is judged to be not terminated;
step S400, changing the display proportion of the virtual scene according to a gesture event, where the gesture event is triggered by the second touch operation or triggered by both the first touch operation and the second touch operation.
In order to more clearly explain the technical principle of the present embodiment, the following explanation of the respective steps shown in fig. 1 may be referred to in conjunction with appropriate drawings in fig. 3 to 9, and fig. 3 to 9 are schematic views of a scene for explaining the principle of the embodiment shown in fig. 1.
Specifically, the control method for the display proportion of the virtual scene is suitable for operating the application scene of the virtual scene editor on the terminal equipment provided with the touch screen. Typically, the terminal device configured with the touch screen includes, but is not limited to: portable computers, desktop workstations, smartphones, tablet computers, smart televisions, and wearable smart devices. In order to support the normal operation of the virtual scene editor, the terminal operating system running on the terminal device is any one or a combination of Windows, iOS, android, symbian, windows Mobile, maemo, webOS, palm OS and Blackberry OS, for example.
In step S100, the computer graphical interface output on the touch screen functions to present and edit a virtual scene, that is, at least a portion of the virtual scene should be included in the computer graphical interface. The term "virtual scene" refers to a virtual scene generated by rendering a software program according to a computer graphics technology after the software program runs on a terminal device, and the virtual scene can be a virtual scene simulating a real-world landscape, can be a virtual scene combining a simulated landscape and a fictive landscape, or can be a virtual scene formed by completely fictive landscape. The virtual scene may be a two-dimensional virtual scene or a three-dimensional virtual scene. Typically, in this embodiment, the virtual scene is a virtual three-dimensional scene. Referring to fig. 3, the terminal device 100 has a touch screen 110, and a computer graphic interface 111 is output on the touch screen 110, and the computer graphic interface 111 presents a virtual scene as shown in fig. 3, wherein the virtual scene includes a virtual landscape of a ground, a road, a building, a vehicle, a tree, and the like, and the virtual scene can be edited.
In step S200, a drag state of a virtual component in the computer graphical interface is activated and maintained according to a first touch operation with respect to the touch screen. The first touch operation generally refers to a single contact action or a plurality of contact actions with relevance applied by a user to the touch screen by using a finger, and aims to trigger an instruction for controlling the computer graphical interface by using the touch screen so as to realize a function of human-computer interaction with the computer graphical interface through the touch screen. The term "virtual component" refers to a virtual graphic element or combination of virtual graphic elements that exist in the virtual scene from which a virtual landscape, such as the ground, road, building, vehicle, tree, etc., shown in fig. 3, may be constructed. One of the virtual scenes may include a plurality of the virtual components, each having a preset shape and volume/area, occupying a portion of the space of the virtual scene. Also, the virtual component may be a two-dimensional virtual component or a three-dimensional virtual component, which is adapted to the type of the virtual space. Typically, in this embodiment, the virtual component is a virtual three-dimensional component. Referring to fig. 4, a virtual scene presented by the gui 111 includes a virtual component 201, and when a drag state of the virtual component 201 is not activated, the virtual component 201 is at a fixed position in the virtual scene, and since it is known that the virtual scene has editability, in one embodiment, the virtual component 201 is designed such that a user may touch a display position corresponding to the virtual component 201 on the touch screen 110 by using an index finger 301, so as to send out the first touch operation, and further activate and maintain the drag state of the virtual component 201 according to the first touch operation. Obviously, when the virtual component 201 is in the activated drag state, the user may continue to give an instruction by means of the touch screen 110, so as to freely drag the position of the virtual component 201 in the virtual scene, as shown in fig. 4, for example, the user slides the index finger 301 on the touch screen 110 to drag the virtual component 201 from its initial position to the adjacent area of another virtual component 202 in the virtual scene, please refer to fig. 5, and after the user finishes the drag operation on the virtual component 201, the virtual scene presented in the computer graphical interface 111 is changed to the specific layout as shown in fig. 5, that is, the virtual component 201 is dragged to a new position. Typically, the first touch operation includes a long press operation and/or a drag operation of a first finger of a user, such as the index finger 301 in fig. 4, applied over the touch screen.
Although the virtual component 201 has been dragged to the vicinity of the virtual component 202 at this time, it is highly likely that the user desires to fine-tune the final placement position of the virtual component 201, and the display scale of the virtual scene presented in the computer graphic interface 111 shown in fig. 5 is too small, so that it is difficult for the user to smoothly accomplish the fine-tuning at the display scale, and thus the user naturally generates a need to change the display scale, that is, a need to zoom the virtual scene (zoom in for the scene shown in fig. 5), and another need for the user is to activate and maintain the drag state of the virtual component 201 while achieving the change of the display scale, so that an additional step of re-activating the drag state of the virtual component 201 after the completion of the change of the display scale can be avoided, so as to improve the operation efficiency when editing the position of the virtual component 201 in the virtual scene.
In step S300, when it is detected that the second touch operation is applied to the touch screen, it is determined whether the touch event triggered by the first touch operation is terminated, and if the determination is not, step S400 is executed. Typically, the touch events include a single-finger long press event and a single-finger drag event. According to the working principle of the conventional touch screen, when the touch event is not terminated yet, it can be seen that the first touch operation is not yet terminated yet, and correspondingly, the drag state of the virtual component is still in an activated state. Typically, the second touch operation comprises a sliding operation of a second finger of the user applied over the touch screen, this type of second touch operation generally conforming to gesture usage habits of a user zooming using the touch screen. When the touch event triggered by the first touch operation is determined to not be terminated, step S400 is performed. In step S400, the display scale of the virtual scene is changed according to a gesture event triggered by the second touch operation or triggered by both the first touch operation and the second touch operation. Typically, the gesture events include a single-finger swipe event, a double-finger fold event, and a double-finger scratch event.
To better illustrate the various embodiments of steps S300 and S400, please refer to fig. 6, where the first touch operation by the index finger 301 of the user is kept in contact with the touch screen 110, and the touch event triggered by the first touch operation is correspondingly not terminated, at the same time, the middle finger 302 of the user slides on the touch screen 110 in the direction a shown in fig. 6, that is, the middle finger 302 sends out the second touch operation, so that the distance between the index finger 301 and the middle finger 302 relatively increases. At this time, a gesture event triggered by the user on the touch screen 110 may be recognized as a single-finger swipe event triggered by the second touch operation, and may be recognized as a double-finger swipe event triggered by both the first touch operation and the second touch operation. In any of the above gesture events, the display proportion of the virtual scene presented in the computer graphical interface 111 is correspondingly enlarged according to the gesture event, for example, the display proportion of the virtual scene is changed from the style in fig. 6 to the style in fig. 7, and it can be seen that the local area where the virtual component 201 is located is enlarged, and the local area occupies more display area in the computer graphical interface 111 in the style shown in fig. 7 than in the style shown in fig. 6. In the manner shown in fig. 7, if the user needs to zoom out the display scale of the virtual scene presented in the computer graphic interface 111 so as to observe the overall layout effect of the virtual scene, the middle finger 302 may be slid on the touch screen 110 approximately along the direction C shown in fig. 7, that is, the middle finger 302 issues the second touch operation, so that the distance between the middle finger 302 and the middle finger 301 is relatively reduced, where the gesture event triggered by the user on the touch screen 110 may be identified as a single-finger sliding event triggered by the second touch operation, and may also be identified as a double-finger folding event triggered by the first touch operation and the second touch operation together. Correspondingly, the display scale of the virtual scene presented in the computer graphical interface 111 is correspondingly scaled down according to the gesture event, for example, the display scale of the virtual scene is changed from the style in fig. 7 to the style as in fig. 6.
In the scene diagrams shown in fig. 6 and 7, the second touch operation is performed by the user using the middle finger 302, that is, the aforementioned second finger is the middle finger 302, but this does not mean that the user must perform the second touch operation using the middle finger 302, nor that the second touch operation can be performed only by the user using the middle finger 302, but rather, the second touch operation can be performed by any finger of the user, even using a finger on a different palm or other end of a human body limb, matching different user operation habits. Referring to fig. 8, on the premise that the index finger 301 of the user keeps contacting the touch screen 110, the first touch operation by the index finger 301 is not finished yet, and the user may slide on the touch screen 110 using the thumb 303 along the direction B shown in fig. 8, that is, the thumb 303 sends the second touch operation, so that the distance between the index finger 301 and the thumb 303 is relatively increased. At this time, the gesture event triggered by the user on the touch screen 110 may be identified as a single-finger sliding event triggered by the second touch operation, or may be identified as a double-finger scratch event triggered by both the first touch operation and the second touch operation, and the display scale of the virtual scene presented in the computer graphic interface 111 is correspondingly enlarged according to the gesture event, for example, the display scale of the virtual scene is changed from the style in fig. 8 to the style in fig. 9, and as can be seen, similar to the scene schematic diagram shown in fig. 7, the local area where the virtual component 201 is located is also enlarged. Similarly, in the style shown in fig. 9, if the user needs to zoom out on the display scale of the virtual scene presented in the computer graphic interface 111 so as to observe the overall layout effect of the virtual scene, the thumb 303 may be slid on the touch screen 110 with the index finger 301 remaining in contact with the touch screen 110, using the thumb 303 to slide along the direction D shown in fig. 9, so that the thumb 303 issues the second touch operation, such that the distance between the index finger 301 and the thumb 303 is relatively reduced, and at this time, the gesture event triggered by the user on the touch screen 110 may be identified as a single-finger sliding event triggered by the second touch operation, or may be identified as a two-finger folding event triggered by the first touch operation and the second touch operation together. Correspondingly, the display scale of the virtual scene presented in the computer graphical interface 111 is correspondingly scaled down according to the gesture event, for example, the display scale of the virtual scene is changed from the style in fig. 9 to the style as in fig. 8.
Since the change of the display scale of the virtual scene is generally implemented by changing the overhead view or the height position of the viewfinder lens of the virtual scene, please refer to fig. 10, fig. 10 is a schematic diagram for explaining the principle of changing the display scale of the virtual scene in the embodiment shown in fig. 1, wherein the ground plane 112 refers to the plane on which the preset virtual ground in the virtual scene is located, and the height of the viewfinder lens 400 of the virtual scene generally refers to the height of the viewfinder lens 400 relative to the ground plane 112. It will be appreciated that if the view of the view finding lens 400 is fixed, when the view finding lens 400 is far from the ground plane 112, the projection area of the view finding field of the view finding lens 400 on the ground plane 112 is larger, so that the display proportion of the virtual scene is smaller; when the viewfinder lens 400 is closer to the ground plane 112, the projection area of the viewfinder field of the viewfinder lens 400 on the ground plane 112 is smaller, and thus the display ratio of the virtual scene is larger. Reference is made to fig. 10, which is helpful for understanding the lens principle involved in the foregoing when, for example, the display scale of the virtual scene is changed from the style in fig. 6 to the style in fig. 7, and from the style in fig. 8 to the style in fig. 9. The display proportion of the virtual scene becomes larger gradually in the process of moving the viewfinder 400 from the contour surface of the height E0 to the contour surface of the height E1, and the display proportion of the virtual scene becomes smaller if the viewfinder 400 moves from the contour surface of the height E1 to the contour surface of the height E0.
Based on the above explanation, please refer to fig. 2, fig. 2 is a flowchart of a preferred embodiment of changing the display scale of the virtual scene according to the gesture event in step S400 shown in fig. 1, and preferably, the step of changing the display scale of the virtual scene according to the gesture event in step S400 includes:
step S410, obtaining the coordinate position of a contact point generated on the touch screen currently by the first touch operation;
step S420, according to the touch track generated on the touch screen by the second touch operation, the moving direction and the moving distance from the starting point to the end point in the touch track are obtained;
step S430, determining a displacement direction of a view finding lens of the virtual scene according to the coordinate position and the moving direction, and determining a displacement value of the view finding lens according to the moving distance;
step S440, updating the position parameter of the lens according to the displacement direction and the displacement value to change the display proportion.
Step S410 to step S440 are described with reference to fig. 11, and fig. 11 is a schematic diagram for explaining the principle of the preferred embodiment shown in fig. 2. Specifically, after the gesture event is generated, step S410 is triggered to be performed. In step S410, the first touch operation generates a contact point 310 on the touch screen 110, and obtains a coordinate position of the contact point 310. In step S420, a touch trajectory is generated on the touch screen 110 according to the second touch operation, and a moving direction and a moving distance from the start point 322 to the end point 321 in the touch trajectory are obtained, where the moving direction and the moving distance can be calculated by using the coordinate values of both the start point 322 and the end point 321. In step S430, determining a displacement direction of the view finding lens of the virtual scene according to the coordinate position and the moving direction, where the displacement direction includes a height lifting direction and a height lowering direction of the view finding lens relative to the ground plane of the virtual scene, for example, if the moving direction is determined to be along a direction proximate to the contact point 310, the second finger performing the second touch operation and the first finger performing the first touch operation are considered to have a tendency to be proximate, and accordingly determining the displacement direction of the view finding lens of the virtual scene as a height lifting direction relative to the ground plane of the virtual scene, and conversely determining the displacement direction of the view finding lens of the virtual scene as a height lowering direction relative to the ground plane of the virtual scene; since it is not sufficient to determine only the displacement direction for the positional change of the finder lens, the displacement value of the finder lens is also determined according to the movement distance in step S430, and the displacement value can be calculated after determining the movement distance, for example, by setting a conversion function of the movement distance and the displacement value in advance. In step S440, the position parameter of the viewfinder lens is updated according to the displacement direction and the displacement value, so as to change the display scale. Typically, the location parameter comprises a height value of the viewing lens relative to the ground plane. In order to make the user operation experience better, smooth and not counterintuitive, it may be preferable to consider that the height value and the displacement value are set to be proportional constant.
With continued reference to fig. 1, in step S200, when the drag state of the virtual component in the computer graphical interface is activated and maintained by the first touch operation, the user typically applies the first touch operation to the activated area of the virtual component on the touch screen, and more typically, the contact position between the first touch operation and the touch screen is the display position of the virtual component on the touch screen, and with the continuation of the first touch operation, after the display scale of the virtual scene is changed in step S400, for example, after the display scale of the virtual scene is changed from the pattern in fig. 7 to the pattern in fig. 6, the display scale of the virtual component is changed because the virtual component is a part of the virtual scene, and at the same time, the display position of the virtual component on the touch screen may also be changed, but the contact position between the first touch operation and the touch screen typically does not shift too much, so that the contact position and the touch screen may be displaced with the change of the display scale. For this situation, in order to make the visual feedback of the computer graphical interface more consistent with the operation habit of the user, it is considered to introduce a compensation correction mechanism for the virtual scene after step S400, please refer to fig. 12, fig. 12 is a schematic flow chart of an alternative embodiment of the control method for the display scale of the virtual scene according to the present invention, which is different from the embodiment shown in fig. 1 in that, after step S400, the alternative embodiment shown in fig. 12 further includes:
step S500, judging whether the distance between the visual center of the virtual component and the contact point generated on the touch screen by the first touch operation is larger than a preset threshold value, and if so, executing step S600;
and S600, refreshing and displaying the virtual scene, so that the distance between the visual center and the contact point is smaller than or equal to the preset threshold value.
Specifically, in step S500, the visual center of the virtual component may, for example, correspond to a geometric center point determined by the shape of the virtual component itself, and the specific position of the contact point may be determined by sampling the touch screen by a background program. Because the geometric center point and the contact point both have corresponding coordinate values, the distance between the geometric center point and the contact point can be calculated by the coordinate values of the geometric center point and the contact point. The range of the preset threshold value is, for example, 100 to 300 pixel points. When the distance is determined to be greater than the preset threshold, it may be considered that the visual center of the virtual component has a significant misalignment phenomenon with the contact point currently generated on the touch screen by the first touch operation, and thus it is considered that step S600 is performed.
In step S600, the step of refreshing the display of the virtual scene may be generally implemented by a method of re-rendering the virtual scene and displaying it in the computer graphical interface, with the aim of calibrating the visual center such that the distance between the visual center and the touch point is less than or equal to the preset threshold, preferably such that the visual center substantially coincides with the touch point.
For a clearer description of the embodiment shown in fig. 12, please refer to fig. 13-15, fig. 13-15 being schematic views of a scene for explaining the principles of the alternative embodiment shown in fig. 12. In the scene diagram shown in fig. 13, the display scale of the virtual scene in the computer graphical interface 111 presented on the touch screen 110 has been changed, for example, from the style shown in fig. 5 to the style shown in fig. 13, as shown in fig. 5, before the zooming, the index finger 301 of the user is originally pressed on the virtual component 201, after the zooming, the virtual component 201 and the surrounding graphical elements thereof are synchronously zoomed, which causes the display position of the visual center 2011 of the virtual component 201 on the touch screen 110 to be changed, and the index finger 301 does not move greatly, so that the contact 3011 of the index finger 301 and the touch screen 110 still remains in the original position. Referring to the scene schematic diagram shown in fig. 14, there is a clear distance between the visual center 2011 and the contact 3011, when the distance between the visual center 2011 and the contact 3011 is determined to be greater than a preset threshold, the virtual scene including the virtual component 201 displayed in the computer graphic interface 111 is correspondingly refreshed and displayed, that is, the virtual scene is rendered again and then presented in the computer graphic interface 111, so that the distance between the visual center 2011 and the contact 3011 is smaller than or equal to the preset threshold, that is, after the virtual scene is refreshed and displayed, the visual center 2011 of the virtual component 201 is closer to the contact 3011. Preferably, a pattern as shown in fig. 15 may be formed, i.e. the visual center 2011 eventually forms a substantially coincident state with the contact 3011. The patterns shown in fig. 14 and 15 are each illustrated with a static frame of the computer graphical interface 111 that presents a virtual scene, but this does not mean that when the process of switching the virtual scene presented in the computer graphical interface 111 from the pattern shown in fig. 14 to the pattern shown in fig. 15 is instantaneous, a suitable animation effect may be added to the above-described switching process, for example, triggering an automatic continuous translation of the entire virtual scene in the computer graphical interface 111 to the right until the translation stops after the visual center 2011 substantially coincides with the contact 3011.
By executing the step S500 and the step S600, the problem that the visual center of the virtual component and the contact point generated on the touch screen by the first touch operation are misplaced is solved, and the visual feedback of the computer graphical interface is more in accordance with the operation habit of a user.
It should be noted that although the operations of the method of the present invention are depicted in the drawings in a particular order, this does not require or imply that the operations be performed in that particular order or that all illustrated operations be performed to achieve desirable results. Rather, the steps depicted in the flowcharts may change the order of execution. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform.
Referring to fig. 16, fig. 16 is a schematic structural diagram of a typical smart terminal for implementing a method for controlling a display scale of a virtual scene according to the present invention. Fig. 16 schematically shows a common structure of the smart terminal, and internal components, software and protocol structures of the common smart terminal are described with reference to fig. 16.
The smart terminal has a processor 510 that is responsible for the overall operation of the mobile terminal and may be implemented with any commercially available Central Processing Unit (CPU), digital Signal Processor (DSP), or any other electronically Programmable Logic Device (PLD). The processor 510 has associated memory 520, the memory 520 including, but not limited to, RAM memory, ROM memory, EEPROM memory, flash memory, or a combination thereof. The memory 520 is controlled by the processor 510 for various purposes, one of which is to store program instructions and data for various software in the intelligent terminal.
The software layer of the intelligent terminal includes a real-time operating system 540, drivers for a human-machine interface 560, an application handler 550, and various applications. Such as text editor 551, handwriting recognition application 552, and other multimedia applications 553, typically including applications such as voice call applications, video call applications, send and receive Short Message Service (SMS) message applications, multimedia Message Service (MMS) applications or email applications, web browsers, instant messaging applications, phonebook applications, calendar applications, control panel applications, camera applications, one or more video games, notepad applications, and the like. It should be noted that two or more of the above applications may be executed as the same application.
The intelligent terminal also includes one or more hardware controllers for cooperating with the display device 561, physical keys 562, microphone 563, and various I/O devices (such as speakers, vibrators, bell generators, LED indicators, etc.) along with drivers for the human-machine interface 560 to enable human-machine interaction of the intelligent terminal. Those skilled in the art will appreciate that a user may operate the mobile terminal through the man-machine interface 560 thus formed.
The software layer of the intelligent terminal may further include various modules, protocol stacks, drivers, etc. related to communication, which are summarized as a communication interface 570 as shown in fig. 16, for providing communication services (e.g. transmission, network and connectivity) for a wireless radio frequency interface 571 and optionally for a bluetooth interface 572 and/or an infrared interface 573, for interconnection and interworking of said intelligent terminals. The wireless radio interface 571 includes an internal or external antenna and appropriate circuitry for establishing and maintaining a wireless link to a base station. As is well known to those skilled in the art, the circuit comprises a series of analog and digital electronic components that together form a radio receiver and transmitter. These components include, for example, bandpass filters, amplifiers, mixers, local oscillators, low pass filters, AD/DA converters, etc.
The intelligent terminal may further comprise a card reading device 530, wherein the card reading device 530 is configured to read information of the SIM card and access a network provided by the operator according to the cooperative wireless radio interface 571.
Accordingly, the present invention also discloses one or more computer-readable media storing computer-executable instructions that, when used by one or more computer devices, cause the one or more computer devices to perform various embodiments of a method for controlling the display scale of a virtual scene as described above, such as the method for controlling the display scale of a virtual scene shown in fig. 1. Computer readable media can be any available media that can be accessed by the computer device and includes both volatile and nonvolatile media, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer-readable media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computing device. Combinations of any of the above should also be included within the scope of computer readable media.
Correspondingly, the invention also discloses a computer device, which comprises a memory and a processor, wherein: the memory stores a computer program, and the processor implements embodiments of a method for controlling a display scale of a virtual scene, such as the method for controlling a display scale of a virtual scene shown in fig. 1, when the computer program is executed.
The portion of the control method for the display scale of the virtual scene provided by the present invention, which relates to the software logic, may be implemented using a programmable logic device, or may be implemented as a computer program product, which causes a computer to execute the method as exemplified. The computer program product comprises a computer-readable storage medium having computer program logic or code portions embodied therein for carrying out the steps of the above-described portions relating to software logic. The computer readable storage medium may be a built-in medium installed in a computer or a removable medium (e.g., a hot-pluggable storage device) detachable from a computer main body. The built-in medium includes, but is not limited to, rewritable nonvolatile memory such as RAM, ROM, and hard disk. The removable media includes, but is not limited to: optical storage media (e.g., CD-ROM and DVD), magneto-optical storage media (e.g., MO), magnetic storage media (e.g., magnetic tape or removable hard disk), media with built-in rewritable non-volatile memory (e.g., memory card), and media with built-in ROM (e.g., ROM cartridge).
It will be appreciated by those skilled in the art that any computer system having suitable programming means is capable of executing the steps of the method of the present invention embodied in a computer program product. Although most of the specific embodiments described in this specification focus on software programs, alternative embodiments that implement the methods provided by the present invention in hardware are also within the scope of the invention as claimed.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are, therefore, to be considered as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is evident that the word "comprising" does not exclude other elements, units or steps, and that the singular does not exclude a plurality. A plurality of components, units or means recited in the claims can also be implemented by means of one component, unit or means in software or hardware.
When the control method for the display proportion of the virtual scene is applied to the virtual scene editor, the change of the display proportion of the virtual scene can be triggered while the dragging state of the virtual scene is activated and maintained, so that the operation efficiency of the virtual scene is effectively improved, the misoperation amount of the virtual scene editor is reduced, the multi-point touch characteristic of the touch screen is fully utilized, and the user experience is remarkably improved.
The above disclosure is intended to be illustrative of only and not limiting of the scope of the invention, which is to be given the full breadth of the appended claims and any and all equivalents thereof.

Claims (12)

1. A display control method of a virtual scene, the method comprising:
outputting a computer graphic interface for presenting the virtual scene on the touch screen;
activating and maintaining a drag state of a virtual component in the computer graphical interface according to a first touch operation on the touch screen;
when the fact that a second touch operation is applied to the touch screen is detected, judging whether a touch event triggered by the first touch operation is terminated or not;
and if the judgment result is negative, changing the display proportion of the virtual scene according to the change of the distance between the touch point of the first touch operation and the touch point of the second touch operation.
2. The control method of a display scale of a virtual scene according to claim 1, wherein the step of changing the display scale of the virtual scene comprises:
obtaining a coordinate position of a contact point currently generated on the touch screen by the first touch operation;
according to a touch track generated on the touch screen by the second touch operation, a moving direction and a moving distance from a starting point to an end point in the touch track are obtained;
determining a displacement direction of a view finding lens of the virtual scene according to the coordinate position and the moving direction, and determining a displacement value of the view finding lens according to the moving distance;
and updating the position parameters of the view finding lens according to the displacement direction and the displacement value so as to change the display proportion.
3. The control method of display scale of virtual scene according to claim 2, wherein:
the displacement direction comprises a height lifting direction and a height descending direction of the view finding lens relative to the ground plane of the virtual scene;
the location parameter includes a height value of the viewing lens relative to the ground plane.
4. The control method of display scale of virtual scene according to claim 3, wherein:
the ratio of the height value to the displacement value is constant.
5. The control method of a display scale of a virtual scene according to claim 1 or 2, wherein:
the first touch operation includes a long press operation and/or a drag operation of a first finger of a user applied on the touch screen.
6. The control method of a display scale of a virtual scene according to claim 1 or 2, wherein:
the second touch operation includes a sliding operation in which a second finger of the user is applied over the touch screen.
7. The control method of display scale of virtual scene according to claim 1, wherein:
the touch events include a single-finger long press event and a single-finger drag event.
8. The control method of a display scale of a virtual scene according to claim 1 or 2, wherein:
the gesture events include a single-finger swipe event, a double-finger close event, and a double-finger scratch event.
9. The control method of a display scale of a virtual scene according to claim 1 or 2, wherein:
the virtual scene is a three-dimensional virtual scene;
the virtual component is a three-dimensional virtual component.
10. The control method of the display scale of the virtual scene according to claim 1, further comprising, after the changing of the display scale of the virtual scene according to the gesture event:
judging whether the distance between the visual center of the virtual component and the contact point generated on the touch screen currently by the first touch operation is larger than a preset threshold value or not;
and if the judgment result is yes, refreshing and displaying the virtual scene, so that the distance between the visual center and the contact is smaller than or equal to the preset threshold value.
11. One or more computer-readable media storing computer-executable instructions that, when used by one or more computer devices, cause the one or more computer devices to perform the method of controlling display proportions of a virtual scene as claimed in any one of claims 1 to 10.
12. A computer device comprising a memory and a processor, wherein:
the memory stores a computer program, and the processor executes the computer program to implement the method for controlling the display scale of the virtual scene according to any one of claims 1 to 10.
CN202311077029.5A 2021-08-25 2021-08-25 Control method, medium and equipment for display proportion of virtual scene Pending CN117193619A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311077029.5A CN117193619A (en) 2021-08-25 2021-08-25 Control method, medium and equipment for display proportion of virtual scene

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110982744.8A CN113721911B (en) 2021-08-25 2021-08-25 Control method, medium and equipment for display proportion of virtual scene
CN202311077029.5A CN117193619A (en) 2021-08-25 2021-08-25 Control method, medium and equipment for display proportion of virtual scene

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202110982744.8A Division CN113721911B (en) 2021-08-25 2021-08-25 Control method, medium and equipment for display proportion of virtual scene

Publications (1)

Publication Number Publication Date
CN117193619A true CN117193619A (en) 2023-12-08

Family

ID=78677963

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202311077029.5A Pending CN117193619A (en) 2021-08-25 2021-08-25 Control method, medium and equipment for display proportion of virtual scene
CN202110982744.8A Active CN113721911B (en) 2021-08-25 2021-08-25 Control method, medium and equipment for display proportion of virtual scene

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202110982744.8A Active CN113721911B (en) 2021-08-25 2021-08-25 Control method, medium and equipment for display proportion of virtual scene

Country Status (1)

Country Link
CN (2) CN117193619A (en)

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7844915B2 (en) * 2007-01-07 2010-11-30 Apple Inc. Application programming interfaces for scrolling operations
US20110304584A1 (en) * 2009-02-23 2011-12-15 Sung Jae Hwang Touch screen control method and touch screen device using the same
KR101567785B1 (en) * 2009-05-28 2015-11-11 삼성전자주식회사 Apparatus and method for controlling zoom function of a portable terminal
CN102541319A (en) * 2010-12-20 2012-07-04 联想(北京)有限公司 Electronic equipment and display processing method thereof
US9317196B2 (en) * 2011-08-10 2016-04-19 Microsoft Technology Licensing, Llc Automatic zooming for text selection/cursor placement
CN102566908A (en) * 2011-12-13 2012-07-11 鸿富锦精密工业(深圳)有限公司 Electronic equipment and page zooming method for same
CN102968273B (en) * 2012-11-20 2015-08-05 鸿富锦精密工业(深圳)有限公司 Electronic equipment and page zoom-in and zoom-out method thereof
US9027153B2 (en) * 2013-03-15 2015-05-05 Google Technology Holdings LLC Operating a computer with a touchscreen
CN103677721B (en) * 2013-12-27 2016-01-06 腾讯科技(深圳)有限公司 Terminal demonstration interface is carried out to method and the terminal device of convergent-divergent
CN105159579A (en) * 2015-08-27 2015-12-16 努比亚技术有限公司 multi-touch apparatus and method based on back sensor
CN105573658A (en) * 2016-02-01 2016-05-11 西安睿尚信息科技有限公司 Method and system for switching input, page moving and page scaling on basis of handwriting
US10559117B2 (en) * 2016-02-08 2020-02-11 Google Llc Interactions and scaling in virtual reality
CN107168636B (en) * 2017-05-18 2020-05-12 广州视源电子科技股份有限公司 Multi-touch gesture recognition method and device, touch screen terminal and storage medium
CN108920069B (en) * 2018-06-13 2020-10-23 网易(杭州)网络有限公司 Touch operation method and device, mobile terminal and storage medium
US11004270B2 (en) * 2018-09-11 2021-05-11 Houzz, Inc. Virtual item placement system
CN109621415A (en) * 2018-12-26 2019-04-16 网易(杭州)网络有限公司 Display control method and device in 3D game, computer storage medium
CN110032326A (en) * 2019-03-29 2019-07-19 网易(杭州)网络有限公司 Mobile terminal shows control method, device, equipment and the storage medium of picture
CN110575671B (en) * 2019-10-08 2023-08-11 网易(杭州)网络有限公司 Method and device for controlling viewing angle in game and electronic equipment

Also Published As

Publication number Publication date
CN113721911B (en) 2023-09-26
CN113721911A (en) 2021-11-30

Similar Documents

Publication Publication Date Title
JP7033152B2 (en) User interface camera effect
JP4752921B2 (en) Information processing apparatus, animation adding method, and program
KR101720849B1 (en) Touch screen hover input handling
CN110069204B (en) Graph processing method, device and equipment based on writing track and storage medium
TWI522889B (en) Managing workspaces in a user interface
CN110045909B (en) Ellipse processing method, device and equipment based on writing track and storage medium
US8780134B2 (en) Access to control of multiple editing effects
JP2020531985A (en) Systems, methods, and graphical user interfaces for interacting with augmented and virtual reality environments
CN111443863A (en) Page control method and device, storage medium and terminal
CN110716680A (en) Control method and device of intelligent interactive panel
CN103914258A (en) Mobile terminal and method for operating same
CN104238877A (en) Icon control method and terminal
CN110928614B (en) Interface display method, device, equipment and storage medium
US20150063785A1 (en) Method of overlappingly displaying visual object on video, storage medium, and electronic device
CN104142794A (en) Information processing method and electronic device
JP2023552659A (en) Interface display state adjustment method, apparatus, device, storage medium
CN112486377B (en) Text editing method and device and electronic equipment
CN104969263A (en) Method and system for generating motion sequence of animation, and computer-readable recording medium
CN106569708B (en) Method and terminal for realizing pressing simulation display
KR20150039552A (en) Display manipulating method of electronic apparatus and electronic apparatus thereof
CN111258698A (en) Object display method and device
CN113926190A (en) Method and device for controlling three-dimensional model in game editor and storage medium
US20080143673A1 (en) Method and Apparatus For Moving Cursor Using Numerical Keys
CN106990900A (en) One kind writing implementation method and device
CN104765542A (en) Control method and terminal of display interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination