CN114780009A - Three-dimensional object rotation method, device, equipment, storage medium and program product - Google Patents

Three-dimensional object rotation method, device, equipment, storage medium and program product Download PDF

Info

Publication number
CN114780009A
CN114780009A CN202210575895.6A CN202210575895A CN114780009A CN 114780009 A CN114780009 A CN 114780009A CN 202210575895 A CN202210575895 A CN 202210575895A CN 114780009 A CN114780009 A CN 114780009A
Authority
CN
China
Prior art keywords
dimensional object
target
touch points
touch
interactive interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210575895.6A
Other languages
Chinese (zh)
Inventor
邹帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210575895.6A priority Critical patent/CN114780009A/en
Publication of CN114780009A publication Critical patent/CN114780009A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Abstract

The embodiment of the application discloses a three-dimensional object rotation method, a three-dimensional object rotation device, three-dimensional object rotation equipment, a three-dimensional object rotation storage medium and a program product, and belongs to the technical field of interface interaction. The method comprises the following steps: displaying an interactive interface, wherein at least one three-dimensional object is displayed in the interactive interface; detecting touch points in the interactive interface, wherein the initial number of the touch points is 2, and the current number of the touch points is 1 or 2; when a touch point in the interactive interface moves, rotating a target three-dimensional object in at least one three-dimensional object; the rotation angle of the target three-dimensional object is related to the movement of the touch points, and the rotation axis of the target three-dimensional object is related to the current number of touch points. Therefore, the operation time of a user is shortened, and the man-machine interaction efficiency when the three-dimensional object in the touch screen is rotated is improved.

Description

Three-dimensional object rotation method, device, equipment, storage medium and program product
Technical Field
The embodiment of the application relates to the technical field of interface interaction, in particular to a three-dimensional object rotation method, device, equipment, storage medium and program product.
Background
The three-dimensional object is displayed in the form of a two-dimensional image on the display screen, and when a user wants to see images of the three-dimensional object at other angles, the three-dimensional object needs to be rotated.
When the three-dimensional object is displayed on the touch screen terminal, the user can control the rotation of the three-dimensional object in a touch sliding mode. In the related art, when a user rotates a three-dimensional object in a touch screen, a rotation axis is selected, and then the three-dimensional object is controlled to rotate along the rotation axis by sliding the touch screen.
However, in the above scheme, the user needs to perform the operation of the rotating shaft first, which results in complicated operation steps of the user, waste of operation time of the user, and influence on human-computer interaction efficiency when the three-dimensional object is rotated.
Disclosure of Invention
The embodiment of the application provides a three-dimensional object rotation method, a three-dimensional object rotation device, three-dimensional object rotation equipment, a storage medium and a program product. The technical scheme comprises the following steps:
in one aspect, an embodiment of the present application provides a three-dimensional object rotation method, where the method includes:
displaying an interactive interface, wherein at least one three-dimensional object is displayed in the interactive interface;
detecting touch points in the interactive interface, wherein the initial number of the touch points is 2, and the current number of the touch points is 1 or 2;
when the touch point in the interactive interface moves, rotating a target three-dimensional object in the at least one three-dimensional object; the rotation angle of the target three-dimensional object is related to the movement condition of the touch points, and the rotation axis of the target three-dimensional object is related to the current number of the touch points.
In one aspect, an embodiment of the present application provides an apparatus for rotating a three-dimensional object, where the apparatus includes:
the interface display module is used for displaying an interactive interface, and at least one three-dimensional object is displayed in the interactive interface;
the touch point detection module is used for detecting touch points in the interactive interface, the initial number of the touch points is 2, and the current number of the touch points is 1 or 2;
the rotation module is used for rotating a target three-dimensional object in the at least one three-dimensional object when the touch point in the interactive interface moves; the rotation angle of the target three-dimensional object is related to the movement condition of the touch points, and the rotation axis of the target three-dimensional object is related to the current number of the touch points.
In another aspect, an embodiment of the present application provides a computer device, where the computer device includes a processor and a memory, and the processor and the memory are connected through a bus; the processor executes computer instructions stored in the memory to cause the computer device to implement the three-dimensional object rotation method as described above.
In still another aspect, the present application further provides a computer-readable storage medium, where computer instructions are stored, and the computer instructions are used for being executed by a processor to implement the above three-dimensional object rotation method.
In yet another aspect, the present application provides a computer program product comprising computer instructions stored in a computer readable storage medium. The computer program product is used for realizing the three-dimensional object rotation method.
In yet another aspect, the present application provides a computer program to be executed by a processor of a computer device to implement the above-mentioned three-dimensional object rotation method.
The technical scheme provided by the embodiment of the application at least comprises the following beneficial effects:
when a user rotates a target three-dimensional object in a touch screen in a sliding touch mode, the rotating shaft can be selected in a single-point/two-point touch mode, and the rotating angle is controlled through the touch sliding condition, so that the mode that the rotating shaft is simultaneously selected and the rotating angle is controlled through the touch of two fingers is realized, the operation steps of the user when the user rotates the three-dimensional object through the touch screen are simplified, the operation time of the user is shortened, and the man-machine interaction efficiency when the three-dimensional object in the touch screen is rotated is improved.
Drawings
FIG. 1 is an architecture diagram of a computer device provided by an exemplary embodiment of the present application;
FIG. 2 is a flow chart of a method for rotating a three-dimensional object according to an exemplary embodiment of the present application;
FIG. 3 is a flow chart of a method for rotating a three-dimensional object provided by an exemplary embodiment of the present application;
FIG. 4 is a schematic view of an interactive interface involved in the embodiment shown in FIG. 3;
FIG. 5 is a schematic view of a rotation angle determination according to the embodiment shown in FIG. 3;
FIG. 6 is a flow chart of a two-finger operation involved in the embodiment shown in FIG. 3;
FIG. 7 is a schematic view of a two-finger operation screen display according to the embodiment shown in FIG. 3;
FIG. 8 is a schematic view of a rotation angle determination according to the embodiment shown in FIG. 3;
FIG. 9 is a flowchart of a single finger operation involved in the embodiment shown in FIG. 3;
FIG. 10 is a schematic view of a single finger operation screen display according to the embodiment shown in FIG. 3;
FIG. 11 is a flow chart of a target three-dimensional object rotation control according to the embodiment shown in FIG. 3;
FIG. 12 is a schematic structural diagram of a three-dimensional object rotation apparatus according to an exemplary embodiment of the present application;
fig. 13 is a schematic structural diagram of a computer device according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, the following detailed description of the embodiments of the present application will be made with reference to the accompanying drawings.
Reference herein to "a plurality" means two or more. "and/or" describes the association relationship of the associated object, indicating that there may be three relationships, for example, a and/or B, which may indicate: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
It should be understood that "indication" mentioned in the embodiments of the present application may be a direct indication, an indirect indication, or an indication of an association relationship. For example, a indicates B, which may indicate that a directly indicates B, e.g., B may be obtained by a; it may also mean that a indicates B indirectly, for example, a indicates C, and B may be obtained by C; it can also be shown that there is an association between a and B.
In the description of the embodiments of the present application, the term "correspond" may indicate that there is a direct correspondence or an indirect correspondence between the two, may also indicate that there is an association between the two, and may also indicate and is indicated, configure and is configured, and the like.
In the embodiment of the present application, "predefining" may be implemented by saving a corresponding code, table, or other manners that may be used to indicate related information in advance in a device (for example, including a terminal device and a network device), and the present application is not limited to a specific implementation manner thereof. Such as predefined, may refer to what is defined in the protocol.
Unlike two-dimensional images, the details of three-dimensional objects that exist in the digital world typically require scaling, rotation, etc. to be fully visible.
The three-dimensional gesture rotation refers to an operation of rotating a three-dimensional object in a display screen so as to view images of various angles of the three-dimensional object.
Embodiments of the present application relate generally to a three-dimensional gesture rotation scheme that uses gestures to rotate a three-dimensional object, rather than using a software UI or other hardware input tool.
Referring to FIG. 1, a block diagram of a computer device provided in an exemplary embodiment of the present application is shown. As shown in fig. 1, the computer device 10 includes a touch screen 110, and an application 120 installed and running in the computer device.
The touch screen 110 is a display screen supporting a touch operation function.
Generally, the touch screen 110 can be implemented by a display panel and a touch panel under the display panel. With the display 110, a user can interact with the computer device by directly touching the touch screen.
The application 120 refers to a computer program that displays a three-dimensional virtual world by a two-dimensional view camera on the touch screen 110.
The computer device 10 may be any device with a touch screen and an application program running function, for example, the computer device 10 may be a smart phone, a tablet computer, an in-vehicle computer, an electronic book reader, a desktop computer with a touch screen, a notebook computer with a touch screen, and the like.
Referring to fig. 2, a flowchart of a three-dimensional object rotation method according to an exemplary embodiment of the present application is shown. The method may be performed by a computer device; wherein the computer device may be the computer device 10 shown in fig. 1 described above. The method may include the following steps.
Step 201, displaying an interactive interface, wherein at least one three-dimensional object is displayed in the interactive interface.
In the embodiment of the application, the computer device may display the interactive interface in the touch screen through the installed and run application program.
The three-dimensional object refers to a three-dimensional virtual object existing in a three-dimensional virtual environment, for example, the three-dimensional object may be a three-dimensional object or a three-dimensional character, etc.
Step 202, detecting touch points in the interactive interface, where the initial number of the touch points is 2, and the current number of the touch points is 1 or 2.
In this embodiment of the application, the computer device may detect, in real time, a touch operation of a user in an interactive interface, where an initial number of touch points corresponding to the touch operation is 2, that is, when a three-dimensional object is rotated, the user first performs the touch operation in a touch screen by a two-point touch manner.
Step 203, when a touch point in the interactive interface moves, rotating a target three-dimensional object in at least one three-dimensional object; the rotation angle of the target three-dimensional object is related to the movement of the touch points, and the rotation axis of the target three-dimensional object is related to the current number of touch points.
In the embodiment of the present application, in the subsequent rotation process, the user may keep the operation of two-point touch, or change from the two-point touch to the single-point touch, where the two-point touch/the single-point touch are used to determine the rotation axis of the target three-dimensional object, the touch operation may be a touch sliding operation, and the sliding condition (such as the sliding direction, the sliding distance, the sliding speed, and the like) of the touch sliding operation is used to determine the rotation angle of the target three-dimensional object.
In the embodiment of the present application, the rotation axis of the target three-dimensional object may be parallel to the rotation axis of the touch screen or perpendicular to the rotation axis of the touch screen, according to the current number of touch points.
To sum up, according to the scheme shown in the embodiment of the application, when a user rotates a target three-dimensional object in a sliding touch manner in a touch screen, a rotating shaft can be selected in a single-point/two-point touch manner, and the rotating angle is controlled by touching the sliding condition, so that the manner that the rotating shaft is simultaneously selected and the rotating angle is controlled by the touch of two fingers is realized, the operation steps of the user when the user rotates the three-dimensional object through the touch screen are simplified, the operation time of the user is shortened, and the man-machine interaction efficiency when the three-dimensional object in the touch screen is rotated is improved.
Referring to fig. 3, a flowchart of a three-dimensional object rotation method according to an exemplary embodiment of the present application is shown. The method may be performed by a computer device; wherein the second computer device may be the computer device 10 shown in fig. 1 described above. The method may include the following steps.
Step 301, displaying an interactive interface, wherein at least one three-dimensional object is displayed in the interactive interface.
In an embodiment of the application, a computer device may display an interactive interface and may display at least one three-dimensional object in the interactive interface.
Wherein, the interactive interface can be a two-dimensional view interface, and the two-dimensional view interface can be an interactive interface displayed through a two-dimensional view screen. The computer device may display an interactive interface on a two-dimensional view screen and at least one three-dimensional object on the two-dimensional view interface.
In one possible implementation, the computer device may be a terminal device, which may be a smartphone.
For example, please refer to fig. 4, which shows a schematic diagram of an interactive interface according to an embodiment of the present application. As shown in fig. 4, the computer device (smartphone) displays a UI interface, which may be the above-mentioned interactive interface, on which a virtual world 40(virtual world) and at least one three-dimensional object 41(3D object) are displayed, and the UI interface may be displayed on a display screen, which may be a multi-touch screen 42(multi-touch screen), and the multi-touch screen 42 may support the computer device to receive a multi-shot signal instruction when the computer device is touched at multiple points. The user can view the virtual world 40 and the three-dimensional object 41 therein through the UI interface.
Step 302, detecting touch points in the interactive interface, where the initial number of the touch points is 2, and the current number of the touch points is 1 or 2.
In this embodiment of the application, the computer device may detect the touch points in the interactive interface, and may determine the current number of the currently detected touch points in the interactive interface because the initial number of the touch points in the interactive interface is 2, where one of the two cases is to maintain the initial number, that is, the current number of the detected touch points is still 2, and the other is to reduce the touch points, that is, the current number of the detected touch points is 1.
In a possible implementation manner, the computer device periodically detects the touch point in the interactive interface, or the computer device detects the touch point in the interactive interface in real time.
That is, when the computer device displays the interactive interface and the interactive interface includes at least one three-dimensional object supporting rotation, the computer device may periodically or in real time detect whether there are touch points in the interactive interface and determine the current number of touch points.
Since the computer device displays the interactive interface through the multi-touch screen, the computer device may detect a plurality of touch points acting on the multi-touch screen.
Step 303, determining a rotation axis of the target three-dimensional object according to the current number of the touch points.
In an embodiment of the application, the computer device may determine a rotation axis currently directed to the target three-dimensional object according to the current number of detected touch points.
Wherein the target three-dimensional object may be one of at least one three-dimensional object displayed on the interactive interface, and the target three-dimensional object may be a three-dimensional object currently supported for rotation.
In one possible implementation manner, when a three-dimensional object is displayed on the interactive interface, the computer device may directly determine the three-dimensional object as a target three-dimensional object, and the computer device may automatically lock the target three-dimensional object in the three-dimensional object. When at least two three-dimensional objects are displayed on the interactive interface, the computer device may randomly select one of the at least two three-dimensional objects as a target three-dimensional object, or may select one of the at least two three-dimensional objects as the target three-dimensional object according to a position of a currently detected touch point.
When the current number of the touch points is 2 and continuous clicking operation occurs at one touch point of the 2 touch points, switching the target three-dimensional object among the n three-dimensional objects under the condition that the number of the at least one three-dimensional object is n and n is an integer greater than or equal to 2.
That is to say, if at least two three-dimensional objects currently exist in the interactive interface and the computer device detects that at least two touch points currently exist, a target three-dimensional object of the at least two three-dimensional objects can be determined according to the positions of the two detected touch points, and when the computer device receives a continuous click operation occurring at the position of any one of the at least two touch points, the current target three-dimensional object is switched, that is, one of the remaining three-dimensional objects can be replaced by the target three-dimensional object.
For example, the current target three-dimensional object may be displayed in a central region of the interactive interface, the current remaining three-dimensional objects may be displayed in an edge region of the interactive interface, and the target three-dimensional object in the central region may be displayed in a larger size than the current remaining three-dimensional objects in the edge region. When the computer device detects that the current number of the touch points is 2, and receives that the continuous clicking operation occurs at the position of any one of the 2 touch points, one of the three-dimensional objects in the edge area can be displayed in the center area, and the three-dimensional object in the center area before is displayed in the edge area, so that the current target three-dimensional object is switched, and if the current switched target three-dimensional object still needs to be replaced, the continuous clicking operation continues to be received at the position of any one of the 2 touch points until the three-dimensional object displayed in the center area is the finally determined target three-dimensional object.
In addition, when the computer device detects that the current number of the touch points is 2 and receives that a continuous click operation occurs at the position of any one of the 2 touch points, it may be preferentially determined whether a three-dimensional object exists in the direction of the touch point where the continuous click operation occurs, and if a three-dimensional object exists in the direction, the three-dimensional object in the edge area of the direction is preferentially switched with the three-dimensional object in the central area. Through the convenient operation, the step of rapidly switching the target three-dimensional object in the presence of a plurality of three-dimensional objects in the interactive interface can be realized, the efficiency of determining the target three-dimensional object is greatly improved, and meanwhile, additional touch operation is not needed, so that the convenience of operation is improved.
In one possible implementation, when the computer device detects that the current number of touch points is an initial number, i.e., two touch points, then it may be determined that the axis of rotation currently directed to the target three-dimensional object may be the axis of rotation perpendicular to the screen. When the computer device detects that the current number of touch points becomes one touch point, then it may be determined that the axis of rotation currently directed to the target three-dimensional object may be parallel to the axis of rotation of the screen.
Wherein, when the current number of detected touch points is 2, the computer device may determine that the rotation axis of the target three-dimensional object may be perpendicular to the screen, and the rotation axis may pass through the centroid of the target three-dimensional object or the rotation axis may pass through the midpoint of the connecting line between the two currently detected touch points. When the current number of detected touch points changes to 1, the computer device may re-determine a rotation axis currently directed to the target three-dimensional object, the re-determined rotation axis may be a rotation axis parallel to the screen, and the rotation axis may be a perpendicular bisector of the target line or an axis perpendicular to the target line and passing through a centroid of the target three-dimensional object. The target connection line may be a connection line between 2 touch points at a time before one of the 2 touch points disappears.
And step 304, when the touch point moves, the control target three-dimensional object rotates according to the rotating shaft.
In this embodiment of the present application, when the touch point moves, the computer device may control the target three-dimensional object to rotate according to the determined rotation axis.
The method comprises the steps of determining a rotating shaft which is currently aimed at a target three-dimensional object according to the current number of touch points detected by computer equipment, and controlling the target three-dimensional object to rotate according to the determined rotating shaft when the touch points move.
In one possible implementation, when the touch points move and the current number of the touch points is 2, the control target three-dimensional object rotates according to a rotation axis perpendicular to the screen.
That is, when the current number of trigger points is 2, the rotation axis may be determined to be an axis perpendicular to the screen, and when one or both of the two touch points move, the computer device may control the target three-dimensional object to rotate according to the determined rotation axis.
The rotation angle of the target three-dimensional object and the rotation angle of the connection line of the 2 touch points can be in positive correlation.
That is to say, the rotation angle of the target three-dimensional object may be a rotation angle of a connection line of the 2 touch points, or the rotation angle of the target three-dimensional object may be a product of the rotation angle of the connection line of the 2 touch points and a fixed ratio, that is, the computer device may process the rotation angle of the connection line of the 2 touch points according to the fixed ratio to obtain the rotation angle of the target three-dimensional object.
Or, the above ratio may be a dynamically changing ratio, and the numerical value of the ratio may be related to the rotation speed of the connection line of 2 touch points, that is, when the rotation speed of the connection line of 2 touch points is relatively high, the numerical value of the corresponding ratio is relatively high, and when the rotation speed of the connection line of 2 touch points is relatively low, the numerical value of the corresponding ratio is relatively low.
For example, fig. 5 is a schematic view of determining a rotation angle according to an embodiment of the present application. As shown in fig. 5, A, B are the current two touch points, a ' is the touch point after a shift, B ' is the touch point after B shift, and in the first case 51a, the rotation angle may be the angle of the acute angle AOA '; in the second case 51b, the rotation angle may be the angle of the obtuse angle AOA'; in the third case 51c, the rotation angle may be the difference between 360 degrees and the angle of the acute angle AOA'; in the fourth case 51d, the rotation angle may be the difference between 360 degrees and the angle of the obtuse angle AOA'.
Illustratively, fig. 6 is a flowchart of a two-finger operation according to an embodiment of the present application. As shown in fig. 6, if the target three-dimensional object needs to be rotated 90 degrees around the z-axis perpendicular to the screen, the gesture suggested to the user for manipulating may be to rotate around the axis perpendicular to the screen, the user places two fingers on the screen to form touch points on the screen (S61), at this time, it may be determined that the rotation axis passes through the center of a connecting line between the touch points of the two fingers, and the rotation axis is also perpendicular to the connecting line between the two fingers (S62), the rotation angle of the target three-dimensional object around the z-axis may be changed by performing a clockwise or counterclockwise rotation movement on the two fingers (S63), when the rotation angle reaches the expected 90 degrees, that is, after the rotation angle is sufficient, the rotation movement of the two fingers may be stopped, and the contact between the two fingers and the screen is released (S64), and the rotation of the target three-dimensional object is completed this time. Fig. 7 is a schematic view of a two-finger operation screen display according to an embodiment of the present application. As shown in fig. 7, in the first interactive interface 71, the touch points of the two fingers with respect to the target three-dimensional object are determined, in the second interactive interface 72, the two fingers are controlled to rotate clockwise around the z-axis, so that the target three-dimensional object can be controlled to rotate clockwise around the z-axis, and in the third interactive interface 73, when the target three-dimensional object rotates to 90 degrees, the two fingers can be released, so that the operation of controlling the target three-dimensional object to rotate by the two fingers is finished.
In another possible implementation manner, when the touch points move and the current number of the touch points is 1, the target three-dimensional object is controlled to rotate according to a rotation axis which is parallel to the screen and perpendicular to the target connecting line. The target connection line may be a connection line between 2 touch points at a time before one of the 2 touch points disappears. The rotation angle of the target three-dimensional object may be in positive correlation with the moving distance of the touch point in the current screen.
The rotation angle of the target three-dimensional object may be a product of a ratio of a distance of the current touch point moving toward the vanished touch point to a distance of the target connection line and 180 degrees. Or, the rotation angle of the target three-dimensional object may also be a product of a ratio of a distance of the current touch point moving toward the vanished touch point to a distance of the target link line, and 360 degrees or 270 degrees.
In a possible case, if the current touch point moves towards the direction of the disappeared touch point and exceeds the range of the target connecting line, the rotation angle can be determined to be 0 degree; or, if the current touch point moves in the opposite direction of the disappeared touch point and exceeds the position of the original current touch point, the rotation angle may be determined to be 0 degree.
For example, fig. 8 is a schematic view of the rotation angle determination involved in the embodiment of the present application. As shown in fig. 8, in the first case 81a, if the point a is a vanishing touch point, the point B is a current touch point, and the point B' is a current touch point after movement, the calculation formula of the rotation angle of the target three-dimensional object can be as follows,
Figure BDA0003660418090000101
in the second case 81B, the rotation angle is 0 degrees since B' is beyond BA after moving in the direction of a. In the third case 81c, the rotation angle is 0 degrees at this time, since B' moves to B and exceeds the range AB.
Illustratively, fig. 9 is a flow chart of a single finger operation involved in an embodiment of the present application. As shown in fig. 9, if it is required to rotate the target three-dimensional object by a certain angle around a rotation axis parallel to the screen, the gesture suggested to the user for manipulating may be to rotate around an axis parallel to the screen, the user puts two fingers on the screen to form touch points on the screen (S91), at which time the rotation axis passing through the center of the connecting line between the touch points of the two fingers and being also perpendicular to the connecting line between the two fingers (S92), and then releases one of the fingers, i.e., leaves the screen, and moves the finger still contacting the screen toward the other finger (S93), the rotation angle of the target three-dimensional object rotating around the rotation axis parallel to the screen may be changed, when the rotation angle reaches a desired angle, i.e., when the rotation angle is sufficient, the movement of the finger may be stopped, and the contact between the finger and the screen is released (S94), and finishing the rotation of the target three-dimensional object. Fig. 10 is a schematic view of a display of a single-finger operation screen according to an embodiment of the present application. As shown in fig. 10, in the first interactive interface 1001, touch points of two fingers with respect to a target three-dimensional object are determined, in the second interactive interface 1002, one of the fingers is released to leave the screen, the other finger is moved in the direction of the upper finger, the rotation axis at this time is determined as a perpendicular bisector of a line connecting the two fingers before the release of the finger, and the rotation angle of the target three-dimensional object is controlled by controlling the distance of the upward movement of the single finger, and in the third interactive interface 1003, when the target three-dimensional object is rotated to 90 degrees, the finger can be released, and the operation of rotating the target three-dimensional object by the single finger is ended.
In a possible implementation manner, since the initial number of the touch points is 2, firstly, the computer device performs a two-finger operation, the target three-dimensional object is controlled to rotate according to a rotation axis perpendicular to the screen through the two-finger operation, and then when the rotation angle in the direction reaches a requirement, the target three-dimensional object is controlled to rotate according to a rotation axis parallel to the screen through a one-finger operation until the rotation angle in the direction reaches the requirement, if the rotation angle rotating according to the rotation axis perpendicular to the screen still needs to be continuously adjusted at this time, the rotation angle of the target three-dimensional object can be controlled by changing to the two-finger operation again.
Illustratively, fig. 11 is a flowchart of a target three-dimensional object rotation control according to an embodiment of the present application. As shown in fig. 11, first, the computer device may display a virtual world on a screen, and display three-dimensional objects in the virtual world, a user may view the three-dimensional objects (S1101), a focusing operation on one of the three-dimensional objects may be implemented when the selected three-dimensional object is a target three-dimensional object (S1102), it is determined whether the user needs to view the target three-dimensional object in another direction (S1103), if the user needs to view the target three-dimensional object in another direction, a two-finger operation is performed first by suggesting that the user use a gesture of rotating around an axis parallel to the screen (S1104), then a one-finger operation is performed by suggesting that the user use a gesture of rotating around an axis perpendicular to the screen (S1105), and the operation may be ended when adjusting a viewing direction of the target three-dimensional object to meet a need. Through the combined operation of the double-finger operation and the single-finger operation, the target three-dimensional object can be rotated to any direction.
In summary, according to the scheme shown in the embodiment of the application, when a user rotates a target three-dimensional object in a sliding touch manner in a touch screen, a rotating shaft can be selected in a single-point/two-point touch manner, and the rotating angle is controlled by touching the sliding condition, so that the manner that the rotating shaft is simultaneously selected and the rotating angle is controlled by the touch of two fingers is realized, the operation steps of the user when the user rotates the three-dimensional object through the touch screen are simplified, the operation time of the user is shortened, and the human-computer interaction efficiency when the three-dimensional object in the touch screen is rotated is improved.
Referring to fig. 12, a schematic structural diagram of a three-dimensional object rotating apparatus according to an exemplary embodiment of the present application is shown. The three-dimensional object rotating apparatus may be used to perform the steps performed by a computer device as in the embodiments shown in fig. 2 or fig. 3 above; the device comprises:
an interface display module 1210, configured to display an interactive interface, where at least one three-dimensional object is displayed in the interactive interface;
a touch point detection module 1220, configured to detect touch points in the interactive interface, where an initial number of the touch points is 2, and a current number of the touch points is 1 or 2;
a rotating module 1230, configured to rotate a target three-dimensional object in the at least one three-dimensional object when the touch point in the interactive interface moves; the rotation angle of the target three-dimensional object is related to the movement condition of the touch points, and the rotation axis of the target three-dimensional object is related to the current number of the touch points.
In one possible implementation, the rotating module 1230 includes:
and the first rotation submodule is used for controlling the target three-dimensional object to rotate according to a rotating shaft vertical to the screen when the touch points move and the current number of the touch points is 2.
In a possible implementation manner, the rotation angle of the target three-dimensional object is positively correlated to the rotation angle of the connection line of the 2 touch points.
In one possible implementation, the rotating module 1230 includes:
the second rotation submodule is used for controlling the target three-dimensional object to rotate according to a rotating shaft which is parallel to the screen and vertical to the target connecting line when the touch points move and the current number of the touch points is 1;
the target connecting line is a connecting line between 2 touch points at the moment before one of the 2 touch points disappears.
In one possible implementation manner, the rotation angle of the target three-dimensional object is positively correlated to the movement distance of the touch point in the current screen.
In a possible implementation manner, the number of at least one three-dimensional object is n, and n is an integer greater than or equal to 2;
the device further comprises:
and the target determining module is used for switching the target three-dimensional object among the n three-dimensional objects when the current number of the touch points is 2 and continuous clicking operation occurs at one of the 2 touch points before the target three-dimensional object in the at least one three-dimensional object is rotated when the touch points in the interactive interface move.
In summary, according to the scheme shown in the embodiment of the application, when a user rotates a target three-dimensional object in a sliding touch manner in a touch screen, a rotating shaft can be selected in a single-point/two-point touch manner, and the rotating angle is controlled by touching the sliding condition, so that the manner that the rotating shaft is simultaneously selected and the rotating angle is controlled by the touch of two fingers is realized, the operation steps of the user when the user rotates the three-dimensional object through the touch screen are simplified, the operation time of the user is shortened, and the human-computer interaction efficiency when the three-dimensional object in the touch screen is rotated is improved.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of each functional module is illustrated, and in practical applications, the above functions may be distributed by different functional modules according to actual needs, that is, the content structure of the device may be divided into different functional modules to implement all or part of the functions described above.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 13 is a block diagram illustrating a structure of a computer device according to an exemplary embodiment of the present application. The computer device can be an electronic device such as a smart phone, a tablet computer, an electronic book, a portable personal computer, an intelligent wearable device and the like. A terminal in the present application may include one or more of the following components: a processor 1310, a memory 1320, and a screen 1330.
Processor 1310 may include one or more processing cores. The processor 1310 connects various parts within the overall terminal using various interfaces and lines, performs various functions of the terminal and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 1320, and calling data stored in the memory 1320. Alternatively, the processor 1310 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 1310 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the screen 1330; the modem is used to handle wireless communications. It is understood that the modem may be implemented by a communication chip without being integrated into the processor 1310.
The Memory 1320 may include a Random Access Memory (RAM) or a Read-Only Memory (ROM). Optionally, the memory 1320 includes a non-transitory computer-readable medium. The memory 1320 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 1320 may include a program storage area and a data storage area, where the program storage area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, and the like), instructions for implementing various method embodiments described above, and the like, and the operating system may be an Android (Android) system (including a system based on Android system depth development), an IOS system developed by apple corp (including a system based on IOS system depth development), or other systems. The data storage area can also store data (such as a phone book, audio and video data, chat record data) created by the terminal in use and the like.
The screen 1330 may be a capacitive touch display screen for receiving a user's touch operation on or near it using a finger, a stylus, or any other suitable object, as well as displaying a user interface for various applications. The touch display screen is generally provided at a front panel of the terminal. The touch display screen may be designed as a full-screen, a curved screen, or a profiled screen. The touch display screen can also be designed to be a combination of a full-face screen and a curved-face screen, and a combination of a special-shaped screen and a curved-face screen, which is not limited in the embodiment of the present application.
In addition, those skilled in the art will appreciate that the configurations of the terminals illustrated in the above-described figures are not meant to be limiting, and that the terminals may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components. For example, the terminal further includes a radio frequency circuit, a shooting component, a sensor, an audio circuit, a Wireless Fidelity (WiFi) component, a power supply, a bluetooth component, and other components, which are not described herein again.
The embodiment of the present application further provides a computer-readable storage medium, in which at least one computer instruction is stored, and the at least one computer instruction is loaded and executed by a processor to implement the three-dimensional object rotation method according to the above embodiments.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. The processor of the terminal reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the terminal performs the three-dimensional object rotation method provided in the various alternative implementations of the above-mentioned aspect.
An embodiment of the present application further provides a computer device, where the computer device includes a processor, a memory, and a transceiver, where the memory stores a computer program, and the processor executes the computer program, so that the computer device implements the three-dimensional object rotation method according to the foregoing embodiments.
Those skilled in the art will recognize that, in one or more of the examples described above, the functions described in the embodiments of the present application may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable storage medium. Computer-readable storage media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A method of rotating a three-dimensional object, the method comprising:
displaying an interactive interface, wherein at least one three-dimensional object is displayed in the interactive interface;
detecting touch points in the interactive interface, wherein the initial number of the touch points is 2, and the current number of the touch points is 1 or 2;
when the touch point in the interactive interface moves, rotating a target three-dimensional object in the at least one three-dimensional object; the rotation angle of the target three-dimensional object is related to the movement condition of the touch points, and the rotation axis of the target three-dimensional object is related to the current number of the touch points.
2. The method of claim 1, wherein rotating a target three-dimensional object of the at least one three-dimensional object when the touch point in the interactive interface moves comprises:
and when the touch points move and the current number of the touch points is 2, controlling the target three-dimensional object to rotate according to a rotating shaft vertical to the screen.
3. The method of claim 2, wherein the rotation angle of the target three-dimensional object is positively correlated to the rotation angle of the connection line of the 2 touch points.
4. The method according to claim 1, wherein the rotating a target three-dimensional object of the at least one three-dimensional object when the touch point in the interactive interface moves comprises:
when the touch points move and the current number of the touch points is 1, controlling the target three-dimensional object to rotate according to a rotating shaft which is parallel to the screen and vertical to the target connecting line;
the target connecting line is a connecting line between 2 touch points at the moment before one touch point in the 2 touch points disappears.
5. The method of claim 4, wherein the rotation angle of the target three-dimensional object is positively correlated to the moving distance of the touch point in the current screen.
6. The method according to any one of claims 1 to 5, wherein the number of at least one of said three-dimensional objects is n, and n is an integer greater than or equal to 2;
before rotating a target three-dimensional object in the at least one three-dimensional object when the touch point in the interactive interface moves, the method further includes:
and when the current number of the touch points is 2 and continuous clicking operation occurs at one touch point of the 2 touch points, switching the target three-dimensional object among the n three-dimensional objects.
7. A three-dimensional object rotation apparatus, characterized in that the apparatus comprises:
the interface display module is used for displaying an interactive interface, and at least one three-dimensional object is displayed in the interactive interface;
the touch point detection module is used for detecting touch points in the interactive interface, the initial number of the touch points is 2, and the current number of the touch points is 1 or 2;
the rotation module is used for rotating a target three-dimensional object in the at least one three-dimensional object when the touch point in the interactive interface moves; the rotation angle of the target three-dimensional object is related to the movement condition of the touch points, and the rotation axis of the target three-dimensional object is related to the current number of the touch points.
8. A computer device, wherein the computer device comprises a processor, a memory, and a transceiver;
the memory has stored therein a computer program that is executed by the processor to cause the computer device to implement the three-dimensional object rotation method according to any one of claims 1 to 6.
9. A computer-readable storage medium having stored therein computer instructions for execution by a processor to implement the three-dimensional object rotation method of any one of claims 1 to 6.
10. A computer program product, characterized in that the computer program product comprises computer instructions, the computer instructions being stored in a computer readable storage medium; the computer program product for implementing the three-dimensional object rotation method as claimed in any one of claims 1 to 6.
CN202210575895.6A 2022-05-24 2022-05-24 Three-dimensional object rotation method, device, equipment, storage medium and program product Pending CN114780009A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210575895.6A CN114780009A (en) 2022-05-24 2022-05-24 Three-dimensional object rotation method, device, equipment, storage medium and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210575895.6A CN114780009A (en) 2022-05-24 2022-05-24 Three-dimensional object rotation method, device, equipment, storage medium and program product

Publications (1)

Publication Number Publication Date
CN114780009A true CN114780009A (en) 2022-07-22

Family

ID=82408243

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210575895.6A Pending CN114780009A (en) 2022-05-24 2022-05-24 Three-dimensional object rotation method, device, equipment, storage medium and program product

Country Status (1)

Country Link
CN (1) CN114780009A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102736771A (en) * 2011-03-31 2012-10-17 比亚迪股份有限公司 Method and device for identifying multi-point rotation motion
CN103257811A (en) * 2012-02-20 2013-08-21 腾讯科技(深圳)有限公司 Picture display system and method based on touch screen
US20140282073A1 (en) * 2013-03-15 2014-09-18 Micro Industries Corporation Interactive display device
US20160261768A1 (en) * 2015-03-06 2016-09-08 Kyocera Document Solutions Inc. Display input device and image forming apparatus including same, and method for controlling display input device
KR20190036061A (en) * 2017-09-27 2019-04-04 삼성중공업 주식회사 Touch type rotation control device and method of 3D object
US20200319776A1 (en) * 2019-04-02 2020-10-08 Adobe Inc. Visual Manipulation of a Digital Object
CN113138670A (en) * 2021-05-07 2021-07-20 郑州捷安高科股份有限公司 Touch screen interaction gesture control method and device, touch screen and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102736771A (en) * 2011-03-31 2012-10-17 比亚迪股份有限公司 Method and device for identifying multi-point rotation motion
CN103257811A (en) * 2012-02-20 2013-08-21 腾讯科技(深圳)有限公司 Picture display system and method based on touch screen
US20140282073A1 (en) * 2013-03-15 2014-09-18 Micro Industries Corporation Interactive display device
US20160261768A1 (en) * 2015-03-06 2016-09-08 Kyocera Document Solutions Inc. Display input device and image forming apparatus including same, and method for controlling display input device
KR20190036061A (en) * 2017-09-27 2019-04-04 삼성중공업 주식회사 Touch type rotation control device and method of 3D object
US20200319776A1 (en) * 2019-04-02 2020-10-08 Adobe Inc. Visual Manipulation of a Digital Object
CN113138670A (en) * 2021-05-07 2021-07-20 郑州捷安高科股份有限公司 Touch screen interaction gesture control method and device, touch screen and storage medium

Similar Documents

Publication Publication Date Title
US11106246B2 (en) Adaptive enclosure for a mobile computing device
US9804761B2 (en) Gesture-based touch screen magnification
RU2601173C2 (en) Split-screen display method and device, and electronic apparatus thereof
EP2407869B1 (en) Mobile terminal and controlling method thereof
KR101799270B1 (en) Mobile terminal and Method for recognizing touch thereof
EP2533146B1 (en) Apparatus and method for providing web browser interface using gesture in device
US10627990B2 (en) Map information display device, map information display method, and map information display program
KR101384857B1 (en) User interface methods providing continuous zoom functionality
EP2500797B1 (en) Information processing apparatus, information processing method and program
EP2669788A1 (en) Mobile terminal and controlling method thereof
KR102304178B1 (en) User terminal device and method for displaying thereof
EP2823387A1 (en) Systems and methods for modifying virtual keyboards on a user interface
US20150268743A1 (en) Device and method for controlling a display panel
US10133454B2 (en) Stackable pagination indicators
EP2778880B1 (en) Method for controlling display function and an electronic device thereof
JP2011081440A (en) Information processing apparatus, information processing method, and information processing program
US20140062925A1 (en) Method for changing object position and electronic device thereof
WO2013095677A1 (en) Computing system utilizing three-dimensional manipulation command gestures
US9377944B2 (en) Information processing device, information processing method, and information processing program
US20210208740A1 (en) Method, Mobile Terminal, and Non-Transitory Computer-Readable Storage Medium for Controlling Displaying Direction
CN112218134A (en) Input method and related equipment
EP2998838A1 (en) Display apparatus and method for controlling the same
CN113138670B (en) Touch screen interaction gesture control method and device, touch screen and storage medium
US10599326B2 (en) Eye motion and touchscreen gestures
CN114780009A (en) Three-dimensional object rotation method, device, equipment, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination