CN109542322B - Information processing method and device, storage medium and electronic device - Google Patents

Information processing method and device, storage medium and electronic device Download PDF

Info

Publication number
CN109542322B
CN109542322B CN201811393907.3A CN201811393907A CN109542322B CN 109542322 B CN109542322 B CN 109542322B CN 201811393907 A CN201811393907 A CN 201811393907A CN 109542322 B CN109542322 B CN 109542322B
Authority
CN
China
Prior art keywords
target
time
behavior
operation instruction
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811393907.3A
Other languages
Chinese (zh)
Other versions
CN109542322A (en
Inventor
余嘉欣
黄乙迦
王超然
赵鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201811393907.3A priority Critical patent/CN109542322B/en
Publication of CN109542322A publication Critical patent/CN109542322A/en
Application granted granted Critical
Publication of CN109542322B publication Critical patent/CN109542322B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an information processing method, an information processing device, a storage medium and an electronic device. The method comprises the following steps: acquiring a first target operation instruction by adopting a first target time based on a target time axis, wherein the target time axis is used for indicating the starting time of a behavior executed by a target object in a target scene; and responding to the first target operation instruction, displaying a plurality of target behavior information of the target object corresponding to the first target time, wherein each target behavior information is used for indicating one target behavior executed by the target object, and starting times of the plurality of target behaviors corresponding to the plurality of target behavior information, and in the target time period, the target time period is a time period which is related to the first target time on the target time axis. According to the invention, the effect of improving the information display amount of the time axis is achieved.

Description

Information processing method and device, storage medium and electronic device
Technical Field
The present invention relates to the field of data processing, and in particular, to a method and an apparatus for processing information, a storage medium, and an electronic apparatus.
Background
At present, in a mobile terminal, behavior information to be played can be displayed through a time axis, and usually only behavior information corresponding to a displayed time is displayed. In addition, since the screen size of the mobile terminal is limited, the touch range is limited, and if the time axis is in units of 0.1 second (common editing interval), and pixels of the screen are guaranteed to be more than 44px, the time displayable in the single screen is reduced, for example, the time axis can only display 1.5 seconds of content in the single screen, so that the amount of information displayed by the time axis is small, and meanwhile, the editing by the player is not facilitated.
Aiming at the problem of small information quantity displayed by a time shaft in the prior art, an effective solution is not provided at present.
Disclosure of Invention
The invention mainly aims to provide an information processing method, an information processing device, a storage medium and an electronic device, which at least solve the technical problem that the amount of information displayed on a time axis is small.
In order to achieve the above object, according to an aspect of the present invention, there is provided a method of processing information. The method comprises the following steps: acquiring a first target operation instruction based on a first target time on a target time axis, wherein the target time axis is used for indicating the starting time of a behavior executed by a target object in a target scene; and responding to the first target operation instruction, displaying a plurality of target behavior information of the target object corresponding to the first target time, wherein each target behavior information is used for indicating one target behavior executed by the target object, and starting times of the plurality of target behaviors corresponding to the plurality of target behavior information, and in the target time period, the target time period is a time period which is related to the first target time on the target time axis.
Optionally, before the first target operation instruction is acquired based on a first target time on the target time axis, the method further includes: and determining any time in the target time period as a first target time.
Optionally, displaying, in response to the first target operation instruction, the plurality of target behavior information of the target object corresponding to the first target time includes: and responding to the first target operation instruction, and displaying at least one piece of target behavior information of each target object in the plurality of target objects.
Optionally, when, in response to the first target operation instruction, displaying a plurality of target behavior information of the target object corresponding to the first target time, the method further includes: and processing the target behavior information to obtain a target result, wherein the target result is used for indicating the variation trend of the target behavior executed by the target object.
Optionally, after displaying a plurality of target behavior information of the target object corresponding to the first target time, the method further includes: determining a first target behavior from a plurality of target behaviors; and on the target time axis, the starting time of the first target behavior is adjusted from the second target time to the third target time.
Optionally, adjusting the starting time of the first target behavior from the second target time to the third target time includes: on the target time axis, acquiring a second target operation instruction based on a second target moment; responding to a second target operation instruction, and displaying a target identifier at a first position on a target time axis, wherein the first position corresponds to a second target moment, and the target identifier is used for indicating the adjustment of the starting moment of the first target behavior; acquiring a third target operation instruction based on the target identifier; responding to a third target operation instruction, and adjusting the display position of the target identifier from the first position to the second position; and determining the time corresponding to the second position on the target time axis as a third target time.
Optionally, the displaying the target identifier at the first position on the target time axis in response to the second target operation instruction includes: responding to a second target operation instruction, and activating the target identifier in the hidden state; and displaying the activated target identification on the first position.
Optionally, after adjusting the starting time of the first target behavior from the second target time to the third target time, the method further includes: in the process of playing the target picture of the target scene, a picture including the first target behavior is played at the third target time.
Optionally, the plurality of target behavior information is displayed by a target list, and the target behavior information displayed by the target list includes at least one of: the name of the target behavior; the starting time of the target behavior; a type of target behavior; the number of target objects that perform the target behavior.
In order to achieve the above object, according to another aspect of the present invention, there is also provided an information processing apparatus. The device includes: the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring a first target operation instruction based on a first target time on a target time axis, and the target time axis is used for indicating the starting time of a behavior executed by a target object in a target scene; and a display unit, configured to display, in response to the first target operation instruction, a plurality of target behavior information of the target object corresponding to the first target time, where each target behavior information is used to indicate one target behavior executed by the target object, start times of the plurality of target behaviors corresponding to the plurality of target behavior information, and within a target time period, the target time period is a time period associated with the first target time on the target time axis.
In order to achieve the above object, according to another aspect of the present invention, there is also provided a storage medium. The storage medium has stored therein a computer program, wherein the computer program is arranged to perform the method of an embodiment of the invention when executed.
In order to achieve the above object, according to another aspect of the present invention, there is also provided an electronic device. The electronic device comprises a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to perform the method of the embodiment of the invention.
According to the method and the device, a first target operation instruction is acquired based on a first target time on a target time axis, wherein the target time axis is used for indicating the starting time of a behavior executed by a target object in a target scene; and responding to the first target operation instruction, displaying a plurality of target behavior information of the target object corresponding to the first target time, wherein each target behavior information is used for indicating one target behavior executed by the target object, and starting times of the plurality of target behaviors corresponding to the plurality of target behavior information, and in the target time period, the target time period is a time period which is related to the first target time on the target time axis. Because a plurality of target behavior information of the target object at different starting moments in the target time period is displayed at the target moment on the target time axis, the purpose of displaying the target behavior information is achieved, the problem of small information amount displayed by the time axis is solved, and the technical effect of improving the information display amount of the time axis is further achieved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a block diagram of a hardware configuration of a mobile terminal of an information processing method according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method of processing information according to an embodiment of the invention;
FIG. 3 is a schematic view of a timeline interface according to the related art;
FIG. 4 is a schematic view of a timeline interface according to another related art;
FIG. 5 is a schematic illustration of a timeline according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of adjusting the start of an action by a time identification icon, according to an embodiment of the invention; and
fig. 7 is a schematic diagram of an information processing apparatus according to an embodiment of the present invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The method provided by the embodiment of the application can be executed in a mobile terminal, a computer terminal or a similar operation device. Taking an example of the present invention running on a mobile terminal, fig. 1 is a block diagram of a hardware structure of the mobile terminal of an information processing method according to an embodiment of the present invention. As shown in fig. 1, the mobile terminal may include one or more (only one shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data, and optionally may also include a transmission device 106 for communication functions and an input-output device 108. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration, and does not limit the structure of the mobile terminal. For example, the mobile terminal may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 can be used for storing computer programs, for example, software programs and modules of application software, such as a computer program corresponding to a data processing method in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer programs stored in the memory 104, so as to implement the method described above. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the mobile terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal. In one example, the transmission device 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
In the embodiment, a method for processing information running in the mobile terminal is provided. Fig. 2 is a flow chart of a method of processing information according to an embodiment of the present invention. As shown in fig. 2, the process includes the following steps:
step S202, a first target operation instruction is obtained based on a first target time on a target time axis.
In the technical solution provided by step S202 above in the present invention, the first target operation instruction is obtained based on a first target time on a target time axis, where the target time axis is used for indicating a starting time of a behavior executed by a target object in a target scene.
In this embodiment, the behavior information of the target object in the target scene may be presented according to a time sequence, so as to record the behavior of the target object. The target time axis of this embodiment may be a time axis displayed on a screen of the mobile terminal, and time may be displayed at intervals of time units, for example, 0.5 second is used as a time unit, and times 2 second, 2.5 second, 3 second, 3.5 second, and so on are sequentially displayed, so as to indicate a starting time of an action performed by a target object in a target scene, for example, the time displayed on the target time axis corresponds to a starting time of an action performed by the target object in the target scene, and the target object may be an object that needs to show the action in the target scene, for example, an actor that shows the action, and the target scene may be a game scene, which is not limited here.
The first target time in this embodiment may be any time that can be displayed on a target time axis, and the first target operation instruction is obtained based on the first target time, and the first target operation instruction may be an operation instruction obtained at a position on the time axis corresponding to the first target time, for example, an operation instruction triggered by a single-click operation, a double-click operation triggered by a double-click operation, an operation instruction triggered by staying at a position on the time axis corresponding to the first target time for a target time, and the like, which is not limited herein.
Step S204, responding to the first target operation instruction, and displaying a plurality of target behavior information of the target object corresponding to the first target time.
In the technical solution provided by step S204 of the present invention, in response to the first target operation instruction, a plurality of target behavior information of the target object corresponding to the first target time are displayed, where each target behavior information is used to indicate one target behavior executed by the target object, start times of the plurality of target behaviors corresponding to the plurality of target behavior information are within a target time period, and the target time period is a time period associated with the first target time on the target time axis.
In this embodiment, the target behavior information is used to indicate a target behavior executed by the target object, and may be an action performed by the target object in the target scene, such as a walking, jumping, waving, and other actions performed by the target object, and the target behavior has a starting time, that is, a time when the target object starts to execute the target behavior in the target scene.
The first target time of the embodiment corresponds to a plurality of target behavior information of the target object, and the plurality of target behavior information may correspond to the first target time in a set form, for example, the plurality of target behavior information is an action set corresponding to the first target time, wherein the start time of each target behavior information is in a target time period associated with the first target time, all the target behavior information of the start times in the target time period may be displayed based on the first target operation instruction acquired by the first target time, and the duration of the target time period may be a time unit on the target time axis, so that all the target behavior information of the start times in the target time period may be displayed simultaneously based on the operation on the first target time, and the display amount of the information on the time axis is increased.
For example, when the first target time is 2 seconds on the target time axis, the target time zone is 2.0 seconds to 2.4 seconds, and a plurality of target behavior information of the target object corresponding to the 2 nd second and having the start time in the target time zone of 2.0 seconds to 2.4 seconds is displayed in response to the first target operation instruction; in the case of the 3 rd second on the first target time axis, the target time zone is 3.0 seconds to 3.4 seconds, and a plurality of target behavior information of the target object at the start time corresponding to the 3 rd second in the target time zone is displayed in response to the first target operation instruction; under the condition of 4 seconds on the target time axis, the target time period is 4.0 seconds to 4.4 seconds, a plurality of target behavior information of the target object with the starting time corresponding to the 4 th second in the target time period of 4.0 seconds to 4.4 seconds is displayed in response to the first target operation instruction, and so on, so that a single screen of the mobile terminal can display the time axis of 12 seconds, optionally, the 12 seconds are calculated by the width of 1334 as limited by the screen width of the mobile terminal, and if the screen width of the mobile terminal is greater than 1334, the 12 seconds are greater than 12 seconds, thereby improving the display amount of the information on the time axis.
Through the steps S202 to S204, a first target operation instruction is acquired based on a first target time on a target time axis, where the target time axis is used to indicate a starting time of a behavior executed by a target object in a target scene; and responding to the first target operation instruction, displaying a plurality of target behavior information of the target object corresponding to the first target time, wherein each target behavior information is used for indicating one target behavior executed by the target object, the starting time of the plurality of target behaviors corresponding to the plurality of target behavior information is within a target time period, and the target time period is a time period which is related to the first target time on the target time axis. Because a plurality of target behavior information of the target object at different starting moments in the target time period is displayed at the target moment on the target time axis, the purpose of displaying the target behavior information is achieved, the problem of small information amount displayed by the time axis is solved, and the technical effect of improving the information display amount of the time axis is further achieved.
As an optional implementation manner, before acquiring the first target operation instruction based on the first target time on the target time axis in step S202, the method further includes: and determining any time in the target time period as a first target time.
In this embodiment, before acquiring the first target operation instruction based on the first target time on the target time axis, the first target time associated with the plurality of target behavior information to be displayed needs to be determined on the target time axis. In this embodiment, any time in the target time period may be determined as the first target time, that is, based on any time in the target time period, the first target operation instruction may be obtained, for example, a click operation may be performed on any time in the target time period, the first target operation instruction may be triggered, and in response to the first target operation instruction, the plurality of target behavior information of the target object corresponding to the first target time are displayed, so that all the target behavior information of the start time in the target time period may be displayed simultaneously based on the operation on the first target time, thereby increasing the display amount of information on the time axis.
As an alternative implementation manner, in step S204, in response to the first target operation instruction, displaying a plurality of target behavior information of the target object corresponding to the first target time includes: and responding to the first target operation instruction, and displaying at least one piece of target behavior information of each target object in the plurality of target objects.
In this embodiment, the target objects in the target scene may be a plurality of target objects, for example, a plurality of actors, and each target object may have a start time with a plurality of target behavior information within a target time period associated with the first target time. At least one piece of target behavior information of each target object in the plurality of target objects can be displayed in response to a first target operation instruction acquired based on a first target time. For example, when actions of multiple actors appear within the same 0.5 second of the target time period, the actions of the same actors can be gathered and displayed together, so that all target behavior information of multiple target objects within the target time period at the starting time can be displayed simultaneously based on the operation of the first target time, and the display amount of information on the time axis is increased.
As an optional implementation manner, in step S204, when, in response to the first target operation instruction, a plurality of target behavior information of the target object corresponding to the first target time are displayed, the method further includes: and processing the target behavior information to obtain a target result, wherein the target result is used for indicating the variation trend of the target behavior executed by the target object.
In this embodiment, when the plurality of target behavior information of the target object corresponding to the first target time is displayed, the plurality of target behavior information may be further processed, for example, the plurality of target behavior information may be made into a form such as a table, a histogram, a graph, and the like to analyze the plurality of target behavior information, which may be used to show a variation trend of the target behavior executed by the target object, for example, a player may observe a change of a current action through a time axis to know the variation trend of the overall action, thereby increasing a display amount of information on the time axis.
As an optional implementation manner, after displaying a plurality of target behavior information of the target object corresponding to the first target time in step S204, the method further includes: determining a first target behavior from a plurality of target behaviors; and on the target time axis, the starting time of the first target behavior is adjusted from the second target time to the third target time.
In this embodiment, the plurality of target behavior information may indicate a plurality of target behaviors, respectively. After displaying the plurality of target behavior information of the target object corresponding to the first target time, the first target behavior may be determined from the plurality of target behaviors according to actual needs, where the first target behavior is a target behavior of a start time to be adjusted, that is, the first target behavior is a target behavior of a start time to be edited according to actual needs, and the start time of the first target behavior may be the second target time. After the first target behavior is determined, on a target time axis, the starting time of the first target behavior can be directly adjusted to the third target time from the second target time, so that the control on the starting time of the first target behavior is realized, the problem that the starting time of the target behavior is inconvenient to modify due to the fact that the current dragging time cannot be observed intuitively is avoided, and the operation experience of time editing of a user is improved.
As an optional implementation, adjusting the starting time of the first target behavior from the second target time to the third target time includes: on the target time axis, acquiring a second target operation instruction based on a second target moment; responding to a second target operation instruction, and displaying a target identifier at a first position on a target time axis, wherein the first position corresponds to a second target moment, and the target identifier is used for indicating the adjustment of the starting moment of the first target behavior; acquiring a third target operation instruction based on the target identifier; responding to a third target operation instruction, and adjusting the display position of the target identifier from the first position to the second position; and determining the time corresponding to the second position on the target time axis as a third target time.
In this embodiment, when the starting time of the first target behavior is adjusted from the second target time to the third target time, the second target operation instruction may be obtained based on the second target time on the target time axis, for example, the second target operation instruction is triggered by performing a long-press operation, a click operation, a double-click operation, and the like on a first position corresponding to the first target time on the time axis, which is not limited herein.
After the second target operation instruction is acquired, in response to the second target operation instruction, a target identifier is displayed at a first position on a target time axis, where the target identifier may be a time identifier icon for indicating that the start time of the first target behavior may be adjusted. Obtaining a third target operation instruction based on the target identifier, for example, the third target operation instruction is a drag operation instruction triggered by a drag operation, adjusting the display position of the target identifier from a first position to a second position in response to the third target operation instruction, for example, long-pressing and dragging the target identifier to trigger the third target operation instruction, dragging the target identifier from the first position in response to the third target operation instruction, and after the long-pressing of the target identifier is stopped, the third target operation instruction disappears, the position where the target identifier stays on the target time axis is the second position, and the time corresponding to the second position on the target time axis is determined as a third target time, so as to accurately control the start time of the first target behavior, thereby realizing that the start time of the first target behavior is directly adjusted from the second target time to the third target time, the purpose of controlling the starting time of the first target behavior is achieved, the problem that the starting time of the target behavior is inconvenient to modify due to the fact that the current dragging time cannot be observed intuitively is avoided, and therefore the operation experience of time editing of a user is improved.
Optionally, in this embodiment, the newly created target behaviors may all correspond to one target identifier, and the dragging feedback is added for the target identifier, so as to accurately control the start time of the first target behavior, and implement modification of the start time of the target behavior.
Optionally, the target behavior whose starting time is the second target time may only have one first target behavior, the starting time of the one first target behavior is adjusted from the second target time to the third target time, if the target behavior whose starting time is the second target time has a plurality of first target behaviors, the starting times of the plurality of first target behaviors may be adjusted from the second target time to the third target time in a unified manner, or the first target behavior whose starting time needs to be adjusted may be further determined among the plurality of first target behaviors, and after the first target behavior whose starting time needs to be adjusted is determined, the starting time of the first target behavior whose starting time needs to be adjusted is adjusted from the second target time to the third target time based on the target identifier of the first target behavior whose starting time needs to be adjusted.
As an alternative embodiment, displaying the target identifier at the first position on the target time axis in response to the second target operation instruction includes: responding to a second target operation instruction, and activating the target identifier in the hidden state; and displaying the activated target identification on the first position.
In this embodiment, the target identifier may default to a hidden state without requiring adjustment to the start time of the first target behavior, so as not to obscure the player's view of the target timeline. When the target identifier is displayed at the first position on the target time axis in response to the second target operation instruction, the target identifier in the hidden state may be activated in response to the second target operation instruction, for example, the second target operation instruction is triggered by performing a long press operation on the first position corresponding to the second target time, the target identifier in the hidden state is activated, and then the activated target identifier is displayed at the first position, so as to directly adjust the starting time of the first target behavior.
As an optional implementation manner, after adjusting the starting time of the first target behavior from the second target time to the third target time, the method further includes: in the process of playing the target picture of the target scene, a picture including the first target behavior is played at the third target time.
In this embodiment, after the start time of the first target behavior is adjusted from the second target time to the third target time, that is, after the editing of the start time of the first target behavior is completed, in the process of playing the target picture of the target scene, when the third target time is reached, the picture including the first target behavior may be played, that is, the first target behavior starts to be executed at the third target time of the target picture, so that the operation experience of the user time editing is further improved.
As an alternative example, the embodiment may adjust for the start time of the target behavior corresponding to an arbitrary time on the target time axis. Optionally, the player may not want to have any target behavior information at the fourth target time on the target time axis as needed, that is, does not need to start any target behavior at the fourth target time, and at this time, the player does not care what the second target behavior is specifically, and may directly adjust the start time of the second target behavior corresponding to the fourth target time, where the second target behavior may include one target behavior starting at the fourth target time or may include multiple target behaviors starting at the fourth target time, which is not limited herein.
Optionally, a fourth target time is determined on the target time axis, and a second target operation instruction is obtained based on the fourth target time, for example, the second target operation instruction is triggered by performing a long press operation, a click operation, a double click operation, and the like on a third position on the time axis corresponding to the fourth target time, which is not limited herein.
After the second target operation instruction is acquired, in response to the second target operation instruction, a target identifier is displayed at a third position on the target time axis, where the target identifier may be a time identifier icon for indicating that the start time of the second target behavior corresponding to the fourth target time is adjusted. Obtaining a third target operation instruction based on the target identifier, for example, the third target operation instruction is a drag operation instruction triggered by a drag operation, adjusting the display position of the target identifier from a third position to a fourth position in response to the third target operation instruction, for example, long-pressing and dragging the target identifier to trigger the third target operation instruction, starting dragging the target identifier from the third position in response to the third target operation instruction, and after the long-pressing of the target identifier is stopped, the third target operation instruction disappears, the position where the target identifier stays on the target time axis is the fourth position, and a time corresponding to the fourth position on the target time axis is determined as a fifth target time, where the fourth position may be a predetermined position or a randomly determined position, so as to implement the start time of the second target behavior, the time axis is adjusted to the fifth target time directly from the fourth target time, at this time, any target behavior will not start at the fourth target time, and the purpose that the user does not need to start any target behavior at the fourth target time is achieved, so that the operation experience of the user in time editing of the time axis is improved.
As an alternative embodiment, a plurality of target behavior information is displayed by a target list, and the target behavior information displayed by the target list includes at least one of the following: the name of the target behavior; the starting time of the target behavior; a type of target behavior; the number of target objects that perform the target behavior.
In this embodiment, a first target operation instruction is acquired based on a first target time on a target time axis, and in response to the first target operation instruction, a plurality of target behavior information of a target object corresponding to the first target time are displayed, where the plurality of target behavior information may be displayed by a target list, and the target behavior information displayed by the target list is used to indicate a target behavior executed by the target object, and may include a name of the target behavior, for example, a type of the target behavior, a number of target objects executing the target behavior, and the like.
It should be noted that the name of the target behavior, the starting time of the target behavior, the type of the target behavior, and the number of target objects executing the target behavior included in the target behavior information are merely examples of the embodiment of the present invention, and the target behavior information does not represent that the embodiment of the present invention includes only the above information, and any information that can be used to indicate the target behavior is within the scope of the embodiment of the present invention, and is not illustrated here.
According to the embodiment, the target behavior information of the target object at different starting moments in the target time period is displayed at the target moment on the target time axis, the purpose of displaying the target behavior information is achieved, the problem that the time axis cannot be completely displayed on a screen of a mobile shaft end is avoided, meanwhile, when the time axis is operated, dragging feedback is increased through the target identification, so that the time can be accurately controlled, the starting moment of the behavior can be directly edited on the time axis, the problem that the starting moment of the target behavior is inconvenient to modify due to the fact that the current dragging time moment cannot be intuitively observed is avoided, and the operation experience of time editing of a user is improved.
The technical solution of the present invention will be described below with reference to a preferred embodiment, specifically, a target object is taken as an actor, and a target behavior is taken as an action for example.
Fig. 3 is a schematic diagram of a timeline interface according to the related art. As shown in fig. 3, a time axis displayed on the computer can be displayed for 15 seconds by adjusting the interval. In addition, only the information of the action corresponding to the target time is displayed when the target time on the time axis is operated. If the pixels are mapped to the screen of the mobile phone in the same proportion, the touch range of the mobile phone needs to ensure that the pixels are more than 44px, only 1.5 seconds can be displayed, and therefore the number of seconds which can be displayed on a single screen is reduced.
Fig. 4 is a schematic diagram of a timeline interface according to another related art. As shown in fig. 4, the end time and the start time are represented by continuous blocks in units of seconds, for example, 4.2 seconds as the start time and 10.2 seconds as the end time. However, after the mobile phone is moved to the mobile phone, the amount of information that can be displayed on a single screen is too small, and the scale on the axis can be shielded when the time axis is dragged due to the large range of the click response of the finger, so that the time on the time axis cannot be accurately controlled due to weak feedback of the time axis.
This embodiment is further improved with respect to the drawbacks of the above-described solution.
Fig. 5 is a schematic diagram of a timeline according to an embodiment of the present invention. As shown in fig. 5, a single screen may display the motion information of the actor within 12 seconds of the start time by performing an aggregation process on the motions of the actor compared to the time axes shown in fig. 3 and 4. Optionally, the single screen of this embodiment may also display motion information in other time periods, which is mainly limited by the width of the screen of the mobile phone. Alternatively, if the width of the cell phone screen is greater than 1334, actor's motion information starting at a time greater than 12 seconds may be displayed.
The time axis shown in fig. 5 is exemplified below.
This embodiment performs an aggregation process of the actions of actors with respect to the time axis. Alternatively, the target time 2 seconds on the time axis may be operated, the click operation may be performed for 2 seconds on the time axis, information of the action of the actor between the target time periods of 2.0 seconds and 2.4 seconds at the action start time may be displayed, for example, information of action a1 having a start time of 2.1 seconds, information of action B1 having a start time of 2.3 seconds, information of action C1 having a start time of 2.4 seconds may be displayed at the same time, and the information of action a1, the information of action B1, and the information of action C1 may be displayed through a first list; by operating the target time 2.5 seconds on the time axis, the information of the action of the actor between 2.5 seconds and 2.9 seconds of the target time period at the action starting time can be displayed, for example, the information of the action a2 with the starting time of 2.5 seconds, the information of the action B2 with the starting time of 2.6 seconds, the information of the action C2 with the starting time of 2.7 seconds are simultaneously displayed, and the information of the action a2, the information of the action B2 and the information of the action C2 can be displayed through a second list; when the target time on the time axis is 3 seconds, the action information of the actor with the action starting time in the target time interval of 3.0 seconds to 3.4 seconds can be displayed, for example, the action A3 with the starting time of 3.2 seconds, the action B3 with the starting time of 3.3 seconds, the action C3 with the starting time of 3.4 seconds are displayed at the same time, the action A3 information, the action B3 information, the action C3 information can be displayed through a third list, and the like, so that the action information in 0.5 second can be displayed in one list.
Because the embodiment can display the information of a plurality of actions corresponding to different starting moments of actors in the target time period only under the condition of operating the target moment on the time axis, compared with the method shown in fig. 3, the method avoids operating the target moment on the time axis and only displaying the information of the actions corresponding to the target moment, thereby avoiding the problem that the number of seconds that can be displayed on a single screen is reduced because the terminal needs to ensure that the pixels are more than 44px due to the touch range, further achieving the effect of improving the information display quantity of the time axis, and enabling a player to observe the change of the current actions through the time axis.
Fig. 6 is a schematic diagram of adjusting the start time of an action by a time identification icon according to an embodiment of the present invention. As shown in fig. 6, the newly created actions all correspond to a time identification icon on the time axis, for example, an action with a start time of 3.0 seconds corresponds to a time identification icon on the time axis. Optionally, the time identification icon is generally hidden on the time axis by default, and is displayed after being operated at the position of the corresponding time, so that the time axis view of the player is not blocked.
Optionally, a user triggers an operation instruction at a position of 3 seconds on a time axis through operations such as clicking, double-clicking, long-pressing and the like, responds to the operation instruction, displays a time identification icon at the position of 3 seconds on the time axis, and highlights 3.0, a player can drag the time identification icon at the moment, and after the player looses his hand, the time corresponding to the position where the time identification icon stays on the time axis is determined to modify the action of the actor with the starting time of 3.0 to the corresponding moment, so that the problem that the scale on the time axis is shielded when the time axis is dragged due to the large range of the finger click response after the player moves to a mobile phone, the time axis feedback is weak, and the time on the time axis cannot be accurately controlled is solved.
Compared with the figure 4, the embodiment adds the dragging feedback when the player operates the time axis, thereby avoiding the problems that the time axis feedback is weak and the time on the time axis cannot be accurately controlled due to the fact that the displayable information amount of a single screen is too small, the finger clicks the large response range and can shield the scale on the time axis when the time axis is dragged, so that the player can accurately control the time, and the starting time of the action can be directly edited on the time axis, thereby improving the operation experience of time editing.
The embodiment can solve the problem that the terminal screens of a plurality of time axes and a long time axis cannot be completely displayed, wherein the time axes perform a plurality of different actions for a plurality of actors, and a scheme of performing collective display by taking time as a unit can be used on the time axes. Because on the moment on the time axis, show a plurality of action information of actor at different inception moments in unit time, reached and carried out the purpose that shows to action information, the problem of the unable complete demonstration of time axis on the screen of removal axle head has been avoided, simultaneously when operating the time axis, increase through the time sign icon and drag the feedback, thereby can accurate control time, and can directly edit the inception moment of action on the time axis, avoided because can't observe the time moment that pulls at present directly perceivedly, lead to the inconvenient problem at the inception moment of modification action, thereby promote the operation experience of user's time editing.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
The embodiment of the invention also provides an information processing device. It should be noted that the information processing apparatus of this embodiment can be used to execute the information processing method of the embodiment of the present invention.
Fig. 7 is a schematic diagram of an information processing apparatus according to an embodiment of the present invention. As shown in fig. 7, the apparatus includes: an acquisition unit 10 and a display unit 20.
The acquiring unit 10 is configured to acquire a first target operation instruction based on a first target time on a target time axis, where the target time axis is used to indicate a start time of a behavior executed by a target object in a target scene.
And a display unit 20, configured to display, in response to the first target operation instruction, a plurality of target behavior information of the target object corresponding to the first target time, where each target behavior information is used to indicate one target behavior executed by the target object, start times of the plurality of target behaviors corresponding to the plurality of target behavior information are within a target time period, and the target time period is a time period associated with the first target time on the target time axis.
Optionally, the apparatus further comprises: the first determining unit is used for determining any time in the target time period as a first target time before acquiring the first target operation instruction based on the first target time on the target time axis.
Alternatively, the display unit 20 includes: the first display module is used for responding to the first target operation instruction and displaying at least one piece of target behavior information of each target object in the plurality of target objects.
Optionally, the apparatus further comprises: and the processing unit is used for processing the plurality of target behavior information to obtain a target result when responding to the first target operation instruction and displaying the plurality of target behavior information of the target object corresponding to the first target time, wherein the target result is used for indicating the variation trend of the target behavior executed by the target object.
Optionally, the apparatus further comprises: a second determination unit configured to determine a first target behavior from among a plurality of target behaviors after displaying a plurality of target behavior information of a target object corresponding to the first target time; and the first adjusting unit is used for adjusting the starting time of the first target behavior from the second target time to the third target time on the target time axis.
Optionally, the first adjusting unit includes: the first acquisition module is used for acquiring a second target operation instruction based on a second target moment on a target time axis; the second display module is used for responding to a second target operation instruction and displaying a target identifier at a first position on a target time axis, wherein the first position corresponds to a second target moment, and the target identifier is used for indicating the adjustment of the starting moment of the first target behavior; the second acquisition module is used for acquiring a third target operation instruction based on the target identifier; the adjusting module is used for responding to a third target operation instruction and adjusting the display position of the target identifier from the first position to the second position; and the determining module is used for determining the time corresponding to the second position on the target time axis as a third target time.
Optionally, the second display module comprises: the activation submodule is used for responding to a second target operation instruction and activating the target identifier in the hidden state; and the display submodule is used for displaying the activated target identification on the first position.
Optionally, the apparatus further comprises: and the playing unit is used for playing the picture comprising the first target behavior at the third target time in the process of playing the target picture of the target scene after the starting time of the first target behavior is adjusted from the second target time to the third target time.
Optionally, the plurality of target behavior information of this embodiment is displayed by a target list, and the target behavior information displayed by the target list includes at least one of: the name of the target behavior; the starting time of the target behavior; a type of target behavior; the number of target objects that perform the target behavior.
In this embodiment, a first target operation instruction is acquired by the acquisition unit 10 based on a first target time on a target time axis, where the target time axis is used to indicate a starting time of a behavior executed by a target object in a target scene, and a plurality of target behavior information of the target object corresponding to the first target time are displayed by the display unit 20 in response to the first target operation instruction, where each target behavior information is used to indicate one target behavior executed by the target object, and the starting times of the plurality of target behaviors corresponding to the plurality of target behavior information are within a target time period, where the target time period is a time period associated with the first target time on the target time axis. Because a plurality of target behavior information of the target object at different starting moments in the target time period is displayed at the target moment on the target time axis, the purpose of displaying the target behavior information is achieved, the problem of small information amount displayed by the time axis is solved, and the technical effect of improving the information display amount of the time axis is further achieved.
Embodiments of the present invention also provide a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
Optionally, in this embodiment, the storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Embodiments of the present invention also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the principle of the present invention should be included in the protection scope of the present invention.

Claims (11)

1. A method for processing information, comprising:
acquiring a first target operation instruction based on a first target time on a target time axis, wherein the target time axis is used for indicating the starting time of a behavior executed by a target object in a target scene;
responding to the first target operation instruction, displaying a plurality of target behavior information corresponding to the first target time, wherein each target behavior information is used for indicating one target behavior executed by the target object, wherein the starting time of the target behaviors corresponding to the target behavior information is within a target time period, and the target time period is a time period which is associated with the first target time on the target time axis;
determining a first target behavior from a plurality of the target behaviors;
and adjusting the starting time of the first target behavior from a second target time to a third target time on the target time axis.
2. The method according to claim 1, wherein before the first target operation instruction is acquired based on the first target time on the target time axis, the method further comprises:
and determining any time in the target time period as the first target time.
3. The method of claim 1, wherein displaying, in response to the first target operation instruction, a plurality of target behavior information of the target object corresponding to the first target time comprises:
and responding to the first target operation instruction, and displaying at least one piece of target behavior information of each target object in a plurality of target objects.
4. The method according to claim 1, wherein when displaying a plurality of target behavior information of the target object corresponding to the first target time in response to the first target operation instruction, the method further comprises:
and processing the target behavior information to obtain a target result, wherein the target result is used for indicating the variation trend of the target behavior executed by the target object.
5. The method of claim 1, wherein adjusting the starting time of the first target behavior from the second target time to the third target time comprises:
on the target time axis, acquiring a second target operation instruction based on the second target time;
responding to the second target operation instruction, and displaying a target identifier at a first position on the target time axis, wherein the first position corresponds to the second target time, and the target identifier is used for indicating to adjust the starting time of the first target behavior;
acquiring a third target operation instruction based on the target identifier;
responding to the third target operation instruction, and adjusting the display position of the target identifier from the first position to a second position;
and determining the time corresponding to the second position on the target time axis as the third target time.
6. The method of claim 5, wherein displaying the target identifier at the first position on the target timeline in response to the second target operation instruction comprises:
responding to the second target operation instruction, and activating the target identifier in a hidden state;
and displaying the activated target identification on the first position.
7. The method of claim 1, wherein after adjusting the starting time of the first target behavior from the second target time to the third target time, the method further comprises:
and in the process of playing a target picture of the target scene, playing a picture comprising the first target behavior at the third target moment.
8. The method according to any one of claims 1 to 7, wherein the plurality of target behavior information is displayed by a target list, and the target behavior information displayed by the target list comprises at least one of:
a name of the target behavior;
a starting time of the target behavior;
a type of the target behavior;
a number of target objects to execute the target behavior.
9. An apparatus for processing information, comprising:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring a first target operation instruction based on a first target time on a target time axis, and the target time axis is used for indicating the starting time of a behavior executed by a target object in a target scene;
a display unit, configured to display, in response to the first target operation instruction, a plurality of target behavior information corresponding to the first target time, where each target behavior information is used to indicate one target behavior executed by the target object, where start times of the plurality of target behaviors corresponding to the plurality of target behavior information are within a target time period, and the target time period is a time period associated with the first target time on the target time axis;
wherein the apparatus is further configured to determine a first target behavior from a plurality of the target behaviors; and adjusting the starting time of the first target behavior from a second target time to a third target time on the target time axis.
10. A storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the method of any of claims 1 to 8 when executed.
11. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is arranged to execute the computer program to perform the method of any of claims 1 to 8.
CN201811393907.3A 2018-11-21 2018-11-21 Information processing method and device, storage medium and electronic device Active CN109542322B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811393907.3A CN109542322B (en) 2018-11-21 2018-11-21 Information processing method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811393907.3A CN109542322B (en) 2018-11-21 2018-11-21 Information processing method and device, storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN109542322A CN109542322A (en) 2019-03-29
CN109542322B true CN109542322B (en) 2021-03-23

Family

ID=65849216

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811393907.3A Active CN109542322B (en) 2018-11-21 2018-11-21 Information processing method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN109542322B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112379823B (en) * 2020-11-25 2022-01-21 武汉市人机科技有限公司 Method for switching four axes in time axis software

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100505858C (en) * 2004-05-13 2009-06-24 索尼株式会社 Image data processing apparatus and image data processing method
JP4765732B2 (en) * 2006-04-06 2011-09-07 オムロン株式会社 Movie editing device
CN101247481A (en) * 2007-02-16 2008-08-20 李西峙 System and method for producing and playing real-time three-dimensional movie/game based on role play
JP2016526343A (en) * 2013-05-29 2016-09-01 トムソン ライセンシングThomson Licensing Apparatus and method for navigating media content
CN106021496A (en) * 2016-05-19 2016-10-12 海信集团有限公司 Video search method and video search device

Also Published As

Publication number Publication date
CN109542322A (en) 2019-03-29

Similar Documents

Publication Publication Date Title
CN109672922B (en) Game video editing method and device
JP7042917B2 (en) Message processing method, unread message display method, and computer terminal
CN105487748B (en) Method and device for displaying icon, storage medium and computer terminal
CN106293410B (en) Video progress adjusting method and mobile terminal
CN111589163B (en) Ranking list processing method, terminal equipment, server and storage medium
CN103793134A (en) Touch screen terminal and multi-interface switching method thereof
CN107277412B (en) Video recording method and device, graphics processor and electronic equipment
CN108876934A (en) Key point mask method, device and system and storage medium
CN105389074A (en) Control method for smart watch and smart watch
CN105786417B (en) A kind of dynamic display method of static images, device and equipment
CN112114734A (en) Online document display method and device, terminal and storage medium
CN110134237A (en) Interface control method and relevant device
CN110868693A (en) Application program flow control method, terminal device and storage medium
CN109542322B (en) Information processing method and device, storage medium and electronic device
WO2022127166A1 (en) Video processing method and electronic device
CN109710368B (en) Processing method, device and system
CN112243065B (en) Video recording method and device
CN109388737A (en) A kind of sending method, device and the storage medium of the exposure data of content item
CN113952740A (en) Method and device for sharing virtual props in game, storage medium and electronic equipment
CN111866403B (en) Video graphic content processing method, device, equipment and medium
CN112752127B (en) Method and device for positioning video playing position, storage medium and electronic device
CN109656463A (en) The generation method of individual character expression, apparatus and system
CN106354356A (en) Method and terminal for managing application icon
CN112619166A (en) Game screen recording method and device, electronic equipment and storage medium
CN109298782B (en) Eye movement interaction method and device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant