WO2022194097A1 - 对象控制方法及设备 - Google Patents

对象控制方法及设备 Download PDF

Info

Publication number
WO2022194097A1
WO2022194097A1 PCT/CN2022/080685 CN2022080685W WO2022194097A1 WO 2022194097 A1 WO2022194097 A1 WO 2022194097A1 CN 2022080685 W CN2022080685 W CN 2022080685W WO 2022194097 A1 WO2022194097 A1 WO 2022194097A1
Authority
WO
WIPO (PCT)
Prior art keywords
special effect
audio
page
target object
played
Prior art date
Application number
PCT/CN2022/080685
Other languages
English (en)
French (fr)
Inventor
赵双琳
Original Assignee
北京字跳网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字跳网络技术有限公司 filed Critical 北京字跳网络技术有限公司
Publication of WO2022194097A1 publication Critical patent/WO2022194097A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the embodiments of the present disclosure relate to the field of computer technologies, and in particular, to an object control method and device.
  • the special effect game plays corresponding audio, and multiple first objects (for example, balls) appear randomly.
  • the user slides his finger on the screen to control the first object.
  • Two objects eg, an animal
  • move ie, perform a game interaction, so that the second object can touch the first object.
  • the control method is inconvenient, especially when the user's hands are inconvenient or the screen is too large, the user cannot use the finger to control the movement of the first object, thus causing the user to Special effects games such as mobile control cannot be used.
  • Special effects games such as mobile control cannot be used.
  • the audio played in special effects games is only background music and has no relationship with other objects in special effects games, users cannot interact with the game based on audio.
  • Embodiments of the present disclosure provide an object control method and device, so as to solve the technical problems in the prior art that the control method is inconvenient and the game interaction cannot be performed according to audio.
  • an embodiment of the present disclosure provides an object control method, including:
  • Dynamically displaying a first special effect on the first page wherein the dynamic display attribute of the first special effect is determined according to the rhythm information of the audio played on the first page;
  • a third special effect is displayed according to the relationship between the real-time position of the second special effect and the dynamic position corresponding to the first special effect.
  • an object control device including:
  • a display module configured to display a first page, wherein the user's target object image is displayed in real time on the first page;
  • the display module is further configured to dynamically display a first special effect on the first page, wherein the dynamic display attribute of the first special effect is determined according to the rhythm information of the audio played on the first page;
  • a processing module configured to control the second special effect displayed on the first page to move in real time in response to the movement of the first target object in the target object image of the first page;
  • the processing module is further configured to display a third special effect according to the relationship between the real-time position of the second special effect and the dynamic position corresponding to the first special effect.
  • embodiments of the present disclosure provide an electronic device, including: at least one processor and a memory.
  • the memory stores computer-executable instructions.
  • the at least one processor executes the computer-executable instructions stored in the memory, causing the at least one processor to perform the object control method described in the first aspect and various possible designs of the first aspect above.
  • embodiments of the present disclosure provide a computer-readable storage medium, where computer-executable instructions are stored in the computer-readable storage medium, and when a processor executes the computer-executable instructions, the first aspect and the first Aspects various possible designs of the object control method described.
  • embodiments of the present disclosure provide a computer program product, including a computer program that, when executed by a processor, implements the object control method described in the first aspect and various possible designs of the first aspect.
  • embodiments of the present disclosure provide a computer program that, when executed by a processor, implements the object control method described in the first aspect and various possible designs of the first aspect.
  • Embodiments of the present disclosure provide an object control method and device.
  • the method dynamically displays a first special effect on a first page by using a dynamic display attribute determined according to rhythm information of audio played on the first page, that is, according to the dynamic display attribute , control the first special effect to move, and when the first target object in the target object image on the first page is detected to move, it indicates that the user's first target object has moved, that is, a control operation is input, and the second
  • the special effect moves accordingly, realizing the movement control of the second special effect; the movement of the second special effect can be controlled without sliding a finger on the screen, which improves the convenience of control, so that the user can successfully use the special effect game of the mobile control type.
  • the third special effect is displayed to inform the user of the game effect brought by the input control operation;
  • the dynamic display attribute is determined according to the real rhythm information of the audio, that is, the audio played on the first page is not only background music, but is associated with the first special effect, and the user can input corresponding control operations according to the rhythm of the audio to make the second special effect. Touching the first special effect enables users to conduct non-contact interaction according to the rhythm of music, which increases the fun of the interaction. At the same time, it can enhance the user's sense of rhythm, enhance the user's sense of realism when using, and improve user satisfaction.
  • FIG. 1 is a schematic diagram of a finger control special effect provided by an embodiment of the present disclosure
  • FIG. 2 is a schematic flowchart 1 of an object control method provided by an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram 1 of object control provided by an embodiment of the present disclosure.
  • FIG. 4 is a second schematic diagram of object control provided by an embodiment of the present disclosure.
  • FIG. 5 is a second schematic flowchart of an object control method provided by an embodiment of the present disclosure.
  • FIG. 6 is a schematic diagram of an enlarged target object according to an embodiment of the present disclosure.
  • FIG. 7 is a structural block diagram of an object control device provided by an embodiment of the present disclosure.
  • FIG. 8 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure.
  • the special effect game plays the corresponding audio, and a plurality of first objects (such as the ball as shown in FIG. 1 ) appear randomly.
  • the user slides the finger on the screen to control the second object (as shown in Figure 1) to move, that is, to perform game interaction, so that the second object can touch the first object, for example, when the finger is moved from the left When swiping to the right, the cat also moves from left to right.
  • the control method is inconvenient, especially when the user's hands are inconvenient or the user needs to hold the mobile terminal with both hands because the screen is too large, the user cannot use the finger
  • the first object is controlled to move, so that the user cannot use the special effect game of the movement control type.
  • the audio played by the special effect game is only background music and is not related to other objects of the special effect game, the user cannot interact with the game according to the audio.
  • the technical idea of the present disclosure is that when the user is using the game, the electronic device shoots the user to display the user's target object image on the first page, and the user can blink, nod and other actions to indicate that the user has entered Start the operation, that is, you need to start the dynamic display of the first special effect; after starting, the user can move the head left and right, for example, when moving the nose left and right, in response to the movement, control the second special effect to move left and right, so as to accurately collide with the first effect
  • the first special effect displayed dynamically on the page improves the convenience of control.
  • the dynamic display attribute of the first special effect is determined according to the real rhythm information of the audio, that is, the audio played on the first page is associated with the first special effect, and the user can input corresponding control operations according to the rhythm of the audio to make the second special effect touch Touching the first special effect realizes non-contact interaction of users according to the rhythm of music, and can enhance the user's sense of rhythm, enhance the user's sense of reality when using, and improve user satisfaction.
  • FIG. 2 is a schematic flowchart 1 of an object control method provided by an embodiment of the present disclosure.
  • the method of this embodiment can be applied to electronic devices, for example, mobile terminals such as smart phones, handheld computers, tablet computers, computer equipment (such as desktop computers, notebook computers, all-in-one computers, etc.), etc.
  • the Object control methods include:
  • S201 Display a first page, where an image of a user's target object is displayed in real time on the first page.
  • a related application program on the electronic device can be opened, and the application program displays a first page, and the first page can display the user data collected by the electronic device in real time.
  • the target object image that is, after the first page is displayed, the camera corresponding to the electronic device captures the user's image in real time, and displays it on the first page in real time.
  • the first page may also display special effects related to the special effect game (eg, a first special effect, a second special effect, a third special effect, etc.).
  • the target object image is a user's face image collected in real time by a camera, an image of other body parts, or a pet image, and the like.
  • the camera corresponding to the electronic device may be a camera integrated on the electronic device (for example, a front-facing camera), or a camera external to the electronic device, that is, the camera is not integrated on the electronic device, but is connected to the electronic device. This disclosure does not limit it.
  • S202 Dynamically display the first special effect on the first page, wherein the dynamic display attribute of the first special effect is determined according to the rhythm information of the audio played on the first page.
  • the corresponding audio to be played is played on the first page, that is, the corresponding audio (for example, a certain song) is played, and the dynamic display determined according to the rhythm information of the audio is displayed.
  • the attribute that is, the display style of the first special effect, dynamically displays the first special effect corresponding to the audio on the first page, that is, continuously moves each first special effect displayed on the first page on the first page.
  • the first special effect when the first special effect is dynamically displayed, the first special effect may be dynamically displayed in the preset display area on the first page according to the first preset movement direction, wherein the first preset movement direction includes the following At least one of: top-to-bottom direction, bottom-to-top direction, left-to-right direction, and right-to-left direction.
  • the first preset moving direction is from top to bottom
  • new first special effects appear continuously in the preset display area on the first page, and in the preset display area
  • the first special effect moves from top to bottom.
  • a new first special effect moves from top to bottom to
  • the preset display area can be set according to the actual situation, for example, it is the center area of the first page.
  • the dynamic display attribute of the first special effect includes determining the distance between the first special effects according to the rhythm information and/or determining the moving speed of the first special effect according to the rhythm information.
  • the distance between the first special effects represents the distance between the two first special effects, for example, the distance between the two first special effects as shown in (b) in FIG. 3 .
  • the rhythm information includes one or more of the following: rhythm type, beat information and drum beat information.
  • the rhythm type includes fast rhythm type and slow rhythm type.
  • the rhythm type of the audio is fast rhythm type
  • rhythm type of the audio is of the slow rhythm type, it indicates that the rhythm of the audio is slower and there are fewer drum beats.
  • the drum beat information includes the drum beat number and/or the drum beat position.
  • the number of drums represents the number of drums included in the audio.
  • the drum position indicates the time position of the drum in the audio. For example, the drum position includes 3 seconds and 10 seconds, which means that there is a drum at the 3rd second of the audio, and there is also a drum at the 10th second of the audio.
  • the beat information indicates a beat type included in the audio, for example, the beat type includes a four-two-beat type.
  • the drum dots are in one-to-one correspondence with the first special effects, and the distance between the first special effects is related to the positions of the drum dots, that is, related to the distance between the positions of two adjacent drum dots.
  • the position of drum point 1 is 3 seconds
  • the position of drum point 2 is 5 seconds
  • drum point 1 is adjacent to drum point 2
  • the distance between drum point 1 and drum point 2 is 2 seconds
  • find the distance corresponding to 2 seconds and put It is determined that the interval between the first special effect corresponding to drum point 1 and the second special effect corresponding to drum point 2 is the interval corresponding to the 2 seconds.
  • the drum point can be used to represent the sound intensity feature of the audio, and the drum point can also be called a rebeat, which is located in an audio frame with a large amplitude in the audio. Therefore, the position of the audio drum point can be determined according to the audio waveform.
  • the process includes:
  • a waveform corresponding to the audio to be played is acquired, and a target waveform segment is extracted from the waveform corresponding to the audio to be played, where the target waveform segment includes a peak value of the waveform corresponding to the audio to be played. Match the target waveform segment with the preset drum waveform segment to determine the drum location of the audio to be played.
  • a target waveform segment including a peak value is extracted from the waveform corresponding to the audio to be played, and the target waveform segment is matched with a preset drum beat waveform segment, that is, for each target waveform segment, the target waveform segment and the preset.
  • the similarity between the drum dot waveform segments when the similarity is greater than the preset similarity, it indicates that the time corresponding to the peak in the target waveform segment is a drum dot position.
  • the preset drum beat waveform segment is the pre-collected drum beat audio waveform segment
  • the determination may also be performed in other manners, for example, by using the amplitude in the waveform of the audio, which is not limited in the present disclosure.
  • the rhythm type of the audio can be determined according to the number of drums in the audio.
  • the specific process is: determining the number of drums corresponding to the audio to be played, that is, acquiring the number of drums included in the audio to be played on the first page.
  • the rhythm type is determined according to the number of drums, that is, when the number of drums is greater than the preset value, it indicates that the number of drums in the audio is large and the rhythm of the audio is fast, and the rhythm type of the audio is determined to be a fast-rhythm type.
  • the number of drum points is less than or equal to the preset number value, it indicates that the number of drum points of the audio is small and the rhythm of the audio is slow, and the rhythm type of the audio is determined to be a slow rhythm type.
  • the moving speed of the first special effect can also be determined according to the rhythm type of the audio.
  • the rhythm type is a fast rhythm type, it indicates that the number of drum beats in the audio is large, and the number of the first special effect that needs to be dropped is large. Therefore, You can move the first special effect at a faster speed, that is, according to the first preset moving speed; when the rhythm type is a slow rhythm type, it indicates that the number of drum beats in the audio is small, and the number of the first special effect that needs to be dropped is small. Therefore, the first special effect can be moved at a slower speed, that is, at a second preset moving speed.
  • the moving speeds corresponding to the fast rhythm type and the slow rhythm type may also be the same, that is, the first special effects corresponding to all audios move according to the preset movement speed.
  • the rhythm type of the audio can also be determined by using the rhythm information of the audio, and the process is similar to the above process of determining the rhythm type of the audio according to the number of drum beats, which is not repeated here.
  • the second special effect when it is detected that the first target object in the target object image of the first page moves, it indicates that the user has input a corresponding control operation, that is, the movement of the second special effect needs to be controlled, and the second special effect is controlled to perform Real-time movement, in response to the control operation input by the user, realizes non-contact control of the movement of the second special effect, that is, the user does not need to slide the finger on the screen to control the movement of the second special effect.
  • the second special effect when controlling the second special effect to move, move according to the moving direction of the first target object. For example, if the moving direction of the first target object is from left to right, the second special effect moves from left to right. .
  • the first target object includes parts such as the nose and hands of the user, and may also be other parts, such as eyes, head, etc., which are not limited in the present disclosure.
  • the position of the second special effect changes in real time, and whether the second special effect touches the
  • the first special effect that is, whether the first special effect is hit
  • the corresponding prompt special effect that is, the third special effect
  • the first special effect includes a core area (the circular area in the first special effect as shown in (a) or (b) in FIG. 3 ) and an edge area (as shown in (a) or (b) in FIG. 3 . ) in the area other than the circle in the first special effect).
  • the first special effect corresponding to the drum point moves to the designated area
  • the core area in the first special effect corresponds to the precise drum point position, that is, when the user controls the second special effect to hit the core area, it indicates that the user Step on the drum accurately, that is, click to the rhythm precisely.
  • the edge area in the first special effect corresponds to a position close to the drum point, that is, when the user controls the second special effect to hit the edge area, it indicates that the user did not step on the drum point accurately, that is, the rhythm was not accurately caught.
  • the first special effect shown in (a) or (b) of FIG. 3 is only an example, and the first special effect may also be other styles of special effects, for example, the first special effect is an arrow, a circle, etc. , which is not limited by the present disclosure.
  • the second special effect can be a ball with eyes, such as the second special effect shown in FIG. 4
  • the second special effect can also be other styles of special effects, for example, the second special effect is animals such as dogs. , which is not limited by the present disclosure.
  • the third special effect includes prompt information and/or prompt animation.
  • the prompt information includes prompt text, prompt pictures and other information.
  • the spacing between the first special effects is determined according to the real drum position of the audio, that is, the first special effects are arranged according to the real rhythm, so that when the audio is played on the first page, the real audio can be dynamically displayed according to the real drum position of the audio.
  • the first special effect arranged in the rhythm of the rhythm, so that the user can precisely collide with the first special effect according to the real rhythm, enhance the user's sense of rhythm, and enhance the authenticity of the experience, bringing the user an immersive experience.
  • the first target object in the target object image on the first page is also moving in real time
  • the electronic device tracks the first target object, Determine the moving direction of the first target object, so that the second special effect bound to the first target object is also controlled according to the moving direction.
  • the user only needs to move the first target object to move the second special effect.
  • the control does not require the user to touch the screen of the electronic device, thereby realizing the non-contact control of the second special effect and improving the convenience of the control.
  • the first special effect is dynamically displayed on the first page by using the dynamic display attribute determined according to the rhythm information of the audio played on the first page, that is, the first special effect is controlled to move according to the dynamic display sample attribute, and when the detection is performed
  • the first target object in the target object image on the first page moves, it indicates that the user's first target object has moved, that is, a control operation has been input, and the second special effect is controlled to move accordingly, realizing the second special effect.
  • the movement control of the second special effect can be controlled without sliding a finger on the screen, which improves the convenience of control, so that the user can successfully use the special effect game of the movement control type.
  • the third special effect is displayed to inform the user of the game effect brought by the input control operation;
  • the dynamic display attribute is determined according to the real rhythm information of the audio, that is, the audio played on the first page is not only background music, but is associated with the first special effect, and the user can input corresponding control operations according to the rhythm of the audio to make the second special effect. Touching the first special effect enables users to conduct non-contact interaction according to the rhythm of music, which increases the fun of the interaction. At the same time, it can enhance the user's sense of rhythm, enhance the user's sense of realism when using, and improve user satisfaction.
  • FIG. 5 is a second schematic flowchart of an object control method provided by an embodiment of the present disclosure.
  • the user can also start the game without contacting the screen to achieve touchless triggering.
  • the following will describe in detail the process of how to trigger the game start without touching a specific embodiment, as shown in FIG. 5, the method includes:
  • the second target object in the target object image in the first page is enlarged, that is, a certain image of the user in the first page is enlarged. Enlarging each part can make the user observe the movement of the first target object more intuitively, improve the control accuracy, and increase the interest.
  • the second target object is the user's head.
  • the user's head When the user's head is not enlarged, the user's head is a normal size (as shown in (a) in FIG. 6 ). After the head is enlarged, the user's head becomes larger (as shown in (b) in FIG. 6 ), so as to realize the large-head special effect.
  • the audio to be played on the first page is determined. , that is, the audio to be played corresponding to the first page is determined.
  • the trigger action may include head up and down movement, left and right movement, hand left and right movement, up and down movement, eye blinking, nose up and down, left and right movement, and the like.
  • the nose moves, the user is actually moving the head. For example, when the user's head moves left and right, the nose also moves left and right.
  • the determination may be performed in the following manners.
  • One way is to randomly select audio from the preset audio list as the audio to be played, that is, the electronic device randomly selects an audio from the preset audio list, that is, the preset audio library, and determines it as the corresponding audio on the first page. Audio to play.
  • Another way is to acquire the audio selected by the user from the preset audio list as the audio to be played. That is, the user can select an audio from the provided preset audio list, and the electronic device determines the audio selected by the user as the audio to be played corresponding to the first page.
  • Another way is to acquire the audio uploaded by the user as the audio to be played. That is, when the audio that the user wants to play does not exist in the preset audio list, the user can also upload the audio he wants, and the electronic device determines the audio uploaded by the user as the audio to be played corresponding to the first page.
  • the rhythm information corresponding to the audio in the preset audio list has been predetermined, that is, the dynamic display attribute of the first special effect has been predetermined.
  • the first special effect can directly use the dynamic display attribute make a move.
  • the first special effect whose dynamic display attribute matches the rhythm information of the audio to be played corresponding to the first page is continuously displayed on the first page, that is, every time a drum beat of the audio is reached , a first effect moves to the designated area on the first page.
  • the specific process is: during the dynamic movement of the first special effect in the preset display area of the first page, if If the real-time position of the second special effect is in the core area, the first prompt information and/or the first prompt animation is displayed. If the real-time position of the second special effect is in the edge area, the second prompt information and/or the second prompt animation is displayed. If there is no overlap between the real-time position of the second special effect and the dynamic position of the first special effect, third prompt information and/or a third prompt animation is displayed.
  • the user when the first special effect moves dynamically in the preset display area on the first page, the user can move the first target object according to the second preset moving direction (for example, from left to right).
  • the electronic device controls the second special effect to also move according to the second preset movement direction in response to the movement.
  • the real-time position of the second special effect changes continuously.
  • the real-time position of the second special effect is in the core area of the first special effect, it indicates that the movement trajectory of the second special effect overlaps with the core area.
  • the first prompt information and/or the first prompt animation is output to inform the user that the first special effect has hit the first special effect precisely, that is, the drum beat of the audio has been accurately stepped on.
  • the real-time position of the second special effect is in the edge area of the first special effect, it indicates that the movement trajectory of the second special effect only overlaps with the edge area, but does not overlap with the core area, then output the second prompt information and/ Or a second prompt animation to warn the user that he hit the first special effect, but didn't hit the exact drum beat of the audio.
  • the real-time position of the second special effect does not continuously overlap with the dynamic position of the first special effect, it indicates that the movement trajectory of the second special effect does not overlap with the first special effect, that is, it indicates that the second special effect continues to be in a position other than the first special effect , the third prompt information and/or the third prompt animation is displayed to inform the user that he did not hit the first special effect, that is, did not step on the drum beat of the audio.
  • the number of times that the third prompt information or the third prompt animation is displayed can also be counted, and when the number of times is greater than a preset threshold, it indicates that the user has not stepped on the drum beat of the audio more times, that is, the number of times the user has not stepped on the drum beat of the audio.
  • the end message is displayed to inform the user that the game is over and the user needs to start the special effect game again.
  • the first prompt information, the first prompt animation, the second prompt information, the second prompt animation, the third prompt information and the third prompt animation can be set according to user needs, for example, the first prompt information is "perfect", The second prompt message is "very good”, and the third prompt message is "failure”.
  • the preset threshold can also be set according to user requirements. For example, the preset threshold is 2 times, that is, when the user steps on the drum of the audio for the third time, the end message is displayed.
  • the first special effect when the first special effect is in a designated area in the preset display area, it indicates that a drum beat of the audio is reached.
  • the second special effect is determined. Hit the edge area or core area of the first special effect, that is, when the second special effect hits the first special effect on the designated area of the first page, it indicates that the user has stepped on the drum of the audio, that is, the first special effect is controlled to hit the When the first special effect corresponding to a drum hit is reached, when a drum hit of the audio played on the first page is reached, a first special effect falls on the designated area.
  • the position of the second special effect is on the core area of the first special effect. Then output "perfect" prompt information. Or, when the second special effect hits the first special effect in the preset display area, it can be considered that the second special effect hits the first special effect.
  • the real-time position of the second special effect is in the core area or the edge area, it means that the second special effect touches the first special effect corresponding to the drum, which means that the user successfully stepped on the drum, and the first preset audio or enhancement is played.
  • the sound effect of the audio currently playing on the first page is to increase the sound of the drum beat of the audio currently playing on the first page, so as to increase the interest of the user and improve the user satisfaction.
  • the first preset audio is a preset audio, for example, the first preset audio is a voice including "success".
  • the electronic device may also, in response to the moving direction of the first target object in the target object image on the first page, control the third target object in the second special effect to display correspondingly in different directions, that is, in the first page During the movement of the target object, the third target object in the second special effect is controlled to move correspondingly according to the movement direction of the first target object.
  • the moving direction of the third target object may be the same as the moving direction of the first target object.
  • the first target object is the nose of the user
  • the third target object is the eyes.
  • the style of the first special effect may also be changed, that is, the remaining playing time of the audio on the first page is obtained. According to the difference of the remaining playing time, the first special effect is displayed according to different preset styles.
  • the remaining playback duration corresponding to the audio played on the first page is obtained, that is, the duration from the end of the audio playback is obtained, the preset style corresponding to the remaining playback duration is searched, and the first special effect is displayed according to the preset style to remind the user
  • the progress of audio playback that is, inform the user of the remaining game time, and improve the user's satisfaction.
  • the preset style includes appearance styles such as color and shape of the first special effect.
  • the preset style includes the color of the first special effect, the remaining playback time is 10 seconds, and the preset style corresponding to 10 seconds, that is, the color of the first special effect is purple, then the color of the first special effect displayed during the remaining playing time turns purple.
  • the user moving the first target object it indicates that the user has input a corresponding control operation, which not only controls the first special effect to move according to the moving direction of the first target object, but also controls the second special effect to move in the moving direction.
  • the third target object moves according to the moving direction of the first target object, realizes double response to the control operation input by the user, makes the user know the input control operation more clearly, and increases the interest.
  • the second special effect when the user moves the first target object, the second special effect also moves correspondingly.
  • the second special effect hits the first special effect in the designated area, according to the part, output the corresponding prompt information and/or animation to inform the user whether he has stepped on the drum beat of the audio accurately, that is, the rhythm is stuck.
  • the second special effect does not hit the first special effect in the designated area, that is, the second special effect is not
  • corresponding prompt information and/or animation is also output to inform the user that it is not stuck in the rhythm, realize interaction with the user, promote the user's willingness to operate, and increase the fun.
  • the audio to be played on the first page is determined. , to play the audio on the first page and display the first special effect with dynamic display attributes related to the audio, so that the user can experience the game without the need for the user to slide his finger on the screen to trigger the start of the game, achieving no Touch triggering to improve the convenience of game triggering.
  • FIG. 7 is a structural block diagram of an object control device provided by an embodiment of the present disclosure.
  • the device includes: a display module 701 and a processing module 702 .
  • the display module 701 is configured to display a first page, wherein the user's target object image is displayed in real time on the first page;
  • the display module 701 is further configured to dynamically display a first special effect on the first page, wherein the dynamic display attribute of the first special effect is determined according to the rhythm information of the audio played on the first page;
  • a processing module 702 configured to control the second special effect displayed on the first page to move in real time in response to the movement of the first target object in the target object image of the first page;
  • the processing module 702 is further configured to instruct the display module 701 to display the third special effect according to the relationship between the real-time position of the second special effect and the dynamic position corresponding to the first special effect.
  • the dynamic display attribute of the first special effect includes determining a distance between the first special effects according to the rhythm information and/or determining a moving speed of the first special effect according to the rhythm information.
  • the rhythm information includes one or more of the following: rhythm type, beat information and drum beat information.
  • the display module 701 is further configured to:
  • the display module 701 is further configured to:
  • audio to be played is determined, and the audio to be played is played on the first page.
  • the display module 701 is further configured to:
  • the audio uploaded by the user is acquired as the audio to be played.
  • the rhythm types include fast rhythm types and slow rhythm types
  • the processing module 702 is also used for:
  • the rhythm type is determined according to the number of drum beats.
  • the drum beat information includes a drum beat position
  • the processing module 702 is also used for:
  • the target waveform segment includes the peak value of the waveform
  • the first special effect includes a core area and an edge area;
  • the third special effect includes prompt information and/or prompt animation;
  • the processing module 702 is also used for:
  • the display module 701 is instructed to display the first prompt information and/or the first prompt animation;
  • the display module 701 If there is no overlap between the real-time position of the second special effect and the dynamic position of the first special effect, instruct the display module to display the third prompt information and/or the third prompt animation; and/or, statistically display the third prompt The number of times of the information or the third prompt animation, and when the number of times is greater than the preset threshold, the display module 701 is instructed to display the end information.
  • processing module 702 is further configured to:
  • processing module 702 is further configured to:
  • processing module 702 is further configured to:
  • the third target object in the second special effect is controlled to be displayed in corresponding different directions.
  • processing module 702 is further configured to:
  • the first special effect is displayed according to different preset styles according to the difference of the remaining playing time.
  • the device provided in this embodiment can be used to implement the technical solutions of the foregoing method embodiments, and the implementation principles and technical effects thereof are similar, and details are not described herein again in this embodiment.
  • the electronic device 800 may be a terminal device or a server.
  • the terminal equipment may include, but is not limited to, such as mobile phones, notebook computers, digital broadcast receivers, personal digital assistants (Personal Digital Assistant, referred to as PDA), tablet computers (Portable Android Device, referred to as PAD), portable multimedia players (Portable Media Player, PMP for short), mobile terminals such as in-vehicle terminals (such as in-vehicle navigation terminals), etc., and fixed terminals such as digital TVs, desktop computers, and the like.
  • PDA Personal Digital Assistant
  • PAD Portable Android Device
  • PMP Portable Multimedia Player
  • mobile terminals such as in-vehicle terminals (such as in-vehicle navigation terminals), etc.
  • fixed terminals such as digital TVs, desktop computers, and the like.
  • the electronic device shown in FIG. 8 is only an example, and should not impose any limitation on the function and scope of use of the embodiments of the present disclosure.
  • the electronic device 800 may include a processing device (such as a central processing unit, a graphics processor, etc.) 801, which may be stored in a read-only memory (Read Only Memory, ROM for short) 802 according to a program or from a storage device 806 loads a program into a random access memory (Random Access Memory, RAM for short) 803 to execute various appropriate actions and processes.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • various programs and data required for the operation of the electronic device 800 are also stored.
  • the processing device 801, the ROM 802, and the RAM 803 are connected to each other through a bus 804.
  • An Input/Output (I/O for short) interface 805 is also connected to the bus 804 .
  • an input device 806 including, for example, a touch screen, a touch pad, a keyboard, a mouse, a camera, a microphone, an accelerometer, a gyroscope, etc.; including, for example, a Liquid Crystal Display (LCD for short) ), speaker, vibrator, etc. output device 807; storage device 806 including, eg, magnetic tape, hard disk, etc.; and communication device 806.
  • Communication means 806 may allow electronic device 800 to communicate wirelessly or by wire with other devices to exchange data. While FIG. 8 shows an electronic device 800 having various means, it should be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
  • embodiments of the present disclosure include a computer program product comprising a computer program carried on a computer-readable medium, the computer program containing program code for performing the method illustrated in the flowchart.
  • the computer program may be downloaded and installed from the network via the communication device 806, or from the storage device 806, or from the ROM 802.
  • the processing device 801 the above-mentioned functions defined in the methods of the embodiments of the present disclosure are executed.
  • the computer-readable medium mentioned above in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the above two.
  • the computer-readable storage medium can be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above.
  • Computer readable storage media may include, but are not limited to, electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable Programmable Read Only Memory (Erasable Programmable ROM, EPROM or Flash Memory), Optical Fiber, Portable Compact Disk ROM (CD-ROM), Optical Storage Device, Magnetic Storage Device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with computer-readable program code embodied thereon. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium that can transmit, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device .
  • the program code embodied on the computer-readable medium may be transmitted by any suitable medium, including but not limited to: electric wire, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the above.
  • the above-mentioned computer-readable medium may be included in the above-mentioned electronic device; or may exist alone without being assembled into the electronic device.
  • the aforementioned computer-readable medium carries one or more programs, and when the aforementioned one or more programs are executed by the electronic device, causes the electronic device to execute the methods shown in the foregoing embodiments.
  • Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including object-oriented programming languages—such as Java, Smalltalk, C++, but also conventional Procedural programming language - such as the "C" language or similar programming language.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer can be connected to the user's computer through any kind of network—including a Local Area Network (LAN) or a Wide Area Network (WAN)—or, can be connected to an external A computer (eg using an internet service provider to connect via the internet).
  • LAN Local Area Network
  • WAN Wide Area Network
  • Embodiments of the present disclosure also provide a computer program product, including a computer program, which, when executed, is used to implement the object control method provided by the embodiments of the present disclosure.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code that contains one or more logical functions for implementing the specified functions executable instructions.
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or operations , or can be implemented in a combination of dedicated hardware and computer instructions.
  • the units involved in the embodiments of the present disclosure may be implemented in a software manner, and may also be implemented in a hardware manner.
  • the name of the unit does not constitute a limitation of the unit itself under certain circumstances, for example, the first obtaining unit may also be described as "a unit that obtains at least two Internet Protocol addresses".
  • exemplary types of hardware logic components include: Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (Application Specific Standard Products) Standard Product, ASSP), system on chip (System on Chip, SOC), complex programmable logic device (Complex Programming Logic Device, CPLD) and so on.
  • FPGAs Field Programmable Gate Arrays
  • ASICs Application Specific Integrated Circuits
  • ASSP Application Specific Standard Products
  • ASSP Application Specific Standard Products
  • SOC System on Chip
  • complex programmable logic device Complex Programming Logic Device, CPLD
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in connection with the instruction execution system, apparatus or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • Machine-readable media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices, or devices, or any suitable combination of the foregoing.
  • machine-readable storage media would include one or more wire-based electrical connections, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), fiber optics, compact disk read only memory (CD-ROM), optical storage, magnetic storage, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • CD-ROM compact disk read only memory
  • magnetic storage or any suitable combination of the foregoing.
  • an object control method including:
  • Dynamically displaying a first special effect on the first page wherein the dynamic display attribute of the first special effect is determined according to the rhythm information of the audio played on the first page;
  • a third special effect is displayed according to the relationship between the real-time position of the second special effect and the dynamic position corresponding to the first special effect.
  • the dynamic display attribute of the first special effect includes determining a distance between the first special effects according to the rhythm information and/or determining a moving speed of the first special effect according to the rhythm information.
  • the rhythm information includes one or more of the following: rhythm type, beat information, and drum beat information.
  • the dynamically displaying the first special effect on the first page includes:
  • the method further includes:
  • audio to be played is determined, and the audio to be played is played on the first page.
  • the determining the audio to be played includes:
  • the audio uploaded by the user is acquired as the audio to be played.
  • the rhythm type includes a fast rhythm type and a slow rhythm type
  • the method also includes:
  • the rhythm type is determined according to the number of drum beats.
  • the drum beat information includes a drum beat position
  • the method also includes:
  • the target waveform segment includes the peak value of the waveform
  • the first special effect includes a core area and an edge area;
  • the third special effect includes prompt information and/or prompt animation;
  • the first prompt information and/or the first prompt is displayed animation
  • the third prompt information and/or the third prompt animation is displayed; and/or, the third prompt information or the third prompt information is statistically displayed. 3.
  • the number of times of the animation is prompted, and when the number of times is greater than a preset threshold, an end message is displayed.
  • the method further includes:
  • the method further includes:
  • the method further includes:
  • the third target object in the second special effect is controlled to be displayed in corresponding different directions.
  • the method further includes:
  • the first special effect is displayed according to different preset styles according to the difference of the remaining playing time.
  • an object control device comprising:
  • a display module configured to display a first page, wherein the user's target object image is displayed in real time on the first page;
  • the display module is further configured to dynamically display a first special effect on the first page, wherein the dynamic display attribute of the first special effect is determined according to the rhythm information of the audio played on the first page;
  • a processing module configured to control the second special effect displayed on the first page to move in real time in response to the movement of the first target object in the target object image of the first page;
  • the processing module is further configured to instruct the display module to display the third special effect according to the relationship between the real-time position of the second special effect and the dynamic position corresponding to the first special effect.
  • the dynamic display attribute of the first special effect includes determining a distance between the first special effects according to the rhythm information and/or determining a moving speed of the first special effect according to the rhythm information.
  • the rhythm information includes one or more of the following: rhythm type, beat information, and drum beat information.
  • the display module is further used for:
  • the display module is further used for:
  • audio to be played is determined, and the audio to be played is played on the first page.
  • the display module is further used for:
  • the audio uploaded by the user is acquired as the audio to be played.
  • the rhythm type includes a fast rhythm type and a slow rhythm type
  • the processing module is also used for:
  • the rhythm type is determined according to the number of drum beats.
  • the drum beat information further includes a drum beat position
  • the processing module is also used for:
  • the target waveform segment includes the peak value of the waveform
  • the first special effect includes a core area and an edge area;
  • the third special effect includes prompt information and/or prompt animation;
  • the processing module is also used for:
  • the display module is instructed to display the first prompt information and / or first prompt animation
  • the processing module is further configured to:
  • the processing module is further configured to:
  • the processing module is further configured to:
  • the third target object in the second special effect is controlled to be displayed in corresponding different directions.
  • the processing module is further configured to:
  • the first special effect is displayed according to different preset styles according to the difference of the remaining playing time.
  • an electronic device comprising: at least one processor and a memory;
  • the memory stores computer-executable instructions
  • the at least one processor executes the computer-executable instructions stored in the memory, causing the at least one processor to perform the object control method described in the first aspect and various possible designs of the first aspect above.
  • a computer-readable storage medium where computer-executable instructions are stored in the computer-readable storage medium, and when a processor executes the computer-executable instructions, The object control method described above in the first aspect and various possible designs of the first aspect is implemented.
  • a computer program product including a computer program, which, when executed by a processor, implements the above first aspect and various possible designs of the first aspect The described object control method.
  • a computer program that, when executed by a processor, implements the objects described in the first aspect and various possible designs of the first aspect Control Method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本公开实施例提供一种对象控制方法及设备,所述方法包括:显示第一页面,其中所述第一页面上实时显示用户的目标对象图像;在所述第一页面上动态显示第一特效,其中,所述第一特效的动态显示属性是根据所述第一页面播放的音频的节奏信息确定的;响应于所述第一页面的所述目标对象图像中的第一目标对象的移动,控制所述第一页面上显示的第二特效进行实时移动;根据所述第二特效的实时位置与所述第一特效对应的动态位置之间的关系,显示第三特效;实现了用户根据音乐节奏进行无触点互动,无需手指在屏幕上滑动即可控制第二特效,提高了控制的便捷度和交互的趣味性。

Description

对象控制方法及设备
相关申请的交叉引用
本申请要求于2021年03月15日提交的申请号为202110277940.5、名称为“对象控制方法及设备”的中国专利申请的优先权,此申请的内容通过引用并入本文。
技术领域
本公开实施例涉及计算机技术领域,尤其涉及一种对象控制方法及设备。
背景技术
随着终端技术的不断发展,为了满足用户的娱乐需求以及锻炼用户的注意力,移动控制类的特效游戏得到快速发展。
目前,当用户在移动终端上使用移动控制类的特效游戏时,特效游戏播放相应的音频,并随机出现多个第一对象(例如,球),用户通过手指在屏幕上进行滑动,以控制第二对象(例如,某个动物)进行移动,即进行游戏交互,从而使得第二对象可以碰触到第一对象。
然而,由于需要用户通过手指在屏幕上滑动控制第一对象进行移动,控制方式不便捷,尤其是当用户双手不方便或者屏幕过大时,用户便无法使用手指控制第一对象移动,从而导致用户无法使用移动控制类的特效游戏,同时,由于特效游戏播放的音频仅是背景音乐,与特效游戏的其它对象没有关联,导致用户无法根据音频进行游戏交互。
发明内容
本公开实施例提供一种对象控制方法及设备,以解决现有技术中控制方式不便捷以及无法根据音频进行游戏交互的技术问题。
第一方面,本公开实施例提供一种对象控制方法,包括:
显示第一页面,其中所述第一页面上实时显示用户的目标对象图像;
在所述第一页面上动态显示第一特效,其中,所述第一特效的动态显示属性是根据所述第一页面播放的音频的节奏信息确定的;
响应于所述第一页面的所述目标对象图像中的第一目标对象的移动,控制所述第一页面上显示的第二特效进行实时移动;
根据所述第二特效的实时位置与所述第一特效对应的动态位置之间的关系,显示第三特效。
第二方面,本公开实施例提供一种对象控制设备,包括:
显示模块,用于显示第一页面,其中所述第一页面上实时显示用户的目标对象图像;
所述显示模块,还用于在所述第一页面上动态显示第一特效,其中,所述第一特效的动态显示属性是根据所述第一页面播放的音频的节奏信息确定的;
处理模块,用于响应于所述第一页面的所述目标对象图像中的第一目标对象的移动,控制所述第一页面上显示的第二特效进行实时移动;
所述处理模块,还用于根据所述第二特效的实时位置与所述第一特效对应的动态位置之间的关系,显示第三特效。
第三方面,本公开实施例提供一种电子设备,包括:至少一个处理器和存储器。
所述存储器存储计算机执行指令。
所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如上第一方面以及第一方面各种可能的设计所述的对象控制方法。
第四方面,本公开实施例提供一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如上第一方面以及第一方面各种可能的设计所述的对象控制方法。
第五方面,本公开实施例提供一种计算机程序产品,包括计算机程序,所述计算机程序被处理器执行时,实现如上第一方面以及第一方面各种可能的设计所述的对象控制方法。
第六方面,本公开实施例提供一种计算机程序,所述计算机程序被处理器执行时,实现如上第一方面以及第一方面各种可能的设计所述的对象控制方法。
本公开实施例提供了一种对象控制方法及设备,该方法通过利用根据第一页面播放的音频的节奏信息确定的动态显示属性,在第一页面上动态显示第一特效,即根据动态显示属性,控制第一特效进行移动,并在检测到第一页面上的目标对象图像中的第一目标对象移动时,表明用户的第一目标对象进行了移动,即输入了控制操作,则控制第二特效进行相应的移动,实现了第二特效的移动控制;无需手指在屏幕上滑动即可控制第二特效移动,提高了控制的便捷度,从而使用户可以成功地使用移动控制类的特效游戏。在控制第二特效移动的过程中,根据第二特效与第一特效之间的位置关系,显示第三特效,以告知用户其输入的控制操作所带来的游戏效果;且由于第一特效的动态显示属性是根据音频真实的节奏信息确定的,即第一页面播放的音频并不仅是背景音乐,其与第一特效存在关联,用户可以根据音频的节奏输入相应的控制操作以使第二特效碰触到第一特效,实现了用户根据音乐节奏进行无触点互动,增加交互的趣味性,同时,可以增强用户的节奏感,提升用户使用时的真实感,提高用户使用满意度。
附图说明
为了更清楚地说明本公开实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本公开实施例提供的手指控制特效的示意图;
图2为本公开实施例提供的对象控制方法的流程示意图一;
图3为本公开实施例提供的对象控制的示意图一;
图4为本公开实施例提供的对象控制的示意图二;
图5为本公开实施例提供的对象控制方法的流程示意图二;
图6为本公开实施例提供的放大目标对象的示意图;
图7为本公开实施例提供的对象控制设备的结构框图;
图8为本公开实施例提供的电子设备的硬件结构示意图。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
目前,特效游戏播放相应的音频,并随机出现多个第一对象(如图1中所示的球)。用户通过手指在屏幕上进行滑动,以控制第二对象(如图1中所示的)进行移动,即进行游戏交互,从而使得第二对象可以碰触到第一对象,例如,当手指由左至右滑动时,猫咪也由左至右移动。然而,由于需要用户通过手指在屏幕上滑动控制第一对象进行移动,控制方式不便捷,尤其是当用户双手不方便或者由于屏幕过大,用户需要双手持握移动终端时,用户便无法使用手指控制第一对象移动,从而导致用户无法使用移动控制类的特效游戏,同时,由于特效游戏播放的音频仅是背景音乐,与特效游戏的其它对象没有关联,导致用户无法根据音频进行游戏交互。
为了解决上述问题,本公开的技术构思是在用户使用游戏时,电子设备对用户进行拍摄,以在第一页面上显示该用户的目标对象图像,用户可以通过眨眼,点头等动作表明用户输入了开始操作,即需要启动第一特效的动态显示;开始之后,用户可以左右移动头部,例如,左右移动鼻子时,响应于该移动,控制第二特效也进行左右移动,从而精准碰撞到第一页面上动态显示的第一特效,提高了控制的便捷度。同时,第一特效的动态显示属性是根据音频真实的节奏信息确定的,即第一页面播放的音频与第一特效存在关联,用户可以根据音频的节奏输入相应的控制操作以使第二特效碰触到第一特效,实现了用户根据音乐节奏进行无触点互动,并且可以增强用户的节奏感,提升用户使用时的真实感,提高用户使用满意度。
下面以具体地实施例对本公开的技术方案进行详细说明。下面这几个具体的实施例可以相互结合,对于相同或相似的概念或过程可能在某些实施例不再赘述。
参考图2,图2为本公开实施例提供的对象控制方法的流程示意图一。本实施例的方法可以应用在电子设备上,例如,智能手机、掌上电脑、平板电脑等移动终端、计算机设备(如,台式机、笔记本电脑、一体机等)等,如图2所示,该对象控制方法包括:
S201、显示第一页面,其中第一页面上实时显示用户的目标对象图像。
在本公开实施例中,当用户想要使用移动控制类的特效游戏时,可以打开电子设备上的相关应用程序,该应用程序显示第一页面,该第一页面可以实时显示电子设备采集的用户的目标对象图像,即在显示第一页面后,电子设备对应的摄像头实时采集用户的图像,并将其实时显示在第一页面上。该第一页面也可以显示与特效游戏相关的特效(例如,第一特效、第二特效、第三特效等)。
其中,目标对象图像为摄像头实时采集的用户面部图像,其他身体部位的图像,或宠物图像等。
其中,电子设备对应的摄像头可以为集成在电子设备上的摄像头(例如,前置摄像头),也可以为电子设备外接的摄像头,即该摄像头并未集成在电子设备上,但与电子设备连接,本公开不对其进行限制。
S202、在第一页面上动态显示第一特效,其中,第一特效的动态显示属性是根据第一页面播放的音频的节奏信息确定的。
在本公开实施例中,当游戏开始后,在第一页面上播放其对应的待播放音频,即播放相应的音频(例如,某首歌曲),并按照根据该音频的节奏信息确定的动态显示属性,即第一特效的显示样式,在第一页面上动态显示该音频对应的第一特效,即在第一页面上不断移动第一页面所显示的各个第一特效。
进一步,可选的,在动态显示第一特效时,可以按照第一预设移动方向,在第一页面上的预设显示区域内动态显示第一特效,其中,第一预设移动方向包括以下至少一种:从上至下方向、从下至上方向、从左到右方向和从右至左方向。
举例来说,第一预设移动方向为从上至下方向,则在播放音频的过程中,在第一页面上的预设显示区域内不断出现新的第一特效,且预设显示区域内的第一特效由上至下移动,如图3中(a)所示,预设显示区域内仅有一个第一特效,在经过一定时间后,一个新的第一特效由上之下移动至预设显示区域内,即预设显示区域内存在两个第一特效,如图3中(b)所示。
可选的,预设显示区域可以根据实际情况进行设置,例如,其为第一页面的中心区域。
可选的,第一特效的动态显示属性包括根据节奏信息确定第一特效之间的间距和/或根据节奏信息确定第一特效的移动速度。
其中,第一特效之间的间距表示两个第一特效之间的距离,例如,如图3中(b)所示的两个第一特效之间的距离。
进一步的,可选的,节奏信息包括以下一种或多种:节奏类型,节拍信息和鼓点信息。
其中,节奏类型包括快节奏类型和慢节奏类型。当音频的节奏类型为快节奏类型时,
表明该音频的节奏较快,存在较多鼓点。当当音频的节奏类型为慢节奏类型时,表明该音频的节奏较慢,鼓点较少。
其中,鼓点信息包括鼓点数量和/或鼓点位置。具体的,鼓点数量表示音频所包括的鼓点的数量。鼓点位置表示鼓点在音频中所处的时间位置,例如,鼓点位置包括3秒和10秒,即表明在音频的第3秒处存在一个鼓点,在音频的第10秒处也存在一个鼓点。
其中,节拍信息表示音频所包括的节拍类型,例如,节拍类型包括四二拍类型。
可选的,鼓点与第一特效一一对应,第一特效之间的间距与鼓点位置相关,即与相邻两个的鼓点位置之间的距离相关。例如,鼓点1的位置为3秒,鼓点2的位置为5秒,且鼓点1与鼓点2相邻,鼓点1和鼓点2之间的距离为2秒,则查找2秒对应的间距,并将其确定为鼓点1对应的第一特效与鼓点2对应的第二特效之间的间距为该2秒对应的间距。
另外,可选的,鼓点可以用于表示音频的声音强度特征,鼓点又可以称为重拍,其位于音频中幅度大的音频帧,因此,音频的鼓点位置可以根据音频波形进行确定,其具体过程包括:
获取待播放的音频对应的波形,并从待播放的音频对应的波形中提取目标波形段,其中,目标波形段包括待播放的音频对应的波形的峰值。将目标波形段与预设鼓点波形段进行匹配,以确定待播放的音频的鼓点位置。
具体的,从待播放的音频对应的波形中提取包括峰值的目标波形段,将该目标波形段与预设鼓点波形段进行匹配,即对于每个目标波形段,计算该目标波形段与预设鼓点波形段之间的相似度,当该相似度大于预设相似度时,表明该目标波形段中的波峰所对应的时间为一个鼓点位置。
其中,预设鼓点波形段为预先采集的鼓点音频的波形段,
另外,可选的,在确定待播放的音频的鼓点时,也可以采用其它方式进行确定,例如,利用音频的波形中的振幅进行确定,本公开不对其进行限制。
可选的,音频的节奏类型可以根据音频的鼓点数量进行确定,其具体过程为:确定待播放的音频对应的鼓点数量,即获取第一页面需播放的音频所包括的鼓点数量。根据鼓点数量确定节奏类型,即当该鼓点数量大于预设数量值时,表明音频的鼓点数量较多,音频的节奏较快,则确定该音频的节奏类型为快节奏类型。当该鼓点数量小于或等于预设数量值时,表明音频的鼓点数量较少,音频的节奏较慢,则确定该音频的节奏类型为慢节奏类型。
可选的,还可以根据音频的节奏类型确定第一特效的移动速度,当节奏类型为快节奏类型时,表明音频的鼓点数量较多,需要掉落的第一特效的数量较多,因此,可以按照较快的速度,即按照第一预设移动速度,移动第一特效;当节奏类型为慢节奏类型时,表明音频的鼓点数量较少,需要掉落的第一特效的数量较少,因此,可以按照较慢的速度,即按照第二预设移动速度,移动第一特效。
另外,可选的,快节奏类型和慢节奏类型所对应的移动速度也可以是相同的,即所有音频对应的第一特效均按照预设移动速度进行移动。
可以理解,无论不同节奏类型的音频所对应的移动速度是否相同,在到达该音频的一个鼓点时,对应该鼓点的第一特效需移动至指定区域,以实现第一特效的准确移动。
另外,可选的,也可以利用音频的节拍信息确定音频的节奏类型,其过程与上述根据鼓点数量确定音频的节奏类型的过程类似,在此,不对其进行赘述。
S203、响应于第一页面的目标对象图像中的第一目标对象的移动,控制第一页面上显示的第二特效进行实时移动。
在本公开实施例中,在检测到第一页面的目标对象图像中的第一目标对象进行移动时,表明用户输入了相应的控制操作,即需要控制第二特效移动,则控制第二特效进行实时移动,以响应用户输入的控制操作,实现无触点控制第二特效移动,即无需用户通过手指在屏幕上进行滑动来控制第二特效移动。
可选的,在控制第二特效进行移动时,根据第一目标对象的移动方向进行移动,例如,第一目标对象的移动方向为由左至由方向,则第二特效由左至右进行移动。
其中,第一目标对象包括用户的鼻子和手部等部位,也可以是其它部位,例如,眼睛、头部等,本公开不对其进行限制。
S204、根据第二特效的实时位置与第一特效对应的动态位置之间的关系,显示第三特效。
在本公开实施例中,在第二特效移动的过程中,第二特效的位置实时变化,可以根据第二特效的实时位置与第一特效对应的动态位置之间的关系确定第二特效是否触碰到第一特效, 即是否击中第一特效,并显示相应的提示特效,即第三特效,以实现与用户的互动,促进用户的操作意愿,有效提高了趣味性,从而提高用户使用满意度。
可选的,第一特效包括核心区域(如图3中的(a)或(b)所示的第一特效中的圆形区域)和边缘区域(如图3中的(a)或(b)所示的第一特效中除圆形以外的区域)。当到达音频的一个鼓点时,该鼓点对应的第一特效移动至指定区域,该第一特效中的核心区域对应精准的鼓点位置,即当用户控制第二特效击中该核心区域时,表明用户精准踩到鼓点,即精准卡到节奏。该第一特效中的边缘区域对应接近鼓点的位置,即当用户控制第二特效击中该边缘区域时,表明用户未精准踩到鼓点,即未精准卡到节奏。
另外,可选的,上述图3的(a)或(b)所示的第一特效仅为一种示例,第一特效也可以为其它样式的特效,例如,第一特效为箭头、圆等,本公开不对其进行限制。
可选的,第二特效可以为一个带有眼睛的球状物,如图4中所示的第二特效,当然,第二特效也可以为其它样式的特效,例如,第二特效为狗等动物,本公开不对其进行限制。
可选的,第三特效包括提示信息和/或提示动画。其中,提示信息包括提示文字、提示图画等信息。
以一个具体应用场景为例,如图4所示,在游戏开始后,第一页面播放音频1,第一特效10由上至下进行移动,在检测到用户通过由左至右移动头部,即鼻子时,控制第二特效20也由左至右进行移动,当到达音频1的一个鼓点时,该鼓点对应的第一特效10移动至指定区域1,第二特效20处于第一特效的核心区域,表明第二特效20准确击中第一特效,即用户精准卡到节奏,则输出“完美”的提示信息。
在本公开实施例中,按照音频真实的鼓点位置确定第一特效之间的间距,即按照真实的节奏排列第一特效,以使第一页面在播放该音频时,可以动态显示按照该音频真实的节奏排列的第一特效,从而可以使用户按照真实的节奏精准碰撞第一特效,增强用户的节奏感,并增强体验的真实性,带给用户沉浸式体验。
在本公开实施例中,在用户移动第一目标对象的过程中,第一页面中目标对象图像中的第一目标对象也在实时进行该移动,电子设备通过对该第一目标对象进行追踪,以确定该第一目标对象的移动方向,从而控制与该第一目标对象绑定的第二特效也按照该移动方向进行控制,用户只需移动第一目标对象便可以实现对第二特效的移动控制,无需用户接触电子设备的屏幕,从而实现第二特效的无触点控制,提高控制的便捷度。
从上述描述可知,利用根据第一页面播放的音频的节奏信息确定的动态显示属性,在第一页面上动态显示第一特效,即根据动态显示样属性,控制第一特效进行移动,并在检测到第一页面上的目标对象图像中的第一目标对象移动时,表明用户的第一目标对象进行了移动,即输入了控制操作,则控制第二特效进行相应的移动,实现了第二特效的移动控制;无需手指在屏幕上滑动即可控制第二特效移动,提高了控制的便捷度,从而使用户可以成功地使用移动控制类的特效游戏。在控制第二特效移动的过程中,根据第二特效与第一特效之间的位置关系,显示第三特效,以告知用户其输入的控制操作所带来的游戏效果;且由于第一特效的动态显示属性是根据音频真实的节奏信息确定的,即第一页面播放的音频并不仅是背景音乐,其与第一特效存在关联,用户可以根据音频的节奏输入相应的控制操作以使第二特效碰触到第一特效,实现了用户根据音乐节奏进 行无触点互动,增加交互的趣味性,同时,可以增强用户的节奏感,提升用户使用时的真实感,提高用户使用满意度。
参考图5,图5为本公开实施例提供的对象控制方法流程示意图二。在图2实施例的基础上,用户还可以通过不与屏幕接触开始游戏,以实现无触点触发,下面将结合一个具体实施例如何进行无触点触发游戏开始的过程进行详细描述,如图5所示,该方法包括:
S501、显示第一页面,其中第一页面上实时显示用户的目标对象图像。
在本公开实施例中,当第一页面显示电子设备采集的目标对象图像时,便对第一页面中目标对象图像中的第二目标对象进行放大处理,即对第一页面内的用户的某个部位进行放大处理,可以使用户更直观地观察到第一目标对象的移动,提高控制的准确度,并且可以增加趣味性。
举例来说,第二目标对象为用户的头部,当未对用户的头部进行放大处理时,用户的头部为正常大小(如图6中的(a)所示),当对用户的头部进行放大处理后,用户的头部变大(如图6中的(b)所示),以实现大头特效。
S502、响应于第一页面的目标对象图像中的触发动作,确定待播放的音频,并在第一页面播放待播放的音频。
在本公开实施例中,在检测到第一页面中目标对象图像中的用户在进行触发动作时,表明用户输入触发游戏开始的操作,需要开始游戏,则确定在第一页面上需播放的音频,即确定第一页面对应的待播放的音频。
可选的,触发动作可以包括头部上下移动、左右移动,手部左右移动、上下移动,眼睛的眨眼动作,鼻子上下移动、左右移动等。
可以理解,当鼻子移动时,实际是用户在进行头部移动,例如,当用户的头部左右移动时,鼻子也左右移动。
在本公开实施例中,可选的,在确定第一页面对应的待播放的音频时,可以通过以下几种方式进行确定。
一种方式为,从预设音频列表中随机选取音频作为待播放的音频,即电子设备从预设音频列表,即预设音频库中随机选取一个音频,并将其确定为第一页面对应的待播放的音频。
另一种方式为,获取用户从预设音频列表中选取的音频作为待播放的音频。即用户可以在提供的预设音频列表中选取一个音频,电子设备将用户选取的音频确定为第一页面对应的待播放的音频。
另一种方式为,获取用户上传的音频作为待播放的音频。即当预设音频列表中未存在用户想要播放的音频,用户还可以上传其想要的音频,电子设备将用户上传的音频确定为第一页面对应的待播放的音频。
另外,可选的,预设音频列表中的音频所对应的节奏信息都已经预先确定,即第一特效的动态显示属性已经预先确定,在游戏开始时,第一特效可以直接以该动态显示属性进行移动。当第一页面对应的待播放的音频为用户上传的音频时,电子设备在获取到用户上传的音频时,对其进行鼓点检测,以确定相应的节奏信息,从而根据该节奏信息确定该音频对应的第一特效的动态显示属性。
S503、在第一页面上动态显示第一特效,其中,第一特效的动态显示属性是根据第一页面播放的音频的节奏信息确定的。
在本公开实施例中,在游戏开始后,便在第一页面上不断显示动态显示属性与第一页面对应的待播放的音频的节奏信息相匹配的第一特效,即每到达音频的一个鼓点时,便有一个第一特效移动至第一页面上的指定区域上。
S504、响应于第一页面的目标对象图像中的第一目标对象的移动,控制第一页面上显示的第二特效进行实时移动。
S505、根据第二特效的实时位置与第一特效对应的动态位置之间的关系,显示第三特效。
在本公开实施例中,根据第二特效的实时位置与第一特效对应的动态位置之间的关系,确定第二特效是否碰触到第一特效,即确定通过用户某个部位控制的第二特效是否击中了第一特效,从而显示相应的反馈特效,即第三特效,其具体过程为:在所述第一特效在所述第一页面的预设显示区域内动态移动过程中,若第二特效的实时位置处于核心区域,则显示第一提示信息和/或第一提示动画。若第二特效的实时位置处于边缘区域,则显示第二提示信息和/或第二提示动画。若第二特效的实时位置与第一特效的动态位置持续未存在重叠部分,则显示第三提示信息和/或第三提示动画。
在本公开实施例中,当第一特效在第一页面内的预设显示区域内动态移动的过程中,用户可以按照第二预设移动方向(例如,从左至右)移动第一目标对象,电子设备响应于该移动,控制第二特效也按照第二预设移动方向进行移动。在第二特效移动的过程中,第二特效的实时位置不断发生变化,当第二特效的实时位置处于第一特效的核心区域时,表明第二特效的移动轨迹与该核心区域存在重叠部分,即表明第一特效击中了第一特效的核心区域,则输出第一提示信息和/或第一提示动画,以告知用户其精准击中了第一特效,即精准踩到了音频的鼓点。当第二特效的实时位置处于第一特效的边缘区域时,表明第二特效的移动轨迹仅与该边缘区域存在重叠部分,而并未与核心区域存在重叠部分,则输出第二提示信息和/或第二提示动画,以告警用户其击中了第一特效,但并未精准踩到音频的鼓点。当第二特效的实时位置持续未与第一特效的动态位置存在重叠部分时,表明第二特效的移动轨迹未与第一特效存在重叠部分,即表明第二特效持续处于第一特效以外的位置,则显示第三提示信息和/或第三提示动画,以告知用户其并未击中第一特效,即并未踩到音频的鼓点。
进一步的,可选的,还可以统计显示第三提示信息或第三提示动画的次数,并在次数大于预设阈值时,表明用户未踩到音频的鼓点的次数较多,即多次未卡到节奏,则显示结束信息,以告知用户游戏结束,用户需重新开始体验特效游戏。
其中,第一提示信息、第一提示动画、第二提示信息、第二提示动画、第三提示信息和第三提示动画,可以根据用户需求进行设置,例如,第一提示信息为“完美”,第二提示信息为“很好”,第三提示信息为“失败”。同时,预设阈值也可以根据用户需求进行设置,例如,预设阈值为2次,即当用户第三次踩到音频的鼓点时,便显示结束信息。
可以理解,当第一特效处于预设显示区域中的指定区域时,表明到了音频的一个鼓点,此时若第二特效的实时位置处于第一特效的边缘区域或核心区域,则确定第二特效击中了第一特效的边缘区域或核心区域,即当第二特效击中了处于第一页面的指定区域上的第一特效时,表明用户踩到了音频的鼓点,即控制第一特效击中了一个鼓点对应的第一特效,当到达第一页面播放的音频的一个鼓点时,一个第一特效落在指定区域上, 此时,第二特效的位置处于该第一特效的核心区域上,则输出“完美”的提示信息。或者,当第二特效击中处于预设显示区域中的第一特效时,便可以认为第二特效击中了第一特效。
另外,可选的,若第二特效的实时位置处于核心区域或边缘区域,表明第二特效碰到了鼓点对应的第一特效,即表明用户成功踩到鼓点,则播放第一预设音频或增强第一页面当前播放的音频的音效,即增大第一页面当前播放的音频的鼓点的声音,以增加用户使用的趣味性,提高用户使用满意度。
其中,第一预设音频为预先设置的音频,例如,第一预设音频为包含“成功”的语音。
另外,可选的,电子设备还可以响应于第一页面的目标对象图像中的第一目标对象的移动方向,控制第二特效中的第三目标对象进行相应的不同方向显示,即在第一目标对象移动的过程中,按照第一目标对象的移动方向,控制第二特效中的第三目标对象进行相应的移动。
其中,第三目标对象的移动方向可以与第一目标对象的移动方向相同,例如,第一目标对象为用户的鼻子,第三目标对象为眼睛,在用户由左至右移动头部的过程中,用户的鼻子也由左至右进行移动,则控制第二特效中的眼睛也由左至右移动。
在任意公开实施例中,另外,可选的,在第一页面上显示第一特效的过程中,还可以改变第一特效的样式,即获取第一页面音频的剩余播放时长。根据剩余播放时长的不同,按照不同预设样式显示第一特效。
具体的,获取第一页面播放的音频对应的剩余播放时长,即获取距离音频播放结束的时长,查找该剩余播放时长对应的预设样式,并根据该预设样式显示第一特效,以提示用户音频播放的进度,即告知用户剩余的游戏时间,提高用户的使用满意度。
其中,预设样式包括第一特效的颜色、形状等外观样式。例如,预设样式包括第一特效的颜色,剩余播放时长为10秒,10秒对应的预设样式,即第一特效的颜色为紫色,则在剩余播放时长内,显示的第一特效的颜色变为紫色。
在本公开实施例中,在用户移动第一目标对象的过程中,表明用户输入了相应的控制操作,不仅控制第一特效按照第一目标对象的移动方向进行移动,还可以控制第二特效中的第三目标对象按照第一目标对象的移动方向进行移动,实现对用户输入的控制操作的双重响应,使用户更加清楚地获知其所输入的控制操作,并且增加趣味性。
在本公开实施例中,在用户移动第一目标对象的过程中,第二特效也进行相应的移动,当第二特效击中了处于指定区域的第一特效时,根据第二特效击中的部分,输出相应的提示信息和/或动画,以告知用户其是否精准踩到了音频的鼓点,即卡到节奏,当第二特效未击中处于指定区域的第一特效时,即第二特效未碰触到某个鼓点对应的第一特效是,也输出相应的提示信息和/或动画,以告知用户其未卡到节奏,实现与用户的互动,促进用户的操作意愿,增加趣味性。
在本公开实施例中,在检测到第一页面中目标对象图像中的用户进行触发动作后,表明用户输入了触发游戏开始的操作,用户想要开始游戏,则确定第一页面需播放的音频,以在第一页面上播放该音频,并显示动态显示属性与该音频相关的第一特效,从而可以使用户可以体验游戏,无需用户无需用户通过手指在屏幕上滑动以触发游戏开始,实现无触点触发,提高游戏触发的便捷度。
对应于上文实施例的对象控制方法,图7为本公开实施例提供的对象控制设备的结构框图。为了便于说明,仅示出了与本公开实施例相关的部分。参照图7,所述设备包括:显示模块701和处理模块702。
其中,显示模块701,用于显示第一页面,其中所述第一页面上实时显示用户的目标对象图像;
所述显示模块701,还用于在所述第一页面上动态显示第一特效,其中,所述第一特效的动态显示属性是根据所述第一页面播放的音频的节奏信息确定的;
处理模块702,用于响应于所述第一页面的所述目标对象图像中的第一目标对象的移动,控制所述第一页面上显示的第二特效进行实时移动;
所述处理模块702,还用于根据所述第二特效的实时位置与所述第一特效对应的动态位置之间的关系,指示所述显示模块701显示第三特效。
在本公开的一个实施例中,所述第一特效的动态显示属性包括根据所述节奏信息确定第一特效之间的间距和/或根据所述节奏信息确定第一特效的移动速度。
在本公开的一个实施例中,所述节奏信息包括以下一种或多种:节奏类型,节拍信息和鼓点信息。
在本公开的一个实施例中,所述显示模块701还用于:
按照第一预设移动方向,在所述第一页面上的预设显示区域内动态显示所述第一特效,其中,所述第一预设移动方向包括以下至少一种:从上至下方向、从下至上方向、从左到右方向和从右至左方向。
在本公开的一个实施例中,所述显示模块701还用于:
响应于所述第一页面的所述目标对象图像中的触发动作,确定待播放的音频,并在所述第一页面播放所述待播放的音频。
在本公开的一个实施例中,所述显示模块701还用于:
从预设音频列表中随机选取音频作为所述待播放的音频;
或者,
获取用户从所述预设音频列表中选取的音频作为所述待播放的音频;
或者,
获取用户上传的音频作为所述待播放的音频。
在本公开的一个实施例中,所述节奏类型包括快节奏类型和慢节奏类型;
所述处理模块702还用于:
确定待播放的音频对应的鼓点数量;
根据所述鼓点数量确定所述节奏类型。
在本公开的一个实施例中,所述鼓点信息包括鼓点位置;
所述处理模块702还用于:
获取待播放的音频对应的波形,并从所述波形中提取目标波形段,其中,所述目标波形段包括所述波形的峰值;
将所述目标波形段与预设鼓点波形段进行匹配,以确定待播放的音频的鼓点位置。
在本公开的一个实施例中,所述第一特效包括核心区域和边缘区域;所述第三特效包括提示信息和/或提示动画;
所述处理模块702还用于:
在所述第一特效在所述第一页面的预设显示区域内动态移动过程中,若所述第二特效的实时位置处于所述核心区域,则指示所述显示模块701显示第一提示信息和/或第一提示动画;
若所述第二特效的实时位置处于所述边缘区域,则指示所述显示模块701显示第二提示信息和/或第二提示动画;
若所述第二特效的实时位置与所述第一特效的动态位置持续未存在重叠部分,则指示显示模块显示第三提示信息和/或第三提示动画;和/或,统计显示第三提示信息或第三提示动画的次数,并在所述次数大于预设阈值时,指示所述显示模块701显示结束信息。
在本公开的一个实施例中,所述处理模块702还用于:
若所述第二特效的实时位置处于所述核心区域或所述边缘区域,则播放第一预设音频或增强所述第一页面当前播放的音频的音效。
在本公开的一个实施例中,所述处理模块702还用于:
对所述第一页面中所述目标对象图像中的第二目标对象进行放大处理。
在本公开的一个实施例中,所述处理模块702还用于:
响应于所述第一页面的所述目标对象图像中的第一目标对象的移动方向,控制所述第二特效中的第三目标对象进行相应的不同方向显示。
在本公开的一个实施例中,所述处理模块702还用于:
获取所述第一页面音频的剩余播放时长;
根据剩余播放时长的不同,按照不同预设样式显示所述第一特效。
本实施例提供的设备,可用于执行上述方法实施例的技术方案,其实现原理和技术效果类似,本实施例此处不再赘述。
参考图8,其示出了适于用来实现本公开实施例的电子设备800的结构示意图,该电子设备800可以为终端设备或服务器。其中,终端设备可以包括但不限于诸如移动电话、笔记本电脑、数字广播接收器、个人数字助理(Personal Digital Assistant,简称PDA)、平板电脑(Portable Android Device,简称PAD)、便携式多媒体播放器(Portable Media Player,简称PMP)、车载终端(例如车载导航终端)等等的移动终端以及诸如数字TV、台式计算机等等的固定终端。图8示出的电子设备仅仅是一个示例,不应对本公开实施例的功能和使用范围带来任何限制。
如图8所示,电子设备800可以包括处理装置(例如中央处理器、图形处理器等)801,其可以根据存储在只读存储器(Read Only Memory,简称ROM)802中的程序或者从存储装置806加载到随机访问存储器(Random Access Memory,简称RAM)803中的程序而执行各种适当的动作和处理。在RAM 803中,还存储有电子设备800操作所需的各种程序和数据。处理装置801、ROM 802以及RAM 803通过总线804彼此相连。输入/输出(Input/Output,简称I/O)接口805也连接至总线804。
通常,以下装置可以连接至I/O接口805:包括例如触摸屏、触摸板、键盘、鼠标、摄像头、麦克风、加速度计、陀螺仪等的输入装置806;包括例如液晶显示器(Liquid Crystal Display,简称LCD)、扬声器、振动器等的输出装置807;包括例如磁带、硬盘等的存储装置806;以及通信装置806。通信装置806可以允许电子设备800与其他设备进行无线或有线通信以交换数据。虽然图8示出了具有各种装置的电子设备800,但是应理解的是,并不要求实施或具备所有示出的装置。可以替代地实施或具备更多或更少的装置。
特别地,根据本公开的实施例,上文参考流程图描述的过程可以被实现为计算机软件程序。例如,本公开的实施例包括一种计算机程序产品,其包括承载在计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信装置806从网络上被下载和安装,或者从存储装置806被安装,或者从ROM 802被安装。在该计算机程序被处理装置801执行时,执行本公开实施例的方法中限定的上述功能。
需要说明的是,本公开上述的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(Erasable Programmable ROM,EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(Compact Disk ROM,CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本公开中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本公开中,计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读信号介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:电线、光缆、射频(Radio Frequency,RF)等等,或者上述的任意合适的组合。
上述计算机可读介质可以是上述电子设备中所包含的;也可以是单独存在,而未装配入该电子设备中。
上述计算机可读介质承载有一个或者多个程序,当上述一个或者多个程序被该电子设备执行时,使得该电子设备执行上述实施例所示的方法。
可以以一种或多种程序设计语言或其组合来编写用于执行本公开的操作的计算机程序代码,上述程序设计语言包括面向对象的程序设计语言—诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(Local Area Network,简称LAN)或广域网(Wide Area Network,简称WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
本公开实施例还提供了一种计算机程序产品,包括计算机程序,所述计算机程序被执行时用于实现本公开实施例提供的对象控制方法。
附图中的流程图和框图,图示了按照本公开各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含一个 或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本公开实施例中所涉及到的单元可以通过软件的方式实现,也可以通过硬件的方式来实现。其中,单元的名称在某种情况下并不构成对该单元本身的限定,例如,第一获取单元还可以被描述为“获取至少两个网际协议地址的单元”。
本文中以上描述的功能可以至少部分地由一个或多个硬件逻辑部件来执行。例如,非限制性地,可以使用的示范类型的硬件逻辑部件包括:现场可编程门阵列(Field Programmable Gate Array,FPGA)、专用集成电路(Application Specific Integrated Circuit,ASIC)、专用标准产品(Application Specific Standard Product,ASSP)、片上系统(System on Chip,SOC)、复杂可编程逻辑设备(Complex Programming Logic Device,CPLD)等等。
在本公开的上下文中,机器可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读介质可以是机器可读信号介质或机器可读储存介质。机器可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或快闪存储器)、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设备、或上述内容的任何合适组合。
第一方面,根据本公开的一个或多个实施例,提供了一种对象控制方法,包括:
显示第一页面,其中所述第一页面上实时显示用户的目标对象图像;
在所述第一页面上动态显示第一特效,其中,所述第一特效的动态显示属性是根据所述第一页面播放的音频的节奏信息确定的;
响应于所述第一页面的所述目标对象图像中的第一目标对象的移动,控制所述第一页面上显示的第二特效进行实时移动;
根据所述第二特效的实时位置与所述第一特效对应的动态位置之间的关系,显示第三特效。
根据本公开的一个或多个实施例,所述第一特效的动态显示属性包括根据所述节奏信息确定第一特效之间的间距和/或根据所述节奏信息确定第一特效的移动速度。
根据本公开的一个或多个实施例,所述节奏信息包括以下一种或多种:节奏类型,节拍信息和鼓点信息。
根据本公开的一个或多个实施例,所述在所述第一页面上动态显示第一特效,包括:
按照第一预设移动方向,在所述第一页面上的预设显示区域内动态显示所述第一特效,其中,所述第一预设移动方向包括以下至少一种:从上至下方向、从下至上方向、从左到右方向和从右至左方向。
根据本公开的一个或多个实施例,所述方法还包括:
响应于所述第一页面的所述目标对象图像中的触发动作,确定待播放的音频,并在所述第一页面播放所述待播放的音频。
根据本公开的一个或多个实施例,所述确定待播放的音频,包括:
从预设音频列表中随机选取音频作为所述待播放的音频;
或者,
获取用户从所述预设音频列表中选取的音频作为所述待播放的音频;
或者,
获取用户上传的音频作为所述待播放的音频。
根据本公开的一个或多个实施例,所述节奏类型包括快节奏类型和慢节奏类型;
所述方法还包括:
确定待播放的音频对应的鼓点数量;
根据所述鼓点数量确定所述节奏类型。
根据本公开的一个或多个实施例,所述鼓点信息包括鼓点位置;
所述方法还包括:
获取待播放的音频对应的波形,并从所述波形中提取目标波形段,其中,所述目标波形段包括所述波形的峰值;
将所述目标波形段与预设鼓点波形段进行匹配,以确定待播放的音频的鼓点位置。
根据本公开的一个或多个实施例,所述第一特效包括核心区域和边缘区域;所述第三特效包括提示信息和/或提示动画;
所述根据第二特效的实时位置与第一特效对应的动态位置之间的关系,显示第三特效,包括:
在所述第一特效在所述第一页面的预设显示区域内动态移动过程中,若所述第二特效的实时位置处于所述核心区域,则显示第一提示信息和/或第一提示动画;
若所述第二特效的实时位置处于所述边缘区域,则显示第二提示信息和/或第二提示动画;
若所述第二特效的实时位置与所述第一特效的动态位置持续未存在重叠部分,则显示第三提示信息和/或第三提示动画;和/或,统计显示第三提示信息或第三提示动画的次数,并在所述次数大于预设阈值时,显示结束信息。
根据本公开的一个或多个实施例,所述方法还包括:
若所述第二特效的实时位置处于所述核心区域或所述边缘区域,则播放第一预设音频或增强所述第一页面当前播放的音频的音效。
根据本公开的一个或多个实施例,所述方法还包括:
对所述第一页面中所述目标对象图像中的第二目标对象进行放大处理。
根据本公开的一个或多个实施例,所述方法还包括:
响应于所述第一页面的所述目标对象图像中的第一目标对象的移动方向,控制所述第二特效中的第三目标对象进行相应的不同方向显示。
根据本公开的一个或多个实施例,所述方法还包括:
获取所述第一页面音频的剩余播放时长;
根据剩余播放时长的不同,按照不同预设样式显示所述第一特效。
第二方面,根据本公开的一个或多个实施例,提供了一种对象控制设备,包括:
显示模块,用于显示第一页面,其中所述第一页面上实时显示用户的目标对象图像;
所述显示模块,还用于在所述第一页面上动态显示第一特效,其中,所述第一特效的动态显示属性是根据所述第一页面播放的音频的节奏信息确定的;
处理模块,用于响应于所述第一页面的所述目标对象图像中的第一目标对象的移动,控制所述第一页面上显示的第二特效进行实时移动;
所述处理模块,还用于根据所述第二特效的实时位置与第一特效对应的动态位置之间的关系,指示所述显示模块显示第三特效。
根据本公开的一个或多个实施例,所述第一特效的动态显示属性包括根据所述节奏信息确定第一特效之间的间距和/或根据所述节奏信息确定第一特效的移动速度。
根据本公开的一个或多个实施例,所述节奏信息包括以下一种或多种:节奏类型,节拍信息和鼓点信息。
根据本公开的一个或多个实施例,所述显示模块还用于:
按照第一预设移动方向,在所述第一页面上的预设显示区域内动态显示所述第一特效,其中,所述第一预设移动方向包括以下至少一种:从上至下方向、从下至上方向、从左到右方向和从右至左方向。
根据本公开的一个或多个实施例,所述显示模块还用于:
响应于所述第一页面的所述目标对象图像中的触发动作,确定待播放的音频,并在所述第一页面播放所述待播放的音频。
根据本公开的一个或多个实施例,所述显示模块还用于:
从预设音频列表中随机选取音频作为所述待播放的音频;
或者,
获取用户从所述预设音频列表中选取的音频作为所述待播放的音频;
或者,
获取用户上传的音频作为所述待播放的音频。
根据本公开的一个或多个实施例,所述节奏类型包括快节奏类型和慢节奏类型;
所述处理模块还用于:
确定待播放的音频对应的鼓点数量;
根据所述鼓点数量确定所述节奏类型。
根据本公开的一个或多个实施例,所述鼓点信息还包括鼓点位置;
所述处理模块还用于:
获取待播放的音频对应的波形,并从所述波形中提取目标波形段,其中,所述目标波形段包括所述波形的峰值;
将所述目标波形段与预设鼓点波形段进行匹配,以确定待播放的音频的鼓点位置。
根据本公开的一个或多个实施例,所述第一特效包括核心区域和边缘区域;所述第三特效包括提示信息,和/或提示动画;
所述处理模块还用于:
在所述第一特效在所述第一页面的预设显示区域内动态移动过程中,若所述第二特效的实时位置处于所述核心区域,则指示所述显示模块显示第一提示信息和/或第一提示动画;
若所述第二特效的实时位置处于所述边缘区域,则指示所述显示模块显示第二提示信息和/或第二提示动画;
若所述第二特效的实时位置与所述第一特效的动态位置持续未存在重叠部分,则指示显示模块显示第三提示信息和/或第三提示动画;和/或,统计显示第三提示信息或第三提示动画的次数,并在所述次数大于预设阈值时,指示所述显示模块显示结束信息。
根据本公开的一个或多个实施例,所述处理模块还用于:
若所述第二特效的实时位置处于所述核心区域或所述边缘区域,则播放第一预设音频或增强所述第一页面当前播放的音频的音效。
根据本公开的一个或多个实施例,所述处理模块还用于:
对所述第一页面中所述目标对象图像中的第二目标对象进行放大处理。
根据本公开的一个或多个实施例,所述处理模块还用于:
响应于所述第一页面的所述目标对象图像中的第一目标对象的移动方向,控制所述第二特效中的第三目标对象进行相应的不同方向显示。
根据本公开的一个或多个实施例,所述处理模块还用于:
获取所述第一页面音频的剩余播放时长;
根据剩余播放时长的不同,按照不同预设样式显示所述第一特效。
第三方面,根据本公开的一个或多个实施例,提供了一种电子设备,包括:至少一个处理器和存储器;
所述存储器存储计算机执行指令;
所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如上第一方面以及第一方面各种可能的设计所述的对象控制方法。
第四方面,根据本公开的一个或多个实施例,提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如上第一方面以及第一方面各种可能的设计所述的对象控制方法。
第五方面,根据本公开的一个或多个实施例,提供一种计算机程序产品,包括计算机程序,所述计算机程序被处理器执行时,实现如上第一方面以及第一方面各种可能的设计所述的对象控制方法。
第六方面,根据本公开的一个或多个实施例,提供了一种计算机程序,所述计算机程序被处理器执行时,实现如上第一方面以及第一方面各种可能的设计所述的对象控制方法。
以上描述仅为本公开的较佳实施例以及对所运用技术原理的说明。本领域技术人员应当理解,本公开中所涉及的公开范围,并不限于上述技术特征的特定组合而成的技术方案,同时也应涵盖在不脱离上述公开构思的情况下,由上述技术特征或其等同特征进行任意组合而形成的其它技术方案。例如上述特征与本公开中公开的(但不限于)具有类似功能的技术特征进行互相替换而形成的技术方案。
此外,虽然采用特定次序描绘了各操作,但是这不应当理解为要求这些操作以所示出的特定次序或以顺序次序执行来执行。在一定环境下,多任务和并行处理可能是有利的。同样地,虽然在上面论述中包含了若干具体实现细节,但是这些不应当被解释为对本公开的范围的限制。在单独的实施例的上下文中描述的某些特征还可以组合地实现在 单个实施例中。相反地,在单个实施例的上下文中描述的各种特征也可以单独地或以任何合适的子组合的方式实现在多个实施例中。
尽管已经采用特定于结构特征和/或方法逻辑动作的语言描述了本主题,但是应当理解所附权利要求书中所限定的主题未必局限于上面描述的特定特征或动作。相反,上面所描述的特定特征和动作仅仅是实现权利要求书的示例形式。

Claims (18)

  1. 一种对象控制方法,所述方法包括:
    显示第一页面,其中所述第一页面上实时显示用户的目标对象图像;
    在所述第一页面上动态显示第一特效,其中,所述第一特效的动态显示属性是根据所述第一页面播放的音频的节奏信息确定的;
    响应于所述第一页面的所述目标对象图像中的第一目标对象的移动,控制所述第一页面上显示的第二特效进行实时移动;
    根据所述第二特效的实时位置与所述第一特效对应的动态位置之间的关系,显示第三特效。
  2. 根据权利要求1所述的方法,其中,所述第一特效的动态显示属性包括根据所述节奏信息确定所述第一特效之间的间距和/或根据所述节奏信息确定所述第一特效的移动速度。
  3. 根据权利要求2所述的方法,其中,所述节奏信息包括以下一种或多种:节奏类型,节拍信息和鼓点信息。
  4. 根据权利要求1至3任一所述的方法,其中,所述在所述第一页面上动态显示第一特效,包括:
    按照第一预设移动方向,在所述第一页面上的预设显示区域内动态显示所述第一特效,其中,所述第一预设移动方向包括以下至少一种:从上至下方向、从下至上方向、从左到右方向和从右至左方向。
  5. 根据权利要求1至4任一所述的方法,其中,所述方法还包括:
    响应于所述第一页面的所述目标对象图像中的触发动作,确定待播放的音频,并在所述第一页面播放所述待播放的音频。
  6. 根据权利要求5所述的方法,其中,所述确定待播放的音频,包括:
    从预设音频列表中随机选取音频作为所述待播放的音频;
    或者,
    获取用户从所述预设音频列表中选取的音频作为所述待播放的音频;
    或者,
    获取用户上传的音频作为所述待播放的音频。
  7. 根据权利要求3所述的方法,其中,所述节奏类型包括快节奏类型和慢节奏类型;
    所述方法还包括:
    确定待播放的音频对应的鼓点数量;
    根据所述鼓点数量确定所述节奏类型。
  8. 根据权利要求3所述的方法,其中,所述鼓点信息包括鼓点位置;
    所述方法还包括:
    获取待播放的音频对应的波形,并从所述波形中提取目标波形段,其中,所述目标波形段包括所述波形的峰值;
    将所述目标波形段与预设鼓点波形段进行匹配,以确定所述待播放的音频的鼓点位置。
  9. 根据权利要求1至8任一所述的方法,其中,所述第一特效包括核心区域和边缘 区域;所述第三特效包括提示信息和/或提示动画;
    所述根据第二特效的实时位置与第一特效对应的动态位置之间的关系,显示第三特效,包括:
    在所述第一特效在所述第一页面的预设显示区域内动态移动过程中,若所述第二特效的实时位置处于所述核心区域,则显示第一提示信息和/或第一提示动画;
    若所述第二特效的实时位置处于所述边缘区域,则显示第二提示信息和/或第二提示动画;
    若所述第二特效的实时位置与所述第一特效的动态位置持续未存在重叠部分,则显示第三提示信息和/或第三提示动画;和/或,统计显示第三提示信息或第三提示动画的次数,并在所述次数大于预设阈值时,显示结束信息。
  10. 根据权利要求9所述的方法,其中,所述方法还包括:
    若所述第二特效的实时位置处于所述核心区域或所述边缘区域,则播放第一预设音频或增强所述第一页面当前播放的音频的音效。
  11. 根据权利要求1至10任一所述的方法,其中,所述方法还包括:
    对所述第一页面中所述目标对象图像中的第二目标对象进行放大处理。
  12. 根据权利要求1至11任一所述的方法,其中,所述方法还包括:
    响应于所述第一页面的所述目标对象图像中的所述第一目标对象的移动方向,控制所述第二特效中的第三目标对象进行相应的不同方向显示。
  13. 根据权利要求1至8任一所述的方法,其中,所述方法还包括:
    获取所述第一页面音频的剩余播放时长;
    根据剩余播放时长的不同,按照不同预设样式显示所述第一特效。
  14. 一种对象控制设备,所述设备包括:
    显示模块,用于显示第一页面,其中所述第一页面上实时显示用户的目标对象图像;
    所述显示模块,还用于在所述第一页面上动态显示第一特效,其中,所述第一特效的动态显示属性是根据所述第一页面播放的音频的节奏信息确定的;
    处理模块,用于响应于所述第一页面的所述目标对象图像中的第一目标对象的移动,控制所述第一页面上显示的第二特效进行实时移动;
    所述处理模块,还用于根据所述第二特效的实时位置与所述第一特效对应的动态位置之间的关系,指示显示模块显示第三特效。
  15. 一种电子设备,包括:至少一个处理器和存储器;
    所述存储器存储计算机执行指令;
    所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如权利要求1至13任一所述的对象控制方法。
  16. 一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如权利要求1至13任一所述的对象控制方法。
  17. 一种计算机程序产品,包括计算机程序,所述计算机程序被处理器执行时实现权利要求1至13任一所述的对象控制方法。
  18. 一种计算机程序,所述计算机程序被处理器执行时实现权利要求1至13任一所述的对象控制方法。
PCT/CN2022/080685 2021-03-15 2022-03-14 对象控制方法及设备 WO2022194097A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110277940.5 2021-03-15
CN202110277940.5A CN112988027B (zh) 2021-03-15 2021-03-15 对象控制方法及设备

Publications (1)

Publication Number Publication Date
WO2022194097A1 true WO2022194097A1 (zh) 2022-09-22

Family

ID=76335561

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/080685 WO2022194097A1 (zh) 2021-03-15 2022-03-14 对象控制方法及设备

Country Status (2)

Country Link
CN (1) CN112988027B (zh)
WO (1) WO2022194097A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112988027B (zh) * 2021-03-15 2023-06-27 北京字跳网络技术有限公司 对象控制方法及设备
CN113744135A (zh) * 2021-09-16 2021-12-03 北京字跳网络技术有限公司 图像处理方法、装置、电子设备及存储介质
CN114329001B (zh) * 2021-12-23 2023-04-28 游艺星际(北京)科技有限公司 动态图片的显示方法、装置、电子设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108111909A (zh) * 2017-12-15 2018-06-01 广州市百果园信息技术有限公司 视频图像处理方法及计算机存储介质、终端
CN108833818A (zh) * 2018-06-28 2018-11-16 腾讯科技(深圳)有限公司 视频录制方法、装置、终端及存储介质
CN109045688A (zh) * 2018-07-23 2018-12-21 广州华多网络科技有限公司 游戏交互方法、装置、电子设备及存储介质
US20190147841A1 (en) * 2017-11-13 2019-05-16 Facebook, Inc. Methods and systems for displaying a karaoke interface
CN111857923A (zh) * 2020-07-17 2020-10-30 北京字节跳动网络技术有限公司 特效展示方法、装置、电子设备及计算机可读介质
CN112259062A (zh) * 2020-10-20 2021-01-22 北京字节跳动网络技术有限公司 特效展示方法、装置、电子设备及计算机可读介质
CN112988027A (zh) * 2021-03-15 2021-06-18 北京字跳网络技术有限公司 对象控制方法及设备

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100028858A (ko) * 2008-09-05 2010-03-15 엔에이치엔(주) 온라인 음악 게임을 제공하는 시스템 및 그 방법
CN111857482B (zh) * 2020-07-24 2022-05-17 北京字节跳动网络技术有限公司 一种互动方法、装置、设备和可读介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190147841A1 (en) * 2017-11-13 2019-05-16 Facebook, Inc. Methods and systems for displaying a karaoke interface
CN108111909A (zh) * 2017-12-15 2018-06-01 广州市百果园信息技术有限公司 视频图像处理方法及计算机存储介质、终端
CN108833818A (zh) * 2018-06-28 2018-11-16 腾讯科技(深圳)有限公司 视频录制方法、装置、终端及存储介质
CN109045688A (zh) * 2018-07-23 2018-12-21 广州华多网络科技有限公司 游戏交互方法、装置、电子设备及存储介质
CN111857923A (zh) * 2020-07-17 2020-10-30 北京字节跳动网络技术有限公司 特效展示方法、装置、电子设备及计算机可读介质
CN112259062A (zh) * 2020-10-20 2021-01-22 北京字节跳动网络技术有限公司 特效展示方法、装置、电子设备及计算机可读介质
CN112988027A (zh) * 2021-03-15 2021-06-18 北京字跳网络技术有限公司 对象控制方法及设备

Also Published As

Publication number Publication date
CN112988027B (zh) 2023-06-27
CN112988027A (zh) 2021-06-18

Similar Documents

Publication Publication Date Title
WO2022194097A1 (zh) 对象控制方法及设备
WO2022012182A1 (zh) 特效展示方法、装置、电子设备及计算机可读介质
US10159902B2 (en) Method and system for providing game ranking information
KR20210062640A (ko) 현재 게임 시나리오에 기초한 스트리밍 게임을 위한 그래픽 오버레이를 구현하는 기법
WO2020107908A1 (zh) 多用户视频特效添加方法、装置、终端设备及存储介质
US20210081985A1 (en) Advertisement interaction methods and apparatuses, electronic devices and storage media
CN110090444B (zh) 游戏中行为记录创建方法、装置、存储介质及电子设备
WO2023045783A1 (zh) 页面处理方法、装置、设备及存储介质
US20200356234A1 (en) Animation Display Method and Apparatus, Electronic Device, and Storage Medium
WO2022257797A1 (zh) 目标内容的显示方法、装置、设备、可读存储介质及产品
US20230131975A1 (en) Music playing method and apparatus based on user interaction, and device and storage medium
CN111760272B (zh) 游戏信息显示方法及装置、计算机存储介质、电子设备
WO2022017181A1 (zh) 一种互动方法、装置、设备和可读介质
US9302182B2 (en) Method and apparatus for converting computer games between platforms using different modalities
WO2024016924A1 (zh) 视频处理方法、装置、电子设备及存储介质
JP2023085442A (ja) プログラム
JP5932905B2 (ja) プログラム、及びゲームシステム
Dong et al. Touch-move-release: studies of surface and motion gestures for mobile augmented reality
JP6924564B2 (ja) ゲームプログラム
JP2016101261A (ja) サウンドメッセージシステム
CN111159472A (zh) 多模态聊天技术
US20230128658A1 (en) Personalized vr controls and communications
CN110141854B (zh) 游戏中的信息处理方法及装置、存储介质及电子设备
US20240058700A1 (en) Mobile game trainer using console controller
WO2023051415A1 (zh) 互动方法、装置、设备、计算机可读存储介质及产品

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22770452

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22770452

Country of ref document: EP

Kind code of ref document: A1