EP4339759B1 - Musikabspielverfahren und -vorrichtung - Google Patents

Musikabspielverfahren und -vorrichtung

Info

Publication number
EP4339759B1
EP4339759B1 EP22862947.3A EP22862947A EP4339759B1 EP 4339759 B1 EP4339759 B1 EP 4339759B1 EP 22862947 A EP22862947 A EP 22862947A EP 4339759 B1 EP4339759 B1 EP 4339759B1
Authority
EP
European Patent Office
Prior art keywords
music
playing
operation instruction
target
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP22862947.3A
Other languages
English (en)
French (fr)
Other versions
EP4339759C0 (de
EP4339759A1 (de
EP4339759A4 (de
Inventor
Mengfei XIE
Yufan Xue
Wei Hua
Xiaoyu Zhu
Dailong CHEN
Jia DING
Zoujie HE
Jie Weng
Chaopeng LIU
Bowen Yang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Publication of EP4339759A1 publication Critical patent/EP4339759A1/de
Publication of EP4339759A4 publication Critical patent/EP4339759A4/de
Application granted granted Critical
Publication of EP4339759C0 publication Critical patent/EP4339759C0/de
Publication of EP4339759B1 publication Critical patent/EP4339759B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • G10H1/0025Automatic or semi-automatic music composition, e.g. producing random music, applying rules from music theory or modifying a musical piece
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/46Volume control
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B19/00Driving, starting, stopping record carriers not specifically of filamentary or web form, or of supports therefor; Control thereof; Control of operating function ; Driving both disc and head
    • G11B19/02Control of operating function, e.g. switching from recording to reproducing

Definitions

  • the present disclosure relates to computer processing, in particular to a method and a device for music play.
  • users may play music through electronic devices. Users may pre-install a music application on their electronic devices to play music through that music application.
  • the music to be played may include at least one of the following: music downloaded from the music application, music uploaded from the local to the music application, and music received through social applications.
  • Embodiments of the present disclosure provide a method and a device for music play, which may enhance the above discussed play effect of the music.
  • the solution of the present disclosure may achieve an enhancement of the play effect through at least one processing and therefore the play effect may be improved flexibly.
  • the definition on the invention is set out in the appended claims.
  • FIG. 1 illustrates a schematic diagram of a scenario for music play according to the present disclosure.
  • the electronic device may be installed with a music application for play music.
  • the music may include audio music and/or video music.
  • the above-mentioned music application may obtain the music from a music server, a local storage device, or a social application and play the music.
  • Social applications may be used for sharing data such as pictures, text, audio and video.
  • one social application may perform data transmission with another social application via a social server, or directly with the social server.
  • the data used hereinafter and transmitted by the social server may be data about music.
  • the speaker of the electronic device may usually be used to play the music.
  • the user may feel the music through the sound.
  • the embodiments of the present dislcosure may enhance the play effect of music by at least one processing, and the processing may be controlled by the operating controls in the first interface. In this way, the play effect of music can be enhanced flexibly.
  • FIG. 2 illustrates a flowchart of a method for music play according to embodiments of the present disclosure.
  • the method shown in FIG. 2 may be applied on the electronic device.
  • the method for music play comprises: S101: receiving a first operation instruction in a target application for playing music.
  • the target application may be the music application shown in FIG. 1 .
  • the target application may have a plurality of interfaces, for example, an initial interface, an interface for playing music (hereinafter may also be referred to as a third interface), an interface for selecting music, and an interface for setting up the music application.
  • the user may perform the first operation instruction on any of the above interfaces of the target application.
  • the first operation instruction is used for requesting the presentation of the first interface.
  • the first operation instruction may be an operation of a control on the interface described above, or a sliding operation on the interface.
  • FIGS. 3 and 4 illustrate schematic diagrams of entering the first interface from the third interface according to embodiments of the present disclosure.
  • the third interface includes: a music name, music lyrics, and an area for controlling the play.
  • the area for controlling the play is used for controlling the music being played, such as collecting the music, adjusting the play progress, adjusting the play mode, downloading the music, sharing the music, searching for the target content in the music, etc., which is not limited by the embodiments of the present disclosure.
  • the third interface further includes the operation control TB0.
  • the interface shown in FIG. 4 may be presented.
  • S102 in response to the first operation instruction, presenting a first interface of the target application, the first interface including an operation control for enhancing a play effect of the music through at least one processing, the processing being used for representing music content in a way more than sound.
  • the first interface may be used for managing the functions that enhance the play effect. Specifically, the functions that enhance the play effect may be managed through one or more operation controls in the first interface.
  • the number of operation controls mentioned above can be flexibly selected, so that the relationships between the operation controls and the processing are different.
  • the operation control may correspond to at least one processing. That is, this operation control may enhance the play effect of the music through at least one processing.
  • Each processing may correspond to a way to enhance the play effect. In this way, the number of operation controls in the first interface can be reduced and therefore the first interface may be simplified. However, it cannot achieve the enhancement of the play effect accurately.
  • each operation control may correspond to a processing. That is, each operation control may enhance the play effect of the music through a processing and each processing may correspond to a way to enhance the play effect. As such, each operation control may achieve a way to enhance the play effect of the music, which may facilitate to improve the accuracy of the enhancement of the play effect.
  • the number of operation controls in the first interface may be increased accordingly, and the complexity of the first interface may also be increased.
  • the at least one processing comprises at least one of the following: continuously amplifying a volume of the music, flashing a light source based on the music, flashing a screen based on the music, or vibrating based on the music.
  • the amplified volume of the music is a multiple of the original volume of the music.
  • the multiple may be set as any value.
  • the conventional way to enhance volume may usually be achieved with preset levels, for example, a high level, a medium level, and a low level, to enhance the volume with inherent parameters. The user may select only one of these levels other than any enhancement degree.
  • Embodiments of the present disclosure can provide continuous amplification parameters to user, so that the user can select any amplification parameter to enhance the volume, which may facilitate to improve the flexibility of the volume enhancement.
  • the embodiments of the present disclosure can exceed the system maximum volume, which may facilitate to improve the play effect of music in noisy environment, especially for sodcasting.
  • Attributes such as flashing frequency, brightness, and color of the light source may relate to the music, for example, the attributes may relate to the beat point in the music, a change point of a voice in the music, and a change point of an instrument sound in the music.
  • the change point of the voice and/or the instrument sound may include a volume change point, a timbre change point, and a tone change point.
  • some of the above attributes of the light source can be set to be related to music, and the rest of the attributes can be fixed, or randomly set, or pre-set by a corresponding operation control.
  • Attributes such as the flashing frequency, brightness, and color of the screen may relate to music, for example, the attributes may relate to the beat point in music, a change point of a voice in the music, and a change point of an instrument sound in the music.
  • some of the above attributes of the screen can be set to be related to music, and the rest of the attributes can be fixed, or randomly set, or pre-set by a corresponding operation control.
  • the vibration attributes such as the vibration frequency, the vibration duration may relate to music, for example, the vibration attributes may relate to the beat point in music, a change point of a voice in the music, and a change point of an instrument sound in the music.
  • some of the above attributes of the vibration can be set to be related to music, and the rest of the attributes can be fixed, or randomly set, or pre-set by a corresponding operation control.
  • the first interface includes operation controls TB1 through TB4.
  • TB1 may be a sliding bar with an arc shape for continuously amplifying the volume of music.
  • a boundary point may be presented on the TB1.
  • the amplified volume corresponding to the boundary point is the system maximum volume, so that the boundary point can be used to distinguish whether the amplified volume exceeds the system maximum volume.
  • the indication that the system maximum volume can be exceeded may also be achieved by other ways, which will not be limited by embodiments of the present disclosure. For example, in the process of sliding the arc sliding bar by user, whether the current amplified volume exceeds the system maximum volume may be indicated by the text next to the arc sliding bar.
  • TB2 is a clickable control.
  • the text "Light flashing" under the TB2 represents that the operation control is used to turn on or off the function of light flashing based on the music. For example, a user may turn on the function of light flashing based on the music by clicking the TB2 and turn off the function of light flashing based on the music by clicking the TB2 again. In this way, a switching between the on-state and off-state may be achieved by clicking the TB2 repeatedly.
  • TB3 is a clickable control.
  • the text "vibration" under the TB3 represents that the operation control is used to turn on or off the function of vibration based on the music. For example, a user may turn on the function of vibration based on the music by clicking the TB3 and turn off the function of vibration based on the music by clicking the TB3 again. In this way, a switching between the on-state and off-state may be achieved by clicking the TB3 repeatedly.
  • TB4 is a clickable control.
  • the text "screen flashing" under the TB4 represents that the operation control is used to turn on or off the function of screen flashing based on the music. For example, a user may turn on the function of screen flashing based on the music by clicking the TB4 and turn off the function of screen flashing based on the music by clicking the TB4 again. In this way, a switching between the on-state and off-state may be achieved by clicking the TB4 repeatedly.
  • the second operation instruction may be considered as an operation instruction for this operation control, while if at least two operation controls are configured, the second operation instruction may be considered as an operation instruction for one of the at least two operation controls.
  • operation control and the second operation instruction may be different when the processing corresponding to operation controls are different.
  • the operation control can be the sliding bar TB1 as shown in FIG. 5 and the second operation instruction may be the sliding operation of the slider bar.
  • the operation control may also be an input control and the second operation instruction may be the amplification factor input in the input control.
  • the operation control can be the operation control TB2 as shown in FIG. 5 with an on-state and an off state and the second operation instruction may be an operation instruction for switching from the off state to the on state, to enable the function of the light source according to the music.
  • the operation control can also be associated with an adjustment control.
  • the adjustment control When the operation control is in the off state, the adjustment control is not allowed to be operated.
  • the adjustment control When the operation control is in the on state, the adjustment control is allowed to be operated, to adjust the flashing attributes of the light source, including but not limited to the brightness of the light source, the color of the light source, and so on.
  • the operation control can be the operation control TB4 as shown in FIG. 5 with an on-state and an off state and the second operation instruction may be an operation instruction for switching from the off state to the on state, to enable the function of the screen according to the music.
  • the operation control can also be associated with an adjustment control.
  • the adjustment control When the operation control is in the off state, the adjustment control is not allowed to be operated.
  • the adjustment control When the operation control is in the on state, the adjustment control is allowed to be operated, to adjust the flashing attributes of the screen, including but not limited to the brightness of the screen, the color of the screen, and so on.
  • the operation control can be the operation control TB3 as shown in FIG. 5 with an on-state and an off state and the second operation instruction may be an operation instruction for switching from the off state to the on state, to enable the function of vibration according to music.
  • the operation control can also be associated with an adjustment control.
  • the adjustment control When the operation control is in the off state, the adjustment control is not allowed to be operated.
  • the adjustment control When the operation control is in the on state, the adjustment control is allowed to be operated, to adjust the vibration attributes, including but not limited to a vibration intensity, a vibration duration, and a vibration frequency.
  • the first interface provided by the embodiments of the present disclosure provides the user with a way to enhance the play effect of music.
  • the first interface may include a variety of operation controls to enhance the play effect. Users may enhance the play effect of the music in various ways through the first interface to set different pattern for the enhancement of the play effect. Meanwhile, the switching between different interfaces may be avoided.
  • S104 processing the music based on the second operation instruction during a process of playing the music.
  • the music is processed corresponding to the second operation instruction.
  • the second operation instruction is a sliding operation of the sliding bar shown in FIG. 5
  • the volume of the music can be amplified by N times during the playing of the music, and N is determined by the position of the sliding bar after sliding.
  • the position of the sliding bar indicates that the volume of the music is amplified by 1.5 times.
  • the second operation instruction is a click operation on the operation control TB2 shown in FIG. 5
  • the light source may flash according to the music during the process of playing the music.
  • the vibration according to the music may occur in the process of playing the music.
  • the second operation instruction is a click operation on the operation control TB4 shown in FIG. 5
  • the screen may flash according to the music in the process of playing the music.
  • the music starts to be played in the target application and the music may be processed based on the second operation instruction during the process of playing the music.
  • the play effect of the music can be enhanced without playing the music manually by the user, which may reduce the operation complexity.
  • Starting playing the music may comprises at least one of the following: starting playing an example music in the target application; starting playing music from the historical play record of the target application; or starting playing the user's favorite music of the target application.
  • the music to be played may not limited by the embodiments of the present disclosure.
  • the processing may be different if the operation controls corresponding to the second operation instruction are different.
  • the following explains the different processing procedures with different operation controls.
  • the music is processed according to the second operation instruction, including: first, determining the amplification parameter corresponding to the second operation instruction, which may be any parameter between the minimum amplification parameter and the maximum amplification parameter; Then, during the process of playing the music, the original volume of the music is amplified according to the amplification parameters, the original volume is the volume specified in the music data of the music.
  • the minimum and maximum amplification parameters are preset values.
  • the minimum amplification parameter can be 1, and the maximum amplification parameter can be any value greater than 1, so the amplification parameter can be any value greater than 1.
  • the determination of the above amplification parameters is related to the type of operation control.
  • the amplification parameter can be the numerical value entered by the user in the input control.
  • the operation control is the sliding bar shown in FIG. 5
  • the amplification parameters can be determined based on the slider position of the sliding bar.
  • the unit size corresponding to the unit size is determined based on the maximum amplification parameter, minimum amplification parameter, and the size of the operating control; Then, based on the slider position and unit amplification parameters, the amplification parameters corresponding to the second operation instruction is determined.
  • the unit amplification parameter can be determined through the following process: first, calculating the difference between the maximum amplification parameter and the minimum amplification parameter; then calculating the ratio of the difference to the size of the control to obtain the unit magnification parameter.
  • the unit amplification parameters can be accurately determined, and therefore the amplification parameters may also be accurately determined, which may facilitate to improve the accuracy of volume amplification.
  • the operation control for amplifying the volume may be a continuous sliding control, i.e., the sliding bar.
  • the continuous sliding of the control at least one of the following attributes of the control changes: a color or a size. In this way, changes in volume may indicated to the user during the sliding.
  • the color of the control can gradually become darker, and/or the size of the control can gradually increase. If the amplification parameter of the volume is reduced, the color of the operation control can gradually become lighter, and/or the size of the operation control can gradually decrease.
  • the second operation instruction is used for adjusting the operation control to an on state
  • the processing the music according to the second operation instruction during the process of playing the music comprises: performing the target processing in response to playing until a target point of the music, the target point comprising at least one of the following: a change point of a voice in the music, a change point of an instrument sound in the music, or a beat point of the music; the change point of the voice and/or the instrument sound includes a volume change point, a timbre change point, and a tone change point.
  • the volume change point can be a time point where the volume difference between the previous time point is greater than or equal to the preset threshold
  • the timbre change point can be a time point where the timbre is different from the previous time point
  • the tone change point can be a time point where the tone is different from the previous time point.
  • the light source and/or the screen when playing until the target point, may flash and/or a vibration may occur to indicate the user that there is a change point of voice and/or instrument sound or beat point at the current time.
  • the rhythm and sound change points can determine the rhythm and sound change points through various ways such as light sources, screens, and vibrations, which may not only improve feeling of the users when they enjoy the music, but also provide the assistant in learning music knowledge.
  • a second interface can also be presented in the target application while playing the music.
  • the second interface is used to present the target image.
  • the target image may be presented in the second interface, which is an image including at least one color.
  • the target image can be presented. In this way, the screen flashing function can be achieved through images with colors.
  • the target images presented at different target points can be the same. For example, for both sound change points and beat points, the same target image is presented.
  • the change points of voice and the sound of instruments can also be distinguished by different target images.
  • the target images corresponding to the change points of different voices are different, and the target images corresponding to the change points of the sound of different instruments are different.
  • target images corresponding to volume change points, timbre change points, and tone change points can also be different.
  • the attributes of the target image can also be changed when presenting the target image in the second interface.
  • an attribute of the target image may be determined based on a time interval between the target point and a previous target point.
  • the attribute includes at least one of the following: a size, a presenting position, a color, a presenting duration, or a shape. Then the target image may be presented in the second interface based on the attribute of the target image.
  • Different time intervals correspond to different attributes. For example, different time intervals result in different colors of the target image.
  • the larger the time interval the larger or smaller the value of the attribute.
  • the larger the time interval the larger the size of the target image.
  • the larger the time interval the longer the display duration.
  • users can close the second interface. Specifically, first, a third operation instruction for the second interface is received; then, in response to the third operation instruction, a presentation of the second interface in the target application is stopped, and a third interface or an initial interface of the target application is presented, the third interface being used for presenting the music being played.
  • the third operation instruction can be an operation instruction for closing the control in the second interface, or a sliding operation in the second interface, which is not limited by the embodiments of the present disclosure.
  • stopping presenting the second interface means the flashing of the screen is stopped while playing the music.
  • the flashing of the screen can be stopped through various policies as follows.
  • the target image can no longer be presented during the process of the playing of the current music but can continue to be presented during the process of the playing of the next music.
  • the target image can no longer be presented during the process of playing the music before the preset duration but continue to be presented during the process of the playing of the music after the preset duration.
  • the target image can no longer be presented during the process of playing the music before quitting the target application but continue to be presented during the process of playing music after restarting the target application.
  • the operation control used in the first interface to flash the screen based on the music can be automatically adjusted to the off state so that the target image is never presented again until the operation control is adjusted to the on state again with a manually operation by the user.
  • the number of third operation instructions may be counted to determine whether to adjust the operation control to the off state.
  • the operation control in the first interface used to control the flashing screen is adjusted to the off state. That is, if the user turns off the operation control of the flashing screen several times, the function of the flashing screen can be considered as being not interested by the user, so that the function of the flashing screen is automatically turned off and the screen is no longer flashing in the process of playing the music. In this way, the experience of using the target image can be improved for the users that are not interested in the flashing screen.
  • performing the target processing in response to playing until the target point of the music comprises: firstly, an intensity of the light source is determined based on a time interval between the target point and a previous target point in response to playing until the target point of the music; then, the light source may flash according to the intensity of the light source. In this way, the flashing of the light source may be fancier and the experience of enjoying the music may be enhanced for the user.
  • the relationship between time interval and light source intensity can include two types, which are not limited by the embodiments of the present disclosure.
  • the first type is that the larger the time interval, the lower the intensity of the light source. On the contrast, the smaller the time interval, the greater the intensity of the light source.
  • the second type is that the smaller the time interval, the lower the intensity of the light source. On the contrast, the larger the time interval, the greater the intensity of the light source.
  • performing the target processing in response to playing until the target point of the music comprises: first, an intensity of a vibration is determined based on a time interval between the target point and a previous target point in response to playing until the target point of the music then, vibration is performed according to the vibration intensity.
  • the vibration effect may be more flexible and the experience of enjoying the music may be enhanced for the user.
  • the relationship between time interval and vibration intensity can include two types, which are not limited by the embodiments of the present disclosure.
  • the first type is that the larger the time interval, the smaller the vibration intensity. On the contrast, the smaller the time interval, the greater the vibration intensity.
  • the second type is that the smaller the time interval, the smaller the vibration intensity. On the contrast, the larger the time interval, the greater the vibration intensity.
  • the electronic device can perform the following steps: first, receiving a fourth operation instruction for the first interface; Then, stopping the at least one of processing for the music in the process of playing the music.
  • the fourth operation instruction can be the operation instruction for the operation control TB5 in FIG. 5 .
  • the operation controls TB1 to TB4 used to enhance the playback effect in the first interface may be set to the default state.
  • the default state of TB1 is the state where the amplification parameter is 1, and the default state of TB2 to TB4 is the off state.
  • the volume of the music may no longer amplified based on the music, and the flashing of the light source screen based on the music and vibration based on the music may no longer occur.
  • all enhancement effects may be closed by the user through a single operation control, which may avoid operating the control one by one to close the enhancement effects and therefore reduce the operation complexity.
  • the electronic device can perform the following steps: first, receiving a fifth operation instruction for the first interface; then stopping presenting the first interface in the target application, and presenting a third interface or an initial interface of the target application.
  • the third interface is used for presenting the music being played.
  • the fifth operation instruction can be the operation instruction for the operation control TB6 in FIG. 5 .
  • the current states of the operation controls TB1 to TB4 that used for enhancing the play effect in the first interface will be kept and will not be set to the default state.
  • the fifth operation instruction is different from the fourth operation instruction.
  • the fifth operation instruction is used for returning to the third or the initial interface, rather than turning off the enhancement effect, while the fourth operation instruction is used for returning to the third or initial interface and turning off the enhancement effect.
  • FIG. 6 illustrates a structure block diagram of an apparatus for music play according to embodiments of the present disclosure.
  • the apparatus 200 for music play comprises a first operation instruction receiving module 201, a first interface presenting module 202, a second operation instruction receiving module 203 and a music play module 204.
  • the second operation instruction receiving module 203 is configured for receiving a second operation instruction for the operation control.
  • the music play module 204 is configured for processing the music based on the second operation instruction during a process of playing the music.
  • the apparatus may further comprises: a music turning on module 204 configured for, before the processing the music based on the second operation instruction during the process of playing the music, in response to the second operation instruction, starting playing the music in the target application if the target application is not playing music.
  • a music turning on module 204 configured for, before the processing the music based on the second operation instruction during the process of playing the music, in response to the second operation instruction, starting playing the music in the target application if the target application is not playing music.
  • the at least one processing comprises at least one of the following: continuously amplifying a volume of the music, flashing a light source based on the music, flashing a screen based on the music, or vibrating based on the music.
  • the operation control is a continuous sliding control
  • the operation control changes, during a continuous sliding process of the operation control, with at least one of the following attributes: a color, a size.
  • the music turning on module 204 is further configured for: performing the target processing in response to playing until a target point of the music, the target point comprising at least one of the following: a change point of a voice in the music, a change point of an instrument sound in the music, or a beat point of the music; the change point of the voice and/or the instrument sound includes a volume change point, a timbre change point, and a tone change point.
  • the apparatus further comprises: a second interface presentation module configured for in response to playing the music, presenting a second interface in the target application.
  • the music turning on module 204 is further configured for: in response to playing until the target point of the music, presenting a target image in the second interface in response to playing until the target point of the music, the target image being an image including at least one color.
  • the music turning on module 204 is further configured for: in response to playing until the target point of the music, determining an attribute of the target image based on a time interval between the target point and a previous target point, the attribute including at least one of the following: a size, a presenting position, a color, a presenting duration, or a shape; presenting the target image in the second interface based on the attribute of the target image.
  • the apparatus further comprises a third operation instruction receiving module and a second interface presentation terminating module.
  • the second interface presentation stopping module is configured for, in response to the third operation instruction, stopping presenting the second interface in the target application, and presenting a third interface or an initial interface of the target application, the third interface being used for presenting the music being played.
  • the music turning on module 204 is further configured for: determining an intensity of the light source based on a time interval between the target point and a previous target point in response to playing until the target point of the music; and flashing the light source based on the intensity of the light source.
  • the music turning on module 204 is further configured for: determining an intensity of a vibration based on a time interval between the target point and a previous target point in response to playing until the target point of the music; and vibrating based on the intensity of the vibration.
  • the processing stopping module is configured for stopping the at least one of processing for the music in the process of playing the music.
  • the apparatus further comprises a fifth operation instruction receiving module and a first interface presentation stopping module.
  • the fifth operation instruction receiving module is configured for receiving a fifth operation instruction for the first interface.
  • the first interface presentation stopping module is configured for stopping presenting the first interface in the target application and presenting a third interface or an initial interface of the target application, the third interface being used for presenting the music being played.
  • the apparatus for music play provided by embodiments of the present disclosure may be used to execute the technical solution of the method embodiment shown in FIG. 2 described above, which may be implemented based on similar principle and achieve similar technical effect that will not be repeated here.
  • FIG. 7 illustrates a structure block diagram of an electronic device 600 according to embodiments of the present disclosure.
  • the electronic device 600 comprises a memory 602 and at least one processor 601.
  • the memory 602 stores computer execution instruction.
  • the at least one processor 601 executes the computer execution instructions stored by the memory 602, to cause the electronic device 600 to carry out the method for music play as shown in FIG. 2 .
  • the electronic device 600 may also include a receiver 603 and a transmitter 604, the receiver 603 is used to receive information from other apparatuses or devices and forward it to the processor 601, and the transmitter 604 is used to transmit information to other apparatuses or devices.
  • the electronic device 900 may be a terminal device.
  • the terminal devices may include, but are not limited to, mobile terminals such as mobile phones, laptops, digital broadcasting receivers, personal digital assistants (PDAs), portable android devices (PADs), portable multimedia players (PMPs), terminals on broad (such as navigation terminals on broad), and fixed terminals such as digital TVs, desktop computers, and the like.
  • PDAs personal digital assistants
  • PADs portable android devices
  • PMPs portable multimedia players
  • terminals on broad such as navigation terminals on broad
  • fixed terminals such as digital TVs, desktop computers, and the like.
  • the electronic device shown in FIG. 8 is only for the propose of illustration and should not suggest any limitations on the functionality and scope of use of the disclosed embodiment.
  • the electronic device 900 may include a processing apparatus (e.g., a central processing unit, a graphics processor, etc.) 901 that can perform various appropriate actions and processes based on programs stored in read-only memory (ROM) 902 or loaded from the storage apparatus 808 into random access memory (RAM) 903.
  • ROM read-only memory
  • RAM random access memory
  • various programs and data required for the operation of the electronic device 900 are also stored.
  • the processing unit 901, ROM 902, and RAM 903 are connected to each other via bus 904.
  • the input/output (I/O) interface 905 is also connected to bus 904.
  • an input apparatus 906 including, for example, touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.
  • an output apparatus 907 such as liquid crystal display (LCD), speakers, vibrators, etc.
  • a storage apparatus 908 such as magnetic tape, hard disk, etc.
  • the communication apparatus 909 may allow electronic devices to communicate wirelessly or wired with other devices to exchange data.
  • FIG. 8 shows an electronic device 900 with a variety of devices, it is to be understood that it is not required to implement or deploy all of the devices shown. More or fewer apparatuses may be implemented or possessed instead.
  • the process described above with reference to the flowchart can be implemented as a computer software program.
  • the disclosed embodiment includes a computer program product that includes a computer program carried on a non transient computer-readable medium, which includes program code for executing the method shown in the flowchart.
  • the computer program can be downloaded and installed from the network through communication apparatus 909, or installed from storage apparatus 908, or installed from ROM 902.
  • the processing apparatus 901 the above-mentioned functions defined in the method of the present disclosed embodiment are executed.
  • Computer readable medium mentioned in this disclosure can be a computer-readable signal medium, a computer-readable storage medium, or any combination of the two.
  • Computer readable storage media can be, for example, but not limited to, systems, devices or devices of electricity, magnetism, light, electromagnetism, infrared, or semiconductors, or any combination of the above. More specific examples of computer-readable storage media may include, but are not limited to: electrical connections with one or more wires, portable computer disks, hard drives, random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fibers, portable compact disk read-only memory (CD-ROM), optical storage devices, magnetic storage devices Or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program, which can be used by or in combination with an instruction execution system, device, or device.
  • computer-readable signal media may include data signals propagated in the baseband or as part of the carrier wave, which carry computer-readable program code. This propagation of data signals can take various forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the above.
  • a computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium, which can send, propagate, or transmit programs for use by or in combination with instruction execution systems, devices, or devices.
  • the program code contained on computer readable media can be transmitted using any suitable medium, including but not limited to wires, optical cables, RF (radio frequency), etc., or any suitable combination of the above.
  • the computer readable medium mentioned above can be included in the electronic device mentioned above; it can also exist separately without being assembled into the electronic device.
  • Computer program code for executing the operations disclosed herein may be written in one or more programming languages or combinations thereof, including but not limited to object-oriented programming languages such as Java, Smalltalk, C++, as well as conventional procedural programming languages such as "C" or similar programming languages.
  • Program code can be completely executed on the user's computer, partially executed on the user's computer, executed as a standalone software package, partially executed on the user's computer, partially executed on a remote computer, or completely executed on a remote computer or server.
  • the remote computer can be connected to the user's computer through any type of network, including a local area network (LAN) or wide area network (WAN), or can be connected to an external computer (such as using an Internet service provider to connect through the Internet).
  • LAN local area network
  • WAN wide area network
  • Internet service provider such as using an Internet service provider to connect through the Internet.
  • each box in a flowchart or block diagram can represent a module, program segment, or part of code that contains one or more executable instructions for implementing specified logical functions.
  • the functions indicated in the boxes can also occur in a different order than those indicated in the accompanying drawings. For example, two consecutive boxes can actually be executed in parallel, and sometimes they can also be executed in the opposite order, depending on the function involved.
  • each box in the block diagram and/or flowchart, as well as the combination of boxes in the block diagram and/or flowchart, can be implemented using dedicated hardware based systems that perform specified functions or operations, or can be implemented using a combination of dedicated hardware and computer instructions.
  • the units described in the present disclosed embodiment can be implemented through software or hardware.
  • the name of the unit does not constitute a qualification for the unit itself in a certain situation, for example, the first obtaining unit can also be described as "the unit for obtaining at least two internet protocol addresses".
  • FPGAs Field Programmable Gate Arrays
  • ASICs Application Specific Integrated Circuits
  • ASSPs Application Specific Standard Products
  • SOC On Chip Systems
  • CPLDs Complex Programmable Logic Devices
  • machine readable media can be tangible media that can contain or store programs for use by or in combination with instruction execution systems, devices, or devices.
  • a machine readable media can be machine readable signal media or machine readable storage media.
  • a machine readable medium may include but not limited to an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • the computer readable storage medium would include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • CD-ROM portable compact disc read-only memory
  • CD-ROM compact disc read-only memory
  • magnetic storage device or any suitable combination of the foregoing.
  • embodiments of the present disclosure provide a method for music play, comprising:
  • the method before the processing the music according to the second operation instruction during the process of playing the music, the method further comprises: in response to the second operation instruction, start playing the music in the target application if the target application is not playing music.
  • the at least one processing comprises at least one of the following: continuously amplifying a volume of the music, flashing a light source according to the music, flashing a screen according to the music, or vibrating according to the music.
  • the processing the music according to the second operation instruction during the process of playing the music comprises:
  • the operation control is a continuous sliding control, and the operation control changes, during a continuous sliding process of the operation control, with at least one of the following attributes: a color, a size.
  • the determining the amplification parameter corresponding to the second operation instruction comprises:
  • the second operation instruction is used for adjusting the operation control to an on state
  • the processing the music according to the second operation instruction during the process of playing the music comprises: performing the target processing in response to playing until a target point of the music, the target point comprising at least one of the following: a change point of a voice in the music, a change point of an instrument sound in the music, or a beat point of the music, the change point of the voice and/or the instrument sound includes a volume change point, a timbre change point, and a tone change point.
  • the presenting a target image in the second interface in response to playing until the target point of the music comprises:
  • the method further comprises:
  • the eleventh example of the first aspect further comprising: in response to the number of reception times of the third operation instruction exceeding or reaching a preset threshold of reception time, adjusting the operation control for controlling flashing screen in the first interface to an off state.
  • the performing the target processing in response to playing until the target point of the music comprises:
  • the performing the target processing in response to playing until the target point of the music comprises:
  • embodiments of the present disclosure provide an apparatus for music play, comprising:
  • the apparatus further comprises: a music turning on module configured for, before the processing the music based on the second operation instruction during the process of playing the music, in response to the second operation instruction, starting playing the music in the target application if the target application is not playing music.
  • the at least one processing comprises at least one of the following: continuously amplifying a volume of the music, flashing a light source based on the music, flashing a screen based on the music, or vibrating based on the music.
  • the music turning on module is further configured for:
  • determining an amplification parameter corresponding to the second operation instruction the amplification parameter being any parameter between a minimum amplification parameter and a maximum amplification parameter; amplifying an original volume of the music based on the amplification parameter during the process of playing the music, the original volume being a volume specified in music data of the music.
  • the operation control is a continuous sliding control, and the operation control changes, during a continuous sliding process of the operation control, with at least one of the following attributes: a color, a size.
  • the music turning on module is further configured for: When determining the amplification parameter corresponding to the second operation instruction, determining a unit amplification parameter corresponding to a unit size based on the maximum amplification parameter, the minimum amplification parameter and a size of the operation control; and determining the amplification parameter corresponding to the second operation instruction based on a slider position of the continuous sliding control and the unit amplification parameter.
  • the music turning on module is further configured for:
  • the target point comprising at least one of the following: a change point of a voice in the music, a change point of an instrument sound in the music, or a beat point of the music; the change point of the voice and/or the instrument sound includes a volume change point, a timbre change point, and a tone change point.
  • the apparatus further comprises: a second interface presentation module configured for in response to playing the music, presenting a second interface in the target application.
  • the music turning on module is further configured for: in response to playing until the target point of the music, presenting a target image in the second interface in response to playing until the target point of the music, the target image being an image including at least one color.
  • the music turning on module is further configured for: in response to playing until the target point of the music, determining an attribute of the target image based on a time interval between the target point and a previous target point, the attribute including at least one of the following: a size, a presenting position, a color, a presenting duration, or a shape; presenting the target image in the second interface based on the attribute of the target image.
  • the apparatus further comprises:
  • the apparatus further comprises: a screen flashing off module configured for, in response to the number of reception times of the third operation instruction exceeding or reaching a preset threshold of reception time, adjusting the operation control for controlling flashing screen in the first interface to an off state.
  • the music turning on module is further configured for: determining an intensity of the light source based on a time interval between the target point and a previous target point in response to playing until the target point of the music; and flashing the light source based on the intensity of the light source.
  • the music turning on module is further configured for: determining an intensity of a vibration based on a time interval between the target point and a previous target point in response to playing until the target point of the music; and vibrating based on the intensity of the vibration.
  • the apparatus further comprises:
  • the apparatus further comprises:
  • an electronic device comprising at least one processor and memory;
  • a computer readable storage medium storing computer execution instructions thereon.
  • the computer execution instructions when executed by a processor, cause a computing device to carry out the method of any of the first aspect.
  • a computer program for carrying out the method of any of the first aspect.
  • a computer program product wherein the computer program product includes computer instructions for carrying out the method of any of the first aspect.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Claims (11)

  1. Verfahren zum Abspielen von Musik, wobei das Verfahren umfasst:
    Empfangen (S101) einer ersten Betriebsanweisung in einer Zielanwendung zum Abspielen von Musik;
    als Reaktion auf die erste Betriebsanweisung, Darstellen (S102) einer ersten Oberfläche der Zielanwendung, wobei die erste Oberfläche eine Betriebssteuerung zum Verbessern einer Wiedergabeleistung der Musik durch mindestens eine Verarbeitung beinhaltet, wobei die mindestens eine Verarbeitung zum Darstellen von Musikinhalten auf eine Weise verwendet wird, die über Ton hinausgeht, wobei die mindestens eine Verarbeitung mindestens eines von kontinuierlichem Verstärken einer Lautstärke der Musik, Blinken mit einer Lichtquelle basierend auf der Musik, Blinken mit einem Bildschirm basierend auf der Musik oder Vibrieren basierend auf der Musik umfasst;
    Empfangen (S103) einer zweiten Betriebsanweisung für die Betriebssteuerung; und
    Verarbeiten (S104) der Musik basierend auf der zweiten Betriebsanweisung während eines Prozesses zum Abspielen der Musik, wobei Verarbeitung der Musik basierend auf der zweiten Betriebsanweisung während des Prozesses zum Abspielen der Musik Folgendes umfasst:
    wenn die mindestens eine Verarbeitung mindestens eines der Folgenden umfasst: Blinken mit der Lichtquelle basierend auf der Musik, Blinken mit dem Bildschirm basierend auf der Musik oder Vibrieren basierend auf der Musik, Durchführen der mindestens einen Verarbeitung als Reaktion auf einen Zielpunkt der Musik während des Prozesses zum Abspielen der Musik, wobei die zweite Betriebsanweisung zum Einstellen der Betriebssteuerung in einen Ein-Zustand verwendet wird, wobei der Zielpunkt mindestens eines des Folgenden umfasst:
    einen Änderungspunkt einer Stimme in der Musik,
    einen Änderungspunkt eines Instrumentenklangs in der Musik oder
    einen Taktpunkt der Musik; und
    wobei der Änderungspunkt der Stimme und/oder des Instrumentenklangs einen Lautstärkenänderungspunkt, einen Klangfarbenänderungspunkt oder einen Tonänderungspunkt umfasst und
    wobei Durchführen der mindestens einen Verarbeitung als Reaktion auf den Zielpunkt der Musik Folgendes umfasst:
    Bestimmen einer Intensität der Lichtquelle basierend auf einem Zeitintervall zwischen dem Zielpunkt und einem vorherigen Zielpunkt als Reaktion auf ein Abspielen der Musik bis zum Zielpunkt; und
    Blinken mit der Lichtquelle basierend auf der Intensität der Lichtquelle.
  2. Verfahren nach Anspruch 1, wobei vor der Verarbeitung der Musik basierend auf der zweiten Betriebsanweisung während des Prozesses zum Abspielen der Musik das Verfahren weiter Folgendes umfasst:
    als Reaktion auf die zweite Betriebsanweisung, Starten vom Abspielen der Musik in der Zielanwendung, wenn die Zielanwendung keine Musik abspielt.
  3. Verfahren nach Anspruch 1, wobei, wenn die Betriebssteuerung zum kontinuierlichen Verstärken der Lautstärke der Musik verwendet wird, die Verarbeitung der Musik basierend auf der zweiten Betriebsanweisung während des Prozesses vom Abspielen der Musik Folgendes umfasst:
    Bestimmen eines Verstärkungsparameters entsprechend der zweiten Betriebsanweisung, wobei der Verstärkungsparameter ein beliebiger Parameter zwischen einem minimalen Verstärkungsparameter und einem maximalen Verstärkungsparameter ist;
    Verstärken einer Originallautstärke der Musik basierend auf dem Verstärkungsparameter während des Prozesses zum Abspielen der Musik, wobei die Originallautstärke eine in Musikdaten der Musik angegebene Lautstärke ist.
  4. Verfahren nach Anspruch 3, wobei das Bestimmen des Verstärkungsparameters entsprechend der zweiten Betriebsanweisung Folgendes umfasst:
    Bestimmen eines Einheitsverstärkungsparameters, der einer Einheitsgröße entspricht, basierend auf dem maximalen Verstärkungsparameter, dem minimalen Verstärkungsparameter und einer Größe der Betriebssteuerung; und
    Bestimmen des Verstärkungsparameters entsprechend der zweiten Betriebsanweisung basierend auf einer Schiebereglerposition der kontinuierlichen Schiebesteuerung und dem Einheitsverstärkungsparameter.
  5. Verfahren nach Anspruch 1, wobei das Verfahren vor Durchführen der Zielverarbeitung als Reaktion auf ein Abspielen der Musik bis zum Zielpunkt weiter Folgendes umfasst:
    als Reaktion auf Abspielen der Musik, Darstellen einer zweiten Oberfläche in der Zielanwendung;
    wobei das Durchführen der Zielverarbeitung als Reaktion auf Abspielen der Musik bis zum Zielpunkt umfasst:
    Darstellen eines Zielbilds in der zweiten Oberfläche als Reaktion auf Abspielen der Musik bis zum Zielpunkt, wobei das Zielbild ein Bild ist, das mindestens eine Farbe beinhaltet.
  6. Verfahren nach Anspruch 5, wobei das Darstellen eines Zielbilds in der zweiten Oberfläche als Reaktion auf Abspielen der Musik bis zum Zielpunkt Folgendes umfasst:
    als Reaktion auf Abspielen der Musik bis zum Zielpunkt, Bestimmen eines Attributs des Zielbilds basierend auf einem Zeitintervall zwischen dem Zielpunkt und einem vorherigen Zielpunkt, wobei das Attribut mindestens eines des Folgenden beinhaltet: eine Größe, eine Darstellungsposition, eine Farbe, eine Darstellungsdauer oder eine Form;
    Darstellen des Zielbilds in der zweiten Schnittstelle basierend auf dem Attribut des Zielbilds.
  7. Verfahren nach Anspruch 1, wobei das Durchführen der Zielverarbeitung als Reaktion auf Abspielen der Musik bis zum Zielpunkt Folgendes umfasst:
    Bestimmen einer Intensität einer Vibration basierend auf einem Zeitintervall zwischen dem Zielpunkt und einem vorherigen Zielpunkt als Reaktion auf ein Abspielen der Musik bis zum Zielpunkt;
    Vibrieren basierend auf der Intensität der Vibration.
  8. Einrichtung (200) zum Abspielen von Musik, umfassend:
    ein erstes Betriebsanweisungsempfangsmodul (201), das zum Empfangen einer ersten Betriebsanweisung in einer Zielanwendung zum Abspielen von Musik konfiguriert ist;
    ein erstes Oberflächendarstellungsmodul (202), das dazu konfiguriert ist, als Reaktion auf die erste Betriebsanweisung eine erste Oberfläche der Zielanwendung darzustellen, wobei die erste Oberfläche eine Betriebssteuerung zum Verbessern einer Wiedergabeleistung der Musik durch mindestens eine Verarbeitung beinhaltet, wobei die mindestens eine Verarbeitung zum Darstellen von Musikinhalten auf eine Weise verwendet wird, die über Ton hinausgeht, wobei die mindestens eine Verarbeitung mindestens eines von kontinuierlichem Verstärken einer Lautstärke der Musik, Blinken mit einer Lichtquelle basierend auf der Musik, Blinken mit einem Bildschirm basierend auf der Musik oder Vibrieren basierend auf der Musik umfasst;
    ein zweites Betriebsanweisungsempfangsmodul (203), das zum Empfangen einer zweiten Betriebsanweisung für die Betriebssteuerung konfiguriert ist; und
    ein Musikwiedergabemodul (204), das dazu konfiguriert ist, die Musik basierend auf der zweiten Betriebsanweisung während eines Prozesses zum Abspielen der Musik zu verarbeiten, wobei Verarbeitung der Musik basierend auf der zweiten Betriebsanweisung während des Prozesses zum Abspielen der Musik Folgendes umfasst:
    wenn die mindestens eine Verarbeitung mindestens eines der Folgenden umfasst: Blinken mit der Lichtquelle basierend auf der Musik, Blinken mit dem Bildschirm basierend auf der Musik oder Vibrieren basierend auf der Musik, Durchführen der mindestens einen Verarbeitung als Reaktion auf einen Zielpunkt der Musik während des Prozesses zum Abspielen der Musik, wobei die zweite Betriebsanweisung zum Einstellen der Betriebssteuerung in einen Ein-Zustand verwendet wird, wobei der Zielpunkt mindestens eines des Folgenden umfasst:
    einen Änderungspunkt einer Stimme in der Musik,
    einen Änderungspunkt eines Instrumentenklangs in der Musik oder
    einen Taktpunkt der Musik; und
    wobei der Änderungspunkt der Stimme und/oder des Instrumentenklangs einen Lautstärkenänderungspunkt, einen Klangfarbenänderungspunkt oder einen Tonänderungspunkt umfasst und
    wobei Durchführen der mindestens einen Verarbeitung als Reaktion auf den Zielpunkt der Musik Folgendes umfasst:
    Bestimmen einer Intensität der Lichtquelle basierend auf einem Zeitintervall zwischen dem Zielpunkt und einem vorherigen Zielpunkt als Reaktion auf ein Abspielen der Musik bis zum Zielpunkt; und
    Blinken mit der Lichtquelle basierend auf der Intensität der Lichtquelle.
  9. Elektronische Vorrichtung, umfassend: mindestens einen Prozessor und Speicher:
    wobei der Speicher computerausführbare Anweisungen speichert;
    der mindestens eine Prozessor die im Speicher gespeicherten Computerausführungsanweisungen ausführt, um die elektronische Vorrichtung zu veranlassen, das Verfahren nach einem der Ansprüche 1-7 auszuführen.
  10. Computerlesbares Speichermedium, wobei das computerlesbare Speichermedium computerausführbare Anweisungen speichert, und die computerausführbaren Anweisungen, wenn sie von einem Prozessor ausgeführt werden, eine Rechenvorrichtung veranlassen, das Verfahren nach einem der Ansprüche 1-7 auszuführen.
  11. Computerprogramm, das Anweisungen umfasst, die, wenn das Programm von einem Computer ausgeführt wird, den Computer veranlassen, das Verfahren nach einem der Ansprüche 1-7 auszuführen.
EP22862947.3A 2021-09-04 2022-07-22 Musikabspielverfahren und -vorrichtung Active EP4339759B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111034867.5A CN113721821A (zh) 2021-09-04 2021-09-04 音乐播放方法及设备
PCT/CN2022/107415 WO2023029806A1 (zh) 2021-09-04 2022-07-22 音乐播放方法及设备

Publications (4)

Publication Number Publication Date
EP4339759A1 EP4339759A1 (de) 2024-03-20
EP4339759A4 EP4339759A4 (de) 2024-11-06
EP4339759C0 EP4339759C0 (de) 2025-12-03
EP4339759B1 true EP4339759B1 (de) 2025-12-03

Family

ID=78681768

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22862947.3A Active EP4339759B1 (de) 2021-09-04 2022-07-22 Musikabspielverfahren und -vorrichtung

Country Status (4)

Country Link
US (1) US12236925B2 (de)
EP (1) EP4339759B1 (de)
CN (1) CN113721821A (de)
WO (1) WO2023029806A1 (de)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113721821A (zh) * 2021-09-04 2021-11-30 北京字节跳动网络技术有限公司 音乐播放方法及设备
CN116320742A (zh) * 2023-02-24 2023-06-23 西安诺瓦星云科技股份有限公司 图像采集设备的参数辅助调节方法、装置、终端及介质

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10055975A1 (de) * 2000-11-11 2002-05-23 Andreas E Hilburg Vorrichtung zur Wiedergabe von mindestens einem Klang
JP2010054530A (ja) * 2008-08-26 2010-03-11 Sony Corp 情報処理装置、発光制御方法およびコンピュータプログラム
JP2010276733A (ja) * 2009-05-27 2010-12-09 Sony Corp 情報表示装置、情報表示方法および情報表示プログラム
US20110015765A1 (en) * 2009-07-15 2011-01-20 Apple Inc. Controlling an audio and visual experience based on an environment
CN201608764U (zh) 2009-12-25 2010-10-13 康佳集团股份有限公司 一种根据播放音乐变换显示色彩的手机
US9417754B2 (en) * 2011-08-05 2016-08-16 P4tents1, LLC User interface system, method, and computer program product
KR102207208B1 (ko) * 2014-07-31 2021-01-25 삼성전자주식회사 음악 정보 시각화 방법 및 장치
KR102358025B1 (ko) * 2015-10-07 2022-02-04 삼성전자주식회사 전자 장치 및 전자 장치의 음악 컨텐츠 시각화 방법
CN109040822B (zh) 2018-07-16 2021-06-22 北京奇艺世纪科技有限公司 播放器配置方法及装置、存储介质
CN110289024B (zh) * 2019-06-26 2021-03-02 北京字节跳动网络技术有限公司 一种音频编辑方法、装置、电子设备及存储介质
CN112866770B (zh) * 2020-12-31 2023-12-05 北京奇艺世纪科技有限公司 一种设备控制方法、装置、电子设备及存储介质
CN112764711A (zh) * 2021-01-29 2021-05-07 腾讯科技(深圳)有限公司 音乐交互方法、装置、设备以及存储介质
CN113721821A (zh) * 2021-09-04 2021-11-30 北京字节跳动网络技术有限公司 音乐播放方法及设备

Also Published As

Publication number Publication date
US12236925B2 (en) 2025-02-25
US20240119919A1 (en) 2024-04-11
EP4339759C0 (de) 2025-12-03
CN113721821A (zh) 2021-11-30
WO2023029806A1 (zh) 2023-03-09
EP4339759A1 (de) 2024-03-20
EP4339759A4 (de) 2024-11-06

Similar Documents

Publication Publication Date Title
US11621022B2 (en) Video file generation method and device, terminal and storage medium
CN104967900B (zh) 一种生成视频的方法和装置
US12236925B2 (en) Method and device for music play
EP4130956B1 (de) Multimediawiedergabeverfahren und -vorrichtung
US12142296B2 (en) Audio processing method and apparatus, and electronic device and storage medium
CN110289024B (zh) 一种音频编辑方法、装置、电子设备及存储介质
US11004452B2 (en) Method and system for multimodal interaction with sound device connected to network
US11272136B2 (en) Method and device for processing multimedia information, electronic equipment and computer-readable storage medium
WO2017215507A1 (zh) 一种音效处理方法及移动终端
WO2024140239A1 (zh) 页面显示方法、装置、设备、计算机可读存储介质及产品
US11941047B2 (en) Music playing method and device
WO2023279892A1 (zh) 节目的混合播放方法及设备
CN109582274A (zh) 音量调节方法、装置、电子设备及计算机可读存储介质
JP7769122B2 (ja) 仮想リソース転送方法、装置、機器、可読記憶媒体及び製品
WO2024037480A1 (zh) 交互方法、装置、电子设备和存储介质
US12572325B2 (en) Method and device for playing sound effects of music
CN113365010B (zh) 音量调节方法、装置、设备、存储介质
WO2023072280A1 (zh) 媒体内容发送方法、装置、设备、可读存储介质及产品
CN115687684B (zh) 音频播放方法、装置、电子设备及可读存储介质
US20240411430A1 (en) Method and apparatus for video interaction, electronic device and storage medium
US10877719B2 (en) Audio device, audio system
CN116546253A (zh) 一种视频播放设置方法、装置、电子设备及存储介质
KR101871102B1 (ko) 암밴드 형식의 웨어러블 기타 멀티 이펙터 장치 및 그의 제어 방법
CN116931871A (zh) 播放控制方法、装置和电子设备
KR20150051073A (ko) 이미지에 워터마크를 삽입하기 위한 전자 장치 및 방법

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20231214

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref legal event code: R079

Ipc: G06F0003160000

Ref country code: DE

Ref document number: 602022026333

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G06F0003048400

A4 Supplementary search report drawn up and despatched

Effective date: 20241008

RIC1 Information provided on ipc code assigned before grant

Ipc: G10H 1/46 20060101ALI20241001BHEP

Ipc: G06F 3/0484 20220101ALI20241001BHEP

Ipc: G06F 3/16 20060101AFI20241001BHEP

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20250627

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: F10

Free format text: ST27 STATUS EVENT CODE: U-0-0-F10-F00 (AS PROVIDED BY THE NATIONAL OFFICE)

Effective date: 20251203

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

U01 Request for unitary effect filed

Effective date: 20251203

U07 Unitary effect registered

Designated state(s): AT BE BG DE DK EE FI FR IT LT LU LV MT NL PT RO SE SI

Effective date: 20251208