WO2022063090A1 - 用于用户引导的方法、装置、设备和存储介质 - Google Patents

用于用户引导的方法、装置、设备和存储介质 Download PDF

Info

Publication number
WO2022063090A1
WO2022063090A1 PCT/CN2021/119400 CN2021119400W WO2022063090A1 WO 2022063090 A1 WO2022063090 A1 WO 2022063090A1 CN 2021119400 W CN2021119400 W CN 2021119400W WO 2022063090 A1 WO2022063090 A1 WO 2022063090A1
Authority
WO
WIPO (PCT)
Prior art keywords
authoring
page
multimedia content
user interface
user
Prior art date
Application number
PCT/CN2021/119400
Other languages
English (en)
French (fr)
Inventor
梁琛奇
白晓双
王语汐
Original Assignee
北京字跳网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字跳网络技术有限公司 filed Critical 北京字跳网络技术有限公司
Priority to AU2021348459A priority Critical patent/AU2021348459B2/en
Priority to EP21871460.8A priority patent/EP4092527A4/en
Priority to CA3168363A priority patent/CA3168363A1/en
Priority to JP2022552451A priority patent/JP7270850B2/ja
Priority to BR112022016862A priority patent/BR112022016862A2/pt
Priority to MX2022010651A priority patent/MX2022010651A/es
Priority to KR1020227029489A priority patent/KR102552821B1/ko
Publication of WO2022063090A1 publication Critical patent/WO2022063090A1/zh
Priority to US17/885,538 priority patent/US11733849B2/en
Priority to US18/219,539 priority patent/US20230350553A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/453Help systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions

Definitions

  • Example embodiments of the present disclosure generally relate to the field of computers, and in particular, to a method, apparatus, device, and computer-readable storage medium for user guidance.
  • More and more applications are currently designed to provide various services to users. For example, users can browse, comment, and forward various types of content in content sharing applications, including multimedia content such as videos, images, image collections, and sounds.
  • content sharing applications also allow users to create and publish multimedia content photos or videos.
  • a solution for user guidance is provided to guide and facilitate the use of authoring functions of an application by a user.
  • a method for user guidance includes acquiring authoring guide information for a user, where the authoring guide information is used to guide the user to use an authoring function in an application for authoring multimedia content.
  • the method also includes, if the application is active, presenting a visual representation corresponding to the authoring guidance information in an active page of the application.
  • the method also includes switching from the active page to the authoring page associated with the authoring function based on the interaction associated with the visual representation.
  • a device for user guidance includes an information acquisition module configured to acquire authoring guide information for the user, where the authoring guide information is used to guide the user to use the authoring function for authoring multimedia content in the application.
  • the apparatus further includes an information presentation module configured to present a visual representation corresponding to the authoring guidance information in an active page of the application if the application is in an active state.
  • the apparatus also includes a page switching module configured to switch from the active page to the authoring page associated with the authoring function based on the interaction associated with the visual representation.
  • an electronic device in a third aspect of the present disclosure, includes at least one processing unit; and at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit.
  • the instructions when executed by at least one processing unit, cause an apparatus to perform the method of the first aspect.
  • a computer-readable storage medium is provided.
  • a computer program is stored on the medium, and when the program is executed by the processor, the method of the first aspect is implemented.
  • FIG. 1 shows a schematic diagram of an example environment in which embodiments of the present disclosure can be implemented
  • FIG. 2 shows a flowchart of a process for user guidance according to some embodiments of the present disclosure
  • 3A-3D, 4 and 5 illustrate schematic diagrams of examples of pages of an application according to some embodiments of the present disclosure
  • 6A-6B and 7A-7B illustrate schematic diagrams of examples of page interactions of applications according to some embodiments of the present disclosure
  • FIGS. 11A-11C illustrate schematic diagrams of examples of authoring pages for applications in accordance with some embodiments of the present disclosure
  • Figure 12 shows a block diagram of an apparatus for user guidance according to some embodiments of the present disclosure.
  • FIG. 13 shows a block diagram of a device capable of implementing various embodiments of the present disclosure.
  • FIG. 1 shows a schematic diagram of an example environment 100 in which embodiments of the present disclosure can be implemented.
  • an application 120 is installed in a terminal device 110 .
  • Application 120 may be a content sharing application capable of providing users 140 with services related to multimedia content consumption, including multimedia content browsing, commenting, forwarding, authoring (eg, filming and/or editing), publishing, and the like.
  • multimedia content may be content in various forms, including video, audio, images, collections of images, text, and the like.
  • the terminal device 110 communicates with the server 130 to enable provisioning of services to the application 120 .
  • Terminal device 110 may be any type of mobile terminal, stationary terminal or portable terminal, including mobile phone, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, media computer, multimedia tablet, personal communication system (PCS) devices, personal navigation devices, personal digital assistants (PDAs), audio/video players, digital cameras/camcorders, pointing devices, television receivers, radio receivers, e-book devices, gaming devices, or any combination of the foregoing, Includes accessories and peripherals for these devices or any combination thereof.
  • the end device 110 is also capable of supporting any type of interface to the user (such as "wearable" circuitry, etc.).
  • Servers 130 are various types of computing systems/servers capable of providing computing capabilities, including but not limited to mainframes, edge computing nodes, computing devices in cloud environments, and the like.
  • the end device 110 may present the page 150 of the application 120 to the user 140 .
  • page 150 is a play page for presenting multimedia content in the home page of application 120 .
  • the application 120 has at least an authoring function to allow the user 140 to author multimedia content through the function.
  • the authoring function of the application 120 may include a photographing function for photographing multimedia content, wherein activation of the photographing function will activate a photographing device, such as a camera, of the terminal device 110 .
  • the authoring function of the application 120 may further include an uploading function for uploading the photographed multimedia content, so as to allow the user 140 to create using the existing multimedia content in the terminal device 110 or other remote data sources.
  • the page 150 presented by the application 120 includes an authoring user interface element 152 for authoring functionality. If the user selects the authoring user interface element 152 in the page 150, the application 120 may enter the authoring mode from the current browsing mode to begin authoring the multimedia content.
  • authoring user interface element 152 is shown in page 150 in FIG. 1 , in some implementations, the user interface element associated with activation of the capture function of application 120 may be hidden in a subordinate menu, user 140 It may take multiple operations to be able to select the user interface element.
  • the application 120 may also have an editing function for editing multimedia content, enabling the user to perform editing on the captured or uploaded multimedia content.
  • the application 120 may also have a publishing function, allowing the user 140 to publish the authored multimedia content.
  • authoring user interface elements in an application associated with the activation of content authoring may be placed at specific locations on the page.
  • the user actively finds and activates the user interface element when there is a creative intention, so as to create multimedia content. Therefore, the creation of multimedia content completely depends on the user's initiative.
  • users are often expected to create and publish more content. Therefore, it is expected to provide guidance and incentives for user creation, so that users can more conveniently and actively participate in content creation.
  • Embodiments of the present disclosure propose a solution for user guidance.
  • authoring guidance information for the user is presented in the activity page of the application to guide or motivate the user to use the authoring function of the application. Based on interaction with the visual representation of the authoring guidance information, a switch will be made from the currently active page of the application to the authoring page associated with the authoring function.
  • the solution by using the creation guide information to guide or motivate the user to use the creation function, and provide a jump to the creation page on the appropriate activity page, the user can create multimedia content more conveniently and quickly.
  • FIG. 2 shows a flowchart of a process 200 for user onboarding in accordance with some embodiments of the present disclosure.
  • Process 200 may be implemented at terminal device 110 .
  • process 200 will be described with reference to environment 100 of FIG. 1 .
  • the terminal device 110 obtains authoring guidance information for the user 140 .
  • the authoring guide information is used to guide the user 140 to use the authoring function of the application 120 .
  • the authoring guidance information may be based on the historical usage of the user 140 corresponding to 110 .
  • the historical usage may include the historical usage of specific functions of the application 120 (eg, authoring functions, publishing functions, browsing functions, commenting functions, etc.) by the user 140 .
  • the historical usage status may include whether the user 140 has used the authoring function of the application 120, the number of uses of the authoring mode of the authoring function, whether the multimedia content has been published in the application 120, the frequency, quantity, duration of publishing the multimedia content, etc., Type of published multimedia content.
  • the historical usage status may also include specifics of the user 140 browsing the multimedia content presentation in the application 120, such as the browsing time length, the preferred multimedia content type, the frequency of comments and likes, and the like.
  • authoring guidance information may also be based on user 140 personal profile information, historical usage of other users of application 120, and other information that can help determine guidance for the user .
  • the personal profile information of the user 140 it is possible to determine the user 140's intention to use the authoring function, the intention to publish the multimedia content in the application, and the specific authoring mode of the authoring function. preferences, and/or preferences for editing functions for editing multimedia content during the authoring process, etc.
  • the authoring guidance information obtained by the terminal device 110 may reflect one or more of these factors, and thus can provide more targeted guidance for the user 140 .
  • the terminal device 110 may receive part or all of the authoring guide information from the server 130 . In other words, a part or all of the authoring guide information may be generated by the server 130 .
  • the server 130 may send part or all of the generated authoring guide information to the terminal device 110 .
  • the terminal device 110 may generate a part or all of the authoring guidance information.
  • the authoring guide information may include various kinds of information that can guide or motivate the user to use the authoring function. Specific examples of authoring guidance information are discussed in greater detail below and illustrated by the accompanying figures.
  • the acquired authoring guide information is to be presented to the user.
  • the terminal device 110 presents a visual representation corresponding to the authoring guidance information in the active page of the application 120 .
  • the active page of the application 120 refers to the page that the application 120 currently presents to the user 140 in the active state.
  • the visual representation corresponding to the authoring guide information is specifically presented on which activity page or pages, may be determined based on the type of the authoring guide information, the manner of presentation, and the guiding effect on the user, and the like.
  • the terminal device 110 may present the visual representation corresponding to the authoring guide information in the page to be displayed after the application 120 is switched from the inactive state to the active state.
  • the application 120 is switched to the active state means that the user 140 currently has an intention to use the application 120, and the authoring guidance information may be presented at this time to better match the user's intention, so that the user 140 can quickly activate the authoring function.
  • the switching of the application 120 from the active state of the inactive state may be achieved by a cold start or a warm start of the application 120 .
  • the so-called cold boot means that when the application is started, the system program code and the application code are not in the memory, and these codes need to be extracted from the disk. For example, launching an app for the first time after a system power-on or restart; or launching an app, closing it, and then launching it again after a longer period of time.
  • the so-called hot start means that when the application starts, most of the system code already exists in memory. For example, after the first launch of the app, launch the app again within a short period of time.
  • the terminal device 110 may also detect that the user 140 is browsing a specific page when the application 120 is in an active state, and determine to present the authoring guide information in the specific page.
  • the presentation of authoring guidance information in the activity page may depend on various triggering conditions.
  • the terminal device 110 may present a visual representation corresponding to the authoring guide information every time the application 120 switches from an inactive state to an active state or displays a specific active page.
  • the terminal device 110 may present the authoring guide information only once in a day or other predetermined time period. If the application 120 switches from the inactive state to the active state again within a predetermined period of time, the authoring guide information will no longer be presented.
  • the triggering conditions for the presentation of different authoring guidance information may be different.
  • the terminal device 110 switches from the active page of the application 120 to the authoring page associated with the authoring function based on the interaction associated with the visual representation corresponding to the authoring guidance information.
  • the presentation of the authoring guide information can guide and motivate the user to use the authoring function more.
  • the terminal device 110 may add an overlay to the active page for presentation and place the overlay on top of the active page.
  • the corresponding visual representation of the authoring guide information is presented in the overlay so that the authoring guide information can be more prominently displayed.
  • FIG. 3A and 3B illustrate some example activity pages in application 120 that present authoring guidance information.
  • an overlay 310 is added to present a visual representation of the authoring information.
  • the overlay 310 is implemented, for example, in the form of a pop-up window.
  • an overlay 320 is added to present a visual representation of the authoring information.
  • the overlay 320 is implemented, for example, in the form of a balloon.
  • an overlay layer 310 or 320 is added to the active page 300A or 300B to be presented, and the authoring guide information corresponding to the overlay layer 310 or 320 is presented on the overlay layer 310 or 320 visual representation.
  • the authoring guidance information may indicate authoring functions of the application 120 .
  • authoring user interface elements for activating authoring functions may be presented in an activity page.
  • authoring user interface element 312 in overlay 310 of Figure 3A.
  • the authoring user interface element 312 is available for selection by the user. If the user selects the authoring user interface element 312, such as by clicking or the like, the authoring function of the application 120 will be activated. Accordingly, the application 120 will switch from the current active page 300A to the authoring page associated with the authoring function. Examples of authoring pages are described below.
  • the authoring guide information may indicate the position of the authoring function of the application 120 in the current page of the application 120 , so that the user 140 can know the authoring function of the application 120 .
  • overlay 320 shown as a balloon, points in the form of an arrow to authoring user interface element 322 in active page 300B for activating authoring functions.
  • the presentation of such authoring guide information can guide the user to pay attention to the authoring function and learn how to activate the authoring function.
  • selection of authoring user interface element 322 also activates the authoring functionality of application 120 .
  • the authoring guidance information may indicate historical usage by the user 140 related to the authoring function of the application 120 .
  • Such historical usage conditions can be presented, for example, by visual representations such as textual forms, graphical forms, and the like.
  • descriptive information 314 in text form is presented in the overlay 310 indicating the user's historical usage of the authoring function in the application 120 "You have recorded your life XX days", and The historical usage status of using the publishing function "Your work has been viewed NN times in total”.
  • Such descriptive information about historical usage may also encourage users to continue using the application 120 for content creation and, in turn, for content publishing.
  • the authoring guidance information may indicate a new mode to the application 120 by the user 140 related to authoring performance. For example, if a new version of the application 120 is updated on the terminal device 110 that provides a new mode related to the authoring function, then the new version of the application 120 may be launched for the first time or after a period of time when the new version is launched.
  • Authoring guide information to indicate new modes related to authoring performance. Such guidance can stimulate users' curiosity about the new model, so as to actively use the authoring function. Indications of new modes related to authoring performance may be presented through various visual representations, such as textual forms, graphical representations, and the like. In the example of FIG. 3B , assuming that the capture function in the authoring function has a new capture mode, descriptive information 324 in textual form and icons 326 in graphical form are displayed in overlay 320 of active page 300B.
  • the terminal device 110 may present the authoring user interface element in the active page for activating the authoring function through animation effects.
  • the dynamic visual presentation makes the user interface element more noticeable to the user 140, thereby incentivizing the use of authoring functions.
  • the active page 300C of FIG. 3C has an authoring user interface element 330 whose activation will trigger a switch from the current active page to the authoring page.
  • the authoring user interface element 330 may visually present an animation effect of continuous enlargement and reduction, for example, the enlarged effect of the authoring user interface element 330 is shown in FIG. 3D .
  • the size of authoring user interface elements 330 may continue to alternate in the examples shown in Figures 3C and 3D.
  • the animation of the authoring user interface element 330 may continue for a period of time and revert to a normal visual representation after a period of time.
  • authoring user interface element 330 may also be presented with such animation effects at all times.
  • the startup page may be presented first, and then the home page or other activity pages are presented after the startup page is presented.
  • the launch page may be used to display event information, promotional information, etc. Rendering of the splash page may take some time.
  • a visual representation corresponding to the authoring guidance information may be presented in a startup page of the application 120, eg, presenting authoring user interface elements for activating authoring functions.
  • FIG. 4 shows such an example startup page 400 in which authoring user interface elements 410 are presented. If the authoring user interface element 410 is selected in the startup page 400, the application 120 also switches from the startup page 400 to the authoring page. In this way, the user can be quickly guided to use the authoring function during the startup process of the application.
  • the active page on which the authoring guide information is to be presented may be a play page for playing multimedia content.
  • some applications are designed to directly present to the user a play page for playing multimedia content after switching from an inactive state to an active state in general.
  • the authoring user interface element for activating the authoring function and the browsing user interface element for entering the browsing function may be presented in the play area of the play page for the user to select .
  • Authoring user interface elements is considered here as authoring guide information.
  • FIG. 5 shows a presentation example of such authoring guide information.
  • FIG. 5 shows a play page 500 for playing multimedia content 502 presented by the application 120 after a cold start or a warm start (ie, after switching from an inactive state to an active state).
  • Also presented in play page 500 are authoring user interface elements 510 and browsing user interface elements 520 .
  • Authoring user interface element 510 and browsing user interface element 520 are presented, for example, in a play area of play page 500 for playing multimedia content 502 . If user 140 selects authoring user interface element 510 , application 120 is switched to the authoring page so that user 140 can use the authoring functionality of application 120 . Note that authoring pages are discussed collectively below.
  • the user 140 selects the browse user interface element 520 in the play page 500, eg, through a finger click or the like, as shown in FIG. 6A, this means that the user 140 desires to enter browse mode.
  • the authoring user interface element 510 and the browsing user interface element 520 will cease to be presented in the play page 500 of the application 120, as shown in FIG. 6B.
  • the user can start normal multimedia browsing in the play page 500 of FIG. 6B.
  • user 140 may be allowed to stop the presentation of authoring user interface element 510 and browsing user interface element 520 in other ways. For example, user 140 may exit presentation of authoring user interface element 510 and browsing user interface element 520 by clicking elsewhere in the play area of multimedia content 502 than authoring user interface element 510 and browsing user interface element 520 .
  • the playback page 500 has multiple page tabs, and some or all of the page tabs can be used to play multimedia content, then switching among the multiple page tabs by the user 140 may not trigger the authoring user interface element 510 and presentation of the browse user interface element 520 is stopped.
  • a switch instruction is received from the user 140 in the play page 500 , such as a switch instruction triggered by a sliding input 710
  • the application 120 switches from the current page tab 701 (sometimes referred to as the “first page tab”) to Another page tab 702 (sometimes referred to as a "second page tab").
  • the application 120 will play the multimedia content 720 corresponding to the page tab 702 in the page 700 of FIG. 7B while maintaining the presentation of the authoring user interface element 510 and the browsing user interface element 520 . That is, authoring user interface element 510 and browsing user interface element 520 may be continuously presented if user 140 explicitly chooses to enter either browsing mode or authoring mode.
  • the authoring user interface element 510 and browsing user interface Element 520 may be stopped rendering.
  • the authoring guide information in addition to displaying the authoring guide information in the active page presented when the application 120 is switched to the active state, may also be displayed in other pages of the application 120 .
  • the application 120 displays the user's personal homepage based on the user's operation, such as the personal homepage 800 shown in FIG. 8
  • the visual representation corresponding to the authoring guidance information may also be presented on the page, for example, through a bubble prompt box
  • the form overlay 810 is rendered.
  • Overlay 810 may be similar to overlay 320 in FIG. 3B, in the form of an arrow pointing to the location of authoring user interface element 812 in personal home page 800 for activating authoring functions.
  • the overlay 810 may also display more authoring guidance information through textual descriptive information 814 and graphical icons 816 .
  • a visual representation corresponding to the authoring guide information may be presented to indicate the location of the authoring user interface element in the currently active page, thereby helping the user 140 understand how to find and launch Creation function.
  • the personal home page 800 of the user 140 has no publications of works (ie, multimedia content), so the authoring guidance information may be presented through the overlay 810 .
  • other active pages such as the above Figures 3A, 3B, 4 and 4, may also be displayed.
  • the authoring guidance information is presented in FIG. 5 .
  • the authoring guide information may not generally indicate the authoring function of the application 120, but may be able to provide the user with authoring guide that is more personalized or more in line with the user's preference.
  • the authoring guide information may indicate the user 140's preference for the authoring mode of the authoring function of the application 120 .
  • a particular mode of the capture function is a capture mode of interest to the user 140, eg, it is determined that the user 140 continues to like to use such capture mode or because a newly introduced capture mode is something that the user 140 may be willing to try.
  • Such a shooting mode may be determined as a recommended shooting mode.
  • the authoring guide information may be determined to indicate such a recommended shooting mode.
  • the terminal device 110 may deform the authoring user interface element in the active page of the application 120 into a style corresponding to the recommended shooting mode.
  • the authoring user interface elements 312, 322, 330, 510, and 812 have styles corresponding to recommended shooting modes (eg, "snapshot mode").
  • the authoring user interface element 410 of FIG. 4 may be considered a generic style.
  • the authoring guidance information may indicate the user's 140 preference for editing functions for editing the multimedia content.
  • the editing functions of the application 120 can be varied.
  • application 120 may have special effects editing functionality for adding special effects to edited multimedia content (eg, videos, images, etc.).
  • the application 120 may also have a filter function for applying specific filters to the edited multimedia content.
  • Other editing functions of the application 120 may also include, for example, a music adding function, a voice changing function, a sticker adding function, and the like.
  • certain editing functions may also have multiple editing modes, which may indicate different approaches to multimedia content.
  • different special effects editing modes in the special effect editing function correspond to different special effects
  • different filter modes in the filter function correspond to different filters
  • different editing modes in the music adding function correspond to different music types or different music types. Add to. Editing modes for other editing functions can be similarly divided.
  • a particular editing function is an editing function of interest to user 140, eg, it is determined that user 140 likes multimedia content edited by the editing mode, prefers to use the editing mode to edit multimedia content, or the editing mode
  • An editing mode that is newly introduced and that the user 140 may be willing to try such an editing mode may be determined as a recommended editing mode.
  • Authoring guide information may be determined to indicate such a recommended editing mode.
  • the terminal device 110 may transform the authoring user interface element in the active page of the application 120 into a style corresponding to the sample multimedia content edited using the recommended editing mode.
  • the authoring user interface element 920 in the activity page 900 has a style corresponding to the recommended special effect editing mode (eg, "paint filter").
  • both the textual descriptive information 1014 and the graphical icons 1016 presented in the overlay 1010 added on the activity page 1000 can indicate a particular recommendation Effects editing mode.
  • the authoring guide information indicating the recommended editing mode may also be presented in a corresponding active page when the user 140 browses the multimedia content edited by the recommended editing mode.
  • the terminal device 110 determines whether the active page of the application 120 presents the multimedia content edited by the recommended editing mode. For example, in the example of Figure 9, it is assumed that the user is browsing multimedia content 910, which has been edited by the "painting filter" editing mode.
  • the terminal device 110 may present a visual representation corresponding to the authoring guide information in the active page, such as the transformed authoring user interface element 920, to indicate the recommended editing mode.
  • a visual representation corresponding to the authoring guide information in the active page such as the transformed authoring user interface element 920
  • other forms of authoring guidance information such as the visual representation 930 in the form of graphics and text of FIG. ".
  • the presentation of such authoring guidance information may be provided when it is determined that the user 140 has authoring intent to publish or edit the multimedia content, so as to avoid the user 140 being presented with the authoring guidance information. excessive interference.
  • the user may be presented with authoring guide information indicating the recommended editing mode "oil painting filter".
  • the application 120 will switch from the current page to the authoring page.
  • 11A-11C illustrate some examples of authoring pages for application 120.
  • the authoring page is associated with authoring functions of the application 120, for example, functions such as shooting and editing of multimedia content are provided.
  • the authoring page 1100A has a capture user interface element 1110 that can be selected to activate the capture of multimedia content, such as video, images, and the like.
  • the authoring page 1100A also has an album user interface element 1120 that, by selection, may allow the user to upload multimedia content from local or other data sources for subsequent authoring.
  • Authoring page 1100A also has user interface elements for activating editing functions for editing captured or uploaded multimedia content.
  • the user interface elements for activating the editing function include, but are not limited to: the special effect user interface element 1130 for activating the special effect editing function; the flip user interface element 1140 for activating the flip function for flipping the created multimedia content on the page The presentation direction in the; filter user interface element 1150, used to activate the filter function; flash user interface element 1150, used to turn on or off the flash light of the terminal device 110; music add user interface element 1170, used to activate the music addition function, to add music to authored multimedia content. It should be understood that what is shown in FIG. 11A is only an example editing function, and there may be more, less or different editing functions in the authoring page as desired.
  • the authoring page will be presented as related to activating the shooting function in the recommended shooting mode Linked authoring page. For example, if authoring user interface elements 312, 322, 330, 510, and 812 in the examples of FIGS. 3A-3D, 5, and 8 are selected, switching from the pages of FIGS. 3A-3D, 5, and 8 After reaching the authoring page, as shown in FIG. 11A , the shooting function is positioned to a recommended shooting mode 1180 (eg, "Snapshot Mode"). In this way, the user can be quickly photographed using a desired photographing mode.
  • the shooting function may be positioned to the default shooting mode.
  • the recommended editing mode may be automatically applied to the authored multimedia content in the authoring page to present the recommended editing mode Mode automatically edited multimedia content.
  • the recommended editing mode is a recommended special effect editing mode, such as the special effect editing mode indicated by authoring user interface element 1012 in the activity page 1000 of FIG. 10 .
  • application 120 switches to authoring page 1100B.
  • authoring page 1100B special effects user interface element 1130 is deformed to indicate a recommended special effect editing mode, and multimedia content 1102 currently being authored in authoring page 1100B is automatically edited to add special effects 1104 to objects in multimedia content 1102.
  • the recommended edit mode is a recommended filter mode, such as the filter mode indicated by authoring user interface element 920 in activity page 900 of FIG. 9 , eg, "paint filter”.
  • application 120 switches to authoring page 1100C.
  • the multimedia content 1106 being authored is automatically edited by the corresponding recommended filter mode, for example, to present the effect of "oil painting filter”.
  • An indication 1108 of the recommended filter mode used may also be presented in the authoring page 1100C.
  • the multimedia content is photographed after selecting the photographing user interface element in the authoring page.
  • the photographing function of the authoring page can be directly activated to photograph the multimedia content.
  • the terminal device 110 may also perform content presentation based on the predetermined volume rather than the target volume of the multimedia content to be presented currently.
  • the predetermined volume selected by the terminal device 110 is lower than the target volume of the multimedia content itself. In some examples, the predetermined volume may be zero.
  • the terminal device 110 may initially set the playback volume of the multimedia content to a predetermined volume (for example, mute by default, or a lower volume).
  • a predetermined volume for example, mute by default, or a lower volume.
  • the overall playback of the multimedia content may be paused, or only the picture is played without sound.
  • the terminal device 110 may present the multimedia content at a predetermined volume for a predetermined period of time (eg, 2 seconds, 3 seconds, 5 seconds, etc.). If no operation instruction on the multimedia content is received within a predetermined time period, for example, the user clicks to play or raises the volume, the terminal device 110 may present the multimedia content at the target volume after the predetermined time period expires.
  • a predetermined period of time e.g, 2 seconds, 3 seconds, 5 seconds, etc.
  • the terminal device 110 may gradually increase the playback volume of the multimedia content from the predetermined volume to the target volume while the multimedia content is being presented. That is, the playback volume of the multimedia content is gradually increased. This gradually increasing process can give the user some buffer time.
  • FIG. 12 shows a schematic structural block diagram of an apparatus 1200 for user guidance according to some embodiments of the present disclosure.
  • the apparatus 1200 may be implemented as or included in the terminal device 110 .
  • the various modules/components in the apparatus 1200 may be implemented by hardware, software, firmware, or any combination thereof.
  • the apparatus 1200 includes an information obtaining module 1210 configured to obtain authoring guide information for the user, where the authoring guide information is used to guide the user to use the authoring function in the application for authoring multimedia content.
  • the apparatus 1200 further includes an information presentation module 1220 configured to present a visual representation corresponding to the authoring guidance information in an active page of the application if the application is in an active state.
  • the apparatus 1200 also includes a page switching module 120 configured to switch from the active page to the authoring page associated with the authoring function based on the interaction associated with the visual representation.
  • the authoring function includes at least one of the following: a shooting function for shooting multimedia content and an uploading function for uploading local multimedia content.
  • the active page includes a page to be rendered after the application switches from the inactive state to the active state.
  • the information presentation module 1220 is configured to present authoring user interface elements for activating authoring functions in the startup page if the active page is the application's startup page.
  • the page switch module 1230 is configured to switch from the start page to the author page if the authoring user interface element is selected in the start page.
  • the information presentation module 1220 is configured to: if the active page is a play page for playing multimedia content, present authoring user interface elements for activating authoring functions and for entering browsing in the play area of the multimedia content A functional browse user interface element.
  • the page switch module 1230 is configured to switch from the play page to the author page if the author user interface element is selected in the play page.
  • the apparatus 1200 further includes at least one of the following: a presentation stop module configured to stop presenting the authoring user interface element and the browsing user interface element in the playback page if the browse user interface element is selected in the playback page; And the label switching presentation module, is configured to be at the same time when the authoring user interface element and the browsing user interface element are presented, if the playing page is switched from the first page label to the second page label, when playing the multimedia corresponding to the second page label The presentation of authoring user interface elements and browsing user interface elements is maintained while the content is present.
  • a presentation stop module configured to stop presenting the authoring user interface element and the browsing user interface element in the playback page if the browse user interface element is selected in the playback page
  • the label switching presentation module is configured to be at the same time when the authoring user interface element and the browsing user interface element are presented, if the playing page is switched from the first page label to the second page label, when playing the multimedia corresponding to the second page label.
  • the information presentation module 1220 is configured to: add an overlay in the active page, the overlay is placed on top of the active page; and present a visual representation corresponding to the authoring guidance information in the overlay.
  • the authoring function includes a photographing function and the authoring guide information indicates a recommended photographing mode for the photographing function.
  • the information presentation module 1220 is configured to: morph the authoring user interface elements in the active page into a style corresponding to the recommended shooting mode.
  • the page switching module 1230 is configured to switch from the active page to the authoring page associated with activating the capture function in the recommended capture mode in response to the authoring user interface element being selected.
  • the authoring guidance information indicates a recommended editing mode for editing the multimedia content.
  • the information presentation module 1220 is configured to: determine whether the active page presents the multimedia content edited by the recommended editing mode; and if it is determined that the active page presents the edited multimedia content, present the authoring guide information corresponding to the active page to indicate the recommended editing mode.
  • the authoring guidance information indicates a recommended editing mode for editing the multimedia content.
  • the information presentation module 1220 is configured to: morph the authoring user interface elements in the active page into a style corresponding to the sample multimedia content edited using the recommended editing mode, in some embodiments, the page switching module 1230 is configured to: in response to the authoring user interface element being selected, switch from the active page to the authoring page for activating the authoring function; and present the multimedia content automatically edited by the recommended editing mode in the authoring page.
  • the application also includes publishing functionality for publishing multimedia content.
  • the information presentation module 1220 is configured to: if the user has not published the multimedia content in the application, present a visual representation corresponding to the authoring guidance information to indicate the location in the active page of the authoring user interface element used to activate the authoring function .
  • the authoring guidance information is related to at least one of: a user's preference for an authoring mode for authoring functions, a user's preference for an editing function for editing multimedia content, and a user's preference for publishing multimedia content in an application intention.
  • the apparatus 1200 further includes: a volume adjustment presentation module configured to present the multimedia content based on a predetermined volume if the application is switched to the active state and the active page is a play page for presenting the multimedia content with the target volume , the predetermined volume is lower than the target volume.
  • the volume adjustment presentation module is configured to: present the multimedia content at a predetermined volume for a predetermined period of time; and if the predetermined period of time expires, present the multimedia content at a target volume.
  • the volume adjustment presentation module is configured to gradually increase the playback volume of the multimedia content from a predetermined volume level until the target volume is reached while the multimedia content is being presented.
  • FIG. 13 shows a block diagram illustrating a computing device 1300 in which one or more embodiments of the present disclosure may be implemented. It should be understood that the computing device 1300 shown in FIG. 13 is merely exemplary and should not constitute any limitation on the functionality and scope of the embodiments described herein. The computing device 1300 shown in FIG. 13 may be used to implement the terminal device 110 of FIG. 1 .
  • computing device 1300 is in the form of a general-purpose computing device.
  • Components of computing device 1300 may include, but are not limited to, one or more processors or processing units 1310, memory 1320, storage devices 1330, one or more communication units 1340, one or more input devices 1350, and one or more output devices 1360.
  • the processing unit 1310 may be an actual or virtual processor and can perform various processes according to programs stored in the memory 1320 .
  • multiple processing units execute computer-executable instructions in parallel to increase the parallel processing capabilities of computing device 1300 .
  • Computing device 1300 typically includes a number of computer storage media. Such media can be any available media accessible by computing device 1300, including but not limited to volatile and nonvolatile media, removable and non-removable media.
  • Memory 1320 may be volatile memory (eg, registers, cache, random access memory (RAM)), non-volatile memory (eg, read only memory (ROM), electrically erasable programmable read only memory (EEPROM) , Flash) or some combination of them.
  • Storage device 1330 may be removable or non-removable media, and may include machine-readable media, such as flash drives, magnetic disks, or any other media that may be capable of storing information and/or data (eg, training data for training). ) and can be accessed within computing device 1300.
  • Computing device 1300 may further include additional removable/non-removable, volatile/non-volatile storage media.
  • disk drives may be provided for reading or writing from removable, non-volatile magnetic disks (eg, "floppy disks") and for reading or writing from removable, non-volatile optical disks CD-ROM drive for reading or writing.
  • each drive may be connected to a bus (not shown) by one or more data media interfaces.
  • Memory 1320 may include a computer program product 1325 having one or more program modules configured to perform various methods or actions of various embodiments of the present disclosure.
  • the communication unit 1340 enables communication with other computing devices through a communication medium. Additionally, the functions of the components of computing device 1300 may be implemented in a single computing cluster or multiple computing machines capable of communicating through a communication connection. Accordingly, computing device 1300 may operate in a networked environment using logical connections to one or more other servers, network personal computers (PCs), or another network node.
  • PCs network personal computers
  • Input device 1350 may be one or more input devices, such as a mouse, keyboard, trackball, and the like.
  • Output device 1360 may be one or more output devices, such as a display, speakers, printer, and the like.
  • Computing device 1300 may also communicate with one or more external devices (not shown), such as storage devices, display devices, etc., through communication unit 1340 as needed, with one or more devices that enable a user to interact with computing device 1300 communicate, or with any device (eg, network card, modem, etc.) that enables computing device 1300 to communicate with one or more other computing devices. Such communication may be performed via an input/output (I/O) interface (not shown).
  • I/O input/output
  • a computer-readable storage medium having computer-executable instructions stored thereon, wherein the computer-executable instructions are executed by a processor to implement the method described above.
  • a computer program product tangibly stored on a non-transitory computer-readable medium and comprising computer-executable instructions executed by a processor to implement the method described above.
  • These computer readable program instructions may be provided to the processing unit of a general purpose computer, special purpose computer or other programmable data processing apparatus to produce a machine that causes the instructions when executed by the processing unit of the computer or other programmable data processing apparatus , resulting in means for implementing the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
  • These computer readable program instructions can also be stored in a computer readable storage medium, these instructions cause a computer, programmable data processing apparatus and/or other equipment to operate in a specific manner, so that the computer readable medium on which the instructions are stored includes An article of manufacture comprising instructions for implementing various aspects of the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
  • Computer-readable program instructions can be loaded onto a computer, other programmable data processing apparatus, or other equipment to cause a series of operational steps to be performed on the computer, other programmable data processing apparatus, or other equipment to produce a computer-implemented process, Thereby, instructions executing on a computer, other programmable data processing apparatus, or other device are caused to carry out the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executables for implementing the specified logical function(s) instruction.
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or actions , or can be implemented in a combination of dedicated hardware and computer instructions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)
  • Input Circuits Of Receivers And Coupling Of Receivers And Audio Equipment (AREA)
  • Electrical Discharge Machining, Electrochemical Machining, And Combined Machining (AREA)
  • Stored Programmes (AREA)
  • Control Of Vending Devices And Auxiliary Devices For Vending Devices (AREA)
  • Knitting Machines (AREA)

Abstract

根据本公开的实施例,提供了用于用户引导的方法、装置、设备和存储介质。该方法包括获取针对用户的创作引导信息,创作引导信息用于引导用户对应用中用于创作多媒体内容的创作功能的使用。该方法还包括如果应用处于活动状态,在应用的活动页面中呈现创作引导信息对应的视觉表示。该方法还包括基于与视觉表示相关联的交互,从活动页面切换到与创作功能相关联的创作页面。以此方式,可以引导、促进用户对应用的创作功能的使用。

Description

用于用户引导的方法、装置、设备和存储介质
本申请要求于2020年9月25日提交中国专利局、申请号为202011027615.5、发明名称为“用于用户引导的方法、装置、设备和存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本公开的示例实施例总体涉及计算机领域,特别地涉及用于用户引导的方法、装置、设备和计算机可读存储介质。
背景技术
当前越来越多应用被设计来向用户提供各种服务。例如,用户可以在内容共享类应用中浏览、评论、转发各类内容,包括诸如视频、图像、图像集、声音等多媒体内容。此外,内容共享类应用还允许用户创作和发布多媒体内容照片或者视频。对于应用的提供方而言,期望越来越多用户能够参与到内容创作和发布中,这样不仅能够在平台中提供更多、更丰富的多媒体内容,而且能够增加应用的用户粘度。
发明内容
根据本公开的示例实施例,提供了一种用于用户引导的方案,以引导、促进用户对应用的创作功能的使用。
在本公开的第一方面,提供了一种用户引导的方法。该方法包括获取针对用户的创作引导信息,创作引导信息用于引导用户对应用中用于创作多媒体内容的创作功能的使用。该方法还包括如果应用处于活动状态,在应用的活动页面中呈现创作引导信息对应的视觉表示。该方法还包括基于与视觉表示相关联的交互,从活动页面切换到与创作功能相关联的创作页面。
在本公开的第二方面,提供了一种用于用户引导的装。该装置包括信息获取模块,被配置为获取针对用户的创作引导信息,创作引导信息用于引导用户对应用中用于创作多媒体内容的创作功能的使用。该装置还包括信息呈现模块,被配置为如果应用处于活动状态,在应用的活动页面中呈现创作引导信息对应的视觉表示。该装置还包括页面切换模块,被配置为基于与视觉表示相关联的交互,从活动页面切换到与创作功能相关联的创作页面。
在本公开的第三方面,提供了一种电子设备。该设备包括至少一个处理单元;以及至少一个存储器,至少一个存储器被耦合到至少一个处理单元并且存储用于由至少一个处理单元执行的指令。指令在由至少一个处理单元执行时使设备执行第一方面的方法。
在本公开的第四方面,提供了一种计算机可读存储介质。介质上存储有计算机程序,程序被处理器执行时实现第一方面的方法。
应当理解,本发明内容部分中所描述的内容并非旨在限定本公开的实施例的关键特征或重要特征,也不用于限制本公开的范围。本公开的其它特征将通过以下的描述而变得容易理解。
附图说明
结合附图并参考以下详细说明,本公开各实施例的上述和其他特征、优点及方面将变得更加明显。在附图中,相同或相似的附图标记表示相同或相似的元素,其中:
图1示出了本公开的实施例能够在其中实现的示例环境的示意图;
图2示出了根据本公开的一些实施例的用于用户引导的过程的流程图;
图3A至图3D、图4和图5示出根据本公开的一些实施例的应用的页面的示例的示意图;
图6A至图6B以及图7A至图7B示出根据本公开的一些实施例的应用的页面交互的示例的示意图;
图8至图10示出根据本公开的一些实施例的应用的页面的示例的示意图;
图11A至图11C示出根据本公开的一些实施例的应用的创作页面的示例的示意图;
图12示出了根据本公开的一些实施例的用于用户引导的装置的框图;以及
图13示出了能够实施本公开的多个实施例的设备的框图。
具体实施方式
下面将参照附图更详细地描述本公开的实施例。虽然附图中示出了本公开的某些实施例,然而应当理解的是,本公开可以通过各种形式来实现,而且不应该被解释为限于这里阐述的实施例,相反,提供这些实施例是为了更加透彻和完整地理解本公开。应当理解的是,本公开的附图及实施例仅用于示例性作用,并非用于限制本公开的保护范围。
在本公开的实施例的描述中,术语“包括”及其类似用语应当理解为开放性包含,即“包括但不限于”。术语“基于”应当理解为“至少部分地基于”。术语“一个实施例”或“该实施例”应当理解为“至少一个实施例”。术语“一些实施例”应当理解为“至少一些实施例”。下文还可能包括其他明确的和隐含的定义。
图1示出了本公开的实施例能够在其中实现的示例环境100的示意图。在该示例环境100中,终端设备110中安装有应用120。应用120可以是内容共享应用,能够向用户140提供与多媒体内容消费相关服务,包括多媒体内容的浏览、评论、转发、创作(例如,拍摄和/或编辑)、发布等等。在本文中,“多媒体内容”可以是多种形式的内容,包括视频、音频、图像、图像集、文本等等。在一些实施例中,终端设备110与服务器130通信,以实现对应用120的服务的供应。
终端设备110可以是任意类型的移动终端、固定终端或便携式 终端,包括移动手机、台式计算机、膝上型计算机、笔记本计算机、上网本计算机、平板计算机、媒体计算机、多媒体平板、个人通信系统(PCS)设备、个人导航设备、个人数字助理(PDA)、音频/视频播放器、数码相机/摄像机、定位设备、电视接收器、无线电广播接收器、电子书设备、游戏设备或者前述各项的任意组合,包括这些设备的配件和外设或者其任意组合。在一些实施例中,终端设备110也能够支持任意类型的针对用户的接口(诸如“可佩戴”电路等)。服务器130是能够提供计算能力的各种类型的计算系统/服务器,包括但不限于大型机、边缘计算节点、云环境中的计算设备,等等。
在图1的环境100中,如果应用120处于活动状态,终端设备110可以向用户140呈现应用120的页面150。在图1的示例中,页面150是播放页面,用于呈现应用120的首页中的多媒体内容。应用120至少具有创作功能,以允许用户140通过该功能来创作多媒体内容。应用120的创作功能可以包括用于拍摄多媒体内容的拍摄功能,其中拍摄功能的激活将会激活终端设备110的拍摄装置,例如相机。可选地或附加地,应用120的创作功能还可以包括用于上传已拍摄的多媒体内容的上传功能,以允许用户140利用终端设备110本地或者其他远程数据源中已有的多媒体内容进行创作。
在图1的示例中,应用120所呈现的页面150包括用于创作功能的创作用户界面元件152。如果用户选择页面150中的创作用户界面元件152,应用120可以从当前的浏览模式进入创作模式,从而开始创作多媒体内容。
注意,虽然在图1中将创作用户界面元件152示出在页面150中,在一些实现中,与应用120的拍摄功能的激活相关联的用户界面元件可能被隐藏在下一级菜单中,用户140可能需要通过多次操作才能够选择该用户界面元件。
除创作功能之外,在一些实施例中,应用120还可以具有用于编辑多媒体内容的编辑功能,使用户能够对拍摄到的或上传的多媒 体内容进行。应用120还可以具有发布功能,允许用户140将所创作的多媒体内容进行发布。
应当理解,仅出于示例性的目的描述环境100的结构和功能,而不暗示对于本公开的范围的任何限制。例如,图1的页面150仅仅是应用的一个示例页面,实际可以存在各种页面设计。
传统上,应用中与内容创作的激活相关联的创作用户界面元件可能会被放置在页面的特定位置。用户在有创作意图的时候主动查找和激活该用户界面元件,以进行多媒体内容的创作。因此,多媒体内容的创作完全依赖于用户的主动性。对于涉及内容共享的应用,往往希望用户能够更多地进行内容的创作和发布。因此,期望能够提供对用户创作的引导和激励,使用户更方便、更积极地参与到内容创作中。
本公开的实施例提出了一种用于用户引导的方案。在该方案中,在应用的活动页面中呈现针对用户的创作引导信息,以引导或激励用户对应用的创作功能的使用。基于对创造引导信息的视觉表示的交互,将从应用的当前活动页面切换到与创作功能相关联的创作页面。根据该方案,通过利用创造引导信息来引导或激励用户使用创作功能,并且在适当的活动页面提供对创作页面的跳转,使用户能够更方便、快速地进行多媒体内容的创作。
以下将继续参考附图描述本公开的一些示例实施例。
图2示出了根据本公开的一些实施例的用于用户引导的过程200的流程图。过程200可以在终端设备110处实现。为便于讨论,将参考图1的环境100来描述过程200。
在框210,终端设备110获取针对用户140的创作引导信息。创作引导信息用于引导用户140对应用120的创作功能的使用。
在一些实施例中,创作引导信息可以基于用户140对应于110的历史使用状况。历史使用状况可以包括用户140对应用120的特定功能(例如,创作功能、发布功能、浏览功能、评论功能等)的历史使用状况。例如,历史使用状况可以包括用户140是否曾使用 应用120的创作功能,对创作功能的创作模式的使用数量,在应用120中是否曾发布多媒体内容,发布多媒体内容的频率、数量、持续时间等,发布多媒体内容的类型。此外,历史使用状况还可以包括用户140在应用120中浏览多媒体内容展现的特定,例如浏览时间长度、偏好的多媒体内容类型、评论和点赞的频率等。备选地或附加地,在一些实施例中,创作引导信息还可以基于用户140的个人简档信息,应用120的其他用户的历史使用状况,以及其他能够有助于确定对用户的引导的信息。
通过用户140和/或其他用户的历史使用状况,用户140的个人简档信息等,可以确定用户140对创作功能的使用意图,在应用中发布多媒体内容的意图,对创作功能的特定创作模式的偏好,和/或对在创作过程中编辑多媒体内容的编辑功能的偏好,等等。终端设备110获取的创作引导信息可以反映这些方面的因素中的一个或多个,因此能够提供针对用户140的更有针对性的引导。
在一些实施例中,终端设备110可以从服务器130接收到创作引导信息的部分或全部。换言之,可以由服务器130来生成创作引导信息的一部分或全部。服务器130可以将所生成的部分或全部创作引导信息发送给终端设备110。在一些实施例中,终端设备110可以生成创作引导信息的一部分或全部。
创作引导信息可以包括能够引导或激励用户使用创作功能的各类信息。创作引导信息的具体示例将在下文中更详细讨论和通过附图来说明。
所获取的创作引导信息要被展现给用户。在框220,如果应用120处于活动状态,终端设备110在应用120的活动页面中呈现创作引导信息对应的视觉表示。应用120的活动页面指的是在活动状态下应用120当前向用户140呈现的页面。具体在哪个或哪些活动页面中呈现创作引导信息对应的视觉表示,可以基于创作引导信息的类型、呈现方式、以及对用户的引导效果等来确定。
在一些实施例中,终端设备110可以在应用120从非活动状态 切换到活动状态后要展示的页面中呈现创作引导信息对应的视觉表示。应用120被切换到活动状态意味着用户140当前具有使用应用120的意图,此时呈现创作引导信息可能更匹配用户意图,方便用户140快速激活创作功能。
应用120从非活动状态的活动状态切换可以通过应用120的冷启动或热启动实现。所谓冷启动,是指在启动应用时,系统程序代码和应用代码均不在内存中,需要从磁盘提取这些代码。例如,在系统开机或者重新启动后首次启动应用;或者启动应用后将其关闭,然后经过一段较长时间之后再次启动。所谓热启动,是指应用启动时,大部分系统代码已经存在于内存中。例如,在首次启动后应用,在短时间内再次启动应用。
在一些实施例中,终端设备110也可以在应用120处于活动状态下,检测到用户140正在浏览特定页面,并确定在该特定页面中呈现创作引导信息。
在一些实施例中,创作引导信息在活动页面中的呈现可以取决于各种触发条件。例如,终端设备110可以在应用120每次从非活动状态切换到活动状态后或者展示特定活动页面时,均呈现与创作引导信息对应的视觉表示。又例如,终端设备110以在一天或其他预定时间段内仅呈现一次创作引导信息。如果在预定时间段内应用120再次从非活动状态切换到活动状态,将不再呈现创作引导信息。在一些实施例中,不同创作引导信息的呈现的触发条件可以不同。
在框230,终端设备110基于与创作引导信息对应的视觉表示相关联的交互,从应用120的活动页面切换到与创作功能相关联的创作页面。由此,创作引导信息的呈现可以引导和激励用户更多地使用创作功能。
为更好地理解创作引导信息的呈现和交互的示例实施例,下面将参考示例用户界面来进行描述。
在一些实施例中,在呈现创作引导信息对应的视觉表示时,终端设备110可以在用于呈现的活动页面中添加叠加层,并将该叠加 层放置在活动页面的顶层。创作引导信息对应的视觉表示被呈现在叠加层中,从而可以更突出地展示创作引导信息。
图3A和图3B示出了在应用120中呈现创作引导信息的一些示例活动页面。在图3A的活动页面300A中,叠加层310被添加用于呈现创作信息的视觉表示。叠加层310例如被实现为弹窗的形式。在图3B的活动页面300B中,叠加层320被添加用于呈现创作信息的视觉表示。叠加层320例如被实现为气泡提示框的形式。
在一些示例中,在应用120通过冷启动或热启动进入活动状态后,要呈现的活动页面300A或300B中被添加叠加层310或320,并且在叠加层310或320上呈现创作引导信息对应的视觉表示。
在一些实施例中,创作引导信息可以指示应用120的创作功能。例如,在呈现这样的创作引导信息对应的视觉表示时,可以通过在活动页面中呈现用于激活创作功能的创作用户界面元件。例如,图3A的叠加层310中的创作用户界面元件312。该创作用户界面元件312可供用户选择。如果用户选择该创作用户界面元件312,例如通过点击等方式进行选择,那么将会激活应用120的创作功能。相应地,应用120将会从当前的活动页面300A切换到与创作功能相关联的创作页面。创作页面的示例将在下文中描述。
在一些实施例中,创作引导信息可以指示应用120的创作功能在应用120的当前页面中的位置,以方便用户140能够获知应用120所具有的创作功能。在图3B的示例中,被展示为气泡提示框的叠加层320通过箭头的形式指向活动页面300B中用于激活创作功能的创作用户界面元件322。这样的创作引导信息的呈现能够引导用户注意创作功能,并获知如何激活创作功能。类似地,创作用户界面元件322的选择也会激活应用120的创作功能。
在一些实施例中,创作引导信息可以指示用户140对应用120的创作功能相关的历史使用状况。这样的历史使用状况例如可以通过文字形式、图表形式等视觉表示来呈现。例如,在图3A的活动页面300A中,在叠加层310中呈现文字形式的描述性信息314,用于 指示用户在应用120中使用创作功能的历史使用状况“你已记录生活XX天”,以及使用发布功能的历史使用状况“你的作品共被观看NN次”。这些关于历史使用状况的描述性信息也可以鼓励用户继续使用应用120进行内容创作,进而还可以进行内容发布。
在一些实施例中,创作引导信息可以指示用户140对应用120的与创作性能相关的新模式。例如,如果在终端设备110上更新应用120的新版本,该新版本中提供了与创作功能相关的新模式,那么可以在新版本的应用120被首次启动或者在一段时间内被启动时,呈现创作引导信息来指示与创作性能相关的新模式。这样的引导可以激励用户对新模式的好奇心,从而积极去使用创作功能。对与创作性能相关的新模式的指示可以通过各种视觉表示方式,例如文本形式、图形表示等来呈现。在图3B的示例中,假设创作功能中的拍摄功能具有新的拍摄模式,在活动页面300B的叠加层320中显示文字形式的描述性信息324和图形形式的图标326。
在一些实施例中,在呈现创作引导信息对应的视觉表示时,终端设备110可以将活动页面中用于激活创作功能的创作用户界面元件通过动画效果进行呈现。动态的视觉呈现方式使用户140更能注意到该用户界面元件,进而激励对创作功能的使用。例如,图3C活动页面300C中具有创作用户界面元件330,该用户界面元件的激活将会触发从当前活动页面切换到创作页面。在应用120从非活动状态切换到活动状态时,创作用户界面元件330可以在视觉上呈现不断放大缩小的动画效果,例如创作用户界面元件330放大后的效果如图3D所示。创作用户界面元件330的大小可以在图3C和图3D所示的示例中持续交替变化。在一些实施例中,创作用户界面元件330的动画效果可以持续一段时间,并且在一段时间后恢复到正常的视觉表示形式。在一些示例中,也可以一直将创作用户界面元件330用这样的动画效果进行展示。
在一些场景中,在应用120的启动过程中,可能会首先呈现启动页面,并且在启动页面呈现完成后再呈现首页或其他活动页面。 启动页面可能会用于展示活动信息、推广信息等。启动页面的呈现可能会持续一定时间。在一些实施例中,可以在应用120的启动页面中呈现创作引导信息对应的视觉表示,例如呈现用于激活创作功能的创作用户界面元件。
图4示出了这样的示例启动页面400,其中呈现了创作用户界面元件410。如果在启动页面400中创作用户界面元件410被选择,应用120也会从启动页面400切换到创作页面。这样,用户可以在应用的启动过程中快速被引导使用创作功能。
在一些场景中,要呈现创作引导信息的活动页面可能是用于播放多媒体内容的播放页面。例如,某些应用被设计为在一般情况下,在从非活跃状态切换到活跃状态后直接向用户呈现用于播放多媒体内容的播放页面。在这种情况下,在确定要呈现创作引导信息时,可以在播放页面的播放区域中呈现用于激活创作功能的创作用户界面元件和用于进入浏览功能的浏览用户界面元件,以供用户选择。创作用户界面元件在此处被视为创作引导信息。
图5示出了这样的创作引导信息的呈现示例。图5示出了应用120在冷启动或热启动后(即从非活跃状态切换到活跃状态后)呈现的播放页面500,该播放页面用于播放多媒体内容502。播放页面500中还呈现了创作用户界面元件510和浏览用户界面元件520。创作用户界面元件510和浏览用户界面元件520例如被呈现在播放页面500中用于播放多媒体内容502的播放区域中。如果用户140选择创作用户界面元件510,应用120被切换到创作页面,从而用户140可以使用应用120的创作功能。注意,创作页面将在下文统一讨论。
在一些实施例中,如果用户140在播放页面500中例如通过手指点击等操作选择浏览用户界面元件520,如图6A所示,这意味着用户140期望进入浏览模式。应用120的播放页面500中将停止呈现创作用户界面元件510和浏览用户界面元件520,如图6B所示。用户在图6B的播放页面500中可以开始正常的多媒体浏览。在一些实施例中,除选择浏览用户界面元件520之外,还可以允许用户140 通过其他方式停止创作用户界面元件510和浏览用户界面元件520的呈现。例如,用户140可以通过点击多媒体内容502的播放区域中除创作用户界面元件510和浏览用户界面元件520之外的其他位置,来退出创作用户界面元件510和浏览用户界面元件520的呈现。
在一些实施例中,如果播放页面500具有多个页面标签,并且其中一些或全部页面标签均能够用于播放多媒体内容,那么用户140在多个页面标签中的切换可以不触发创作用户界面元件510和浏览用户界面元件520的呈现被停止。如图7A所示,如果接收到用户140在播放页面500中切换指令,例如通过滑动输入710触发的切换指令,应用120从当前的页面标签701(有时称为“第一页面标签”)切换到另一页面标签702(有时称为“第二页面标签”)。应用120将在图7B的页面700中播放与页面标签702对应的多媒体内容720,同时保持创作用户界面元件510和浏览用户界面元件520的呈现。也就是说,如果用户140明确选择进入浏览模式或是创作模式,那么创作用户界面元件510和浏览用户界面元件520可以被持续呈现。
在一些实施例中,如果确定用户140选择播放页面500中其他页面标签,并且所选择的页面标签不是播放页面,而是例如消息列表页面、个人主页等,那么创作用户界面元件510和浏览用户界面元件520可以被停止呈现。
在一些实施例中,除在应用120切换到活动状态时呈现的活动页面中显示创作引导信息之外,还可以在应用120的其他页面中显示创作引导信息。在一个实施例中,如果应用120基于用户的操作而显示用户的个人主页,例如图8所示的个人主页800,也可以在该页面中呈现创作引导信息对应的视觉表示,例如通过气泡提示框形式的叠加层810进行呈现。叠加层810可以与图3B中的叠加层320类似,通过箭头形式指向用于激活创作功能的创作用户界面元件812在个人主页800中的位置。备选地或附加地,与图3B中的叠加层320类似,叠加层810中还可以通过文字形式的描述性信息814和图形形式的图标816来展示更多的创作引导信息。
在一些实施例中,如果用户140在应用120中未发布多媒体内容,可以呈现创作引导信息对应的视觉表示来指示创作用户界面元件在当前活动页面中的位置,从而帮助用户140了解如何查找和启动创作功能。例如,在图8的示例中,用户140的个人主页800没有作品(即多媒体内容)的发布,因此可以通过叠加层810来呈现创作引导信息。当然,在用户140未发布多媒体内容的情况下,除在个人主页800中呈现这样的创作引导信息之外或者作为备选,还可以在其他活动页面,例如以上图3A、图3B、图4和图5中呈现创作引导信息。
在一些实施例中,创作引导信息可以不是笼统地指示应用120的创作功能,而是可以能够向用户提供更个性化或者更符合用户偏好的创作引导。例如,如以上提及的,如果创作功能包括拍摄功能,创作引导信息可以指示用户140对应用120的创作功能的创作模式的偏好。在一些实施例中,如果确定拍摄功能的特定模式是向用户140感兴趣的拍摄模式,例如确定用户140持续喜欢使用这样的拍摄模式或者由于新引入的拍摄模式是用户140可能会愿意尝试的,这样的拍摄模式可以被确定为推荐拍摄模式。创作引导信息可以被确定为指示这样的推荐拍摄模式。
在呈现创作引导信息对应的视觉表示时,终端设备110可以将应用120的活动页面中的创作用户界面元件变形为与推荐拍摄模式对应的样式。这样,可以从视觉上更直观地吸引用户了解和使用创作功能。例如,在图3A至图3D、图5和图8的示例中,创作用户界面元件312、322、330、510和812具有与推荐拍摄模式(例如,“快拍模式”)对应的样式。与之相比,图4的创作用户界面元件410可以认为是一般样式。
在一些实施例中,如果应用120还包括编辑功能,创作引导信息可以指示用户140对用于编辑多媒体内容的编辑功能的偏好。应用120的编辑功能可以是多种多样的。作为示例,应用120可以具有特效编辑功能,用于向所编辑的多媒体内容(例如,视频、图像 等)增加特殊效果。应用120还可以具有滤镜功能,用于向所编辑的多媒体内容施加特定滤镜。应用120的其他编辑功能例如还可以包括音乐添加功能、变声功能、添加贴纸功能,等等。在一些实施例中,某些编辑功能还可以具有多种编辑模式,不同编辑模式可以指示对多媒体内容的不同方式。例如,特效编辑功能中的不同特效编辑模式对应于不同的特殊效果,滤镜功能中的不同滤镜模式对应不同的滤镜,音乐添加功能的不同编辑模式对应于不同音乐类型或各个不同音乐的添加。其他编辑功能的编辑模式也可以类似地划分。
在一些实施例中,如果确定特定编辑功能是向用户140感兴趣的编辑功能,例如确定用户140喜欢由该编辑模式编辑后的多媒体内容、喜欢使用该编辑模式来编辑多媒体内容,或者该编辑模式是新引入且是用户140可能会愿意尝试的,这样的编辑模式可以被确定为推荐编辑模式。创作引导信息可以被确定为指示这样的推荐编辑模式。
在呈现创作引导信息对应的视觉表示时,终端设备110可以将应用120的活动页面中的创作用户界面元件变形为与利用推荐编辑模式编辑后的样本多媒体内容对应的样式。这样,可以从视觉上更直观地吸引用户了解和使用创作功能。例如,活动页面900中的创作用户界面元件920具有与推荐特效编辑模式(例如,“油画滤镜”)对应的样式。
在一些实施例中,除通过变形创作用户界面元件来指示推荐编辑模式之外,还可以通过其他形式的创作引导信息进行指示。例如,在图10的示例中,与图3A的活动页面300A相比,活动页面1000上添加的叠加层1010中呈现的文字形式的描述性信息1014以及图形形式的图标1016均能够指示特定的推荐特效编辑模式。
在一些实施例中,对于指示推荐编辑模式的创作引导信息,还可以在用户140浏览由推荐编辑模式编辑后的多媒体内容时在相应的活动页面中被呈现。例如,终端设备110确定应用120的活动页面是否呈现由推荐编辑模式编辑后的多媒体内容。例如,在图9的 示例中,假设用户正在浏览多媒体内容910,其是由“油画滤镜”的编辑模式编辑后的。
如果应用120的活动页面正呈现编辑后的多媒体内容,终端设备110可以在该活动页面中呈现创作引导信息对应的视觉表示,例如变形后的创作用户界面元件920,以指示推荐编辑模式。在一些实施例中,在呈现由推荐编辑模式编辑后的多媒体内容时,还可以通过其他形式的创作引导信息,例如图9的图形和文本形式的视觉表示930来指示推荐编辑模式“油画滤镜”。在用户浏览类似内容的同时推荐生成这样的内容的编辑模式,可以更容易激励用户使用创作功能来生成类似的内容。
在一些实施例中,如果要在用户140浏览应用120的过程中呈现创作引导信息,可以在确定用户140具有发布或编辑多媒体内容的创作意图时提供这样的创作引导信息的呈现,以避免对用户的过多干扰。例如,在图9中,可以在确定用户可能期望拍摄多媒体内容或者发布多媒体内容时,在向用户呈现指示推荐编辑模式“油画滤镜”的创作引导信息。
以上描述了创作引导信息的呈现的一些示例实施例。应当理解,以上示出的创作引导信息的呈现仅是一些示例,并且不同实施例中的创作引导信息及其呈现方式可以合并在一个实施例中,或者同一实施例中的创作引导信息及其呈现方式可以被实现在不同实施例中。例如,在图8的活动页面800中的叠加层810可以呈现与图10的叠加层1010类似的创作引导信息。本公开的实施例在此方面不受限制。
如以上提及的,在呈现创作引导信息的页面中,如果用户选择作为创作引导信息的创作用户界面元件或者根据所呈现的创作引导信息的引导而激活创作功能,应用120将从当前的页面切换到创作页面。图11A至图11C示出了应用120的创作页面的一些示例。创作页面与应用120的创作功能相关联,例如用于提供多媒体内容的拍摄、编辑等功能。
首先参考图11A,其中示出了创作页面1100A。创作页面1100A具有拍摄用户界面元件1110,通过选择该拍摄用户界面元件1110可以激活多媒体内容的拍摄,例如拍摄视频、图像等。创作页面1100A还具有相册用户界面元件1120,通过选择该用户界面元件可以允许用户上传本地或其他数据源中的多媒体内容进行后续创作。
创作页面1100A还具有用于激活编辑功能的用户界面元件,用于编辑拍摄或上传的多媒体内容。作为示例,用于激活编辑功能的用户界面元件包括但不限于:特效用户界面元件1130,用于激活特效编辑功能;翻转用户界面元件1140,用于激活翻转功能,以翻转创作的多媒体内容在页面中的呈现方向;滤镜用户界面元件1150,用于激活滤镜功能;闪光灯用户界面元件1150,用于打开或关闭终端设备110的闪光灯;音乐添加用户界面元件1170,用于激活音乐添加功能,以向创作的多媒体内容添加音乐。应当理解,图11A中示出的仅是示例编辑功能,根据需要,创作页面中可以存在更多、更少或不同的编辑功能。
在一些实施例中,如果在活动页面中呈现的创作引导信息指示应用120的拍摄功能的推荐拍摄模式,在切换到创作页面后,创作页面将被呈现为与以该推荐拍摄模式激活拍摄功能相关联的创作页面。例如,如果在图3A至图3D、图5和图8的示例中的创作用户界面元件312、322、330、510和812被选择,从图3A至图3D、图5和图8的页面切换到创作页面后,如图11A所示,拍摄功能被定位到推荐拍摄模式1180(例如,“快拍模式”)。这样,用户可以被快速使用期望的拍摄模式进行拍摄。在一些实施例中,如果创作引导信息不指示特定拍摄模式,在切换到创作页面后,拍摄功能可以被定位到默认拍摄模式。
在一些实施例中,如果在活动页面中呈现的创作引导信息指示推荐编辑模式,在切换到创作页面后,在创作页面中可以自动将推荐编辑模式应用到创作的多媒体内容,以呈现由推荐编辑模式自动编辑后的多媒体内容。
在图11B的示例中,假设推荐编辑模式是推荐特效编辑模式,例如图10的活动页面1000中的创作用户界面元件1012所指示的特效编辑模式。在活动页面1000中的创作用户界面元件1012被选择后,应用120切换到创作页面1100B。在创作页面1100B中,特效用户界面元件1130被变形为指示推荐特效编辑模式,并且当前创作页面1100B中正在创作的多媒体内容1102被自动编辑,以向多媒体内容1102中的对象添加特效1104。
在图11C的示例中,假设推荐编辑模式是推荐滤镜模式,例如图9的活动页面900中的创作用户界面元件920所指示的滤镜模式,例如“油画滤镜”。在活动页面900中的创作用户界面元件920被选择后,应用120切换到创作页面1100C。在创作页面1100C中,正在创作的多媒体内容1106被相应的推荐滤镜模式自动编辑,例如呈现“油画滤镜”的效果。创作页面1100C中还可以呈现关于所使用的该推荐滤镜模式的指示1108。
在图11A-11C的示例中,在创作页面中选择拍摄用户界面元件后再进行多媒体内容的拍摄。在其他实施例中,在切换到创作页面后,创作页面的拍摄功能可以被直接激活以进行多媒体内容的拍摄。
在应用120的启动中,由于应用120可能会呈现具有声音的多媒体内容,例如带有声音的视频、音频等,在一些实施例中,如果应用120被切换到活动状态并且当前的活动页面是用于呈现具有目标音量的多媒体内容的播放页面,终端设备110还可以基于预定音量而不是当前要呈现的多媒体内容自己的目标音量来进行内容呈现。终端设备110所选择的预定音量小于多媒体内容自己的目标音量。在一些示例中,预定音量可以是零。
在进行内容呈现时,终端设备110初始时可以将多媒体内容的播放音量设置为预定音量(例如,默认为静音,或者较小的音量)。在一些实施例中,如果多媒体内容是视频并且播放音量被设置为静音,多媒体内容的整体播放可以被暂停,或者仅播放画面而没有声音。
在一些实施例中,终端设备110可以以预定音量将多媒体内容呈现一个预定时间段(例如,2秒、3秒、5秒等)。如果在预定时间段内没有接收到对多媒体内容的操作指令,例如用户点击播放或者提高音量,那么在预定时间段到期后,终端设备110可以以目标音量来呈现多媒体内容。
在一些实施例中,在初始将多媒体内容的播放音量设置为预定音量后,终端设备110可以在多媒体内容被呈现的同时,从预定音量大小起逐渐增加多媒体内容的播放音量,直到目标音量。也就是说,多媒体内容的播放音量被逐渐增大。这个逐渐增大的过程可以给用户一定的缓冲时间。
图12示出了根据本公开的某些实施例的用于用户引导的装置1200的示意性结构框图。装置1200可以被实现为或者被包括在终端设备110中。装置1200中的各个模块/组件可以由硬件、软件、固件或者它们的任意组合来实现。
如图所示,装置1200包括信息获取模块1210,被配置为获取针对用户的创作引导信息,创作引导信息用于引导用户对应用中用于创作多媒体内容的创作功能的使用。装置1200还包括信息呈现模块1220,被配置为如果应用处于活动状态,在应用的活动页面中呈现创作引导信息对应的视觉表示。装置1200还包括页面切换模块120,被配置为基于与视觉表示相关联的交互,从活动页面切换到与创作功能相关联的创作页面。
在一些实施例中,创作功能包括以下至少一项:用于拍摄多媒体内容的拍摄功能和用于上传本地多媒体内容的上传功能。
在一些实施例中,活动页面包括在应用从非活动状态切换到活动状态后要呈现的页面。
在一些实施例中,信息呈现模块1220被配置为:如果活动页面是应用的启动页面,在启动页面中呈现用于激活创作功能的创作用户界面元件。在一些实施例中,页面切换模块1230被配置为:如果在启动页面中创作用户界面元件被选择,从启动页面切换到创作页 面。
在一些实施例中,信息呈现模块1220被配置为:如果活动页面是用于播放多媒体内容的播放页面,在多媒体内容的播放区域中呈现用于激活创作功能的创作用户界面元件和用于进入浏览功能的浏览用户界面元件。在一些实施例中,页面切换模块1230被配置为:如果在播放页面中创作用户界面元件被选择,从播放页面切换到创作页面。
在一些实施例中,装置1200还包括以下至少一项:呈现停止模块,被配置为如果在播放页面中浏览用户界面元件被选择,在播放页面中停止呈现创作用户界面元件和浏览用户界面元件;以及标签切换呈现模块,被配置为在创作用户界面元件和浏览用户界面元件被呈现的同时,如果播放页面从第一页面标签被切换到第二页面标签,在播放与第二页面标签对应的多媒体内容的同时,维持创作用户界面元件和浏览用户界面元件的呈现。
在一些实施例中,信息呈现模块1220被配置为:在活动页面中添加叠加层,叠加层被放置在活动页面的顶层;以及在叠加层中呈现创作引导信息对应的视觉表示。
在一些实施例中,创作功能包括拍摄功能并且创作引导信息指示拍摄功能的推荐拍摄模式。在一些实施例中,信息呈现模块1220被配置为:将活动页面中的创作用户界面元件变形为与推荐拍摄模式对应的样式。在一些实施例中,页面切换模块1230被配置为:响应于创作用户界面元件被选择,从活动页面切换到与以推荐拍摄模式激活拍摄功能相关联的创作页面。
在一些实施例中,创作引导信息指示用于编辑多媒体内容的推荐编辑模式。在一些实施例中,信息呈现模块1220被配置为:确定活动页面是否呈现由推荐编辑模式编辑后的多媒体内容;以及如果确定活动页面呈现编辑后的多媒体内容,在活动页面中呈现创作引导信息对应的视觉表示以指示推荐编辑模式。
在一些实施例中,创作引导信息指示用于编辑多媒体内容的推 荐编辑模式。在一些实施例中,信息呈现模块1220被配置为:将活动页面中的创作用户界面元件变形为与利用推荐编辑模式编辑后的样本多媒体内容对应的样式,在一些实施例中,页面切换模块1230被配置为:响应于创作用户界面元件被选择,从活动页面切换到用于激活创作功能的创作页面;以及在创作页面中呈现由推荐编辑模式自动编辑后的多媒体内容。
在一些实施例中,应用还包括用于发布多媒体内容的发布功能。在一些实施例中,信息呈现模块1220被配置为:如果用户在应用中未发布多媒体内容,呈现创作引导信息对应的视觉表示以指示用于激活创作功能的创作用户界面元件在活动页面中的位置。
在一些实施例中,创作引导信息与以下中的至少一项相关:用户对创作功能的创作模式的偏好,用户对用于编辑多媒体内容的编辑功能的偏好,以及用户在应用中发布多媒体内容的意图。
在一些实施例中,装置1200还包括:音量调节呈现模块,被配置为如果应用被切换到活动状态并且活动页面是用于呈现具有目标音量的多媒体内容的播放页面,基于预定音量来呈现多媒体内容,预定音量小于目标音量。
在一些实施例中,音量调节呈现模块被配置为:以预定音量将多媒体内容呈现一个预定时间段;以及如果预定时间段到期,以目标音量来呈现多媒体内容。
在一些实施例中,音量调节呈现模块被配置为:在多媒体内容被呈现的同时,从预定音量大小起逐渐增加多媒体内容的播放音量,直到目标音量。
图13示出了示出了其中可以实施本公开的一个或多个实施例的计算设备1300的框图。应当理解,图13所示出的计算设备1300仅仅是示例性的,而不应当构成对本文所描述的实施例的功能和范围的任何限制。图13所示出的计算设备1300可以用于实现图1的终端设备110。
如图13所示,计算设备1300是通用计算设备的形式。计算设 备1300的组件可以包括但不限于一个或多个处理器或处理单元1310、存储器1320、存储设备1330、一个或多个通信单元1340、一个或多个输入设备1350以及一个或多个输出设备1360。处理单元1310可以是实际或虚拟处理器并且能够根据存储器1320中存储的程序来执行各种处理。在多处理器系统中,多个处理单元并行执行计算机可执行指令,以提高计算设备1300的并行处理能力。
计算设备1300通常包括多个计算机存储介质。这样的介质可以是计算设备1300可访问的任何可以获得的介质,包括但不限于易失性和非易失性介质、可拆卸和不可拆卸介质。存储器1320可以是易失性存储器(例如寄存器、高速缓存、随机访问存储器(RAM))、非易失性存储器(例如,只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、闪存)或它们的某种组合。存储设备1330可以是可拆卸或不可拆卸的介质,并且可以包括机器可读介质,诸如闪存驱动、磁盘或者任何其他介质,其可以能够用于存储信息和/或数据(例如用于训练的训练数据)并且可以在计算设备1300内被访问。
计算设备1300可以进一步包括另外的可拆卸/不可拆卸、易失性/非易失性存储介质。尽管未在图13中示出,可以提供用于从可拆卸、非易失性磁盘(例如“软盘”)进行读取或写入的磁盘驱动和用于从可拆卸、非易失性光盘进行读取或写入的光盘驱动。在这些情况中,每个驱动可以由一个或多个数据介质接口被连接至总线(未示出)。存储器1320可以包括计算机程序产品1325,其具有一个或多个程序模块,这些程序模块被配置为执行本公开的各种实施例的各种方法或动作。
通信单元1340实现通过通信介质与其他计算设备进行通信。附加地,计算设备1300的组件的功能可以以单个计算集群或多个计算机器来实现,这些计算机器能够通过通信连接进行通信。因此,计算设备1300可以使用与一个或多个其他服务器、网络个人计算机(PC)或者另一个网络节点的逻辑连接来在联网环境中进行操作。
输入设备1350可以是一个或多个输入设备,例如鼠标、键盘、追踪球等。输出设备1360可以是一个或多个输出设备,例如显示器、扬声器、打印机等。计算设备1300还可以根据需要通过通信单元1340与一个或多个外部设备(未示出)进行通信,外部设备诸如存储设备、显示设备等,与一个或多个使得用户与计算设备1300交互的设备进行通信,或者与使得计算设备1300与一个或多个其他计算设备通信的任何设备(例如,网卡、调制解调器等)进行通信。这样的通信可以经由输入/输出(I/O)接口(未示出)来执行。
根据本公开的示例性实现方式,提供了一种计算机可读存储介质,其上存储有计算机可执行指令,其中计算机可执行指令被处理器执行以实现上文描述的方法。根据本公开的示例性实现方式,还提供了一种计算机程序产品,计算机程序产品被有形地存储在非瞬态计算机可读介质上并且包括计算机可执行指令,而计算机可执行指令被处理器执行以实现上文描述的方法。
这里参照根据本公开实现的方法、装置、设备和计算机程序产品的流程图和/或框图描述了本公开的各个方面。应当理解,流程图和/或框图的每个方框以及流程图和/或框图中各方框的组合,都可以由计算机可读程序指令实现。
这些计算机可读程序指令可以提供给通用计算机、专用计算机或其他可编程数据处理装置的处理单元,从而生产出一种机器,使得这些指令在通过计算机或其他可编程数据处理装置的处理单元执行时,产生了实现流程图和/或框图中的一个或多个方框中规定的功能/动作的装置。也可以把这些计算机可读程序指令存储在计算机可读存储介质中,这些指令使得计算机、可编程数据处理装置和/或其他设备以特定方式工作,从而,存储有指令的计算机可读介质则包括一个制造品,其包括实现流程图和/或框图中的一个或多个方框中规定的功能/动作的各个方面的指令。
可以把计算机可读程序指令加载到计算机、其他可编程数据处理装置、或其他设备上,使得在计算机、其他可编程数据处理装置 或其他设备上执行一系列操作步骤,以产生计算机实现的过程,从而使得在计算机、其他可编程数据处理装置、或其他设备上执行的指令实现流程图和/或框图中的一个或多个方框中规定的功能/动作。
附图中的流程图和框图显示了根据本公开的多个实现的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或指令的一部分,模块、程序段或指令的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或动作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
以上已经描述了本公开的各实现,上述说明是示例性的,并非穷尽性的,并且也不限于所公开的各实现。在不偏离所说明的各实现的范围和精神的情况下,对于本技术领域的普通技术人员来说许多修改和变更都是显而易见的。本文中所用术语的选择,旨在最好地解释各实现的原理、实际应用或对市场中的技术的改进,或者使本技术领域的其他普通技术人员能理解本文公开的各个实现方式。

Claims (17)

  1. 一种用户引导的方法,包括:
    获取针对用户的创作引导信息,所述创作引导信息用于引导所述用户对所述应用中用于创作多媒体内容的创作功能的使用;
    如果所述应用处于活动状态,在所述应用的活动页面中呈现所述创作引导信息对应的视觉表示;以及
    基于与所述视觉表示相关联的交互,从所述活动页面切换到与所述创作功能相关联的创作页面。
  2. 根据权利要求1所述的方法,其中所述创作功能包括以下至少一项:用于拍摄多媒体内容的拍摄功能和用于上传已拍摄的多媒体内容的上传功能。
  3. 根据权利要求1所述的方法,其中呈现所述创作引导信息对应的所述视觉表示包括:
    如果所述活动页面是所述应用的启动页面,在所述启动页面中呈现用于激活所述创作功能的创作用户界面元件,并且
    其中从所述活动页面切换到所述创作页面包括:
    如果在所述启动页面中所述创作用户界面元件被选择,从所述启动页面切换到所述创作页面。
  4. 根据权利要求1所述的方法,其中呈现所述创作引导信息对应的所述视觉表示包括:
    如果所述活动页面是用于播放多媒体内容的播放页面,在所述多媒体内容的播放区域中呈现用于激活所述创作功能的创作用户界面元件和用于进入浏览功能的浏览用户界面元件,并且
    其中从所述活动页面切换到所述创作页面包括:
    如果在所述播放页面中所述创作用户界面元件被选择,从所述播放页面切换到所述创作页面。
  5. 根据权利要求4所述的方法,还包括执行以下至少一项:
    如果在所述播放页面中所述浏览用户界面元件被选择,在所述 播放页面中停止呈现所述创作用户界面元件和所述浏览用户界面元件;以及
    在所述创作用户界面元件和所述浏览用户界面元件被呈现的同时,如果所述播放页面从第一页面标签被切换到第二页面标签,在播放与所述第二页面标签对应的多媒体内容的同时,维持所述创作用户界面元件和所述浏览用户界面元件的呈现。
  6. 根据权利要求1所述的方法,其中呈现所述创作引导信息对应的视觉表示包括:
    在所述活动页面中添加叠加层,所述叠加层被放置在所述活动页面的顶层;以及
    在所述叠加层中呈现所述创作引导信息对应的所述视觉表示。
  7. 根据权利要求1所述的方法,其中所述创作功能包括拍摄功能并且所述创作引导信息指示所述拍摄功能的推荐拍摄模式,并且呈现所述创作引导信息对应的所述视觉表示包括:
    将所述活动页面中的创作用户界面元件变形为与所述推荐拍摄模式对应的样式,并且
    其中从所述活动页面切换到所述创作页面包括:
    响应于所述创作用户界面元件被选择,从所述活动页面切换到与以所述推荐拍摄模式激活所述拍摄功能相关联的创作页面。
  8. 根据权利要求1所述的方法,其中所述创作引导信息指示用于编辑多媒体内容的推荐编辑模式,并且呈现所述创作引导信息对应的所述视觉表示包括:
    确定所述活动页面是否呈现由所述推荐编辑模式编辑后的多媒体内容;以及
    如果确定所述活动页面呈现所述编辑后的多媒体内容,在所述活动页面中呈现所述创作引导信息对应的所述视觉表示以指示所述推荐编辑模式。
  9. 根据权利要求1所述的方法,其中所述创作引导信息指示用 于编辑多媒体内容的推荐编辑模式,并且呈现所述创作引导信息对应的所述视觉表示包括:
    将所述活动页面中的创作用户界面元件变形为与利用所述推荐编辑模式编辑后的样本多媒体内容对应的样式,并且
    其中从所述活动页面切换到所述创作页面包括:
    响应于所述创作用户界面元件被选择,从所述活动页面切换到用于激活所述创作功能的创作页面;以及
    在所述创作页面中呈现由所述推荐编辑模式自动编辑后的多媒体内容。
  10. 根据权利要求1所述的方法,其中所述应用还包括用于发布多媒体内容的发布功能,并且呈现所述创作引导信息对应的所述视觉表示包括:
    如果所述用户在所述应用中未发布多媒体内容,呈现所述创作引导信息对应的所述视觉表示以指示用于激活所述创作功能的创作用户界面元件在所述活动页面中的位置。
  11. 根据权利要求1所述的方法,其中所述创作引导信息与以下中的至少一项相关:
    所述用户对所述创作功能的创作模式的偏好,
    所述用户对用于编辑多媒体内容的编辑功能的偏好,以及
    所述用户在所述应用中发布多媒体内容的意图。
  12. 根据权利要求1至11中任一项所述的方法,还包括:
    如果所述应用被切换到活动状态并且所述活动页面是用于呈现具有目标音量的多媒体内容的播放页面,基于预定音量来呈现所述多媒体内容,所述预定音量小于所述目标音量。
  13. 根据权利要求12所述的方法,其中呈现所述多媒体内容包括:
    以所述预定音量将所述多媒体内容呈现一个预定时间段;以及
    如果所述预定时间段到期,以所述目标音量来呈现所述多媒体内容。
  14. 根据权利要求12所述的方法,其中呈现所述多媒体内容包括:
    在所述多媒体内容被呈现的同时,从所述预定音量大小起逐渐增加所述多媒体内容的播放音量,直到所述目标音量。
  15. 一种用于用户引导的装置,包括
    信息获取模块,被配置为获取针对用户的创作引导信息,所述创作引导信息用于引导所述用户对所述应用中用于创作多媒体内容的创作功能的使用;
    信息呈现模块,被配置为如果所述应用处于活动状态,在所述应用的活动页面中呈现所述创作引导信息对应的视觉表示;以及
    页面切换模块,被配置为基于与所述视觉表示相关联的交互,从所述活动页面切换到与所述创作功能相关联的创作页面。
  16. 一种电子设备,包括:
    至少一个处理单元;以及
    至少一个存储器,所述至少一个存储器被耦合到所述至少一个处理单元并且存储用于由所述至少一个处理单元执行的指令,所述指令在由所述至少一个处理单元执行时使所述设备执行根据权利要求1至14中任一项所述的方法。
  17. 一种计算机可读存储介质,其上存储有计算机程序,所述程序被处理器执行时实现根据权利要求1至14中任一项所述的方法。
PCT/CN2021/119400 2020-09-25 2021-09-18 用于用户引导的方法、装置、设备和存储介质 WO2022063090A1 (zh)

Priority Applications (9)

Application Number Priority Date Filing Date Title
AU2021348459A AU2021348459B2 (en) 2020-09-25 2021-09-18 Method and apparatus for user guide, device and storage medium
EP21871460.8A EP4092527A4 (en) 2020-09-25 2021-09-18 METHOD AND APPARATUS FOR GUIDING USERS, DEVICE AND STORAGE MEDIA
CA3168363A CA3168363A1 (en) 2020-09-25 2021-09-18 Method and apparatus for user guide, device and storage medium
JP2022552451A JP7270850B2 (ja) 2020-09-25 2021-09-18 ユーザガイドに用いられる方法、装置、機器と記憶媒体
BR112022016862A BR112022016862A2 (pt) 2020-09-25 2021-09-18 Método e aparelho para guia de usuário, dispositivo e meio de armazenamento
MX2022010651A MX2022010651A (es) 2020-09-25 2021-09-18 Metodo y aparato para guia de usuario, dispositivo y medio de almacenamiento.
KR1020227029489A KR102552821B1 (ko) 2020-09-25 2021-09-18 사용자 가이드, 장치 및 저장 매체를 위한 방법 및 디바이스
US17/885,538 US11733849B2 (en) 2020-09-25 2022-08-11 Method and apparatus for user guide, device and storage medium
US18/219,539 US20230350553A1 (en) 2020-09-25 2023-07-07 Method and apparatus for user guidance, device and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011027615.5A CN112114925B (zh) 2020-09-25 2020-09-25 用于用户引导的方法、装置、设备和存储介质
CN202011027615.5 2020-09-25

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/885,538 Continuation US11733849B2 (en) 2020-09-25 2022-08-11 Method and apparatus for user guide, device and storage medium

Publications (1)

Publication Number Publication Date
WO2022063090A1 true WO2022063090A1 (zh) 2022-03-31

Family

ID=73798258

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/119400 WO2022063090A1 (zh) 2020-09-25 2021-09-18 用于用户引导的方法、装置、设备和存储介质

Country Status (10)

Country Link
US (2) US11733849B2 (zh)
EP (1) EP4092527A4 (zh)
JP (1) JP7270850B2 (zh)
KR (1) KR102552821B1 (zh)
CN (1) CN112114925B (zh)
AU (1) AU2021348459B2 (zh)
BR (1) BR112022016862A2 (zh)
CA (1) CA3168363A1 (zh)
MX (1) MX2022010651A (zh)
WO (1) WO2022063090A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114979495A (zh) * 2022-06-28 2022-08-30 北京字跳网络技术有限公司 用于内容拍摄的方法、装置、设备和存储介质

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11955025B2 (en) * 2019-04-16 2024-04-09 Adin Aoki Systems and methods for facilitating creating of customizable tutorials for instruments specific to a particular facility
CN112114925B (zh) 2020-09-25 2021-09-21 北京字跳网络技术有限公司 用于用户引导的方法、装置、设备和存储介质
CN112764871B (zh) * 2021-02-04 2022-08-12 腾讯科技(深圳)有限公司 数据处理方法、装置、计算机设备以及可读存储介质
CN113286159B (zh) * 2021-05-14 2022-05-31 北京字跳网络技术有限公司 应用程序的页面显示方法、装置和设备
CN114727020A (zh) * 2022-04-11 2022-07-08 北京达佳互联信息技术有限公司 界面显示与数据处理方法、装置、设备及存储介质
CN115017345A (zh) * 2022-06-28 2022-09-06 上海哔哩哔哩科技有限公司 多媒体内容处理方法、装置、计算设备及存储介质
CN115858077A (zh) * 2022-12-22 2023-03-28 北京字跳网络技术有限公司 用于创建特效的方法、装置、设备和介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106897081A (zh) * 2015-12-18 2017-06-27 中兴通讯股份有限公司 应用的引导方法及装置、终端
CN109828800A (zh) * 2017-11-23 2019-05-31 电子科技大学中山学院 一种qml视频播放与页面切换冲突的处理方法
US20200045245A1 (en) * 2018-05-07 2020-02-06 Apple Inc. Creative camera
CN111625165A (zh) * 2019-02-28 2020-09-04 北京字节跳动网络技术有限公司 一种媒体文件的处理方法、装置、终端及存储介质
CN112114925A (zh) * 2020-09-25 2020-12-22 北京字跳网络技术有限公司 用于用户引导的方法、装置、设备和存储介质

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050134578A1 (en) * 2001-07-13 2005-06-23 Universal Electronics Inc. System and methods for interacting with a control environment
EP1472625A2 (en) 2002-01-24 2004-11-03 Koninklijke Philips Electronics N.V. Music retrieval system for joining in with the retrieved piece of music
US7594177B2 (en) * 2004-12-08 2009-09-22 Microsoft Corporation System and method for video browsing using a cluster index
US20080045336A1 (en) * 2006-08-18 2008-02-21 Merit Industries, Inc. Interactive amusement device advertising
KR20090066368A (ko) * 2007-12-20 2009-06-24 삼성전자주식회사 터치 스크린을 갖는 휴대 단말기 및 그의 기능 제어 방법
JP5652652B2 (ja) * 2010-12-27 2015-01-14 ソニー株式会社 表示制御装置および方法
US9003289B2 (en) * 2012-02-23 2015-04-07 Google Inc. Automatic detection of suggested video edits
KR102026717B1 (ko) * 2013-02-08 2019-10-04 삼성전자 주식회사 사진 촬영 가이드 방법 및 이를 구현하는 휴대 단말
US9002175B1 (en) * 2013-03-13 2015-04-07 Google Inc. Automated video trailer creation
KR102077505B1 (ko) * 2013-03-15 2020-04-07 삼성전자주식회사 전자장치의 웹 페이지 처리 방법 및 장치
KR102099635B1 (ko) * 2013-04-09 2020-04-13 삼성전자 주식회사 카메라의 가이드 제공 방법 및 그 전자 장치
CN103399690A (zh) * 2013-07-31 2013-11-20 贝壳网际(北京)安全技术有限公司 拍照引导方法、装置及移动终端
KR101588136B1 (ko) * 2014-10-10 2016-02-15 한국과학기술원 모바일 문서 캡쳐를 위한 카메라 탑다운 앵글 보정 방법 및 장치
CN104679383B (zh) * 2014-12-31 2018-04-27 广东欧珀移动通信有限公司 自动切换拍照按钮的方法和装置
US20160196584A1 (en) * 2015-01-06 2016-07-07 Facebook, Inc. Techniques for context sensitive overlays
KR20170006559A (ko) * 2015-07-08 2017-01-18 엘지전자 주식회사 이동단말기 및 그 제어방법
KR20180017746A (ko) * 2016-08-10 2018-02-21 엘지전자 주식회사 이동 단말기 및 그 제어방법
EP3324582A1 (en) * 2016-11-18 2018-05-23 LG Electronics Inc. Mobile terminal and method for controlling the same
JP6157711B1 (ja) * 2016-12-05 2017-07-05 株式会社リクポ コンテンツ提供装置、方法およびプログラム
US10298837B2 (en) * 2016-12-28 2019-05-21 Facebook, Inc. Systems and methods for presenting content based on unstructured visual data
US10277834B2 (en) * 2017-01-10 2019-04-30 International Business Machines Corporation Suggestion of visual effects based on detected sound patterns
JP2018005893A (ja) * 2017-05-02 2018-01-11 株式会社 ディー・エヌ・エー 動画を記録するためのプログラム、システム、及び方法
DK180859B1 (en) * 2017-06-04 2022-05-23 Apple Inc USER INTERFACE CAMERA EFFECTS
CN107526569B (zh) * 2017-08-18 2019-09-27 Oppo广东移动通信有限公司 一种音量调节方法、装置、存储介质及移动终端
JP2019053355A (ja) * 2017-09-12 2019-04-04 セイコーエプソン株式会社 情報処理装置、情報処理方法、及び、プログラム
US10861348B2 (en) * 2018-03-20 2020-12-08 Microsoft Technology Licensing, Llc Cross-application feature linking and educational messaging
JP7121270B2 (ja) * 2018-07-25 2022-08-18 株式会社ミクシィ 情報処理装置、動画配信方法及び動画配信プログラム
KR102598109B1 (ko) * 2018-08-08 2023-11-06 삼성전자주식회사 이미지 분석에 기반하여, 디스플레이를 통해 표시된 영상과 메모리에 저장된 영상과 관련하여 알림을 제공하는 전자 장치 및 방법
US10674072B1 (en) * 2019-05-06 2020-06-02 Apple Inc. User interfaces for capturing and managing visual media
WO2020138120A1 (ja) * 2018-12-26 2020-07-02 日本電気株式会社 情報処理装置、情報処理方法及び記録媒体
JP7186120B2 (ja) * 2019-03-27 2022-12-08 オリンパス株式会社 画像ファイル生成装置
CN109982130A (zh) * 2019-04-11 2019-07-05 北京字节跳动网络技术有限公司 一种视频拍摄方法、装置、电子设备及存储介质
US11212449B1 (en) * 2020-09-25 2021-12-28 Apple Inc. User interfaces for media capture and management

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106897081A (zh) * 2015-12-18 2017-06-27 中兴通讯股份有限公司 应用的引导方法及装置、终端
CN109828800A (zh) * 2017-11-23 2019-05-31 电子科技大学中山学院 一种qml视频播放与页面切换冲突的处理方法
US20200045245A1 (en) * 2018-05-07 2020-02-06 Apple Inc. Creative camera
CN111625165A (zh) * 2019-02-28 2020-09-04 北京字节跳动网络技术有限公司 一种媒体文件的处理方法、装置、终端及存储介质
CN112114925A (zh) * 2020-09-25 2020-12-22 北京字跳网络技术有限公司 用于用户引导的方法、装置、设备和存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4092527A4

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114979495A (zh) * 2022-06-28 2022-08-30 北京字跳网络技术有限公司 用于内容拍摄的方法、装置、设备和存储介质
CN114979495B (zh) * 2022-06-28 2024-04-12 北京字跳网络技术有限公司 用于内容拍摄的方法、装置、设备和存储介质

Also Published As

Publication number Publication date
JP2023507678A (ja) 2023-02-24
US20220382442A1 (en) 2022-12-01
EP4092527A1 (en) 2022-11-23
US11733849B2 (en) 2023-08-22
CN112114925A (zh) 2020-12-22
EP4092527A4 (en) 2023-08-30
AU2021348459A1 (en) 2022-09-15
AU2021348459B2 (en) 2024-03-14
CN112114925B (zh) 2021-09-21
JP7270850B2 (ja) 2023-05-10
BR112022016862A2 (pt) 2023-04-18
KR102552821B1 (ko) 2023-07-07
KR20220123741A (ko) 2022-09-08
US20230350553A1 (en) 2023-11-02
MX2022010651A (es) 2022-10-13
CA3168363A1 (en) 2022-03-31

Similar Documents

Publication Publication Date Title
WO2022063090A1 (zh) 用于用户引导的方法、装置、设备和存储介质
US11079923B1 (en) User interface for a video capture device
US11190856B2 (en) Synchronizing content and metadata
CN112153288B (zh) 用于发布视频或图像的方法、装置、设备和介质
US8799300B2 (en) Bookmarking segments of content
WO2018006802A1 (zh) 一种视频播放方法、装置及电子设备
US10657834B2 (en) Smart bookmarks
CN114979495B (zh) 用于内容拍摄的方法、装置、设备和存储介质
WO2022111376A1 (zh) 发布和回复多媒体内容的方法、装置和设备
WO2019042183A1 (zh) 虚拟场景显示方法、装置及存储介质
WO2017113673A1 (zh) 一种图片查看方法、装置和电子设备
US12114060B2 (en) Method, apparatus, device and storage medium for content capturing
WO2024131577A1 (zh) 用于创建特效的方法、装置、设备和介质
CN115826805A (zh) 媒体交互方法、装置、设备和存储介质
KR102276789B1 (ko) 동영상 편집 방법 및 장치
US20230244366A1 (en) In-page navigation
KR20230157692A (ko) 영상 콘텐츠 내 사용자 감정표현 표시 방법 및 장치
CN118590701A (zh) 用于视频交互的方法、装置、设备和存储介质
CN118250553A (zh) 用于内容拍摄的方法、装置、设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21871460

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3168363

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2021348459

Country of ref document: AU

ENP Entry into the national phase

Ref document number: 20227029489

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2021871460

Country of ref document: EP

Effective date: 20220819

ENP Entry into the national phase

Ref document number: 2022552451

Country of ref document: JP

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112022016862

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 2021348459

Country of ref document: AU

Date of ref document: 20210918

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 112022016862

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20220824

NENP Non-entry into the national phase

Ref country code: DE