WO2021031909A1 - Procédé et appareil de production de contenu de données, dispositif électronique et support lisible par ordinateur - Google Patents

Procédé et appareil de production de contenu de données, dispositif électronique et support lisible par ordinateur Download PDF

Info

Publication number
WO2021031909A1
WO2021031909A1 PCT/CN2020/108256 CN2020108256W WO2021031909A1 WO 2021031909 A1 WO2021031909 A1 WO 2021031909A1 CN 2020108256 W CN2020108256 W CN 2020108256W WO 2021031909 A1 WO2021031909 A1 WO 2021031909A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
environment
target data
output
terminal
Prior art date
Application number
PCT/CN2020/108256
Other languages
English (en)
Chinese (zh)
Inventor
王俊豪
Original Assignee
北京字节跳动网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字节跳动网络技术有限公司 filed Critical 北京字节跳动网络技术有限公司
Publication of WO2021031909A1 publication Critical patent/WO2021031909A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/44Browsing; Visualisation therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9536Search customisation based on social or collaborative filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output

Definitions

  • the present disclosure relates to the field of computer technology. Specifically, the present disclosure relates to a method, device, electronic device, and computer-readable medium for outputting data content.
  • Feed is a content aggregator that combines several news sources that users actively subscribe to to help users continuously obtain the latest feed content.
  • friends or followers are used as information sources, and the content is friends or followers.
  • the dynamics possibly simplified content and other social behaviors
  • friends or the friends post enough Frequently, users receive content continuously.
  • the content displayed by the terminal to the user is generally published by the user’s friends, following objects, or other matching users, that is, what kind of content the user’s friends, following objects, or other matching users publish, and what the terminal displays to the user is What kind of content, the display form of the content is relatively single, which cannot meet the diverse needs of users.
  • the first aspect of the present disclosure provides a method for outputting data content, the method including:
  • the content information of the target data and the environment information of the environment where the terminal is located determine the output effect corresponding to the content information and environment information
  • a data content output device which includes:
  • the transceiver module is configured to obtain target data corresponding to the first preview area when a playback trigger operation performed on the first preview area of the current application display page is detected;
  • the output effect determination module is used to determine the output effect corresponding to the content information and the environment information according to the content information of the target data and the environment information of the environment where the terminal is located;
  • the target data output module is used to output target data based on the output effect.
  • an electronic device which includes:
  • a computer program is stored in the memory
  • the processor is configured to execute any one of the methods in the first aspect when running a computer program.
  • a computer-readable medium on which a computer program is stored.
  • the program is executed by a processor, the method of any one of the first aspects is provided.
  • the present disclosure outputs target data
  • considering the content information of the target data itself can enhance the presentation effect of the content information
  • considering the environmental information of the environment where the terminal is located the presentation of the content information of the target data can match the actual environment, so it is based on the target
  • the content information of the data and the environmental information of the environment where the terminal is located are used to determine the output effect corresponding to the content information and environmental information, and output target data based on the output effect, so that the output effect matches the content information and environmental information, so that the user has
  • the immersive experience can ultimately allow users to experience the content information of the target data more immersively.
  • FIG. 1 is a schematic flowchart of a method for outputting data content provided by an embodiment of the disclosure
  • FIG. 2 is a schematic diagram of a scene of a method for outputting data content provided by an embodiment of the disclosure
  • FIG. 3 is a schematic structural diagram of a data content output device provided by an embodiment of the disclosure.
  • FIG. 4 is a schematic structural diagram of an electronic device provided by an embodiment of the disclosure.
  • first and second mentioned in the present disclosure are only used to distinguish devices, modules or units, and are not used to limit these devices, modules or units to be different devices, modules or units. Units are not used to limit the order or interdependence of the functions performed by these devices, modules or units.
  • a friend or an object of interest is used as the information source.
  • One implementation method requires the user to pay attention to an object in one direction to obtain the dynamic (or content information) published by the object; one implementation method requires the user and the object to follow both directions In order to obtain the dynamic (or content information) released by the object; there is another way to achieve that does not require the user to actively follow the object, the platform corresponding to the application defaults that the user has already followed the object, and the platform can directly feedback to the user that the object is on the platform
  • the published dynamic (or content information) for example, some applications can automatically determine user preferences based on the user's browsing history and push relevant content information to the user.
  • the present disclosure provides a method for outputting data content.
  • the method can be executed by an electronic device.
  • the electronic device may be a terminal device, and the terminal device may be a desktop device or a mobile terminal.
  • the disclosure includes:
  • the current display page of the terminal device is divided into at least one preview area, and at least one preview area includes a first preview area. It is understandable that if the number of preview areas is one, the current display page is the first preview area. Wherein, the size of each preview area may be the same or different. For example, the first preview area is the first size, and the preview area corresponding to the target data uploaded by ordinary users of the application (including users in the present disclosure) is the first preview area. Two sizes, the first size and the second size are different sizes, and the first size may be larger than the second size.
  • the preview area is the display area of the data or cover formed based on the content information of the target data. Its purpose is to let the user have a preliminary understanding of the content information of the target data.
  • the target data is a video
  • the preview area can be that of the video. Display area for promotional posters.
  • Each preview area corresponds to one target data
  • the terminal device detects a playback trigger operation performed by the user corresponding to the terminal device on the first preview area, the terminal device obtains the target data corresponding to the first preview area.
  • the target data can be any of video data, image data, text data, and animation data.
  • each preview area of the terminal device can display the cover of the content information of the corresponding target data, and the cover can be a few frames of the content information of the target data.
  • the target data corresponding to the first preview area is generally an advertisement uploaded by an advertiser.
  • the terminal device When the terminal device detects a play trigger operation performed on the first preview area of the current application display page, it can acquire target data corresponding to the first preview area.
  • the play trigger operation may be a user's click operation on the first preview area.
  • One possible situation is that the user can click the first preview area according to preferences, and the terminal device receives the user's play trigger operation.
  • the playback trigger operation can be a user sliding operation, and the end position of the sliding operation is in the first preview area, for example, the user's finger slides on the application display interface , Slide to the current application display interface, and the end position of the sliding operation is located in the first preview area of the current application display interface, then the terminal device receives the user's play trigger operation.
  • the playback trigger operation can be the operation of the user sliding to the current application display interface, for example: the user follows the preset rules on the terminal interface Swiping, the application display page is switched every time a sliding operation is performed.
  • the terminal device receives the user's play trigger operation for the first preview area of the current application display page.
  • S102 Determine an output effect corresponding to the content information and the environment information according to the content information of the target data and the environment information of the environment in which the terminal device is located.
  • the content information of the target data refers to the specific content contained in the target data, such as image content, text content, and sound content in the video data.
  • the environmental information of the environment in which the terminal device is located refers to the environmental information of the physical environment in which the terminal device is currently located.
  • the terminal device determines the output effect corresponding to the content information and the environmental information, if the first output effect corresponding to the content information separately determined by the content information, and the second output effect corresponding to the environmental information separately determined by the environmental information
  • the output effects do not conflict or influence each other, then the output effects corresponding to the content information and the environmental information can include: the first output effect determined by the content information alone, and the second output determined by the environmental information alone At least one of the effects;
  • the output effects corresponding to the content information and the environmental information may include : The third output effect determined by the content information and the environmental information.
  • the effects may also include at least one of a fourth output effect determined solely by content information and a fifth output effect determined solely by environmental information.
  • the output effect can be called the output special effect.
  • the terminal device outputs the output special effect when outputting the target data, so that the output special effect can be presented in the application display page (for the specific situation of the target data, the meaning of the presentation can be It is the content information of the target data to be displayed or played.
  • the terminal device can automatically present the content information of the target data on the application display interface, or it can be the content information of the target data presented after manual operation by the user.
  • the content information of the target data can be presented in full screen.
  • outputting the target data based on the output effect can enhance the presentation effect of the content information when presenting the content information of the target data, for example, according to the image content, text content or sound in the video data
  • At least one item in the content determines the playback effect of the video, and can enhance the playback effect of the video.
  • the environment information of the terminal device is taken into account when determining the output effect in S102, the content information of the target data is matched with the actual environment, so that the user can have an immersive feeling when experiencing the content information.
  • Comprehensive consideration of content information and environmental information can enhance the user's immersive experience.
  • the method further includes: before detecting a playback trigger operation performed on the first preview area of the current application display page,
  • each preview area in the current application display page, and each preview area includes the first preview area
  • Obtaining the target data corresponding to the first preview area includes any of the following:
  • the above application display interface is divided into at least one preview area, and each preview area has corresponding target data.
  • the user performs a sliding operation on the application display interface of the terminal device to change the preview area displayed on the application display interface.
  • the operation can change all or part of the preview area displayed on the application display interface to change the corresponding target data.
  • the application display interface displays three preview areas corresponding to three small videos A, B, and C. Slide on the application display interface to display four preview areas corresponding to the four small videos of C, D, E, and F on the application display interface.
  • the terminal device When the terminal device detects that the user stops sliding on the application display page, it displays the corresponding preview areas in the current application display page. If the user wants to view the content information of the corresponding target data in the first preview area, the user is The first preview area executes the play trigger operation. As described above, the play trigger operation may be a user's click operation on the first preview area. The terminal device obtains the target data corresponding to the first preview area based on the user operation. The method of obtaining may be:
  • the terminal device can send a data acquisition request to the server.
  • the data acquisition request carries the identification information of the first preview area.
  • the identification information of determines the target data corresponding to the first preview area, and feeds back the target data to the terminal device.
  • the terminal device has received the video stream information sent by the server, that is, the terminal device itself stores the video stream information, the terminal device can directly extract the information corresponding to the identification information of the first preview area according to the user operation Target data.
  • the video stream may be a video stream published by a user or other users in the present disclosure in a feed scenario.
  • the identification information of the preview area may be an identification (Identification, ID) previously set by the terminal device or the server for the preview area, and the terminal device or the server can determine the corresponding target data according to the identification.
  • the environmental information of the environment where the terminal is located includes at least one of the geographic location of the terminal, the weather conditions of the environment where the terminal is located, and brightness information of the environment where the terminal is located;
  • the output effects corresponding to the content information and environmental information are determined, including:
  • the output special effects include rain special effects, snow special effects, lightning special effects, and hail special effects , At least one of wind speed effects, rainbow effects and thunder sound effects.
  • the geographic location of the environment where the terminal is located can be determined according to the Global Positioning System (GPS).
  • GPS Global Positioning System
  • the brightness information of the environment where the terminal is located is determined by the terminal device itself.
  • the weather conditions of the environment where the terminal is located can be determined by the terminal device itself or sent by the server to the terminal device.
  • the terminal device can determine the output special effect matching the weather condition according to the weather conditions of the environment in which it is located, so as to output the target data by fusing the output special effect when outputting the target data.
  • the output special effects may include at least one of rain special effects, snow special effects, lightning special effects, hail special effects, wind speed special effects, rainbow special effects and thunder sound effects.
  • rain special effect As an example, after the terminal device outputs the target data, When the content information of the target data is presented on the application display interface, it will be combined to present rain special effects. For example, when a video of someone drinking coffee is played, it can be combined with a raining scene. The same is based on different special effects or combinations of special effects when playing videos. It can also be matched with at least one of snow, lightning, hail, wind, rainbow, and thunder.
  • the wind speed special effect matching the weather condition is the wind speed special effect of a specific wind speed determined according to the wind speed of the actual environment.
  • the present disclosure presents the content information of the target data according to the weather information of the environment where the actual terminal is located, so that the user will have an immersive feeling when experiencing the content information of the target data, which enhances the user's immersive experience.
  • the environmental information of the environment in which the terminal is located includes brightness information of the environment in which the terminal is located, according to the content information of the target data and the environmental information of the environment in which the terminal is located, the information corresponding to the content information and the environmental information is determined
  • Output effects including:
  • the display brightness corresponding to the brightness information is determined.
  • the terminal device can determine the brightness information of its environment by itself, and determine the display brightness corresponding to the brightness information, so as to integrate the display brightness to output target data. After the target data is output, the content information of the target data is presented with the display brightness. If the external environment is brighter, the corresponding display brightness can be increased, otherwise, the display brightness can be reduced.
  • the present disclosure presents the content information of the target data in this way, can change the display brightness of the content information of the target data according to the brightness of the external environment, and display the content information of the target data with a more suitable display brightness, which enhances user experience.
  • determining the output effect corresponding to the content information and the environment information according to the content information of the target data and the environment information of the environment in which the terminal is located includes:
  • the output effect matching the target content information includes a sound special effect matching the image content in the video data.
  • the terminal device can determine the output special effect matching the target content information according to the target content information in the content information of the target data, and the target content information can be all or part of the content information of the target data.
  • the above-mentioned target data may be video data
  • the video data may include at least one item of content information among image content, text content, and sound content.
  • the target content information is image content in the video data
  • the terminal device can determine what matches the image content Sound special effects to enhance the display effect of the video.
  • a promotion advertisement of a brand car the content is that the car is driving through a tunnel.
  • the terminal device configures the sound special effect of the car driving through the tunnel according to the image of the car and the tunnel.
  • the target content information is the sound content in the video data
  • the terminal device can recognize the sound when playing the video and configure the text corresponding to the content expressed by the sound.
  • the terminal device may also determine whether to output the target data with a 2D effect or to output the target data with a 3D effect according to the content information of the target data.
  • outputting the target data in combination with the content information of the target data can enhance the display effect of the target data and enhance the user's immersive experience.
  • the terminal device may also be combined with the geographic location of the environment where the terminal is located.
  • the user corresponding to the current terminal is at a certain scenic spot (such as a cherry blossom viewing spot, waterfall, etc.), and the terminal device may combine with the location of the scenic spot.
  • a certain scenic spot such as a cherry blossom viewing spot, waterfall, etc.
  • the terminal device may combine with the location of the scenic spot.
  • the method before determining the output effect corresponding to the content information and the environment information according to the content information of the target data and the environment information of the environment in which the terminal is located, the method further includes:
  • the output effect corresponding to the content information and environment information including:
  • the user portrait includes at least one of user attributes, user preferences, and user historical operation information.
  • the user portrait of the user can also be considered when determining the output effect, and the corresponding output effect is jointly determined according to the three factors of content information, environmental information, and user portrait.
  • the target data is output based on the output effect, namely To determine the corresponding output effect output target data based on these three factors.
  • the common The determined output effect may separately include the output effect corresponding to the user portrait determined by the user portrait.
  • the user portrait can be obtained according to the user's browsing record on the terminal device, for example, the type of the browsed video, the search keyword, etc. If the user portrait includes the user's historical operation information, and the historical operation information is related to the user's use of music applications on the terminal device, then the output effect includes music special effects, which are used to indicate the output of the target data content information when fusion output related records Record the corresponding music so that the music can be played when the content information of the target data is presented; if the user portrait includes user attributes, the music that matches the user attributes can be determined based on the user attributes, for example, if the user’s gender is female, it can be played for the user For soft music, the output effect includes the music special effects corresponding to the soft music; if the user portrait includes user preferences, the user preference can be the user's favorite or downloaded music, and the output effects include the music special effects corresponding to the user's favorite or downloaded music, and the output target The special effect of outputting music during data fusion can play the music that
  • determining the output target data for special effects in combination with the user portrait can make the content information of the presented target data more in line with user preferences, and enhance the user's immersive experience.
  • outputting target data based on an output effect includes:
  • the target special effect corresponding to the output target special effect option is merged.
  • the output effect corresponding to the content information determined by the content information of the target data, the output effect corresponding to the environment information determined by the environment information of the terminal's environment, and the output effect corresponding to the user portrait determined by the user portrait Will not affect each other, that is, the output effects determined by these three factors are different types of output effects, and the output effects (or output special effects) determined by each factor can be one or more.
  • the terminal device does not obtain a matching output effect based on certain factors. For example, after the terminal device analyzes the video content, it finds that it can directly use the video obtained from the server or its own storage space, and the number of subsequent special effect options may be one.
  • the three factors or the output effect determined by the content information and the environment information corresponds to at least one special effect option.
  • At least one special effect option is displayed on the terminal application display interface. The user clicks the target special effect option, and the terminal device receives it.
  • the terminal device can merge the target special effect corresponding to the output target special effect option when outputting the target data, so as to present the content information of the target data in the form of the target special effect.
  • the number of target special effects selected by the user is not limited.
  • the user can select all the special effects in the special effect options, or select some special effects.
  • the output special effects include rain special effects, music special effects, and sound special effects corresponding to the image, and the user can only select the rain special effects.
  • the played video can be stopped after the terminal device receives the user's play stop operation, and the play stop operation can be that the user swipes away or jumps unconsciously.
  • the page can also be any non-immersive operation such as the user clicking on the video being played, or the user shaking the terminal device.
  • stopping the video playback can be stopped when the specified duration of the video ends or the video ends.
  • the present disclosure can be used in a feed scenario.
  • the target data is the data in the feed data stream.
  • feed stream video preview areas displayed on the application display page.
  • the user selects one of the feed stream video preview areas (that is, the above-mentioned first preview area) according to preference, and the terminal device obtains the feed stream video according to the user selection The video data corresponding to the preview area.
  • the terminal device performs background processing of the video data, specifically: determining one or more special effects based on the content information of the video data itself and the environment information of the environment where the terminal is located, for example, the video content is someone drinking coffee, and the environment information indicates that the weather is light rain. It can correspond to the configured rain effects.
  • the rain effects include the sound of rain, the special effects of rain, etc., then combined with the rain effects, the video content of someone drinking coffee in the rain is presented to the user; the terminal device also combines the video content For example, when the user pours coffee, the sound of the coffee is configured to enhance the video presentation effect; the terminal device can also obtain the brightness information in the current environment.
  • the display brightness of the video can be correspondingly enhanced , On the contrary, it can reduce the display brightness of the video; at the same time, it can also be combined with the user's music recording when presenting the video content, and play the background music suitable for listening to the music record when drinking coffee on a rainy day.
  • the terminal device can display rain special effects, sound special effects, display brightness, and special effect options of the music special effects corresponding to the above background music to the user.
  • the user can select special effects.
  • the terminal device can play the video according to the user's selection.
  • the video playback can be received It can be played manually by the user, or it can be played automatically after the user selects the special effect or the scene that does not require the user to select the special effect after the terminal device determines the special effect.
  • Fig. 3 is a schematic structural diagram of a data content output device provided by another embodiment of the present disclosure. As shown in Fig. 3, the device of the embodiment of the present disclosure may include:
  • the transceiver module 301 is configured to obtain target data corresponding to the first preview area when a playback trigger operation performed on the first preview area of the current application display page is detected;
  • the output effect determination module 302 is configured to determine the output effect corresponding to the content information and the environment information according to the content information of the target data and the environment information of the environment where the terminal is located;
  • the target data output module 303 is configured to output target data based on the output effect.
  • the device before detecting a play trigger operation performed on the first preview area of the current application display page, the device further includes:
  • the display module is configured to display each preview area in the current application display page when the user's stop operation for sliding on the application display page is detected, and each preview area includes the first preview area;
  • the transceiver module obtains the target data corresponding to the first preview area, it is specifically configured to perform at least one of the following operations:
  • the environmental information of the environment where the terminal is located includes at least one of the geographic location of the terminal, weather conditions of the environment where the terminal is located, and brightness information of the environment where the terminal is located;
  • the output effect determination module 302 determines the output effect corresponding to the content information and environmental information based on the content information of the target data and the environmental information of the environment where the terminal is located When, specifically used for:
  • the output special effects include rain special effects, snow special effects, lightning special effects, and hail special effects , At least one of wind speed effects, rainbow effects and thunder sound effects.
  • the output effect determination module 302 determines the corresponding content information and environment information based on the content information of the target data and the environment information of the environment where the terminal is located.
  • the output effect is specifically used for:
  • the display brightness corresponding to the brightness information is determined.
  • the output effect determination module 302 is specifically used for determining the output effect corresponding to the content information and the environment information according to the content information of the target data and the environment information of the environment in which the terminal is located:
  • the output effect matching the target content information includes a sound special effect matching the image content in the video data.
  • the transceiver module 301 is further configured to:
  • the output effect determination module 302 is specifically used for determining the output effect corresponding to the content information and the environment information according to the content information of the target data and the environment information of the environment in which the terminal is located:
  • the user portrait includes at least one of user attributes, user preferences, and user historical operation information.
  • the target data output module 303 when the target data output module 303 outputs target data based on the output effect, it is specifically used to:
  • the target special effect corresponding to the output target special effect option is merged.
  • FIG. 4 shows a schematic structural diagram of an electronic device (such as the terminal device in FIG. 1) 600 suitable for implementing embodiments of the present disclosure.
  • the terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablets), PMPs (portable multimedia players), vehicle-mounted terminals (e.g. Mobile terminals such as car navigation terminals) and fixed terminals such as digital TVs, desktop computers, etc.
  • the electronic device shown in FIG. 4 is only an example, and should not bring any limitation to the function and use range of the embodiments of the present disclosure.
  • the electronic device includes a memory and a processor.
  • the processor here may be referred to as the processing device 601 below, and the memory may include a read-only memory (ROM) 602, a random access memory (RAM) 603, and a storage device 608 below.
  • ROM read-only memory
  • RAM random access memory
  • storage device 608 At least one item of, as follows:
  • the electronic device 600 may include a processing device (such as a central processing unit, a graphics processor, etc.) 601, which may be loaded into a random access device according to a program stored in a read-only memory (ROM) 602 or from a storage device 608.
  • the program in the memory (RAM) 603 executes various appropriate actions and processing.
  • the RAM 603 also stores various programs and data required for the operation of the electronic device 600.
  • the processing device 601, ROM 602, and RAM 603 are connected to each other through a bus 604.
  • An input/output (I/O) interface 605 is also connected to the bus 604.
  • the following devices can be connected to the I/O interface 605: including input devices 606 such as touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; including, for example, liquid crystal display (LCD), speakers, vibration An output device 607 such as a device; a storage device 608 such as a magnetic tape and a hard disk;
  • the communication device 609 may allow the electronic device 600 to perform wireless or wired communication with other devices to exchange data.
  • FIG. 4 shows an electronic device 600 having various devices, it should be understood that it is not required to implement or have all the illustrated devices. It may alternatively be implemented or provided with more or fewer devices.
  • an embodiment of the present disclosure includes a computer program product, which includes a computer program carried on a non-transitory computer readable medium, and the computer program contains program code for executing the method shown in the flowchart.
  • the computer program may be downloaded and installed from the network through the communication device 609, or installed from the storage device 608, or installed from the ROM 602.
  • the processing device 601 the above-mentioned functions defined in the method of the embodiment of the present disclosure are executed.
  • the aforementioned computer-readable medium in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the two.
  • the computer-readable storage medium may be, for example, but not limited to, an electric, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the above. More specific examples of computer-readable storage media may include, but are not limited to: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable Programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in a baseband or as a part of a carrier wave, and a computer-readable program code is carried therein. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium may also be any computer-readable medium other than the computer-readable storage medium.
  • the computer-readable signal medium may send, propagate, or transmit the program for use by or in combination with the instruction execution system, apparatus, or device .
  • the program code contained on the computer-readable medium can be transmitted by any suitable medium, including but not limited to: wire, optical cable, RF (Radio Frequency), etc., or any suitable combination of the above.
  • the client and server can communicate with any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol), and can communicate with digital data in any form or medium.
  • Communication e.g., communication network
  • Examples of communication networks include local area networks (“LAN”), wide area networks (“WAN”), the Internet (e.g., the Internet), and end-to-end networks (e.g., ad hoc end-to-end networks), as well as any currently known or future research and development network of.
  • LAN local area networks
  • WAN wide area networks
  • the Internet e.g., the Internet
  • end-to-end networks e.g., ad hoc end-to-end networks
  • the above-mentioned computer-readable medium may be included in the above-mentioned electronic device; or it may exist alone without being assembled into the electronic device.
  • the above-mentioned computer-readable medium carries one or more programs.
  • the electronic device When the above-mentioned one or more programs are executed by the electronic device, the electronic device: When detecting a playback trigger operation performed on the first preview area of the current application display page When, obtain the target data corresponding to the first preview area; determine the output effect corresponding to the content information and the environment information according to the content information of the target data and the environment information of the environment where the terminal is located; output the target data based on the output effect.
  • the computer program code used to perform the operations of the present disclosure can be written in one or more programming languages or a combination thereof.
  • the above-mentioned programming languages include but are not limited to object-oriented programming languages such as Java, Smalltalk, C++, and Including conventional procedural programming languages-such as "C" language or similar programming languages.
  • the program code can be executed entirely on the user's computer, partly on the user's computer, executed as an independent software package, partly on the user's computer and partly executed on a remote computer, or entirely executed on the remote computer or server.
  • the remote computer can be connected to the user’s computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or it can be connected to an external computer (for example, using an Internet service provider to pass Internet connection).
  • LAN local area network
  • WAN wide area network
  • each block in the flowchart or block diagram can represent a module, program segment, or part of code, and the module, program segment, or part of code contains one or more for realizing the specified logical function Executable instructions.
  • the functions marked in the block may also occur in a different order from the order marked in the drawings. For example, two blocks shown in succession can actually be executed substantially in parallel, or they can sometimes be executed in the reverse order, depending on the functions involved.
  • each block in the block diagram and/or flowchart, and the combination of the blocks in the block diagram and/or flowchart can be implemented by a dedicated hardware-based system that performs the specified functions or operations Or it can be realized by a combination of dedicated hardware and computer instructions.
  • the modules or units involved in the described embodiments of the present disclosure can be implemented in software or hardware. Wherein, the name of the module or unit does not constitute a limitation on the unit itself under certain circumstances.
  • the transceiver module can also be described as "a module for obtaining target data corresponding to the first preview area".
  • exemplary types of hardware logic components include: Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), Application Specific Standard Product (ASSP), System on Chip (SOC), Complex Programmable Logical device (CPLD) and so on.
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • ASSP Application Specific Standard Product
  • SOC System on Chip
  • CPLD Complex Programmable Logical device
  • a machine-readable medium may be a tangible medium, which may contain or store a program for use by or in combination with the instruction execution system, apparatus, or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • the machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any suitable combination of the foregoing.
  • machine-readable storage media would include electrical connections based on one or more wires, portable computer disks, hard drives, random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or flash memory erasable programmable read-only memory
  • CD-ROM portable compact disk read only memory
  • magnetic storage device or any suitable combination of the above.
  • a method for outputting data content including:
  • the content information of the target data and the environment information of the environment where the terminal is located determine the output effect corresponding to the content information and environment information
  • the method before detecting a playback trigger operation performed on the first preview area of the current application display page, the method further includes:
  • each preview area in the current application display page, and each preview area includes the first preview area
  • Obtaining the target data corresponding to the first preview area includes any of the following:
  • the environmental information of the environment where the terminal is located includes at least one of the geographic location of the terminal, weather conditions of the environment where the terminal is located, and brightness information of the environment where the terminal is located;
  • the output effect corresponding to the content information and environmental information is determined, including:
  • the output special effects include rain special effects, snow special effects, lightning special effects, and hail special effects , At least one of wind speed effects, rainbow effects and thunder sound effects.
  • determining the output effect corresponding to the content information and the environment information according to the content information of the target data and the environment information of the environment where the terminal is located includes:
  • the display brightness corresponding to the brightness information is determined.
  • determining the output effect corresponding to the content information and the environment information includes:
  • the output effect matching the target content information includes a sound special effect matching the image content in the video data.
  • the method before determining the output effect corresponding to the content information and the environment information according to the content information of the target data and the environment information of the environment in which the terminal is located, the method further includes:
  • the output effect corresponding to the content information and environment information including:
  • the user portrait includes at least one of user attributes, user preferences, and user historical operation information.
  • output target data based on output effects including:
  • the target special effect corresponding to the output target special effect option is merged.
  • a data content output device which includes:
  • the transceiver module is configured to obtain target data corresponding to the first preview area when a playback trigger operation performed on the first preview area of the current application display page is detected;
  • the output effect determination module is used to determine the output effect corresponding to the content information and the environment information according to the content information of the target data and the environment information of the environment where the terminal is located;
  • the target data output module is used to output target data based on the output effect.
  • the device before detecting a play trigger operation performed on the first preview area of the current application display page, the device further includes:
  • the display module is configured to display each preview area in the current application display page when the user's stop operation for sliding on the application display page is detected, and each preview area includes the first preview area;
  • the transceiver module obtains the target data corresponding to the first preview area, it is specifically configured to perform at least one of the following operations:
  • the environmental information of the environment where the terminal is located includes at least one of the geographic location of the terminal, weather conditions of the environment where the terminal is located, and brightness information of the environment where the terminal is located;
  • the output effect determination module determines the output effect corresponding to the content information and environmental information based on the content information of the target data and the environmental information of the environment where the terminal is located, Specifically used for:
  • the output special effects include rain special effects, snow special effects, lightning special effects, and hail special effects , At least one of wind speed effects, rainbow effects and thunder sound effects.
  • the output effect determination module determines the content information and environment information corresponding to the content information and environment information based on the content information of the target data and the environment information of the environment where the terminal is located.
  • the display brightness corresponding to the brightness information is determined.
  • the output effect determination module is specifically used to determine the output effect corresponding to the content information and the environment information according to the content information of the target data and the environment information of the environment in which the terminal is located:
  • the output effect matching the target content information includes a sound special effect matching the image content in the video data.
  • the transceiver module before determining the output effect corresponding to the content information and environment information based on the content information of the target data and the environment information of the environment where the terminal is located, the transceiver module is also used for:
  • the user portrait includes at least one of user attributes, user preferences, and user historical operation information.
  • the target data output module when the target data output module outputs target data based on the output effect, it is specifically used to:
  • the target special effect corresponding to the output target special effect option is merged.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé et un appareil de production de contenu de données, un dispositif électronique et un support lisible par ordinateur. Le procédé comprend les étapes consistant à : lorsqu'une opération de déclenchement de jeu exécutée pour une première zone de prévisualisation de la page d'affichage d'application actuelle est détectée, acquérir des données cibles correspondant à la première zone de prévisualisation (S101) ; déterminer, en fonction d'informations de contenu des données cibles et d'informations d'environnement d'un environnement dans lequel se trouve un terminal, un effet de production correspondant aux informations de contenu et aux informations d'environnement (S102) ; et produire les données cibles sur la base de l'effet de production (S103). Dans le procédé, les informations d'environnement de l'environnement où se trouve le terminal sont prises en compte lorsque les données cibles sont produites, et la présentation des informations de contenu des données cibles peut correspondre à un environnement réel, de telle sorte qu'un utilisateur peut avoir une expérience plus immersive.
PCT/CN2020/108256 2019-08-19 2020-08-10 Procédé et appareil de production de contenu de données, dispositif électronique et support lisible par ordinateur WO2021031909A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910765709.3A CN112395530A (zh) 2019-08-19 2019-08-19 数据内容的输出方法、装置、电子设备及计算机可读介质
CN201910765709.3 2019-08-19

Publications (1)

Publication Number Publication Date
WO2021031909A1 true WO2021031909A1 (fr) 2021-02-25

Family

ID=74603559

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/108256 WO2021031909A1 (fr) 2019-08-19 2020-08-10 Procédé et appareil de production de contenu de données, dispositif électronique et support lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN112395530A (fr)
WO (1) WO2021031909A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116257309A (zh) * 2021-12-10 2023-06-13 北京字跳网络技术有限公司 内容展示方法、装置、电子设备、存储介质和程序产品

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105915979A (zh) * 2016-04-19 2016-08-31 乐视控股(北京)有限公司 视频播放方法及装置
CN106201159A (zh) * 2015-05-08 2016-12-07 阿里巴巴集团控股有限公司 基于网页区块的信息预展示方法、装置及系统
CN107613351A (zh) * 2017-09-27 2018-01-19 惠州Tcl移动通信有限公司 一种视频区域亮度控制方法、移动终端及存储介质
WO2018057944A1 (fr) * 2016-09-23 2018-03-29 Apple Inc. Dispositifs, procédés et interfaces utilisateur permettant d'interagir avec des objets d'interface utilisateur par l'intermédiaire d'entrées basées sur la proximité et basées sur le contact
CN108259941A (zh) * 2018-03-01 2018-07-06 北京达佳互联信息技术有限公司 视频播放方法和装置
CN109688344A (zh) * 2018-12-18 2019-04-26 广州励丰文化科技股份有限公司 一种基于时间轴的演出实时预览控制方法及系统

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9473813B2 (en) * 2009-12-31 2016-10-18 Infosys Limited System and method for providing immersive surround environment for enhanced content experience
DE112013006049T5 (de) * 2012-12-18 2015-09-10 Samsung Electronics Co., Ltd. Anzeigevorrichtung und Bildbearbeitungsverfahren dafür
KR102358025B1 (ko) * 2015-10-07 2022-02-04 삼성전자주식회사 전자 장치 및 전자 장치의 음악 컨텐츠 시각화 방법
CN108073649B (zh) * 2016-11-15 2022-02-25 北京搜狗科技发展有限公司 一种信息处理方法和装置、一种用于信息处理的装置
CN108307127A (zh) * 2018-01-12 2018-07-20 广州市百果园信息技术有限公司 视频处理方法及计算机存储介质、终端
CN109213557A (zh) * 2018-08-24 2019-01-15 北京海泰方圆科技股份有限公司 浏览器换肤方法、装置、计算装置和存储介质
CN109976858A (zh) * 2019-03-28 2019-07-05 北京字节跳动网络技术有限公司 电子设备中应用程序界面的显示控制方法、装置及其设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106201159A (zh) * 2015-05-08 2016-12-07 阿里巴巴集团控股有限公司 基于网页区块的信息预展示方法、装置及系统
CN105915979A (zh) * 2016-04-19 2016-08-31 乐视控股(北京)有限公司 视频播放方法及装置
WO2018057944A1 (fr) * 2016-09-23 2018-03-29 Apple Inc. Dispositifs, procédés et interfaces utilisateur permettant d'interagir avec des objets d'interface utilisateur par l'intermédiaire d'entrées basées sur la proximité et basées sur le contact
CN107613351A (zh) * 2017-09-27 2018-01-19 惠州Tcl移动通信有限公司 一种视频区域亮度控制方法、移动终端及存储介质
CN108259941A (zh) * 2018-03-01 2018-07-06 北京达佳互联信息技术有限公司 视频播放方法和装置
CN109688344A (zh) * 2018-12-18 2019-04-26 广州励丰文化科技股份有限公司 一种基于时间轴的演出实时预览控制方法及系统

Also Published As

Publication number Publication date
CN112395530A (zh) 2021-02-23

Similar Documents

Publication Publication Date Title
WO2021052085A1 (fr) Procédé et appareil de recommandation de vidéo, dispositif électronique et support lisible par ordinateur
CN111510760B (zh) 视频信息展示方法和装置、存储介质和电子设备
WO2021196903A1 (fr) Procédé et dispositif de traitement vidéo, support lisible et dispositif électronique
US10397320B2 (en) Location based synchronized augmented reality streaming
WO2021008223A1 (fr) Procédé et appareil de détermination d'informations, et dispositif électronique associé
WO2020007012A1 (fr) Procédé et dispositif d'affichage de page de recherche, terminal et support d'informations
CN112380379B (zh) 歌词特效展示方法、装置、电子设备及计算机可读介质
US20140164921A1 (en) Methods and Systems of Augmented Reality on Mobile Devices
WO2023072296A1 (fr) Procédé et appareil de traitement d'informations multimédias, dispositif électronique et support de stockage
WO2023051297A1 (fr) Procédé et appareil d'affichage d'informations, dispositif électronique et support d'enregistrement
WO2023169340A1 (fr) Procédé et appareil d'affichage d'informations, dispositif électronique, support d'enregistrement, et produit-programme
WO2023088442A1 (fr) Procédé et appareil de prévisualisation en continu en direct, et dispositif, produit-programme et support
WO2022042389A1 (fr) Procédé et appareil d'affichage de résultat de recherche, support lisible et dispositif électronique
WO2023151589A1 (fr) Procédé et appareil d'affichage de vidéo, dispositif électronique et support de stockage
WO2023051294A9 (fr) Procédé et appareil de traitement de support, dispositif et support
CN110958481A (zh) 视频页面显示方法、装置、电子设备和计算机可读介质
WO2023116479A1 (fr) Procédé et appareil de publication de vidéo, dispositif électronique, support de stockage et produit-programme
CN111970571A (zh) 视频制作方法、装置、设备及存储介质
CN111246304A (zh) 视频处理方法、装置、电子设备及计算机可读存储介质
WO2023116480A1 (fr) Procédé et appareil de publication de contenu multimédia, et dispositif, support et produit de programme
US20230421857A1 (en) Video-based information displaying method and apparatus, device and medium
CN115190366B (zh) 一种信息展示方法、装置、电子设备、计算机可读介质
WO2020233144A1 (fr) Procédé et dispositif de fourniture d'un mode d'entrée de commentaire
CN109635131B (zh) 多媒体内容榜单显示方法、推送方法,装置及存储介质
WO2021031909A1 (fr) Procédé et appareil de production de contenu de données, dispositif électronique et support lisible par ordinateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20853777

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20853777

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 25.08.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20853777

Country of ref document: EP

Kind code of ref document: A1