CN114780180A - Object data display method and device, electronic equipment and storage medium - Google Patents

Object data display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114780180A
CN114780180A CN202111574859.XA CN202111574859A CN114780180A CN 114780180 A CN114780180 A CN 114780180A CN 202111574859 A CN202111574859 A CN 202111574859A CN 114780180 A CN114780180 A CN 114780180A
Authority
CN
China
Prior art keywords
playing
data
analysis
target object
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111574859.XA
Other languages
Chinese (zh)
Inventor
张夏楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202111574859.XA priority Critical patent/CN114780180A/en
Publication of CN114780180A publication Critical patent/CN114780180A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/73Querying
    • G06F16/735Filtering based on additional data, e.g. user or group profiles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/73Querying
    • G06F16/738Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0282Rating or review of business operators or products
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking

Abstract

The disclosure relates to an object data display method, an object data display device, an electronic device and a storage medium. The method comprises the steps of displaying an object analysis page, wherein the object analysis page comprises an object playing area and a data display area, responding to a first playing instruction triggered based on a playing control corresponding to a target object, playing the target object in the object playing area, and displaying first analysis data and second analysis data corresponding to the target object in the data display area, wherein the first analysis data comprises analysis data of objects which are the same as the target object in type, and the first analysis data and the second analysis data comprise data corresponding to preset time points in the target object. The target object is assisted to locate the object part corresponding to the time point to be improved, and the quality of the target object can be known by the creator through displaying the analysis data of other objects, so that the creator can improve the target object in a targeted manner, and the quality of the work is improved.

Description

Object data display method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of internet technologies, and in particular, to a method and an apparatus for displaying object data, an electronic device, and a storage medium.
Background
With the continuous development of mobile internet technology, the mobile network speed is greatly improved, and people increasingly use mobile object scenes. Moreover, mobile objects are increasingly popular with people due to their convenience and rapidity. For example, people can publish or watch the object live broadcast content in leisure time or even fragmented time of going to work and off work, so that trivial time can be fully utilized to compose or watch interested object programs.
As more and more objects are available to viewers, the quality of the objects for the viewers is higher and higher, which may prompt creators to create works of higher quality to meet the needs of the viewers. In the current environment, the creator can roughly sense the quality of the created object through some simple data (such as browsing amount and praise amount) provided on the page, but it is difficult to know more detailed matters through the data, so that the creator cannot make targeted improvement to improve the quality of the work.
Disclosure of Invention
The present disclosure provides an object data display method, device, electronic device and storage medium, and the technical scheme of the present disclosure is as follows:
according to a first aspect of an embodiment of the present disclosure, there is provided an object data display method including:
displaying an object analysis page; the object analysis page comprises an object playing area and a data display area;
responding to a first playing instruction triggered by a playing control corresponding to the target object, and playing the target object in an object playing area;
displaying the first analysis data and second analysis data corresponding to the target object in a data display area; the first analysis data includes analysis data of an object of the same type as the target object; the first analysis data and the second analysis data include data corresponding to a preset time point in the target object.
In some possible embodiments, the object analysis page further includes a time display area; the method further comprises the following steps:
displaying a time progress display control in a time display area;
the display progress of the time progress display control is consistent with the playing progress of the target object.
In some possible embodiments, the object analysis page further includes a play guidance control, and the method further includes:
displaying a play guide control in a time display area and a data display area;
the positions of the playing guide control on the first analysis data, the second analysis data and the time progress display control are changed along with the change of the playing progress of the target object.
In some possible embodiments, the object analysis page further includes a to-be-selected object display area; the method further comprises the following steps:
and displaying at least one object to be selected which is the same as the target object in the object to be selected display area.
In some possible embodiments, the method further comprises:
switching a target object in an object playing area into a selected object in response to a switching instruction triggered by a switching control corresponding to the selected object in at least one object to be selected;
responding to a second playing instruction triggered by the playing control based on the selected object, and playing the selected object in the object playing area;
and displaying the first analysis data and the second analysis data corresponding to the selected object in the data display area.
In some possible embodiments, the method further comprises:
responding to an adding instruction triggered by an adding control corresponding to a selected object in at least one object to be selected, and dividing an object playing area into a target object playing area and a selected object playing area;
responding to a third playing instruction triggered by a comprehensive playing control based on the target object and the selected object, playing the target object in a target object playing area, and playing the selected object in a selected object playing area;
and displaying the first analysis data and the second analysis data corresponding to the selected object in the data display area.
In some possible embodiments, displaying the first analysis data and the second analysis number corresponding to the target object in the data display area includes:
displaying first analysis data and second analysis data corresponding to at least one object to be selected in a data display area;
or;
displaying first analysis data and second analysis data corresponding to the selectable object set in a data display area; the selectable objects in the selectable object set comprise objects which are the same as the target objects in type; the set of selectable objects includes at least one object to be selected.
In some possible embodiments, the method further comprises:
and setting a display target object in the display area of the object to be selected.
In some possible embodiments, the to-be-selected object display area further includes a to-be-selected object addition control, and the method further includes:
in response to a to-be-selected object adding instruction triggered by the to-be-selected object adding control, adding a to-be-selected object in a to-be-selected object display area; the adding instruction of the object to be selected comprises the object identification of the added object to be selected.
In some possible embodiments, the method further comprises:
and when a viewing guide instruction based on the target time point is detected, displaying a viewing guide control at the corresponding positions of the first analysis subdata, the second analysis subdata and the time progress display control corresponding to the target time point.
In some possible embodiments, the method further comprises:
displaying a key frame display area on the upper layer of the object playing area;
and displaying the key frames in the target objects corresponding to the target time points in the key frame display area.
In some possible embodiments, the method further comprises:
displaying the numerical value of the target time point at the corresponding position of the time progress display control;
displaying the analysis numerical value of the first analysis subdata in a preset area corresponding to the first analysis subdata;
and displaying the analysis numerical value of the second analysis subdata in a preset area corresponding to the second analysis subdata.
In some possible embodiments, the first analysis data comprises first account retention data and the second analysis data comprises second account retention data;
and/or;
the first analysis data comprises first interaction data, and the second analysis data comprises second interaction data;
and/or;
the first analytical data includes first operational data and the second analytical data includes second operational data.
In some possible embodiments, the method further comprises:
hiding a display area of the object to be selected in response to a first playing instruction triggered based on a playing control corresponding to the target object;
expanding an object playing area;
and playing the target object in the enlarged object playing area.
In some possible embodiments, displaying at least one candidate object of the same type as the target object in the candidate object display area includes:
displaying at least one object to be selected with the same time length type as the target object in an object to be selected display area;
and/or;
displaying at least one object to be selected with the same time length label type as the target object in the object to be selected display area;
and/or;
displaying at least one object to be selected with the same operation amount type as the target object in the object to be selected display area;
and/or;
displaying at least one object to be selected with the same preference type as the target object in the object to be selected display area;
and/or;
and displaying at least one object to be selected with the same attribute type as the target object in the object to be selected display area.
According to a second aspect of the embodiments of the present disclosure, there is provided an object data display method including:
a page display module configured to execute a display object analysis page; the object analysis page comprises an object playing area and a data display area;
the playing module is configured to execute a first playing instruction triggered based on a playing control corresponding to the target object, and play the target object in the object playing area;
the data display module is configured to display the first analysis data and second analysis data corresponding to the target object in the data display area; the first analysis data includes analysis data of an object of the same type as the target object; the first analysis data and the second analysis data include data corresponding to a preset time point in the target object.
In some possible embodiments, the object analysis page further comprises a time display area; the device still includes:
a time control display module configured to execute displaying a time progress display control in a time display area;
the display progress of the time progress display control is consistent with the playing progress of the target object.
In some possible embodiments, the object analysis page further includes a play guidance control, and the apparatus further includes:
a play guidance control display module configured to perform display of a play guidance control in the time display area and the data display area;
the position of the playing guide control on the first analysis data, the second analysis data and the time progress display control changes along with the change of the playing progress of the target object.
In some possible embodiments, the object analysis page further includes a to-be-selected object display area; the device still includes:
and the object display module is configured to display at least one object to be selected, which is of the same type as the target object, in the object display area to be selected.
In some possible embodiments, the apparatus further comprises:
the switching module is configured to execute a switching instruction triggered by a switching control piece corresponding to a selected object in at least one object to be selected, and switch a target object in the object playing area to the selected object;
the playing module is configured to execute a second playing instruction triggered by the playing control based on the selected object, and play the selected object in the object playing area;
and the data display module is configured to display the first analysis data and the second analysis data corresponding to the selected object in the data display area.
In some possible embodiments, the apparatus further comprises:
the region segmentation module is configured to execute an adding instruction triggered by an adding control corresponding to a selected object in at least one object to be selected, and divide the object playing region into a target object playing region and a selected object playing region;
the playing module is configured to execute a third playing instruction triggered by the comprehensive playing control based on the target object and the selected object, play the target object in the target object playing area and play the selected object in the selected object playing area;
and the data display module is configured to display the first analysis data and the second analysis data corresponding to the selected object in the data display area.
In some possible embodiments, the data display module is configured to display, in the data display area, first analysis data and second analysis data corresponding to at least one object to be selected;
or;
the data display module is configured to display first analysis data and second analysis data corresponding to the selectable object sets in the data display area; the selectable objects in the selectable object set comprise objects which are the same as the target objects in type; the set of selectable objects includes at least one object to be selected.
In some possible embodiments, the object presentation module is configured to perform the display of the target object on top of the candidate object presentation area.
In some possible embodiments, the candidate object display area further includes a candidate object addition control, and the apparatus further includes:
the object adding module is configured to execute a to-be-selected object adding instruction triggered based on the to-be-selected object adding control and add the to-be-selected object in the to-be-selected object display area; the adding instruction of the objects to be selected comprises the object identification of the added objects to be selected.
In some possible embodiments, the apparatus further comprises:
and the viewing guide control display module is configured to execute that when a viewing guide instruction based on the target time point is detected, the viewing guide control is displayed at the corresponding position of the first analysis subdata, the second analysis subdata and the time progress display control corresponding to the target time point.
In some possible embodiments, the apparatus further comprises a key frame display module configured to perform:
displaying a key frame display area on the upper layer of the object playing area;
and displaying the key frames in the target objects corresponding to the target time points in the key frame display area.
In some possible embodiments, the apparatus further comprises a numerical display module configured to perform:
displaying the numerical value of the target time point at the corresponding position of the time progress display control;
displaying an analysis numerical value of the first analysis subdata in a preset area corresponding to the first analysis subdata;
and displaying the analysis numerical value of the second analysis subdata in a preset area corresponding to the second analysis subdata.
In some possible embodiments, the first analysis data comprises first account retention data and the second analysis data comprises second account retention data;
and/or;
the first analysis data comprises first interaction data, and the second analysis data comprises second interaction data;
and/or;
the first analysis data includes first operation data and the second analysis data includes second operation data.
In some possible embodiments, the apparatus further comprises:
the hiding module is configured to execute a first playing instruction triggered by a playing control corresponding to the target object and hide the display area of the object to be selected;
an area enlarging module configured to perform enlarging the object play area;
a playing module configured to execute playing the target object in the enlarged object playing region.
In some possible embodiments, the object representation module is configured to perform:
displaying at least one object to be selected which is of the same time length type as the target object in the object to be selected display area;
and/or;
displaying at least one object to be selected in the object to be selected display area, wherein the object to be selected and the target object are of the same time length label type;
and/or;
displaying at least one object to be selected which is in the same operation amount type with the target object in an object to be selected display area;
and/or;
displaying at least one object to be selected with the same love degree type as the target object in an object to be selected display area;
and/or;
and displaying at least one object to be selected with the same attribute type as the target object in the object to be selected display area. According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to execute instructions to implement the method of any one of the first aspects as described above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium, in which instructions, when executed by a processor of an electronic device, enable the electronic device to perform the method of any one of the first aspects of embodiments of the present disclosure.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer program product comprising a computer program, the computer program being stored in a readable storage medium, the computer program being read from the readable storage medium and executed by at least one processor of a computer device, such that the computer device performs the method of any one of the first aspects of embodiments of the present disclosure.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
the method comprises the steps of displaying an object analysis page, wherein the object analysis page comprises an object playing area and a data display area, responding to a first playing instruction triggered by a playing control corresponding to a target object, playing the target object in the object playing area, and displaying first analysis data and second analysis data corresponding to the target object in the data display area, wherein the first analysis data comprise analysis data of objects of the same type as the target object, and the first analysis data and the second analysis data comprise data corresponding to preset time points in the target object. In the embodiment of the application, the analysis data of the corresponding preset time point of the target object can be displayed while the target object is played, in addition, the analysis data of the object of the same type can be displayed, namely the target object is helped to position the object part corresponding to the time point to be improved, and the quality of the target object can be better known by the creator through displaying the analysis data of other objects, so that the creator can improve the target object in a targeted manner, and the quality is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
FIG. 1 is a schematic diagram illustrating an application environment in accordance with an illustrative embodiment;
FIG. 2 is a flow chart illustrating a method of displaying object data in accordance with an exemplary embodiment;
FIG. 3 is a diagram illustrating an object analysis page in accordance with an illustrative embodiment;
FIG. 4 is a schematic diagram illustrating an object analysis page in accordance with an exemplary embodiment;
FIG. 5 is a diagram illustrating an object analysis page in accordance with an illustrative embodiment;
FIG. 6 is a diagram illustrating an object analysis page in accordance with an illustrative embodiment;
FIG. 7 is a diagram illustrating an object analysis page in accordance with an illustrative embodiment;
FIG. 8 is a diagram illustrating an object analysis page in accordance with an illustrative embodiment;
FIG. 9 is a schematic diagram illustrating an object analysis page in accordance with an exemplary embodiment;
FIG. 10 is a schematic diagram illustrating an object analysis page in accordance with an exemplary embodiment;
FIG. 11 is a block diagram illustrating an object data display apparatus in accordance with an exemplary embodiment;
FIG. 12 is a block diagram illustrating an electronic device for recommending, according to an example embodiment.
Detailed Description
In order to make the technical solutions of the present disclosure better understood, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
All data about a user in the present application are data authorized by the user.
Referring to fig. 1, fig. 1 is a schematic diagram illustrating an application environment of an object data display method according to an exemplary embodiment, and as shown in fig. 1, the application environment may include a client 01 and a server 02.
In an alternative embodiment, the client 01 may obtain the target object from the server, and the analysis data of the target object and the analysis data of the same type of object. Specifically, the client 01 may display an object analysis page, where the object analysis page includes an object playing area and a data display area, and in response to a first playing instruction triggered based on a playing control corresponding to a target object, play the target object in the object playing area, and display first analysis data and second analysis data corresponding to the target object in the data display area, where the first analysis data includes analysis data of an object of the same type as the target object, and the first analysis data and the second analysis data include data corresponding to a preset time point in the target object.
The client 01 may include, but is not limited to, a smart phone, a desktop computer, a tablet computer, a notebook computer, a smart speaker, a digital assistant, an Augmented Reality (AR)/Virtual Reality (VR) device, a smart wearable device, and other types of electronic devices. The software running on the electronic device may be an application program, an applet, or the like. Alternatively, the operating system running on the electronic device may include, but is not limited to, an android system, an IOS system, linux, windows, Unix, and the like.
In an optional embodiment, the server 02 may include an independent physical server, or a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a CDN (Content Delivery Network), a big data and artificial intelligence platform, and the like.
In addition, it should be noted that fig. 1 shows only one application environment of the object data display method provided by the present disclosure, and in practical applications, other application environments may also be included.
Fig. 2 is a flowchart illustrating an object data display method according to an exemplary embodiment, where as shown in fig. 2, the object data display method may be applied to a client, and includes the following steps:
in step S201, an object analysis page is displayed; the object analysis page includes an object play area and a data display area.
In some possible embodiments, after the client detects that the page jump control corresponding to the object analysis page is touched, the client may jump to the object analysis page. As such, the client may display an object analysis page on the interface, which may include an object play area and a data display area.
Fig. 3 is a schematic diagram illustrating an object analysis page according to an exemplary embodiment, and as shown in fig. 3, includes an object analysis page 300, and an object playing area 301 and a data display area 302 located on the object analysis page 300.
In step S203, in response to a first play instruction triggered based on the play control corresponding to the target object, the target object is played in the object play area.
In the embodiment of the application, the client can respond to a first playing instruction triggered based on a playing control corresponding to the target object, and play the target object in the object playing area.
In an optional embodiment, the page jump control may be a target object, and optionally, after detecting that the target object is subjected to a preset operation, the client may jump to an object analysis page. Optionally, the preset operation may be a long-time press on the playing or non-playing target object, a double click on the playing or non-playing target object, or other actions performed in a preset area on the playing or non-playing target object.
If the target object is an unplayed target object, there may be a case that the client has not acquired the resource of the target object. Therefore, when the client jumps to the object analysis page, the resource of the target object can be acquired. When the client detects a first play instruction triggered by a play control corresponding to the target object, the target object can be played in the object play area.
Optionally, the playing control corresponding to the target object may be a triangular control in the object playing area 301 as shown in fig. 3, or may be other forms of controls disposed at other positions on the object analysis page.
In step S205, displaying the first analysis data and the second analysis data corresponding to the target object in the data display area; the first analysis data includes analysis data of an object of the same type as the target object; the first analysis data and the second analysis data include data corresponding to a preset time point in the target object.
In this embodiment, the client may display the first analysis data and the second analysis data corresponding to the target object in the data display area while playing the target object in the object playing area. The first analysis data may be analysis data of an object of the same type as the target object, and the second analysis data may be analysis data of the target object.
As can be seen from fig. 3, the data display area 302 may exemplarily include first analysis data 3021 and second analysis data 3022. Assume that the target object has a duration of 12 seconds and the coordinates at the bottom in the data display area may correspond to a duration of 12 seconds. Then, the client may display the first analysis data and the second analysis data corresponding to the preset time point in the target object.
Alternatively, the preset time point may be every whole second, or every half second, or a time point with a smaller or larger interval range.
In some possible embodiments, the first analysis data may be analysis data of an object of the same type as the target object, or may be an average of analysis data of a plurality of objects of the same type as the target object, or may even be a plurality of analysis data of a plurality of objects of the same type as the target object.
In the embodiment of the present application, the above objects may be various multimedia sensory contents such as video, audio, 3D works (for example, AR and VR works), 4D shocked photoelectric contents, and the like,
the coordinates at the bottom of the data display area described above may correspond to a 12 second target object is but one alternative embodiment. In a particular application, the target object may not be 12 seconds in duration, but may be a greater number of seconds, such as 24 seconds. Assuming that when the duration of the target object is 24 seconds, if the client only displays the content shown in fig. 3, during the playing process of the target object, the user does not know the number of seconds corresponding to the content being played, and does not know which second analysis data is the next second analysis data.
Based on this, fig. 4 is a schematic diagram illustrating an object analysis page according to an exemplary embodiment, and as shown in fig. 4, in addition to the contents shown in fig. 3, a time display area 303, and a time progress display control 3031 and a slider 3032 located in the time display area 303 may be further included.
Optionally, the client may display the first analysis data and the second analysis data corresponding to the target object in the data display area while playing the target object in the object playing area, and display the time progress display control element in the time display area.
And the display progress of the time progress display control is consistent with the playing progress of the target object. That is, the sliding bar 3032 at the starting point in the time schedule display control 3031 shown in fig. 4 may move on the whole time schedule display control along with the playing of the target object, for example, when the target duration is 24 seconds, the whole display schedule of the time schedule display control corresponds to 24 seconds, when the target object is played for 10 seconds, the sliding bar 3032 at the starting point in the time schedule display control 3031 shown in fig. 4 may be located at 10/24 (from front to back) of the whole time schedule display control, and when the target object is played for 24 seconds, the sliding bar 3032 at the starting point in the time schedule display control 3031 shown in fig. 4 may be located at the ending point.
Optionally, as the target object is played, the portion of the timeline 3031 in front of the slider 3032 may be highlighted.
In this way, during the playback of the target object, the user can roughly know the number of seconds corresponding to the content being played back, or can roughly know which second of the following second analysis data is the analysis data.
However, the above-described embodiment still has a problem in that the time cannot be accurately determined. Fig. 5 is a schematic diagram illustrating an object analysis page according to an exemplary embodiment, and as shown in fig. 5, in addition to the content shown in fig. 4, time information may also be displayed at a preset position in the time display area 303, for example, at the lower left corner shown in fig. 5. The time information includes the whole part, the front part of the slash is a time point corresponding to the current position of the slider 3032 (i.e. the time point of the current playing progress of the target object), and the rear part of the slash is the duration of the target object.
In this embodiment, during the playing process of the target object, the user can accurately know the number of seconds corresponding to the content being played by the target object, and can also accurately know which second of the analysis data corresponding to the following second analysis data is the target object.
However, the object analysis page shown in fig. 5 can make the user accurately know the number of seconds corresponding to the content being played by the target object, and also can accurately know which second of the analysis data corresponding to the next second analysis data corresponds to the target object.
FIG. 6 is a diagram illustrating an object analysis page, such as that shown in FIG. 5, that may include a play guidance control 304 in addition to the content shown in FIG. 5, according to an exemplary embodiment.
Optionally, the client may display a play guidance control in the time display area and the data display area while playing the target object in the object play area, as shown in fig. 6, where the play guidance control may vertically span the first analysis data, the second analysis data, and the time progress display control. The position of the playing guidance control on the first analysis data, the second analysis data and the time progress display control can be changed along with the change of the playing progress of the target object. Therefore, the user can easily align the time point of the current playing progress displayed in the time display area with the analysis data corresponding to the time point.
On the basis of any one of the schematic diagrams shown in fig. 3-6, the object analysis page further includes a display area of the object to be selected. Fig. 7 is a schematic diagram of an object analysis page according to an exemplary embodiment, and fig. 7 is a schematic diagram based on fig. 6, and as shown in fig. 7, in addition to all the contents shown in fig. 6, an object display area 305 to be selected is also included on the object analysis page 300. The client may present at least one candidate object 3051 of the same type as the target object in the candidate object presentation area 305.
Optionally, each candidate object in the at least one candidate object may be an object of the same type as the target object.
Optionally, the label of each candidate object may be the same as the label of the target object, such as all of the "parent-child" types.
Optionally, the duration of each candidate object may be the same as the duration of the target object, such as 24 seconds.
Optionally, the label and duration of each candidate object may be the same as those of the target object, for example, both are of "parent-child" type and both are 24 seconds.
Optionally, the operation amount of each candidate object may be the same as the operation amount of the target object, for example, both of the candidate objects and the target object are considered based on the like amount, and the like amounts of the selected object and the target object are in the same order, for example, both of the candidate objects and the target object are in the like order of one million to two million. Similarly, the operation amount also includes a collection amount, a forwarding amount, a comment amount, the number of releases of the work of the publisher to which the object belongs, the number of retrieved objects, and the like.
Optionally, the likeness of each candidate object may be the same as the likeness of the target object, for example, the likeness is considered based on the powder expansion amount, and the powder expansion amounts of the selected object and the target object are in the same order, for example, in ten thousand to twenty thousand powder expansion levels. Similarly, the popularity also includes the number of fans of the publisher described by the object.
Optionally, the preset content of each candidate object may be the same as the preset content of the target object, for example, the location is in the same city or the same area. Alternatively, the background music used may be the same background music.
Optionally, the first analysis data may be analysis data of any one object to be selected among the at least one object to be selected, and may also be average analysis data of a plurality of objects to be selected among the at least one object to be selected.
This is because the present application aims to enable the author to know the quality of the target object more by displaying the analysis data of other objects of the same type, so that the author can improve the quality of the works which the author continuously creates based on the analysis data of the objects of the same type, and therefore, the present application needs to search for the object of the same type as the target object in a targeted manner for comparison, otherwise, the comparison significance will be lost, and resources are wasted.
In the embodiment of the application, the client can also play any one of the objects to be selected in the object playing area.
In an alternative embodiment, the client may switch the target object in the object playing area to the selected object in response to a switching instruction triggered by a switching control corresponding to the selected object in the at least one object to be selected, play the selected object in the object playing area in response to a second playing instruction triggered by a playing control based on the selected object, and display first analysis data and second analysis data corresponding to the selected object in the data display area.
For example, assuming that the object 1 to be selected in the object display area to be selected is a selected object, when the client detects a switching instruction triggered by the switching control of the selected object, the resource of the selected object may be obtained, and the target object in the object playing area is switched to the selected object. When a second play instruction triggered by the play control of the selected object is detected, the selected object can be played in the object play area, and first analysis data corresponding to the selected object and second analysis data corresponding to the target object are displayed in the data display area.
The first analysis data and the second analysis data comprise data corresponding to a preset time point in the duration of the target object.
In an alternative embodiment, the object playing area may be divided into a plurality of areas, for example, the object playing area may be divided into two areas for playing the target object and the selected object respectively. Fig. 8 is a schematic diagram of an object analysis page according to an exemplary embodiment, and fig. 8 is a schematic diagram based on fig. 7, and as shown in fig. 8, the object playing area 301 may be divided into two areas, including a target object playing area 3011 and a selected object playing area 3012.
Optionally, the two regions that are divided into the playing target object and the selected object are only an optional embodiment, and in an actual application process, the number of the object playing regions may be determined based on the number of the selected objects. For example, assuming that the number of the selected objects is 2, the object playing area may be divided into 3 parts, including a target object playing area, a first selected object playing area, and a second selected object playing area.
Optionally, the client may divide the object playing area into a target object playing area and a selected object playing area in response to an adding instruction triggered based on an adding control corresponding to a selected object in the at least one object to be selected. Subsequently, the client may respond to a third play instruction triggered by the integrated play control based on the target object and the selected object, play the target object in the target object play area, and play the selected object in the selected object play area. The integrated playing control may be a triangle control in the target object playing region 3011 shown in fig. 8, or may be a triangle control in the selected object playing region 3012. That is, when any one of the triangle controls in the target object playing region 3011 and the selected object playing region 3012 is touched, the synchronous playing of the target object and the selected object can be started.
And, while the target object and the selected object are played synchronously, the client may display the first analysis data corresponding to the selected object and the second analysis data corresponding to the target object in the data display area. The first analysis data and the second analysis data comprise data corresponding to a preset time point in the duration of the target object.
In the embodiment of the application, when the selected object plays in the object playing area, the first analysis data is analysis data corresponding to the selected object.
In this embodiment of the present application, when only the target object is playing, the first analysis data may be the first analysis data corresponding to at least one object to be selected. That is, when the target object is played, the client may display the first analysis data and the second analysis data corresponding to at least one object to be selected in the display area.
In this embodiment of the application, when only the target object is played, the first analysis data may be analysis data corresponding to the selectable object in the selectable object set. The selectable objects are all objects of the same type as the target object, and the selectable objects may include objects to be selected displayed in an object display area to be selected, and may also include objects not displayed in the object display area to be selected. That is, when the target object is played, the client may display the first analysis data and the second analysis data corresponding to the selectable object set in the data display area.
If the object playing area can be divided into 3 parts including the target object playing area, the first selected object playing area and the second selected object playing area, the data display area can display the first analysis data of the first selected object played in the first selected object playing area, the first analysis data of the second selected object played in the second selected object playing area, and the second analysis data of the target object played in the target object playing area.
In the embodiment of the application, the target object can be displayed in the display area of the object to be selected by being placed on top. Or, when the target object is played in the object playing area, the target object is not displayed in the to-be-selected object displaying area, and when the target object is not played in the object playing area, for example, when the target object is switched to the selected object playing, the target object may be displayed at the top of the to-be-selected object displaying area.
Because the to-be-selected object in the to-be-selected object display area may not be the selected object desired by the user, based on this, the embodiment of the application may provide a to-be-selected object addition control in the to-be-selected object display area, and when a to-be-selected object addition instruction triggered by the to-be-selected object addition control is detected, the client may analyze the to-be-selected object addition instruction to obtain an object identifier of the added to-be-selected object, so that the client may add a new to-be-selected object in the to-be-selected object display area.
Optionally, when a new candidate object is added to the candidate object display area, considering the size of the area, an original candidate object may be removed, or the candidate objects may be collectively reduced to accommodate more candidate objects, or the original multiple candidate objects may be placed in the candidate object display area in a form of a list, and different candidate objects may be displayed in the candidate object display area by sliding up and down, so that even if a new candidate object is added, the frame may not be affected, but the number of the new candidate objects is increased.
On the basis of any one of the schematic diagrams shown in fig. 4-7, the object analysis page further includes a to-be-selected object presentation area. FIG. 9 is a schematic diagram illustrating an object analysis page in accordance with an exemplary embodiment, and FIG. 9 is a schematic diagram based on FIG. 7, and may include a view guidance control 306 as shown in FIG. 9 in addition to all of the content shown in FIG. 7, as shown in FIG. 9.
Specifically, when the client detects a viewing guidance instruction based on the target time point, the viewing guidance control may be displayed at a position corresponding to the first analysis sub-data, the second analysis sub-data, and the time schedule display control corresponding to the target time point. Analyzing in conjunction with fig. 9, when the client detects that the target time point 0: a view guidance instruction of 10 (i.e., 10 seconds), may be set at target time point 0: and 10, displaying the viewing guide control on the first analysis subdata, the second analysis subdata and the corresponding position of the time advance display control.
The viewing guidance instruction may be that the client detects a target time point 0: 10 (in fig. 9, a circle area corresponding to 10 seconds) is generated when the position corresponding to the schedule display control is touched. The viewing guidance instruction may be generated when the client detects that the circle area on the first analysis data is touched, and the viewing guidance instruction may be generated when the client detects that the circle area on the second analysis data is touched.
As shown in fig. 9, after the viewing guidance control appears, the client may display the value 0 of the target time point at the corresponding position of the schedule display control: 10, displaying the analysis value "XXX" of the first analysis subdata in a preset area corresponding to the first analysis subdata, and displaying the analysis value "YYY" of the second analysis subdata in a preset area corresponding to the second analysis subdata.
Fig. 10 is a schematic diagram illustrating an object analysis page according to an exemplary embodiment, and as shown in fig. 10, a key frame display area 307 may be included in addition to the entire contents shown in fig. 9. The key frame display area 307 is located in the object playback area and on the upper layer of the object playback area.
Optionally, when the client detects a viewing guidance instruction based on the target time point, and displays the viewing guidance control at the corresponding position of the first analysis subdata, the second analysis subdata, and the time progress display control corresponding to the target time point, a key frame display area may be displayed on the upper layer of the object playing area, and a key frame in the target object corresponding to the target time point, such as a key frame corresponding to the 10 th second, is displayed in the key frame display area. The key frame corresponding to the 10 th second may be the object frame (if existing) on the node of the 10 th second, the first object frame or the last object frame in the whole second of the 10 th second, or a random object frame.
In this manner, the client may help the user locate the analysis data of the object frame that specifically corresponds to a certain second.
Optionally, when the client detects a first play instruction triggered by a play control corresponding to the target object in the object analysis page, the display area of the target object may be hidden in order to enlarge the viewing field of the user, the object play area is enlarged based on the original display area of the target object, and the target object is played in the enlarged object play area. Or after hiding the display area of the object to be selected, synchronously expanding the object playing area, the time display area and the data display area based on the original display area of the object to be selected.
In this embodiment of the application, the first analysis data may include first account retention data corresponding to the selected object, and the second analysis data may include second account retention data corresponding to the target object. The account retention data is to help the user analyze that the audience watching the corresponding object left the next second. Alternatively, it may be expressed in percentage form, with the percentage data indicating how many percent of viewers remain in the current second.
In this embodiment of the application, the first analysis data may include first interaction data corresponding to the selected object, and the second analysis data may include second interaction data corresponding to the target object. For example, the interactive data may be approval data, forwarding data, comment data, favorite data, searched times data, fan-up data, and browsing and viewing data. The interactive data is to help the user analyze that the viewer watching the corresponding object likes the work of the object in the second seconds. Alternatively, it may be expressed in percentage form, for example, the percentage data represents the percentage of the audience favored by the current second among all the audience favored.
In this embodiment of the application, the first analysis data may include first operation data corresponding to the selected object, and the second analysis data may include second operation data corresponding to the target object. For example, the operation data may be reward data. Alternatively, the data may be expressed in percentage form, for example, the percentage data represents the percentage of the audience enjoying the current number of seconds among all the audience enjoying.
In this embodiment of the application, the first analysis data may include first account retention data corresponding to the selected object and first interaction data corresponding to the target object, and the second analysis data may include second account retention data corresponding to the target object and second interaction data corresponding to the target object. Thus, the data display area can display 4 broken lines which respectively correspond to different colors.
Optionally, the broken line shown in the above figures is only an optional implementation manner, and besides the broken line accident, a bar graph, a graph combination analysis, a three-dimensional statistical graph or a dynamic graph can be used for displaying the analysis data, so that users with different use habits can switch among the analysis data.
The analysis data may also include other analysis data suitable for the present application, which is not described herein in detail.
In summary, the embodiment of the application can provide a brand new interactive mode for the creator to view the real-time data of each second of the picture of the object, and the creator can play the object and view the data at the same time. The method helps the creator to better analyze the object, can improve the creation enthusiasm of the creator and enhance the creation desire, and is a great innovation experience in the field of object data visualization.
In addition, the method and the device provide the opportunity of performing same-screen multi-dimensional comparison with the same type of works, instantiate the same-type excellent works entity in the same row, and then recommend the region to be presented to the author. The creator can watch and compare the excellent works, comprehensively analyze the content of the excellent objects and the data expression condition of the corresponding seconds frame by frame, and compare the excellent objects with the works of the creator, thereby promoting the creator to create the works with higher and higher quality.
Fig. 11 is a block diagram illustrating an object data display apparatus according to an exemplary embodiment. Referring to fig. 11, the apparatus includes a page display module 1101, a play module 1102, and a data display module 1103.
A page display module 1101 configured to execute a display object analysis page; the object analysis page comprises an object playing area and a data display area;
the playing module 1102 is configured to execute a first playing instruction triggered based on a playing control corresponding to the target object, and play the target object in the object playing area;
a data display module 1103 configured to perform displaying the first analysis data and second analysis data corresponding to the target object in a data display area; the first analysis data includes analysis data of an object of the same type as the target object; the first analysis data and the second analysis data include data corresponding to a preset time point in the duration of the target object.
In some possible embodiments, the object analysis page further comprises a time display area; the device still includes:
a time control display module configured to perform displaying a time progress display control in a time display area;
the display progress of the time progress display control is consistent with the playing progress of the target object.
In some possible embodiments, the object analysis page further includes a play guidance control, and the apparatus further includes:
a play guidance control display module configured to perform display of a play guidance control in the time display area and the data display area;
the position of the playing guide control on the first analysis data, the second analysis data and the time progress display control changes along with the change of the playing progress of the target object.
In some possible embodiments, the object analysis page further includes a to-be-selected object display area; the device still includes:
the object display module is configured to display at least one object to be selected, which is of the same type as the target object, in the object display area to be selected;
the same type includes the same duration type or the same duration label type.
In some possible embodiments, the apparatus further comprises:
the switching module is configured to execute a switching instruction triggered by a switching control piece corresponding to a selected object in at least one object to be selected, and switch a target object in the object playing area to the selected object;
the playing module is configured to execute a second playing instruction triggered by the playing control based on the selected object, and play the selected object in the object playing area;
and the data display module is configured to display the first analysis data and the second analysis data corresponding to the selected object in the data display area.
In some possible embodiments, the apparatus further comprises:
the region segmentation module is configured to execute an adding instruction triggered by an adding control corresponding to a selected object in at least one object to be selected, and divide the object playing region into a target object playing region and a selected object playing region;
the playing module is configured to execute a third playing instruction triggered by the comprehensive playing control based on the target object and the selected object, play the target object in the target object playing area and play the selected object in the selected object playing area;
and the data display module is configured to display the first analysis data and the second analysis data corresponding to the selected object in the data display area.
In some possible embodiments, the data display module is configured to perform displaying, in the data display area, first analysis data and second analysis data corresponding to at least one object to be selected;
or;
the data display module is configured to display first analysis data and second analysis data corresponding to the selectable object sets in the data display area; the selectable objects in the selectable object set comprise objects which are the same as the target objects in type; the selectable object set includes at least one candidate object.
In some possible embodiments, the object display module is configured to perform displaying the target object on top of the to-be-selected object display area.
In some possible embodiments, the to-be-selected object display area further includes a to-be-selected object addition control, and the apparatus further includes:
the object adding module is configured to execute a to-be-selected object adding instruction triggered based on the to-be-selected object adding control and add the to-be-selected object in the to-be-selected object display area; the candidate object adding instruction comprises an object identifier of the added candidate object.
In some possible embodiments, the apparatus further comprises:
and the viewing guide control display module is configured to execute that when a viewing guide instruction based on the target time point is detected, the viewing guide control is displayed at the corresponding position of the first analysis subdata, the second analysis subdata and the time progress display control corresponding to the target time point.
In some possible embodiments, the apparatus further comprises a key frame display module configured to perform:
displaying a key frame display area on the upper layer of the object playing area;
and displaying the key frames in the target object corresponding to the target time points in a key frame display area.
In some possible embodiments, the apparatus further comprises a numerical display module configured to perform:
displaying the numerical value of the target time point at the corresponding position of the time progress display control;
displaying the analysis numerical value of the first analysis subdata in a preset area corresponding to the first analysis subdata;
and displaying the analysis numerical value of the second analysis subdata in a preset area corresponding to the second analysis subdata.
In some possible embodiments, the first analysis data comprises first account retention data and the second analysis data comprises second account retention data;
and/or;
the first analysis data comprises first interaction data, and the second analysis data comprises second interaction data;
and/or;
the first analysis data includes first operation data and the second analysis data includes second operation data.
In some possible embodiments, the apparatus further comprises:
the hiding module is configured to execute a first playing instruction triggered by a playing control corresponding to the target object and hide the display area of the object to be selected;
an area enlargement module configured to perform enlarging an object play area;
a playing module configured to execute playing the target object in the enlarged object playing area.
In some possible embodiments, the object representation module is configured to perform:
displaying at least one object to be selected which has the same time length type as the target object in the object display area to be selected;
and/or;
displaying at least one object to be selected in the object to be selected display area, wherein the object to be selected and the target object are of the same time length label type;
and/or;
displaying at least one object to be selected which is in the same operation amount type with the target object in an object to be selected display area;
and/or;
displaying at least one object to be selected with the same preference type as the target object in an object to be selected display area;
and/or;
and displaying at least one object to be selected with the same attribute type as the target object in the object to be selected display area.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 12 is a block diagram illustrating an apparatus 2000 for data processing in accordance with an example embodiment. For example, the apparatus 2000 may be a mobile telephone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 12, the apparatus 2000 may include one or more of the following components: a processing component 2002, a memory 2004, a power component 2006, a multimedia component 2008, an audio component 2010, an input/output (I/O) interface 2012, a sensor component 2014, and a communication component 2016.
The processing component 2002 generally controls the overall operation of the device 2000, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 2002 may include one or more processors 2020 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 2002 can include one or more modules that facilitate interaction between the processing component 2002 and other components. For example, the processing component 2002 may include a multimedia module to facilitate interaction between the multimedia component 2008 and the processing component 2002.
The memory 2004 is configured to store various types of data to support operation at the device 2000. Examples of such data include instructions for any application or method operating on device 2000, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 2004 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 2006 provides power to the various components of the device 2000. The power components 2006 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device 2000.
The multimedia component 2008 includes a screen providing an output interface between the device 2000 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 2008 includes a front camera and/or a rear camera. The front camera and/or the rear camera may receive external multimedia data when the device 2000 is in an operation mode, such as a photographing mode or an object mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
Audio component 2010 is configured to output and/or input audio signals. For example, audio component 2010 includes a Microphone (MIC) configured to receive external audio signals when apparatus 2000 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in memory 2004 or transmitted via communications component 2016. In some embodiments, audio assembly 2010 also includes a speaker for outputting audio signals.
The I/O interface 2012 provides an interface between the processing component 2002 and peripheral interface modules, which can be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
Sensor assembly 2014 includes one or more sensors for providing status assessments of various aspects to apparatus 2000. For example, the sensor assembly 2014 may detect an open/closed state of the device 2000, a relative positioning of components, such as a display and keypad of the apparatus 2000, a change in position of the apparatus 2000 or a component of the apparatus 2000, the presence or absence of user contact with the apparatus 2000, an orientation or acceleration/deceleration of the apparatus 2000, and a change in temperature of the apparatus 2000. The sensor assembly 2014 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 2014 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 2014 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 2016 is configured to facilitate wired or wireless communication between the apparatus 2000 and other devices. The apparatus 2000 may access a wireless network based on a communication standard, such as WiFi, an operator network (such as 2G, 3G, 4G, or 5G), or a combination thereof. In an exemplary embodiment, the communication component 2016 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 2016 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 2000 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors, or other electronic components for performing the above-described methods.
In an exemplary embodiment, a storage medium comprising instructions, such as the memory 2004 comprising instructions, executable by the processor 2020 of the apparatus 2000 to perform the above-described method is also provided. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.

Claims (10)

1. An object data display method, comprising:
displaying an object analysis page; the object analysis page comprises an object playing area and a data display area;
responding to a first playing instruction triggered based on a playing control corresponding to a target object, and playing the target object in the object playing area;
displaying the first analysis data and the second analysis data corresponding to the target object in a data display area; the first analysis data comprises analysis data of an object of the same type as the target object; the first analysis data and the second analysis data include data corresponding to a preset time point in the target object.
2. The object data display method according to claim 1, wherein the object analysis page further includes a time display area; the method further comprises the following steps:
displaying a time progress display control in the time display area;
and the display progress of the time progress display control is consistent with the playing progress of the target object.
3. The method of claim 1, wherein the object analysis page further comprises a play guidance control, the method further comprising:
displaying the play guidance control in the time display area and the data display area;
the positions of the playing guide control on the first analysis data, the second analysis data and the time progress display control are changed along with the change of the playing progress of the target object.
4. The object data display method according to any one of claims 1 to 3, wherein the object analysis page further includes a to-be-selected object display area; the method further comprises the following steps:
and displaying at least one object to be selected which is the same as the target object in the object to be selected display area.
5. The method of claim 4, further comprising:
switching the target object in the object playing area to the selected object in response to a switching instruction triggered based on a switching control corresponding to the selected object in the at least one object to be selected;
responding to a second playing instruction triggered by a playing control based on the selected object, and playing the selected object in the object playing area;
and displaying the first analysis data and the second analysis data corresponding to the selected object in the data display area.
6. The object data display method according to claim 4, characterized in that the method further comprises:
responding to an adding instruction triggered by an adding control corresponding to a selected object in the at least one object to be selected, and dividing the object playing area into a target object playing area and a selected object playing area;
responding to a third playing instruction triggered by a comprehensive playing control based on the target object and the selected object, playing the target object in the target object playing area, and playing the selected object in the selected object playing area;
and displaying the first analysis data and the second analysis data corresponding to the selected object in the data display area.
7. An object data display apparatus, comprising:
a page display module configured to execute a display object analysis page; the object analysis page comprises an object playing area and a data display area;
the playing module is configured to execute a first playing instruction triggered based on a playing control corresponding to a target object, and play the target object in the object playing area;
the data display module is configured to display the first analysis data and second analysis data corresponding to the target object in a data display area; the first analysis data comprises analysis data of an object of the same type as the target object; the first analysis data and the second analysis data include data corresponding to a preset time point in the target object.
8. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the object data display method of any of claims 1 to 6.
9. A computer-readable storage medium, wherein instructions in the computer-readable storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the object data display method of any one of claims 1 to 6.
10. A computer program product, characterized in that the computer program product comprises a computer program, the computer program being stored in a readable storage medium, from which at least one processor of a computer device reads and executes the computer program, causing the computer device to perform the object data display method according to any one of claims 1 to 6.
CN202111574859.XA 2021-12-21 2021-12-21 Object data display method and device, electronic equipment and storage medium Pending CN114780180A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111574859.XA CN114780180A (en) 2021-12-21 2021-12-21 Object data display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111574859.XA CN114780180A (en) 2021-12-21 2021-12-21 Object data display method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114780180A true CN114780180A (en) 2022-07-22

Family

ID=82423188

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111574859.XA Pending CN114780180A (en) 2021-12-21 2021-12-21 Object data display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114780180A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101753952A (en) * 2008-12-04 2010-06-23 乐金电子(中国)研究开发中心有限公司 Sending and displaying method of audience rating data and device
CN103200453A (en) * 2011-12-06 2013-07-10 Lg电子株式会社 Image display apparatus and methods for operating the same
CN106341694A (en) * 2016-08-29 2017-01-18 广州华多网络科技有限公司 Method and device for obtaining live streaming operation data
CN106792081A (en) * 2016-12-07 2017-05-31 腾讯科技(深圳)有限公司 The method for pushing and device of live video
CN109688437A (en) * 2018-12-10 2019-04-26 未来电视有限公司 A kind of method for exhibiting data, device, electronic equipment and readable storage medium storing program for executing
CN109769146A (en) * 2018-12-25 2019-05-17 国家新闻出版广电总局广播电视规划院 The determination method and device of broadcast TV program audience ratings
CN111817943A (en) * 2019-04-12 2020-10-23 腾讯科技(深圳)有限公司 Data processing method and device based on instant messaging application
CN113259780A (en) * 2021-07-15 2021-08-13 中国传媒大学 Holographic multidimensional audio and video playing progress bar generating, displaying and playing control method
CN113727201A (en) * 2021-08-30 2021-11-30 北京字节跳动网络技术有限公司 Data processing method and device, computer equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101753952A (en) * 2008-12-04 2010-06-23 乐金电子(中国)研究开发中心有限公司 Sending and displaying method of audience rating data and device
CN103200453A (en) * 2011-12-06 2013-07-10 Lg电子株式会社 Image display apparatus and methods for operating the same
CN106341694A (en) * 2016-08-29 2017-01-18 广州华多网络科技有限公司 Method and device for obtaining live streaming operation data
CN106792081A (en) * 2016-12-07 2017-05-31 腾讯科技(深圳)有限公司 The method for pushing and device of live video
CN109688437A (en) * 2018-12-10 2019-04-26 未来电视有限公司 A kind of method for exhibiting data, device, electronic equipment and readable storage medium storing program for executing
CN109769146A (en) * 2018-12-25 2019-05-17 国家新闻出版广电总局广播电视规划院 The determination method and device of broadcast TV program audience ratings
CN111817943A (en) * 2019-04-12 2020-10-23 腾讯科技(深圳)有限公司 Data processing method and device based on instant messaging application
CN113259780A (en) * 2021-07-15 2021-08-13 中国传媒大学 Holographic multidimensional audio and video playing progress bar generating, displaying and playing control method
CN113727201A (en) * 2021-08-30 2021-11-30 北京字节跳动网络技术有限公司 Data processing method and device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
US8799300B2 (en) Bookmarking segments of content
CN113065008A (en) Information recommendation method and device, electronic equipment and storage medium
CN106886540B (en) Data searching method and device for data searching
CN111556352B (en) Multimedia resource sharing method and device, electronic equipment and storage medium
JP2021535656A (en) Video processing methods, equipment, devices and computer programs
CN113298602A (en) Commodity object information interaction method and device and electronic equipment
CN112464031A (en) Interaction method, interaction device, electronic equipment and storage medium
CN106204695B (en) Editing method and device of 3D animation
WO2019095810A1 (en) Interface display method and device
CN111526380B (en) Video processing method, video processing device, server, electronic equipment and storage medium
CN113157972A (en) Recommendation method and device for video cover documents, electronic equipment and storage medium
WO2023097981A1 (en) Object display method and electronic device
CN114554231A (en) Information display method and device, electronic equipment and storage medium
CN114666643A (en) Information display method and device, electronic equipment and storage medium
KR20230120668A (en) Video call method and device
CN114780180A (en) Object data display method and device, electronic equipment and storage medium
CN110662103B (en) Multimedia object reconstruction method and device, electronic equipment and readable storage medium
CN113568551A (en) Picture saving method and device
CN113553466A (en) Page display method, device, medium and computing equipment
CN113792178A (en) Song generation method and device, electronic equipment and storage medium
CN113419660A (en) Video resource processing method and device, electronic equipment and storage medium
CN113778301A (en) Emotion interaction method based on content service and electronic equipment
CN112115341A (en) Content display method, device, terminal, server, system and storage medium
CN112040257A (en) Method and device for pushing information in live broadcast room
CN113630644B (en) Editing method, device and storage medium of video content editor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination