CN109992187B - Control method, device, equipment and storage medium - Google Patents

Control method, device, equipment and storage medium Download PDF

Info

Publication number
CN109992187B
CN109992187B CN201910289631.2A CN201910289631A CN109992187B CN 109992187 B CN109992187 B CN 109992187B CN 201910289631 A CN201910289631 A CN 201910289631A CN 109992187 B CN109992187 B CN 109992187B
Authority
CN
China
Prior art keywords
image
control
adjusted
interactive control
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910289631.2A
Other languages
Chinese (zh)
Other versions
CN109992187A (en
Inventor
徐锐
刘梓欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN201910289631.2A priority Critical patent/CN109992187B/en
Publication of CN109992187A publication Critical patent/CN109992187A/en
Application granted granted Critical
Publication of CN109992187B publication Critical patent/CN109992187B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure discloses a control method, a control device, control equipment and a storage medium. The method comprises the following steps: displaying an image and an interactive control associated with the image in a list interface of a client or a detail interface of a list element; receiving a triggering operation aiming at the interaction control, and determining an image associated with the triggered interaction control; acquiring the associated image as an image to be adjusted; determining image adjustment information corresponding to the interactive control; based on the image adjustment information, the image to be adjusted is adjusted, and the adjusted image is displayed, wherein the image adjustment information is matched with the interaction information transmitted by the interaction control.

Description

Control method, device, equipment and storage medium
Technical Field
The present disclosure relates to image processing technologies, and in particular, to a control method, apparatus, device, and storage medium.
Background
At present, the light and the image density are inseparable during leisure time of users, the users like browsing pictures and videos and obtain more funs from the pictures and the videos, and image processing technology is continuously developed to increase entertainment experience of the users.
With the increasing demand of users, the improvement of the entertainment experience of users by using image processing technology becomes a technical problem which needs to be solved urgently at present.
Disclosure of Invention
The embodiment of the disclosure provides a control method, a control device, a control apparatus and a storage medium, wherein when a user performs an interactive operation through an interactive control, the user adjusts an image in a correlated manner, so that a display effect of the image is matched with interactive information between the user and the image, which is embodied by the interactive control, and the efficiency and the richness of information transmission are improved, thereby improving the entertainment experience of the user.
In a first aspect, an embodiment of the present disclosure provides a control method, including: displaying an image and an interactive control associated with the image in a list interface of a client or a detail interface of a list element;
receiving a triggering operation aiming at the interaction control, and determining an image associated with the triggered interaction control;
acquiring the associated image as an image to be adjusted;
determining image adjustment information corresponding to the interactive control;
and adjusting the image to be adjusted based on the image adjustment information, and displaying the adjusted image, wherein the image adjustment information is matched with the interaction information transmitted by the interaction control.
In a second aspect, an embodiment of the present disclosure further provides a control apparatus, including:
the first display module is used for displaying an image and an interactive control associated with the image in a list interface of a client or a detail interface of a list element;
the image determining module is used for receiving triggering operation aiming at the interactive control and determining an image associated with the triggered interactive control;
the acquisition module is used for acquiring the associated image as an image to be adjusted;
the information determining module is used for determining image adjusting information corresponding to the interactive control;
and the second display module is used for adjusting the image to be adjusted based on the image adjustment information and displaying the adjusted image, wherein the image adjustment information is matched with the interaction information transmitted by the interaction control.
In a third aspect, an embodiment of the present disclosure further provides a computer device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the control method according to any one of the embodiments of the present disclosure when executing the program.
In a fourth aspect, the disclosed embodiments also provide a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the control method according to any one of the disclosed embodiments.
The embodiment of the disclosure displays an image and an interactive control associated with the image in a list interface of a client or a detail interface of a list element; receiving a triggering operation aiming at the interaction control, and determining an image associated with the triggered interaction control; acquiring the associated image as an image to be adjusted; determining image adjustment information corresponding to the interactive control; based on the image adjustment information, the image to be adjusted is adjusted, and the adjusted image is displayed, wherein the image adjustment information is matched with the interaction information transmitted by the interaction control, and the image is adjusted in a correlated manner when the user performs interaction operation through the interaction control, so that the display effect of the image is matched with the interaction information between the user and the image embodied by the interaction control, the information transmission efficiency and richness are improved, and the entertainment experience of the user is further improved.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present disclosure and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings may be obtained from the drawings without inventive effort.
Fig. 1 is a flowchart of a control method in a first embodiment of the disclosure;
fig. 2A is a flowchart of a control method in a second embodiment of the disclosure;
fig. 2B is a list interface of a client in the second embodiment of the present disclosure;
fig. 2C is a detail interface diagram of the list element image a in the second embodiment of the present disclosure;
fig. 2D is a detail interface diagram of a list element image B in the second embodiment of the present disclosure;
FIG. 2E is a detailed interface diagram of a still picture in the second embodiment of the disclosure;
fig. 2F is a detail interface illustration of a video in a second embodiment of the disclosure;
fig. 3A is a flowchart of a control method in a third embodiment of the present disclosure;
fig. 3B is an illustration of a detail interface of a video after a user clicks the like control in a third embodiment of the disclosure;
fig. 4 is a schematic structural diagram of a control device in a fourth embodiment of the disclosure;
fig. 5 is a schematic structural diagram of a computer device in a fifth embodiment of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the disclosure and are not limiting of the disclosure. It should be further noted that, for the convenience of description, only some of the structures relevant to the present disclosure are shown in the drawings, not all of them.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present disclosure, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
Example one
Fig. 1 is a flowchart of a control method provided in an embodiment of the present disclosure, where the present embodiment is applicable to a control situation, and the method may be executed by a control device in the embodiment of the present disclosure, and the device may be implemented in a software and/or hardware manner, as shown in fig. 1, the method specifically includes the following steps:
s110, displaying the image and the interactive control related to the image in a list interface of the client or a detail interface of a list element.
The list interface may be an interface including at least one list element, for example, a home page interface of a certain user, and the interface includes at least one work image of the user and interaction controls such as likes and comments associated with the work image.
The list interface may include at least one image, and the at least one image may be arranged in a list form, for example, the list interface may include 6 images, and the 6 images are arranged in sequence according to 2 rows and 3 columns, or in sequence according to a single column and 6 rows.
The interactive control can be one or more of a browsing control, a barrage control, a sharing control, a praise control, a comment control, a forwarding control, a playing control and a rewarding control. The image may be a still picture, a part or all of frames in a dynamic picture, or a part or all of frames in a video, which is not limited in this disclosure.
The detail interface of the list element comprises at least one image and an interaction control associated with the image, for example, the detail interface of the list element comprises at least one work image and interaction controls such as praise and comment associated with the work image.
The display mode of the image and the interactive control associated with the image may be to display the interactive control associated with the image below the image, or may be to display the interactive control associated with the image at the end of an interface where the image is located, or may be to display the interactive control associated with the image in a floating manner at the bottom of the image.
Specifically, in the list interface of the client or in the detail interface of the list element, the image and the interaction control associated with the image may be, for example, in the homepage interface of the user a, the related image of the user a is displayed, and the interaction control such as a comment, and the like is displayed in suspension at the bottom of the image, or in the detail interface of the image W, the image W is displayed, and the interaction control such as a comment, and the like is displayed at the bottom of the detail interface.
And S120, receiving a triggering operation aiming at the interactive control, and determining an image associated with the triggered interactive control.
The triggering operation for the interactive control may be that the user clicks the interactive control displayed on the touch screen, the interactive control corresponding to the voice information is determined by recognizing the voice information input by the user, and the interactive control corresponding to the gesture of the user is determined by recognizing the gesture of the user, which is not limited in the embodiment of the present disclosure.
The method for determining the image related to the triggered interactive control may be to establish a database related to a corresponding relationship between the interactive control and the image in advance, acquire the image related to the triggered interactive control by searching the database, or may be to determine the image closest to the position region according to the position region where the triggered interactive control is located, and use the image closest to the position region as the image related to the triggered interactive control, or may also be to establish the association between the interactive control and the image in advance, and determine the image related to the triggered interactive control directly according to the triggered interactive control, which is not limited in the embodiments of the present disclosure.
Specifically, receiving a trigger operation for an interaction control, and determining an image associated with the triggered interaction control, for example, a user clicks a compliment control S at the bottom of a detail interface, where the detail interface includes: the image W, the image Q and the image P are used for pre-establishing a database of the corresponding relation between the interactive control and the image, wherein the database comprises: if the image associated with the like control S is the image Q, the image associated with the like control S is determined to be the image Q.
S130, acquiring the associated image as an image to be adjusted.
Specifically, the associated image is acquired as the image to be adjusted, for example, the user may click a like control S at the bottom of the detail interface, and the detail interface includes: the image W, the image Q and the image P are used for pre-establishing a database of the corresponding relation between the interactive control and the image, wherein the database comprises: if the image associated with the praise control S is the image Q, determining that the image associated with the praise control S is the image Q, and acquiring the image Q as the image to be adjusted.
S140, determining image adjustment information corresponding to the interactive control.
The image adjustment information may be to add a special effect at a preset position of an image, or may replace the preset position of the image with a preset image, or may replace the image with a preset dynamic picture, or replace an expression of a person in the image with a preset expression, or replace an expression of an animal in the image with a preset expression, which is not limited in the embodiment of the present disclosure.
Specifically, the manner of determining the image adjustment information corresponding to the interactive control may be to pre-establish a database about the corresponding relationship between the interactive control and the image adjustment information, inquiring a database according to the triggered interactive control to obtain image adjustment information corresponding to the triggered interactive control, for example, the image adjustment information corresponding to the interactive control U may be to adjust the human movement in the image to be true heart, the image adjustment information corresponding to the interactive control O may be to adjust the human movement in the image to be kiss, the image adjustment information corresponding to the interactive control X may be to adjust the human expression or the animal expression in the image to be a beep mouth, the image adjustment information corresponding to the interactive control Y may be to adjust the human expression or the animal expression in the image to be a blink, and the image adjustment information corresponding to the interactive control X may be obtained by receiving a trigger operation on the interactive control X.
S150, adjusting the image to be adjusted based on the image adjusting information, and displaying the adjusted image, wherein the image adjusting information is matched with the interactive information transmitted by the interactive control.
Wherein, the interactive information conveyed by the interactive control can determine the interactive information conveyed by the interactive control through the category to which the interactive control belongs, or can determine the information conveyed by the interactive control according to the input content corresponding to the interactive control, for example, if the interactive control is approved, the interactive information conveyed by the interactive control is a user favorite image, if the interactive control is a comment, the information conveyed by the interactive control is determined according to the comment content input by the user, the comment content input by the user is analyzed, if the comment content of the user is the user favorite image, the interactive information conveyed by the interactive control is the user favorite image, if the comment content of the user is the user favorite image, the interactive information conveyed by the interactive control is the user disliked image, if the interactive control is stepped, the interactive information conveyed by the interactive control is the user disliked image, the disclosed embodiments are not so limited.
The image adjustment information may be matched with the interaction information conveyed by the interaction control, for example, if the interaction information conveyed by the interaction control is a favorite image, the image adjustment information may be an expression for adjusting the human expression to be happy, for example, a beep, a blink, or a smile, and if the interaction information conveyed by the interaction control is a dislike image, the image adjustment information may be an expression for adjusting the human expression to be too hard, for example, a cry.
The method for adjusting the image to be adjusted may be to replace the image to be adjusted with a preset image, or may replace a partial region of the image to be adjusted with a preset image, or modify a preset region of the image to be adjusted, or add a preset special effect to the image to be adjusted, which is not limited in the embodiment of the present disclosure.
Specifically, the image to be adjusted is adjusted based on the image adjustment information, and the adjusted image is displayed, for example, if the interactive control is a praise control, and the image adjustment information is to adjust the expression of the character in the image to be beeped, the expression of the character in the image to be adjusted is adjusted to be beeped according to the adjustment of the expression of the character in the image to be beeped, and the adjusted image is displayed, that is, the image in which the expression of the character in the image is adjusted to be beeped is displayed. And the character in the image is matched to make the expression of the mouth of the user after the user clicks the praise control.
Optionally, the position relationship between the interaction control and the image to be adjusted includes: the interactive control is located in the bottom area of the image to be adjusted, the interactive control is located in the right area of the image to be adjusted, and the interactive control is suspended in one or more of the image to be adjusted.
In a specific example, as shown in the figure, the interaction control is located in a bottom region of the image to be adjusted, the interaction control is located in a right region of the image to be adjusted, and the interaction control is suspended in the image to be adjusted.
Optionally, receiving a triggering operation for the interaction control, and determining the image associated with the triggered interaction control includes:
receiving a trigger operation aiming at an interactive control, and acquiring position information of the interactive control;
and determining an image associated with the triggered interaction control according to the position information.
Specifically, the list interface of the client includes at least one image, the detail interface of the list element includes at least one image, the triggering operation for the interactive control is received, the position information of the triggered interactive control is obtained, and the image closest to the triggered interactive control is determined as the image associated with the triggered interactive control according to the position information of the interactive control.
Optionally, the image adjustment information includes: an adjustment region in the image to be adjusted;
correspondingly, based on the image adjustment information, the image to be adjusted is adjusted, and displaying the adjusted image includes:
and adjusting the adjusting area in the image to be adjusted, and displaying the adjusted image.
Specifically, an adjustment region in the image to be adjusted corresponding to the interactive control is determined, the adjustment region in the image to be adjusted is adjusted, and the adjusted image is displayed.
Optionally, the interaction control includes: the system comprises one or more of a browsing control, a popup control, a sharing control, a praise control, a comment control, a forwarding control, a playing control and a rewarding control.
Optionally, the image to be adjusted includes one or more of the following:
a still picture;
part or all of the frames in the moving picture;
some or all of the frames in the video.
It should be noted that, the embodiment of the present disclosure can directly adjust the image to be adjusted, and the adjusted image is more real, thereby improving the efficiency and richness of information transmission, and further improving the entertainment experience of the user.
According to the technical scheme of the embodiment, an image and an interactive control related to the image are displayed in a list interface of a client or a detail interface of a list element; receiving a triggering operation aiming at the interaction control, and determining an image associated with the triggered interaction control; acquiring the associated image as an image to be adjusted; determining image adjustment information corresponding to the interactive control; based on the image adjustment information, the image to be adjusted is adjusted, and the adjusted image is displayed, wherein the image adjustment information is matched with the interaction information transmitted by the interaction control, and the image is adjusted in a correlated manner when the user performs interaction operation through the interaction control, so that the display effect of the image is matched with the interaction information between the user and the image embodied by the interaction control, the information transmission efficiency and richness are improved, and the entertainment experience of the user is further improved.
Example two
Fig. 2A is a flowchart of a control method in a second embodiment of the present disclosure, where the embodiment is optimized based on the above embodiment, and in this embodiment, determining image adjustment information corresponding to the interactive control includes: determining the target character expression corresponding to the interaction control; correspondingly, based on the image adjustment information, the image to be adjusted is adjusted, and displaying the adjusted image includes: and adjusting the character expression in the image to be adjusted according to the target character expression, and displaying the adjusted image.
As shown in fig. 2A, the method of this embodiment specifically includes the following steps:
s210, displaying the image and the interactive control related to the image in a list interface of the client or a detail interface of a list element.
Specifically, the list interface of the client is shown in fig. 2B, in which the list interface of the client includes 6 images, which are: image a, image B, image C, image D, image E, and image F. The bottom of each image is floating to display the thumbs control. The detail interface diagram of the list element image a is shown in fig. 2C, in the figure, the detail interface of the image a includes the image a, the like control and the comment control are displayed at the bottom of the detail interface, the detail interface diagram of the list element image B is shown in fig. 2D, in the figure, the detail interface of the image B includes the image B, the image Q, the image W and the image E, and the like control and the comment control are displayed at the bottom of the detail interface.
And S220, receiving a triggering operation aiming at the interactive control, and determining an image associated with the triggered interactive control.
And S230, acquiring the associated image as an image to be adjusted.
S240, determining the target character expression corresponding to the interactive control.
The target person expression may be a beep mouth expression, a blink expression, a smile expression, a crying expression, and the like, which is not limited by the embodiments of the present disclosure.
Specifically, a database about the correspondence between the interaction control and the character expression is established in advance, the database is queried according to the triggered interaction control, and the target character expression corresponding to the triggered control is obtained, for example, the interaction control S corresponds to a Duzu expression, the interaction control I corresponds to a blink expression, the interaction control D corresponds to a smile expression, and the like.
And S250, adjusting the character expression in the image to be adjusted according to the target character expression, and displaying the adjusted image.
The facial expression of the person in the image to be adjusted is adjusted according to the target person expression, the facial image area of the person in the image to be adjusted can be obtained in advance, the person expression corresponding to the facial image area is adjusted to be the target person expression according to the target person expression, and the facial image area of the person in the image to be adjusted is replaced by the facial image area adjusted to be the target person expression; or the character expression in the image to be adjusted can be directly adjusted to be the target character expression; the image processing method may further include obtaining a partial image corresponding to the expression of the target character in advance, and superimposing the partial image to a preset position of the image to be adjusted.
Specifically, the character expression in the image to be adjusted is adjusted according to the target character expression, and the adjusted image is displayed, for example, the target character expression may be a beep mouth expression, the character expression in the image to be adjusted is adjusted to a beep mouth expression, and the image adjusted to the beep mouth expression is displayed.
Specifically, the image may be adjusted in a manner of real-time analysis of the scene, and the image may be adjusted according to the analysis result. The server or the client analyzes and judges the content itself through scene analysis technologies (such as face recognition, human body contour recognition, natural language processing, voice recognition and the like), and identifies a corresponding scene in the content. And matching the analyzed scenes with corresponding feedback behavior effects (if the natural language recognizes that the scenes depicted in the characters are rainy days, the feedback behaviors are matched with the special effect of raining). When the user interacts with the content in the scene, the feedback behavior is presented (e.g., a rain effect appears on the screen when the user browses the rainy scene). The image may also be adjusted in such a way that the image is adjusted by additional data of the content. For example, the content submitted by the content creator may be added with some pre-made feedback effects, such as smiling/blinking/kissing actions of a person. The content and the prefabricated feedback effect simultaneously reach the terminal (mobile phone or computer) of the user, and preloading is well done. When a user interacts with the content, a pre-made feedback effect is triggered, and a pre-loading technique may be used for the purpose of smoothing and naturalness of the feedback effect.
In a specific example, fig. 2E is a detailed interface illustration of a still picture, as shown in fig. 2E, the image to be adjusted is shown on the left side of the figure, and after the user clicks the thumbs-up control, the expression of the character in the image to be adjusted is adjusted to be the expression of the beep mouth shown on the right side of the figure. Besides the static pictures, the system can also recognize dynamic video pictures and change the facial expressions and special effects in the contents. Fig. 2F is a detailed interface illustration of a video, as shown in fig. 2F, with the image to be adjusted shown on the left side of the figure, and after the user clicks the like control, the expression of the character in the image to be adjusted is adjusted to the expression of the beep mouth shown on the right side of the figure. When the content consumer interacts with the content, the content feeds back to the consumer according to the interaction form and the current scene, so that the content is lively and vivid. Forms of content consumer interaction include: browse/click/pop-up/share/like/comment/forward/play/reward (not limited to virtual currency or legal currency).
According to the technical scheme of the embodiment, an image and an interactive control related to the image are displayed in a list interface of a client or a detail interface of a list element; receiving a triggering operation aiming at the interaction control, and determining an image associated with the triggered interaction control; acquiring the associated image as an image to be adjusted; determining the target character expression corresponding to the interaction control; and adjusting the character expression in the image to be adjusted according to the target character expression, displaying the adjusted image, and adjusting the character expression in the image to be adjusted in a relevant manner when a user performs interactive operation through the interactive control, so that the character expression of the image is matched with the interactive information between the user and the image, which is embodied by the interactive control, and the efficiency and the richness of information transmission are improved, and the entertainment experience of the user is further improved.
EXAMPLE III
Fig. 3A is a flowchart of a control method in a third embodiment of the present disclosure, where in this embodiment, in combination with the above embodiment, the determining image adjustment information corresponding to the interactive control includes: if the interactive control is a first type of interactive control, acquiring a first image adjustment strategy matched with the first type of interactive control; and if the interactive control is a second type of interactive control, acquiring a second image adjustment strategy matched with the second type of interactive control, wherein the first type of interactive control and the second type of interactive control are different interactive controls.
As shown in fig. 3A, the method of this embodiment specifically includes the following steps:
s310, displaying the image and the interactive control related to the image in a list interface of the client or a detail interface of the list element.
And S320, receiving a triggering operation aiming at the interactive control, and determining an image associated with the triggered interactive control.
S330, acquiring the associated image as an image to be adjusted.
And S340, judging whether the interactive control is the first type of interactive control, if so, executing S350, and if not, executing S360.
And S350, acquiring a first image adjusting strategy matched with the first type of interactive control.
The first type of interactive control and the second type of interactive control are different interactive controls.
The first type of interactive control can be an interactive control clicked when the preference degree of the user on the image is greater than or equal to a set threshold, for example, a praise control and a reward control; the interactive control may also be a preset type of interactive control, for example, the sharing control, the forwarding control, and the playing control may be set as a first type of interactive control; or the user may click an interactive control when the preference degree of the user for the image is smaller than the set threshold, for example, the user may step on the control, which is not limited in this disclosure.
Specifically, if the interactive control is a first-class interactive control, a first image adjustment strategy matched with the first-class interactive control is obtained, whether the triggered interactive control is the first-class interactive control is judged in advance, a database about the corresponding relation between the interactive control and the image adjustment strategy is established in advance, and if the triggered interactive control is the first-class interactive control, the database is queried to obtain the first image adjustment strategy corresponding to the first-class interactive control.
And S380, adjusting the image to be adjusted based on the first image adjusting strategy, and displaying the adjusted image.
And S360, judging whether the interactive control is the second type interactive control, and if so, executing S370.
S370, obtaining a second image adjusting strategy matched with the second type of interactive control.
The second type of interactive control may be an interactive control clicked when the preference degree of the user for the image is greater than or equal to a set threshold, for example, the second type of interactive control may be a praise control or a reward control; the interactive control system can also be a preset second type of interactive control, for example, the sharing control, the forwarding control and the playing control can be set as a second type of interactive control; or the image preference degree of the user is smaller than the set threshold value, for example, the user may click on an interactive control, it should be noted that the second type of interactive control is different from the first type of interactive control, and if the first type of interactive control is an interactive control that is clicked when the image preference degree of the user is greater than or equal to the set threshold value, the second type of interactive control is an interactive control that is clicked when the image preference degree of the user is smaller than the set threshold value; if the first-class interaction control is an interaction control clicked when the preference degree of the user to the image is smaller than the set threshold, the second-class interaction control is an interaction control clicked when the preference degree of the user to the image is greater than or equal to the set threshold, and the first-class interaction control and the interaction control in the second-class interaction control are not overlapped, that is, the interaction control a belongs to the first-class interaction control, the interaction control a does not belong to the second-class interaction control, and the first-class interaction control and the second-class interaction control can be interaction controls with opposite emotions, for example, if the preference control is the first interaction control, the step control is the second-class interaction control, which is not limited by the embodiment of the present disclosure.
Specifically, if the interactive control is a second-type interactive control, a second image adjustment strategy matched with the second-type interactive control is obtained, whether the triggered interactive control is the second-type interactive control is judged in advance, a database about the corresponding relation between the interactive control and the image adjustment strategy is established in advance, and if the triggered interactive control is the second-type interactive control, the database is queried to obtain the second image adjustment strategy corresponding to the second-type interactive control.
And S390, adjusting the image to be adjusted based on the second image adjustment strategy, and displaying the adjusted image.
Optionally, based on the image adjustment information, the image to be adjusted is adjusted, and displaying the adjusted image includes:
and adjusting the image to be adjusted according to the first image adjustment strategy, and displaying the adjusted image.
Specifically, if the triggered interactive control is a first-class interactive control, a first image adjustment strategy matched with the first-class interactive control is obtained, an image to be adjusted is adjusted according to the first image adjustment strategy, and the adjusted image is displayed.
Or;
and adjusting the image to be adjusted according to the second image adjustment strategy, and displaying the adjusted image, wherein the first image adjustment strategy is different from the second image adjustment strategy.
Specifically, if the triggered interactive control is a second type of interactive control, a second image adjustment strategy matched with the second type of interactive control is obtained, the image to be adjusted is adjusted according to the second image adjustment strategy, and the adjusted image is displayed.
Optionally, the first type of interactive control is an interactive control clicked when the preference degree of the user to the image is greater than or equal to a set threshold, and the second type of interactive control is an interactive control clicked when the preference degree of the user to the image is less than the set threshold. The preset threshold value can be an expected value of a user, the expected values of different users can be the same or different, the expected value is used for distinguishing whether the user can trigger the first type of interaction control or trigger the second type of interaction control, the expected value does not need to be stored in the terminal, the terminal can detect whether the first type of interaction control or the second type of interaction control is triggered, when the first type of interaction control is triggered, the terminal can recognize that the preference degree of the user to the image is larger than or equal to the preset threshold value, and when the second type of interaction control is triggered, the terminal can recognize that the preference degree of the user to the image is smaller than the preset threshold value.
Optionally, the first image adjustment policy is to adjust the expression of the person in the image to be adjusted to a happy expression, and the second image adjustment policy is to adjust the expression of the person in the image to be adjusted to a worried expression.
In particular, feedback is obtained from real-time analysis of the scene. The content system can analyze and process the picture in real time by using a face recognition technology, and when an interactive behavior is generated, the person in the image can be enabled to make actions such as responding expressions (such as smiling, mouth-throwing, blinking) and the like so as to feed back the interaction of the content consumer. Similarly, the content system may also analyze scenes in the text, such as weather (rain/snow/wind/fog, etc.), environment (in darkness/street/sea/forest), etc., based on the natural language analysis technique, and make feedback actions (e.g., beads on the screen during rain, snowflakes during snow, firecracks in darkness, etc.) in conjunction with the scenes. The feedback is obtained by additional data of the content. The content creator may attach additional data during the authoring process to form feedback on the user's interactive behavior. For example, in the creation of the cartoon, a cartoon version can be added to a certain page of the cartoon, and when the user generates interactive behaviors, the cartoon version gives feedback, such as a character making smiling/blinking/double-snooze/kissing actions. The embodiment of the disclosure enables the content to be vivid and lively in the consumption process through the feedback behavior generated when the user interacts with the content, so that the user experience of the user is improved, and the enthusiasm of the interaction between the user and the content is improved.
In a specific example, when the user performs an interactive operation through the interactive control, the image is adjusted associatively in such a way as to add a special effect, so that the display effect of the image is matched with the interactive information between the user and the image, which is embodied by the interactive control, as shown in fig. 3B, after the user clicks the like control, adding a special effect cat head at the face position of the image to be adjusted, namely after the user clicks the praise control, the face position in the image to be adjusted is adjusted to be the cat head, the specific adjustment mode can be that the cat head image with the same size is superposed on the face position of the image to be adjusted, or the face position of the image to be adjusted is replaced by the cat head image with the same size, the special effect mode can also be that the head portrait in the image to be adjusted flies out of the mouth, and the embodiment of the disclosure does not specifically limit the special effect mode.
According to the technical scheme of the embodiment, an image and an interactive control related to the image are displayed in a list interface of a client or a detail interface of a list element; receiving a triggering operation aiming at the interaction control, and determining an image associated with the triggered interaction control; acquiring the associated image as an image to be adjusted; if the interactive control is a first type of interactive control, acquiring a first image adjustment strategy matched with the first type of interactive control; if the interactive control is a second type of interactive control, acquiring a second image adjustment strategy matched with the second type of interactive control, wherein the first type of interactive control and the second type of interactive control are different interactive controls; based on a first image adjustment strategy or a second image adjustment strategy, the image to be adjusted is adjusted, the adjusted image is displayed, when a user executes interactive operation through an interactive control, the interactive control is classified in advance, different image adjustment strategies are set for different types of interactive controls, and then the image is adjusted in a correlated manner, so that the display effect of the image is matched with the interactive information between the user and the image embodied by the interactive control, the efficiency and the richness of information transmission are improved, and the entertainment experience of the user is further improved.
Example four
Fig. 4 is a schematic structural diagram of a control device according to a fourth embodiment of the present disclosure. The present embodiment may be applicable to the case of control, where the apparatus may be implemented in a software and/or hardware manner, and the apparatus may be integrated in any device providing a control function, as shown in fig. 4, where the control apparatus specifically includes: a first display module 410, an image determination module 420, an acquisition module 430, an information determination module 440, and a second display module 450.
The first display module 410 is configured to display an image and an interaction control associated with the image in a list interface of a client or a detail interface of a list element;
an image determining module 420, configured to receive a triggering operation for an interaction control, and determine an image associated with the triggered interaction control;
an obtaining module 430, configured to obtain the associated image as an image to be adjusted;
the information determining module 440 is configured to determine image adjustment information corresponding to the interactive control;
a second display module 450, configured to adjust the image to be adjusted based on the image adjustment information, and display the adjusted image, where the image adjustment information matches the interaction information conveyed by the interaction control.
Optionally, the list interface includes at least one image, and the at least one image is arranged in a list.
Optionally, the information determining module includes:
the expression determining unit is used for determining the target character expression corresponding to the interaction control;
accordingly, the second display module includes:
and the image display unit is used for adjusting the character expression in the image to be adjusted according to the target character expression and displaying the adjusted image.
Optionally, the information determining module includes:
the first obtaining unit is used for obtaining a first image adjusting strategy matched with the first type of interactive control if the interactive control is the first type of interactive control;
and the second obtaining unit is used for obtaining a second image adjustment strategy matched with the second type of interactive control if the interactive control is the second type of interactive control, wherein the first type of interactive control and the second type of interactive control are different interactive controls.
Optionally, the second display module includes:
the first display unit is used for adjusting the image to be adjusted according to the first image adjustment strategy and displaying the adjusted image;
or;
and the second display unit is used for adjusting the image to be adjusted according to the second image adjustment strategy and displaying the adjusted image, wherein the first image adjustment strategy is different from the second image adjustment strategy.
Optionally, the first type of interactive control is an interactive control clicked when the preference degree of the user to the image is greater than or equal to a set threshold, and the second type of interactive control is an interactive control clicked when the preference degree of the user to the image is less than the set threshold.
Optionally, the first image adjustment policy is to adjust the expression of the person in the image to be adjusted to a happy expression, and the second image adjustment policy is to adjust the expression of the person in the image to be adjusted to a worried expression.
Optionally, the position relationship between the interaction control and the image to be adjusted includes: the interactive control is located in the bottom area of the image to be adjusted, the interactive control is located in the right area of the image to be adjusted, and the interactive control is suspended in one or more of the image to be adjusted.
Optionally, the image determining module is specifically configured to:
receiving a trigger operation aiming at an interactive control, and acquiring position information of the interactive control;
and determining an image associated with the triggered interaction control according to the position information.
Optionally, the image adjustment information includes: an adjustment region in the image to be adjusted;
correspondingly, based on the image adjustment information, the image to be adjusted is adjusted, and displaying the adjusted image includes:
and adjusting the adjusting area in the image to be adjusted, and displaying the adjusted image.
Optionally, the interaction control includes: the system comprises one or more of a browsing control, a popup control, a sharing control, a praise control, a comment control, a forwarding control, a playing control and a rewarding control.
Optionally, the image to be adjusted includes one or more of the following:
a still picture;
part or all of the frames in the moving picture;
some or all of the frames in the video.
The product can execute the method provided by any embodiment of the disclosure, and has corresponding functional modules and beneficial effects of the execution method.
According to the technical scheme of the embodiment, an image and an interactive control related to the image are displayed in a list interface of a client or a detail interface of a list element; receiving a triggering operation aiming at the interaction control, and determining an image associated with the triggered interaction control; acquiring the associated image as an image to be adjusted; determining image adjustment information corresponding to the interactive control; based on the image adjustment information, the image to be adjusted is adjusted, and the adjusted image is displayed, wherein the image adjustment information is matched with the interaction information transmitted by the interaction control, and the image is adjusted in a correlated manner when the user performs interaction operation through the interaction control, so that the display effect of the image is matched with the interaction information between the user and the image embodied by the interaction control, the information transmission efficiency and richness are improved, and the entertainment experience of the user is further improved.
EXAMPLE five
Referring now to FIG. 5, a block diagram of an electronic device 500 suitable for use in implementing embodiments of the present disclosure is shown. The electronic device 500 in the disclosed embodiment may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device 500 shown in fig. 5 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, electronic device 500 may include a processing means (e.g., central processing unit, graphics processor, etc.) 501 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM 503 are connected to each other through a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
Generally, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 507 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; storage devices 508 including, for example, magnetic tape, hard disk, etc.; and a communication device 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 illustrates an electronic device 500 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or installed from the storage means 508, or installed from the ROM 502. The computer program performs the above-described functions defined in the methods of the embodiments of the present disclosure when executed by the processing device 501.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: displaying an image and an interactive control associated with the image in a list interface of a client or a detail interface of a list element; receiving a triggering operation aiming at the interaction control, and determining an image associated with the triggered interaction control; acquiring the associated image as an image to be adjusted; determining image adjustment information corresponding to the interactive control; and adjusting the image to be adjusted based on the image adjustment information, and displaying the adjusted image, wherein the image adjustment information is matched with the interaction information transmitted by the interaction control.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or electronic device. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of an element does not in some cases constitute a limitation on the element itself.
EXAMPLE six
The disclosed embodiments also provide a computer-readable storage medium on which a computer program is stored, the program, when executed by a control apparatus, implementing a control method according to a first embodiment of the disclosure, the method including: displaying an image and an interactive control associated with the image in a list interface of a client or a detail interface of a list element; receiving a triggering operation aiming at the interaction control, and determining an image associated with the triggered interaction control; acquiring the associated image as an image to be adjusted; determining image adjustment information corresponding to the interactive control; and adjusting the image to be adjusted based on the image adjustment information, and displaying the adjusted image, wherein the image adjustment information is matched with the interaction information transmitted by the interaction control.
Of course, the computer program stored on the computer readable storage medium provided by the embodiments of the present disclosure is not limited to implement the method operations described above when being executed, and may also implement the relevant operations in the control method provided by any embodiments of the present disclosure.
From the above description of the embodiments, it is obvious for a person skilled in the art that the present disclosure can be implemented by software and necessary general hardware, and certainly can be implemented by hardware, but in many cases, the former is a better embodiment. Based on such understanding, the technical solutions of the present disclosure may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, an electronic device, or a network device) to execute the methods according to the embodiments of the present disclosure.
It should be noted that, in the embodiment of the control device, the included units and modules are merely divided according to the functional logic, but are not limited to the above division as long as the corresponding functions can be realized; in addition, specific names of the functional units are only used for distinguishing one functional unit from another, and are not used for limiting the protection scope of the present disclosure.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.

Claims (12)

1. A control method, comprising:
displaying a plurality of images and interaction controls associated with the images in a list interface of a client or a detail interface of a list element, wherein the images are arranged in a list form;
receiving a triggering operation aiming at the interaction control, and determining an image associated with the triggered interaction control;
the receiving a triggering operation for an interaction control, and the determining an image associated with the triggered interaction control comprises:
a database about the corresponding relation between the interactive control and the image is established in advance, and the image associated with the triggered interactive control is obtained by searching the database;
or the association between the interactive control and the image is established in advance, and the image associated with the interactive control is determined directly according to the triggered interactive control;
or determining the image closest to the position area according to the position area where the triggered interactive control is located, and taking the image closest to the position area as the image associated with the triggered interactive control;
acquiring the associated image as an image to be adjusted;
determining image adjustment information corresponding to the interactive control;
adjusting the image to be adjusted based on the image adjustment information, displaying the adjusted image, wherein the adjusting the image to be adjusted comprises analyzing the scene of the image to be adjusted in real time and matching the image to be adjusted with corresponding feedback behaviors; wherein the image adjustment information matches interaction information conveyed by the interaction control.
2. The method of claim 1, wherein determining the image adjustment information corresponding to the interaction control comprises:
determining the target character expression corresponding to the interaction control;
correspondingly, based on the image adjustment information, the image to be adjusted is adjusted, and displaying the adjusted image includes:
and adjusting the character expression in the image to be adjusted according to the target character expression, and displaying the adjusted image.
3. The method of claim 1, wherein determining the image adjustment information corresponding to the interaction control comprises:
if the interactive control is a first type of interactive control, acquiring a first image adjustment strategy matched with the first type of interactive control;
if the interactive control is a second type of interactive control, acquiring a second image adjustment strategy matched with the second type of interactive control, wherein the first type of interactive control and the second type of interactive control are different interactive controls;
correspondingly, based on the image adjustment information, the image to be adjusted is adjusted, and displaying the adjusted image includes:
adjusting the image to be adjusted according to the first image adjustment strategy, and displaying the adjusted image;
or;
and adjusting the image to be adjusted according to the second image adjustment strategy, and displaying the adjusted image, wherein the first image adjustment strategy is different from the second image adjustment strategy.
4. The method of claim 3, wherein the first type of interaction control is an interaction control clicked when the preference degree of the user for the image is greater than or equal to a set threshold, and the second type of interaction control is an interaction control clicked when the preference degree of the user for the image is less than the set threshold.
5. The method of claim 4, wherein the first image adjustment strategy is to adjust a human expression in the image to be adjusted to a happy expression, and the second image adjustment strategy is to adjust a human expression in the image to be adjusted to a sad expression.
6. The method of claim 1, wherein the positional relationship between the interactive control and the image to be adjusted comprises: the interactive control is located in the bottom area of the image to be adjusted, the interactive control is located in the right area of the image to be adjusted, and the interactive control is suspended in one or more of the image to be adjusted.
7. The method of claim 1, wherein the image adjustment information comprises: an adjustment region in the image to be adjusted;
correspondingly, based on the image adjustment information, the image to be adjusted is adjusted, and displaying the adjusted image includes:
and adjusting the adjusting area in the image to be adjusted, and displaying the adjusted image.
8. The method of any of claims 1-7, wherein the interactive controls comprise: the system comprises one or more of a browsing control, a popup control, a sharing control, a praise control, a comment control, a forwarding control, a playing control and a rewarding control.
9. The method according to any one of claims 1-7, wherein the image to be adjusted comprises one or more of:
a still picture;
part or all of the frames in the moving picture;
some or all of the frames in the video.
10. A control device, comprising:
the first display module is used for displaying a plurality of images and interaction controls related to the images in a list interface of a client or a detail interface of a list element, wherein the images are arranged in a list form;
the image determining module is used for receiving triggering operation aiming at the interactive control and determining an image associated with the triggered interactive control;
the receiving a triggering operation for an interaction control, and the determining an image associated with the triggered interaction control comprises:
a database about the corresponding relation between the interactive control and the image is established in advance, and the image associated with the triggered interactive control is obtained by searching the database;
or the association between the interactive control and the image is established in advance, and the image associated with the interactive control is determined directly according to the triggered interactive control;
or determining the image closest to the position area according to the position area where the triggered interactive control is located, and taking the image closest to the position area as the image associated with the triggered interactive control;
the acquisition module is used for acquiring the associated image as an image to be adjusted;
the information determining module is used for determining image adjusting information corresponding to the interactive control;
the second display module is used for adjusting the image to be adjusted based on the image adjustment information, displaying the adjusted image, and adjusting the image to be adjusted, wherein the adjusting of the image to be adjusted comprises analyzing the scene of the image to be adjusted in real time and matching the image to be adjusted with a corresponding feedback behavior; wherein the image adjustment information matches interaction information conveyed by the interaction control.
11. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1-9 when executing the program.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-9.
CN201910289631.2A 2019-04-11 2019-04-11 Control method, device, equipment and storage medium Active CN109992187B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910289631.2A CN109992187B (en) 2019-04-11 2019-04-11 Control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910289631.2A CN109992187B (en) 2019-04-11 2019-04-11 Control method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109992187A CN109992187A (en) 2019-07-09
CN109992187B true CN109992187B (en) 2021-12-10

Family

ID=67133304

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910289631.2A Active CN109992187B (en) 2019-04-11 2019-04-11 Control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109992187B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110827378B (en) * 2019-10-31 2023-06-09 北京字节跳动网络技术有限公司 Virtual image generation method, device, terminal and storage medium
CN111831203B (en) * 2020-07-03 2022-05-17 Oppo广东移动通信有限公司 Information processing method, information processing apparatus, storage medium, and electronic device
CN112131487A (en) * 2020-09-11 2020-12-25 深圳市大成天下信息技术有限公司 Interaction method and computing device
CN113419800B (en) * 2021-06-11 2023-03-24 北京字跳网络技术有限公司 Interaction method, device, medium and electronic equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101363691B1 (en) * 2006-01-17 2014-02-14 가부시키가이샤 시세이도 Makeup simulation system, makeup simulation device, makeup simulation method, and makeup simulation program
CN107809375A (en) * 2017-10-25 2018-03-16 陕西舜洋电子科技有限公司 Information interacting method and storage medium based on internet social networks
CN107728887A (en) * 2017-10-25 2018-02-23 陕西舜洋电子科技有限公司 The information interaction system of internet social networks
CN108769814B (en) * 2018-06-01 2022-02-01 腾讯科技(深圳)有限公司 Video interaction method, device, terminal and readable storage medium
CN109302631B (en) * 2018-09-20 2021-06-08 阿里巴巴(中国)有限公司 Video interface display method and device

Also Published As

Publication number Publication date
CN109992187A (en) 2019-07-09

Similar Documents

Publication Publication Date Title
CN109992187B (en) Control method, device, equipment and storage medium
US10924800B2 (en) Computerized system and method for automatically detecting and rendering highlights from streaming videos
CN109688463B (en) Clip video generation method and device, terminal equipment and storage medium
CN110827378B (en) Virtual image generation method, device, terminal and storage medium
CN107924414B (en) Personal assistance to facilitate multimedia integration and story generation at a computing device
US11231838B2 (en) Image display with selective depiction of motion
JP2022523606A (en) Gating model for video analysis
CN111708941A (en) Content recommendation method and device, computer equipment and storage medium
CN111930994A (en) Video editing processing method and device, electronic equipment and storage medium
CN109348277B (en) Motion pixel video special effect adding method and device, terminal equipment and storage medium
CN111666898B (en) Method and device for identifying class to which vehicle belongs
KR102454421B1 (en) Personalized Auto Video Crop
CN113806306B (en) Media file processing method, device, equipment, readable storage medium and product
KR20230079413A (en) Ad breakpoints in video within the messaging system
KR20230079261A (en) Inserting advertisements into videos within the messaging system
CN111158924A (en) Content sharing method and device, electronic equipment and readable storage medium
CN110413834B (en) Voice comment modification method, system, medium and electronic device
CN112308950A (en) Video generation method and device
CN115022702B (en) Display method, device, equipment and medium for live broadcast room gift
CN112927326A (en) Image processing method and device, electronic equipment and computer readable storage medium
CN115994266A (en) Resource recommendation method, device, electronic equipment and storage medium
CN111506184A (en) Avatar presenting method and electronic equipment
CN111507142A (en) Facial expression image processing method and device and electronic equipment
US10126821B2 (en) Information processing method and information processing device
CN111246246A (en) Video playing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant