CN115379274B - Picture-based interaction method and device, electronic equipment and storage medium - Google Patents

Picture-based interaction method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115379274B
CN115379274B CN202210988384.7A CN202210988384A CN115379274B CN 115379274 B CN115379274 B CN 115379274B CN 202210988384 A CN202210988384 A CN 202210988384A CN 115379274 B CN115379274 B CN 115379274B
Authority
CN
China
Prior art keywords
picture
interaction
control
target
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210988384.7A
Other languages
Chinese (zh)
Other versions
CN115379274A (en
Inventor
王乐言
高天宇
陈振
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202210988384.7A priority Critical patent/CN115379274B/en
Publication of CN115379274A publication Critical patent/CN115379274A/en
Application granted granted Critical
Publication of CN115379274B publication Critical patent/CN115379274B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • H04N21/41265The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/4221Dedicated function buttons, e.g. for the control of an EPG, subtitles, aspect ratio, picture-in-picture or teletext
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42224Touch pad or touch panel provided on the remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows

Abstract

The disclosure provides an interaction method and device based on pictures, electronic equipment and a storage medium, and belongs to the technical field of Internet. The method comprises the following steps: displaying a picture display interface; responding to the triggering operation of the thumbnail of the target picture, switching the picture display interface into a detail interface of the target picture, and displaying a picture display area and a control area in the detail interface; and responding to the triggering operation of the target interaction control in the at least one interaction control, and executing the target interaction behavior between the target interaction control and the release object of the target picture. According to the scheme, the browsing object for viewing the target picture can interact with the release object of the target picture at any time through the control in the control area, and interaction efficiency is improved.

Description

Picture-based interaction method and device, electronic equipment and storage medium
Technical Field
The disclosure relates to the field of computer technology, and in particular, to an interaction method and device based on pictures, an electronic device and a storage medium.
Background
With the development of computer technology, people can view multimedia resources such as articles, pictures, videos and the like shared by other people through terminals such as mobile phones and the like. Taking the example of displaying a long graph larger than the interface size in the interface, the object a can browse the long graph issued by the object B by inputting a sliding operation in the interface. When the object A browses to the end of the long graph, the terminal displays the description information of the long graph and an interaction control in an interface, wherein the interaction control is used for interacting with the object B after triggering. The interface display mode can lead the object A to interact with the object B when the long graph is browsed, and the interaction efficiency is low.
Disclosure of Invention
The invention provides an interaction method, an interaction device, electronic equipment and a storage medium based on pictures, so that a browsing object for viewing a target picture can interact with a release object of the target picture at any time through a control in a control area, and interaction efficiency is improved. The technical scheme of the present disclosure is as follows:
according to an aspect of the embodiments of the present disclosure, there is provided a picture-based interaction method, including:
displaying a picture display interface, wherein the picture display interface displays thumbnail images of a plurality of pictures;
responding to triggering operation of a thumbnail of a target picture, switching the picture display interface into a detail interface of the target picture, displaying a picture display area and a control area in the detail interface, wherein the size of the target picture is larger than that of the picture display area, the picture display area displays picture content in a part area of the target picture, and the control area displays at least one interactive control;
and responding to the triggering operation of a target interaction control in the at least one interaction control, and executing target interaction behaviors between the target interaction control and the release object of the target picture, wherein the target interaction behaviors correspond to the target interaction control.
According to another aspect of the embodiments of the present disclosure, there is provided an interactive apparatus based on a picture, including:
a display unit configured to display a picture display interface, the picture display interface displaying thumbnails of a plurality of pictures;
the display unit is further configured to respond to triggering operation of a thumbnail of a target picture, switch the picture display interface into a detail interface of the target picture, display a picture display area and a control area in the detail interface, wherein the size of the target picture is larger than that of the picture display area, the picture display area displays picture content in a partial area of the target picture, and the control area displays at least one interactive control;
and the behavior execution unit is configured to respond to the triggering operation of a target interaction control in the at least one interaction control and execute target interaction behaviors between the target interaction control and the release object of the target picture, wherein the target interaction behaviors correspond to the target interaction control.
In some embodiments, the display unit is further configured to display the picture display region in the detail interface; updating the picture content displayed in the picture display region based on a sliding operation on the picture display region; and displaying the control area in the detail interface under the condition that the picture content displayed in the picture display area is related to the picture theme.
In some embodiments, the display unit is further configured to display the picture display region in the detail interface; updating the picture content displayed in the picture display region based on a sliding operation on the picture display region; and displaying the control area in the detail interface under the condition that the picture content displayed in the picture display area is related to the interest of the browsing object for viewing the target picture.
In some embodiments, the display unit is further configured to display the picture display region in the detail interface; and responding to the fact that the browse object for viewing the target picture belongs to the target object type, and displaying the control area in the detail interface.
In some embodiments, the control region displays a plurality of interactive controls;
the display unit is further configured to display the picture display area in the detail interface; and displaying the plurality of interactive controls in the control area based on the behavior priorities of the plurality of interactive behaviors in the detail interface, wherein the display positions of the interactive controls in the control area are related to the behavior priorities of the interactive behaviors of the interactive controls.
In some embodiments, the location of the control region is any one of:
the control area is displayed at the top of the detail interface;
the control area is displayed at the bottom of the detail interface;
the control area is displayed on the left side of the detail interface;
and the control area is displayed on the right side of the detail interface.
In some embodiments, the display unit is further configured to hide the control region in the detail interface in response to a region hiding operation.
In some embodiments, the region hiding operation includes any one of:
triggering operation of the area hiding control;
a first sliding operation for sliding the target picture, the sliding distance of the first sliding operation being not less than a first distance;
a second sliding operation for sliding the control region, a sliding distance of the sliding operation being not less than a second distance;
and a third sliding operation for inputting a first sliding track in the detail interface.
In some embodiments, the behavior execution unit is further configured to determine, in response to detecting a triggering operation of a combination key, a first interaction behavior, the first interaction behavior being an interaction behavior indicated by the combination key; and executing the first interaction behavior between the first interaction behavior and the release object.
In some embodiments, the behavior execution unit is further configured to determine a second interaction behavior in case a second sliding track is detected in the detail interface, the second interaction behavior being an interaction behavior indicated by the second sliding track; and executing the second interaction behavior between the second interaction behavior and the release object.
According to another aspect of the embodiments of the present disclosure, there is provided an electronic device including:
one or more processors;
a memory for storing the processor-executable program code;
wherein the processor is configured to execute the program code to implement the above-described picture-based interaction method.
According to another aspect of the embodiments of the present disclosure, there is provided a storage medium, which when executed by a processor of an electronic device, enables the electronic device to perform the above-described picture-based interaction method.
According to another aspect of the disclosed embodiments, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the picture-based interaction method of the above aspects.
The embodiment of the disclosure provides an interaction method based on a picture, which enables a browsing object viewing a target picture to interact with a release object of the target picture at any time through a control in a control area by displaying the control area while displaying a partial area of the target picture in a detail interface of the target picture, thereby improving interaction efficiency.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure and do not constitute an undue limitation on the disclosure.
Fig. 1 is a schematic diagram illustrating an implementation environment of a picture-based interaction method according to an exemplary embodiment.
Fig. 2 is a flow chart illustrating a picture-based interaction method according to an exemplary embodiment.
FIG. 3 is a flowchart illustrating another picture-based interaction method according to an example embodiment.
FIG. 4 is a schematic diagram of a detail interface provided in accordance with an exemplary embodiment.
Fig. 5 is a schematic diagram of another detailed interface provided in accordance with an exemplary embodiment.
FIG. 6 is a schematic diagram of a focus control provided in accordance with an exemplary embodiment.
Fig. 7 is a block diagram illustrating a picture-based interaction device, according to an example embodiment.
Fig. 8 is a block diagram of a terminal according to an exemplary embodiment.
Fig. 9 is a block diagram of a server, according to an example embodiment.
Detailed Description
In order to enable those skilled in the art to better understand the technical solutions of the present disclosure, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the foregoing figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the disclosure described herein may be capable of operation in sequences other than those illustrated or described herein. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with aspects of the disclosure as detailed in the accompanying claims
It should be noted that, the information (including but not limited to user equipment information, user personal information, etc.), data (including but not limited to data for analysis, stored data, presented data, etc.), and signals related to the present disclosure are all authorized by the user or are fully authorized by the parties, and the collection, use, and processing of relevant data is required to comply with relevant laws and regulations and standards of relevant countries and regions. For example, the pictures referred to in this disclosure are taken with sufficient authorization.
In the embodiment of the present disclosure, an electronic device may be provided as a terminal or a server, and when the electronic device is provided as a terminal, the solution provided by the present disclosure may be implemented by the terminal; when provided as a server, the aspects provided by the present disclosure may be implemented by the server; the solution provided by the present disclosure may also be implemented by the interaction of the server and the terminal, which is not limited by the embodiments of the present disclosure.
Fig. 1 is a schematic diagram illustrating an implementation environment of a picture-based interaction method according to an exemplary embodiment. Taking an example in which the electronic device is provided as a server, referring to fig. 1, the implementation environment specifically includes: a terminal 101 and a server 102.
The terminal 101 may be at least one of a smart phone, a smart watch, a desktop computer, a laptop computer, an MP3 (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3) player, an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, and a laptop portable computer. An application may be installed and run on the terminal 101, and a user may log in to the application through the terminal 101 to obtain a service provided by the application. The terminal 101 may be connected to the server 102 through a wireless network or a wired network.
The terminal 101 may refer broadly to one of a plurality of terminals, and the present embodiment is illustrated only with the terminal 101. Those skilled in the art will recognize that the number of terminals may be greater or lesser. For example, the number of the terminals may be only several, or the number of the terminals may be tens or hundreds, or more, and the number and the device type of the terminals are not limited in the embodiments of the present disclosure.
Server 102 may be at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 102 may be connected to the terminal 101 and other terminals via a wireless network or a wired network. Alternatively, the number of servers may be greater or lesser, which is not limited by the embodiments of the present disclosure. Of course, the server 102 may also include other functional servers to provide more comprehensive and diverse services.
Fig. 2 is a flowchart illustrating a picture-based interaction method according to an exemplary embodiment, referring to fig. 2, for example, to be performed by a terminal, the method includes:
in step S201, the terminal displays a picture display interface, which displays thumbnails of a plurality of pictures.
In the embodiment of the disclosure, the terminal is a terminal of the first object, and the terminal is provided with and runs an application program supporting to view pictures published by the second object. The second object is an object other than the first object, and the application program can also be used for viewing videos or articles published by the object. The application includes a picture display interface capable of displaying a thumbnail of a picture to facilitate viewing of the picture by a first object. Alternatively, the plurality of pictures may be published by the same second object, or may be published by different second objects, which is not limited by the embodiments of the present disclosure.
Note that, since the first object is an object for viewing a picture, the first object may also be referred to as a browsing object. Since the second object is an object of the release picture, the second object may also be referred to as a release object.
In step S202, in response to a triggering operation on a thumbnail of a target picture, the terminal switches the picture display interface to a detail interface of the target picture, a picture display area and a control area are displayed in the detail interface, the size of the target picture is larger than that of the picture display area, the picture display area displays picture content in a part of the area of the target picture, and the control area displays at least one interactive control.
In the embodiment of the disclosure, the browsing object may trigger the thumbnail of any picture to view the picture. After detecting the triggering operation of the thumbnail of the target picture, the terminal can display the detail interface of the target picture. Because the size of the target picture is larger than the size of the picture display area in the detail interface, the picture display area displays the picture content of a part of the target picture, and the picture content of other areas of the target picture can be checked by sliding the target picture. The control area in the detail interface displays one or more interactive controls. The interaction control is used for executing interaction behavior between the browsing object and the release object of the target picture after triggering.
In step S203, in response to a triggering operation of a target interaction control in the at least one interaction control, the terminal executes a target interaction behavior between the terminal and the release object of the target picture, where the target interaction behavior corresponds to the target interaction control.
In the embodiment of the disclosure, the browsing object can interact with the release object of the target picture by triggering the interaction control. After detecting the triggering operation of the target interaction control, the terminal can determine the target interaction behavior corresponding to the target interaction control, and further execute the target interaction behavior between the terminal and the release object. Optionally, the target interaction behavior is a focus behavior, a praise behavior, a resource transfer behavior, and the like, which is not limited by the embodiments of the present disclosure.
The embodiment of the disclosure provides an interaction method based on a picture, which enables a browsing object viewing a target picture to interact with a release object of the target picture at any time through a control in a control area by displaying the control area while displaying a partial area of the target picture in a detail interface of the target picture, thereby improving interaction efficiency.
In some embodiments, displaying the picture display region and the control region in the detail interface includes:
displaying a picture display area in the detail interface;
updating the picture content displayed in the picture display area based on the sliding operation on the picture display area;
and displaying a control area in the detail interface in the condition that the picture content displayed in the picture display area is related to the picture theme.
According to the embodiment of the disclosure, under the condition that the currently displayed picture content in the picture display area is related to the picture theme, the control area is displayed again, so that a browsing object can roughly browse when browsing the picture content with low or irrelevant picture theme, and can carefully browse and interact with the release object of the target picture by triggering the control in the control area when browsing the picture content with higher picture theme, thereby more conforming to the browsing habit of the browsing object and improving the interaction efficiency between the browsing picture and the object.
In some embodiments, displaying the picture display region and the control region in the detail interface includes:
displaying a picture display area in the detail interface;
updating the picture content displayed in the picture display area based on the sliding operation on the picture display area;
and displaying a control area in the detail interface in the condition that the picture content displayed in the picture display area is related to the interest of the browsing object for viewing the target picture.
According to the embodiment of the disclosure, the control area is displayed again under the condition that the currently displayed picture content in the picture display area is the content of interest of the browsing object, so that when the browsing object browses the content of interest, the browsing object can interact with the release object of the target picture by triggering the control in the control area, and interaction efficiency among the objects is improved.
In some embodiments, displaying the picture display region and the control region in the detail interface includes:
displaying a picture display area in the detail interface;
and displaying a control area in the detail interface in response to the browsing object for viewing the target picture belonging to the target object type.
According to the embodiment of the disclosure, the control area is displayed according to the object type of the browsing object, so that the display mode of the control area accords with the use habit of the browsing object.
In some embodiments, the control region displays a plurality of interactive controls;
displaying a picture display area and a control area in a detail interface, including:
displaying a picture display area in the detail interface;
in the detail interface, based on the behavior priorities of the interaction behaviors, displaying the interaction controls in a control area, wherein the display positions of the interaction controls in the control area are related to the behavior priorities of the interaction behaviors of the interaction controls.
According to the method and the device for determining the interaction behavior, the position of the interaction control in the control area is determined based on the behavior priority of the interaction behavior of the interaction control, so that the interaction behavior with high behavior priority is triggered more easily, and the interaction efficiency between objects is improved.
In some embodiments, the location of the control region is any of:
the control area is displayed at the top of the detail interface;
the control area is displayed at the bottom of the detail interface;
the control area is displayed on the left side of the detail interface;
the control area is displayed on the right side of the detail interface.
According to the embodiment of the disclosure, the display position of the control area is not limited, so that no matter where the control area is displayed in the detail interface, the browsing object and the publishing object can be interacted.
In some embodiments, the method further comprises:
and hiding the control region in the detail interface in response to the region hiding operation.
According to the embodiment of the disclosure, the region hiding function is provided, so that a browsing object can hide a control region when the control region is not needed, the picture display region can display more picture contents, and the viewing efficiency of pictures is improved.
In some embodiments, the region hiding operation includes any one of:
triggering operation of the area hiding control;
a first sliding operation for sliding the target picture, the sliding distance of the first sliding operation being not less than the first distance;
the second sliding operation is used for sliding the control area, and the sliding distance of the sliding operation is not smaller than the second distance;
and a third sliding operation for inputting the first sliding track in the detail interface.
According to the embodiment of the disclosure, by providing various region hiding operations, a manner of hiding the control region is increased, and the man-machine interaction efficiency is improved.
In some embodiments, the method further comprises:
in response to detecting a triggering operation of the combination key, determining a first interaction behavior, wherein the first interaction behavior is an interaction behavior indicated by the combination key;
A first interaction with the published object is performed.
According to the method and the device for the interaction between the objects, interaction between the objects is achieved through triggering the combination keys, interaction modes between the objects are increased, and interaction efficiency is improved.
In some embodiments, the method further comprises:
under the condition that a second sliding track is detected in the detail interface, determining a second interaction behavior, wherein the second interaction behavior is the interaction behavior indicated by the second sliding track;
and executing a second interaction action between the object and the release object.
According to the embodiment of the disclosure, the interaction with the release object is performed by sliding a certain track on the detail interface, so that the interaction mode between the objects is increased, and the interaction efficiency is improved.
The foregoing fig. 2 shows a basic flow of the present disclosure, and the scheme provided in the present disclosure is further described below based on a specific implementation, and fig. 3 is a flowchart illustrating another picture-based interaction method according to an exemplary embodiment. Taking an example in which an electronic device is provided as a terminal, see fig. 3, the method comprises:
in step S301, the terminal displays a picture display interface, which displays thumbnails of a plurality of pictures.
The present step is referred to the above step S201, and will not be described herein.
In step S302, in response to a triggering operation on a thumbnail of a target picture, the terminal switches the picture display interface to a detail interface of the target picture, a picture display area is displayed in the detail interface, the size of the target picture is larger than that of the picture display area, and the picture display area displays picture content in a partial area of the target picture.
In the embodiment of the disclosure, the browsing object may trigger the thumbnail of the target picture by clicking the thumbnail, long-pressing the thumbnail, or triggering a view control on the thumbnail, and the like. The target picture can be a long picture, and the long picture refers to a picture with a size larger than that of the picture display area, i.e. the picture display area cannot completely display the target picture, but only can display the picture content of a partial area of the target picture. The browsing object may view different regions of the target picture by sliding the target picture. Optionally, the terminal may use multiple switching modes to perform interface switching, such as fade-in fade-out, fly-in, shutter, etc., where the implementation of the present disclosure does not limit the switching modes of the interface.
In step S303, the terminal displays a control area in the detail interface, where at least one interactive control is displayed in the control area.
In the embodiment of the disclosure, the terminal may display the control area while displaying the above-mentioned picture display area in the detail interface, or may display the control area after meeting the display condition of the control area. Alternatively, the display condition of the control region may be related to at least one of a picture subject of the target picture, an interest of the browsing object, or an object type of the browsing object, see the following three cases.
In the first case, the display condition of the control area is that the picture content displayed in the picture display area is related to the picture theme. Accordingly, after the terminal displays the picture display region in the detail interface, the content of the picture displayed in the picture display region is updated based on the sliding operation on the picture display region. The terminal confirms whether the picture content is related to the picture theme. And under the condition that the picture content displayed in the picture display area is related to the picture theme, the terminal displays a control area in the detail interface. Under the condition that the currently displayed picture content in the picture display area is related to the picture theme, the control area is displayed again, so that a browsing object can roughly browse when browsing the picture content with low or irrelevant picture theme, and can carefully browse and interact with the release object of the target picture by triggering the control in the control area when browsing the picture content with higher picture theme, thereby conforming to the browsing habit of the browsing object and improving the interaction efficiency between the browsing picture and the object. The part with low or irrelevant relevance may be a background area in the person picture, a blank area in the landscape picture or a sky area in the building picture, etc.
For example, fig. 4 is a schematic diagram of a detail interface provided according to an exemplary embodiment, and referring to fig. 4, the target picture is a building picture, and the subject is a building. The upper third part of the target picture is sky, and the lower two thirds part is the main body of the building. As shown in (a) of fig. 4, the terminal initially displays the picture content, i.e., sky, in the region of the upper third of the target picture in the picture display region in the detail interface. When the browsing object is to update the picture content displayed in the picture display area by sliding up the target picture, see (b) in fig. 4. When the picture content in the picture display area is mostly building, the terminal determines that the picture theme is related, and the terminal displays a control area in the detail interface, and the control area is displayed with a control 1, a control 2 and a control 3, as shown in (c) in fig. 4.
And secondly, the display condition of the control area is that the picture content displayed in the picture display area is related to the interest of the browsing object. Accordingly, after the terminal displays the picture display region in the detail interface, the content of the picture displayed in the picture display region is updated based on the sliding operation on the picture display region. The terminal acquires a content tag of the picture content, and if the content tag is the same as the interest tag of the browsing object, the terminal confirms that the picture content displayed in the picture display area is related to the interest of the browsing object for viewing the target picture, and in this case, the terminal displays a control area in the detail interface. Under the condition that the currently displayed picture content in the picture display area is the content of interest of the browsing object, the control area is displayed again, so that when the browsing object browses the content of interest, the control in the control area can be triggered to interact with the release object of the target picture, and interaction efficiency among the objects is improved.
And thirdly, the display condition of the control area is that the object type of the browsing object is the target object type. The object types may be divided according to attributes of the browsing object, such as an age group, or according to browsing habits of the browsing object, which is not limited by the embodiments of the present disclosure. Correspondingly, after the terminal displays the picture display area in the detail interface, determining the object type of the browsing object, and responding to the browsing object of the viewing target picture belonging to the target object type, the terminal displays the control area in the detail interface. The control area is displayed according to the object type of the browsing object, so that the display mode of the control area accords with the use habit of the browsing object. If the learning ability of the older browsing object is worse than that of the younger browsing object, if the control area is not directly displayed, the older browsing object can not successfully call out the control area, and thus can not interact with the publishing object of the target picture. And the browsing object with smaller age can recall the control area when needed, so that the picture can be browsed more immersively, and the browsing efficiency of the picture is improved.
In some embodiments, the control region may be displayed in any location in the detail page that does not affect browsing by the browsing object. Optionally, a control area is displayed on top of the detail interface; or the control area is displayed at the bottom of the detail interface; or the control area is displayed on the left side of the detail interface; alternatively, the control area is displayed on the right side of the detail interface, etc. The embodiment of the disclosure does not limit the display position of the control area.
In some embodiments, the control region may be displayed in any form, such as a window, float, wheel, or the like. Taking the control area as an example of a wheel, the wheel may be displayed in the lower right corner of the detail interface. The wheel disc has two states of unfolding and folding, when the wheel disc is in the unfolded state, a plurality of interactive controls are displayed on the unfolded wheel disc, when the wheel disc is in the folding state, a wheel disc unfolding control is displayed, and the wheel disc unfolding control is used for unfolding the wheel disc after triggering.
In some embodiments, the control region may be hidden, and in response to a region hiding operation, the terminal hides the control region in the detail interface in the case where the detail interface displays the control region. By hiding the control area, the picture display area can display more picture contents, and the picture viewing efficiency is improved. Optionally, the region hiding operation includes: triggering operation of the area hiding control; or, a first sliding operation for sliding the target picture, the sliding distance of the first sliding operation being not less than the first distance; or, a second sliding operation, wherein the second sliding operation is used for sliding the control area, and the sliding distance of the sliding operation is not smaller than the second distance; or, a third sliding operation for inputting the first sliding track in the detail interface. Alternatively, the region hiding operation may also be a picture enlarging operation or a picture reducing operation, which is not limited by the embodiments of the present disclosure.
The hidden control may be displayed in the control area, such as a right arrow, or may be displayed outside the control area, such as an eye control. The first sliding operation may be to slide the target picture in either direction. The second sliding operation may be a left-to-right sliding control area, a right-to-left sliding control area, a top-to-bottom sliding control area, or a left-to-right sliding control area, which is not limited by the embodiments of the present disclosure. The third sliding operation may be to input any preset track, such as a C-track, an S-track, a W-track, or a user-defined track, which is not limited by the embodiments of the present disclosure.
It should be noted that, in the case where the control area is in the hidden state, the terminal may redisplay the control area in the detail interface in response to the area display operation. The area display operation may be a trigger operation of the area display control, a fourth sliding operation for inputting a third sliding track in the detail interface.
In some embodiments, the control area displays a plurality of interactive controls, and the display positions of the interactive controls in the control area are determined according to the behavior priority of the corresponding interactive behaviors. Correspondingly, the terminal displays a picture display area in the detail interface, then displays a plurality of interactive controls in the control area based on the behavior priorities of the interactive behaviors in the detail interface, and the display positions of the interactive controls in the control area are related to the behavior priorities of the interactive behaviors of the interactive controls. The position of the interactive control in the control area is determined based on the behavior priority of the interactive behavior of the interactive control, so that the interactive behavior with high behavior priority is easier to trigger, and the interaction efficiency between objects is improved.
For example, FIG. 5 is a schematic diagram of another detailed interface provided in accordance with an exemplary embodiment. Referring to fig. 5, four interactive controls are displayed in the control area: comment control 501, attention control 502, favorites control 503, and shares control 504. The behavior priority of the comment behavior corresponding to the comment control 501 is 2, the behavior priority of the attention behavior corresponding to the attention control 502 is 1, the behavior priority of the collection behavior corresponding to the collection control 503 is 2, and the behavior priority of the sharing behavior corresponding to the sharing control 504 is 3. Wherein, the behavior priority is 1 and is the highest priority. Fig. 5 (a) exemplarily shows a scenario in which interactive controls are displayed from left to right according to behavior priorities. Fig. 5 (b) exemplarily shows a scenario in which interactive controls are displayed from the middle to the two sides according to behavior priorities. Alternatively, the controls may have a variety of display forms. As shown in fig. 6, which is a schematic diagram of an attention control provided according to an exemplary embodiment, the attention control may display an avatar and a name of a published object, as shown in (a) of fig. 6, and may also display an avatar and a plus sign of the published object, as shown in (b) of fig. 6, which is not limited by the embodiment of the present disclosure.
In step S304, in response to a triggering operation of a target interaction control in the at least one interaction control, the terminal executes a target interaction behavior between the terminal and the release object of the target picture, where the target interaction behavior corresponds to the target interaction control.
In the embodiment of the disclosure, the browsing object can interact with the release object of the target picture by triggering any interaction control in the control area. The target interaction control is any interaction control in the control area, when the terminal detects triggering operation on the target interaction control, the terminal determines target interaction behavior corresponding to the target interaction control, and then the terminal executes target interaction behavior between the browsing object and the release object, such as attention, praise, collection or sharing.
In some embodiments, the browse object can interact with the published object by, in addition to triggering an interaction control, the following two ways.
In one mode, the browse object interacts with the publish object by triggering a combination key. Correspondingly, in response to detecting the triggering operation of the combination key, the terminal determines a first interaction behavior, wherein the first interaction behavior is the interaction behavior indicated by the combination key. The terminal then performs a first interaction with the published object. The interaction with the release object is performed by triggering the combination key, so that the interaction mode between objects is increased, and the interaction efficiency is improved.
In a second mode, the browse object interacts with the publish object by sliding across the detail interface. Correspondingly, under the condition that the second sliding track is detected in the detail interface, the terminal determines a second interaction behavior, wherein the second interaction behavior is the interaction behavior indicated by the second sliding track. The terminal then performs a second interaction with the published object. The second sliding track may be a C-track, an S-track, a W-track, or a user-defined track, which is not limited in the embodiments of the present disclosure. By sliding a certain track on the detail interface to interact with the release object, the interaction mode between the objects is increased, and the interaction efficiency is improved.
The embodiment of the disclosure provides an interaction method based on a picture, which enables a browsing object viewing a target picture to interact with a release object of the target picture at any time through a control in a control area by displaying the control area while displaying a partial area of the target picture in a detail interface of the target picture, thereby improving interaction efficiency.
Any combination of the above-mentioned optional solutions may be adopted to form an optional embodiment of the present disclosure, which is not described herein in detail.
Fig. 7 is a block diagram illustrating a picture-based interaction device, according to an example embodiment. Referring to fig. 7, the apparatus includes: a display unit 701 and a behavior execution unit 702.
A display unit 701 configured to display a picture display interface, the picture display interface displaying thumbnails of a plurality of pictures;
the display unit 701 is further configured to switch the picture display interface to a detail interface of the target picture in response to a triggering operation on the thumbnail of the target picture, wherein the detail interface displays a picture display area and a control area, the size of the target picture is larger than that of the picture display area, the picture display area displays picture content in a part of the area of the target picture, and the control area displays at least one interactive control;
the behavior execution unit 702 is configured to execute a target interaction behavior between the target interaction control and the release object of the target picture in response to a trigger operation of the target interaction control in the at least one interaction control, where the target interaction behavior corresponds to the target interaction control.
In some embodiments, the display unit 701 is further configured to display a picture display area in the detail interface; updating the picture content displayed in the picture display area based on the sliding operation on the picture display area; and displaying a control area in the detail interface in the condition that the picture content displayed in the picture display area is related to the picture theme.
In some embodiments, the display unit 701 is further configured to display a picture display area in the detail interface; updating the picture content displayed in the picture display area based on the sliding operation on the picture display area; and displaying a control area in the detail interface in the condition that the picture content displayed in the picture display area is related to the interest of the browsing object for viewing the target picture.
In some embodiments, the display unit 701 is further configured to display a picture display area in the detail interface; and displaying a control area in the detail interface in response to the browsing object for viewing the target picture belonging to the target object type.
In some embodiments, the control region displays a plurality of interactive controls;
a display unit 701 further configured to display a picture display area in the detail interface; in the detail interface, based on the behavior priorities of the interaction behaviors, displaying the interaction controls in a control area, wherein the display positions of the interaction controls in the control area are related to the behavior priorities of the interaction behaviors of the interaction controls.
In some embodiments, the location of the control region is any of:
the control area is displayed at the top of the detail interface;
The control area is displayed at the bottom of the detail interface;
the control area is displayed on the left side of the detail interface;
the control area is displayed on the right side of the detail interface.
In some embodiments, the display unit 701 is further configured to hide the control region in the detail interface in response to a region hiding operation.
In some embodiments, the region hiding operation includes any one of:
triggering operation of the area hiding control;
a first sliding operation for sliding the target picture, the sliding distance of the first sliding operation being not less than the first distance;
the second sliding operation is used for sliding the control area, and the sliding distance of the sliding operation is not smaller than the second distance;
and a third sliding operation for inputting the first sliding track in the detail interface.
In some embodiments, the behavior execution unit 702 is further configured to determine, in response to detecting a triggering operation of the combination key, a first interaction behavior, the first interaction behavior being an interaction behavior indicated by the combination key; a first interaction with the published object is performed.
In some embodiments, the behavior execution unit 702 is further configured to determine a second interaction behavior in case a second sliding track is detected in the detail interface, the second interaction behavior being an interaction behavior indicated by the second sliding track; and executing a second interaction action between the object and the release object.
The embodiment of the disclosure provides an interaction device based on a picture, which enables a browsing object viewing a target picture to interact with a release object of the target picture at any time through a control in a control area by displaying the control area while displaying a partial area of the target picture in a detail interface of the target picture, thereby improving interaction efficiency.
It should be noted that, when the interaction device based on the picture provided in the above embodiment performs interaction, only the division of the above functional units is used for illustration, in practical application, the above functional allocation may be performed by different functional units according to needs, that is, the internal structure of the electronic device is divided into different functional units, so as to complete all or part of the functions described above. In addition, the image-based interaction device and the image-based interaction method provided in the above embodiments belong to the same concept, and detailed implementation processes of the image-based interaction device and the image-based interaction method are detailed in the method embodiments, which are not repeated here.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 8 is a block diagram of a terminal 800, shown in accordance with an exemplary embodiment, when the electronic device is provided as a terminal. Fig. 8 shows a block diagram of a terminal 800 according to an exemplary embodiment of the present disclosure. The terminal 800 may be: smart phones, tablet computers, MP3 players, MP4 players, notebook computers or desktop computers. Terminal 800 may also be referred to by other names of user devices, portable terminals, laptop terminals, desktop terminals, and the like.
In general, the terminal 800 includes: a processor 801 and a memory 802.
Processor 801 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 801 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 801 may also include a main processor, which is a processor for processing data in an awake state, also referred to as a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 801 may integrate a GPU (Graphics Processing Unit, image processor) for rendering and rendering of content required to be displayed by the display screen. In some embodiments, the processor 801 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 802 may include one or more computer-readable storage media, which may be non-transitory. Memory 802 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 802 is used to store at least one program code for execution by processor 801 to implement the picture-based interaction methods provided by the method embodiments in the present disclosure.
In some embodiments, the terminal 800 may further optionally include: a peripheral interface 803, and at least one peripheral. The processor 801, the memory 802, and the peripheral interface 803 may be connected by a bus or signal line. Individual peripheral devices may be connected to the peripheral device interface 803 by buses, signal lines, or a circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 804, a display 805, a camera assembly 806, audio circuitry 807, a positioning assembly 808, and a power supply 809.
Peripheral interface 803 may be used to connect at least one Input/Output (I/O) related peripheral to processor 801 and memory 802. In some embodiments, processor 801, memory 802, and peripheral interface 803 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 801, the memory 802, and the peripheral interface 803 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 804 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 804 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 804 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 804 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuitry 804 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: metropolitan area networks, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuitry 804 may also include NFC (Near Field Communication, short range wireless communication) related circuitry, which is not limited by the present disclosure.
The display 805 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 805 is a touch display, the display 805 also has the ability to collect touch signals at or above the surface of the display 805. The touch signal may be input as a control signal to the processor 801 for processing. At this time, the display 805 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the display 805 may be one, providing a front panel of the terminal 800; in other embodiments, the display 805 may be at least two, respectively disposed on different surfaces of the terminal 800 or in a folded design; in still other embodiments, the display 805 may be a flexible display disposed on a curved surface or a folded surface of the terminal 800. Even more, the display 805 may be arranged in an irregular pattern other than rectangular, i.e., a shaped screen. The display 805 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 806 is used to capture images or video. Optionally, the camera assembly 806 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, the camera assembly 806 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
Audio circuitry 807 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, inputting the electric signals to the processor 801 for processing, or inputting the electric signals to the radio frequency circuit 804 for voice communication. For stereo acquisition or noise reduction purposes, a plurality of microphones may be respectively disposed at different portions of the terminal 800. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 801 or the radio frequency circuit 804 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, audio circuit 807 may also include a headphone jack.
The location component 808 is utilized to locate the current geographic location of the terminal 800 to enable navigation or LBS (Location Based Service, location-based services). The positioning component 808 may be a positioning component based on the United states GPS (Global Positioning System ), the Beidou system of China, the Granati system of Russia, or the Galileo system of the European Union.
A power supply 809 is used to power the various components in the terminal 800. The power supply 809 may be an alternating current, direct current, disposable battery, or rechargeable battery. When the power supply 809 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 800 also includes one or more sensors 810. The one or more sensors 810 include, but are not limited to: acceleration sensor 811, gyroscope sensor 812, pressure sensor 813, fingerprint sensor 814, optical sensor 815, and proximity sensor 816.
The acceleration sensor 811 can detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal 800. For example, the acceleration sensor 811 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 801 may control the display screen 805 to display a user interface in a landscape view or a portrait view based on the gravitational acceleration signal acquired by the acceleration sensor 811. Acceleration sensor 811 may also be used for the acquisition of motion data of a game or user.
The gyro sensor 812 may detect a body direction and a rotation angle of the terminal 800, and the gyro sensor 812 may collect a 3D motion of the user to the terminal 800 in cooperation with the acceleration sensor 811. The processor 801 may implement the following functions based on the data collected by the gyro sensor 812: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 813 may be disposed at a side frame of the terminal 800 and/or at a lower layer of the display 805. When the pressure sensor 813 is disposed on a side frame of the terminal 800, a grip signal of the terminal 800 by a user may be detected, and the processor 801 performs left-right hand recognition or shortcut operation according to the grip signal collected by the pressure sensor 813. When the pressure sensor 813 is disposed at the lower layer of the display screen 805, the processor 801 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 805. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 814 is used to collect a fingerprint of a user, and the processor 801 identifies the identity of the user based on the fingerprint collected by the fingerprint sensor 814, or the fingerprint sensor 814 identifies the identity of the user based on the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 801 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 814 may be provided on the front, back, or side of the terminal 800. When a physical key or vendor Logo is provided on the terminal 800, the fingerprint sensor 814 may be integrated with the physical key or vendor Logo.
The optical sensor 815 is used to collect the ambient light intensity. In one embodiment, the processor 801 may control the display brightness of the display screen 805 based on the intensity of ambient light collected by the optical sensor 815. Specifically, when the intensity of the ambient light is high, the display brightness of the display screen 805 is turned up; when the ambient light intensity is low, the display brightness of the display screen 805 is turned down. In another embodiment, the processor 801 may also dynamically adjust the shooting parameters of the camera module 806 based on the ambient light intensity collected by the optical sensor 815.
A proximity sensor 816, also referred to as a distance sensor, is typically provided on the front panel of the terminal 800. The proximity sensor 816 is used to collect the distance between the user and the front of the terminal 800. In one embodiment, when the proximity sensor 816 detects that the distance between the user and the front of the terminal 800 gradually decreases, the processor 801 controls the display 805 to switch from the bright screen state to the off screen state; when the proximity sensor 816 detects that the distance between the user and the front surface of the terminal 800 gradually increases, the processor 801 controls the display 805 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 8 is not limiting and that more or fewer components than shown may be included or certain components may be combined or a different arrangement of components may be employed.
When the electronic device is provided as a server, fig. 9 is a block diagram illustrating a server 900 according to an exemplary embodiment, where the server 900 may be relatively different due to configuration or performance, and may include one or more processors (Central Processing Units, CPU) 901 and one or more memories 902, where at least one program code is stored in the memories 902, and the at least one program code is loaded and executed by the processors 901 to implement the picture-based interaction method provided in the above-described method embodiments. Of course, the server may also have a wired or wireless network interface, a keyboard, an input/output interface, etc. to perform input/output, and the server 900 may also include other components for implementing the functions of the device, which are not described herein.
In an embodiment of the present disclosure, there is also provided a storage medium including a program code, for example, a memory 802 or a memory 902 including a program code, which is executable by the processor 801 of the terminal 800 or the processor 901 of the server 900 to accomplish the above-described picture-based interaction method. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
In an embodiment of the present disclosure, a computer program product is also provided, including a computer program, which when executed by a processor implements the above-mentioned picture-based interaction method.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (12)

1. A picture-based interaction method, the method comprising:
displaying a picture display interface, wherein the picture display interface displays thumbnail images of a plurality of pictures;
Responding to triggering operation of a thumbnail of a target picture, switching the picture display interface into a detail interface of the target picture, and displaying a picture display area in the detail interface, wherein the size of the target picture is larger than that of the picture display area, and the picture display area displays picture contents in a partial area of the target picture;
updating the picture content displayed in the picture display region based on a sliding operation on the picture display region;
displaying a control area in the detail interface under the condition that the picture content displayed in the picture display area is related to a picture theme, wherein at least one interactive control is displayed in the control area;
and responding to the triggering operation of a target interaction control in the at least one interaction control, and executing target interaction behaviors between the target interaction control and the release object of the target picture, wherein the target interaction behaviors correspond to the target interaction control.
2. The picture-based interaction method of claim 1, wherein the method further comprises:
and displaying the control area in the detail interface under the condition that the picture content displayed in the picture display area is related to the interest of the browsing object for viewing the target picture.
3. The picture-based interaction method of claim 1, wherein the method further comprises:
and responding to the fact that the browse object for viewing the target picture belongs to the target object type, and displaying the control area in the detail interface.
4. The picture-based interaction method of claim 1, wherein the control area displays a plurality of interaction controls;
the method further comprises the steps of:
and displaying the plurality of interactive controls in the control area based on the behavior priorities of the plurality of interactive behaviors in the detail interface, wherein the display positions of the interactive controls in the control area are related to the behavior priorities of the interactive behaviors of the interactive controls.
5. The picture-based interaction method of claim 1, wherein the location of the control region is any one of:
the control area is displayed at the top of the detail interface;
the control area is displayed at the bottom of the detail interface;
the control area is displayed on the left side of the detail interface;
and the control area is displayed on the right side of the detail interface.
6. The picture-based interaction method of claim 1, wherein the method further comprises:
And hiding the control region in the detail interface in response to a region hiding operation.
7. The picture-based interaction method of claim 6, wherein the region hiding operation comprises any one of:
triggering operation of the area hiding control;
a first sliding operation for sliding the target picture, the sliding distance of the first sliding operation being not less than a first distance;
a second sliding operation for sliding the control region, a sliding distance of the sliding operation being not less than a second distance;
and a third sliding operation for inputting a first sliding track in the detail interface.
8. The picture-based interaction method of any of claims 1-7, wherein the method further comprises:
in response to detection of a triggering operation of a combination key, determining a first interaction behavior, wherein the first interaction behavior is an interaction behavior indicated by the combination key;
and executing the first interaction behavior between the first interaction behavior and the release object.
9. The picture-based interaction method of any of claims 1-7, wherein the method further comprises:
Under the condition that a second sliding track is detected in the detail interface, determining a second interaction behavior, wherein the second interaction behavior is the interaction behavior indicated by the second sliding track;
and executing the second interaction behavior between the second interaction behavior and the release object.
10. A picture-based interactive apparatus, the apparatus comprising:
a display unit configured to display a picture display interface, the picture display interface displaying thumbnails of a plurality of pictures;
the display unit is further configured to switch the picture display interface to a detail interface of the target picture in response to a triggering operation of a thumbnail of the target picture, wherein a picture display area is displayed in the detail interface, the size of the target picture is larger than that of the picture display area, and the picture display area displays picture contents in a partial area of the target picture; updating the picture content displayed in the picture display region based on a sliding operation on the picture display region; displaying a control area in the detail interface under the condition that the picture content displayed in the picture display area is related to a picture theme, wherein at least one interactive control is displayed in the control area;
And the behavior execution unit is configured to respond to the triggering operation of a target interaction control in the at least one interaction control and execute target interaction behaviors between the target interaction control and the release object of the target picture, wherein the target interaction behaviors correspond to the target interaction control.
11. An electronic device, the electronic device comprising:
one or more processors;
a memory for storing the processor-executable program code;
wherein the processor is configured to execute the program code to implement the picture-based interaction method of any of claims 1 to 9.
12. A storage medium, characterized in that the program code in the storage medium, when executed by a processor of an electronic device, enables the electronic device to perform the picture-based interaction method of any of claims 1 to 9.
CN202210988384.7A 2022-08-17 2022-08-17 Picture-based interaction method and device, electronic equipment and storage medium Active CN115379274B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210988384.7A CN115379274B (en) 2022-08-17 2022-08-17 Picture-based interaction method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210988384.7A CN115379274B (en) 2022-08-17 2022-08-17 Picture-based interaction method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115379274A CN115379274A (en) 2022-11-22
CN115379274B true CN115379274B (en) 2023-10-03

Family

ID=84065592

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210988384.7A Active CN115379274B (en) 2022-08-17 2022-08-17 Picture-based interaction method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115379274B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105828145A (en) * 2016-03-18 2016-08-03 广州酷狗计算机科技有限公司 Interaction method and interaction device
WO2020233553A1 (en) * 2019-05-22 2020-11-26 华为技术有限公司 Photographing method and terminal
CN113204298A (en) * 2021-04-30 2021-08-03 北京达佳互联信息技术有限公司 Method and device for displaying release progress, electronic equipment and storage medium
CN113908559A (en) * 2021-10-13 2022-01-11 腾讯科技(深圳)有限公司 Interface display method, device, terminal and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105828145A (en) * 2016-03-18 2016-08-03 广州酷狗计算机科技有限公司 Interaction method and interaction device
WO2020233553A1 (en) * 2019-05-22 2020-11-26 华为技术有限公司 Photographing method and terminal
CN113204298A (en) * 2021-04-30 2021-08-03 北京达佳互联信息技术有限公司 Method and device for displaying release progress, electronic equipment and storage medium
CN113908559A (en) * 2021-10-13 2022-01-11 腾讯科技(深圳)有限公司 Interface display method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN115379274A (en) 2022-11-22

Similar Documents

Publication Publication Date Title
CN110602321B (en) Application program switching method and device, electronic device and storage medium
CN111597008A (en) Popup management method, popup management device, terminal and storage medium
CN108737897B (en) Video playing method, device, equipment and storage medium
CN110362366B (en) Application interface display method and device
CN111368114B (en) Information display method, device, equipment and storage medium
CN111897465B (en) Popup display method, device, equipment and storage medium
CN111694478A (en) Content display method, device, terminal and storage medium
CN110968815B (en) Page refreshing method, device, terminal and storage medium
CN111459363A (en) Information display method, device, equipment and storage medium
CN114546545B (en) Image-text display method, device, terminal and storage medium
CN113032590B (en) Special effect display method, device, computer equipment and computer readable storage medium
CN110928464A (en) User interface display method, device, equipment and medium
CN111275607A (en) Interface display method and device, computer equipment and storage medium
CN113301422B (en) Method, terminal and storage medium for acquiring video cover
CN111464829B (en) Method, device and equipment for switching media data and storage medium
CN115379274B (en) Picture-based interaction method and device, electronic equipment and storage medium
CN114594885A (en) Application icon management method, device and equipment and computer readable storage medium
CN109189525B (en) Method, device and equipment for loading sub-page and computer readable storage medium
CN111258673A (en) Fast application display method and terminal equipment
CN113495770A (en) Method, device, terminal and storage medium for displaying application page
CN111275561B (en) Method, device, computer equipment and storage medium for acquiring association relation
CN111381765B (en) Text box display method and device, computer equipment and storage medium
CN113220203B (en) Activity entry display method, device, terminal and storage medium
CN113052408B (en) Method and device for community aggregation
CN113127130B (en) Page jump method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant