CN110019897B - Method and device for displaying picture - Google Patents

Method and device for displaying picture Download PDF

Info

Publication number
CN110019897B
CN110019897B CN201710646777.9A CN201710646777A CN110019897B CN 110019897 B CN110019897 B CN 110019897B CN 201710646777 A CN201710646777 A CN 201710646777A CN 110019897 B CN110019897 B CN 110019897B
Authority
CN
China
Prior art keywords
picture
user
information
preset
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710646777.9A
Other languages
Chinese (zh)
Other versions
CN110019897A (en
Inventor
陈悦
刘颖
梁于阳
张蕊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Miaomiaoce Technology Beijing Co ltd
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Miaomiaoce Technology Beijing Co ltd
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Miaomiaoce Technology Beijing Co ltd, Beijing Xiaomi Mobile Software Co Ltd filed Critical Miaomiaoce Technology Beijing Co ltd
Priority to CN201710646777.9A priority Critical patent/CN110019897B/en
Publication of CN110019897A publication Critical patent/CN110019897A/en
Application granted granted Critical
Publication of CN110019897B publication Critical patent/CN110019897B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/54Browsing; Visualisation therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Library & Information Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides a method and an apparatus for displaying pictures, wherein the method comprises: in the process of dynamically playing the picture, acquiring user response information of the currently displayed picture; if the user response information meets a preset attention condition, determining a target picture concerned by the user according to the current display picture; highlighting the target picture. By the method for displaying the pictures, the pictures which are interesting to the user can be intelligently displayed, and the user experience of the electronic equipment is improved.

Description

Method and device for displaying picture
Technical Field
The present disclosure relates to the field of computer communication technologies, and in particular, to a method and an apparatus for displaying pictures.
Background
With the development of camera technology and storage technology, people can take a large number of photos using electronic devices such as mobile phones, tablet computers and digital cameras, for example, people can often take thousands of photos after a holiday or a trip. In order to facilitate people to review and share photo information, the electronic equipment provides a function of dynamically playing photos. However, in the dynamic play mode of the related art, the play time interval of the pictures is fixed, and if people want to spend more time browsing the interested pictures, the people need to manually trigger the pause function, or after the automatic play is finished, the people manually find the pictures from the picture database to review the pictures. If a user is interested in many photos in the photo database, the user needs to switch between pause and continuous play functions manually or pick out the interested photos from the photo database for static play, which is tedious in operation and wastes time and energy of the user.
Disclosure of Invention
In view of this, the present disclosure provides a method and an apparatus for displaying a picture, which can intelligently highlight a picture of interest of a user, and save time and effort of the user for browsing the picture of interest.
According to a first aspect of the embodiments of the present disclosure, there is provided a method of displaying a picture, the method including:
in the process of dynamically playing the picture, acquiring user response information of the currently displayed picture;
if the user response information meets a preset attention condition, determining a target picture concerned by the user according to the current display picture;
highlighting the target picture.
Optionally, the user reaction information includes at least one of:
expression information, motion information, sound information, language information, physiological characteristic information.
The user response information meets preset attention conditions and comprises at least one of the following items:
the expression information of the user accords with preset expression characteristics;
the action information of the user meets the preset action characteristics;
the voice information of the user meets the preset voice characteristics;
the language information of the user comprises preset evaluation keywords;
the physiological characteristic information of the user conforms to the preset physiological change characteristic.
Optionally, the determining, according to the currently displayed picture, a target picture that a user pays attention to includes:
and determining the current display picture as a target picture concerned by the user.
Optionally, if the user response information satisfies a preset attention condition, determining a target picture focused by the user according to the currently displayed picture, including:
determining a related picture of the current display picture;
and determining the current display picture and the associated picture of the current display picture as a target picture set.
Optionally, the user reaction information satisfies a preset attention condition, including:
determining a user response value according to different types of user response information, different response degrees of the same type of response information, or weights of different types of response information;
and if the user response value is larger than a preset threshold value, determining that the user response information meets a preset concern condition.
Optionally, the determining a picture associated with the currently displayed picture includes:
extracting image characteristic information of the current display picture;
determining the similarity between the residual pictures in a picture database and the currently displayed picture according to the image characteristic information;
and if the similarity exceeds a preset association threshold, determining the corresponding residual pictures as the associated pictures of the current display picture.
Optionally, the highlighting of the target picture includes:
and displaying the target picture in a mode of prolonging display time.
Optionally, highlighting the target picture in a manner of extending a display duration includes:
inquiring a preset display list according to the user response value of the target picture to obtain the reset display duration of the target picture, wherein the preset display list comprises: the corresponding relation between the user response value of the target picture and the reset display duration;
and under the automatic playing mode, automatically displaying the target picture according to the reset display duration.
Optionally, the method further comprises:
and marking and storing the target picture.
According to a second aspect of the embodiments of the present disclosure, there is provided an apparatus for displaying pictures, the apparatus comprising:
the response information acquisition module is configured to acquire user response information of a currently displayed picture in the process of dynamically playing the picture;
the target determining module is configured to determine a target picture concerned by the user according to the current display picture under the condition that the user response information meets a preset concerned condition;
a highlighting module configured to highlight the target picture.
Optionally, the user response information acquired by the response information acquiring module includes at least one of the following:
expression information, motion information, sound information, language information, physiological characteristic information.
The user response information meets preset attention conditions and comprises at least one of the following items:
the expression information of the user accords with preset expression characteristics;
the action information of the user meets the preset action characteristics;
the voice information of the user meets the preset voice characteristics;
the language information of the user comprises preset evaluation keywords;
the physiological characteristic information of the user conforms to the preset physiological change characteristic.
Optionally, the target determining module is configured to determine the current display picture as a target picture focused by the user.
Optionally, the target determining module includes:
an associated picture determination submodule configured to determine an associated picture of the currently displayed picture;
a target determination submodule configured to determine the current display picture and an associated picture of the current display picture as a target picture set.
Optionally, the target determining module includes:
the reaction value determining submodule is configured to determine a user reaction value according to different types of user reaction information, different reaction degrees of the same type of reaction information, or weights of different reaction information types;
an attention determining submodule configured to determine that the user response information satisfies a preset attention condition, when the user response value is greater than a preset threshold value.
Optionally, the associated picture determining sub-module includes:
a feature extraction unit configured to extract image feature information of the currently displayed picture;
the similarity determining unit is configured to determine the similarity between the residual pictures in the picture database and the current display picture according to the image characteristic information;
and the association determining unit is configured to determine the corresponding residual pictures as the associated pictures of the current display picture under the condition that the similarity exceeds a preset association threshold value.
Optionally, the highlighting module is configured to display the target picture in a manner of prolonging a display duration.
Optionally, the highlighting module includes:
a display duration determining submodule configured to query a preset display list according to the user response value of the target picture, and obtain a reset display duration of the target picture, where the preset display list includes: the corresponding relation between the user response value of the target picture and the reset display duration;
and the time delay display submodule is configured to automatically display the target picture according to the reset display duration in an automatic play mode.
Optionally, the apparatus further comprises:
and the target storage module is configured to mark and store the target picture.
According to a third aspect of embodiments of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of any one of the first aspect described above.
According to a fourth aspect of the embodiments of the present disclosure, there is provided an apparatus for displaying pictures, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
in the process of dynamically playing the picture, acquiring user response information of the currently displayed picture;
if the user response information meets a preset attention condition, determining a target picture concerned by the user according to the current display picture;
highlighting the target picture.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the method and the device, in the process of automatically playing the picture, the electronic equipment can acquire the user response information of the picture viewer to the currently displayed picture, intelligently determines the target picture concerned by the user according to the user response information, and highlights each target picture. Therefore, when the user plays the pictures in an automatic playing mode, the user is prevented from browsing interested pictures in a manual operation mode, the intelligent degree of dynamically playing the pictures by the electronic equipment is effectively improved, and the user experience of the electronic equipment is improved.
According to the method and the device, when the electronic equipment plays the picture automatically, the user reaction information such as expression information, action information, sound information, language information and physiological characteristic information of the user can be detected automatically, and then the target picture concerned by the user is judged accurately according to the comparison between the reaction information and the corresponding preset concerned condition, so that the accuracy of judging the target picture is ensured.
In the present disclosure, the electronic device may adopt the above-mentioned intelligent target picture judgment manner, and determine the target picture according to the response information of the user to each picture in the automatic picture playing process, so as to ensure the accuracy of the target picture judgment.
According to the method and the device, under the condition that the current display picture is determined to be the target picture concerned by the user, the electronic equipment can also determine the associated picture according to the current display picture, determine the associated picture to be the target picture interested by the user, obtain the target picture set, further save time spent on judging the target picture and improve the target picture judging efficiency.
According to the method and the device, the electronic equipment can determine the user response value of the current display picture according to different types of user response information, different response degrees of the same type of response information, or different weights of the types of response information, compare the user response value with a preset threshold value, and determine that the user response information meets a preset attention condition if the user response value exceeds the preset threshold value, so that the accuracy of target picture judgment is improved.
According to the method and the device, when the electronic equipment determines the associated picture of the current display picture, the image characteristic information of the current display picture can be firstly extracted, then the similarity between the residual picture and the current display picture in the picture database is calculated, and when the similarity is higher than a preset associated threshold value, the corresponding residual picture is taken as the associated picture of the current display picture, so that the accuracy of judging the associated picture is improved, the judging time of the subsequent picture is saved, and the judging efficiency of the target picture is integrally improved.
In the disclosure, in the automatic picture playing mode, the electronic device may highlight the target picture by extending the display duration of the target picture, so as to reduce the regret that the user cannot browse the interested picture for a longer time, and improve the user experience.
According to the method and the device, the electronic equipment can also personally and automatically select different time delay display durations according to different user response values of the target picture, so that the electronic equipment can display the picture which is interested by the user more intelligently.
According to the method and the device, after the target picture concerned by the user is determined, the target picture can be marked and stored by the electronic equipment, so that the marked target picture information can be displayed in a preset highlighting mode in a follow-up static preview mode, the user can conveniently and quickly select the interested picture from the picture database, the time spent by the user for selecting the interested picture is saved, and the user experience of the electronic equipment is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart illustrating a method of displaying pictures according to an exemplary embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an application scenario showing a display picture according to an exemplary embodiment of the present disclosure;
FIG. 3 is a flow chart illustrating another method of displaying pictures according to an exemplary embodiment of the present disclosure;
FIG. 4 is a flow chart illustrating another method of displaying pictures according to an exemplary embodiment of the present disclosure;
FIG. 5 is a flow chart illustrating another method of displaying pictures according to an exemplary embodiment of the present disclosure;
FIG. 6 is a flow chart illustrating another method of displaying pictures according to an exemplary embodiment of the present disclosure;
FIG. 7 is a flow chart illustrating another method of displaying pictures according to an exemplary embodiment of the present disclosure;
FIG. 8 is a schematic diagram of another application scenario showing a display picture according to an example embodiment of the present disclosure;
FIG. 9 is a block diagram of an apparatus for displaying pictures according to an exemplary embodiment of the present disclosure;
FIG. 10 is a block diagram of another apparatus for displaying pictures shown in accordance with an exemplary embodiment of the present disclosure;
FIG. 11 is a block diagram of another apparatus for displaying pictures shown in accordance with an exemplary embodiment of the present disclosure;
FIG. 12 is a block diagram of another apparatus for displaying pictures shown in accordance with an exemplary embodiment of the present disclosure;
FIG. 13 is a block diagram of another apparatus for displaying pictures shown in accordance with an exemplary embodiment of the present disclosure;
FIG. 14 is a block diagram of another apparatus for displaying pictures shown in accordance with an exemplary embodiment of the present disclosure;
fig. 15 is a schematic structural diagram illustrating an apparatus for displaying pictures according to an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
In the present disclosure, an execution body is referred to, including: the electronic equipment is provided with a picture viewing tool such as a picture player and can automatically play pictures; the electronic device may be a smart television, a smart phone, a computer, a PDA (Personal Digital Assistant), an electronic album, a tablet device, or the like. The execution body concerned may further include: wearable equipment, like intelligent wrist-watch, intelligent bracelet, intelligent running shoes etc. above-mentioned wearable equipment can detect user's physiological characteristics and sign data such as blood pressure, body such as rhythm of the heart. The electronic device can establish communication connection with the wearable device through at least one communication mode such as mobile communication network, Bluetooth communication, Wifi (wireless fidelity) communication, infrared communication and the like. In a specific implementation process, the electronic device and the wearable device are independent and are simultaneously associated with each other, and the technical scheme provided by the disclosure is jointly implemented. The method for displaying pictures provided by the present disclosure is described in detail below with reference to the accompanying drawings.
Referring to fig. 1, a flowchart of a method for displaying pictures according to an exemplary embodiment is shown, where the method is applied to an electronic device, and the method may include the following steps:
in step 11, in the process of dynamically playing the picture, obtaining user response information of the currently displayed picture;
wherein, the user response information of the current display picture refers to: and sensory response information of the user to the currently displayed picture.
And detecting the reaction information of the user to the currently displayed picture in the display time period of each picture. The display time period of one picture refers to a time period from the beginning of display to the end of display in the preset automatic play mode.
The user response information may be at least one of: expression response information of the user, such as expression information of smile, laugh and the like; reaction information of the motion, such as a motion of clapping a hand, a motion of exciting and waving an arm, and the like; language response information, such as "really look like", "nice and beautiful", "too praised", etc. evaluation speech; physiological characteristic response information, such as information that heart rate is accelerated, blood pressure is increased within a reasonable range, and the like.
In the application example of the present disclosure, only the positive response information that the current display picture motivates the user to show happiness, excitement, etc. is taken as an example for explanation, and the present disclosure should not be construed as limiting the response information of the user. It should be understood that the user response information may also be surprise, difficult, etc. response information presented by the user.
Taking an electronic device as an example of a smart phone, when a user a uses the smart phone to automatically play 200 shot photos in a preset picture playing mode, such as a slide show mode, the expression information, the body action information and the like of a photo viewer, namely the user a, can be collected through a built-in camera; the language information and the sound information sent by the photo user a can also be collected by a sound collecting device such as a microphone, wherein the sound information can include information such as tone, volume and the like.
In an application scenario, if the electronic device playing the photo can communicate with the wearable device worn by the photo viewer, as shown in fig. 2, the smart phone 100 establishes a communication connection with the smart band 200 worn by the user a through WiFi, and when the smart phone 100 dynamically plays the photo, the physiological characteristic information such as the heart rate data and the blood pressure data of the user a monitored by the smart band 200 can be acquired in real time. In other application scenarios, the smartphone 100 may also acquire data monitored by multiple wearable devices through a communication connection, so as to acquire physiological characteristic data of multiple users watching photos together.
In step 12, if the user response information meets a preset attention condition, determining a target picture which is concerned by the user according to the current display picture;
in an embodiment of the present disclosure, the preset attention condition may include at least one of the following:
the expression information of the user accords with preset expression characteristics; the preset expression features may be: smile, laugh, frown, etc. For example, the cell phone captures the smiling face of the viewer during the automatic playing of the photo IMG _101.JPG, such as within 8 s.
The action information of the user meets the preset action characteristics; the preset action information may be: nodding the head, clapping the hands, erecting the thumb, shaking the head and the like. For example, the cell phone captures the action of the viewer holding his thumb during the automatic playback of the photo IMG _101. JPG.
The voice information of the user meets the preset voice characteristics; the preset sound characteristics may be: the voice characteristics such as laughing voices and startle voices can also comprise: volume, number of sound occurrences, etc. For example, during the automatic playing of the photo IMG _101.JPG, the mobile phone acquires a laugh of the user through the microphone, and the volume of the laugh exceeds a preset decibel threshold.
The language information fed back by the user comprises preset evaluation keywords; the preset evaluation keyword may be a preset praise keyword or the like. For example, during the automatic playing of a photo IMG _101.JPG, the mobile phone collects language information such as "really beautiful", "too praised", "i want to print it out or send it to a circle of friends" sent by the user through a microphone;
the physiological characteristic information of the user accords with the preset physiological change characteristic, and the physiological characteristic information of the user can be information such as a blood pressure value and a heart rate value of the user. For example, during the automatic playing of the photo IMG _101.JPG, the mobile phone detects that the heart rate of the user is accelerated within a reasonable range or the blood pressure is increased within a reasonable range when the user browses the photo through the physiological characteristic data such as blood pressure and heart rate data sent by the wearable device.
And when the user response information meets at least one preset attention condition or meets the attention conditions of a preset type, determining the picture IMG _101.JPG as a target picture interested by the user.
For example, the mobile phone detects not only a smiling face of the user but also a laugh sound of the user within a preset display duration, such as 8s, of the automatically played photo IMG _101.JPG, and further detects that the user sends language information, such as "too long, i want to send a circle of friends", and the like, if the volume exceeds a preset volume threshold, the mobile phone automatically marks the photo IMG _101.JPG as a target picture of interest of the user.
In another embodiment of the present disclosure, the electronic device may further determine a user response value according to the at least one piece of user response information and a preset policy, and then accurately determine whether the user is interested in the currently displayed picture according to the magnitude of the user response value. Referring to fig. 3 specifically, according to another flowchart of a method for displaying a picture according to an exemplary embodiment, on the basis of the embodiment shown in fig. 1, the method for determining that the user response information satisfies the preset attention condition in step 12 may include:
in step 121, determining a user response value according to different types of user response information, different response degrees of the same type of response information, or weights of different types of response information;
in the embodiment of the present disclosure, the user response value of the target picture may be determined according to the user response information of the picture in any one of the following manners.
In the first mode, the user response value is determined according to different response degrees of the same type of user response information.
Assuming that the same type of user response information shows smiley face expressions, the user device may match corresponding smiley face symbols according to detected smiley faces with different degrees of distraction, and then determine a user response value according to the matched smiley face symbols. For example, it is assumed that a first preset list is preset in the electronic device, and the first preset list includes: the corresponding relationship among the user reaction degree, the emoticons and the user reaction values is shown in a table I:
Figure BDA0001367010130000111
watch 1
Supposing that the smiling expression of the user is collected when the electronic equipment plays photos IMG _101.JPG, determining a corresponding expression symbol through image analysis, and then determining a user response value according to a determined expression symbol look-up table I: 80 minutes.
In another embodiment of the present disclosure, it is assumed that the same type of user response information is: and in the picture display period, the frequency of the praise words is acquired. For example, it is assumed that a second preset list is preset in the electronic device, and the second preset list includes: the corresponding relationship between the occurrence frequency of the complimentary words and the user response values is shown in the second table
Frequency of occurrence of Zanme words User reaction value
1 time of 10 minutes
2 to 4 times 20 minutes
5 to 8 times 30 minutes
Watch two
If the electronic device collects the information that the user utters the favorable sound for 3 times through the sound collection device during the period of automatically playing the photo IMG _101.JPG, the second table can be inquired to determine the user response value: and 20 minutes.
In the second mode, the user response value is calculated according to different types of response information and weights of different types of response information.
In an embodiment of the present disclosure, the user response value may also be determined according to a preset policy according to the weight of different types of user response information.
For example, on the basis of the above embodiment, assuming that the weighting factor of the smiling face expression of the user is 0.7 and the weighting factor of the occurrence frequency of the admire words is 0.3, according to the weighting factors and the preset lists shown in table one and table two, the user response value corresponding to the case where the user shows the above two types of response information to one picture can be calculated. Still taking the example of playing the photo IMG _101.JPG, the user response value is 0.7 × 80+0.3 × 20, or 62 points.
It should be noted that the above is only an application example, and is not used to limit the specific generation strategy and manner of the user response value in the present disclosure.
In step 122, if the user response value is greater than a preset threshold, it is determined that the user response information satisfies a preset attention condition.
After determining the user response value of a picture, the user response value may be compared with a preset threshold value, for example, 60 minutes, and if the user response value exceeds the preset threshold value, it is determined that the user response information of the currently displayed picture satisfies a preset attention condition.
Under the condition that the user response information meets the preset attention condition, determining a target picture interested by the user by adopting at least two modes as follows:
in the first mode, a current display picture is determined as a target picture concerned by a user;
as in the above example, when the smartphone automatically plays 200 photos in a slideshow manner according to the interval duration of 8s, if it is detected that the user response information meets the above preset attention condition within the duration of 8s during which the smartphone displays IMG _101.JPG, the currently displayed picture IMG _101.JPG is determined as the target picture that the user is interested in.
In the second mode, a target picture set is determined according to a current display picture;
in the embodiment of the present disclosure, on the basis of the first manner, not only the current display picture is determined as the target picture that the user is interested in, but also a related picture can be further searched in the designated picture database according to the current display picture, and a target picture set is determined. The specified picture database may include: the picture data corresponding to the preset time period, for example, the shooting time is: all photos from 1/5/2017 to 3/5/2017.
Referring to fig. 4, which is a flowchart illustrating another method for displaying pictures according to an exemplary embodiment, the determining, in step 12, a target picture focused by a user according to a current display picture may include:
in step 123, determining a picture associated with the currently displayed picture;
in the embodiment of the disclosure, under the condition that the current display picture meets the preset attention condition, it can be presumed that the user is interested in the same type of pictures related to the current display picture, so that the associated pictures can be searched from the remaining pictures in the preset picture database in advance, the display duration can be reset in advance, whether the pictures belong to the target pictures interested by the user or not can be judged one by one according to the user response information, and the calculation amount can be saved.
Referring to fig. 5, which is a flowchart illustrating another method for displaying pictures according to an exemplary embodiment, on the basis of the embodiment shown in fig. 4, the step 123 may include:
in step 1231, extracting image feature information of the current display picture;
wherein, the image characteristic information of the current display picture may include: scene, character, contrast, color, etc. The personal information may include: the number of people, the facial features, the expression features and other information.
In step 1232, determining similarity between the remaining pictures in the picture database and the currently displayed picture according to the image feature information;
still taking the above database containing 200 photos, such as the folder P1 as an example, when the mobile phone automatically plays the 101 st photo IMG _101.JPG, it is determined that the user response value of the IMG _101.JPG exceeds the preset threshold value by 60 points, and the preset attention condition is met. Extracting image characteristic information of the IMG _101.JPG according to a preset image processing algorithm, and calculating the similarity between the remaining 99 photos and the IMG _101. JPG.
In step 1233, if the similarity exceeds a preset association threshold, the corresponding remaining pictures are determined as the associated pictures of the currently displayed picture.
Assuming that the similarity between two pictures is represented by percentage, the preset association threshold is 80%, and if the similarity between 9 of the 99 remaining pictures and the currently displayed picture IMG _101.JPG exceeds 80%, it is determined that the 9 pictures belong to the associated picture IMG _101. JPG.
In step 124, the current display picture and the associated picture of the current display picture are determined as a target picture set.
As in the above example, the current display picture IMG _101.JPG and the 9 associated photos similar thereto are determined as the target picture set.
In step 13, the target picture is highlighted.
In the embodiment of the present disclosure, in the automatic picture playing mode, the target picture may be highlighted by extending the display duration.
According to different delay display strategies, the automatic picture playing in a delay display mode can include at least two situations:
in the first case, the target picture is displayed according to the unified preset extended display duration.
For example, in the process that the electronic device automatically plays the picture according to the preset picture playing time length, for example, 8s, if it is detected that the currently displayed picture is the target picture concerned by the user, the display time length of the currently displayed picture is automatically extended, for example, if the currently displayed picture is determined to be the target picture before the preset picture playing time length is ended for 8s, the display time length of the currently displayed picture is automatically extended, for example, the display time length of the currently displayed picture is extended to 10 s. In another embodiment, all the pictures in the target picture set where the currently displayed picture is located may be displayed according to the preset delayed display time duration of 10 s.
And in the second situation, according to different user response values of the target pictures, displaying the target pictures according to different extended display durations when the target pictures are automatically played.
Referring to fig. 6, which is a flowchart illustrating another method for displaying pictures according to an exemplary embodiment, on the basis of the embodiment shown in fig. 3, the step 13 may include:
in step 131, inquiring a preset display list according to the user response value of the target picture, and determining the reset display duration of the target picture;
wherein, the preset display list comprises: and the corresponding relation between the user response value of the target picture and the reset display duration. The higher the user reaction value, the longer the corresponding reset display duration. The reset display duration is determined after the delay is performed based on the original display duration.
For example, the preset display list may be as shown in table three:
user reaction value Resetting display duration
60 to 69 minutes 10s
70 to 79 minutes 12s
80 to 89 minutes 14s
90 to 99 minutes 15s
Watch III
Suppose that the user reaction values of the 4 target pictures PIG1, PIG2, PIG3, and PIG4 are respectively expressed as: 65 minutes, 70 minutes, 80 minutes and 90 minutes; then, as can be seen from the first lookup table, the reset display durations of the PIG1, the PIG2, the PIG3, and the PIG4 are respectively: 10s, 12s, 14s, 15 s.
It will be readily appreciated that table three is merely illustrative for further understanding of the present invention. The entry may be pre-defined by a user, or may be preset when an operating system of the electronic device is initialized.
In step 132, in the automatic playing mode, the target picture is automatically displayed according to the reset display duration.
Assume that the automatic play mode described above is a "slide show" mode. The original play time period of each picture in the "slide show" mode is 8 s. After determining that the 4 pictures are the target pictures focused by the user, when the play sequence arrives, the electronic device respectively uses the PIG1, the PIG2, the PIG3 and the PIG4 as follows: 10s, 12s, 14s, 15 s. Namely, in the current automatic playing mode, the display duration of the target picture is dynamically adjusted to the reset display duration. Or when the user restarts the automatic playing mode, the time for the user to browse the target picture is prolonged, the requirement that the user wants to watch the interested picture for more time is met, and the experience of the user in watching the picture is improved.
Referring to fig. 7, which is a flowchart illustrating another method for displaying pictures according to an exemplary embodiment, on the basis of the embodiment illustrated in fig. 1, the method may further include:
in step 14, the target picture is marked and stored.
In the disclosure, the electronic device may further mark the target picture, and store the marked target picture data, so that when the electronic device displays the target picture information in the still preview mode under the trigger of the user, the electronic device may further perform highlighting in the ways of marking, setting, highlighting, animation, and the like.
When a user needs to browse a picture database statically, for example, when the picture database is previewed, a preset highlighting mode such as a set top mode, a highlight mode, a display marker, a dynamic display mode and the like can be adopted to highlight the target picture information, so that the user can quickly find an interested picture, the time for the user to select the interested picture is shortened, the intelligent degree of the electronic equipment is improved, and the user experience is improved.
Referring to fig. 8, according to another application scenario diagram for displaying pictures, shown in an exemplary embodiment, after the smartphone 100 determines 4 target pictures PIG1, PIG2, PIG3, and PIG4, when the user reopens a picture database, such as a folder named "photos for a trip five", the smartphone 100 automatically displays the top of the PIG1, the PIG2, the PIG3, and the PIG4 in the preview display interface, and informs the user that the above four photos may be photos of interest to the user through a marker "√" to help the user quickly select the photos of interest, such as photos to be washed, photos ready to post a friend circle, and the like.
While, for purposes of simplicity of explanation, the foregoing method embodiments have been described as a series of acts or combination of acts, it will be appreciated by those skilled in the art that the present disclosure is not limited by the order of acts, as some steps may, in accordance with the present disclosure, occur in other orders and concurrently.
Further, those skilled in the art should also appreciate that the embodiments described in the specification are exemplary embodiments and that acts and modules referred to are not necessarily required by the disclosure.
Corresponding to the embodiment of the application function implementation method, the disclosure also provides an embodiment of an application function implementation device and a corresponding terminal.
Referring to fig. 9, a block diagram of an apparatus for displaying pictures according to an exemplary embodiment is shown, the apparatus may include:
the response information acquiring module 21 is configured to acquire user response information of a currently displayed picture in the process of dynamically playing the picture;
in an embodiment of the present disclosure, the user response information acquired by the response information acquiring module 21 may include at least one of the following:
expression information, motion information, sound information, language information, physiological characteristic information.
The user response information meets preset attention conditions and comprises at least one of the following items:
the expression information of the user accords with preset expression characteristics;
the action information of the user meets the preset action characteristics;
the voice information of the user meets the preset voice characteristics;
the language information of the user comprises preset evaluation keywords;
the physiological characteristic information of the user conforms to the preset physiological change characteristic.
A target determining module 22, configured to determine a target picture focused by the user according to the current display picture when the user response information satisfies a preset focusing condition;
in an embodiment of the present disclosure, the target determination module 22 may be configured to determine the current display picture as a target picture focused by the user.
A highlighting module 23 configured to highlight the target picture.
In an embodiment of the present disclosure, the highlighting module 23 may be configured to display the target picture in a manner of extending a display duration.
Referring to fig. 10, which is a block diagram of another apparatus for displaying pictures according to an exemplary embodiment, on the basis of the apparatus embodiment shown in fig. 9, the target determining module 22 may include:
an associated picture determining sub-module 221 configured to determine an associated picture of the currently displayed picture;
a target determination sub-module 222 configured to determine the current display picture and the associated picture of the current display picture as a target picture set.
Referring to fig. 11, which is a block diagram of another apparatus for displaying pictures according to an exemplary embodiment, on the basis of the embodiment of the apparatus shown in fig. 10, the target determining module 22 may further include:
the reaction value determining submodule 22-1 is configured to determine a user reaction value according to different types of user reaction information, different reaction degrees of the same type of reaction information, or weights of different types of reaction information;
an attention determining submodule 22-2 configured to determine that the user response information satisfies a preset attention condition if the user response value is greater than a preset threshold.
Referring to fig. 12, which is a block diagram of another apparatus for displaying pictures according to an exemplary embodiment, on the basis of the apparatus embodiment shown in fig. 10 or fig. 11, the associated picture determining sub-module 221 may include:
a feature extraction unit 2211 configured to extract image feature information of the current display picture;
a similarity determining unit 2212 configured to determine similarity between the remaining pictures in the picture database and the current display picture according to the image feature information;
an association determining unit 2213 configured to determine, if the similarity exceeds a preset association threshold, the corresponding remaining picture as an associated picture of the currently displayed picture.
Referring to fig. 13, which is a block diagram of another apparatus for displaying pictures according to an exemplary embodiment, on the basis of the embodiment of the apparatus shown in fig. 11, the highlighting module 23 may include:
a display duration determining sub-module 231, configured to query a preset display list according to the user response value of the target picture, to obtain a reset display duration of the target picture, where the preset display list includes: the corresponding relation between the user response value of the target picture and the reset display duration;
and the delay display sub-module 232 is configured to automatically display the target picture according to the reset display duration in the automatic play mode.
Referring to fig. 14, which is a block diagram of another apparatus for displaying pictures according to an exemplary embodiment, on the basis of the embodiment of the apparatus shown in fig. 9, the apparatus may further include:
and the target storage module 24 is configured to mark and store the target picture.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the disclosed solution. One of ordinary skill in the art can understand and implement it without inventive effort.
Accordingly, in one aspect, an embodiment of the present disclosure provides an apparatus for displaying pictures, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to:
in the process of dynamically playing the picture, acquiring user response information of the currently displayed picture;
if the user response information meets a preset attention condition, determining a target picture concerned by the user according to the current display picture;
highlighting the target picture.
Fig. 15 is a schematic structural diagram illustrating an apparatus 1500 for displaying pictures according to an exemplary embodiment. For example, the apparatus 1500 may be a user device, which may be embodied as a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, a wearable device such as a smart watch, smart glasses, a smart bracelet, a smart running shoe, and the like.
Referring to fig. 15, apparatus 1500 may include one or more of the following components: processing components 1502, memory 1504, power components 1506, multimedia components 1508, audio components 1510, input/output (I/O) interfaces 1512, sensor components 1514, and communication components 1516.
The processing component 1502 generally controls overall operation of the device 1500, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 1502 may include one or more processors 1520 executing instructions to perform all or a portion of the steps of the methods described above. Further, processing component 1502 may include one or more modules that facilitate interaction between processing component 1502 and other components. For example, processing component 1502 may include a multimedia module to facilitate interaction between multimedia component 1508 and processing component 1502.
The memory 1504 is configured to store various types of data to support operation at the device 1500. Examples of such data include instructions for any application or method operating on the device 1500, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1504 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 1506 provides power to the various components of the device 1500. The power components 1506 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 1500.
Multimedia component 1508 includes a screen that provides an output interface between the device 1500 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of the touch or slide action but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, multimedia component 1508 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 1500 is in an operational mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1510 is configured to output and/or input audio signals. For example, the audio component 1510 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 1500 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 1504 or transmitted via the communication component 1516. In some embodiments, audio component 1510 also includes a speaker for outputting audio signals.
The I/O interface 1512 provides an interface between the processing component 1502 and peripheral interface modules, which can be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 1514 includes one or more sensors for providing status assessment of various aspects of the apparatus 1500. For example, the sensor assembly 1514 can detect an open/closed state of the device 1500, the relative positioning of components, such as a display and keypad of the apparatus 1500, the sensor assembly 1514 can also detect a change in position of the apparatus 1500 or a component of the apparatus 1500, the presence or absence of user contact with the apparatus 1500, orientation or acceleration/deceleration of the apparatus 1500, and a change in temperature of the apparatus 1500. The sensor assembly 1514 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1514 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1514 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1516 is configured to facilitate wired or wireless communication between the apparatus 1500 and other devices. The apparatus 1500 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1516 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 1516 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1500 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, there is also provided a non-transitory computer readable storage medium, such as the memory 1504 including instructions that, when executed by the processor 1520 of the apparatus 1500, enable the apparatus 1500 to perform a method of displaying pictures, the method comprising:
in the process of dynamically playing the picture, acquiring user response information of the currently displayed picture;
if the user response information meets a preset attention condition, determining a target picture concerned by the user according to the current display picture;
highlighting the target picture.
The non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (14)

1. A method for displaying pictures, the method comprising:
in the process of dynamically playing the picture, acquiring user response information of the currently displayed picture;
if the user response information meets a preset attention condition, determining a target picture concerned by the user according to the current display picture, wherein the step of: determining a reaction value of a user according to the reaction types of a plurality of reaction information of the user, and the reaction value and the weight factor of each reaction type; if the user response value is larger than a preset threshold value, determining that the user response information meets a preset attention condition, and determining a target picture which is concerned by the user according to the current display picture;
highlighting the target picture, comprising: inquiring a preset display list according to the user response value of the target picture to obtain the reset display duration of the target picture, wherein the preset display list comprises: the corresponding relation between the user response value of the target picture and the reset display duration; and under the automatic playing mode, automatically displaying the target picture according to the reset display duration.
2. The method of claim 1, wherein the user response information comprises at least one of:
expression information, motion information, sound information, language information, physiological characteristic information;
the user response information meets preset attention conditions and comprises at least one of the following items:
the expression information of the user accords with preset expression characteristics;
the action information of the user meets the preset action characteristics;
the voice information of the user meets the preset voice characteristics;
the language information of the user comprises preset evaluation keywords;
the physiological characteristic information of the user conforms to the preset physiological change characteristic.
3. The method of claim 1, wherein determining a target picture of interest to a user from a currently displayed picture comprises:
and determining the current display picture as a target picture concerned by the user.
4. The method according to claim 1, wherein determining a target picture that a user pays attention to according to a currently displayed picture if the user response information satisfies a preset attention condition comprises:
determining a related picture of the current display picture;
and determining the current display picture and the associated picture of the current display picture as a target picture set.
5. The method of claim 4, wherein the determining the associated picture of the currently displayed picture comprises:
extracting image characteristic information of the current display picture;
determining the similarity between the residual pictures in a picture database and the currently displayed picture according to the image characteristic information;
and if the similarity exceeds a preset association threshold, determining the corresponding residual pictures as the associated pictures of the current display picture.
6. The method of claim 1, further comprising:
and marking and storing the target picture.
7. An apparatus for displaying pictures, the apparatus comprising:
the response information acquisition module is configured to acquire user response information of a currently displayed picture in the process of dynamically playing the picture;
the target determination module is configured to determine a target picture focused by a user according to the current display picture under the condition that the user response information meets a preset focusing condition, and comprises: determining a reaction value of a user according to the reaction types of a plurality of reaction information of the user, and the reaction value and the weight factor of each reaction type; if the user response value is larger than a preset threshold value, determining that the user response information meets a preset attention condition, and determining a target picture which is concerned by the user according to the current display picture;
a highlighting module configured to highlight the target picture, comprising: inquiring a preset display list according to the user response value of the target picture to obtain the reset display duration of the target picture, wherein the preset display list comprises: the corresponding relation between the user response value of the target picture and the reset display duration; and under the automatic playing mode, automatically displaying the target picture according to the reset display duration.
8. The apparatus according to claim 7, wherein the user response information obtained by the response information obtaining module comprises at least one of:
expression information, motion information, sound information, language information, physiological characteristic information;
the user response information meets preset attention conditions and comprises at least one of the following items:
the expression information of the user accords with preset expression characteristics;
the action information of the user meets the preset action characteristics;
the voice information of the user meets the preset voice characteristics;
the language information of the user comprises preset evaluation keywords;
the physiological characteristic information of the user conforms to the preset physiological change characteristic.
9. The apparatus of claim 7, wherein the target determination module is configured to determine the current display picture as a target picture of interest to a user.
10. The apparatus of claim 7, wherein the goal determination module comprises:
an associated picture determination submodule configured to determine an associated picture of the currently displayed picture;
a target determination submodule configured to determine the current display picture and an associated picture of the current display picture as a target picture set.
11. The apparatus of claim 10, wherein the associated picture determination sub-module comprises:
a feature extraction unit configured to extract image feature information of the currently displayed picture;
the similarity determining unit is configured to determine the similarity between the residual pictures in the picture database and the current display picture according to the image characteristic information;
and the association determining unit is configured to determine the corresponding residual pictures as the associated pictures of the current display picture under the condition that the similarity exceeds a preset association threshold value.
12. The apparatus of claim 7, further comprising:
and the target storage module is configured to mark and store the target picture.
13. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the program when executed by a processor implements the steps of the method of any of claims 1 to 6.
14. An apparatus for displaying pictures, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
in the process of dynamically playing the picture, acquiring user response information of the currently displayed picture;
if the user response information meets a preset attention condition, determining a target picture concerned by the user according to the current display picture, wherein the step of: determining a reaction value of a user according to the reaction types of a plurality of reaction information of the user, and the reaction value and the weight factor of each reaction type; if the user response value is larger than a preset threshold value, determining that the user response information meets a preset attention condition, and determining a target picture which is concerned by the user according to the current display picture;
highlighting the target picture, comprising: inquiring a preset display list according to the user response value of the target picture to obtain the reset display duration of the target picture, wherein the preset display list comprises: the corresponding relation between the user response value of the target picture and the reset display duration; and under the automatic playing mode, automatically displaying the target picture according to the reset display duration.
CN201710646777.9A 2017-08-01 2017-08-01 Method and device for displaying picture Active CN110019897B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710646777.9A CN110019897B (en) 2017-08-01 2017-08-01 Method and device for displaying picture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710646777.9A CN110019897B (en) 2017-08-01 2017-08-01 Method and device for displaying picture

Publications (2)

Publication Number Publication Date
CN110019897A CN110019897A (en) 2019-07-16
CN110019897B true CN110019897B (en) 2022-02-08

Family

ID=67186059

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710646777.9A Active CN110019897B (en) 2017-08-01 2017-08-01 Method and device for displaying picture

Country Status (1)

Country Link
CN (1) CN110019897B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10820060B1 (en) * 2018-06-27 2020-10-27 Facebook, Inc. Asynchronous co-watching
CN110990212A (en) * 2019-10-15 2020-04-10 厦门美柚股份有限公司 Control state detection method and device
CN113849142B (en) * 2021-09-26 2024-05-28 深圳市火乐科技发展有限公司 Image display method, device, electronic equipment and computer readable storage medium
CN116861019A (en) * 2022-01-14 2023-10-10 荣耀终端有限公司 Picture display method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101495945A (en) * 2006-07-28 2009-07-29 皇家飞利浦电子股份有限公司 Gaze interaction for information display of gazed items
CN105339969A (en) * 2013-04-18 2016-02-17 微软技术许可有限责任公司 Linked advertisements
CN105893490A (en) * 2016-03-29 2016-08-24 努比亚技术有限公司 Picture display device and method
CN106528689A (en) * 2016-10-24 2017-03-22 北京小米移动软件有限公司 Page content displaying method and device, and electronic device
CN106971317A (en) * 2017-03-09 2017-07-21 杨伊迪 The advertisement delivery effect evaluation analyzed based on recognition of face and big data and intelligently pushing decision-making technique

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6867807B2 (en) * 2001-09-04 2005-03-15 Eastman Kodak Company Camera having single-button timed display of most-recently viewed image and default display of last verification image and method
US8264364B2 (en) * 2008-09-08 2012-09-11 Phillip Roger Sprague Psychophysiological touch screen stress analyzer
CN103207662A (en) * 2012-01-11 2013-07-17 联想(北京)有限公司 Method and device for obtaining physiological characteristic information
CN104681048A (en) * 2013-11-28 2015-06-03 索尼公司 Multimedia read control device, curve acquiring device, electronic equipment and curve providing device and method
WO2016032806A1 (en) * 2014-08-26 2016-03-03 Apple Inc. User interface for limiting notifications and alerts
KR102400014B1 (en) * 2014-12-26 2022-05-20 삼성전자주식회사 Method and device for performing service using data brodacasting of a mobile device
CN106383870A (en) * 2016-09-05 2017-02-08 广东欧珀移动通信有限公司 Picture playing method and mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101495945A (en) * 2006-07-28 2009-07-29 皇家飞利浦电子股份有限公司 Gaze interaction for information display of gazed items
CN105339969A (en) * 2013-04-18 2016-02-17 微软技术许可有限责任公司 Linked advertisements
CN105893490A (en) * 2016-03-29 2016-08-24 努比亚技术有限公司 Picture display device and method
CN106528689A (en) * 2016-10-24 2017-03-22 北京小米移动软件有限公司 Page content displaying method and device, and electronic device
CN106971317A (en) * 2017-03-09 2017-07-21 杨伊迪 The advertisement delivery effect evaluation analyzed based on recognition of face and big data and intelligently pushing decision-making technique

Also Published As

Publication number Publication date
CN110019897A (en) 2019-07-16

Similar Documents

Publication Publication Date Title
CN106776890B (en) Method and device for adjusting video playing progress
CN107454465B (en) Video playing progress display method and device and electronic equipment
CN105845124B (en) Audio processing method and device
CN113115099B (en) Video recording method and device, electronic equipment and storage medium
EP3179408A2 (en) Picture processing method and apparatus, computer program and recording medium
CN110019897B (en) Method and device for displaying picture
CN110677734B (en) Video synthesis method and device, electronic equipment and storage medium
CN109189986B (en) Information recommendation method and device, electronic equipment and readable storage medium
CN106095465B (en) Method and device for setting identity image
CN111753135B (en) Video display method, device, terminal, server, system and storage medium
CN107423386B (en) Method and device for generating electronic card
CN113099297B (en) Method and device for generating click video, electronic equipment and storage medium
CN109257649B (en) Multimedia file generation method and terminal equipment
CN111800652A (en) Video processing method and device, electronic equipment and storage medium
CN113032627A (en) Video classification method and device, storage medium and terminal equipment
CN114245154B (en) Method and device for displaying virtual articles in game live broadcast room and electronic equipment
CN107229707B (en) Method and device for searching image
CN111629270A (en) Candidate item determination method and device and machine-readable medium
CN106447747B (en) Image processing method and device
CN113157972A (en) Recommendation method and device for video cover documents, electronic equipment and storage medium
CN111698532B (en) Bullet screen information processing method and device
CN112000840A (en) Business object display method and device
CN112004033B (en) Video cover determining method and device and storage medium
CN110769282A (en) Short video generation method, terminal and server
CN114189719A (en) Video information extraction method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant