CN116302293A - Picture display method and device - Google Patents

Picture display method and device Download PDF

Info

Publication number
CN116302293A
CN116302293A CN202310558298.7A CN202310558298A CN116302293A CN 116302293 A CN116302293 A CN 116302293A CN 202310558298 A CN202310558298 A CN 202310558298A CN 116302293 A CN116302293 A CN 116302293A
Authority
CN
China
Prior art keywords
pictures
picture
candidate pictures
candidate
index
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310558298.7A
Other languages
Chinese (zh)
Inventor
魏巍
宓振鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310558298.7A priority Critical patent/CN116302293A/en
Publication of CN116302293A publication Critical patent/CN116302293A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of terminals, in particular to a picture display method and equipment. The picture display method comprises the following steps: firstly, selecting a plurality of candidate pictures from the shot pictures, and screening a plurality of target pictures from the plurality of candidate pictures according to color characteristics of the plurality of candidate pictures. Then, a picture set is generated based on the plurality of target pictures. And finally, responding to the triggering operation of the first control in the target application interface, displaying a first page, and displaying the picture set on the first page. Through the technical scheme, the pictures in the same picture set can have uniform tone, so that the jumping sense of the user in vision is reduced, the picture set is more aesthetic on the whole, and the user experience is improved.

Description

Picture display method and device
Technical Field
The application relates to the technical field of terminals, in particular to a picture display method and equipment.
Background
And a plurality of picture applications are installed in a plurality of terminal devices, wherein a plurality of picture applications can provide different pages to display pictures shot by a user in different forms. In general, the gallery application may provide pages such as "photos" pages, "photo albums" pages, "time of day" pages, and the like. The 'moment' page is used for displaying pictures in the form of picture sets, and the pictures contained in each picture set are selected from the shot pictures according to specific principles such as shooting time, place and the like. However, when browsing the picture set of the "moment" page, the user often finds that the pictures contained in the same picture set are disordered in overall appearance, and lack of aesthetic feeling, so that the user experience is poor.
Disclosure of Invention
The application provides a picture display method and equipment, which can further screen each candidate picture which is preliminarily selected according to the color distribution of an image, and each picture with smaller color difference is obtained to generate a picture set for display, so that the picture set has better overall appearance.
In a first aspect, the present technical solution provides a method for displaying a picture, including: selecting a plurality of candidate pictures from the shot pictures; screening a plurality of target pictures from the plurality of candidate pictures according to the color characteristics of the plurality of candidate pictures; generating a picture set based on the plurality of target pictures; responding to the triggering operation of a first control in a target application interface, and displaying a first page; the picture set is displayed on a first page.
In the above implementation manner, after the candidate pictures of the picture set are selected, the candidate pictures can be further screened based on the tone characteristics of each candidate picture, so that the picture with larger fading tone difference is filtered. Based on the implementation mode, the pictures in the same picture set can have uniform tone, visual jumping sense is relieved, each picture in the picture set has better impression, and user experience is improved.
With reference to the first aspect, in certain implementation manners of the first aspect, selecting a plurality of candidate pictures from the captured pictures includes: detecting the current time; and responding to the current time reaching a preset period node, and selecting a plurality of candidate pictures from the shot pictures.
With reference to the first aspect, in some implementations of the first aspect, in response to the current time reaching a preset period node, selecting a plurality of candidate pictures from the captured pictures includes: detecting the current equipment state in response to the current time reaching a preset period node; and selecting a plurality of candidate pictures from the shot pictures in response to the device power exceeding the power threshold.
In the implementation manner, the execution of the picture display method provided by the application can be started after the equipment state is determined to meet the preset condition, so that the equipment can be prevented from further increasing the electricity consumption and the resource occupation under the low-electricity state or the high-power state.
With reference to the first aspect, in some implementations of the first aspect, selecting, according to color features of the plurality of candidate pictures, a plurality of target pictures from the plurality of candidate pictures includes: calculating a first index value among the plurality of candidate pictures according to the color characteristics of the plurality of candidate pictures; and screening a plurality of target pictures from the plurality of candidate pictures according to the first index value.
With reference to the first aspect, in certain implementation manners of the first aspect, calculating a first index value between the plurality of candidate pictures according to color features of the plurality of candidate pictures includes: respectively generating chromaticity distribution diagrams corresponding to a plurality of candidate pictures; and calculating a first index value among the plurality of candidate pictures according to the chromaticity distribution diagrams corresponding to the plurality of candidate pictures.
With reference to the first aspect, in some implementations of the first aspect, calculating, according to a chromaticity distribution map corresponding to the plurality of candidate pictures, a first index value between the plurality of candidate pictures includes: selecting a reference picture from a plurality of candidate pictures; and respectively calculating first index values between the reference picture and each other candidate picture according to the chromaticity distribution diagram corresponding to each reference picture and each other candidate picture.
With reference to the first aspect, in some implementations of the first aspect, selecting, according to the first index value, a plurality of target pictures from a plurality of candidate pictures includes: determining the picture to be removed according to the statistical characteristics of each first index value; and removing the picture to be removed from the candidate pictures to obtain a plurality of target pictures.
With reference to the first aspect, in some implementations of the first aspect, determining the picture to be rejected according to the statistical feature of each first index value includes: respectively calculating the average value of each first index value; determining a target distribution range of the first index values according to the average value of the first index values; and determining the picture to be removed according to the target distribution range, wherein a first index value corresponding to the picture to be removed is located outside the target distribution range.
With reference to the first aspect, in certain implementation manners of the first aspect, before calculating the first index value between the plurality of candidate pictures according to the color features of the plurality of candidate pictures, the method further includes: respectively calculating second index values of the plurality of candidate pictures according to the content characteristics of the plurality of candidate pictures; and determining the index type of the first index value according to the second index value.
In the implementation manner, the first index values of different types can be selected to carry out chromaticity evaluation comparison on the candidate pictures based on the complexity of the contents of each candidate picture, so that unnecessary operation amount can be prevented from being increased on the basis of ensuring operation accuracy.
With reference to the first aspect, in certain implementations of the first aspect, the second index value includes a hash value; determining the index type of the first index value according to the second index value, including: the difference value of the second index values of the candidate pictures is smaller than a set threshold value, and the first index value is determined to contain at least one index type; the difference value of the second index values of the candidate pictures is larger than a set threshold value, and the first index value is determined to contain multiple index types.
With reference to the first aspect, in certain implementations of the first aspect, the at least one indicator type includes a centroid; the multiple index types include centroid, correlation coefficient, papanicolaou distance, soil moving distance and JS divergence.
In a second aspect, the present technical solution provides a picture display device, including: a selecting unit for selecting a plurality of candidate pictures from the photographed pictures; screening a plurality of target pictures from the plurality of candidate pictures according to the color characteristics of the plurality of candidate pictures; a generation unit configured to generate a picture set based on a plurality of target pictures; the display unit is used for responding to the triggering operation of the first control in the target application interface and displaying a first page; the picture set is displayed on a first page.
In the above implementation scheme, after the candidate pictures of the picture set are selected, the picture display device may further screen each candidate picture based on the tone characteristics of the candidate pictures, so as to filter the picture with a larger fading tone difference. Based on the implementation mode, the pictures in the same picture set can have uniform tone, visual jumping sense is relieved, each picture in the picture set has better impression, and user experience is improved.
With reference to the second aspect, in some implementations of the second aspect, the selecting unit is specifically configured to detect a current time; and responding to the current time reaching a preset period node, and selecting a plurality of candidate pictures from the shot pictures.
With reference to the second aspect, in some implementations of the second aspect, the selecting unit is specifically configured to detect a current device state in response to a current time reaching a preset period node; and selecting a plurality of candidate pictures from the shot pictures in response to the device power exceeding the power threshold.
In the above implementation scheme, the execution of the picture display method provided by the application can be started after the equipment state is determined to meet the preset condition, so that the equipment can be prevented from further increasing the electricity consumption and the resource occupation under the low-electricity state or the high-power state.
With reference to the second aspect, in some implementations of the second aspect, the selecting unit is specifically configured to calculate a first index value between the plurality of candidate pictures according to color features of the plurality of candidate pictures; and screening a plurality of target pictures from the plurality of candidate pictures according to the first index value.
With reference to the second aspect, in some implementations of the second aspect, the selecting unit is specifically configured to generate chromaticity distribution graphs corresponding to the multiple candidate pictures respectively; and calculating a first index value among the plurality of candidate pictures according to the chromaticity distribution diagrams corresponding to the plurality of candidate pictures.
With reference to the second aspect, in some implementations of the second aspect, the selecting unit is specifically configured to select a reference picture from a plurality of candidate pictures; and respectively calculating first index values between the reference picture and each other candidate picture according to the chromaticity distribution diagram corresponding to each reference picture and each other candidate picture.
With reference to the second aspect, in some implementations of the second aspect, the selecting unit is specifically configured to determine, according to statistical features of each first index value, a picture to be removed; and removing the picture to be removed from the candidate pictures to obtain a plurality of target pictures.
With reference to the second aspect, in some implementations of the second aspect, the selecting unit is specifically configured to calculate an average value of each first index value separately; determining a target distribution range of the first index values according to the average value of the first index values; and determining the picture to be removed according to the target distribution range, wherein a first index value corresponding to the picture to be removed is located outside the target distribution range.
With reference to the second aspect, in some implementations of the second aspect, the selecting unit is further configured to calculate second index values of the plurality of candidate pictures according to content features of the plurality of candidate pictures, respectively; and determining the index type of the first index value according to the second index value.
In the implementation scheme, the first index values of different types can be selected to carry out chromaticity evaluation comparison on the candidate pictures based on the complexity of the contents of each candidate picture, so that unnecessary operation amount can be prevented from being increased on the basis of ensuring operation accuracy.
With reference to the second aspect, in certain implementations of the second aspect, the second index value includes a hash value; the selecting unit is specifically configured to determine that the first index value includes at least one index type, where a difference between second index values of the plurality of candidate pictures is smaller than a set threshold; the difference value of the second index values of the candidate pictures is larger than a set threshold value, and the first index value is determined to contain multiple index types.
With reference to the second aspect, in certain implementations of the second aspect, the at least one indicator type includes a centroid; the multiple index types include centroid, correlation coefficient, papanicolaou distance, soil moving distance and JS divergence.
In a third aspect, the present technical solution provides a picture display device, including: one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions that, when executed by the device, cause the device to perform the method of the first aspect or any of the possible implementations of the first aspect.
In a fourth aspect, the present technical solution provides a picture display device, where the device includes a storage medium and a central processing unit, where the storage medium may be a non-volatile storage medium, where a computer executable program is stored in the storage medium, and where the central processing unit is connected to the non-volatile storage medium and executes the computer executable program to implement the first aspect or a method in any possible implementation manner of the first aspect.
In a fifth aspect, the present technical solution provides a chip, the chip including a processor and a data interface, the processor reading instructions stored on a memory through the data interface, and executing the method in the first aspect or any possible implementation manner of the first aspect.
Optionally, as an implementation manner, the chip may further include a memory, where the memory stores instructions, and the processor is configured to execute the instructions stored on the memory, where the instructions, when executed, are configured to perform the method in the first aspect or any of the possible implementation manners of the first aspect.
In a sixth aspect, the present technical solution provides a computer readable storage medium storing program code for execution by a device, the program code comprising instructions for performing the method of the first aspect or any possible implementation of the first aspect.
Drawings
FIG. 1 is a schematic scene diagram of a picture display method provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 3 is a block diagram of a software architecture of an electronic device according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of a picture display method provided in an embodiment of the present application;
FIG. 5 is another schematic scene diagram of a picture display method provided by an embodiment of the present application;
FIG. 6 is another schematic scene diagram of a picture display method provided by an embodiment of the present application;
FIG. 7 is another schematic scene diagram of a picture display method provided by an embodiment of the present application;
FIG. 8 is another schematic scene diagram of a picture display method provided by an embodiment of the present application;
fig. 9 is another schematic flowchart of a picture display method provided in an embodiment of the present application;
fig. 10 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Prior to describing the embodiments of the present application, a description will be given first of the related art.
In the embodiment of the present application, the picture application may include any application with a picture display function, and common picture applications include gallery applications (also referred to as "album applications", "photo applications", etc.), and resource storage applications. The picture application can be used for displaying pictures shot by a user, wherein the pictures shot by the user can be pictures shot by using the equipment, or can be pictures shot by using other equipment and transmitted to the equipment for storage.
For convenience of description, the above-mentioned picture class application will be referred to as a target application in this application. Fig. 1 is a schematic diagram of a target application interface, as shown in fig. 1, where a target application interface 700 may include a plurality of trigger controls, such as a photo control, an album control, a time control, and an authoring control. The respective trigger controls may be used to trigger the display of different picture display pages, each of which may display pictures taken by the user based on different display forms. For example, a "photo" page triggered by a photo control, which may display pictures sequentially according to the shooting time; the album control is used for triggering an album page, and the page can be used for displaying pictures in a classified manner according to picture sources; and a time page triggered by the time control, wherein the page can display pictures in the form of a picture set.
For the "time" page, the pictures contained in each picture set are selected from the pictures shot by the user according to specific principles such as shooting time and place, and are used for displaying the highlight pictures recorded at the specific time and place to the user, and have certain aesthetic properties. However, when a user browses each picture set, the user often finds that the pictures contained in the same picture set are disordered in overall appearance, and the aesthetic feeling is not satisfied, so that the aesthetic interest of the user cannot be satisfied, and the expected ornamental target of the user cannot be reached.
The present application is proposed to solve the above-described problems.
According to the method and the device, after candidate pictures are selected, the candidate pictures are further screened based on color distribution of each candidate picture and saturation of different hues, and a picture set is generated and displayed based on each picture with consistent hues. Based on the scheme, the tone of each picture in the same picture set can be kept consistent, so that the user can be prevented from generating visual jumping sense when browsing the picture set, and the defect of messy overall appearance of the picture set is relieved.
The image display method provided by the embodiment of the application can be applied to any electronic equipment comprising the target application. For example, a cell phone, a tablet computer, a wearable device, an in-vehicle device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA), and the like, the specific type of the electronic device is not limited in the embodiments of the present application.
Fig. 2 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, a sensor module 180, keys 190, an indicator 192, a camera 193, a display 194, and a user identification module (subscriber identification module, SIM) card interface 195, etc., and optionally, the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an acceleration sensor 180E, a distance sensor 180F, a fingerprint sensor 180H, a touch sensor 180K.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an operating system, an application program (such as a picture display function) required for at least one function, and the like. The storage data area may store data created during use of the electronic device 100 (e.g., tone data of a picture, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, etc.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. The embodiment of the invention takes an Android (Android) system with a layered architecture as an example, and illustrates a software structure of the electronic device 100.
Fig. 3 is a software block diagram of the electronic device 100 according to the embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system library, a hardware abstraction layer, and a kernel layer, respectively.
The application layer may include a series of application packages. As shown in FIG. 3, the application package may include applications for cameras, gallery, system caretakers, OCR engines, intelligent screenshots, and the like. The gallery application can be used for generating a picture set based on the picture display method provided by the application and displaying the picture set by a display driver.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 3, the application framework layer may include a window manager, a content provider, a notification manager, a battery manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include images, video, audio, etc.
The notification manager may be used to implement message notifications, such as to implement off-screen charge message notifications.
The battery manager can be used for realizing battery state judgment, off-screen state judgment and off-screen charging message notification.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (Media Libraries), SQLite database, two-dimensional graphics processing library, etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
The media library supports still image files, a variety of commonly used audio, video format playback and recording, and the like.
The two-dimensional graphic processing library is used for realizing two-dimensional image clipping, scaling and the like.
The SQLite database user stores picture data, picture attributes, and the like.
The hardware abstraction layer may include a graphics module, a camera module, a bluetooth library module, a WIFI module, a hardware synthesizer, and the like. Wherein the graphics module may be used to generate a picture.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver. The display driver can be used for displaying the picture set obtained by the embodiment of the application.
For convenience of understanding, the following embodiments of the present application will take an electronic device having the structure shown in fig. 2 and fig. 3 as an example, and the picture display method provided in the embodiments of the present application will be described with reference to the scenario shown in fig. 1. As shown in fig. 4, the method for displaying a picture provided in the embodiment of the present application includes:
101, selecting a plurality of candidate pictures from the photographed pictures.
In this embodiment of the present application, the shot picture may be a picture shot and stored by the device, or may be a picture shot and transmitted to the device by other devices.
The candidate pictures can be selected from the photographed pictures according to any one or more factors of photographing time, photographing place, photographing object and aesthetic quality of the pictures. For example, a picture taken at the same shooting location on the same day may be selected as a candidate picture; or selecting the pictures shot by taking the same shooting object as a main body in one month as candidate pictures and the like. In this embodiment of the present application, the number of candidate pictures selected may be determined based on a preset value, or may be flexibly determined based on an actual screening result.
Further, in the embodiment of the present application, the electronic device may generate a new picture set according to a set period, and after the user starts the target application and triggers the display of the first page on the target application interface, the newly generated picture set is displayed on the first page. Meanwhile, the previously generated picture set may be displayed together with the newly generated picture set. The setting period may be flexibly set, and may be, for example, a period of a week, a period of a month, or the like.
Based on the implementation manner, in the embodiment of the application, the electronic device may detect the current time, and after detecting that the current time reaches the set period node, may trigger to select a candidate picture from the captured pictures, so as to generate a new picture set.
In other implementations, the device state may be detected first after detecting that the current time has reached the set period node. The device status may include, for example, battery state of charge, device operational status, screen display status, and the like. And triggering to select candidate pictures from the shot pictures under the condition that the electric quantity of the equipment is determined to be larger than a preset electric quantity threshold value, and/or the current running state of the equipment is good, the resource occupancy rate is lower than a set threshold value, and/or the equipment is in an off-screen charging state so as to generate a new picture set. And when the electric quantity of the equipment is determined to be smaller than a preset electric quantity threshold value, and/or the current resource occupancy rate of the equipment is higher than a set threshold value, and/or the equipment is in a bright screen state, the equipment state can be continuously detected until the equipment state reaches the above condition, and then the candidate picture is triggered to be selected from the shot pictures. Thus, it is possible to prevent the power consumption of the device from being increased in the case of low power, and to prevent the resource consumption from being further increased in the case of high resource occupancy, resulting in a reduction in the operation efficiency of other applications.
Or the electronic device can trigger to generate a new picture set according to the number of the newly added pictures in the target time period. Specifically, the electronic device may count the number of subsequent newly added pictures after generating the new picture set each time, and trigger to generate the new picture set when determining that the number of newly added pictures reaches the set threshold.
Alternatively, in another implementation, in the case that the number of newly added pictures is detected to reach the set threshold, similar to the previous implementation, the device state may be detected first, and in the case that it is determined that the device state meets the above condition, the selection of the candidate picture from the already taken pictures is triggered so as to generate a new picture set.
102, screening a plurality of target pictures from the plurality of candidate pictures according to the color characteristics of the plurality of candidate pictures.
After the candidate pictures are obtained, in the embodiment of the application, the candidate pictures can be further screened, so that a plurality of target pictures with consistent color tones are obtained.
First, a first index value between a plurality of candidate pictures may be calculated according to color characteristics of the plurality of candidate pictures.
In this embodiment of the present application, a chrominance distribution map corresponding to each candidate picture may be generated first, and then, based on the chrominance distribution map, a first index value between each candidate picture is calculated. The first index value may reflect a degree of similarity in chromaticity of each candidate picture.
The following describes a method for generating a chromaticity distribution map.
Specifically, each candidate picture may be first converted from RGB format to HSV format. Where H represents chromaticity, S represents saturation, and V represents luminance. Fig. 5 is a schematic view of a color wheel, as shown in fig. 5, the distribution range of chromaticity is 0 deg. to 360 deg., each degree corresponding to a particular chromaticity. Wherein 0 ° corresponds to red, 60 ° corresponds to yellow, 120 ° corresponds to green, 180 ° corresponds to cyan, 240 ° corresponds to blue, 300 ° corresponds to violet. Since a typical image is stored in 8 bits, the readable chromaticity range is 0-180, with each degree representing two degrees of actual distribution. Using functions
Figure SMS_1
The distribution can be converted to 0-360.
Furthermore, based on the candidate pictures in the HSV format, after the images of the H channels are acquired, the chromaticity of each pixel in the H channels of the candidate pictures can be counted, and a chromaticity distribution diagram corresponding to each candidate picture can be generated. Fig. 6 is an exemplary representation of a chromaticity distribution map, as shown in fig. 6, where chromaticity coordinates may be generated at corresponding locations on the color wheel based on the distribution of the respective chromaticities in the candidate pictures. For any chromaticity, the more the distribution amount is, the higher the coordinate value of the corresponding position in the color wheel is.
Further, a method for calculating the first index value between each candidate picture is described below.
In the embodiment of the application, any one picture can be selected from the candidate pictures as the reference picture. Further, first index values between the remaining candidate pictures and the reference picture may be calculated based on the chromaticity distribution map, respectively. The first index value may include various types, such as centroid, correlation coefficient CORREL, barbitacharyya, earth Moving Distance (EMD), and JS divergence. The centroid can reflect the distribution centroid of the chromaticity of the candidate pictures, and the closer the centroids of the chromaticity of the two pictures are, the smaller the hue difference of the two pictures is. The correlation coefficient, the Babbitt distance, the soil carrying distance and the JS scattering degree can be used for representing the similarity between any two candidate pictures, and the higher the correlation coefficient is, the smaller the Babbitt distance is, the smaller the soil carrying distance is, and the smaller the JS scattering degree is, the higher the chromaticity similarity between the two candidate pictures is.
The calculation formula of each first index value may refer to the prior art, and is briefly described in the embodiments of the present application.
The correlation coefficient has a value of 0-1, the larger the value is, the more similar the chromaticity is represented, and the calculation formula is as follows:
Figure SMS_2
the pasteurization distance is used to measure the similarity of two discrete or continuous probability distributions. For discrete probability distributions p and q in the same domain X, the pasteurization distance is defined as:
Figure SMS_3
/>
Figure SMS_4
the earth moving distance is used for representing the minimum value of the average moving distance required for moving the discrete data distribution P into the discrete data distribution Q, and the calculation formula is as follows:
Figure SMS_5
wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure SMS_7
is->
Figure SMS_9
And->
Figure SMS_12
All possible joint distributions combined, for each possible joint distribution +.>
Figure SMS_8
In other words, it is possible to sample +.>
Figure SMS_10
Obtaining a sample->
Figure SMS_14
And->
Figure SMS_15
And calculate the distance of the pair of samples +.>
Figure SMS_6
Calculating the joint distribution +.>
Figure SMS_11
Lower sample to distance expectation value +.>
Figure SMS_13
Taking the lower bound on the expected value in all possible joint distributions
Figure SMS_16
I.e. the wasperstein distance.
The JS divergence can be used for representing the similarity between two probability distributions, the value is between 0 and 1, and the calculation formula is as follows:
Figure SMS_17
Figure SMS_18
table 1 below is a schematic of one possible result of the first index value between the candidate pictures. In the case where the selected candidate picture 03 is a reference picture, a first index value between each of the remaining candidate pictures and the candidate picture 03 may be obtained as shown in table 1.
Figure SMS_19
TABLE 1
Then, a plurality of target pictures with consistent hues can be screened from a plurality of candidate pictures according to the first index value.
Specifically, the picture to be removed may be determined from each candidate picture according to the statistical characteristics of each first index value.
In one possible implementation, an average value of each first index value may be calculated. Further, at least one first index value with the largest difference from the average value in the first index values can be determined, and the candidate picture corresponding to the at least one first index value is determined as the picture to be removed.
For ease of understanding, still taking the above table 1 as an example, the average value of the first index values may be calculated as 0.7134 (correlation coefficient), 0.3296 (barbituric distance), 5.3447 (soil carrying distance), 0.1329 (JS divergence), and 71.83 (centroid), respectively. It may be determined that the difference between the first index value between the candidate picture 05 and the reference picture and the above average value is the largest, and thus, the candidate picture 05 may be determined as the picture to be rejected.
In another possible implementation manner, after calculating the average value of each first index value, the target distribution range of the first index value may be further determined based on the average value. For example, a range obtained by adding or subtracting the variance to or from the average value may be determined as the target distribution range, or a range obtained by adding or subtracting the standard deviation to or from the average value may be determined as the target distribution range. And finally, determining the candidate pictures with the corresponding first index values outside the respective ranges of the targets as pictures to be removed.
Further, in the actual execution scene, because of randomness of the selection result of the reference picture, the selected reference picture may be the picture with the largest hue difference, and at this time, each calculated first index value may be located in the target distribution range, and the picture to be removed cannot be determined by the method.
In view of the above, in the embodiment of the present application, when each calculated first index value is located within the target distribution range, the reference picture may be reselected, and the first index value between the new reference picture and the remaining candidate pictures may be recalculated. And further, determining the picture to be removed based on the new calculation result.
Alternatively, in the embodiment of the present application, each candidate picture may be sequentially selected as a reference picture, the first index value is calculated multiple times, and the picture to be removed is determined based on the calculation result of each time. Therefore, the candidate pictures with larger tone difference can be prevented from being missed in the screening process, and the reliability of the scheme is further improved.
And 103, generating a picture set based on the plurality of target pictures.
104, responding to the triggering operation of the first control in the target application interface, and displaying a first page.
And 105, displaying the picture set on the first page.
In the embodiment of the application, after the target pictures are obtained, a picture set can be generated based on each target picture, and after the user triggers the target application and enters the first page of the target application, the newly generated picture set is displayed in the first page.
And responding to the triggering operation of the user on the target application icon in the main interface of the electronic equipment, and displaying the target application interface. And a plurality of page controls can be displayed in the target application interface, each page control can be used for triggering and displaying different picture display pages, and each picture display page can be used for displaying the shot pictures in different forms.
And responding to the triggering operation of the user on the first control in each page control, displaying a first page, wherein the first page can display the newly generated picture set in the embodiment. Meanwhile, the history picture set can be displayed together.
Specifically, the manner of displaying each picture set may be that, on the one hand, the picture set identification information is generated according to factors such as a shooting object, shooting time, shooting place and the like of each picture included in the picture set. The picture set identification information may include, for example, a picture set name, a picture taking time, a place, and the like. Examples of the "figure 2022, 6 to 12 months," and "recall Beijing" are possible. On the other hand, any picture in the picture set can be selected as the cover. Furthermore, covers of the respective picture sets and corresponding picture set identification information can be displayed respectively.
As shown in fig. 1, for example, the target application interface 700 may include four page controls, namely, a photo control, an album control, a time control, and an authoring control. In response to a trigger operation of the moment control therein, a moment page may be displayed and the newly generated picture set and the history picture set are displayed therein.
Fig. 7 is a schematic view of a first page according to an embodiment of the present application. As shown in fig. 7, a newly generated picture set 801 may be displayed in the upper half of the first page 800, while the lower half of the first page 800 in turn displays various historical picture sets 802 that have been previously generated. At this time, the up-down scrolling of the page may also be implemented in response to the user's sliding operation on the first page 800, so as to display more history picture sets 802.
After displaying the picture sets, the electronic device may also display a second page in response to a trigger operation on any one of the picture sets. The second page may display a cover of the picture set, identification information, and each picture included in the picture set. The cover of the picture set can also comprise a play control, and each picture in the picture set can be sequentially played in a video mode in response to the triggering operation of the play control. The triggering operation on the play control can be clicking the play control or clicking any position of the photo album cover. Before each picture contained in the picture set is displayed in the second page, cutting and scaling treatment can be carried out on each picture according to layout arrangement requirements, and the cut and scaled pictures are arranged and displayed.
Fig. 8 is a schematic view of a second page according to an embodiment of the present application. As shown in fig. 8, the upper half area of the second page 900 may display a cover 901 of the photo album, and a play control 9011 may be displayed in the cover 901. In response to a trigger operation to the play control 9011, each picture included in the picture set can be played in video form. The lower half of the second page 900 may display the individual pictures 902 contained in the picture set. In response to a sliding operation of the second page 900 by the user, scrolling up and down of the page may be implemented to display more pictures of the picture set.
Through the technical scheme, the pictures can be filtered based on the color characteristics, so that the pictures contained in the same picture set have consistent color tones, the visual jumping sense of a user can be relieved, the ornamental value of the picture set is enhanced, and the user experience is improved.
Fig. 9 is another schematic flowchart of a picture display method provided in an embodiment of the present application. In another embodiment of the present application, the method for displaying a picture may include:
and 201, selecting a plurality of candidate pictures from the shot pictures.
202, respectively calculating second index values of the plurality of candidate pictures according to content characteristics of the plurality of candidate pictures.
203, determining the index type of the first index value according to the second index value.
In the embodiment of the present application, after the candidate pictures are selected, the complexity level of performing tone contrast on each candidate picture may be determined based on the image content of each candidate picture. Further, in the case of low complexity, the calculation of the at least one type of the first index value may be performed according to the color characteristics of the candidate pictures, and the target pictures with consistent color tone may be selected based on the at least one type of the first index value, so that the calculation amount may be reduced. For example, it is possible to calculate only the centroid of the chromaticity distribution of each candidate picture, and to screen out target pictures of consistent hue based on the chromaticity centroid.
Conversely, under the condition of higher complexity, the first index values of multiple types can be calculated according to the color characteristics of the candidate pictures, and the target pictures with consistent color tones can be screened based on the first index values of multiple types, so that the accuracy of tone comparison can be improved. For example, a plurality of first index values such as centroid, correlation coefficient, barbituric distance, soil carrying distance, JS divergence, etc. may be calculated simultaneously.
Specifically, for each candidate picture in RGB format, a hash algorithm may be used to calculate a hash value for each candidate picture. The hash algorithm may be a mean hash algorithm, a difference value hash algorithm, a perceptual hash algorithm, or the like. The hash value of a picture can be used as a "fingerprint" of the picture to uniquely identify the picture. By comparing the hash values, the similarity of the two pictures in content can be obtained. The closer the hash value is, the higher the similarity of the two pictures in the content is, and the lower the similarity is otherwise.
Based on this, in the embodiment of the present application, when the difference in hash value between each candidate picture is smaller than the set threshold, it may be considered that the content similarity of each picture is high. For example, it is possible to take pictures of the same subject from similar shooting angles. At this time, the complexity of tone matching the candidate pictures is relatively low, and then it may be determined to use at least one type of first index value for tone matching. In contrast, in the case where the difference in hash value between the respective candidate pictures is larger than the set threshold, the respective picture contents are considered to be low in similarity. For example, it may be a picture taken of a different photographic subject. At this time, the complexity of tone matching the candidate pictures may be high, and then it may be determined to use multiple types of first index values for tone matching.
204, calculating a first index value between the plurality of candidate pictures according to the color characteristics of the plurality of candidate pictures.
And 205, screening a plurality of target pictures from the plurality of candidate pictures according to the first index value.
A picture set is generated based on the plurality of target pictures 206.
207, in response to a triggering operation of the first control in the target application interface, displaying a first page.
208, displaying the picture set on the first page.
After determining the type of the first index value, each first index value between each candidate picture can be calculated according to the color characteristics of each candidate picture, and each target picture with consistent tone can be determined based on the first index value. Further, a picture set may be generated based on each target picture, and after detecting a triggering operation on the first control, a corresponding picture set may be displayed on the first page. For specific implementation, reference may be made to the foregoing embodiments, and details are not described herein.
Through the technical scheme, under the condition that the picture complexity is high, the tone difference of each candidate picture can be compared by using multiple types of similarity evaluation indexes, and under the condition that the picture complexity is low, the tone difference of each candidate picture can be compared by using fewer types of similarity evaluation indexes. Therefore, the operation resources can be saved, and the accuracy of the operation result is ensured.
It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware and/or software modules that perform the respective functions. The steps of the examples described in connection with the embodiments disclosed herein may be embodied in hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art can implement the described functionality using different approaches for each particular application in conjunction with the embodiments.
The present embodiment may divide the functional modules of the electronic device according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules described above may be implemented in hardware. It should be noted that, in this embodiment, the division of the modules is schematic, only one logic function is divided, and another division manner may be implemented in actual implementation.
In the case of dividing the respective functional modules with the respective functions, fig. 10 shows a schematic diagram of one possible composition of the electronic device involved in the above-described embodiment, and as shown in fig. 10, the electronic device 600 may include: a selection unit 601, a generation unit 602, and a display unit 603, wherein:
a selecting unit 601, configured to select a plurality of candidate pictures from the captured pictures; and screening a plurality of target pictures from the plurality of candidate pictures according to the color characteristics of the plurality of candidate pictures.
A generating unit 602, configured to generate a picture set based on a plurality of target pictures.
A display unit 603, configured to display a first page in response to a triggering operation on a first control in the target application interface; the picture set is displayed on a first page.
In one possible implementation, the selecting unit 601 is specifically configured to detect a current time; and responding to the current time reaching a preset period node, and selecting a plurality of candidate pictures from the shot pictures.
In one possible implementation, the selecting unit 601 is specifically configured to detect a current device state in response to the current time reaching a preset period node; and selecting a plurality of candidate pictures from the shot pictures in response to the device power exceeding the power threshold.
In one possible implementation manner, the selecting unit 601 is specifically configured to calculate a first index value between the multiple candidate pictures according to color features of the multiple candidate pictures; and screening a plurality of target pictures from the plurality of candidate pictures according to the first index value.
In a possible implementation manner, the selecting unit 601 is specifically configured to generate chromaticity distribution graphs corresponding to the multiple candidate pictures respectively; and calculating a first index value among the plurality of candidate pictures according to the chromaticity distribution diagrams corresponding to the plurality of candidate pictures.
In one possible implementation, the selecting unit 601 is specifically configured to select a reference picture from a plurality of candidate pictures; and respectively calculating first index values between the reference picture and each other candidate picture according to the chromaticity distribution diagram corresponding to each reference picture and each other candidate picture.
In a possible implementation manner, the selecting unit 601 is specifically configured to determine a picture to be removed according to the statistical characteristics of each first index value; and removing the picture to be removed from the candidate pictures to obtain a plurality of target pictures.
In a possible implementation manner, the selecting unit 601 is specifically configured to calculate an average value of each first index value respectively; determining a target distribution range of the first index values according to the average value of the first index values; and determining the picture to be removed according to the target distribution range, wherein a first index value corresponding to the picture to be removed is located outside the target distribution range.
In a possible implementation manner, the selecting unit 601 is further configured to calculate second index values of the multiple candidate pictures according to content features of the multiple candidate pictures, respectively; and determining the index type of the first index value according to the second index value.
In one possible implementation, the second index value comprises a hash value; the selection unit 601 is specifically configured to determine that the first index value includes at least one index type, where a difference between second index values of the plurality of candidate pictures is smaller than a set threshold; the difference value of the second index values of the candidate pictures is larger than a set threshold value, and the first index value is determined to contain multiple index types.
In one possible implementation, the at least one indicator type includes a centroid; the multiple index types include centroid, correlation coefficient, papanicolaou distance, soil moving distance and JS divergence.
It should be understood that the electronic device herein is embodied in the form of functional units. The term "unit" herein may be implemented in software and/or hardware, without specific limitation. For example, a "unit" may be a software program, a hardware circuit or a combination of both that implements the functions described above. The hardware circuitry may include application specific integrated circuits (application specific integrated circuit, ASICs), electronic circuits, processors (e.g., shared, proprietary, or group processors, etc.) and memory for executing one or more software or firmware programs, merged logic circuits, and/or other suitable components that support the described functions.
The application also provides a picture display device, which comprises a storage medium and a central processing unit, wherein the storage medium can be a nonvolatile storage medium, a computer executable program is stored in the storage medium, and the central processing unit is connected with the nonvolatile storage medium and executes the computer executable program to realize the picture display method.
The present application also provides a computer-readable storage medium having instructions stored therein that, when executed on a computer, cause the computer to perform the steps of the picture display method of the present application.
The present application also provides a computer program product comprising instructions which, when run on a computer or any of the at least one processors, cause the computer to perform the steps of the picture display method of the present application.
The application also provides a chip, which comprises a processor and a data interface, wherein the processor reads instructions stored in a memory through the data interface so as to execute corresponding operations and/or processes executed by the picture display method.
Optionally, the chip further comprises a memory, the memory is connected with the processor through a circuit or a wire, and the processor is used for reading and executing the computer program in the memory. Further optionally, the chip further comprises a communication interface, and the processor is connected to the communication interface. The communication interface is used for receiving data and/or information to be processed, and the processor acquires the data and/or information from the communication interface and processes the data and/or information. The communication interface may be an input-output interface.
The memory may be read-only memory (ROM), other types of static storage devices that can store static information and instructions, random access memory (random access memory, RAM) or other types of dynamic storage devices that can store information and instructions, electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), compact disc read-only memory (compact disc read-only memory) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media, or any other magnetic storage device that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, etc.
In this embodiment, "and/or" describes an association relationship of an association object, which means that there may be three relationships, for example, a and/or B, and may mean that there is a alone, a and B together, and B alone. Wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of the following" and the like means any combination of these items, including any combination of single or plural items. For example, at least one of a, b and c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in the embodiments disclosed herein can be implemented as a combination of electronic hardware, computer software, and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In several embodiments provided herein, any of the functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely specific embodiments of the present application, and any person skilled in the art may easily conceive of changes or substitutions within the technical scope of the present application, which should be covered by the protection scope of the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A picture display method, characterized by comprising:
selecting a plurality of candidate pictures from the shot pictures;
screening a plurality of target pictures from the plurality of candidate pictures according to the color characteristics of the plurality of candidate pictures;
generating a picture set based on the plurality of target pictures;
responding to the triggering operation of a first control in a target application interface, and displaying a first page;
and displaying the picture set on the first page.
2. The method of claim 1, wherein selecting a plurality of candidate pictures from the captured pictures comprises:
detecting the current time;
and responding to the current time reaching a preset period node, and selecting a plurality of candidate pictures from the shot pictures.
3. The method of claim 2, wherein selecting a plurality of candidate pictures from the captured pictures in response to the current time reaching a preset period node comprises:
Detecting a current equipment state in response to the current time reaching a preset period node;
and selecting a plurality of candidate pictures from the shot pictures in response to the device power exceeding the power threshold.
4. The method of claim 1, wherein screening a plurality of target pictures from the plurality of candidate pictures based on color characteristics of the plurality of candidate pictures, comprises:
calculating a first index value among the plurality of candidate pictures according to the color characteristics of the plurality of candidate pictures;
and screening a plurality of target pictures from the plurality of candidate pictures according to the first index value.
5. The method of claim 4, wherein calculating a first index value between the plurality of candidate pictures based on color characteristics of the plurality of candidate pictures comprises:
respectively generating chromaticity distribution diagrams corresponding to the candidate pictures;
and calculating a first index value among the plurality of candidate pictures according to the chromaticity distribution diagrams corresponding to the plurality of candidate pictures.
6. The method of claim 5, wherein calculating a first index value between the plurality of candidate pictures from the chromaticity distribution map corresponding to the plurality of candidate pictures comprises:
Selecting a reference picture from the plurality of candidate pictures;
and respectively calculating first index values between the reference picture and the rest candidate pictures according to the chromaticity distribution diagrams corresponding to the reference picture and the rest candidate pictures.
7. The method of claim 6, wherein screening a plurality of target pictures from the plurality of candidate pictures based on the first index value comprises:
determining the picture to be removed according to the statistical characteristics of each first index value;
and removing the picture to be removed from the candidate pictures to obtain a plurality of target pictures.
8. The method of claim 7, wherein determining the picture to be culled based on the statistical characteristics of the respective first index values comprises:
respectively calculating the average value of the first index values;
determining a target distribution range of the first index values according to the average value of the first index values;
and determining a picture to be removed according to the target distribution range, wherein a first index value corresponding to the picture to be removed is located outside the target distribution range.
9. The method of claim 4, wherein prior to calculating the first index value between the plurality of candidate pictures based on the color characteristics of the plurality of candidate pictures, the method further comprises:
Respectively calculating second index values of the plurality of candidate pictures according to the content characteristics of the plurality of candidate pictures;
and determining the index type of the first index value according to the second index value.
10. The method of claim 9, wherein the second index value comprises a hash value; determining the index type of the first index value according to the second index value, including:
the difference value of the second index values of the candidate pictures is smaller than a set threshold value, and the first index value is determined to contain at least one index type;
and determining that the first index value contains multiple index types when the difference value of the second index values of the multiple candidate pictures is larger than the set threshold value.
11. The method of claim 10, wherein the at least one indicator type comprises a centroid; the multiple index types comprise mass centers, correlation coefficients, babbing distances, soil carrying distances and JS divergences.
12. An electronic device, comprising:
one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions, which when executed by the electronic device, cause the electronic device to perform the picture display method of any of claims 1-11.
13. A computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the picture display method of any one of claims 1 to 11.
CN202310558298.7A 2023-05-18 2023-05-18 Picture display method and device Pending CN116302293A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310558298.7A CN116302293A (en) 2023-05-18 2023-05-18 Picture display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310558298.7A CN116302293A (en) 2023-05-18 2023-05-18 Picture display method and device

Publications (1)

Publication Number Publication Date
CN116302293A true CN116302293A (en) 2023-06-23

Family

ID=86815257

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310558298.7A Pending CN116302293A (en) 2023-05-18 2023-05-18 Picture display method and device

Country Status (1)

Country Link
CN (1) CN116302293A (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080089572A1 (en) * 2006-08-04 2008-04-17 Konica Minolta Medical & Graphic, Inc. Image display method and image display apparatus
US20100131901A1 (en) * 2008-11-27 2010-05-27 Sony Corporation Information processing apparatus, display control method and program
CN101789233A (en) * 2010-01-14 2010-07-28 宇龙计算机通信科技(深圳)有限公司 Image displaying method for mobile terminal and mobile terminal
CN104615769A (en) * 2015-02-15 2015-05-13 小米科技有限责任公司 Image classification method and device
CN105740454A (en) * 2016-02-04 2016-07-06 北京金山安全软件有限公司 Display method and device of picture folder and electronic equipment
CN106445995A (en) * 2016-07-18 2017-02-22 腾讯科技(深圳)有限公司 Picture classification method and apparatus
US20170090693A1 (en) * 2015-09-25 2017-03-30 Lg Electronics Inc. Mobile terminal and method of controlling the same
CN107015998A (en) * 2016-01-28 2017-08-04 阿里巴巴集团控股有限公司 A kind of image processing method, device and intelligent terminal
CN108121816A (en) * 2017-12-28 2018-06-05 广东欧珀移动通信有限公司 Picture classification method, device, storage medium and electronic equipment
CN108334531A (en) * 2017-09-19 2018-07-27 平安普惠企业管理有限公司 Picture tone extracting method, equipment and computer readable storage medium
CN110175259A (en) * 2019-05-30 2019-08-27 努比亚技术有限公司 Image display method, wearable device and computer readable storage medium
CN110377773A (en) * 2019-07-17 2019-10-25 Oppo广东移动通信有限公司 Image processing method, device, mobile terminal and storage medium
CN112445922A (en) * 2019-08-27 2021-03-05 华为技术有限公司 Picture processing method and device
CN112463275A (en) * 2020-11-23 2021-03-09 深圳传音控股股份有限公司 Data processing method, terminal and storage medium
CN113177131A (en) * 2021-04-09 2021-07-27 深圳时空引力科技有限公司 Picture processing method and device and storage medium
US20210342050A1 (en) * 2019-01-15 2021-11-04 Guangdong Oppo Mobile Telecommunications Corp., Ltd. File processing method, terminal, and storage medium
CN115525783A (en) * 2022-01-14 2022-12-27 荣耀终端有限公司 Picture display method and electronic equipment

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080089572A1 (en) * 2006-08-04 2008-04-17 Konica Minolta Medical & Graphic, Inc. Image display method and image display apparatus
US20100131901A1 (en) * 2008-11-27 2010-05-27 Sony Corporation Information processing apparatus, display control method and program
CN101789233A (en) * 2010-01-14 2010-07-28 宇龙计算机通信科技(深圳)有限公司 Image displaying method for mobile terminal and mobile terminal
CN104615769A (en) * 2015-02-15 2015-05-13 小米科技有限责任公司 Image classification method and device
US20170090693A1 (en) * 2015-09-25 2017-03-30 Lg Electronics Inc. Mobile terminal and method of controlling the same
CN107015998A (en) * 2016-01-28 2017-08-04 阿里巴巴集团控股有限公司 A kind of image processing method, device and intelligent terminal
CN105740454A (en) * 2016-02-04 2016-07-06 北京金山安全软件有限公司 Display method and device of picture folder and electronic equipment
CN106445995A (en) * 2016-07-18 2017-02-22 腾讯科技(深圳)有限公司 Picture classification method and apparatus
CN108334531A (en) * 2017-09-19 2018-07-27 平安普惠企业管理有限公司 Picture tone extracting method, equipment and computer readable storage medium
CN108121816A (en) * 2017-12-28 2018-06-05 广东欧珀移动通信有限公司 Picture classification method, device, storage medium and electronic equipment
US20210342050A1 (en) * 2019-01-15 2021-11-04 Guangdong Oppo Mobile Telecommunications Corp., Ltd. File processing method, terminal, and storage medium
CN110175259A (en) * 2019-05-30 2019-08-27 努比亚技术有限公司 Image display method, wearable device and computer readable storage medium
CN110377773A (en) * 2019-07-17 2019-10-25 Oppo广东移动通信有限公司 Image processing method, device, mobile terminal and storage medium
CN112445922A (en) * 2019-08-27 2021-03-05 华为技术有限公司 Picture processing method and device
CN112463275A (en) * 2020-11-23 2021-03-09 深圳传音控股股份有限公司 Data processing method, terminal and storage medium
CN113177131A (en) * 2021-04-09 2021-07-27 深圳时空引力科技有限公司 Picture processing method and device and storage medium
CN115525783A (en) * 2022-01-14 2022-12-27 荣耀终端有限公司 Picture display method and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
赵涓涓;陈俊杰;李元俊;: "基于Web页面结构和主色调的聚类算法", 计算机工程, vol. 36, no. 3, 28 February 2010 (2010-02-28), pages 1 *

Similar Documents

Publication Publication Date Title
US9491366B2 (en) Electronic device and image composition method thereof
US11237702B2 (en) Carousel interface for post-capture processing in a messaging system
CN107613202B (en) Shooting method and mobile terminal
KR20160097974A (en) Method and electronic device for converting color of image
CN113763856B (en) Method and device for determining ambient illumination intensity and storage medium
US11756249B2 (en) Layering of post-capture processing in a messaging system
US11695718B2 (en) Post-capture processing in a messaging system
CN109639896A (en) Block object detecting method, device, storage medium and mobile terminal
US11750546B2 (en) Providing post-capture media overlays for post-capture processing in a messaging system
CN113938602B (en) Image processing method, electronic device, chip and readable storage medium
CN107172354A (en) Method for processing video frequency, device, electronic equipment and storage medium
KR20220118545A (en) Post-capture processing in messaging systems
CN108764139A (en) A kind of method for detecting human face, mobile terminal and computer readable storage medium
CN114926351B (en) Image processing method, electronic device, and computer storage medium
CN113452969A (en) Image processing method and device
CN114138215A (en) Display method and related equipment
EP4303815A1 (en) Image processing method, electronic device, storage medium, and program product
CN116302293A (en) Picture display method and device
CN113742430B (en) Method and system for determining number of triangle structures formed by nodes in graph data
CN113298753A (en) Sensitive muscle detection method, image processing method, device and equipment
CN112712377A (en) Product defect arrangement and collection management database platform system
CN117036206B (en) Method for determining image jagged degree and related electronic equipment
CN106056584B (en) A kind of front and back scape segmenting device and method
CN116363017B (en) Image processing method and device
CN113749614B (en) Skin detection method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination