CN117931356A - Display method and device - Google Patents

Display method and device Download PDF

Info

Publication number
CN117931356A
CN117931356A CN202410142497.4A CN202410142497A CN117931356A CN 117931356 A CN117931356 A CN 117931356A CN 202410142497 A CN202410142497 A CN 202410142497A CN 117931356 A CN117931356 A CN 117931356A
Authority
CN
China
Prior art keywords
information
display
interface
input
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410142497.4A
Other languages
Chinese (zh)
Inventor
张良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202410142497.4A priority Critical patent/CN117931356A/en
Publication of CN117931356A publication Critical patent/CN117931356A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a display method and a device, which belong to the technical field of communication, wherein the method can comprise the steps of receiving a first input of a user, wherein the first input is used for setting wallpaper of a first interface, and the wallpaper comprises a luminous object; responding to the first input, and acquiring first display information of the luminous object in a first interface; generating light and shadow characteristic information according to the first display information and the second display information of the icon in the first interface; and displaying the second interface, and taking the first interface after the light and shadow characteristic information is adjusted as the second interface.

Description

Display method and device
Technical Field
The application belongs to the technical field of communication, and particularly relates to a display method and device.
Background
At present, on the display interfaces of electronic devices such as mobile phones, computers and intelligent watches, the displayed wallpaper and icons can be respectively and independently arranged, so that the problem of incongruity exists between the color tones of the wallpaper and the icons, or the problem that the color contrast between the wallpaper and the icons is not obvious enough exists, or the display content of the wallpaper is shielded by the icons sometimes, so that the display interfaces of the electronic devices are disordered and incongruous, and the display effect of the display interfaces is poor.
Disclosure of Invention
The embodiment of the application aims to provide a display method and a display device, which can improve the display effect of a display interface.
In a first aspect, an embodiment of the present application provides a display method, including:
receiving a first input of a user, wherein the first input is used for setting wallpaper of a first interface, and the wallpaper comprises a luminous object;
Responding to the first input, and acquiring first display information of the luminous object in a first interface;
generating light and shadow characteristic information according to the first display information and the second display information of the icon in the first interface;
and displaying the second interface, and taking the first interface after the light and shadow characteristic information is adjusted as the second interface.
In a second aspect, an embodiment of the present application provides a display apparatus, including:
the receiving module is used for receiving a first input of a user, wherein the first input is used for setting wallpaper of a first interface, and the wallpaper comprises a luminous object;
The acquisition module is used for responding to the first input and acquiring first display information of the luminous object in the first interface;
the generating module is used for generating light and shadow characteristic information according to the first display information and the second display information of the icon in the first interface;
The display module is used for displaying the second interface and taking the first interface after the light and shadow characteristic information is adjusted as the second interface.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions implementing the steps of the display method as shown in the first aspect when executed by the processor.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which, when executed by a processor, implement the steps of the display method as shown in the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a display interface, where the display interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps of the display method as shown in the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the steps of the display method as shown in the first aspect.
In the embodiment of the application, the wallpaper of the first interface in the electronic equipment can be set based on the user input, when the wallpaper comprises the luminous object, the first display information of the luminous object in the first interface and the second display information of the icon in the first interface are obtained, the light and shadow characteristic information is generated according to the first display information and the second display information, and then the first interface after the light and shadow characteristic information is adjusted is used as the second interface and the second interface is displayed. Therefore, the first interface can be adjusted through the shadow feature information, so that the adjusted second interface can display the effect of simulating the light emitted by the luminous object in reality to irradiate the object, the color contrast between the wallpaper and the icons in the first interface is increased, the wallpaper and the icons displayed in the first interface are effectively combined together through the physical boundary element of the luminous object, the luminous object of the nature is restored to irradiate the object to present various illumination effects, a static single picture is changed into a dynamic, natural and real scene, the display interface of the electronic equipment can be more harmonious, the tone coordination degree of the wallpaper and the icons is increased, and the display effect of the display interface is improved.
Drawings
FIG. 1 is a flow chart of a display method according to some embodiments of the present application;
FIG. 2 (a) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 2 (b) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 2 (c) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 2 (d) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 2 (e) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 3 (a) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 3 (b) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 4 is an interface diagram of a display method according to some embodiments of the present application;
FIG. 5 (a) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 5 (b) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 5 (c) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 5 (d) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 6 is an interface diagram of a display method according to some embodiments of the present application;
FIG. 7 is an interface diagram of a display method according to some embodiments of the present application;
FIG. 8 is an interface diagram of a display method according to some embodiments of the present application;
FIG. 9 is an interface diagram of a display method according to some embodiments of the present application;
FIG. 10 (a) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 10 (b) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 10 (c) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 10 (d) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 10 (e) is an interface diagram of a display method according to some embodiments of the present application;
FIG. 10 (f) is an interface diagram of a display method according to some embodiments of the present application;
fig. 11 is a schematic structural diagram of a display device according to an embodiment of the present application;
Fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
Fig. 13 is a schematic hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which are obtained by a person skilled in the art based on the embodiments of the present application, fall within the scope of protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the objects identified by "first," "second," etc. are generally of a type not limited to the number of objects, for example, the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
Many people prefer to shoot physical light sources such as the sun, moon, stars, etc. and use them as electronic equipment system desktops or lock screen wallpaper. However, the wallpaper belongs to the bottommost layer on the display level in the electronic device, and is usually shielded and segmented by display icons on the desktop of the system or notification information and other display contents on the lock screen, so that the effect of illuminating an object in reality cannot be fully reflected, and the wallpaper is poor in display effect due to the lack of natural characteristics and flexibility of physical world. In the related art, the display effect of the wallpaper can be improved by setting a transparent icon and running the wallpaper application on the electronic device. The background color of the icon is set to be transparent, so that the wallpaper content which originally belongs to the bottommost layer is more outstanding, and the situation that the content in the wallpaper is blocked by the icon is avoided; the latter wallpaper application generally provides multiple display modes and adjustment parameters for users to select, so as to meet the needs of different users, however, running the wallpaper application occupies the memory of the electronic device, increases power consumption, and requires a certain skill and experience for adjusting the wallpaper effect of the electronic device, which may be difficult for some users, and increases the complexity of setting wallpaper for users.
In order to solve the problems in the related art, the embodiment of the application provides a display method and a display device. The following describes in detail a display method provided by the embodiment of the present application through specific embodiments and application scenarios thereof with reference to fig. 1 to 13.
First, a display method provided in an embodiment of the present application will be described in detail with reference to fig. 1.
Fig. 1 is a flowchart of a display method according to some embodiments of the present application.
It should be noted that, in the display method provided by the embodiment of the present application, the execution body may be an electronic device capable of setting wallpaper or locking a screen, such as a mobile phone, a tablet computer, a notebook computer, a palm computer, a wearable device, and the like. In some embodiments of the present application, an electronic device is taken as an execution body to execute a display method, which is described in the display method provided by the embodiments of the present application.
As shown in fig. 1, the display method provided in the embodiment of the present application may be applied to an electronic device, and based on this, the display method may include step 110 and step 140, which are specifically shown below.
Step 110, receiving a first input of a user, wherein the first input is used for setting wallpaper of a first interface, and the wallpaper comprises a luminous object; step 120, responding to the first input, and acquiring first display information of the luminous object in a first interface; step 130, generating light and shadow characteristic information according to the first display information and the second display information of the icon in the first interface; and 140, displaying a second interface, and taking the first interface after the adjustment of the light and shadow characteristic information as the second interface.
Therefore, the first interface can be adjusted through the shadow characteristic information, so that the adjusted second interface can display the effect of simulating the light emitted by the luminous object in reality to irradiate the object, the color contrast between the wallpaper and the icon in the first interface is increased, the wallpaper and the icon displayed in the first interface are combined together through the physical boundary element of the luminous object, the luminous object of the nature is restored to irradiate the object to present various illumination effects, a static single picture is changed into a dynamic, natural and real scene, the display interface of the electronic equipment can be more harmonious, the tone coordination degree of the wallpaper and the icon is increased, and the display effect of the display interface is improved.
The above steps are described in detail below, and are specifically described below.
Referring to step 110, in the embodiment of the present application, the first input may be used to trigger the display adjustment option and trigger the input of acquiring the display position information, or may be used to adjust the display area of the light-emitting object in the first interface and trigger the input of acquiring the display position information.
Illustratively, the first input includes, but is not limited to: the touch input of the user to the electronic device through the touch device such as a finger or a stylus pen, or a voice command input by the user, or a specific gesture input by the user, or other feasibility inputs can be specifically determined according to actual use requirements, and the embodiment of the application is not limited. The specific gesture in the embodiment of the application can be any one of a single-click gesture, a sliding gesture, a dragging gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture and a double-click gesture; the click input or selection input in the embodiment of the application can be single click input, double click input or any click input, and the like, and can also be long-press input or short-press input. For example, the first input may be: a click input by a user on a light-emitting object displayed on the electronic device or a long press input by a user on a light-emitting object displayed on the electronic device.
The wallpaper in the embodiment of the application can be wallpaper of a desktop of the electronic equipment system or wallpaper of the electronic equipment in a screen locking state, wherein the wallpaper can be dynamic wallpaper or static wallpaper.
Based on this, since the user in the embodiment of the present application can adjust the display area of the light-emitting object in the first interface in different movement manners, the following describes the step 110 in detail with reference to different movement manners.
In some embodiments of the present application, the first display information includes display position information, the first input includes a first sub-input and a second sub-input, the first sub-input is used for triggering input of a display adjustment option, the adjustment option is used for adjusting a display position of the light-emitting object in the first interface, and the second sub-input is used for triggering input for acquiring the display position information, based on which, the step 110 may specifically include steps 1101 to 1103, as shown in detail below.
Step 1101, a first sub-input of a user to a lighting object is received.
Illustratively, as shown in fig. 2 (a), the wallpaper includes a light object, the light object being sun 20, and the first sub-input may be a click input or a long press input, based on which a user may click on sun 20 displayed on the electronic device, such that the electronic device may receive a click input from the user on sun 20 displayed on the electronic device; or the user may press the sun 20 displayed on the electronic device for a long time so that the electronic device may receive a long press input from the user to the sun 20 displayed on the electronic device.
In step 1102, N adjustment options are displayed in response to the first sub-input, where each adjustment option in the N adjustment options corresponds to a preset display position in the first interface, and N is a positive integer.
Illustratively, as shown in fig. 2 (b), the adjustment options may relate to the display position of the sun at the first interface, taking 3 as an example, 3 adjustment options are displayed, namely adjustment option 1 "left aligned", adjustment option 2 "centered" and adjustment option 3 "right aligned. Wherein, the adjustment option 1 corresponds to the preset display position 21, the adjustment option 2 corresponds to the preset display position 22, and the adjustment option 3 corresponds to the preset display position 23.
Or as shown in fig. 2 (c), the correspondence between the adjustment options and the preset display position is established by taking the time period where the light-emitting object is located as a bridge, taking N as an example, 3 adjustment options are displayed, namely, adjustment option 4 "morning", adjustment option 5 "noon" and adjustment option 6 "evening", wherein, since the electronic device displays the sun to rise from the left side and fall from the right side in the two-dimensional plane, the sun of "morning" generally corresponds to the preset display position 24, the sun of "noon" corresponds to the preset display position 25, and the sun of "evening" corresponds to the preset display position 26.
Step 1103, receiving a second sub-input of the user to a target adjustment option of the N adjustment options, where the target adjustment option corresponds to a preset target display position.
Illustratively, taking the example shown in fig. 2 (b) as an example, if the user selects the adjustment option 2 "center" of the 3 adjustment options, i.e., the electronic device receives the second sub-input of the user "center" the target adjustment option of the 3 adjustment options, then the sun 20 may be displayed at the preset display position 22 as shown in fig. 2 (d).
Taking the example shown in fig. 2 (c) as an example, if the user selects the adjustment option 6 "evening" of the 3 adjustment options, that is, the electronic device receives the second sub-input of the user to the target adjustment option 6 "evening" of the 3 adjustment options, then, as shown in fig. 2 (e), the sun 20 may be displayed at the preset display position 26.
Therefore, the display position of the luminous object can be adjusted according to the user requirement, so that the display information such as illumination intensity information, illumination range information and the like of the luminous object can be adjusted based on the target display position, and therefore the effect of wallpaper such as sunrise and sunset of the sun can be displayed under the condition that the wallpaper is not replaced, and the wallpaper replacing operation of the user is reduced.
In other embodiments of the present application, the user may optionally adjust the display position of the light-emitting object according to his/her own needs. Based on this, the first display information includes display position information, the first input includes a third sub-input for a user to adjust an input of a display area of the light-emitting object in the first interface, and a fourth sub-input; the fourth sub-input is used for triggering the input for acquiring the display position information; the first interface includes a first display area and a second display area such that step 110 may include, in particular, steps 1104 through 1106, as described in detail below.
Step 1104, receiving a third sub-input from the user to the light object.
Illustratively, still referring to FIG. 2 (a), the wallpaper includes a light object, the light object being sun 20, the user being able to press the sun 20 displayed on the electronic device for a long time, such that the electronic device is able to receive a third sub-input of the user to the sun 20.
In step 1105, in response to the third sub-input, a movement control is displayed, the movement control being for changing a display area of the lighted object in the first interface according to the user input.
Illustratively, as shown in fig. 3 (a), the mobile control 30 is displayed, where the display control 30 may be displayed in the first interface in the form of a copy image of the light-emitting object, and the user may move the display control 30, based on which the electronic device may change the display area of the light-emitting object in the first interface, so as to adjust the display information of the light intensity information, the light range information, and the like of the light-emitting object based on different display areas, so as to facilitate the user to browse the display effect of the light-emitting object at different positions in the first interface.
At step 1106, a fourth sub-input to the movement control is received, the fourth sub-input for moving the lighted object from the first display area to the second display area.
As shown in fig. 3 (b), in the embodiment of the present application, taking the first display area as the initial display area 31 of the sun 20 as an example, the user may move the movement control 30 from the initial first display area 31 to the second display area 32, so that the electronic device may update the display position of the sun 20 at the first interface from the first display area to the second display area 32.
It should be noted that, in the above example, the first display area in the embodiment of the present application is described by taking an initial display area as an example, in an actual operation, the first display area 31 may be the initial display area of the sun in the wallpaper, or the display area of the sun after at least one movement by the user and before moving to the second display area 32, where the one-time movement control 30 is located before the ending display position of the movement control 30 is actually moved by the user, for example, still shown in fig. (b).
Therefore, a user can conveniently adjust the display area of the luminous object according to the actual demand of the user, the position of the luminous object can be adjusted without changing wallpaper, so that the illumination intensity information, illumination range information and other display information of the luminous object can be adjusted based on the target display position, and the convenience and entertainment of wallpaper setting of the user are improved.
In addition, after step 110, in the embodiment of the present application, it is required to detect whether the wallpaper includes a light-emitting object, and based on this, the display method provided in the embodiment of the present application may further detect the light-emitting object in steps 1501 to 1502.
In step 1501, image features of wallpaper are extracted by an image feature extraction algorithm.
The image feature extraction algorithm in the embodiment of the application includes, but is not limited to, algorithms for image feature extraction, such as scale-invariant feature transform (SCALE INVARIANT Feature Transform, SIFT), direction gradient histogram (Histogram of Oriented Gradient, HOG), and the like.
Illustratively, image features of wallpaper are extracted by SIFT to identify the location and shape of the sun based on these image features.
In step 1502, in the case that the image feature matches the preset feature, it is determined that the wallpaper includes a light-emitting object.
Here, since the light-emitting object in the embodiment of the present application may be a light-emitting object that generates a natural light source for the sun, moon, stars, bright lights, lantern, etc., the preset features in the embodiment of the present application may include features for characterizing the light-emitting object, such as pixel values, shapes, etc. In this way, the image features can be matched with the preset features, and the wallpaper is characterized to comprise the luminous object under the condition that the image features are matched with the preset features.
Therefore, after receiving the input of setting the wallpaper by the user, whether the luminous object exists in the wallpaper or not can be automatically detected, if the luminous object does not exist in the wallpaper, the first interface comprising the wallpaper and the application is directly displayed, and if the luminous object exists in the wallpaper, the step 120 can be executed so as to generate the light and shadow characteristic information based on the first display information of the luminous object in the first interface and the second display information of the icon in the first interface, and the first interface is adjusted through the light and shadow characteristic information, so that the adjusted second interface can display the effect of light emitted by the luminous object in the simulated reality on the object.
It should be noted that, in the embodiment of the present application, the executing step 120 may be automatically triggered when the light-emitting object exists in the wallpaper, or the user may be reminded of whether to start the second interface after the adjustment of the light and shadow characteristic information when the light-emitting object exists in the wallpaper, so that the user may select the second interface. Thus, in the event that the user selects to open the second interface for display, step 120 may be entered; in the case that the user chooses not to start displaying the second interface, a first interface may be displayed, where the first interface includes wallpaper and icons. It should be noted that, the user may also restart based on the actual requirement when the user previously selects to not start displaying the second interface.
Step 120, the first display information in the embodiment of the present application includes at least one of the following: and displaying the position information, the illumination intensity information and the illumination range information. Based on this, the following steps for acquiring these first display information are given in connection with the different embodiments, respectively, as follows.
In some embodiments of the present application, the display position information may be acquired by at least one of the following.
In a first manner, based on the content as shown in fig. 2 (a) to fig. 2 (c) in step 110, an embodiment of the present application provides the following steps for obtaining display position information, based on which, the step 220 may specifically include:
And in response to the second sub-input, determining preset coordinate information of a preset target display position as display position information of the luminous object in the first interface.
For example, since the preset display position has fixed preset coordinate information, if the user selects a target adjustment option such as "centering", the preset coordinate information corresponding to the preset display position 22 may be determined as the display position information of the sun in the first interface.
Similarly, if the user selects the target adjustment option, such as "evening," the preset coordinate information corresponding to the preset display position 26 may be determined as the display position information of the sun in the first interface.
In a second manner, based on the content as shown in fig. 3 (a) to fig. 3 (b) in step 110, the embodiment of the present application provides the following steps for acquiring the display position information, and based on this, step 120 may specifically include step 1201 and step 1202.
Step 1201, in response to the fourth sub-input, compares the first pixel information of the first wallpaper with the second pixel information of the second wallpaper to obtain difference pixel information, where the first wallpaper is the wallpaper when the light-emitting object is in the first display area, and the second wallpaper is the wallpaper when the light-emitting object is in the second display area.
For example, the feature extraction algorithm may be used to extract the features of the first wallpaper and the second wallpaper, so as to obtain the first pixel information of the first wallpaper and the second pixel information of the second wallpaper, and after comparing the first pixel information and the second pixel information, obtain the difference pixel information between the first wallpaper and the second wallpaper.
In step 1202, coordinates of the difference pixel point information are determined as display position information of the light emitting object in the first interface.
For example, there may be a plurality of differential pixel point information, and at this time, the differential pixel point information may form at least one graphic, and coordinates of the graphic may be determined as display position information of the light emitting object in the first interface.
In other embodiments of the present application, the illumination intensity information may be obtained by at least one of the following.
In a first mode, the illumination time interval where the light-emitting object is located is calculated through the color of the light-emitting object in the wallpaper and the coverage area of the light beam emitted by the light-emitting object in the wallpaper so as to acquire illumination intensity information corresponding to the illumination time interval.
Based on this, this step 120 may specifically include step 1203 and step 1204.
Step 1203, in response to the first input, generating a lighting time interval of the lighting object according to the color information of the lighting object in the wallpaper and the coverage of the emitted light beam.
For example, when the light-emitting object is the sun, the color information of the sun in the wallpaper may be that the sun in the morning is relatively soft in color, and yellow or orange in color; the sun in the evening is more bright in color and shows red or purple; the noon color is generally white or yellowish and has a high brightness.
And, as shown in fig. 4, according to the natural law of solar east-west rise, if the coverage of the emitted light beam of the luminous object in the wallpaper is 40, the illumination time area representing the sun is between 6 and 9 points in the morning; if the coverage of the emitted light beam of the luminous object in the wallpaper is 41, the illumination time area of the sun is between noon, such as 10 points and 2 points; if the coverage of the emitted light beam of the luminous object in the wallpaper is 42, the illumination time area of the sun is between 4 and 6 pm.
Therefore, the illumination time interval of the luminous object can be accurately locked based on the color information of the luminous object in the wallpaper and the coverage range of the emitted light beam.
In step 1204, according to the association information of the preset illumination time interval and the preset illumination intensity information, the illumination intensity information corresponding to the illumination time interval is obtained.
In addition, it should be noted that, in the embodiment of the present application, the color information may be obtained based on the color histogram, and based on this, the step display method may further include step 2101 and step 2102.
Step 2101, calculating pixel values of a luminous object according to a color histogram of wallpaper;
In step 2102, a pixel value of the light emitting object is determined as color information of the light emitting object in the first interface.
For example, a method based on a color histogram may be used to analyze color information of the sun in the wallpaper, and by judging the color information reflected by the pixels of the sun in the wallpaper, the light of the sun comprises morning light, noon light and sunset light, wherein the morning light is relatively soft and presents yellow or orange; the noon light, namely the noon light is white or light yellow, and the brightness is high; sunset light is more vivid in color and shows red or purple.
Therefore, the color information of the relatively accurate luminous object in the first interface can be obtained.
In the second embodiment, the difference from the first embodiment is that in addition to the illumination time zone of the light-emitting object, which is generated according to the color information of the light-emitting object in the wallpaper and the coverage area of the emitted light beam, a certain zone including the time of capturing the image corresponding to the wallpaper may be determined as the illumination time zone of the light-emitting object.
For example, when an image corresponding to the wallpaper is shot, a shooting time, such as 10 am, is marked for the image, and then a section including 5 minutes before and after 10 am can be determined as a lighting time section of a lighting object, namely, a section from 9 am 55 to 10 am zero 5 minutes is determined as a lighting time section of the lighting object.
Here, in addition to the manner of determining the illumination time zone, the illumination intensity information corresponding to the illumination time zone may be acquired in any of the above-described manners.
In still other embodiments of the present application, the illumination range information may be obtained by at least one of the following.
In one mode, illumination range information is defined based on the vertical distance from a light-emitting object in the wallpaper to a reference plane. Based on this, the step 120 may specifically include a step 1205 and a step 1206.
In step 1205, in response to the first input, a reference plane corresponding to the light-emitting object is generated from a coverage area of the light-emitting object emitting a light beam in the wallpaper.
Illustratively, as shown in fig. 5 (a), the elevation angle of the sun 20 at noon is approximately 90 degrees, and an object reference plane 50 is introduced as a relative reference system according to the relative angle of the sun, and the elevation and angle of the sun to this plane are calculated by using the reference plane as a relative reference to obtain the coverage of the emitted light beam by the sun with respect to this object plane. Here, the reference plane may be a plane in which an object that can be illuminated in wallpaper such as a road, a lawn, a water surface, a building group, a tree group, etc.
It should be noted that, if the image corresponding to the wallpaper is inconvenient to find the reference plane, the plane below the 1/3 of the screen height away from the sun can be used as the reference plane by default.
In step 1206, illumination range information of the light-emitting object in the first interface is determined based on the distance of the light-emitting object from the reference plane.
Illustratively, the closer the light-emitting object is to the reference plane, the smaller the illumination range information, the size of which is described below by the number of icons in the first interface, as shown in detail below.
As shown in fig. 5 (b), when the display size of the sun is large and the distance of the sun from the reference plane is low, the illumination range information of the light-emitting object in the first interface may be a range (i.e., gray-labeled icons) where three rows of icons from bottom to top are located in 3*6 icons.
Or as shown in fig. 5 (c), when the display size of the sun is smaller than the display size of the sun in fig. 5 (b), and the height of the sun from the reference plane is higher than the height of the sun from the reference plane in fig. 5 (b), the illumination range information of the light-emitting object in the first interface may be the range (i.e. gray-marked icon) where four rows of icons from bottom to top are located in the icons 3*6.
Or as shown in fig. 5 (d), when the display size of the sun is smaller than the display size of the sun in fig. 5 (c), and the height of the sun from the reference plane is higher than the height of the sun from the reference plane in fig. 5 (c), the illumination range information of the light-emitting object in the first interface may be a range (i.e. gray-marked icons) where five rows of icons from bottom to top are located in the icons of 3*6.
Here, it should be noted that, in the embodiment of the present application, the illumination range information may refer to a range of a two-dimensional plane, and may refer to illumination angle and illumination range information of a three-dimensional space.
Referring to step 130, in the embodiment of the present application, the shadow feature information may be used to simulate the effect of the light-emitting object on the object in the process of emitting light to adjust the information of the display effect of the icon in the first interface.
In some embodiments of the present application, the light and shadow characteristic information is light and edge effect information, which may include size and brightness information of an edge outside region of the icon. The first display information comprises display position information and illumination range information; the second display information comprises arrangement position information of the icons in the first interface and display size of the icons in the first interface.
Based on this, the step 130 may specifically include steps 1301 to 1303.
Step 1301, selecting a target icon from the icons according to the display position information and the illumination range information.
For example, reference may be made to the contents shown in fig. 5 (b) to 5 (d), and the gray icons thereof may be regarded as target images.
Step 1302, determining the size of the edge outer area of the target icon according to the display position information, the arrangement position information and the display size.
In particular, this step 1302 may include, in particular, step 13021 and step 13022.
Step 13021 calculates a distance value from the display position information to a first display side of the target icon, which is the side of the target icon near the light-emitting object, based on the display position information and the arrangement position information.
Illustratively, taking the display position information of the sun and the arrangement position information of the icons in fig. 5 (b) as an example, a distance value of the sun 20 from the upper edge of each target icon is calculated, for example, a distance value of the sun 20 from the first target icon in the fourth row is 5, a distance value of the sun 20 from the second target icon in the fourth row is4, and a distance of the sun 20 from the third target icon in the fourth row is 5; the distance value of sun 20 from the first target icon of the fifth row is 10, the distance value of sun 20 from the second target icon of the fifth row is 8, and the distance of sun 20 from the third target icon of the fifth row is 10.
Step 13022, setting the size of the outer area of the edge of the non-occluded target icon, which is smaller than or equal to the preset threshold value, to a first size; setting the size of the outer side area of the edge of the non-occluded target icon with the distance value larger than a preset threshold value as a second size; setting the size of the outer side area of the edge of the blocked target icon with the distance value larger than a preset threshold value as a third size; wherein the second dimension is between the first dimension and the third dimension, the first dimension being greater than the third dimension.
By way of example, still taking fig. 5 (b) as an example, the preset threshold is 6, and since the third row of icons is illuminated by the sun, the fourth row of target icons may be regarded as non-occluded target icons. Based on this, as shown in fig. 6, the size of the edge outside area of the target icons of the fourth row may be set to the first size, the size of the edge outside area of the target icons of the fifth row may be set to the second size, and the size of the edge outside area of the target icons of the sixth row may be set to the third size.
In step 1303, according to the size of the edge outside area of the target icon, brightness information corresponding to the size of the edge outside area of the target icon is generated.
Specifically, the step 1303 may specifically include:
According to the relation information of the size of the outer side area of the preset edge and the preset brightness information, respectively acquiring first brightness information corresponding to the first size, second brightness information corresponding to the second size and third brightness information corresponding to the third size; wherein the brightness value of the second brightness information is between the brightness value of the first brightness information and the brightness value of the third brightness information, and the brightness value of the first brightness information is larger than the brightness value of the third brightness information.
For example, as shown in fig. 6, the brightness value of the brightness information of the area outside the edge of the target icon of the fourth row may be set to be highest, that is, to be brightest, and the brightness of the area outside the edge of the target icon of the fifth row is lower than that of the fourth row, and finally, the brightness of the area outside the edge of the target icon of the sixth row is set to be darker than that of the area outside the edge of the target icon of the fifth row.
Here, as shown in fig. 6, the description is made taking 4 sides of the icon as an example, and in addition, brightness information of an edge outside area of any one of the 4 sides may be adjusted taking an edge outside area of one side and both sides of the icon as an example.
In the embodiment of the application, the brightness information of the outside area of the preset edge can be adjusted, and the color information of the outside area of the preset edge can be adjusted according to the illumination time interval of the sun.
The icons in the embodiment of the application can comprise at least one of the following icons: icons of applications, icons of operating systems, icons of gadgets of operating systems, icons of folders.
In other embodiments of the present application, the shadow characteristic information is halo effect information, which may include a halo region corresponding to the icon and halo information of the halo region, including halo intensity and halo pattern. Based on this, the first display information includes display position information, illumination range information, and illumination intensity information; the second display information includes arrangement position information of the icons in the first interface.
Based on this, the step 130 may specifically include steps 1304 to 1305.
In step 1304, a halo position at which the light-emitting object generates a halo is determined based on the display position information and the arrangement position information.
Illustratively, as shown in fig. 7, since the target application is started from the fourth row, based on this, the target application may still be determined in the above manner, and the halo positions thereof may be randomly selected from the fourth row to the sixth row, where the halo positions may be set at the positions of the second and third icons of the fourth row and the second and third icons of the fifth row.
Step 1305, acquiring the halation intensity corresponding to the illumination intensity information according to the association information of the preset illumination intensity information and the preset halation intensity information; and acquiring a halation pattern corresponding to the illumination range according to the preset illumination range information and the association information of the preset halation pattern.
In other embodiments of the present application, the light and shadow characteristic information is beam effect information, which may include color information and transparency information of a beam display area corresponding to the icon. The first display information comprises display position information, illumination range information and illumination intensity information; the second display information includes arrangement position information of the icons in the first interface.
Based on this, the step 130 may specifically include steps 1306 through 1308.
Step 1306, generating illumination angle information of the luminous object according to the display position information and the illumination range information.
Illustratively, as shown in FIG. 8, still illustrated as an example in FIG. 5 (b), the illumination angle information may be 45 degrees from the reference platform, perpendicular.
In step 1307, a beam display area emitted by the light-emitting object is determined from the first interface based on the illumination angle information and the arrangement position information.
In the example, in the illumination angle information and the arrangement position information, whether the icon gap portion has illumination passing through is determined, wherein whether the icon gap portion has illumination passing through can be determined by a linear scanning mode, specifically, as shown in fig. 5 (a), a cone space is presented by taking the sun as a dot, a dotted line is outward, namely, a ray range, if the dotted line intersects with the icon, the ray is blocked, and a portion which directly does not intersect with the icon passes through the icon gap. Based on this, when it is determined that the light beam can be formed through the gap, the light beam display area can be established based on the angle of the light beam and the position of the gap. Specifically, the light beam is formed by long distance and narrow gap, no scattering is formed, so that the light beam scanning mode can be adopted, after the light beam intersected with the icon is removed, the light beam effect is formed if two sides of the formed shape are parallel or intersected, and the middle area of the two sides is the light beam range.
Step 1308, generating color information of the light beam display area and transparency information corresponding to the color information according to the illumination range information and the illumination intensity information.
In addition, the shadow characteristic information comprises at least one of the following information of light edge effect information, halation effect information and light beam effect information; the second interface includes a sub-interface, based on which the display method may further include steps 3101 to 3103 before step 130.
Step 3101, displaying at least one of the following options: a halo effect option corresponding to the halo effect information, a beam effect option corresponding to the beam effect.
Illustratively, as shown in fig. 9, a light edge effect option corresponding to the light edge effect information, a halo effect option corresponding to the halo effect information, and a beam effect option corresponding to the beam effect are displayed.
At step 3102, a third input of a target option from the user is received.
Illustratively, with continued reference to fig. 9, the user may select the halo-effect option, i.e., the electronic device may receive a third input from the user to the target option, i.e., the halo-effect option.
In step 3103, in response to the third input, a sub-interface corresponding to the target option is displayed, the sub-interface being an interface adjusted by the light and shadow characteristic information corresponding to the target option.
For example, the sub-interface corresponding to the target option may refer to fig. 7, i.e., in response to the third input, the display of fig. 7 may be switched from fig. 9.
Thus, the light and shadow characteristic information can be controlled through the options corresponding to the light and shadow characteristic information as shown in fig. 9, so that the user can conveniently open and close at any time, and the customization and playability of the effect are improved.
Here, the target option may include at least one of 3 options as shown in fig. 9, so that the user determines whether to superimpose the display effect according to the options.
It should be noted that, the wallpaper is a dynamic wallpaper, the wallpaper includes a movable object, the light and shadow characteristic information includes first light and shadow characteristic information and second light and shadow characteristic information, the first light and shadow characteristic information is used for adjusting the display effect of the icon under the light-emitting object, and the second light and shadow characteristic information is used for adjusting the display effect of the movable object under the light-emitting object.
Illustratively, the wallpaper is a dynamic wallpaper, the dynamic wallpaper comprises sun and movable objects such as cats, the first light and shadow characteristic information is further used for adjusting the display effect of the light edge, halation and light effect of the icon, and the second light and shadow characteristic information is further used for adjusting the display effect of the cats, such as that cat animals can have different states under illumination, such as that kittens play when being illuminated by morning sunlight and sleep when being illuminated by noon sunlight.
In step 140, the embodiment of the present application may generate the second interface in the following two manners, which is specifically described below.
In some embodiments of the present application, before step 140, the first interface corresponds to a wallpaper layer and an icon layer, the wallpaper layer is used to display wallpaper, the icon layer is used to display icons, and the icon layer is overlaid on the wallpaper layer, based on which the display method may include step 4101 and step 4102.
In step 4101, a lighting effect layer is generated based on the lighting characteristic information.
As shown in fig. 10 (a), the layer is an exemplary wallpaper layer, the wallpaper layer is the bottommost layer, and is set as a light source layer, the wallpaper layer displays luminous objects, such as sun 20, on which beautiful scenery wallpaper of sunrise and sunset can be displayed, and meanwhile, the wallpaper layer can be flexibly adjusted according to the requirement of a user, and the display effect of the icon layer and the lighting effect layer is determined by the position of sun and the direction of light. As shown in fig. 10 (b), the layer is an icon layer, which displays icons. As shown in fig. 10 (c), the layer is a lighting effect layer, which is an intermediate layer of three layers, and includes various icons, and the icon layer can shade the wallpaper layer. Based on the above, the size, brightness and flashing effect of the edge outside area of the icon on the icon layer can be calculated based on the display position information of the luminous object on the wallpaper layer, the arrangement position information of the icon and the illumination range information, and the illumination effect layer is generated based on the size, brightness and flashing effect of the edge outside area of the icon on the icon layer, namely the illumination effect layer is the uppermost layer and is overlapped and displayed on the icon layer, the halation effect and the tyndall effect shown in fig. 7, namely the light beam shown in fig. 8, can be displayed, of course, the halation area and the halation pattern can be configured by a user by themselves, the effect of the illumination effect layer is to enhance the connection sense between the wallpaper layer and the icon layer, a 3D effect is formed, the three-layer is connected through light, and the visual effect of the whole wallpaper is enhanced.
In step 4102, a lighting effect layer is displayed on the icon layer in a superimposed manner, so as to obtain a second interface.
In some embodiments of the present application, before step 140, the display method may include step 4103, rendering the first interface according to the shadow characteristic information to obtain the second interface.
Illustratively, as shown in fig. 10 (d), 10 (e) and 10 (f), during the movement, enlargement and reduction of the sun or the icon, different golden edge positions and shapes are displayed according to different positions of the sun relative to the reference plane, and the effect of flashing, the more direct the sun, the more obvious the golden edge is and the more flashing is. Specifically, firstly, a 3D model of the sun and a reference plane is established, the reference plane is used as a reference coordinate to be kept still, illumination and materials are required to be set after the establishment of the 3D model is completed, a main light source (representing the sun) and one or more auxiliary light sources can be set, and a golden material can be set for the illuminated icon material and a reflection effect is generated under illumination; then, animation design, namely moving, amplifying and shrinking animation effects of sun and icons, and rendering in real time when parameters change through software rendering design; furthermore, the effect of the golden edge is produced and rendered: the effect of the golden edge can be achieved by adding a golden shade or map around the target model, and the shade or map can adjust the position and the shape of the shade or map according to the target 3D position and the angle of the sun; then, the flash effect is produced and rendered: the flashing effect can be realized by adjusting the highlight attribute of the target material, and when the sun is directly irradiated, the intensity and the range of the highlight can be increased, so that the flashing effect is generated.
Therefore, the display method provided by the embodiment of the application relies on the physical characteristics and optical characteristics of the screen of the electronic equipment, and the wallpaper layer is set as the light source layer to form light to the upper layer content so as to simulate the effect of illuminating the object in reality, thereby presenting the effect similar to the light receiving effect of the object in reality on the screen, providing a more flexible, universal and easy-to-use wallpaper display scheme, and being capable of presenting the beautiful wallpaper effect of sunrise and sunset in various scenes.
The following describes in detail a display method provided by the embodiment of the application by taking a light-emitting object as the sun.
In step 1, as the icon and the sun are both variable parts, the position and the sun intensity of the sun on the screen, the size of the icon and the distance between the position of the icon and the variable parts can be dynamically changed, and the illumination effect can be caused to be changed along with the change, so that the natural physical characteristics are met.
Step2, detecting that a changed object is present, sliding a screen or dragging an icon so as to change the display position information of the icon in a first interface, wherein the existence of sun on the wallpaper follows the current time condition, and dynamic wallpaper such as sunset from day to day is present, so when the two main objects are changed, determining the changed object by calculating the comparison between the parameters of the current wallpaper and the parameters of the wallpaper in the previous time period, and if the changed object is the icon, executing step 3; if the change object is sun, executing step 4.
And 3, changing the icon in two cases, namely changing the display position information and the display size of the icon, wherein the change of the position information and the display size of the icon can be realized under the interaction modes of sliding a screen, dragging the icon and the like, so that the second display information of the icon in the first interface can be calculated in real time according to the changed information.
And 4, changing the sun from rising to falling according to a natural law, wherein the first display information of the sun in the first interface changes along with the time change, so that the first display information of the luminous object in the first interface can be calculated in real time according to the changed information.
And 5, calculating the light and shadow characteristic information according to the first display information and the second display information, so as to update the display effect of the first interface through the light and shadow characteristic information, such as updating the light edge effect of the illumination icon, updating the halation effect of the illumination icon and updating the beam effect in the gap of the illumination icon.
Therefore, according to the embodiment of the application, the original wallpaper and the icon can be effectively combined together by using the physical boundary element of the sun, and various illumination effects are presented by restoring an object irradiated by the sun in the nature, so that a static single picture is changed into a dynamic, natural and real scene, the display interface of the electronic equipment can be more harmonious, and the display effect of the display interface is improved.
According to the display method provided by the embodiment of the application, the execution main body can be a display device. In the embodiment of the present application, a display device is used as an example to perform display, and a device of the display method provided in the embodiment of the present application is described.
The application also provides a display device. This is described in detail with reference to fig. 11.
Fig. 11 is a schematic structural diagram of a display device according to an embodiment of the present application.
As shown in fig. 11, the display device 110 may be applied to an electronic apparatus, and the display device 110 may specifically include:
A receiving module 1101, configured to receive a first input of a user, where the first input is used to set wallpaper of a first interface, and the wallpaper includes a light-emitting object;
an obtaining module 1102, configured to obtain first display information of a light-emitting object in a first interface in response to a first input;
A generating module 1103, configured to generate light and shadow characteristic information according to the first display information and the second display information of the icon in the first interface;
The display module 1104 is configured to display the second interface, and take the first interface after the adjustment of the light and shadow characteristic information as the second interface.
The display device 110 in the embodiment of the present application will be described in detail as follows.
In some embodiments of the present application, the display apparatus 110 in the embodiments of the present application may further include a first determination module; wherein,
The receiving module 1101 may be specifically configured to, when the first display information includes display position information, the first input includes a first sub-input and a second sub-input, the first sub-input is used for triggering a display adjustment option, the adjustment option is used for adjusting a display position of the light-emitting object in the first interface, and the second sub-input is used for receiving the first sub-input of the user on the light-emitting object when the second sub-input is used for triggering to obtain the display position information;
the display module 1104 may be further configured to display N adjustment options in response to the first sub-input, where each adjustment option in the N adjustment options corresponds to a preset display position in the first interface, and N is a positive integer;
The receiving module 1101 may be specifically configured to receive a second sub-input of a target adjustment option from the N adjustment options by the user, where the target adjustment option corresponds to a preset target display position;
And the first determining module is used for responding to the second sub-input and determining preset coordinate information of a preset target display position as display position information of the luminous object in the first interface.
In some embodiments of the present application, the display device 110 in the embodiments of the present application may further include a comparing module and a second determining module; wherein,
The receiving module 1101 may be specifically configured to, when the first display information includes display position information, the first input includes a third sub-input and a fourth sub-input, the third sub-input is used for adjusting a display area of the light-emitting object in the first interface by the user, the fourth sub-input is used for triggering to obtain the display position information, and in a case that the first interface includes the first display area and the second display area, receive the third sub-input of the light-emitting object by the user;
The display module 1104 may also be configured to display, in response to the third sub-input, a movement control, the movement control being a display area for changing the light-emitting object in the first interface according to the user input;
The receiving module 1101 may be specifically configured to receive a fourth sub-input to the movement control, where the fourth sub-input is used to move the light-emitting object from the first display area to the second display area;
the comparison module is used for responding to the fourth sub-input, comparing the first pixel point information of the first wallpaper with the second pixel point information of the second wallpaper to obtain difference pixel point information, wherein the first wallpaper is the wallpaper when the luminous object is in the first display area, and the second wallpaper is the wallpaper when the luminous object is in the second display area;
And the second determining module is used for determining the coordinates of the difference pixel point information as the display position information of the luminous object in the first interface.
In some embodiments of the present application, the generating module 1103 may be specifically configured to, in response to the first input, generate, in a case where the first display information includes illumination intensity information, an illumination time interval of the light-emitting object according to the color information of the light-emitting object in the wallpaper and the coverage of the emitted light beam;
The obtaining module 1102 may be further configured to obtain, according to the association information of the preset illumination time interval and the preset illumination intensity information, illumination intensity information corresponding to the illumination time interval.
In some embodiments of the present application, the display apparatus 110 in the embodiments of the present application may further include a third determining module; wherein,
The obtaining module 1102 may specifically be configured to, in response to the first input, generate a reference plane corresponding to the light-emitting object according to a coverage area of the light-emitting object emitting a light beam in the wallpaper when the first display information includes illumination range information;
And defining illumination range information of the light-emitting object in the first interface based on the distance between the light-emitting object and the reference plane.
In some embodiments of the present application, the generating module 1103 may be specifically configured to, when the first display information includes display position information and illumination range information, the second display information includes arrangement position information of the icons in the first interface and display size of the icons in the first interface, and the light and shadow characteristic information includes size and brightness information of an area outside an edge of the icons, screen the target icon from the icons according to the display position information and the illumination range information;
Determining the size of the area outside the edge of the target icon according to the display position information, the arrangement position information and the display size;
According to the size of the edge outside area of the target icon, brightness information corresponding to the size of the edge outside area of the target icon is generated.
In some embodiments of the present application, the generating module 1103 may be specifically configured to, when the first display information includes display position information, illumination range information, and illumination intensity information, the second display information includes arrangement position information of icons in the first interface, the shadow characteristic information includes halo information of a halo region and a halo region corresponding to the icons, and the halo information includes halo intensity and a halo pattern, determine, according to the display position information and the arrangement position information, a halo position where the light-emitting object generates a halo;
Acquiring the halation intensity corresponding to the illumination intensity information according to the association information of the preset illumination intensity information and the preset halation intensity information; and acquiring a halation pattern corresponding to the illumination range according to the preset illumination range information and the association information of the preset halation pattern.
In some embodiments of the present application, the generating module 1103 may be specifically configured to, when the first display information includes display position information, illumination range information, and illumination intensity information, the second display information includes arrangement position information of icons in the first interface, and the light and shadow characteristic information includes color information and transparency information of a light beam display area corresponding to the icons, generate illumination angle information of the light-emitting object according to the display position information and the illumination range information;
defining a beam display area emitted by the light-emitting object from the first interface based on the illumination angle information and the arrangement position information;
And generating color information of the light beam display area and transparency information corresponding to the color information according to the illumination range information and the illumination intensity information.
In some embodiments of the present application, the display module 1104 may be further configured to display at least one of the following options in a case where the light and shadow characteristic information includes at least one of information including light edge effect information, halo effect information, and beam effect information, and the second interface includes a sub-interface: a halo effect option corresponding to the halo effect information, a beam effect option corresponding to the beam effect;
The receiving module 1101 may also be configured to receive a third input of a target option from a user;
The display module 1104 may be further configured to display a sub-interface corresponding to the target option in response to the third input, where the sub-interface is an interface adjusted by the light and shadow characteristic information corresponding to the target option.
The display device in the embodiment of the application can be an electronic device or a component in the electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. The electronic device may be a Mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a Mobile internet appliance (Mobile INTERNET DEVICE, MID), an augmented reality (augmented reality, AR)/Virtual Reality (VR) device, a robot, a wearable device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), etc., and may also be a server, a network attached storage (Network Attached Storage, NAS), a personal computer (personal computer, PC), a Television (TV), a teller machine, a self-service machine, etc., which are not particularly limited in the embodiments of the present application.
The display device in the embodiment of the application may be a device having an operating system. The operating system may be an Android operating system, an iOS operating system, or other possible operating systems, and the embodiment of the present application is not limited specifically.
The device cooperation apparatus provided by the embodiment of the present application can implement each process implemented by the embodiment of the display method shown in fig. 1 to 10, so as to achieve the same technical effect, and in order to avoid repetition, a detailed description is omitted here.
Based on the above, the display device provided in the embodiment of the present application may set wallpaper of a first interface in an electronic device based on user input, and when the wallpaper includes a light-emitting object, obtain first display information of the light-emitting object in the first interface and second display information of an icon in the first interface, generate light and shadow characteristic information according to the first display information and the second display information, and then use the first interface adjusted by the light and shadow characteristic information as the second interface and display the second interface. Therefore, the first interface can be adjusted through the shadow feature information, so that the adjusted second interface can display the effect of simulating the light emitted by the luminous object in reality to irradiate the object, the color contrast between the wallpaper and the icon in the first interface is increased, the wallpaper and the icon displayed in the first interface are combined together through the physical boundary element of the luminous object, the luminous object in nature is restored to irradiate the object to present various illumination effects, a static single picture is changed into a dynamic, natural and real scene, the display interface of the electronic equipment can be more harmonious, the tone coordination degree of the wallpaper and the icon is increased, and the display effect of the display interface is improved.
Optionally, as shown in fig. 12, the embodiment of the present application further provides an electronic device 120, including a processor 1201 and a memory 1202, where the memory 1202 stores a program or an instruction that can be executed on the processor 1201, and the program or the instruction implements each step of the embodiment of the display method when executed by the processor 1201, and can achieve the same technical effect, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device.
Fig. 13 is a schematic hardware structure of an electronic device according to an embodiment of the present application.
The electronic device 1300 includes, but is not limited to: radio frequency unit 1301, network module 1302, audio output unit 1303, input unit 1304, sensor 1305, display unit 1306, user input unit 1307, interface unit 1308, memory 1309, processor 1310, and the like.
Those skilled in the art will appreciate that the electronic device 1300 may also include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 1310 by a power management system, such as to perform functions such as managing charging, discharging, and power consumption by the power management system. The electronic device structure shown in fig. 13 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
Wherein, in the embodiment of the present application, the user input unit 1307 is configured to receive a first input of a user, where the first input is used to set wallpaper of a first interface, and the wallpaper includes a light-emitting object;
a processor 1310 for obtaining first display information of a light emitting object in a first interface in response to a first input;
a processor 1310 for generating light and shadow characteristic information according to the first display information and the second display information of the icon in the first interface;
the display unit 1306 is configured to display the second interface, and the first interface after the adjustment of the light and shadow characteristic information is used as the second interface.
The electronic device 1300 is described in detail below, as follows.
In some embodiments of the present application, the user input unit 1307 may be specifically configured to, when the first display information includes display position information, receive a first sub-input of a user for the light-emitting object in a case where the first display information includes display position information, and the first input includes a first sub-input for triggering a display adjustment option for adjusting a display position of the light-emitting object in the first interface and a second sub-input for triggering acquisition of the display position information;
The display unit 1306 is further configured to display N adjustment options in response to the first sub-input, where each adjustment option in the N adjustment options corresponds to a preset display position in the first interface, and N is a positive integer;
The user input unit 1307 may be specifically configured to receive a second sub-input of a target adjustment option from the N adjustment options, where the target adjustment option corresponds to a preset target display position;
The processor 1310 may be specifically configured to determine, in response to the second sub-input, preset coordinate information of a preset target display position as display position information of the light emitting object in the first interface.
In some embodiments of the present application, the user input unit 1307 may be specifically configured to, when the first display information includes display position information, receive a third sub-input and a fourth sub-input, where the third sub-input is used for a user to adjust a display area of the light-emitting object in the first interface, and the fourth sub-input is used for triggering to obtain the display position information, and receive the third sub-input of the user to the light-emitting object when the first interface includes the first display area and the second display area;
The display unit 1306 may be further configured to display, in response to the third sub-input, a movement control, the movement control being a display area for changing the light-emitting object in the first interface according to the user input;
The user input unit 1307 may be specifically configured to receive a fourth sub-input to the movement control, the fourth sub-input being configured to move the light-emitting object from the first display area to the second display area;
The processor 1310 may be specifically configured to, in response to the fourth sub-input, compare first pixel information of the first wallpaper with second pixel information of the second wallpaper to obtain difference pixel information, where the first wallpaper is a wallpaper when the light-emitting object is in the first display area, and the second wallpaper is a wallpaper when the light-emitting object is in the second display area;
The coordinates of the difference pixel point information are determined as display position information of the light emitting object in the first interface.
In some embodiments of the present application, the processor 1310 may be specifically configured to, in response to the first input, generate an illumination time interval of the light-emitting object according to the coverage of the emitted light beam and color information of the light-emitting object in the wallpaper, in case the first display information includes illumination intensity information;
And acquiring the illumination intensity information corresponding to the illumination time interval according to the association information of the preset illumination time interval and the preset illumination intensity information.
In some embodiments of the present application, the processor 1310 may be specifically configured to, in response to the first input, generate a reference plane corresponding to the light-emitting object according to a coverage of the light beam emitted by the light-emitting object in the wallpaper, in a case where the first display information includes illumination range information;
And defining illumination range information of the light-emitting object in the first interface based on the distance between the light-emitting object and the reference plane.
In some embodiments of the present application, the processor 1310 may be specifically configured to, when the first display information includes display position information and illumination range information, the second display information includes arrangement position information of the icons in the first interface and display size of the icons in the first interface, and the shadow characteristic information includes size and brightness information of an area outside an edge of the icons, screen the target icon from the icons according to the display position information and the illumination range information;
Determining the size of the area outside the edge of the target icon according to the display position information, the arrangement position information and the display size;
According to the size of the edge outside area of the target icon, brightness information corresponding to the size of the edge outside area of the target icon is generated.
In some embodiments of the present application, the processor 1310 may be specifically configured to, when the first display information includes display position information, illumination range information, and illumination intensity information, the second display information includes arrangement position information of the icons in the first interface, the shadow characteristic information includes halo information of a halo region and a halo region corresponding to the icons, and the halo information includes halo intensity and a halo pattern, determine, according to the display position information and the arrangement position information, a halo position where the light-emitting object generates a halo;
Acquiring the halation intensity corresponding to the illumination intensity information according to the association information of the preset illumination intensity information and the preset halation intensity information; and acquiring a halation pattern corresponding to the illumination range according to the preset illumination range information and the association information of the preset halation pattern.
In some embodiments of the present application, the processor 1310 may be specifically configured to, when the first display information includes display position information, illumination range information, and illumination intensity information, the second display information includes arrangement position information of the icons in the first interface, and the light and shadow characteristic information includes color information and transparency information of a light beam display area corresponding to the icons, generate illumination angle information of the light-emitting object according to the display position information and the illumination range information;
defining a beam display area emitted by the light-emitting object from the first interface based on the illumination angle information and the arrangement position information;
And generating color information of the light beam display area and transparency information corresponding to the color information according to the illumination range information and the illumination intensity information.
In some embodiments of the present application, the display unit 1306 may be further configured to display at least one of the following options in a case where the light and shadow characteristic information includes at least one of information including light edge effect information, halo effect information, and beam effect information, and the second interface includes a sub-interface: a halo effect option corresponding to the halo effect information, a beam effect option corresponding to the beam effect;
The user input unit 1307 may also be configured to receive a third input from the user for a target option;
The display unit 1306 may be further configured to display, in response to the third input, a sub-interface corresponding to the target option, where the sub-interface is an interface adjusted by the light and shadow characteristic information corresponding to the target option.
It is to be appreciated that the input unit 1304 may include a graphics processor (Graphics Processing Unit, GPU) 13041 and a microphone 13042, the graphics processor 13041 processing image data of still images or video obtained by an image capture device (e.g., a camera) in a video capture mode or image capture mode. The display unit 1306 may include a display panel, which may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1307 includes at least one of a touch panel 13071 and other input devices 13072. Touch panel 13071, also referred to as a touch screen. The touch panel 13071 may include two parts, a touch detection device and a touch display. Other input devices 13072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume display keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
The memory 1309 may be used to store software programs and various data, and the memory 1309 may mainly include a first storage area storing programs or instructions and a second storage area storing data, wherein the first storage area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 1309 may include volatile memory or nonvolatile memory, or the memory 1309 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static random access memory (STATIC RAM, SRAM), dynamic random access memory (DYNAMIC RAM, DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate Synchronous dynamic random access memory (Double DATA RATE SDRAM, DDRSDRAM), enhanced Synchronous dynamic random access memory (ENHANCED SDRAM, ESDRAM), synchronous link dynamic random access memory (SYNCH LINK DRAM, SLDRAM), and Direct random access memory (DRRAM). Memory 1309 in embodiments of the application include, but are not limited to, these and any other suitable types of memory.
The processor 1310 may include one or more processing units; optionally, processor 1310 integrates an application processor that primarily handles operations related to the operating system, user interface, and applications, and a modem processor that primarily handles wireless display signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 1310.
The embodiment of the application also provides a readable storage medium, and the readable storage medium stores a program or an instruction, which when executed by a processor, implements each process of the above-mentioned display method embodiment, and can achieve the same technical effects, so that repetition is avoided, and no further description is provided here.
The processor is a processor in the electronic device in the above embodiment. Among them, the readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic disk or optical disk, etc.
In addition, the embodiment of the application further provides a chip, the chip comprises a processor and a display interface, the display interface is coupled with the processor, the processor is used for running programs or instructions, the processes of the embodiment of the display method can be realized, the same technical effects can be achieved, and the repetition is avoided, and the description is omitted here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
Embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the embodiments of the display method described above, and achieve the same technical effects, and for avoiding repetition, a detailed description is omitted herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in part in the form of a computer software product stored on a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method of the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.

Claims (18)

1. A display method, comprising:
receiving a first input of a user, wherein the first input is used for setting wallpaper of a first interface, and the wallpaper comprises a luminous object;
responding to the first input, and acquiring first display information of the luminous object in the first interface;
Generating light and shadow characteristic information according to the first display information and the second display information of the icon in the first interface;
And displaying a second interface, and taking the first interface after the light and shadow characteristic information is adjusted as the second interface.
2. The method of claim 1, wherein the first display information comprises display position information, the first input comprises a first sub-input and a second sub-input, the first sub-input is used for triggering display of an adjustment option, the adjustment option is used for adjusting a display position of the light-emitting object in the first interface, and the second sub-input is used for triggering acquisition of the display position information; the receiving a first input from a user includes:
Receiving a first sub-input of a user to the light-emitting object;
Responding to the first sub-input, displaying N adjustment options, wherein each adjustment option in the N adjustment options corresponds to a preset display position in the first interface, and N is a positive integer;
Receiving a second sub-input of a user for a target adjustment option in the N adjustment options, wherein the target adjustment option corresponds to a preset target display position;
The responding to the first input, obtaining first display information of the luminous object in the first interface, including:
And responding to the second sub-input, and determining preset coordinate information of the preset target display position as display position information of the luminous object in the first interface.
3. The method of claim 1, wherein the first display information comprises display position information, the first input comprising a third sub-input and a fourth sub-input, the third sub-input for a user to adjust a display area of the light-emitting object in the first interface; the fourth sub-input is used for triggering and acquiring the display position information; the first interface comprises a first display area and a second display area; the receiving a first input from a user includes:
Receiving a third sub-input of a user to the light-emitting object;
responsive to the third sub-input, displaying a movement control, the movement control being a display area for changing the light-emitting object in the first interface according to user input;
receiving a fourth sub-input to the movement control, the fourth sub-input for moving the light-emitting object from the first display area to the second display area;
The responding to the first input, obtaining first display information of the luminous object in the first interface, including:
Responding to the fourth sub-input, comparing first pixel point information of a first wallpaper with second pixel point information of a second wallpaper to obtain difference pixel point information, wherein the first wallpaper is the wallpaper of the luminous object in the first display area, and the second wallpaper is the wallpaper of the luminous object in the second display area;
and determining the coordinates of the difference pixel point information as display position information of the luminous object in the first interface.
4. The method of claim 1, wherein the first display information comprises illumination intensity information; the responding to the first input, obtaining first display information of the luminous object in the first interface, including:
generating an illumination time interval of the luminous object according to the color information of the luminous object in the wallpaper and the coverage range of the emitted light beam in response to the first input;
Acquiring illumination intensity information corresponding to a preset illumination time interval according to the association information of the preset illumination time interval and the preset illumination intensity information.
5. The method of claim 1, wherein the first display information comprises illumination range information; the responding to the first input, obtaining first display information of the luminous object in the first interface, including:
Generating a reference plane corresponding to the light-emitting object according to the coverage range of the light beam emitted by the light-emitting object in the wallpaper in response to the first input;
and determining illumination range information of the luminous object in the first interface based on the distance between the luminous object and the reference plane.
6. The method of claim 1, wherein the first display information includes display location information and illumination range information; the second display information comprises arrangement position information of the icons in the first interface and display size of the icons in the first interface; the light and shadow characteristic information comprises size and brightness information of an area outside the edge of the icon; generating the shadow characteristic information according to the first display information and the second display information of the icon in the first interface, including:
Screening target icons from the icons according to the display position information and the illumination range information;
Determining the size of the edge outer side area of the target icon according to the display position information, the arrangement position information and the display size;
And generating brightness information corresponding to the size of the edge outside area of the target icon according to the size of the edge outside area of the target icon.
7. The method of claim 1, wherein the first display information includes display location information, illumination range information, and illumination intensity information; the second display information comprises arrangement position information of the icons in the first interface; the shadow characteristic information includes a halo region corresponding to the icon and halo information of the halo region, the halo information including halo intensity information and halo pattern information;
generating the shadow characteristic information according to the first display information and the second display information of the icon in the first interface, including:
according to the display position information and the arrangement position information, determining halation position information of halation generated by the luminous object;
acquiring halation intensity information corresponding to the illumination intensity information according to the association information of the preset illumination intensity information and the preset halation intensity information; and acquiring halation style information corresponding to the illumination range information according to the preset illumination range information and the association information of the preset halation style.
8. The method of claim 1, wherein the first display information includes display location information, illumination range information, and illumination intensity information; the second display information comprises arrangement position information of the icons in the first interface; the light and shadow characteristic information comprises color information and transparency information of a light beam display area corresponding to the icon;
generating the shadow characteristic information according to the first display information and the second display information of the icon in the first interface, including:
Generating illumination angle information of the luminous object according to the display position information and the illumination range information;
Determining a light beam display area emitted by the luminous object from the first interface based on the illumination angle information and the arrangement position information;
and generating color information of the light beam display area and transparency information corresponding to the color information according to the illumination range information and the illumination intensity information.
9. The method of claim 1, wherein the light and shadow characteristic information comprises at least one of: light side effect information, halo effect information, and beam effect information; the second interface includes a sub-interface; before the displaying the second interface, the method further includes:
displaying at least one of the following options: a halo effect option corresponding to the halo effect information, a beam effect option corresponding to the beam effect information;
Receiving a third input of a target option by a user;
and responding to the third input, displaying a sub-interface corresponding to the target option, wherein the sub-interface is an interface adjusted by the light and shadow characteristic information corresponding to the target option.
10. A display device, comprising:
The receiving module is used for receiving a first input of a user, wherein the first input is used for setting wallpaper of a first interface, and the wallpaper comprises a luminous object;
The acquisition module is used for responding to the first input and acquiring first display information of the luminous object in the first interface;
the generating module is used for generating light and shadow characteristic information according to the first display information and the second display information of the icon in the first interface;
And the display module is used for displaying a second interface, and the first interface is used as the second interface after the light and shadow characteristic information is adjusted.
11. The apparatus of claim 10, wherein the display apparatus further comprises a first determination module; wherein,
The receiving module is specifically configured to, when the first display information includes display position information, where the first input includes a first sub-input and a second sub-input, where the first sub-input is used to trigger a display adjustment option, the adjustment option is used to adjust a display position of the light-emitting object in the first interface, and the second sub-input is used to receive the first sub-input of the user on the light-emitting object when the second sub-input is used to trigger acquisition of the display position information;
The display module is further configured to display N adjustment options in response to the first sub-input, where each adjustment option in the N adjustment options corresponds to a preset display position in the first interface, and N is a positive integer;
The receiving module is specifically configured to receive a second sub-input of a target adjustment option from the N adjustment options, where the target adjustment option corresponds to a preset target display position;
the first determining module is configured to determine, in response to the second sub-input, preset coordinate information of the preset target display position as display position information of the light-emitting object in the first interface.
12. The apparatus of claim 10, wherein the display apparatus further comprises a contrast module and a second determination module; wherein,
The receiving module is specifically configured to, when the first display information includes display position information, where the first input includes a third sub-input and a fourth sub-input, where the third sub-input is used for a user to adjust a display area of the light-emitting object in the first interface, and the fourth sub-input is used for triggering to obtain the display position information, where the first interface includes a first display area and a second display area, receive the third sub-input of the user to the light-emitting object;
The display module is further used for responding to the third sub-input and displaying a mobile control, wherein the mobile control is used for changing the display area of the luminous object in the first interface according to user input;
The receiving module is specifically configured to receive a fourth sub-input to the movement control, where the fourth sub-input is used to move the light-emitting object from the first display area to the second display area;
The comparison module is used for responding to the fourth sub-input, comparing first pixel point information of a first wallpaper with second pixel point information of a second wallpaper to obtain difference pixel point information, wherein the first wallpaper is the wallpaper of the luminous object in the first display area, and the second wallpaper is the wallpaper of the luminous object in the second display area;
And the second determining module is used for determining the coordinates of the difference pixel point information as the display position information of the luminous object in the first interface.
13. The apparatus of claim 10, wherein the generating module is further configured to, in response to the first input, generate a lighting time interval for the light-emitting object based on the color information of the light-emitting object in the wallpaper and the coverage of the emitted light beam, if the first display information includes lighting intensity information;
The acquisition module is further used for acquiring illumination intensity information corresponding to the illumination time interval according to the association information of the preset illumination time interval and the preset illumination intensity information.
14. The apparatus of claim 10, wherein the display apparatus further comprises a third determination module; wherein,
The generating module is further configured to, in response to the first input, generate a reference plane corresponding to the light-emitting object according to a coverage area of the light beam emitted by the light-emitting object in the wallpaper when the first display information includes illumination range information;
the third determining module is configured to determine illumination range information of the light-emitting object in the first interface based on a distance between the light-emitting object and the reference plane.
15. The apparatus according to claim 10, wherein the generating module is specifically configured to, when the first display information includes display position information and illumination range information, the second display information includes arrangement position information of the icon in the first interface and display size of the icon in the first interface, and the shadow characteristic information includes size and brightness information of an area outside an edge of the icon, screen a target icon from the icons according to the display position information and the illumination range information;
Determining the size of the edge outer side area of the target icon according to the display position information, the arrangement position information and the display size;
And generating brightness information corresponding to the size of the edge outside area of the target icon according to the size of the edge outside area of the target icon.
16. The apparatus according to claim 10, wherein the generating module is specifically configured to determine, when the first display information includes display position information, illumination range information, and illumination intensity information, the second display information includes arrangement position information of the icon in the first interface, the shadow characteristic information includes a halation region corresponding to the icon and halation information of the halation region, and the halation information includes halation intensity information and halation pattern information, halation position information of the light-emitting object for generating halation according to the display position information and the arrangement position information;
acquiring halation intensity information corresponding to the illumination intensity information according to the association information of the preset illumination intensity information and the preset halation intensity information; and acquiring halation style information corresponding to the illumination range information according to the preset illumination range information and the association information of the preset halation style.
17. The device according to claim 10, wherein the generating module is specifically configured to, when the first display information includes display position information, illumination range information, and illumination intensity information, the second display information includes arrangement position information of the icon in the first interface, and the light and shadow characteristic information includes color information and transparency information of a light beam display area corresponding to the icon, generate illumination angle information of the light-emitting object according to the display position information and the illumination range information;
Determining a light beam display area emitted by the luminous object from the first interface based on the illumination angle information and the arrangement position information;
and generating color information of the light beam display area and transparency information corresponding to the color information according to the illumination range information and the illumination intensity information.
18. The apparatus of claim 10, wherein the display module is further configured to, when the light and shadow characteristic information includes at least one of: and displaying at least one of the following options under the condition that the second interface comprises a sub-interface: a halo effect option corresponding to the halo effect information, a beam effect option corresponding to the beam effect information;
The receiving module is further used for receiving a third input of a target option from a user;
the display module is further configured to display, in response to the third input, a sub-interface corresponding to the target option, where the sub-interface is an interface adjusted by the light and shadow characteristic information corresponding to the target option.
CN202410142497.4A 2024-01-31 2024-01-31 Display method and device Pending CN117931356A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410142497.4A CN117931356A (en) 2024-01-31 2024-01-31 Display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410142497.4A CN117931356A (en) 2024-01-31 2024-01-31 Display method and device

Publications (1)

Publication Number Publication Date
CN117931356A true CN117931356A (en) 2024-04-26

Family

ID=90770198

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410142497.4A Pending CN117931356A (en) 2024-01-31 2024-01-31 Display method and device

Country Status (1)

Country Link
CN (1) CN117931356A (en)

Similar Documents

Publication Publication Date Title
CN106548455B (en) Apparatus and method for adjusting brightness of image
CN108525298B (en) Image processing method, image processing device, storage medium and electronic equipment
KR20220045977A (en) Devices, methods and graphical user interfaces for interacting with three-dimensional environments
US20170293364A1 (en) Gesture-based control system
US8552976B2 (en) Virtual controller for visual displays
US10346011B2 (en) User interface for the application of image effects to images
JP6625801B2 (en) Image processing apparatus, image processing method, and program
US20150089452A1 (en) System and Method for Collaborative Computing
US20100315413A1 (en) Surface Computer User Interaction
US20130201202A1 (en) Media editing with overlaid color adjustment tools
US20140333585A1 (en) Electronic apparatus, information processing method, and storage medium
CN106201173A (en) The interaction control method of a kind of user's interactive icons based on projection and system
CN109572239B (en) Printing method and system of nail beautifying device, nail beautifying equipment and medium
EP3765949A1 (en) Systems and methods to increase discoverability in user interfaces
CN113546419B (en) Game map display method, game map display device, terminal and storage medium
US9483873B2 (en) Easy selection threshold
CN114385289B (en) Rendering display method and device, computer equipment and storage medium
CN112422945A (en) Image processing method, mobile terminal and storage medium
CN114758054A (en) Light spot adding method, device, equipment and storage medium
CN104811639B (en) Information processing method and electronic equipment
JP6677160B2 (en) Information processing apparatus, information processing system, information processing method and program
US11922904B2 (en) Information processing apparatus and information processing method to control display of a content image
US10402085B2 (en) Display of content based on handwritten input
CN117931356A (en) Display method and device
Matulic et al. Smart ubiquitous projection: discovering surfaces for the projection of adaptive content

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination