CN115878007A - Method and device for replacing screen wallpaper of electronic equipment and electronic equipment - Google Patents

Method and device for replacing screen wallpaper of electronic equipment and electronic equipment Download PDF

Info

Publication number
CN115878007A
CN115878007A CN202111156115.6A CN202111156115A CN115878007A CN 115878007 A CN115878007 A CN 115878007A CN 202111156115 A CN202111156115 A CN 202111156115A CN 115878007 A CN115878007 A CN 115878007A
Authority
CN
China
Prior art keywords
electronic device
wallpaper
pattern
screen
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111156115.6A
Other languages
Chinese (zh)
Inventor
杨婉艺
曹原
卞苏成
张乐韶
林尤辉
邵天雨
王翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202111156115.6A priority Critical patent/CN115878007A/en
Priority to PCT/CN2022/119922 priority patent/WO2023051320A1/en
Publication of CN115878007A publication Critical patent/CN115878007A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation

Abstract

The embodiment of the application is applicable to the technical field of information, and provides a method and a device for replacing screen wallpaper of electronic equipment and the electronic equipment, wherein the method is applied to first electronic equipment and comprises the following steps: the first electronic equipment acquires a framing image; the first electronic device identifying patterns and colors in the viewfinder image; the first electronic equipment generates a target pattern based on the pattern, fills the target pattern with the color, and obtains a wallpaper set containing a plurality of screen wallpapers; the first electronic device replaces the screen wallpaper of the target electronic device according to the wallpaper set, wherein the target electronic device comprises the first electronic device and/or a second electronic device which is in communication connection with the first electronic device. By the method, the problem that in the prior art, when the screen wallpaper of the electronic equipment is replaced, the color of the built-in wallpaper template can only be changed, and the requirement of a user on the diversity of the screen wallpaper is difficult to meet is solved.

Description

Method and device for replacing screen wallpaper of electronic equipment and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of information, in particular to a method and a device for replacing screen wallpaper of electronic equipment and the electronic equipment.
Background
The screen wallpaper of the electronic device includes a screen background of the electronic device in a use state, an image or pattern displayed in a standby or a breath-screen state, and the like. Illustratively, the screen wallpaper of the electronic device may be a desktop image as a background of a smartphone screen, and may also be a dial pattern of a smartwatch.
Typically, the screen wallpaper of an electronic device is replaceable. In some scenarios, the electronic device may identify one or more colors contained in a certain image. The electronic device may then fill the identified one or more colors into a built-in wallpaper template, generating multiple screens of wallpaper for selection by the user. However, the screen wallpaper generated in this way only changes the colors of various wallpaper templates built in the electronic device. For example, for two different images, the screen wallpaper generated by the electronic device from the two images may differ only in color. Such a replacement mode is difficult to satisfy the diversified demands of users on the screen wallpaper.
Disclosure of Invention
The method and the device for replacing the screen wallpaper of the electronic equipment and the electronic equipment are used for solving the problems that in the prior art, when the screen wallpaper of the electronic equipment is replaced, the color of a built-in wallpaper template can only be changed, and the diversity requirement of a user on the screen wallpaper is difficult to meet.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, a method for replacing a wallpaper of a screen of an electronic device is provided, and the method is applied to a first electronic device and includes:
the first electronic equipment acquires a framing image;
the first electronic device recognizing patterns and colors in the through-view image;
the method comprises the steps that a first electronic device generates a target pattern based on a pattern in a framing image, and fills the target pattern with a recognized color to obtain a wallpaper set containing a plurality of screen wallpapers;
the first electronic device replaces the screen wallpaper of the target electronic device according to the wallpaper set, wherein the target electronic device comprises the first electronic device and/or a second electronic device which is in communication connection with the first electronic device.
Compared with the prior art, the technical scheme of the embodiment of the application has the following beneficial effects: the first electronic device, upon acquiring the through-image, may identify patterns and colors in the through-image and generate a target pattern based on the identified patterns. These target patterns may be part of a pattern in a subsequently generated wallpaper. The first electronic device may then fill the target pattern with the colors identified from the viewfinder image. The colors can include colors obtained by directly taking colors from the viewfinder images, and can also include new colors obtained by combining, optimizing and re-matching colors on the basis of direct color taking. Since the colors and the number of the target patterns can be multiple, the first electronic device fills the target patterns with the colors, and multiple pieces of screen wallpaper can be obtained. These screen wallpapers may be used to replace a screen wallpaper currently being used by the first electronic device and/or a second electronic device communicatively connected to the first electronic device. Because the patterns and the colors contained in the replaced screen wallpaper are obtained by processing on the basis of the patterns and the colors contained in the viewfinder image, the style and the presentation effect of the replaced screen wallpaper of each electronic device can be relatively unified. Compared with the scheme that the screen wallpaper is generated only by changing the color of the built-in wallpaper template in the prior art, the technical scheme provided by the embodiment of the application obviously improves the richness of the generated screen wallpaper.
In a possible implementation manner of the first aspect, the through image may be an image or a part of an image captured in real time by a user using the first electronic device, or may be an image or a part of an image already stored in the first electronic device. The part of the image may be a new image obtained by the first electronic device cutting a part of the image area from a certain image. The new image, like the complete image, can also be used as a framing image for generating a screen wallpaper. When the first electronic device captures a part of the image area from the complete image, the first electronic device may capture the image area by a regular capture or an irregular capture. Illustratively, the first electronic device may capture a picture with a regular pattern such as a rectangle or a circle to obtain a regular image area, or capture a picture with an arbitrary irregular pattern to obtain an irregular image area. Therefore, the source diversity of the viewfinder image is improved, and the diversity of subsequently generated screen wallpaper is indirectly increased.
In one possible implementation manner of the first aspect, the pattern recognized by the first electronic device from the viewfinder image may include a basic pattern or a complex pattern in the viewfinder image. Wherein, the basic pattern can comprise any one or more of line patterns, circular patterns, square patterns, rectangular patterns and the like; the complex pattern may comprise scalable vector graphics, a bitmap, or a pattern based on a combination of the above-described base patterns. The first electronic device may be implemented according to certain composition rules when combining the basic patterns, and the composition rules may include plane symmetry composition, stack composition, translation composition, stack composition, rotation composition, or kaleidoscope composition. The first electronic device may combine the base patterns using a single composition rule thereof or may combine the base patterns using a plurality of composition rules thereof.
In one possible implementation manner of the first aspect, the target pattern generated based on the pattern in the viewfinder image may have a corresponding influence factor, and the influence factor may be used to characterize the type, number, and order of composition rules used when combining the patterns to obtain the corresponding target pattern. These influencing factors include at least one of the following: color sampling mode, pattern complexity, type of electronic equipment, type of screen wallpaper and random number.
In a possible implementation manner of the first aspect, the color recognized by the first electronic device from the viewfinder image may include a pattern color of each pattern in the viewfinder image and an optimized color obtained by processing the recognized pattern color. The first electronic device may process the pattern colors, including merging similar pattern colors, adjusting color values such as hue, saturation, and lightness of the respective colors, generating pattern colors, optimizing gradation colors of the colors, and the like. For example, the first electronic device may perform color optimization according to the type of the electronic device to which the screen wallpaper is to be applied and the current screen background color thereof, so that the optimized color can be better adapted to the electronic device and the background color thereof.
In one possible implementation manner of the first aspect, the number of colors recognized by the first electronic device from the viewfinder image may be no more than three. Therefore, after the first electronic device fills the target pattern with the colors, the number of the colors in the obtained screen wallpaper is not more than three, and the display effect is better.
In a possible implementation manner of the first aspect, multiple wallpaper templates may be built in the first electronic device, any wallpaper template may have a preset template complexity, and the template complexity may be determined according to multiple factors included in the wallpaper template. Illustratively, the template complexity of the wallpaper template can be comprehensively determined according to the coloring quantity, whether the wallpaper template has gradient colors, the richness degree of the patterns contained in the template and other factors. The first electronic device can determine the image complexity of a framing image in the process of generating the screen wallpaper, then the first electronic device can acquire a target wallpaper template with the template complexity being the same as the image complexity, and the pattern contained in the target wallpaper template is used as a target pattern of a subsequent color to be filled. Illustratively, the template complexity and the image complexity may be represented using different levels. E.g., 1-6, etc. After the first electronic device determines that the image complexity of the framing image belongs to a certain level, for example, a third level, the first electronic device may obtain one or more wallpaper templates from the wallpaper templates with the template complexity of the third level. The template complexity of the wallpaper template can be preset, and the image complexity of the framing image can be obtained by training an artificial intelligence classification algorithm model and calculating in real time after the framing image is obtained.
In a possible implementation manner of the first aspect, the wallpaper template built in the first electronic device may further have preset template categories, and these template categories may be used to characterize the types of patterns included in wallpaper templates belonging to a certain category. Therefore, when the first electronic device generates the target pattern according to the pattern in the through-view image, the first electronic device may first determine the image category of the pattern in the through-view image, and then the first electronic device may determine, from the template categories, the category of the pattern whose template category is the same as the image category, and acquire the target wallpaper template from the category, taking the pattern contained in the target wallpaper template as the target pattern. Illustratively, the template categories of the wallpaper template may include a stripe category, a square category, a spot/circle category, an arc category, an irregular pattern category, and the like. When the first electronic device recognizes that the pattern in the viewfinder image is of a certain category, for example, the pattern in the viewfinder image is a stripe, the first electronic device may obtain one or more target wallpaper templates from the wallpaper templates of the stripe category. If the patterns in the through-image recognized by the first electronic device belong to a plurality of categories, the first electronic device may determine one pattern according to the number of the patterns in the plurality of categories or the position of each pattern in the through-image, and use the determined category of the pattern as the image category of the through-image. In a possible implementation manner, the template type of the wallpaper template may include multiple types, for example, the template type of a certain wallpaper template is a stripe type + a square grid type, and when the first electronic device recognizes that the pattern in the viewfinder image includes stripes and squares, the first electronic device may obtain the target wallpaper template from the wallpaper template of the stripe type + the square grid type. The first electronic device may be implemented by an artificial intelligence image analysis technique when recognizing the category of the pattern in the through-view image.
In one possible implementation manner of the first aspect, the number of target patterns may include a plurality, and the plurality of target patterns may include a plurality of pattern regions, respectively, the plurality of pattern regions including a primary region and a secondary region; the number of colors recognized by the first electronic device from the viewfinder image may include a plurality of colors including a primary color and a secondary color. Wherein the main color can be used for filling the main area, and the auxiliary color can be used for filling the secondary area, so as to obtain the multiple screens of wallpaper.
In a possible implementation manner of the first aspect, after the first electronic device performs color filling on the target pattern to obtain a wallpaper set including multiple screen wallpapers, the first electronic device may further perform secondary transformation on the multiple screen wallpapers by using a built-in transformation template to obtain the transformed multiple screen wallpapers. The transformation templates may have corresponding template styles, which may characterize a specific style of the transformation template. After the first electronic device carries out secondary transformation on the screen wallpaper according to the transformation template, the obtained transformed screen wallpaper can have the same style as the transformation template, so that the wallpaper style of the screen wallpaper is consistent with the template style of the transformation template.
In one possible implementation manner of the first aspect, before the first electronic device replaces the screen wallpaper of the target electronic device according to the wallpaper set, the first electronic device may determine a screen specification of each target electronic device. The screen specifications of the target electronic devices may include screen shapes or aspect ratios of the respective target electronic devices. For example, for a target electronic device of the class of wearable devices such as smartwatches, the first electronic device may first determine that its screen is circular, square, or other shape; for a target electronic device in the category of a mobile phone, a tablet computer, etc., the first electronic device may determine that the ratio of the length to the width of its screen is 16. Then, the first electronic device may adjust the multiple screen wallpapers according to the screen specification of the target electronic device, so that the wallpaper specification of the adjusted screen wallpaper matches the screen specification of each target electronic device, and the wallpaper specification may also include a wallpaper shape or a length-width ratio of the screen wallpaper, and the like.
In one possible implementation manner of the first aspect, the number of the second electronic devices may include a plurality. Therefore, when the first electronic device replaces the screen wallpaper of the target electronic device according to the wallpaper set, the screen wallpaper of the first electronic device and/or the plurality of second electronic devices should be replaced. Specifically, the first electronic device may determine screen wallpapers that match screen specifications of the first electronic device and the plurality of second electronic devices, respectively; the first electronic equipment can adopt the screen wallpaper matched with the screen specification of the first electronic equipment to replace the screen wallpaper currently used by the first electronic equipment; and the first electronic device may send the screen wallpaper matched with the screen specifications of the plurality of second electronic devices to the corresponding plurality of second electronic devices, and instruct the plurality of second electronic devices to replace the screen wallpaper currently used by each second electronic device with the received screen wallpaper. In this way, the style of the screen wallpaper used by the first electronic device and the plurality of second electronic devices after the replacement is completed can be kept relatively uniform.
In a second aspect, an apparatus for changing wallpaper of a screen of an electronic device is provided, where the method is applicable to a first electronic device, and includes: the device comprises an acquisition module, an identification module, a generation module, a filling module and a replacement module; wherein:
an acquisition module for acquiring a framing image;
the identification module is used for identifying patterns and colors in the viewfinder image;
a generation module for generating a target pattern based on the pattern;
the filling module is used for filling the target pattern by adopting the colors to obtain a wallpaper set containing a plurality of screen wallpapers;
the wallpaper changing module is used for changing the screen wallpaper of the target electronic equipment according to the wallpaper set, and the target electronic equipment comprises first electronic equipment and/or second electronic equipment in communication connection with the first electronic equipment.
In a third aspect, an electronic device is provided, which may be the first electronic device in any one of the above first aspects, and may include a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the computer program is executed by the processor, the processor implements the method for changing the wallpaper of the screen of the electronic device according to any one of the above first aspects.
In a fourth aspect, a computer-readable storage medium is provided, in which computer instructions are stored, and when the computer instructions are executed on an electronic device, the electronic device is enabled to implement the method for replacing the wallpaper of the screen of the electronic device according to any one of the first aspect. The electronic device may exemplarily be the first electronic device in any one of the above-described first aspects.
In a fifth aspect, a computer program product is provided, which when running on a computer, causes the computer to execute the relevant steps described above, so as to implement the method for replacing the wallpaper on the screen of the electronic device according to any one of the first aspect.
In a sixth aspect, a chip is provided that includes a memory and a processor. Wherein the processor executes the computer program stored in the memory, and the method for replacing the wallpaper of the electronic device screen according to any one of the first aspect can be implemented.
It is understood that the beneficial effects of the second to sixth aspects can be seen from the description of the first aspect, and are not described herein again.
Drawings
FIG. 1 is a schematic application scenario diagram of a method for replacing a wallpaper of a screen of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating a method for changing wallpaper on a screen of an electronic device according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a color processing flow according to an embodiment of the present application;
FIG. 5 is a schematic diagram illustrating brightness and saturation relationship of an HSB color mode according to an embodiment of the present application;
fig. 6 is a schematic diagram of a similar color merging process provided in an embodiment of the present application;
FIG. 7 is a schematic diagram of a color optimization process provided in an embodiment of the present application;
FIG. 8 is a schematic diagram illustrating a generation method of a target pattern according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of a gradient color generation process provided by an embodiment of the present application;
FIG. 10 is a schematic diagram of a color matching process provided by an embodiment of the present application;
FIG. 11 is a schematic flow chart of a collision transfer technique according to an embodiment of the present application;
12 a-12 d are schematic diagrams illustrating an operation flow of replacing wallpaper on a screen of an electronic device according to an embodiment of the application;
13 a-13 h are schematic diagrams of generating screen wallpapers applicable to a plurality of electronic devices according to an embodiment of the present application;
14 a-14 b are schematic diagrams illustrating changing wallpaper of multiple electronic device screens according to an embodiment of the present application;
fig. 15 is a block diagram of a structure of an apparatus for replacing wallpaper on a screen of an electronic device according to an embodiment of the present application.
Detailed Description
In order to facilitate clear description of technical solutions of the embodiments of the present application, in the embodiments of the present application, words such as "first" and "second" are used to distinguish identical items or similar items with substantially the same functions and actions. For example, the first region, the second region, and the like are merely regions for distinguishing different positions on the image, and the number and execution order thereof are not limited.
It should be noted that in the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described herein as "exemplary" or "such as" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The service scenario described in the embodiment of the present application is for more clearly illustrating the technical solution of the embodiment of the present application, and does not form a limitation on the technical solution provided in the embodiment of the present application, and it can be known by a person skilled in the art that with the occurrence of a new service scenario, the technical solution provided in the embodiment of the present application is also applicable to similar technical problems.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or" describes the association relationship of the associated object, indicating that there may be three relationships, for example, a and/or B, which may indicate: a alone, A and B together, and B alone, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
The steps involved in the method for replacing the wallpaper of the electronic device screen provided by the embodiment of the application are only examples, and not all the steps are necessarily executed steps, or the contents in each technical feature are optional, and may be increased or decreased as required during the use process.
The same steps or technical features having the same functions in the embodiments of the present application may be referred to with each other between different embodiments.
As shown in fig. 1, the application scenario of the method for changing wallpaper on a screen of an electronic device according to the embodiment of the present application is schematically illustrated, where the application scenario may include a first electronic device 101 and a plurality of second electronic devices 102, that is, 102a, 102b, and 102c shown in fig. 1. The first electronic device 101 and the plurality of second electronic devices 102 may establish a connection with each other in various possible ways. The connection between the second electronic device 102 and the first electronic device 101 may be established in the same manner or in different manners as 102a, 102b, and 102c. For example, 102a, 102b, and 102c in the second electronic device 102 may each establish a bluetooth connection with the first electronic device 101 by means of a bluetooth connection; or 102a of the second electronic device 102 may establish a bluetooth connection with the first electronic device 101 in a bluetooth connection manner, and 102b and 102c of the second electronic device 102 may establish a wireless connection in a manner of accessing the same wireless local area network as the first electronic device 101, which manner the first electronic device 101 and the plurality of second electronic devices 102 adopt to establish a connection is not limited in the embodiment of the present application.
Of course, as an example, the second electronic device 102 shown in fig. 1 includes only three different types of electronic devices 102a, 102b, and 102c. In practical applications, the second electronic device 102 may also include a greater number of electronic devices. For example, the second electronic device 102 may further include other electronic devices besides 102a, 102b, and 102c, and the number of the second electronic devices 102 is not limited in the embodiment of the present application.
In this embodiment, the first electronic device 101 may generate a set of wallpaper sets based on one viewfinder image, where the wallpaper sets may include multiple screen wallpapers. These screen wallpapers may be generated based on the patterns and colors contained in the through image processed by the first electronic device 101 and recognized. Thus, the multiple screen wallpapers remain relatively uniform in style. Furthermore, different screen wallpapers in the set of wallpapers can also be adapted to the first electronic device 101 and/or the plurality of second electronic devices 102. For example, several screen wallpapers in the wallpaper set may have the same specification as the screen of the mobile phone, and several other screen wallpapers may have the same specification as the screen of the wearable device or the tablet computer. Thus, multiple screen wallpapers in the wallpaper set may be used to replace existing screen wallpapers of the first electronic device 101 and/or the multiple second electronic devices 102. In this way, the changed wallpaper may also remain relatively uniform in style.
In the embodiment of the present application, the first electronic device 101 or the second electronic device 102 may be an electronic device with a screen display function, such as a mobile phone, a tablet computer, a wearable device, an in-vehicle device, a notebook computer, a Personal Computer (PC), a netbook, or a Personal Digital Assistant (PDA). The embodiment of the present application does not limit the specific type of the first electronic device 101 or the second electronic device 102.
The first electronic device 101 and the second electronic device 102 in the embodiment of the present application may be the same type of electronic device. For example, the first electronic device 101 and the second electronic device 102 are both mobile phones; alternatively, the first electronic device 101 and the second electronic device 102 are both wearable devices. The first electronic device 101 and the second electronic device 102 in the embodiment of the present application may also be different types of electronic devices. Moreover, since the number of the second electronic devices 102 may be multiple, when the first electronic device 101 and the multiple second electronic devices 102 belong to different types of electronic devices, each of the first electronic device 101 and the multiple second electronic devices 102 may be an electronic device of a completely different type or an electronic device of a partially different type. For example, the first electronic device 101 is a mobile phone, and the second electronic device 102 may include a plurality of electronic devices, such as a notebook computer, a tablet computer, a wearable device, etc., of types completely different from the first electronic device 101; alternatively, the first electronic device 101 is a mobile phone, and the second electronic device 102 may include a plurality of electronic devices, such as a mobile phone, a tablet computer, a wearable device, and the like, which are partially different from the type of the first electronic device 101.
Illustratively, referring to fig. 1, the first electronic device 101 may be a cellular phone and the second electronic device 102 may include a plurality of different types of electronic devices that establish a communication connection with the cellular phone. For example, three different types of electronic devices are shown in fig. 1. In fig. 1, 102a is a wearable device (smart watch), 102b is a tablet computer, and 102c is a notebook computer, which are completely different from the first electronic device 101 in type.
In the process of implementing the method for changing the wallpaper of the electronic device provided by the embodiment of the present application, the mobile phone shown as 101 in fig. 1 may generate a set of wallpaper sets based on one viewfinder image, where the set of wallpaper sets may include multiple wallpaper screens. These screen wallpapers are each generated based on the pattern and color contained in the through image. The specification of a plurality of screen wallpapers in the wallpaper set can be matched with the mobile phone, the specification of a plurality of screen wallpapers can be matched with the wearable device shown as 102a in fig. 1, the specification of a plurality of screen wallpapers can be matched with the tablet computer shown as 102b in fig. 1, and the specification of a plurality of screen wallpapers can be matched with the notebook computer shown as 102c in fig. 1. In this way, the mobile phone 101 shown in fig. 1 may replace the screen wallpaper currently used by the corresponding device with one of several screen wallpapers adapted to the respective electronic devices in response to an operation or instruction of the user. For example, the mobile phone 101 shown in fig. 1 may replace the current screen wallpaper of the mobile phone with one of the plurality of screen wallpapers adapted to itself, replace the current screen wallpaper of the wearable device with one of the plurality of screen wallpapers adapted to the wearable device, replace the current screen wallpaper of the tablet computer with one of the plurality of screen wallpapers adapted to the tablet computer, and replace the current screen wallpaper of the notebook computer with one of the plurality of screen wallpapers adapted to the notebook computer.
After the mobile phone shown by 101 in fig. 1 generates a wallpaper set including a plurality of screen wallpapers based on the viewfinder image, only screen wallpapers of some of the own (the mobile phone shown by 101 in fig. 1) and the second electronic device (the wearable device shown by 102a, the tablet computer shown by 102b, and the notebook computer shown by 102c in fig. 1) connected by communication may be replaced, or screen wallpapers of all of these electronic devices may be replaced, which is not limited in the embodiment of the present application.
Fig. 2 shows a schematic structural diagram of an electronic device 200. The structure of the first electronic device 101 and the second electronic device 102 described above may refer to the structure of the electronic device 200.
The electronic device 200 may include a processor 210, an external memory interface 220, an internal memory 221, a Universal Serial Bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, a sensor module 280, a key 290, a motor 291, an indicator 292, a camera 293, a display screen 294, and a Subscriber Identity Module (SIM) card interface 295, among others. Among them, the sensor module 280 may include a pressure sensor 280A, a gyro sensor 280B, an air pressure sensor 280C, a magnetic sensor 280D, an acceleration sensor 280E, a distance sensor 280F, a proximity light sensor 280G, a fingerprint sensor 280H, a temperature sensor 280J, a touch sensor 280K, an ambient light sensor 280L, a bone conduction sensor 280M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 200. In some embodiments of the present application, the electronic device 200 may include more or fewer components than illustrated, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units. For example, the processor 210 may include an Application Processor (AP), a modem processor, a Graphic Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution.
A memory may also be provided in processor 210 for storing instructions and data. In some embodiments of the present application, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that have just been used or recycled by processor 210. If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 210, thereby increasing the efficiency of the system.
In some embodiments of the present application, the processor 210 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose-input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus comprising a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments of the present application, processor 210 may include multiple sets of I2C buses. The processor 210 may be coupled to the touch sensor 280K, the charger, the flash, the camera 293, and the like through different I2C bus interfaces. For example, the processor 210 may be coupled to the touch sensor 280K through an I2C interface, such that the processor 210 and the touch sensor 280K communicate through an I2C bus interface to implement the touch function of the electronic device 200.
The I2S interface may be used for audio communication. In some embodiments of the present application, processor 210 may include multiple sets of I2S buses. Processor 210 may be coupled to audio module 270 via an I2S bus to enable communication between processor 210 and audio module 270. In some embodiments of the present application, the audio module 270 may transmit an audio signal to the wireless communication module 260 through an I2S interface, so as to implement a function of answering a call through a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments of the present application, audio module 270 and wireless communication module 260 may be coupled via a PCM bus interface. In some embodiments of the present application, the audio module 270 may also transmit an audio signal to the wireless communication module 260 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments of the present application, a UART interface is generally used to connect the processor 210 and the wireless communication module 260. For example, the processor 210 communicates with a bluetooth module in the wireless communication module 260 through a UART interface to implement a bluetooth function. In some embodiments of the present application, the audio module 270 may transmit an audio signal to the wireless communication module 260 through a UART interface, so as to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 210 with peripheral devices such as the display screen 294, the camera 293, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like.
In some embodiments of the present application, the processor 210 and the camera 293 communicate through a CSI interface to implement the shooting function of the electronic device 200. The processor 210 and the display screen 294 communicate through the DSI interface to implement a display function of the electronic device 200.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments of the present application, a GPIO interface may be used to connect processor 210 with camera 293, display screen 294, wireless communication module 260, audio module 270, sensor module 280, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 230 is an interface conforming to a USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 230 may be used to connect a charger to charge the electronic device 200, and may also be used to transmit data between the electronic device 200 and a peripheral device. The USB interface 230 may also be used to connect headphones through which audio is played. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not limit the structure of the electronic device 200. In other embodiments of the present application, the electronic device 200 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charge management module 240 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 240 may receive charging input from a wired charger via the USB interface 230. In some wireless charging embodiments, the charging management module 240 may receive the wireless charging input through a wireless charging coil of the electronic device 200. The charging management module 240 may also supply power to the electronic device through the power management module 241 while charging the battery 242.
The power management module 241 is used for connecting the battery 242, the charging management module 240 and the processor 210. The power management module 241 receives input from the battery 242 and/or the charging management module 240, and supplies power to the processor 210, the internal memory 221, the display 294, the camera 293, the wireless communication module 260, and the like. The power management module 241 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc.
In some other embodiments, the power management module 241 may also be disposed in the processor 210. In other embodiments, the power management module 241 and the charging management module 240 may be disposed in the same device.
The wireless communication function of the electronic device 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 200 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example, the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 250 may provide a solution including 2G/3G/4G/5G wireless communication applied on the electronic device 200. The mobile communication module 250 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 250 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 250 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave.
In some embodiments of the present application, at least some of the functional modules of the mobile communication module 250 may be disposed in the processor 210. In some embodiments of the present application, at least some of the functional modules of the mobile communication module 250 may be disposed in the same device as at least some of the modules of the processor 210.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 270A, the receiver 270B, etc.) or displays an image or video through the display screen 294.
In some embodiments of the present application, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 250 or other functional modules, independent of the processor 210.
The wireless communication module 260 may provide a solution for wireless communication applied to the electronic device 200, including Wireless Local Area Networks (WLANs), such as wireless fidelity (Wi-Fi) networks, bluetooth (BT), global Navigation Satellite Systems (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 260 may be one or more devices integrating at least one communication processing module. The wireless communication module 260 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 260 may also receive a signal to be transmitted from the processor 210, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments of the present application, the antenna 1 of the electronic device 200 is coupled to the mobile communication module 250 and the antenna 2 is coupled to the wireless communication module 260, such that the electronic device 200 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 200 implements display functions through the GPU, the display screen 294, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 294 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 294 is used to display images, video, and the like. The display screen 294 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments of the present application, the electronic device 200 may include 1 or N display screens 294, N being a positive integer greater than 1.
The electronic device 200 may implement a photographing function through the ISP, the camera 293, the video codec, the GPU, the display screen 294, and the application processor, etc.
The ISP is used to process the data fed back by the camera 293. For example, when a user takes a picture, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, an optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and converting the electric signal into an image visible to the naked eye. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments of the present application, the ISP may be provided in camera 293.
The camera 293 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV or other format. In some embodiments of the present application, the electronic device 200 may include 1 or N cameras 293, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 200 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 200 may support one or more video codecs. In this way, the electronic device 200 may play or record video in a variety of encoding formats, such as Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 200, for example, image recognition, face recognition, voice recognition, text understanding, and the like, may be implemented by the NPU.
The external memory interface 220 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the electronic device 200. The external memory card communicates with the processor 210 through the external memory interface 220 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card.
Internal memory 221 may be used to store computer-executable program code, which includes instructions. The internal memory 221 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 200, and the like.
In addition, the internal memory 221 may include a high-speed random access memory and may also include a nonvolatile memory. Such as at least one magnetic disk storage device, flash memory device, universal Flash Storage (UFS), etc.
The processor 210 executes various functional applications and data processing of the electronic device 200 by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor.
Electronic device 200 may implement audio functions through audio module 270, speaker 270A, receiver 270B, microphone 270C, headphone interface 270D, and an application processor, among other things. Such as music playing, recording, etc.
Audio module 270 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. Audio module 270 may also be used to encode and decode audio signals. In some embodiments of the present application, the audio module 270 may be disposed in the processor 210, or some functional modules of the audio module 270 may be disposed in the processor 210.
The speaker 270A, also called a "horn", is used to convert an audio electrical signal into an acoustic signal. The electronic apparatus 200 can listen to music through the speaker 270A or listen to a hands-free call.
The receiver 270B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the electronic apparatus 200 receives a call or voice information, it is possible to receive voice by placing the receiver 270B close to the human ear.
The microphone 270C, also referred to as a "microphone" or "microphone", converts sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 270C by speaking near the microphone 270C through the mouth. The electronic device 200 may be provided with at least one microphone 270C. In other embodiments, the electronic device 200 may be provided with two microphones 270C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 200 may further include three, four, or more microphones 270C to collect a sound signal, reduce noise, identify a sound source, and implement a directional recording function.
The earphone interface 270D is used to connect a wired earphone. The headset interface 270D may be the USB interface 230, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, or a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 280A is used for sensing a pressure signal, which can be converted into an electrical signal. In some embodiments, pressure sensor 280A may be disposed on display screen 294. The pressure sensor 280A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 280A, the capacitance between the electrodes changes. The electronic device 200 determines the intensity of the pressure from the change in capacitance. When a touch operation is applied to the display screen 294, the electronic apparatus 200 detects the intensity of the touch operation based on the pressure sensor 280A. The electronic apparatus 200 may also calculate the touched position from the detection signal of the pressure sensor 280A.
In some embodiments of the present application, touch operations that are applied to the same touch position but have different touch operation intensities may correspond to different operation instructions. For example, when a touch operation having a touch operation intensity smaller than a first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 280B may be used to determine the motion pose of the electronic device 200. In some embodiments of the present application, the angular velocity of the electronic device 200 about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 280B. The gyro sensor 280B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 280B detects the shake angle of the electronic device 200, calculates the distance that the lens module needs to compensate according to the shake angle, and enables the lens to counteract the shake of the electronic device 200 through a reverse motion, thereby achieving anti-shake. The gyro sensor 280B may also be used for navigation, body sensing game scenes.
The air pressure sensor 280C is used to measure air pressure. In some embodiments of the present application, the electronic device 200 calculates altitude, assisted positioning, and navigation from barometric pressure values measured by the barometric pressure sensor 280C.
The magnetic sensor 280D includes a hall sensor. The electronic device 200 may detect the opening and closing of the flip holster using the magnetic sensor 280D. In some embodiments of the present application, when the electronic device 200 is a flip cover machine, the electronic device 200 may detect the opening and closing of the flip cover according to the magnetic sensor 280D, and further set the automatic unlocking of the flip cover according to the detected opening and closing state of the holster or the detected opening and closing state of the flip cover.
The acceleration sensor 280E may detect the magnitude of acceleration of the electronic device 200 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 200 is stationary. The method can also be used for identifying the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and the like.
A distance sensor 280F for measuring distance. The electronic device 200 may measure the distance by infrared or laser. In some embodiments of the present application, such as shooting a scene, the electronic device 200 may utilize the distance sensor 280F to range to achieve fast focus.
The proximity light sensor 280G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic apparatus 200 emits infrared light to the outside through the light emitting diode. The electronic device 200 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 200. When insufficient reflected light is detected, the electronic device 200 may determine that there are no objects near the electronic device 200. The electronic device 200 can utilize the proximity sensor 280G to detect that the user holds the electronic device 200 close to the ear for talking, so as to automatically turn off the screen to save power. Proximity light sensor 280G can also be used in holster mode, pocket mode automatically unlocking and locking the screen.
The ambient light sensor 280L is used to sense the ambient light level. The electronic device 200 may adaptively adjust the brightness of the display screen 294 based on the perceived ambient light level. The ambient light sensor 280L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 280L may also cooperate with the proximity light sensor 280G to detect whether the electronic device 200 is in a pocket to prevent inadvertent touches.
The fingerprint sensor 280H is used to collect a fingerprint. The electronic device 200 may utilize the collected fingerprint characteristics to implement fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering, and the like.
The temperature sensor 280J is used to detect temperature. In some embodiments of the present application, the electronic device 200 implements a temperature processing strategy using the temperature detected by the temperature sensor 280J. For example, when the temperature reported by the temperature sensor 280J exceeds the threshold, the electronic device 200 performs a reduction in performance of a processor located near the temperature sensor 280J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 200 heats the battery 242 when the temperature is below another threshold to avoid the low temperature causing the electronic device 200 to shut down abnormally. In other embodiments, when the temperature is below a further threshold, the electronic device 200 performs a boost on the output voltage of the battery 242 to avoid an abnormal shutdown due to low temperature.
The touch sensor 280K is also referred to as a "touch device". The touch sensor 280K may be disposed on the display screen 294, and the touch sensor 280K and the display screen 294 form a touch screen, which is also called a "touch screen". The touch sensor 280K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display screen 294. In other embodiments, the touch sensor 280K can be disposed on a surface of the electronic device 200 at a different location than the display screen 294.
The bone conduction sensor 280M may acquire a vibration signal. In some embodiments of the present application, the bone conduction transducer 280M can obtain a vibration signal of the human voice vibrating the bone mass. The bone conduction sensor 280M may also contact the pulse of the human body to receive the blood pressure pulsation signal.
In some embodiments of the present application, the bone conduction sensor 280M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 270 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 280M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 280M, so as to realize the heart rate detection function.
The keys 290 include a power-on key, a volume key, etc. The keys 290 may be mechanical keys or touch keys. The electronic apparatus 200 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 200.
The motor 291 may generate a vibration cue. The motor 291 can be used for both incoming call vibration prompting and touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 291 may also respond to different vibration feedback effects for touch operations on different areas of the display 294. Different application scenarios (e.g., time reminders, received messages, alarms, games, etc.) may also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 292 may be an indicator light that may be used to indicate a change in charge status, charge level, or may be used to indicate a message, missed call, notification, etc.
The SIM card interface 295 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic apparatus 200 by being inserted into the SIM card interface 295 or being pulled out from the SIM card interface 295. The electronic device 200 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 295 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 295 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 295 may also be compatible with different types of SIM cards. The SIM card interface 295 may also be compatible with external memory cards. The electronic device 200 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments of the present application, the electronic device 200 employs an eSIM (i.e., an embedded SIM card). The eSIM card may be embedded in the electronic device 200 and cannot be separated from the electronic device 200.
The operating system loaded on the electronic device 200 may include, but is not limited to
Figure BDA0003288443040000131
The hongmeng system (Harmony OS) or other operating system.
The following embodiments take the electronic device with the above hardware structure as an example, and describe the method for replacing the wallpaper of the electronic device provided in the embodiments of the present application.
As shown in fig. 3, a schematic flowchart of a method for replacing a wallpaper of an electronic device according to an embodiment of the present application is provided, where the method specifically includes:
s301, the first electronic device acquires a framing image.
In the embodiment of the application, the framing image may be a base image for generating the screen wallpaper, that is, the first electronic device may generate the screen wallpaper on the basis of a series of processes performed on the framing image. The through image may be an image captured in real time by the first electronic device. For example, the first electronic device may call up one image obtained by capturing an arbitrary object or scene by an imaging device such as a front camera or a rear camera under the operation of the user, and the first electronic device may use the image as a through-view image. Alternatively, the through image may be any image stored in an album or other storage unit of the first electronic device. For example, the first electronic device may select an image from an album as a viewfinder image, which is not limited in this embodiment of the application.
In a possible implementation manner of the embodiment of the present application, the framing image may be a complete image or an image area in a complete image. For example, after an image is captured or selected, the first electronic device may call the image editing control to crop the image, and a certain image area obtained after cropping is used as a viewfinder image. The image region may include a regular image region or an irregular image region.
Note that the through image in the embodiment of the present application may be an image in any file format. For example, the framing image may be an image in a format such as jpeg (joint photographic experts group), scalable Vector Graphics (SVG), bitmap (bitmap), and the format of the framing image is not limited in the embodiment of the present application.
S302, the first electronic device identifies patterns and colors in the framing image.
In the embodiment of the present application, the pattern in the through image may refer to a pattern of some basic shapes included in the through image or a complex pattern combined from patterns based on some basic shapes. For example, the pattern in the through image may be a basic pattern such as a circle, a square, a rectangle, or a line, or may be a complex pattern obtained by combining basic patterns such as a circle, a square, or a rectangular line. The complex patterns may also include patterns in the format of SVG, bitmap, etc.
The first electronic device may employ image detection techniques to identify patterns contained in the framing image. The number of patterns recognized by the first electronic device may be one, two, or more.
In this embodiment, the first electronic device may identify a color in the viewfinder image by using a color extraction algorithm, where the color may be a color of each pixel point in the viewfinder image.
Fig. 4 is a schematic diagram of a color processing flow provided in the embodiment of the present application. The first electronic device may perform the process illustrated in fig. 4 when recognizing colors in the through-image and processing the recognized colors. According to the flow shown in fig. 4, after the first electronic device acquires the through image, the first electronic device may first perform color analysis on the through image. Illustratively, the first electronic device may perform color analysis on the through-image using a median difference algorithm.
It should be noted that, in the embodiment of the present application, the first electronic device may be implemented based on any type of color mode in the process of identifying and processing the color in the through image. For example, the color mode of HSB, HSL, RGB, CMYK, etc., which is not limited in this embodiment.
Generally, the colors of the pixels in the viewfinder image are not completely the same, and the number of pixels included in one image is very large. Thus, the first electronic device may recognize a large number of colors from the photographed image.
In a possible implementation manner of the embodiment of the present application, after identifying colors in a viewfinder image, the first electronic device may further perform certain processing on the colors by using a color algorithm. Illustratively, as shown in fig. 4, the first electronic device may merge similar colors. For example, the first electronic device may determine the light yellow and the dark yellow that are recognized as similar colors, and combine the two colors to obtain yellow with a uniform color value.
As an example of the embodiment of the present application, as shown in fig. 5, it is a schematic diagram of a relationship between brightness and saturation in an HSB color mode. In the HSB color mode, the letter H represents the hue of a color, the letter B represents the lightness of the color, and the letter S represents the saturation of the color. In fig. 5, the horizontal axis represents saturation (S) of the HSB color mode, and the vertical axis represents lightness (B). As the saturation and lightness of each color of the hue (H) increase, the effect exhibited by the color becomes more vivid. In fig. 5, the saturation and the brightness of each color in the first range 501 are greater than those of each color in the second region 502. Thus, the first region 501 may be regarded as a bright color group color, and the second region 502 may be regarded as a gray color group color. When the first electronic device processes the colors in the viewfinder image based on the HSB color mode, similar colors can be merged and some colors can be color-optimized according to the saturation, brightness, and the like of each color.
Fig. 6 is a schematic diagram of a similar color merging process provided in the embodiment of the present application. The first electronic device processes any two colors according to the flow shown in fig. 6, and may first determine the saturation or brightness of the two colors. For example, the first electronic device may first determine whether the saturation of both colors is less than 15, or the lightness is less than 20. As can be known from fig. 5, if the saturation of each of the two colors is less than 15, or the lightness of each of the two colors is less than 20, the first electronic device may determine that the two colors are similar to black or white, so that the first electronic device may consider that the two colors are similar and combine them. If the saturation of both colors is not less than 15 or the lightness is not less than 20, the first electronic device may consider both colors to be chromatic colors. For this case, the first electronic device may continue to compare the magnitude of the difference between the two hues. For example, as shown in fig. 6, the first electronic device may determine whether the difference between the colors is within 20 degrees. If the two colors differ by more than 20 degrees, the first electronic device may consider the two colors to be dissimilar. Otherwise, the first electronic device may continue to compare the saturation difference and the brightness difference of the two colors. For example, as shown in fig. 6, the first electronic device may determine whether the saturation difference and the brightness difference of two colors are both less than 20. If so, the first electronic device may determine that the two colors are similar. Otherwise, the first electronic device may determine that the two colors are not similar. For similar two colors, the first electronic device may combine the two colors into one color. For the new color obtained by merging, the first electronic device may continue to process the new color and the another color according to the process shown in fig. 6, and determine whether the two colors belong to similar colors until two colors identified by the first electronic device do not belong to similar colors.
In a possible implementation manner of the embodiment of the application, when the first electronic device combines two similar colors into one color, based on one of the colors, the other color may be adjusted to be the same as the color. For example, if the first electronic device determines that the first color and the second color are similar according to the flow shown in fig. 6, the first electronic device may adjust the values of the hue, saturation, and lightness of the second color to be the same as the first color based on the first color; or the first electronic device may adjust the hue, saturation, lightness, and other values of the first color to be the same as those of the second color based on the second color.
In another possible implementation manner of the embodiment of the present application, when the first electronic device performs similar color merging, a target value may also be determined according to the hues of the similar first color and the second color, and the target value is used as the hue of the merged color. The target value may be an average value of the hues of the first color and the second color, or may be a value predetermined according to a hue interval in which the hues of the first color and the second color are located. The first electronic device may also determine the saturation and lightness of the combined color in a manner similar to that described above.
As shown in fig. 4, after completing similar color combination, the first electronic device may select a color from multiple colors obtained after final recognition and combination, and select a certain number of colors as colors required for subsequent generation of the wallpaper.
In the embodiment of the application, the first electronic device should not recognize too many types of colors from the viewfinder image. For example, the first electronic device may recognize no more than three colors, i.e., the first electronic device may select at most three colors from among the recognized colors as the color to be subsequently used. If the types of the colors recognized by the first electronic device include multiple colors, the first electronic device may further determine primary and secondary relationships of the multiple colors, so as to select a part of the colors to be applied to a subsequent process of generating the screen wallpaper. For example, in the case that a plurality of colors are recognized, the first electronic device may determine one of the colors as a primary color and determine the other colors as secondary colors; for there being a plurality of secondary colors, the first electronic device may also determine a first secondary color, a second secondary color, and so on. The first electronic device determines that the primary and secondary colors may be implemented according to the coverage of those colors in the viewfinder image. For example, the first electronic device may determine one color having the largest coverage area in the through-image as the primary color and determine the other colors as the secondary colors. For the auxiliary colors, the first electronic device may also determine the first auxiliary color, the second auxiliary color, and the like in the same manner. The first electronic device may determine the coverage area of each color in the through-image based on an unprocessed color corresponding to the color. For example, if the yellow color recognized by the first electronic device is a color obtained by similarly color-combining and optimizing a part of light yellow and dark yellow in the through-view image, the first electronic device may calculate the areas covered by the light yellow and the dark yellow in the through-view image together when determining the coverage area of the yellow color in the through-view image.
Alternatively, the first electronic device may determine the main color and the auxiliary color according to the position relationship of the colors in the through image, which is not limited in this embodiment of the present application.
In a possible implementation manner of the embodiment of the application, the first electronic device may further perform color optimization on some colors according to actual needs. For example, after recognizing some colors, the first electronic device may adjust the lightness and saturation of the colors to a preset value range, so that the effect exhibited by the colors is more attractive.
As an example of the embodiment of the present application, as shown in fig. 7, a schematic diagram of a color optimization process provided in the embodiment of the present application is shown. The process flow for the first electronic device to optimize a portion of the colors on a black background smart watch dial is shown in fig. 7. As shown in fig. 7, for a certain color, the first electronic device may first determine whether the color is yellow. In general, yellow can be considered as a color having a hue between 45 and 60, i.e., the hue H of yellow satisfies the following condition: h is more than 45 and less than 60. If the color is yellow, that is, the hue of the color is between 45 and 60, the first electronic device may adjust the hue, saturation, and lightness of the color to preset values respectively for the dial with the black background. For example, the first electronic device may set the hue of the color to 45, the saturation of the color to 90, and the lightness of the color to 95, and accordingly perform a color optimization algorithm based on the black background, resulting in optimization of the color. As shown in fig. 7, if the color is not yellow, that is, the hue of the color is not between 45 and 60, the first electronic device may first determine whether the saturation and the brightness of the color are within the predetermined ranges. For example, the number of the preset intervals may include a plurality of intervals. For example, a first preset interval, a second preset interval, etc. The first electronic device may set the preset section on a targeted basis on the dial plate of the black background. For example, the first predetermined interval may refer to an interval having a saturation between 65 and 100 and a brightness between 85 and 100; the second predetermined interval may refer to an interval having a saturation between 0 and 40 and a brightness between 40 and 70. Namely, the first preset interval satisfies the following conditions: s is more than 65 and less than 100, and B is more than 85 and less than 100; the second preset interval meets the following conditions: s is more than 0 and less than 40, and B is more than 40 and less than 70. If the saturation and brightness of the color satisfy the condition corresponding to any one of the preset intervals, the first electronic device may accordingly execute a color optimization algorithm based on the black background to optimize the color. If the saturation and brightness of the color do not satisfy the condition corresponding to any of the preset intervals, the first electronic device may adjust the saturation and brightness of the color to the preset intervals. Illustratively, if the lightness of the color is greater than 70, that is, B > 70 exists for the color, the first electronic device may adjust the saturation and the lightness of the color to a first preset range, that is, the saturation and the lightness of the adjusted color satisfy the conditions of 65 < S < 100 and 85 < B < 100. If the lightness of the color is not greater than 70, that is, B is not greater than 70 for the color, the first electronic device may adjust the color to a preset interval closest to the color according to the saturation and lightness of the color and the distance between the two preset intervals, and then execute a color optimization algorithm based on the black background. When calculating the saturation and lightness of the color and the distance between the two preset intervals, the calculation can be performed by combining the relationship diagram of lightness and saturation in the HSB color mode shown in fig. 5. For example, the first electronic device may draw the two preset intervals in the coordinate system of the schematic diagram shown in fig. 5, and the two intervals may be represented as two rectangular areas in the coordinate system shown in fig. 5. Then, the first electronic device may determine a position of the color in the coordinate system according to the saturation and the lightness of the color, and then calculate a distance between the position and the two rectangular areas, so as to determine which preset interval the saturation and the lightness of the color are closest to.
It should be noted that fig. 7 is only an example of yellow being optimized for a black background, and for backgrounds of other colors and other colors, the first electronic device may be optimized specifically according to a preset color optimization algorithm to improve display effects of various colors in different backgrounds, which is not limited in this embodiment of the application.
And S303, the first electronic equipment generates a target pattern based on the pattern.
In the embodiment of the present application, the target pattern may be some new patterns obtained by the first electronic device after processing the pattern recognized in the through image. For example, the first electronic device may perform a deformation process on the pattern, perform a smoothing process on lines in the pattern, delete partial details in the pattern, connect lines at different positions in the pattern, and so on. These target patterns may be part of subsequently generated screen wallpaper.
In a possible implementation manner of the embodiment of the present application, the target pattern generated by the first electronic device may be obtained by combining patterns in the through-image according to a certain composition rule. For example, as shown in the schematic generation manner of the target pattern shown in fig. 8, the first electronic device may combine basic patterns, such as circles, rectangles, scalable vector images, and bitmaps, identified in the through-view image according to composition rules to obtain the target pattern. The composition rules can include plane symmetry composition, stacking composition, pattern translation, stacking composition, kaleidoscope composition and the like, the first electronic device can select one or more composition rules to combine the patterns to obtain different target patterns, and diversity of subsequently generated screen wallpaper is enriched.
In the embodiment of the present application, the composition rule used by the first electronic device may also be influenced by different influence factors. As shown in fig. 8, these influencing factors may include photo capture mode, device type, wallpaper usage, random number, and so forth. The composition rules corresponding to these influence factors may be pre-built in the first electronic device, and in the process of generating the target pattern by the first electronic device, the first electronic device may select the composition rule corresponding to the current influence factor to combine the patterns. The composition rules shown in fig. 8 may be used alone or in combination. For example, the first electronic device may use any one composition rule alone or may use two or more composition rules in combination at the same time when generating the target pattern from the pattern in the through image. When the first electronic device uses two or more composition rules in combination, the usage sequence of each composition rule may be random, or may be preset according to other factors such as an influence factor, which is not limited in the embodiment of the present application.
In a possible implementation manner of the embodiment of the present application, the target pattern generated by the first electronic device may also be obtained by combining a pattern recognized in the through-view image and a wallpaper template built in the first electronic device. Illustratively, the built-in wallpaper template may be a template drawn based on a pattern of different shape. After recognizing the pattern in the viewfinder image, the first electronic device may replace some or all of the patterns in the built-in wallpaper template with the pattern, so as to obtain a new wallpaper template containing the pattern in the viewfinder image. The first electronic device may take the pattern in the new wallpaper template as the target pattern.
In another possible implementation manner of the embodiment of the application, when the first electronic device generates the target pattern by using the built-in wallpaper template, the wallpaper template to be finally used may be determined according to the complexity of each wallpaper template. The complexity of the wallpaper template may be used to represent the complexity of using the wallpaper template. Illustratively, the complexity of the wallpaper template may be determined according to the coloring amount of the wallpaper template. For example, the wallpaper template with a larger coloring amount has higher complexity, and the wallpaper template with a smaller coloring amount has relatively lower complexity. Alternatively, the complexity of the wallpaper template may also be related to whether a gradient color is present in the wallpaper template. For example, the complexity of a wallpaper template with a gradient may be higher than the complexity of a wallpaper template without a gradient. Still alternatively, the complexity of the wallpaper template may be related to the richness of the pattern contained in itself. For example, the more kinds of patterns contained in the wallpaper template, the more complicated the patterns are, and the higher the complexity of the template is; on the contrary, the complexity of the wallpaper template is relatively low. It should be noted that the complexity of the wallpaper template can be represented in a hierarchical manner. For example, the complexity level may be defined in advance as 1-6, etc. If the complexity level of the wallpaper template is six and the like, the wallpaper template is shown to have the highest complexity; if the complexity level of the wallpaper template is equal, the wallpaper template is indicated to have the lowest complexity.
When determining the wallpaper template to be used according to the complexity, the first electronic device may first identify the complexity of the framing image. For example, the first electronic device may recognize which level of 1-6 or the like the complexity of the through image belongs to. Then, the first electronic device may select a wallpaper template having the same complexity as the through-view image and take the pattern contained in the wallpaper template as a target pattern. It should be noted that the complexity of the framing image can be identified by training an Artificial Intelligence (AI) algorithm model.
In another possible implementation manner of the embodiment of the application, when the first electronic device generates the target pattern by using the built-in wallpaper template, the wallpaper template to be finally used may be determined according to the pattern included in each wallpaper template. For example, the wallpaper template built in the first electronic device can be divided into different categories according to different contained patterns. Such as stripes, squares, spots/circles, arcs, and irregular patterns, among others. The first electronic device, upon recognizing a pattern in the framing image, may select a wallpaper template from those categories that are the same as or similar to the pattern. For example, when the first electronic device recognizes that the pattern included in the through-view image belongs to the spot/circle type, the first electronic device may select one or more wallpaper templates from the built-in wallpaper templates of the spot/circle type as wallpaper templates for subsequent use, where the patterns in the selected wallpaper templates are target patterns. It should be noted that the first electronic device may recognize which category the pattern included in the through image belongs to, by means of an AI image analysis technique.
S304, the first electronic device fills the target pattern by adopting the identified color to obtain a wallpaper set containing a plurality of screen wallpapers.
In this embodiment of the application, the filling of the target pattern by the first electronic device using the identified color may refer to that the first electronic device fills different pattern areas of the target pattern using the identified color, so that the filled pattern areas show corresponding colors. These pattern areas of the target pattern may be determined by an algorithm built in the first electronic device when generating the target pattern. In general, the number of pattern regions included in the target pattern may be one, two, or more.
In a possible implementation manner of the embodiment of the application, the first electronic device may perform filling with different colors according to the primary and secondary relationships of the pattern area. For example, the first electronic device may determine a certain pattern region as a primary region and the other pattern regions as secondary regions according to an algorithm. If the secondary area includes a plurality of secondary areas, the first electronic device may further determine the plurality of secondary areas as the first secondary area, the second secondary area, and so on. The first electronic device may then fill the primary region with the identified primary color and the secondary region with the identified secondary color. For example, the first electronic device may fill a first secondary color in a first secondary region, a second secondary color in a second secondary region, and so on. Of course, the first electronic device may also randomly fill the recognized colors into each pattern region of the target pattern, which is not limited in this embodiment of the application.
In another possible implementation manner of the embodiment of the application, when the first electronic device fills the color into each pattern region, the first electronic device may further perform corresponding processing on the color. Illustratively, the first electronic device may fill in color using a solid color or a gradient color, or the like. For example, if the dominant color recognized by the first electronic device is yellow, and the first electronic device plans to fill the primary area in the target pattern with the yellow, the first electronic device may fill the primary area with pure yellow. Or, the first electronic device may also fill the main area with a gradually changing yellow color, so that the main area filled with the color shows a gradually changing effect.
Fig. 9 is a schematic diagram of a gradient color generation process provided in an embodiment of the present application. According to the generation process shown in fig. 9, when generating a gradient color with a gradient effect based on a certain color, the first electronic device may implement the generation according to the saturation of the color. For example, the first electronic device may first determine an interval to which the saturation of the color belongs. For example, the saturation intervals may include three intervals of [0,30 ], [30,60 ], [60,100 ]. And aiming at different saturation intervals, the corresponding gradient color generation processing processes are different. For example, for a color whose saturation belongs to [0, 30), the first electronic device may gradually increase its hue to 120 and its lightness to 40 gradually when generating the gradient; for the color with the saturation degree of [30, 60), when the first electronic device generates the gradient color, the hue of the first electronic device can be gradually increased to 30, and the lightness of the first electronic device can be gradually decreased to 40; for a color whose saturation belongs to [60,100], the first electronic device may gradually increase its hue to 30 and its lightness to 20 when generating the gradient color. Of course, the gradient color generation process shown in fig. 9 is only one example of the embodiment of the present application, and for different colors, the first electronic device may generate the gradient color of the color in different ways, which is not limited in the embodiment of the present application.
When the first electronic device performs color filling on each pattern, the filling may be performed by a single processing method, or may be performed by a plurality of combinations. For example, the first electronic device may perform color filling on each pattern region by using a pure color processing method. Or, the first electronic device may also perform color filling on each pattern region by using a gradient processing method. Alternatively, the first electronic device may perform color filling on a part of the pattern region by a pure color processing method, and perform color filling on another part of the pattern region by a gradient color processing method. For example, the first electronic device may fill the main area with a primary color by using a pure color processing manner, and for the other sub-areas, the first electronic device may fill the sub-areas with a gradation color of a corresponding color. Therefore, the display effect of the filled main area is more prominent, and the problem that the display effect of each sub area is too single can be avoided.
As a specific example of the embodiment of the present application, as shown in fig. 10, a schematic diagram of a color matching process provided by the embodiment of the present application is shown, and after the first electronic device performs the color matching process, the first electronic device may fill corresponding colors into the target patterns generated in the foregoing steps. It should be noted that, unlike the first electronic device described above, which directly fills the identified colors into the target pattern, the color matching process shown in fig. 10 further includes further processing of the colors during the process of filling the colors.
As shown in fig. 10, the first electronic device may first determine the number of identified colors when performing the color matching procedure. For example, one color, two colors, or three or more numbers of colors. For three or more numbers of colors, the first electronic device may fill each color into a different pattern area of the target pattern, respectively, according to the procedure described above. For only one or two of the recognized colors, the first electronic device may determine whether a chromatic color exists among the one or two colors. If a color is present, the first electronic device may be matched using a chromatic color scheme, and if one or both colors do not include a color, the first electronic device may be matched using an achromatic color scheme. The presence or absence of the color described above may refer to whether the recognized color includes colors other than black, white, and gray.
The above-described achromatic color scheme and chromatic color scheme may be preset. For example, the first electronic device may employ two or three achromatic color schemes for color matching the target design when performing the achromatic color scheme. For example, as shown in fig. 10, when the recognized color does not include a chromatic color, the first electronic device may set the saturation of the recognized color to 0 and set the lightness thereof to two kinds between 30-60 and 60-90; alternatively, after setting the saturation of the identified color to 0 and setting the lightness of the identified color to two kinds between 30-60 and 60-90, the first electronic device may further set a color with lightness of 100, so as to obtain three achromatic color schemes.
As shown in fig. 10, when the first electronic device executes a chromatic color scheme, it may first determine whether the color is red, that is, whether the hue of the color is between 0-30 or 330-360. If the hue of the color is in the above range, it indicates that the color is red. At this time, when the first electronic device performs color matching, the saturation of the color may be decreased by 40 and the lightness thereof may be increased by 10, and then the first electronic device may fill a certain pattern region of the target pattern with the color obtained after the color matching. If the hue of the color is not in the above range, it indicates that the color is not red. At this time, the first electronic device may sequentially determine whether the color is yellow or green, and then adopt a different color scheme for the determination result. As shown in fig. 10, if the color is not red, the first electronic device can determine whether it is yellow, that is, whether the hue of the color is between 45 and 60. If the hue of the color is in the interval 45-60, it indicates that the color is yellow, the first electronic device may reduce the saturation of the yellow color by 40 to obtain a new color, and the first electronic device may fill a certain pattern area of the target pattern with the new color. If the color is not yellow, that is, the hue of the color is not in the interval 45-60, the first electronic device may determine whether the color is green, that is, whether the hue of the color is in the interval 75-219. If the hue of the color is in the interval of 75-219, it indicates that the color is green, and the first electronic device may randomly increase the hue of the color by a certain value on the basis of the original hue of the color. For example, randomly increasing by any one of values 60-90. If the color is not green, that is, the hue of the color is not in the interval of 75-219, the first electronic device may randomly increase the hue of the color by another value based on the original hue of the color. For example, randomly increased by any of 30-60. As shown in fig. 10, after the first electronic device randomly increases any one of the values of 30-60 or 60-90 to the hue of the color, the first electronic device may again perform a corresponding optimization algorithm to optimize the color according to the background color of the device to which the screen wallpaper is applied. Then, the first electronic device may fill a certain pattern area of the target pattern with the optimized color, and complete the entire color matching process.
In the embodiment of the present application, since the number of the target patterns may be multiple, the pattern areas included in each target pattern may also be multiple, and the first electronic device may also fill the pattern areas with the recognized colors in multiple ways, the number of the screen wallpapers finally generated by the first electronic device may include multiple. Multiple screen wallpapers together form a wallpaper set.
In a possible implementation manner of the embodiment of the present application, a plurality of transformation templates may be built in the first electronic device, where the transformation templates have corresponding template styles, and the template styles may represent specific styles of the transformation templates. Illustratively, the template style of the transform template may include a paper cut style, an album style, and the like. The style presented by the paper-cut style transformation template can be the shape of a piece of paper-cut, and the style presented by the record style transformation template can be the shape of a record. After the first electronic device fills the target pattern with the recognized color to obtain a wallpaper set including multiple pieces of screen wallpaper, the first electronic device may further perform secondary transformation on the multiple pieces of screen wallpaper with a built-in transformation template. The style of the screen wallpaper obtained after the secondary transformation can be the same as the style of the transformation template, so that the wallpaper style of the screen wallpaper is consistent with the template style of the transformation template. Illustratively, the first electronic device transforms the screen wallpaper according to the paper-cut template, and the obtained transformed screen wallpaper may be a paper-cut-shaped screen wallpaper.
S305, the first electronic device replaces the screen wallpaper of the target electronic device according to the wallpaper set.
In an embodiment of the present application, the target electronic device may include the first electronic device itself and/or a second electronic device communicatively connected to the first electronic device. The communication connection can comprise various connection modes including Bluetooth, NFC and Wi-Fi.
The first electronic device may then replace the screen wallpaper that the first electronic device and/or with the second electronic device is currently using. Illustratively, the first electronic device may use any one of the set of wallpapers to replace the screen wallpaper of the first electronic device, but not the screen wallpaper of the second electronic device; or the first electronic device may use any one or more of the wallpaper sets to replace the screen wallpaper of one or more second electronic devices, without replacing the screen wallpaper of the first electronic device itself; or, the first electronic device may replace the screen wallpaper of the first electronic device itself, or may replace the screen wallpaper of part or all of the electronic devices in the second electronic device.
It should be noted that, for the second electronic device, the communication connection with the first electronic device may be before the first electronic device generates the wallpaper set based on the framing image. For example, if it is desired to change the wallpaper of the second electronic device, the first electronic device may establish a communication connection with the second electronic device in advance. For example, the first electronic device may previously establish a bluetooth connection with the second electronic device. Then, the first electronic device generates a wallpaper set containing a plurality of pieces of screen wallpaper based on the framing image. Wherein the specification of the partial screen wallpaper in the wallpaper set can be matched with the screen specification of the second electronic equipment. For example, the length and width of the partial screen wallpaper in the wallpaper set are equal to the length and width of the screen of the second electronic device; or the length-width ratio of the part of screen wallpaper is equal to the length-width ratio of the screen of the second electronic device. The first electronic device may replace the screen wallpaper currently being used by the second electronic device with any one of the portion of screen wallpaper.
The communication connection between the first electronic device and the second electronic device may also be after the first electronic device generates the set of wallpapers based on the framing image. For example, the first electronic device may generate a wallpaper set including a plurality of screen wallpapers based on the framing image, and the screen wallpapers in the wallpaper set may be screen wallpapers of a plurality of specifications. For example, the length-width ratio of the partial screen wallpaper is 4. Then, for a second electronic device that wishes to change the screen wallpaper, the first electronic device may establish a communication connection with the second electronic device. For example, the first electronic device may previously establish a bluetooth connection with the second electronic device. The first electronic device can select any one of the partial screen wallpapers matched with the first electronic device from the wallpaper set according to the actual specification of the screen of the second electronic device, and the first electronic device can use the screen wallpaper to replace the screen wallpaper currently used by the second electronic device. For example, if the second electronic device is an electronic device with a screen length-width ratio of 16. Or, if the second electronic device is a smart watch with a circular dial, the first electronic device may select one of the circular screen wallpapers, and replace the dial of the smart watch with the selected screen wallpaper. Of course, in the case where the second electronic device is a smart watch or the like, the first electronic device may also select one of the screen wallpapers with the aspect ratio of 4. Or, after the first electronic device selects one or more screen wallpapers from the screen wallpapers with the length-width ratio of 4, 3 or 16.
In a possible implementation manner of the embodiment of the application, the first electronic device may establish a communication connection with the second electronic device in a way of touch-and-talk (OneHop) and send the generated screen wallpaper to the second electronic device. OneHop is a technology for establishing a communication connection between two electronic devices based on NFC. The NFC induction area of the first electronic equipment and the NFC induction area of the second electronic equipment can be automatically paired after being contacted, and the effect that the electronic equipment can be connected through collision can be achieved. The data transmission established after pairing can be realized by means of WLAN technology. Under the wireless transmission state, the transmission speed of the file can reach 30MB/s, and the efficiency of data transmission is greatly improved.
Fig. 11 is a schematic flow chart of a collision transmission technique according to an embodiment of the present application. In fig. 11, a first electronic device and a second electronic device are included. Various application programs and a touch transmission component (OneHop Kit) are installed in the first electronic equipment and the second electronic equipment. When establishing a communication connection between a first electronic device and a second electronic device based on an OneHop technology, an application program in the first electronic device may first register a callback interface with an OneHop service after being started. Then, after the first electronic device contacts with the NFC sensing area of the second electronic device, that is, after the first electronic device and the second electronic device perform a collision, the application program may process the collision event, and establish data synchronization between the first electronic device and the second electronic device. Thus, a communication connection between the first electronic device and the second electronic device is established. The first electronic device can also use the generated screen wallpaper to replace the screen wallpaper currently used by the second electronic device.
The above describes in detail a specific implementation process of the method for replacing the screen wallpaper of the electronic device according to the embodiment of the present application, and the following describes an implementation effect of replacing the screen wallpaper of the electronic device by using the method with reference to two specific examples.
Example 1
The present example is an example of applying a screen wallpaper generated by a first electronic device to a single second electronic device. Specifically, the present example describes applying a dial (screen wallpaper) generated on a mobile phone (first electronic device) to a smart watch (second electronic device) to achieve the effect of replacing the smart watch dial.
Fig. 12a to fig. 12d are schematic diagrams illustrating an operation flow of replacing wallpaper on a screen of an electronic device according to an embodiment of the present application. In fig. 12a, a cell phone 121 (first electronic device), a smart watch 122 (second electronic device) in which a dial 122a is provided, and an album 123 are included. In this example, the cell phone 121 may photograph the album 123 to obtain a viewfinder image, and the cell phone 121 may generate a new dial for the smart watch 122 based on the viewfinder image. The user can implement this process by operating on the handset 121. Fig. 12b shows that the cell phone 121 takes a photograph of the album 123, resulting in an image 124. The cell phone 121 may cut out a partial area in the image 124 as a through image 124a. The cell phone 121 may generate a new dial plate for the smart watch 122 based on the framing image 124a. The process of generating a new dial plate by the cell phone 121 based on the viewfinder image 124a can refer to the description in the foregoing embodiment, and this example is not described again here. Generating a new dial face by the handset 121 based on the framing image 124a can include the dial faces 125a, 125b, and 125c as shown in fig. 12 c. Handset 121 may present a specific style for each dial to the user on its screen. For example, fig. 12c currently shows a dial 125b. With reference to fig. 12d, after the user selects the dial 125b, the mobile phone 121 may replace the original dial 122a of the smart watch 122 with the dial 125b, so as to obtain the dial effect shown in fig. 12 d.
Example two
The present example is an example of applying a screen wallpaper generated by a first electronic device to a plurality of second electronic devices of the first electronic device itself. Specifically, the present example introduces an effect of applying the screen wallpaper generated on the mobile phone (first electronic device) to the mobile phone itself (first electronic device), and the smart watch and the in-vehicle terminal (second electronic devices), so as to replace the screen wallpaper and the smart watch dial of the mobile phone itself and the in-vehicle terminal.
Fig. 13a to fig. 13h are schematic diagrams illustrating generating screen wallpapers applicable to a plurality of electronic devices according to an embodiment of the present application. As shown in fig. 13a, the viewfinder image 135 displayed on the screen of the mobile phone 131 (first electronic device) may be an image captured by the user in real time using the mobile phone 131, or may be an image captured from an album of the mobile phone 131. The handset 131 may cut out an image area 135a as shown in fig. 13b from the image 135 and generate a target pattern based on the pattern in the image area 135 a. In this example, the cell phone 131 may generate a plurality of target patterns as shown in fig. 13c after recognizing and processing the patterns in the image area 135 a. In fig. 13c, a total of 5 target patterns c.1-c.5 are shown. The mobile phone 131 may perform color filling, secondary conversion, and other processing on the 5 target patterns c.1 to c.5 to generate screen wallpaper that is adaptable to the mobile phone 131, the smart watch, and the in-vehicle terminal (the plurality of second electronic devices). For a smart watch, the wallpaper of the smart watch screen can be generally called a watch dial.
First, a process of generating the target pattern c.2 by the cellular phone 131 based on the pattern in the image area 135a of the through image 135 will be described with reference to the target pattern c.2 as an example.
As shown in fig. 13d, the pattern 136 is recognized from the image area 135a by the mobile phone 131, and the pattern 136 is a pattern of lines. For the pattern 136, the handset 131 can intercept the pattern 136a therefrom for processing. For example, as shown in fig. 13d, the cell phone 131 may first copy the pattern 136a to obtain the pattern 136b, and then the cell phone 131 may copy the pattern 136b again to obtain the pattern 136c. The cell phone 131 may not only use the pattern 136c as the dial of the smart watch 132, but the cell phone 131 may also generate a screen wallpaper 136d as a screen locking wallpaper of the cell phone 131 on the basis of the pattern 136c.
For the pattern 136, the handset 131 may first perform a background rotation on the pattern 136 to obtain the pattern 137 as shown in fig. 13 e. The pattern 137 is a pattern of lines. Similar to the pattern 136, for the pattern 137, the cell phone 131 may intercept the pattern 137a and copy the pattern 137a to obtain the pattern 137b, and then the cell phone 131 may copy the pattern 137b again to obtain the pattern 137c. The cell phone 131 may not only use the pattern 137c as the dial of the smart watch 132, but the cell phone 131 may also generate a screen wallpaper 137d as a screen locking wallpaper of the cell phone 131 on the basis of the pattern 137c.
If screen wallpapers for a plurality of second electronic devices are to be generated on the basis of the pattern 136, the cell phone 131 may intercept a pattern area matching the second electronic device to be applied from the pattern 136. For example, the cellular phone 131 may apply different processing methods to the pattern 136 for the screen sizes of the cellular phone and the in-vehicle terminal 133. As shown in fig. 13f, for the mobile phone 131, it may directly cut out a pattern area matching the screen size from the pattern 136, and take the pattern 136e as a target pattern for generating the screen wallpaper for the mobile phone 136, thereby generating the screen wallpaper 136f. For the vehicle-mounted terminal 133, the cellular phone 131 may rotate the pattern 136 to obtain the pattern 138. Then, the cellular phone 131 may cut out a pattern area 138a matching the screen size of the in-vehicle terminal 133 from the pattern 138 and take the pattern 138a as a target pattern for generating screen wallpaper usable for the in-vehicle terminal 133, thereby finally generating the screen wallpaper 138b as the screen wallpaper of the in-vehicle terminal 133.
In one possible implementation of this example, for the pattern 138, the cell phone 131 may also intercept a portion of the pattern area from the pattern 138 when generating the screen wallpaper based on the pattern 138. For example, as shown in fig. 13g, the cell phone 131 may intercept a pattern 139 from the pattern 138 and generate a wallpaper for the tablet 134 based on the pattern 139. For example, as shown in fig. 13g, the mobile phone 131 may sequentially copy the pattern 139 in a left-right and up-down symmetrical manner to obtain the pattern 139a in fig. 13g, and the mobile phone 131 may perform a translation expansion on the pattern 139a to obtain a screen wallpaper 139b applicable to the tablet computer 134.
It should be noted that the foregoing examples only describe how to process the pattern to obtain the target pattern, and then generate the screen wallpaper applicable to multiple electronic devices. It should be appreciated that in generating the wallpaper based on the target pattern, the target pattern needs to be filled in with the colors identified from the viewfinder image. In this example, the target pattern is filled with the identified color, which may include monochrome filling, solid filling, and/or gradient filling.
Taking the target pattern c.3 in fig. 13c as an example, the filling effect obtained by using the single color filling, the solid color filling and the gradient color filling respectively can be as shown in fig. 13 h. Where h.1 is an example of a pattern region of the target pattern c.3, and the target pattern c.3 includes a pattern region 1, a pattern region 2, and a pattern region 3, and for different pattern regions, different colors that are recognized may be adopted in performing color filling. For example, the pattern region 1 may be filled with a main color, and the pattern regions 2 and 3 may be filled with a sub color. H.2 in fig. 13h shows a schematic effect diagram after the target pattern c.3 is filled by adopting a monochrome filling manner, h.3 shows a schematic effect diagram after the target pattern c.3 is filled by adopting a pure color filling manner, and h.4 shows a schematic effect diagram after the target pattern c.3 is filled by adopting a gradient color filling manner.
For the generated multiple screen wallpapers, the first electronic device may apply the multiple screen wallpapers to the first electronic device itself and the multiple second electronic devices, respectively. Fig. 14a to fig. 14b are schematic diagrams illustrating changing wallpaper of multiple electronic devices according to an embodiment of the present application. Fig. 14a is an interface diagram illustrating that after the first electronic device 141 generates multiple pieces of screen wallpaper, the multiple pieces of screen wallpaper are presented to the user. In the interface shown in fig. 14a, a dial 142 is included that can be used as a dial of a smart watch. The dial 142 may be generated by performing gradation processing on the pattern in the through image (processing method 144b in fig. 14 a) and using a paper-cut style (dial style 145b in fig. 14 a). The user can apply the watch face 142 to a smart watch by clicking on control 146a "apply to watch". Alternatively, the user may "apply to multiple devices" by clicking the control 146b, and generate screen wallpaper applicable to multiple electronic devices such as a mobile phone and a vehicle-mounted terminal based on the current viewfinder image while applying the dial 142 to a smart watch. The user may click on a control 147a or 147c on the screen of the first electronic device 141 to view a specific pattern of screen wallpaper generated for the handset and in-vehicle terminal. Referring to fig. 14b, a schematic diagram illustrating an effect of applying the generated screen wallpaper to a plurality of electronic devices is shown. The replaced dial of the smart watch 148a is the dial 142 generated in fig. 14 a. 148b and 148c in FIG. 14b are the lock screen wallpaper and desktop wallpaper, respectively, for a cell phone. Different patterns may be included in lock screen wallpaper 148b and desktop wallpaper 148 c. For example, in the lock screen wallpaper 148b of the mobile phone, the same pattern as the dial 142 may be included; the pattern 142c in the desktop wallpaper 148c and the pattern 142d in the screen wallpaper 148d of the in-vehicle terminal may be obtained by performing a gradation process (the processing method 144b in fig. 14 a) on the through image. If the user wishes to change the generated wallpaper, the user can select a new viewfinder image by clicking the control 143 "re-view" in fig. 14 a; it may also be implemented by changing the way in which the screen wallpaper is generated. For example, by clicking on a control such as 144a, 144c in FIG. 14a, the pattern shape in the generated screen wallpaper is modified. For the dial plate of the smart watch, the style of the dial plate can be modified by clicking controls such as 145a, 145c and 145d in fig. 14a, and a new dial plate is regenerated.
According to the embodiment of the application, the pattern and the color in the viewfinder image are identified, and a new target pattern can be generated in a plurality of ways on the basis of the pattern in the original viewfinder image. By filling the target pattern with the recognized color, the screen wallpaper applicable to various types of electronic equipment can be obtained, and compared with the screen wallpaper obtained by only modifying the filling color of the built-in wallpaper template in the prior art, the method for changing the screen wallpaper of the electronic equipment provided by the embodiment of the application greatly enriches the diversity of the screen wallpaper.
In the embodiment of the present application, the electronic device may be divided into the functional modules according to the above method examples, for example, each functional module may be divided for each function, or one or more functions may be integrated into one functional module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and another division manner may be available in actual implementation. The following description will be given taking as an example that each function module is divided for each function.
Corresponding to the foregoing embodiments, referring to fig. 15, a block diagram of a device for changing a wallpaper of a screen of an electronic device according to an embodiment of the present application is shown, where the device may be applied to a first electronic device in the foregoing embodiments, and the device may specifically include the following modules: an acquisition module 1501, an identification module 1502, a generation module 1503, a filling module 1504, and a replacement module 1505; wherein:
an acquisition module 1501 configured to acquire a viewfinder image;
an identification module 1502 for identifying patterns and colors in the through image;
a generation module 1503 for generating a target pattern based on the pattern;
a filling module 1504, configured to fill the target pattern with the colors to obtain a wallpaper set including multiple screen wallpapers;
a replacement module 1505 for replacing a screen wallpaper of a target electronic device according to a set of wallpapers, the target electronic device comprising a first electronic device and/or a second electronic device communicatively connected to the first electronic device.
In the embodiment of the present application, the through image may include a complete image or an image area in the complete image, and the image area may include a regular image area or an irregular image area.
In the embodiment of the present application, the pattern recognized by the first electronic device from the viewfinder image may include a basic pattern or a complex pattern in the viewfinder image, and the basic pattern may include at least one of the following: a line pattern, a circle pattern, a square pattern, or a rectangle pattern, the complex pattern may include at least one of: vector graphics, bitmaps, or patterns based on combinations of base patterns may be scaled.
In this embodiment, the target pattern may include a pattern obtained by combining, by the first electronic device, patterns identified from the captured image according to a preset composition rule, where the preset composition rule includes at least one of the following: planar symmetric patterning, stacked patterning, translational patterning, stacked patterning, rotational patterning, or kaleidoscope patterning.
In this embodiment, the color recognized by the first electronic device from the viewfinder image may include a pattern color in the viewfinder image or an optimized color obtained by processing the pattern color, and the processing of the pattern color by the first electronic device may include at least one of: merging similar pattern colors, and adjusting color values of the pattern colors, wherein the color values comprise hue, saturation and lightness.
In this embodiment of the application, wallpaper templates may be built in the first electronic device, each wallpaper template has a preset template complexity, and the generation module 1503 may be configured to: determining the image complexity of the framing image according to the pattern in the framing image; and acquiring a target wallpaper template with the template complexity being the same as the image complexity, and taking the pattern contained in the target wallpaper template as a target pattern.
In this embodiment of the application, each wallpaper template may further have a preset template category, and the generation module 1503 may further be configured to: determining an image category of a pattern in the through-view image; and acquiring a target wallpaper template with the template type same as the image type, and taking the pattern contained in the target wallpaper template as a target pattern.
In the embodiment of the present application, the number of target patterns may include a plurality, the plurality of target patterns may include a plurality of pattern regions, respectively, and the plurality of pattern regions may include a primary region and a secondary region; the number of the colors recognized by the first electronic device from the view-taking image may also include multiple colors, and the multiple colors may include a primary color and a secondary color, wherein the primary color may be used for filling the primary area, and the secondary color may be used for filling the secondary area, so as to obtain multiple pieces of screen wallpaper.
In an embodiment of the present application, the apparatus may further include a quadratic transformation module, where the quadratic transformation module may be configured to: and carrying out secondary transformation on the multiple screen wallpapers by adopting a built-in transformation template to obtain the transformed multiple screen wallpapers.
In this embodiment of the present application, the transformation template may have a corresponding template style, and the quadratic transformation module may specifically be configured to: and transforming the multiple screen wallpapers according to the template style, so that the wallpaper styles of the transformed multiple screen wallpapers are consistent with the template style of the transformed template.
In an embodiment of the present application, the apparatus may further include an adjusting module, where the adjusting module may be configured to: determining a screen specification of each target electronic device, wherein the screen specification can comprise a screen shape or a length-width ratio of the target electronic device; and adjusting the plurality of screen wallpapers according to the screen specification to enable the adjusted wallpaper specification of the screen wallpaper to be matched with the screen specification of each target electronic device, wherein the wallpaper specification can comprise the wallpaper shape or the length-width ratio of the screen wallpaper.
In the embodiment of the present application, the number of the second electronic devices may include a plurality; replacement module 1505 may be used to: respectively determining screen wallpapers matched with the screen specifications of the first electronic equipment and the plurality of second electronic equipment; replacing the currently used screen wallpaper by adopting the screen wallpaper matched with the screen specification of the user; and respectively sending the screen wallpaper matched with the screen specifications of the plurality of second electronic devices to the corresponding plurality of second electronic devices so as to instruct the plurality of second electronic devices to replace the screen wallpaper currently used by each second electronic device by adopting the received screen wallpaper.
It should be noted that all relevant contents of each step related to the method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The present application further provides an electronic device, which may be the first electronic device or the second electronic device in the foregoing embodiments, and the electronic device may include a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the computer program is executed by the processor, the method for changing the wallpaper of the screen of the electronic device in the foregoing embodiments is implemented.
The embodiment of the present application further provides a computer-readable storage medium, where computer instructions are stored in the computer-readable storage medium, and when the computer instructions are run on an electronic device, the electronic device is caused to execute the above related method steps to implement the method for changing the wallpaper of the screen of the electronic device in the above embodiments.
The embodiment of the present application further provides a computer program product, which when running on a computer, causes the computer to execute the above related steps, so as to implement the method for replacing the wallpaper of the screen of the electronic device in the above embodiments.
The embodiment of the present application further provides a chip, which may be a general-purpose processor or a special-purpose processor. The chip includes a processor. The processor is used for supporting the electronic device to execute the relevant steps so as to realize the method for replacing the screen wallpaper of the electronic device in the embodiments.
Optionally, the chip further includes a transceiver, where the transceiver is configured to receive control of the processor, and is configured to support the electronic device to perform the relevant steps, so as to implement the method for changing the wallpaper of the electronic device in the foregoing embodiments.
Optionally, the chip may further include a storage medium.
It should be noted that the chip may be implemented by using the following circuits or devices: one or more Field Programmable Gate Arrays (FPGAs), programmable Logic Devices (PLDs), controllers, state machines, gate logic, discrete hardware components, any other suitable circuitry, or any combination of circuitry capable of performing the various functions described throughout this application.
Finally, it should be noted that: the above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the present application should be covered within the scope of the present application.

Claims (15)

1. A method for replacing screen wallpaper of an electronic device, which is applied to a first electronic device, comprises the following steps:
the first electronic equipment acquires a framing image;
the first electronic device recognizing patterns and colors in the viewfinder image;
the first electronic equipment generates a target pattern based on the pattern, fills the target pattern with the color, and obtains a wallpaper set containing a plurality of screen wallpapers;
the first electronic device replaces screen wallpaper of a target electronic device according to the wallpaper set, wherein the target electronic device comprises the first electronic device and/or a second electronic device in communication connection with the first electronic device.
2. The method of claim 1, wherein the viewfinder image comprises a full image or an image area in the full image, the image area comprising a regular image area or an irregular image area.
3. The method of claim 1 or 2, wherein the pattern comprises a base pattern or a complex pattern in the viewfinder image, the base pattern comprising at least one of: a line pattern, a circle pattern, a square pattern, or a rectangle pattern, the complex pattern comprising at least one of: vector graphics, bitmaps, or patterns based on combinations of the base patterns may be scaled.
4. The method according to claim 3, wherein the target pattern comprises a pattern obtained by combining the patterns by the first electronic device according to a preset composition rule, and the preset composition rule comprises at least one of the following: planar symmetric patterning, stacked patterning, translational patterning, stacked patterning, rotational patterning, or kaleidoscope patterning.
5. The method of any of claims 1-4, wherein the color comprises a pattern color in the viewfinder image or an optimized color resulting from processing the pattern color, the processing comprising at least one of: merging similar pattern colors, adjusting color values of the pattern colors, the color values including hue, saturation and lightness.
6. The method according to any one of claims 1-5, wherein the first electronic device is built-in with wallpaper templates, each wallpaper template having a preset template complexity, the first electronic device generating a target pattern based on the pattern, comprising:
the first electronic equipment determines the image complexity of the viewfinder image according to the pattern;
and the first electronic equipment acquires a target wallpaper template with the template complexity being the same as the image complexity, and takes a pattern contained in the target wallpaper template as the target pattern.
7. The method according to any one of claims 1-5, wherein the first electronic device embeds wallpaper templates, each wallpaper template has a preset template category, and the first electronic device generates a target pattern based on the pattern, comprising:
the first electronic device determining an image category of the pattern;
and the first electronic equipment acquires a target wallpaper template with the template type being the same as the image type, and takes a pattern contained in the target wallpaper template as the target pattern.
8. The method according to any one of claims 1 to 7, wherein the number of the target patterns includes a plurality, the plurality of target patterns respectively includes a plurality of pattern regions, the plurality of pattern regions includes a primary region and a secondary region; the number of the colors comprises a plurality of colors, the colors comprise a main color and an auxiliary color, the main color is used for filling the main area, and the auxiliary color is used for filling the auxiliary area to obtain the wallpaper with the plurality of screens.
9. The method according to any one of claims 1-7, wherein after the first electronic device generates a target pattern based on the pattern and fills the target pattern with the color, resulting in a wallpaper set comprising multiple screen wallpapers, further comprising:
and the first electronic equipment performs secondary conversion on the plurality of screen wallpapers by adopting a built-in conversion template to obtain the plurality of converted screen wallpapers.
10. The method as claimed in claim 9, wherein the transformation template has a corresponding template style, and the performing, by the first electronic device, a secondary transformation on the plurality of wallpaper templates by using a built-in transformation template to obtain a plurality of transformed wallpaper templates includes:
and the first electronic equipment transforms the screen wallpapers according to the template style, so that the wallpaper styles of the transformed screen wallpapers are consistent with the template style of the transformed template.
11. The method according to any one of claims 1-10, further comprising, before the first electronic device replaces the screen wallpaper of the target electronic device according to the set of wallpapers:
the first electronic equipment determines the screen specification of each target electronic equipment, wherein the screen specification comprises the screen shape or the length-width ratio of the target electronic equipment;
the first electronic device adjusts the multiple pieces of screen wallpaper according to the screen specification, so that the adjusted wallpaper specification of the screen wallpaper is matched with the screen specification of each target electronic device, and the wallpaper specification comprises the wallpaper shape or the length-width ratio of the screen wallpaper.
12. The method of claim 11, wherein the number of second electronic devices comprises a plurality; the first electronic device replaces the screen wallpaper of the target electronic device according to the wallpaper set, and the method comprises the following steps:
the first electronic equipment respectively determines screen wallpapers matched with the screen specifications of the first electronic equipment and the plurality of second electronic equipment;
the first electronic equipment adopts the screen wallpaper matched with the screen specification of the first electronic equipment to replace the currently used screen wallpaper; and the number of the first and second groups,
the first electronic device sends the screen wallpaper matched with the screen specifications of the plurality of second electronic devices to the corresponding plurality of second electronic devices respectively so as to indicate the plurality of second electronic devices to replace the screen wallpaper currently used by each second electronic device by adopting the received screen wallpaper.
13. An electronic device, a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor when executing the computer program implements the method of replacing a wallpaper of an electronic device according to any one of claims 1 to 12.
14. A computer readable storage medium having stored thereon computer instructions, which, when run on an electronic device, cause the electronic device to implement the method of replacing a wallpaper on a screen of an electronic device as claimed in any one of claims 1 to 12.
15. A computer program product, characterized in that it causes a computer to carry out a method of replacing a wallpaper of a screen of an electronic device according to any one of claims 1 to 12, when the computer program product is run on the computer.
CN202111156115.6A 2021-09-29 2021-09-29 Method and device for replacing screen wallpaper of electronic equipment and electronic equipment Pending CN115878007A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111156115.6A CN115878007A (en) 2021-09-29 2021-09-29 Method and device for replacing screen wallpaper of electronic equipment and electronic equipment
PCT/CN2022/119922 WO2023051320A1 (en) 2021-09-29 2022-09-20 Method and apparatus for changing screen wallpaper of electronic device and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111156115.6A CN115878007A (en) 2021-09-29 2021-09-29 Method and device for replacing screen wallpaper of electronic equipment and electronic equipment

Publications (1)

Publication Number Publication Date
CN115878007A true CN115878007A (en) 2023-03-31

Family

ID=85756504

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111156115.6A Pending CN115878007A (en) 2021-09-29 2021-09-29 Method and device for replacing screen wallpaper of electronic equipment and electronic equipment

Country Status (2)

Country Link
CN (1) CN115878007A (en)
WO (1) WO2023051320A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110304636A1 (en) * 2010-06-14 2011-12-15 Acer Incorporated Wallpaper image generation method and portable electric device thereof
EP2996037B1 (en) * 2014-03-28 2017-10-04 Huawei Device Co., Ltd. Method and apparatus for determining color of interface control, and terminal device
DK179412B1 (en) * 2017-05-12 2018-06-06 Apple Inc Context-Specific User Interfaces
WO2020242882A1 (en) * 2019-05-31 2020-12-03 Apple Inc. Device, method, and graphical user interface for updating a background for home and wake screen user interfaces

Also Published As

Publication number Publication date
WO2023051320A1 (en) 2023-04-06

Similar Documents

Publication Publication Date Title
CN110072070B (en) Multi-channel video recording method, equipment and medium
EP3968133A1 (en) Air-mouse mode implementation method and related device
CN111327814A (en) Image processing method and electronic equipment
WO2020134877A1 (en) Skin detection method and electronic device
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN110248037B (en) Identity document scanning method and device
CN114489533A (en) Screen projection method and device, electronic equipment and computer readable storage medium
CN113810600A (en) Terminal image processing method and device and terminal equipment
CN114466107A (en) Sound effect control method and device, electronic equipment and computer readable storage medium
CN114089932A (en) Multi-screen display method and device, terminal equipment and storage medium
CN110138999B (en) Certificate scanning method and device for mobile terminal
CN115129410A (en) Desktop wallpaper configuration method and device, electronic equipment and readable storage medium
CN113490291B (en) Data downloading method and device and terminal equipment
CN113535284A (en) Full-screen display method and device and electronic equipment
CN114500901A (en) Double-scene video recording method and device and electronic equipment
CN112272191B (en) Data transfer method and related device
CN112099741B (en) Display screen position identification method, electronic device and computer readable storage medium
CN114756184A (en) Collaborative display method, terminal device and computer-readable storage medium
CN113518189A (en) Shooting method, shooting system, electronic equipment and storage medium
CN114302063B (en) Shooting method and equipment
WO2023051320A1 (en) Method and apparatus for changing screen wallpaper of electronic device and electronic device
CN115706869A (en) Terminal image processing method and device and terminal equipment
CN114661258A (en) Adaptive display method, electronic device, and storage medium
CN116782023A (en) Shooting method and electronic equipment
CN114844542A (en) Antenna selection method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination