CN111093025A - Image processing method and electronic equipment - Google Patents

Image processing method and electronic equipment Download PDF

Info

Publication number
CN111093025A
CN111093025A CN201911398115.XA CN201911398115A CN111093025A CN 111093025 A CN111093025 A CN 111093025A CN 201911398115 A CN201911398115 A CN 201911398115A CN 111093025 A CN111093025 A CN 111093025A
Authority
CN
China
Prior art keywords
image
target
electronic device
module
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911398115.XA
Other languages
Chinese (zh)
Other versions
CN111093025B (en
Inventor
泮婕
周珊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911398115.XA priority Critical patent/CN111093025B/en
Publication of CN111093025A publication Critical patent/CN111093025A/en
Application granted granted Critical
Publication of CN111093025B publication Critical patent/CN111093025B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters

Abstract

The embodiment of the invention provides an image processing method and electronic equipment, relates to the technical field of communication, and aims to solve the problems that the electronic equipment is complex in operation and not convenient enough in the process of processing a food image. The method comprises the following steps: acquiring a first image, wherein the first image comprises a first object of a food type and a second object of a preset type, and the preset type is different from the food type; determining a target recommendation object according to the first object; and replacing the second object with the target recommended object to obtain a target image.

Description

Image processing method and electronic equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an image processing method and electronic equipment.
Background
With the development of communication technology, users have higher and higher requirements on convenience of operation of electronic equipment.
At present, an image processing object provided by an electronic device for a user mainly aims at processing of landscapes, people and the like, a filter is usually added for gourmet, and if the user needs to perform other processing on a gourmet image, the user needs to have a professional image processing technology to process and obtain a gourmet image with a good effect, so that the operation of the electronic device in the process of processing the gourmet image is complex and not convenient.
Disclosure of Invention
The embodiment of the invention provides an image processing method and electronic equipment, and aims to solve the problems that the electronic equipment is complex in operation and not convenient enough in the process of processing a food image.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides an image processing method, where the method includes: acquiring a first image, wherein the first image comprises a first object of a food type and a second object of a preset type, and the preset type is different from the food type; determining a target recommendation object according to the first object; and replacing the second object with the target recommended object to obtain a target image.
In a second aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes: the device comprises an acquisition module, a determination module and a replacement module; the acquisition module is used for acquiring a first image, wherein the first image comprises a first object of a food type and a second object of a preset type, and the preset type is different from the food type; the determining module is used for determining a target recommendation object according to the first object; the replacing module is used for replacing the second object with the target recommending object to obtain a target image.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a processor, a memory, and a computer program stored on the memory and executable on the processor, and when executed by the processor, the computer program implements the steps of the image processing method according to the first aspect.
In a fourth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the image processing method according to the first aspect.
In the embodiment of the invention, electronic equipment acquires a first image, wherein the first image comprises a first object of a food type and a second object of a preset type, and the preset type is different from the food type; determining a target recommendation object according to the first object; replacing the second object with the target recommended object to obtain a target image, wherein the preset type of object can comprise tableware, a dining table, a dining chair, tablecloths, curtains, other backgrounds and the like, so that a user can not perform fine tray arrangement, decoration and the like due to limited scenes in the process of processing the food image, the tableware, the dining table, the dining chair, the tablecloths, the curtains, other backgrounds and the like in the image can be replaced without limiting the existing environmental materials (such as tables, tablecloths, tableware and the like), and compared with the traditional image processing mode, the user can have more choices for beautifying the food image and is more interesting.
Drawings
FIG. 1 is a block diagram of a possible operating system according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of an image processing method according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a possible structure of an electronic device according to an embodiment of the present invention;
fig. 4 is a second schematic structural diagram of an electronic device according to a second embodiment of the present invention;
fig. 5 is a hardware schematic diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that "/" in this context means "or", for example, A/B may mean A or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. "plurality" means two or more than two.
The terms "first" and "second," and the like, in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first image and the second image, etc. are for distinguishing different images, rather than for describing a particular order of the images.
It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The electronic device in the embodiment of the present invention may be an electronic device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the image processing method according to the embodiment of the present invention is applied, by taking the operating system shown in fig. 1 as an example.
Fig. 1 is a schematic diagram of a possible operating system according to an embodiment of the present invention. In fig. 1, the architecture of the operating system includes 4 layers, respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application layer comprises various application programs (including system application programs and third-party application programs) in an operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes a library (also referred to as a system library) and an operating system runtime environment. The library mainly provides various resources required by the operating system. The operating system runtime environment is used to provide a software environment for the operating system.
The kernel layer is the operating system layer of the operating system and belongs to the lowest layer of the operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the operating system based on the Linux kernel.
Taking the operating system in fig. 1 as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the image processing method provided in the embodiment of the present invention based on the system architecture of the operating system shown in fig. 1, so that the image processing method may run based on the operating system shown in fig. 1. That is, the processor or the electronic device may implement the image processing method provided by the embodiment of the present invention by running the software program in the operating system.
An image processing method according to an embodiment of the present invention will be described with reference to fig. 2. Fig. 2 is a schematic flowchart of an image processing method according to an embodiment of the present invention, and as shown in fig. 2, the image processing method includes steps S201 to S203:
s201, the electronic equipment acquires a first image, wherein the first image comprises a first object of a food type and a second object of a preset type.
Wherein the preset type is different from the food type.
Illustratively, the preset type of object may include tableware, a table, a dining chair, a tablecloth, a curtain, other backgrounds, and the like.
It is to be understood that the first image may include a plurality of objects of different food types, and the first object is an object of one of the food types, which is not particularly limited in this embodiment of the present invention.
In this embodiment of the present invention, the food type may be a food type preset by the electronic device, or may also be a food type set by a user, which is not specifically limited in this embodiment of the present invention.
For example, the food types may include: fruits, snacks, dishes, staple foods, etc.
Optionally, in this embodiment of the present invention, the first image may be an image that is acquired in real time when a user opens a camera of the electronic device, and the first image may also be an image that is selected by the user from already acquired images.
S202, the electronic equipment determines a target recommendation object according to the first object.
S203, the electronic equipment replaces the second object with the target recommendation object to obtain a target image.
Optionally, the second object may be one object or a plurality of objects, which is not specifically limited in this embodiment of the present invention.
In the embodiment of the invention, the electronic equipment can be replaced directly after the shooting is finished, and can also be replaced according to the selection of a user or random selection.
For example, the user may replace at least one of the shape, color, material, etc. of the tray by one input, and the user may also replace the material of the table, the color of the tablecloth, and other backgrounds, such as lamps, wall paper colors, curtains, etc., by one input.
It can be understood that the user may trigger the electronic device to replace a different second object in the first image as needed, and after the replacement is completed, the electronic device may be triggered to save the replaced obtained target image.
According to the image processing method provided by the embodiment of the invention, electronic equipment acquires a first image, wherein the first image comprises a first object of a food type and a second object of a preset type, and the preset type is different from the food type; determining a target recommendation object according to the first object; replacing the second object with the target recommended object to obtain a target image, wherein the preset type of object can comprise tableware, a dining table, a dining chair, tablecloth, a curtain, other backgrounds and the like, so that a user can not perform fine tray arrangement, decoration and the like due to limited scenes in the process of processing the food image, and the user can not be limited by the existing environment materials (such as tables, tablecloths and the like) to replace the tableware, the dining table, the dining chair, the tablecloth, the curtain, other backgrounds and the like in the image according to the needs in a targeted manner.
Optionally, after the above S201, the image processing method according to the embodiment of the present invention may further include the following S204:
s204, the electronic equipment displays the M pieces of prompt information.
Each prompt message prompts that a third object in the first image can be replaced, M is a positive integer, and each third object is an object of one of preset types.
For example, the form of the prompt message may be a text form or a graphic form, which is not specifically limited in this embodiment of the present invention.
Further, the above S203 may be performed by the following S203a and S203 b:
s203a, the electronic device receives a first input of a user.
The first input is input of selecting a target recommendation object by a user.
Optionally, the first input may be an input for triggering the electronic device to randomly select the replacement object, or an input for the user to select the replacement object by himself.
S203b, the electronic device responds to the first input, replaces the second object with the target recommendation object, and obtains a target image.
The second object is a third object prompted by the target prompt message, and the target prompt message is at least one prompt message in the M prompt messages.
Based on the scheme, the user can select the replaceable third object by combining M pieces of prompt information displayed by the electronic equipment, the electronic equipment can replace the second object with the target recommended object selected by the user according to the first input of the user, so that the target image is obtained, the replacement is more flexible, and compared with a traditional image processing mode, the user can beautify the gourmet image and select more objects, so that the user is more interested.
Optionally, before the foregoing S204, the image processing method provided in the embodiment of the present invention may further include the following S205:
s205, the electronic equipment determines that M third objects are included in the first image.
For example, the electronic device may determine food, tableware, background, etc. in the image based on a semantic segmentation network, or other classification model under massive data.
It should be noted that, the M objects may be related to the first object (for example, tableware on which the first object is placed), or may be unrelated to the first object (for example, wallpaper, a lamp, and the like), and this is not particularly limited in the embodiment of the present invention.
In particular, the electronic device may determine a third object associated with the first object based on the food information of the first object.
Further, in the embodiment of the present invention, the above S204 may be specifically executed by the following S204 a:
s204a, the electronic device displays prompt information on the M third objects in the first image respectively.
For example, the electronic device may use different colors as the prompt information, and the electronic device may display a red prompt box on 3 plates in the first image and a green prompt box on the tablecloth area.
Optionally, in the embodiment of the present invention, the electronic device may also mark the food in the first image to prompt the user to replace the matching of different foods.
Specifically, after food identification is completed, the electronic device can acquire food information such as the color, shape and dish style of the food, then store the food information, and acquire recommended collocation information according to the food information.
Based on the scheme, after the electronic equipment acquires the first image, the M third objects included in the first image can be determined firstly, and then the prompt messages are respectively displayed on the M third objects, so that a user can determine which objects can be replaced in an interface, the processing of the user aiming at the food image is improved, and the processing steps are simple and convenient.
Optionally, in the image processing method provided in the embodiment of the present invention, the above S202 may specifically be executed by S202 a:
s202a, the electronic equipment determines the target recommendation object according to at least one of the color, the shape and the food type of the first object.
Illustratively, for example, the dish color is color 1, and the dish color 2 may be determined as the target recommendation object. The dish color 1 and dish color 2 are matched with the highest praise amount ranked first or recommended most frequently.
Based on the scheme, the electronic equipment can determine the target recommended object for the first object by combining at least one of the color, the shape and the type of the first object, can acquire the recommended object from multiple angles of the first object, and has more flexible mode of acquiring the recommended object, so that the user experience in the processing process for the food image is better.
Optionally, in the image processing method provided in the embodiment of the present invention, the above S202 may specifically be executed by S202 b:
s202b, the electronic device determines a target recommendation object according to the first object and at least one of container collocation, background collocation and ornament collocation.
It will be appreciated that in embodiments of the invention, the container is used to hold food items such as cups, bowls, dishes, pans and the like.
For example, assuming that the first object is fruit, assuming that the container holding the fruit is most often a transparent glass bowl, the bowl may be determined as the target recommended object. Assuming that the most commonly used carving for matching the fruit is carving 1, the carving 1 can be used as the target recommendation object.
Based on the scheme, the electronic equipment can determine the target recommendation object aiming at the first object by combining the first object and at least one of container collocation, background collocation and ornament collocation, can obtain the recommendation object from the perspective of multiple collocations, and has more flexible way of obtaining the recommendation object, so that the user experience in the processing process aiming at the food image is better.
Optionally, after the above S201, the image processing method according to the embodiment of the present invention may further include the following S206:
s206, the electronic equipment displays N types of replacement options.
Each type in the N types is one of preset types, each type in the N types corresponds to at least one replacement option, N is a positive integer, and N is less than or equal to M.
For example, for a plate in tableware, the replacement options may include the shape of the plate, such as a circle, an oval, a square, a diamond, etc., and the replacement options may include the material of the plate, such as ceramic, wood, metal, etc. The dining table and chair can be made of dining table materials such as wood, glass, metal and the like.
In an embodiment of the present invention, the first input may be a selection input of a user for a replacement option, for example, when the user clicks the replacement option 1, the electronic device displays the replacement option 1 at a corresponding position.
Optionally, the electronic device may display N types of replacement options when displaying M pieces of prompt information, or the electronic device may display N types of replacement options when a user selects an object corresponding to one piece of prompt information, which is not specifically limited in this embodiment of the present invention.
It should be noted that the N types of replacement options may respectively correspond to one piece of prompt information, for example, the first image includes 3 disks, and the electronic device may respectively display different replacement options for the 3 disks.
Optionally, the at least one option corresponding to each type may be selected by the electronic device according to a preset rule.
Specifically, the electronic device may determine the color, shape, dish, etc. of the first object after acquiring the first image. The electronic device may acquire a replacement option complying with a preset rule according to the food information of the first object. For example, the tray with the largest amount of praise in the collocation of the tray and the background may be used as the replacement option for the tray of the first object, and the background in the collocation of the tray with the largest amount of praise in the collocation of the tray and the background may be used as the replacement option for the background of the first object.
Optionally, the at least one alternative option is J options ranked before scoring in one type of alternative options, or L options ranked before the target frequency, where J and L are positive integers.
The target frequency may be a complimentary frequency, a repeating frequency, etc.
Optionally, after the above S206, in the image processing method according to the embodiment of the present invention, the above S204 may specifically include the following S204 a:
s204a, the electronic device responds to the first input, and randomly replaces the collocation information of the first object based on the N types of replacement options.
The first input is an input of a user shaking the electronic device, such as gravity sensing replacement and shake-shake replacement.
In an embodiment of the present invention, the collocation information of the first object may include a preset type of object for the first object.
Based on the scheme, the electronic equipment can display N types of replacement options after the first image is acquired, so that a user can conveniently select and replace the content in the first image according to needs, the user operation is simple, and the interest of beautifying the food image is strong.
Optionally, after S201, the image processing method provided in the embodiment of the present invention may further include the following S207:
s207, the electronic equipment displays K adding objects, wherein K is a positive integer.
Illustratively, the K added objects may be decorations for food collocation, such as: carved, fruit, tableware, etc.
In this embodiment of the present invention, the K additional objects may be K randomly displayed, may also be K with the highest occurrence frequency in the additional objects, and may also be K with the highest praise number in the additional objects, which is not specifically limited in this embodiment of the present invention.
Furthermore, in the embodiment of the present invention, after S207, S208 and S209 described below may be further included.
And S208, the electronic equipment receives a second input of the user.
The second input may be a randomly selected input triggered by a user, or may also be an input manually selected by the user, which is not specifically limited in the embodiment of the present invention.
S209, the electronic equipment responds to the second input and displays the first adding object in the first image.
For example, the user may drag the first addition object to the first position of the first image, and the electronic device may display the first addition object at the first position.
Based on the scheme, the electronic equipment can display some added objects, such as ornaments, and a user can select to add the ornaments in the first image according to needs, so that the effect of the edited first object is better, the user does not need to actually perform operations such as carving in order to obtain a delicate and attractive image, the inconvenience caused by the fact that the user cannot perform more refined disk arrangement and decoration treatment due to limited scenes is overcome, and the user experience is better.
Optionally, in the embodiment of the present invention, before the electronic device saves the edited (replaced, and added) image, a filter may be added to the image.
For example, in an embodiment of the present invention, the electronic device may include an image processing model, the image processing model may include a classification model and a scoring model, the classification model may identify food (including category, shape, color, and the like of the food) in the food image based on the food image, tags (or labeling information) of each food in the food image, and a semantic segmentation network, the scoring model may be based on scores of professional colloquiists obtained from mass data, and scores of a large number of users according to their preferences, and the electronic device may display, to the users, alternative options of a collocation combination with a higher score corresponding to each food object in the first image. Specifically, when the user selects the matching, the first image can be divided into a food layer, a tableware layer, a background layer and an ornament layer, wherein the food layer is fixed, the tableware option and the background option can be selected by the user in a left-right sliding mode, the ornament option can be selected by the user in a type and a posture position, the user can conveniently obtain the favorite matching type, and after the user selects the background, such as tablecloth, wallpaper curtain and the like, the selected background and the like can be fused with the food layer based on the Laplace pyramid strategy, so that the transition area of the food and the background is smoother, and the visual impression of the user can be improved.
Fig. 3 is a schematic diagram of a possible structure of an electronic device according to an embodiment of the present invention, and as shown in fig. 3, the electronic device 300 includes: an acquisition module 301, a determination module 302 and a replacement module 303; an obtaining module 301, configured to obtain a first image, where the first image includes a first object of a food type and a second object of a preset type, and the preset type is different from the food type; a determining module 302, configured to determine a target recommended object according to a first object; and a replacing module 303, configured to replace the second object with the target recommended object to obtain a target image.
Optionally, with reference to fig. 3, as shown in fig. 4, the electronic device 300 further includes: a display module 304; a display module 304, configured to display M pieces of prompt information after the obtaining module 301 obtains the first image, where each piece of prompt information prompts that one third object in the first image is replaceable, M is a positive integer, and each third object is an object of one of preset types; the replacing module 303 is configured to receive a first input of a user, where the first input is an input of selecting a target recommended object by the user, and replace a second object with the target recommended object in response to the first input to obtain a target image; the second object is a third object prompted by the target prompt message, and the target prompt message is at least one prompt message in the M prompt messages.
Optionally, the determining module 302 is further configured to determine that the first image includes M third objects before the displaying module 304 displays the M pieces of prompt information; the display module 304 is specifically configured to display a piece of prompt information on each of the M third objects in the first image.
Optionally, the determining module 302 is specifically configured to: and determining the target recommended object according to at least one of the color, the shape and the food type of the first object.
Optionally, the determining module 302 is specifically configured to: and determining a target recommendation object according to the first object and at least one of the container collocation, the background collocation and the ornament collocation.
Optionally, the display module 304 is further configured to display N types of alternative options after the obtaining module 301 obtains the first image; each type in the N types is one type in preset types, each type corresponds to at least one replacement option, and N is a positive integer.
Optionally, the display module 304 is further configured to display K adding objects after the obtaining module 301 obtains the first image, where K is a positive integer; the receiving module 301 is further configured to receive a second input of the user; the display module 304 is specifically configured to display the first added object in the first image in response to the second input received by the receiving module 301.
Optionally, the at least one alternative option is J options ranked before scoring in one type of alternative options, or L options ranked before the target frequency, where J and L are positive integers.
The electronic device 300 provided in the embodiment of the present invention can implement each process implemented by the electronic device in the above method embodiments, and is not described here again to avoid repetition.
The embodiment of the invention provides electronic equipment, which is used for acquiring a first image, wherein the first image comprises a first object of a food type and a second object of a preset type, and the preset type is different from the food type; determining a target recommendation object according to the first object; replacing the second object with the target recommended object to obtain a target image, wherein the preset type of object can comprise tableware, a dining table, a dining chair, tablecloths, curtains, other backgrounds and the like, so that a user can not perform fine tray arrangement, decoration and the like due to limited scenes in the process of processing the food image, the tableware, the dining table, the dining chair, the tablecloths, the curtains, other backgrounds and the like in the image can be replaced without limiting the existing environmental materials (such as tables, tablecloths, tableware and the like), and compared with the traditional image processing mode, the user can have more choices for beautifying the food image and is more interesting.
Fig. 5 is a hardware schematic diagram of an electronic device 100 according to an embodiment of the present invention, where the electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 5 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a wearable device, a pedometer, and the like.
The processor 110 is configured to obtain a first image, where the first image includes a first object of a food type and a second object of a preset type, and the preset type is different from the food type; determining a target recommendation object according to the first object; and replacing the second object with the target recommendation object to obtain the target image.
The embodiment of the invention provides electronic equipment, which is used for acquiring a first image, wherein the first image comprises a first object of a food type and a second object of a preset type, and the preset type is different from the food type; determining a target recommendation object according to the first object; replacing the second object with the target recommended object to obtain a target image, wherein the preset type of object can comprise tableware, a dining table, a dining chair, tablecloths, curtains, other backgrounds and the like, so that a user can not perform fine tray arrangement, decoration and the like due to limited scenes in the process of processing the food image, the tableware, the dining table, the dining chair, the tablecloths, the curtains, other backgrounds and the like in the image can be replaced without limiting the existing environmental materials (such as tables, tablecloths, tableware and the like), and compared with the traditional image processing mode, the user can have more choices for beautifying the food image and is more interesting.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 102, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic apparatus 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The electronic device 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the electronic device 100 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 5, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the electronic apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 100 or may be used to transmit data between the electronic apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the electronic device. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The electronic device 100 may further include a power source 111 (such as a battery) for supplying power to each component, and preferably, the power source 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the electronic device 100 includes some functional modules that are not shown, and are not described in detail herein.
Optionally, an electronic device is further provided in an embodiment of the present invention, and with reference to fig. 5, the electronic device includes a processor 110, a memory 109, and a computer program that is stored in the memory 109 and is executable on the processor 110, and when the computer program is executed by the processor 110, the electronic device implements each process of the image processing method embodiment, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling an electronic device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. An image processing method applied to an electronic device, the method comprising:
acquiring a first image, wherein the first image comprises a first object of a food type and a second object of a preset type, and the preset type is different from the food type;
determining a target recommendation object according to the first object;
and replacing the second object with the target recommended object to obtain a target image.
2. The method of claim 1, wherein after the acquiring the first image, the method further comprises:
displaying M prompt messages, wherein each prompt message prompts that a third object in the first image can be replaced, M is a positive integer, and each third object is an object of one type in the preset types;
replacing the second object with the target recommendation object to obtain a target image, including:
receiving a first input of a user, wherein the first input is an input of the user for selecting the target recommendation object,
in response to the first input, replacing the second object with the target recommended object, and obtaining the target image;
the second object is a third object prompted by target prompt information, and the target prompt information is at least one prompt information in the M prompt information.
3. The method of claim 2, wherein prior to displaying the M prompt messages, the method further comprises:
determining that M third objects are included in the first image;
the displaying of the M prompt messages includes:
and respectively displaying prompt information on the M third objects in the first image.
4. The method of any one of claims 1 to 3, wherein determining a target recommended object based on the first object comprises:
determining the target recommended object according to at least one of color, shape and food type of the first object.
5. The method of any one of claims 1 to 3, wherein the first object, determining a target recommended object, comprises:
and determining the target recommendation object according to the first object and at least one of container collocation, background collocation and ornament collocation.
6. An electronic device, characterized in that the electronic device comprises: the device comprises an acquisition module, a determination module and a replacement module;
the acquisition module is used for acquiring a first image, wherein the first image comprises a first object of a food type and a second object of a preset type, and the preset type is different from the food type;
the determining module is used for determining a target recommendation object according to the first object;
the replacing module is used for replacing the second object with the target recommending object to obtain a target image.
7. The electronic device of claim 6, further comprising: a display module;
the display module is configured to display M pieces of prompt information after the acquisition module acquires the first image, where each piece of prompt information prompts that a third object in the first image is replaceable, M is a positive integer, and each third object is an object of one of the preset types;
the replacing module is used for receiving a first input of a user, wherein the first input is an input of selecting the target recommended object by the user, and the second object is replaced by the target recommended object in response to the first input to obtain the target image;
the second object is a third object prompted by target prompt information, and the target prompt information is at least one prompt information in the M prompt information.
8. The electronic device of claim 7,
the determining module is further configured to determine that the first image includes M third objects before the displaying module displays the M pieces of prompt information;
the display module is specifically configured to display a piece of prompt information on the M third objects in the first image, respectively.
9. The electronic device according to any one of claims 6 to 8, wherein the determining module is specifically configured to:
determining the target recommended object according to at least one of color, shape and food type of the first object.
10. The electronic device according to any one of claims 6 to 8, wherein the determining module is specifically configured to:
and determining the target recommendation object according to the first object and at least one of container collocation, background collocation and ornament collocation.
CN201911398115.XA 2019-12-30 2019-12-30 Image processing method and electronic equipment Active CN111093025B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911398115.XA CN111093025B (en) 2019-12-30 2019-12-30 Image processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911398115.XA CN111093025B (en) 2019-12-30 2019-12-30 Image processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN111093025A true CN111093025A (en) 2020-05-01
CN111093025B CN111093025B (en) 2021-07-30

Family

ID=70398593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911398115.XA Active CN111093025B (en) 2019-12-30 2019-12-30 Image processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN111093025B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113709370A (en) * 2021-08-26 2021-11-26 维沃移动通信有限公司 Image generation method and device, electronic equipment and readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090136125A1 (en) * 2005-06-27 2009-05-28 Pioneer Corporation Image analysis device and image analysis method
US20110074819A1 (en) * 2009-09-29 2011-03-31 Fujifilm Corporation Image layout determining method, recording medium and information processing apparatus for the same
CN106779791A (en) * 2015-11-25 2017-05-31 阿里巴巴集团控股有限公司 A kind of generation method and device of object picture combination of arranging in pairs or groups
CN106998423A (en) * 2016-01-26 2017-08-01 宇龙计算机通信科技(深圳)有限公司 Image processing method and device
CN108121957A (en) * 2017-12-19 2018-06-05 北京麒麟合盛网络技术有限公司 The method for pushing and device of U.S. face material
CN108230283A (en) * 2018-01-19 2018-06-29 维沃移动通信有限公司 A kind of textures material recommends method and electronic equipment
CN108961302A (en) * 2018-07-16 2018-12-07 Oppo广东移动通信有限公司 Image processing method, device, mobile terminal and computer readable storage medium
CN110035227A (en) * 2019-03-25 2019-07-19 维沃移动通信有限公司 Special effect display methods and terminal device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090136125A1 (en) * 2005-06-27 2009-05-28 Pioneer Corporation Image analysis device and image analysis method
US20110074819A1 (en) * 2009-09-29 2011-03-31 Fujifilm Corporation Image layout determining method, recording medium and information processing apparatus for the same
CN106779791A (en) * 2015-11-25 2017-05-31 阿里巴巴集团控股有限公司 A kind of generation method and device of object picture combination of arranging in pairs or groups
CN106998423A (en) * 2016-01-26 2017-08-01 宇龙计算机通信科技(深圳)有限公司 Image processing method and device
CN108121957A (en) * 2017-12-19 2018-06-05 北京麒麟合盛网络技术有限公司 The method for pushing and device of U.S. face material
CN108230283A (en) * 2018-01-19 2018-06-29 维沃移动通信有限公司 A kind of textures material recommends method and electronic equipment
CN108961302A (en) * 2018-07-16 2018-12-07 Oppo广东移动通信有限公司 Image processing method, device, mobile terminal and computer readable storage medium
CN110035227A (en) * 2019-03-25 2019-07-19 维沃移动通信有限公司 Special effect display methods and terminal device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113709370A (en) * 2021-08-26 2021-11-26 维沃移动通信有限公司 Image generation method and device, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
CN111093025B (en) 2021-07-30

Similar Documents

Publication Publication Date Title
CN106878390B (en) Electronic pet interaction control method and device and wearable equipment
CN109164949A (en) A kind of chat messages localization method and mobile terminal
CN109409244B (en) Output method of object placement scheme and mobile terminal
US20210096739A1 (en) Method For Editing Text And Mobile Terminal
CN111127595B (en) Image processing method and electronic equipment
CN108595089A (en) A kind of virtual key control method and mobile terminal
CN108388403B (en) Method and terminal for processing message
CN108897473A (en) A kind of interface display method and terminal
CN108920119A (en) A kind of sharing method and mobile terminal
CN109117239A (en) A kind of screen wallpaper display methods and mobile terminal
CN108415642A (en) A kind of display methods and mobile terminal
CN109710165A (en) A kind of drawing processing method and mobile terminal
CN108228033A (en) A kind of message display method and mobile terminal
CN108460817A (en) A kind of pattern splicing method and mobile terminal
CN109656636A (en) A kind of application starting method and device
CN110134306A (en) A kind of data sharing method, device and computer readable storage medium
CN110096203A (en) A kind of screenshot method and mobile terminal
CN108600544A (en) A kind of Single-hand control method and terminal
CN109413264A (en) A kind of background picture method of adjustment and terminal device
CN109660674B (en) Method for setting alarm clock and electronic equipment
CN108388354A (en) A kind of display methods and mobile terminal in input method candidate area domain
CN107944040A (en) The display methods and mobile terminal of a kind of lyrics
CN109164908B (en) Interface control method and mobile terminal
CN111093025B (en) Image processing method and electronic equipment
CN108346083B (en) Information processing method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant