CN112711370B - Image editing method and device and projection equipment - Google Patents

Image editing method and device and projection equipment Download PDF

Info

Publication number
CN112711370B
CN112711370B CN202110329990.3A CN202110329990A CN112711370B CN 112711370 B CN112711370 B CN 112711370B CN 202110329990 A CN202110329990 A CN 202110329990A CN 112711370 B CN112711370 B CN 112711370B
Authority
CN
China
Prior art keywords
image
projection
mobile terminal
template
projection surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110329990.3A
Other languages
Chinese (zh)
Other versions
CN112711370A (en
Inventor
李禹�
曹琦
何维
张聪
王骁逸
胡震宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huole Science and Technology Development Co Ltd
Original Assignee
Shenzhen Huole Science and Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huole Science and Technology Development Co Ltd filed Critical Shenzhen Huole Science and Technology Development Co Ltd
Priority to CN202110329990.3A priority Critical patent/CN112711370B/en
Publication of CN112711370A publication Critical patent/CN112711370A/en
Application granted granted Critical
Publication of CN112711370B publication Critical patent/CN112711370B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/54Browsing; Visualisation therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut

Abstract

The disclosure provides an image editing method, an image editing device and projection equipment; the method comprises the steps that after receiving a first image template and a first image assembly from a mobile terminal, a projection device in the method generates a first image based on the first image template and the first image assembly and projects the first image onto a projection surface, then the first image is adjusted to obtain a second image according to the projection surface image of the projection surface, and finally the second image template and the second image assembly corresponding to the second image are sent to the mobile terminal, so that the mobile terminal can synchronously display the second image. The method provided by the disclosure can adaptively optimize the first image edited by the user at the mobile terminal into the second image according to the projection effect through the projection equipment, and synchronously display the second image at the mobile terminal in real time, so that the projection equipment can perform anthropomorphic editing on the image while realizing real-time display of the edited content, and the reasonability and intelligence of the image editing process are increased.

Description

Image editing method and device and projection equipment
Technical Field
The present disclosure relates to the field of projection display technologies, and in particular, to an image editing method and apparatus, and a projection device.
Background
Along with the development of the smart home technology, the smart projection device is a trend to replace a smart television based on the advantages of convenience in movement, display area and the like.
At present, when a user carries out personalized editing of projection content based on a mobile terminal matched with intelligent projection equipment, the user completes image editing in a control APP of the mobile terminal and then synchronizes to the projection equipment for displaying; in the process, the user cannot see the projection effect of the edited content in real time, and can only edit the colors, positions and the like of the image template and the image components according to experience, so that the operation is complex and the real-time performance is insufficient.
Disclosure of Invention
The disclosure provides an image editing method, an image editing device and a projection device, which are used for relieving the technical problems that the current image editing method is complex in operation and insufficient in real-time performance.
In order to solve the technical problem, the present disclosure provides the following technical solutions:
the present disclosure provides an image editing method, including:
receiving a first image template and a first image assembly from a mobile terminal;
generating a first image based on the first image template and a first image assembly, and projecting the first image to a projection surface;
acquiring and adjusting the first image according to the projection surface image of the projection surface to obtain a second image;
and sending a second image template and a second image component corresponding to the second image to the mobile terminal so that the mobile terminal synchronously displays the second image.
Meanwhile, the present disclosure also provides an image editing apparatus, including:
the first receiving module is used for receiving a first image template and a first image assembly from the mobile terminal;
the projection module is used for generating a first image based on the first image template and the first image assembly and projecting the first image to a projection surface;
the image adjusting module is used for acquiring and adjusting the first image according to the projection surface image of the projection surface to obtain a second image;
and the sending module is used for sending a second image template and a second image component corresponding to the second image to the mobile terminal so that the mobile terminal synchronously displays the second image.
Optionally, the image adjusting module comprises:
the first acquisition module is used for acquiring current projection parameters, and the projection parameters comprise at least one of projection surface style, color temperature and brightness;
the first adjusting module is used for adjusting at least one of the first image template and the first image assembly according to the projection parameters to obtain a second image template and a second image assembly;
a first generation module to generate a second image based on the second image template and the second image component.
Optionally, the image adjusting module further comprises:
the second receiving module is used for receiving a first editing request from the mobile terminal;
a second adjusting module, configured to adjust at least one of the first image template and the first image component according to the first editing request, so as to obtain a second image template and a second image component;
a second generation module to generate a second image based on the second image template and the second image component.
Optionally, the image editing apparatus may further include:
a third receiving module, configured to receive a second editing request from the mobile terminal;
the first determining module is used for determining an initial image template and an initial image assembly according to the second editing request;
and the third generation module is used for generating an initial image based on the initial image template and the initial image assembly and projecting the initial image to the projection surface.
Optionally, the image editing apparatus may further include:
the connection module is used for establishing connection with the cloud end;
the downloading module is used for downloading a preset image template and a preset image assembly from the cloud end;
and the fourth generation module is used for generating a preset image based on the preset image template and a preset image assembly and projecting the preset image to the projection surface.
Optionally, the image editing apparatus may further include:
the resource request module is used for sending an image resource request to the mobile terminal;
the fourth receiving module is used for receiving the local image template and the local image assembly returned by the mobile terminal based on the image resource request;
and the fifth generation module is used for generating a local image based on the local image template and the local image component and projecting the local image to the projection surface.
Optionally, the image editing apparatus may further include:
and the storage module is used for storing the second image to the database after receiving a storage instruction from the mobile terminal.
Optionally, the image editing apparatus may further include:
the second acquisition module is used for acquiring current projection parameters, and the projection parameters comprise at least one of projection surface style, color temperature and brightness;
and the second determining module is used for determining a display image corresponding to the projection parameters in the database according to the projection parameters and projecting the display image to the projection surface.
Optionally, the image editing apparatus may further include:
a fifth receiving module, configured to receive a display request from the mobile terminal;
and the third determining module is used for determining a display image corresponding to the display request from the database according to the display request and projecting the display image to the projection surface.
Accordingly, the present disclosure also provides a projection apparatus comprising: a processor and a memory, the memory being used to store instructions and data, the processor being used to execute the instructions in the memory to perform the steps in the above image editing method.
Accordingly, the present disclosure also provides a computer-readable storage medium having stored therein a plurality of instructions adapted to be loaded by a processor to perform the steps of the above-mentioned image editing method.
Has the advantages that: the invention provides an image editing method, an image editing device and a projection device, wherein the projection device in the method generates a first image based on a first image template and a first image assembly and projects the first image to a projection surface after receiving the first image template and the first image assembly from a mobile terminal, then obtains and adjusts the first image according to the projection surface image of the projection surface to obtain a second image, and finally sends the second image template and the second image assembly corresponding to the second image to the mobile terminal so that the mobile terminal synchronously displays the second image. The method provided by the disclosure can optimize the first image edited by the user at the mobile terminal into the second image through the projection equipment in a self-adaptive manner according to the projection effect, so that the projection equipment can carry out anthropomorphic editing on the image, in addition, the projection equipment can also send the second image to the mobile terminal besides projecting the second image to the projection surface for displaying, so that the mobile terminal and the projection terminal can synchronously display the image editing process in real time, the real-time interaction of the mobile terminal and the projection terminal is ensured, the user can see the projection effect of the edited content in real time, the projected image is edited according to the projection effect, the operation is simplified, and the reasonability, the flexibility and the intelligence are increased.
Drawings
The technical solutions and other advantages of the present disclosure will become apparent from the following detailed description of specific embodiments of the present disclosure, which is to be read in connection with the accompanying drawings.
Fig. 1 is a schematic diagram of a system architecture provided by an embodiment of the present disclosure.
Fig. 2 is a schematic flowchart of an image editing method provided in an embodiment of the present disclosure.
Fig. 3 is a functional schematic diagram of a mobile terminal provided in an embodiment of the present disclosure.
Fig. 4 is another schematic flowchart of an image editing method provided in the embodiment of the present disclosure.
Fig. 5 is an interface schematic diagram of a mobile terminal and a projection terminal provided by an embodiment of the disclosure.
Fig. 6 is a schematic interface diagram of a mobile terminal and a projection terminal provided in an embodiment of the present disclosure.
Fig. 7 is a schematic structural diagram of an image editing apparatus provided in an embodiment of the present disclosure.
Fig. 8 is a schematic structural diagram of a projection apparatus provided in an embodiment of the present disclosure.
Description of reference numerals:
101-a cloud server; 102-a projection terminal; 103-mobile terminal.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure. It is to be understood that the described embodiments are merely illustrative of some, and not restrictive, of the embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
The terms "first," "second," and the like in the description and in the claims, and the above-described drawings of embodiments of the present disclosure, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprise" and "have," and any variations thereof, are intended to cover non-exclusive inclusions, such that a process, method, system, article, or apparatus that comprises a list of steps or modules is not necessarily limited to those steps or modules expressly listed, but may include other steps or modules not expressly listed or inherent to such process, method, article, or apparatus, such that the division of modules presented in the disclosed embodiments is merely a logical division and may be implemented in a practical application in a different manner, such that multiple modules may be combined or integrated into another system or some features may be omitted or not implemented, and such that couplings or direct couplings or communicative connections between modules shown or discussed may be through interfaces, indirect couplings or communicative connections between modules may be electrical or the like, the embodiments of the present disclosure are not limited. Moreover, the modules or sub-modules described as separate components may or may not be physically separated, may or may not be physical modules, or may be distributed in a plurality of circuit modules, and some or all of the modules may be selected according to actual needs to achieve the purpose of the embodiments of the present disclosure.
In the disclosed embodiment, the image template may be a background, a filter, etc. of the image; the image assembly can be an image of an object such as a chair, a table, a vase and the like under a certain type of image template. The image templates and the components are in inclusion relationship, a plurality of image components can exist under one type of image template, and the image components under different image templates have different characteristics.
In the embodiment of the present disclosure, the projection surface may be various wall surfaces, a projection screen, an electronic display screen, or the like.
Referring to fig. 1, fig. 1 is a schematic diagram of a system architecture provided in the embodiment of the present disclosure, the system at least includes a cloud server 101, a projection terminal 102, and a mobile terminal 103, where:
communication links are arranged among the cloud server 101, the projection terminal 102 and the mobile terminal 103 so as to realize information interaction. The type of communication link may include a wired, wireless communication link, or fiber optic cable, etc., and the disclosure is not limited thereto.
The cloud server 101 may be a cloud server providing basic cloud computing services such as cloud services, cloud databases, cloud computing, cloud functions, cloud storage, web services, cloud communication, middleware services, domain name services, security services, Content Delivery Networks (CDNs), and big data and artificial intelligence platforms.
The projection terminal 102 may be a projection device having a projection function such as a projector, a micro-projector, or the like.
The mobile terminal 103 may be a laptop, a tablet computer, a smart phone, or other mobile communication device.
In the embodiment of the present disclosure, the projection terminal 102 and the mobile terminal 103 may download and upload various image templates and image components from the cloud server 101, and the projection terminal 102 and the mobile terminal 103 may also upload images, user-defined image templates and image components to the cloud server 101; the mobile terminal 103 can send the image editing interface of the mobile terminal to the projection terminal 102 in real time for displaying, and the projection terminal 102 can also send the image editing interface of the projection terminal to the mobile terminal 103 in real time for displaying; the projection terminal 102 and the mobile terminal 103 can adjust the image according to the current scene or the user operation to obtain the image which is suitable for the current scene or meets the user requirement, and display the image.
It should be noted that the schematic diagram of the system architecture shown in fig. 1 is only an example, and the server, the terminal and the scenario described in the embodiment of the present disclosure are for more clearly illustrating the technical solution of the embodiment of the present disclosure, and do not form a limitation on the technical solution provided by the embodiment of the present disclosure, and as a person having ordinary skill in the art knows that along with the evolution of the system and the occurrence of a new service scenario, the technical solution provided by the embodiment of the present disclosure is also applicable to similar technical problems.
With reference to the above system architecture, the following will describe an image editing method in the present disclosure in detail, please refer to fig. 2, fig. 2 is a schematic flow chart of the image editing method according to an embodiment of the present disclosure, and as shown in fig. 2, the image editing method according to the embodiment of the present disclosure at least includes the following steps:
step 201: a first image template and a first image component from a mobile terminal are received.
The first image template and the first image assembly refer to image templates and image assemblies selected by a user in the process of editing the current image through the mobile terminal. The current image may be art picture data, art visual work data, digital works, multimedia content, and the like.
In one embodiment, before the step of receiving the first image template and the first image component from the mobile terminal, the method further comprises: establishing connection with a cloud end; downloading a preset image template and a preset image assembly from the cloud; and generating a preset image based on the preset image template and a preset image assembly, and projecting the preset image to the projection surface. Specifically, the cloud refers to a cloud server, a large number of preset image templates with different styles and modularized preset image components are stored in the cloud server, the projection terminal can be connected with the cloud server through a network, a user can download required image templates and image components from the cloud server according to requirements, as shown in fig. 5, mobile interface i, based on the downloaded image templates and the images generated by the image components, i.e. an editing interface for which the image work has not been edited, a preset image template, i.e. template 1, preset image components, i.e. component 1 (oval a), component 2 (heptagon B), component 3 (rhombus C), etc., projecting the editing interface on the projection plane, as shown in fig. 5, the projection interface i enables the user to visually see the editing interface from the projection surface, and the content displayed on the projection surface is the same as that displayed on the mobile terminal.
The image template and the image assembly can be downloaded from a cloud end and can be obtained from the mobile terminal. In one embodiment, before the step of receiving the first image template and the first image component from the mobile terminal, the method further includes: sending an image resource request to the mobile terminal; receiving a local image template and a local image assembly returned by the mobile terminal based on the image resource request; and generating a local image based on the local image template and a local image component, and projecting the local image to the projection surface. Specifically, the mobile terminal may also establish a connection with the cloud server through a network, then download a required image template and image components from the cloud server, and store the image template and image components in a local storage of the mobile terminal, in addition, a user may also customize and design the image template and image components through the mobile terminal according to preferences, and store the customized image template and image components in the local storage, when the projection terminal sends an image resource request to the mobile terminal, the mobile terminal determines a corresponding local image template and local image components by parsing the image resource request, as shown in a mobile interface ii in fig. 5, at this time, a local image is an editing interface with the local image template and local image components, the local image template is a customized template in the image, the local image components are components 1 (four-pointed star X) under the customized template in the image, and the component 2 (five-pointed star Y), the component 3 (triangle Z) and the like project the editing interface on the projection surface, as shown in the projection interface II of FIG. 5, so that a user can visually see the editing interface from the projection surface, and the local image template and the local image component in the editing interface, and at the moment, the content displayed on the projection surface is consistent with that displayed on the mobile terminal.
In one embodiment, before the step of receiving the first image template and the first image component from the mobile terminal, the method further comprises: receiving a second editing request from the mobile terminal; determining an initial image template and an initial image assembly according to the second editing request; and generating an initial image based on the initial image template and an initial image component, and projecting the initial image to the projection surface. It should be noted that, a user sends an original image to be displayed to a projection terminal for displaying through a mobile terminal for the first time, the user can actively send an editing request to the projection terminal through the mobile terminal according to a display effect, the projection terminal can determine an initial image template and an initial image component according to the editing request sent by the user and adjust the original image, the adjusted image work is the initial image and projects the initial image to a projection surface for displaying, then, the user can continue to adjust the image displayed by the projection surface through the mobile terminal, that is, the image on the projection surface is edited through the first image template and the first image component to obtain a first image, the user can adjust the current projection surface image multiple times according to preferences, and each adjustment is performed based on the current projection surface image.
Step 202: and generating a first image based on the first image template and the first image assembly, and projecting the first image to the projection surface.
In one embodiment, a user is connected with a mobile terminal for the first time through a projection terminal, and projects an image work to be displayed to a projection surface, the projection terminal receives a first image template and a first image assembly, and edits an original image through the first image template and the first image assembly on the basis of the original image to obtain a new image, namely the first image. The original image may be edited by changing the size of the original image, where the size of the original image is 100 × 100, and the size of the original image is changed to 100 × 90 by adding a template (which may be added by a user or adaptively adjusted by a projection terminal according to current projection surface parameters), and a component such as a vase is added under the template, so that the edited 100 × 90 image with the vase is a first image. The simple dragging of the components and the layout adjustment are completed through the mobile terminal, and after the templates and the components are edited by the mobile terminal, the templates and the components which are not changed any more are determined to be the first image template and the first image component. After the projection terminal obtains the first image, the first image can be projected to the projection surface.
In an embodiment, the projection terminal and the mobile terminal are connected through bluetooth/WiFi or the like, and interface synchronization is performed through a customized private protocol, that is, the mobile terminal 103 may encode current screen information, user operation and the like to transmit to the projection terminal 102, the projection terminal 102 receives related encoded information, decodes according to the protocol, performs corresponding operations according to the screen information, the control instruction and the like, and projects the encoded information to a projection surface for display.
Step 203: and acquiring and adjusting the first image according to the projection plane image of the projection plane to obtain a second image.
In one embodiment, the projection surface image refers to an image formed by a projection area, and mainly comprises a projection image which is sent to the projection terminal by the mobile terminal and then projected to the projection surface, and a non-projection image which is not projected in the projection area. According to the projection surface image, the method for adjusting the first image to obtain the second image can be that a user independently adjusts through a mobile terminal, or the projection terminal obtains the projection parameters of the current projection surface according to modules such as a sensor and the like, and self-adaptive adjustment is carried out on the projection surface image, so that the adjusted image can be better displayed on the projection surface. Specifically, the adjustment of the first image may be the adjustment of the size, color temperature, brightness, etc., or the adjustment of the size, position, color temperature, brightness, addition of new components, etc., of image components in the first image.
The step of obtaining the second image by adaptively adjusting the first image through the projection terminal comprises the following steps: acquiring current projection parameters, wherein the projection parameters comprise at least one of projection surface style, color temperature and brightness; adjusting at least one of the first image template and the first image assembly according to the projection parameters to obtain a second image template and a second image assembly; generating a second image based on the second image template and the second image component. Specifically, if the projection surface is a white wall, as shown in fig. 6, in the projection interface i, the first image template is the template 1 and the gray background, and the corresponding first image component is the component 2, that is, the component represented by B in fig. 6, at this time, the projection terminal obtains that the current wall color is lighter through its sensor, and thus the gray background template at this time can be adaptively adjusted to a dark background, such as black, corresponding to the template 2 in the projection interface ii in fig. 6, and at this time, the component labeled B is adjusted to the component E under the template 2. In this process, the projection terminal adjusts the first image template to be the second image template, adjusts the first image component to be the second image component, and adjusts the first image to be the second image based on the second image template and the second image component. The mode of adaptively adjusting the projection plane image through the current projection parameter can ensure that the image has better display effect and stronger flexibility and intelligence.
In one embodiment, the projection terminal obtains that the material of the current projection surface is dark wood according to a camera, a sensor and the like, so that a light-color image template can be adaptively matched, and the current projection surface image is adjusted to be a light-color background and the like, so that the current projection surface image can be clearly displayed on the projection surface of the dark wood.
The step of adjusting the first image to obtain the second image through the mobile terminal comprises the following steps: receiving a first editing request from the mobile terminal; adjusting at least one of the first image template and the first image assembly according to the first editing request to obtain a second image template and a second image assembly; generating a second image based on the second image template and the second image component. The user edits the image through the mobile terminal, and the image can be in the same room as the projection terminal, and can directly see the editing interface of the projection surface, so that the image can be edited through the mobile terminal; the user and the projection terminal are not in the same room, and the user cannot directly see the editing interface of the projection surface, but the pictures of the mobile terminal and the projection terminal are synchronous in real time, so that the current editing interface can be seen through the mobile terminal, and the image works can be edited through the mobile terminal. The specific editing process may be: the user can edit and adjust the projection surface image through the mobile terminal according to the preference of the user, as shown from a mobile interface III to a mobile interface IV in fig. 6, the user adjusts the image in the mobile interface III, a component 1 (an image marked with a mark D in the figure) under a template 2 is added, the image in the mobile interface IV is obtained, the first image template and the first image component are adjusted in the process, a second image template and a second image component are obtained, in the process, the first image template is a black background, the second image template is also a black background, the first image component is a heptagon B of a flower shading, and the second image component is composed of the heptagon B of the flower shading and an ellipse D of the flower shading.
The first image template may be the same as or different from the second image template, and the adjustment of the first image component may be the adjustment of the position of the original image component, or the addition of a new component, where the new component and the original component form the second image component.
Step 204: and sending a second image template and a second image component corresponding to the second image to the mobile terminal so that the mobile terminal synchronously displays the second image.
In one embodiment, as shown in FIG. 6, moving interface II to projection interface I: the mobile terminal can send the current interface to the projection terminal in real time so as to display the editing process on the projection terminal side in real time. Similarly, as shown in fig. 6, projecting interface ii to moving interface iii: the projection terminal can also send the current interface to the mobile terminal in real time, so that the mobile terminal side can synchronously display the current image.
In one embodiment, after the step of sending the second image template corresponding to the second image and the second image component to the mobile terminal, so that the mobile terminal synchronously displays the second image, the method further includes: and after receiving a storage instruction from the mobile terminal, storing the second image in a database. Specifically, after the current image work is edited, the projection terminal may store the current image work according to a storage instruction sent by the mobile terminal and store the current image work in a database of the projection terminal after the current image work is edited. In addition, the mobile terminal can also store the edited image in the local storage of the mobile terminal.
In an embodiment, after storing the second image in the database, the projection terminal may adaptively select an appropriate display image from the database and project the display image onto the projection surface, and the specific steps include: acquiring current projection parameters, wherein the projection parameters comprise at least one of projection surface style, color temperature and brightness; and determining a display image corresponding to the projection parameters in the database according to the projection parameters and projecting the display image to the projection surface. Specifically, the projection terminal can adaptively select an image suitable for the current environment from a local database of the projection terminal for display according to the current projection parameters. If the current projection surface is wood with a dark background, a light natural landscape picture can be selected for display; if the current projection surface is a wall surface with a light background, a dark artistic work picture can be selected for display. As shown in the projection interface iv in fig. 6, the projection interface shows an edited image instead of an editing interface.
In an embodiment, after storing the second image in the database, the user may send an instruction through the mobile terminal, select a suitable display image from the database, and project the display image onto the projection surface, and the specific steps include: receiving a display request from the mobile terminal; and determining a display image corresponding to the display request from the database according to the display request and projecting the display image to the projection surface. Specifically, the user can select an image which is desired to be displayed through the mobile terminal and send the image to the projection terminal for displaying. As shown in the projection interface iv in fig. 6, the projection interface shows an edited image instead of an editing interface.
Based on the content of the foregoing embodiments, the functions of the mobile terminal 103 will be described in detail below, please refer to fig. 3, and fig. 3 is a functional diagram of the mobile terminal 103.
Step 301: the user turns on the mobile terminal side DIY software.
In one embodiment, the mobile terminal side may install DIY software capable of editing the image works, and the user performs editing operations on the image works through the DIY software.
Step 302: the APP at the mobile terminal side establishes connection with the cloud server through the network.
In an embodiment, the mobile terminal 103 may connect an APP on the mobile terminal side with the cloud server through a network, where the APP may be DIY software capable of editing image works.
Step 303: and the APP at the mobile terminal downloads the image DIY template and the image DIY module from the cloud server.
In one embodiment, since the image templates and the image components are stored in the cloud server, the user downloads the required image templates and image components from the cloud server after the APP establishes a connection with the cloud server.
Step 304: and the mobile terminal side establishes connection with the projection terminal through Bluetooth/WiFi.
In an embodiment, the communication mode between the mobile terminal and the projection terminal may be communication through a bluetooth connection, or communication through a WiFi connection.
Step 305: and the mobile terminal side synchronously projects the APP image editing interface to the projection terminal side for displaying in real time through a wireless screen projection protocol.
In one embodiment, the mobile terminal encodes the current screen information, user operation steps and the like into content to be transmitted through a self-defined private protocol, and transmits the content to the projection terminal, and the projection terminal receives the related information, then executes the related operation according to the screen information and the encoded control command, and then directly displays the related operation to realize the real-time synchronization of the mobile terminal and the projection terminal.
Step 306: and the user combines and edits the image template and the image component in real time through the mobile terminal APP, or uploads the template and the image component which are individually edited by the user through the mobile terminal APP.
In an embodiment, a user can edit an original image by using the APP of the mobile terminal, that is, a personalized image work is generated by combining and editing the image template and the image component, and besides, the user can use the image template and the image component downloaded from the cloud server, also can use the APP of the mobile terminal to customize the image template and the image component, can upload a personalized image edited by himself as the image component to an existing image template, and can also create a new image template, and upload the personalized image edited by himself as the image component to a newly created image template.
Step 307: and displaying the edited content on the projection terminal in real time through a wireless projection screen.
In one embodiment, the process of editing the image works by the user through the mobile terminal can be displayed on the projection terminal in real time through the wireless screen projection.
Step 308: and after the editing is finished, storing the image in the mobile terminal/projection terminal, and displaying the image in real time at the projection terminal side.
In an embodiment, after the image works are edited, the images can be stored in a local storage of the mobile terminal, and also can be uploaded to a cloud server for storage, and a storage instruction can also be sent to the projection terminal to instruct the projection terminal to store the image works and project the image works to the projection terminal for real-time display.
In the embodiment of the disclosure, a user edits an image work by using a mobile terminal, and generates a personalized image work by simple dragging and layout adjustment based on the form of a componentized template, so that the personalized image work can be used for digital visual display in a family scene. In addition, the mobile terminal 103 supports the user to upload the personalized picture or photo edited by the user side, and the picture or photo is embedded into the modular template, so that the user-defined image template and image module of the user are realized, and the personalized degree is further improved. Finally, through a customized privatized wireless screen projection protocol, the display synchronization of the software side of the mobile terminal 103 and the display synchronization of the projection terminal side are achieved, the process of editing image works by a user can be displayed in real time, what you see is what you get is achieved, and the interestingness and flexibility of projection are improved.
Specifically, referring to fig. 4, fig. 4 is another schematic flow chart of an image editing method according to an embodiment of the present disclosure, where the method at least includes the following steps:
step 401 to step 403: the mobile terminal 103 generates a downloading request and sends the downloading request to the cloud server 101; the projection terminal 102 generates a downloading request and sends the downloading request to the cloud server 101; the cloud server 101 processes the download request.
In one embodiment, a user may select a desired image template and image component by operating the mobile terminal 103 or the projection terminal 102, generate a corresponding download request, and then send the download request to the cloud server 101. The cloud server 101 receives and processes download requests from the mobile terminal 103 and the projection terminal 102, respectively.
Step 404 to step 405: the mobile terminal 103 receives and stores a preset image template and a preset image component sent from the cloud server 101; the projection terminal 102 receives and stores the preset image template and the preset image component sent from the cloud server 101, generates a preset image, and projects the preset image onto a projection surface.
In one embodiment, the cloud server 101 receives the download requests from the mobile terminal 103 and the projection terminal 102, respectively, parses the download requests, and determines a corresponding preset image template and a preset image component in a database of the cloud server, and generates a preset image based on the corresponding preset image template and the preset image component, and projects the preset image to the projection terminal 102, as shown in fig. 5, the mobile interface i is a preset image, i.e. an editing interface where the image work has not been edited, a preset image template, i.e. template 1, a preset image component, i.e. component 1 (ellipse a), component 2 (heptagon B), component 3 (diamond C), etc., and projects the editing interface on the projection plane, as shown in fig. 5, the projection interface i enables the user to visually see the editing interface from the projection surface, and the content displayed on the projection surface is the same as that displayed on the mobile terminal.
Step 406: the mobile terminal 103 creates a local image template and a local image component.
In one embodiment, a user may create custom image templates and components via mobile terminal 103. As shown in the mobile interface i in fig. 5, a user can create a new image template according to the preference of the user by clicking "+" of the template column in the mobile interface i, and after the new image template is correspondingly created, no component exists under the new image template, at this time, the user can create a new image component by clicking "+" of the component column in the mobile interface i, as shown in the mobile interface ii in fig. 5, the user can also create a new image component by clicking "+" of the component column under the custom template, at this time, the created new image component belongs to the custom template; in addition, the user can create a new image component by clicking "+" of the component bar under an existing template in the mobile interface i, for example, under template 1, and the created new image component belongs to template 1. It should be noted that creating a new image template or image component may be accomplished by the user uploading a personalized picture edited by the user.
Step 407 to step 409: the projection terminal 102 generates an image resource request and sends the image resource request to the mobile terminal 103, after receiving the image resource request sent by the projection terminal 102, the mobile terminal 103 processes the image resource request, determines a local image template and a local image component locally stored by the mobile terminal 103, generates a local image based on the local image template and the local image component, and then projects the local image onto a projection surface.
In one embodiment, the projection terminal 102 may request the image template and the image component created by the user from the mobile terminal 103 in addition to downloading the image template and the image component from the cloud server 101. As shown in fig. 5, the local image at this time is an editing interface with a local image template and local image components, the local image template is a self-defined template in the image, the local image components are components 1 (a quadrangle star X), 2 (a pentagon star Y), 3 (a triangle Z) and the like under the self-defined template in the image, and the editing interface is projected on the projection plane, as shown in fig. 5, the projection interface ii enables a user to visually see the editing interface from the projection plane, and the local image template and the local image components in the editing interface, and at this time, the content displayed on the projection plane is consistent with that displayed on the mobile terminal.
Step 410 to step 413: the mobile terminal 103 generates a second editing request and sends the second editing request to the projection terminal 102, the projection terminal 102 processes the second editing request from the mobile terminal 103 and obtains the current projection parameters, and then the initial image template is determined according to the second editing request or the current projection parameters.
In an embodiment, a user sends an original image to be displayed to a projection terminal for displaying through a mobile terminal 103, the user can actively send an editing request to the projection terminal 102 through the mobile terminal 103 according to a display effect, and the projection terminal 102 can determine an initial image template according to the editing request sent by the user and adjust the original image.
In one embodiment, in addition to the user-autonomous adjustment of the image template, the projection terminal 102 may adaptively select an appropriate initial image template. After the original image is projected onto the projection surface, the projection terminal can acquire projection parameters of the current projection surface, such as the style, color temperature, brightness and the like of the projection surface, according to modules such as a sensor and the like, then adaptively match a most appropriate initial image template according to the projection parameters of the current projection surface, and adjust the original image through the initial image template, such as filter changing, color tone adjustment, color temperature adjustment and the like, so that the adjusted image can be better displayed on the projection surface.
Step 414 to step 416: the mobile terminal 103 generates a first image template and a first image component and sends the first image template and the first image component to the projection terminal 102, and the projection terminal 102 receives the first image template and the first image component from the mobile terminal 103, generates a first image based on the first image template and the first image component, and projects the first image onto a projection surface.
In an embodiment, as shown in fig. 6, a user performs an image editing operation on the mobile terminal 103, selects a template 1 and a component 2 (i.e., a heptagon image labeled B) under the template 1, at this time, an interface in the mobile terminal 103 is shown as a mobile interface ii, an image work changes from a white image shown in the mobile interface i to a gray image shown in the mobile interface ii through the template 1, and by adding the component 2, the image work adds the white heptagon B on the basis of a gray image background, and besides directly dragging the component 2 under the template 1 to the gray image, the user can also change the color, size, and the like of the component. Wherein the first image template corresponds to template 1 and the first image component corresponds to component 2 under template 1.
In one embodiment, the first image template and the first image component are sent to the projection terminal 102, and the first image is generated based on the first image template and the first image component. As shown in fig. 6, the projection interface i on the projection terminal side is an interface for receiving the first image template and the first image component transmitted by the mobile terminal 103, and the interface is substantially the same as the image editing interface on the mobile terminal 103 side, except that the image size is adjusted with respect to the size of the current projection plane.
Step 417 to step 420: the mobile terminal 103 generates a first editing request and sends the first editing request to the projection terminal 102, the projection terminal 102 processes the first editing request from the mobile terminal 103, then adjusts the first image according to the first editing request or the current projection parameter to obtain a second image, then sends a second image template and a second image component corresponding to the second image to the mobile terminal 103, and the mobile terminal 103 receives the second image template and the second image component from the projection terminal 102.
In an embodiment, the projection terminal 102 obtains projection parameters of a current projection surface through modules such as a sensor, for example, a style, a color temperature, a brightness, and the like of the projection surface, then adaptively matches a most suitable initial image template according to the projection parameters of the current projection surface, if the projection surface is light color, as shown in fig. 6, the projection terminal 102 selects the template 2, adjusts the projection interface i through the template 2, for example, changes a filter, adjusts a color tone, a color temperature, and the like, that is, adjusts a gray image in the projection interface i to a black image in the projection interface ii, and simultaneously adjusts an original heptagon B to a heptagon E with a bottom color, so that the adjusted image can be better displayed on the projection surface, and then sends the projection interface ii to the mobile terminal 103, thereby obtaining an interface such as the mobile interface iii in fig. 6. The first image refers to an image work formed by a gray image and a white heptagon B in a projection interface I, and the second image refers to an image work formed by a black image and a heptagon E with pattern background color; the second image template refers to the black image background and the second image component refers to the heptagon E of the motif base color.
In one embodiment, as shown in fig. 6, the user performs an image editing operation on the mobile terminal 103, i.e. a first editing request is generated, which corresponds to template 2 and component 1 under template 2 (i.e. the oval image marked with the undertone of D), when the interface in mobile terminal 103 is shown as mobile interface iv, the image works are changed from the black image shown by the moving interface III into the black image shown by the moving interface IV through the template 2, i.e., the template is unchanged, the image background is unchanged, and by adding component 1 (i.e., the shaded oval image labeled D), the image work is on a black image background, and the figure base color ellipse D is added on the basis of the figure base color heptagon E, except that the component 1 under the template 2 is directly dragged to a black image, the user can also change the color or the size of the component, and the like. The first image refers to an image work consisting of a black image and a heptagon E of the pattern background color, and the second image refers to an image work consisting of a black image, a heptagon E of the pattern background color and an oval D of the pattern background color; the second image template refers to the black image background and the second image components refer to the set of components of the heptagon E and the oval D of the motif base.
It should be noted that, the adjustment of the first image component to the second image component may be a simple drag of a component corresponding to the first image component, that is, only the position of the first image component is changed, or a new component may be added on the basis of the first image component to form the second image component.
Step 421 to step 422: the mobile terminal 103 generates a storage instruction and sends the storage instruction to the projection terminal 102, and the projection terminal 102 processes the storage instruction and stores the second image in the database.
In one embodiment, the user performs a storage operation on the edited image work through the mobile terminal 103, and sends a storage instruction to the projection terminal, so that the projection terminal stores and summarizes the edited image work in a local database of the projection terminal.
In one embodiment, mobile terminal 103 may also store the edited image work in a local database. In addition, the mobile terminal 103 can also upload the edited image works to the cloud server 101 for storage.
Step 423 to step 425: the mobile terminal generates a display request and sends the display request to the projection terminal 102, the projection terminal 102 processes the display request, and finally determines a corresponding display image in the database according to the display request or the current projection surface parameter and projects the display image to the projection surface.
In one embodiment, the user may select an image in the local database for projection presentation via the mobile terminal 103.
In an embodiment, since the local database of the projection terminal 102 stores image works, the projection terminal 102 can also select the most suitable image work to display according to the projection parameters of the current projection plane.
It should be noted that, the above-mentioned image editing process, whether performed in the mobile terminal 103 or the projection terminal 102, the image editing interface and the edited image work can be synchronously displayed in real time in the mobile terminal 103 and the projection terminal 102, the editing and displaying of the image is a double-ended cyclic process, the user edits the image work through the mobile terminal, the projection terminal can see the current editing by projecting the current interface, the projection terminal adaptively adjusts the image on the projection surface, the adjusted image and the adjustment process can also be sent to the mobile terminal, and the mobile terminal can continue to edit the image on the basis until the editing is finished, so as to improve the flexibility and intelligence of the projection.
Based on the content of the above embodiments, the embodiments of the present disclosure provide an image editing apparatus, which may be disposed in a projection terminal. The image editing apparatus is configured to execute the image editing method provided in the foregoing method embodiment, and specifically, referring to fig. 7, the apparatus includes:
a first receiving module 701, configured to receive a first image template and a first image component from a mobile terminal;
a projection module 702, configured to generate a first image based on the first image template and a first image component, and project the first image onto a projection surface;
the image adjusting module 703 is configured to obtain and adjust the first image according to the projection plane image of the projection plane to obtain a second image;
a sending module 704, configured to send a second image template and a second image component corresponding to the second image to the mobile terminal, so that the mobile terminal synchronously displays the second image.
In one embodiment, the image adjustment module 703 includes:
the first acquisition module is used for acquiring current projection parameters, and the projection parameters comprise at least one of projection surface style, color temperature and brightness;
the first adjusting module is used for adjusting at least one of the first image template and the first image assembly according to the projection parameters to obtain a second image template and a second image assembly;
a first generation module to generate a second image based on the second image template and the second image component.
In one embodiment, the image adjustment module 703 further comprises:
the second receiving module is used for receiving a first editing request from the mobile terminal;
a second adjusting module, configured to adjust at least one of the first image template and the first image component according to the first editing request, so as to obtain a second image template and a second image component;
a second generation module to generate a second image based on the second image template and the second image component.
In one embodiment, the image editing apparatus may further include:
a third receiving module, configured to receive a second editing request from the mobile terminal;
the first determining module is used for determining an initial image template and an initial image assembly according to the second editing request;
and the third generation module is used for generating an initial image based on the initial image template and the initial image assembly and projecting the initial image to the projection surface.
In one embodiment, the image editing apparatus may further include:
the connection module is used for establishing connection with the cloud end;
the downloading module is used for downloading a preset image template and a preset image assembly from the cloud end;
and the fourth generation module is used for generating a preset image based on the preset image template and a preset image assembly and projecting the preset image to the projection surface.
In one embodiment, the image editing apparatus may further include:
the resource request module is used for sending an image resource request to the mobile terminal;
the fourth receiving module is used for receiving the local image template and the local image assembly returned by the mobile terminal based on the image resource request;
and the fifth generation module is used for generating a local image based on the local image template and the local image component and projecting the local image to the projection surface.
In one embodiment, the image editing apparatus may further include:
and the storage module is used for storing the second image to the database after receiving a storage instruction from the mobile terminal.
In one embodiment, the image editing apparatus may further include:
the second acquisition module is used for acquiring current projection parameters, and the projection parameters comprise at least one of projection surface style, color temperature and brightness;
and the second determining module is used for determining a display image corresponding to the projection parameters in the database according to the projection parameters and projecting the display image to the projection surface.
In one embodiment, the display module may further comprise:
a fifth receiving module, configured to receive a display request from the mobile terminal;
and the third determining module is used for determining a display image corresponding to the display request from the database according to the display request and projecting the display image to the projection surface.
The image editing apparatus of the embodiment of the present disclosure may be configured to execute the technical solutions of the foregoing method embodiments, and the implementation principles and technical effects thereof are similar, and are not described herein again.
Different from the prior art, the image editing device provided by the disclosure is provided with the image adjusting module, the first image is optimally adjusted to the second image through the image adjusting module according to the projection surface image of the projection surface, wherein the first image can be adjusted by receiving an editing request sent by the mobile terminal or by acquiring the projection parameter of the projection surface in a self-adaptive manner, and the real-time double-end interaction of the mobile terminal and the projection terminal is realized through the image editing device, so that the double ends can display the image editing process in real time, and the interestingness, the flexibility and the intelligence in the projection process are improved.
Correspondingly, the embodiment of the disclosure also provides a projection device. As shown in fig. 8, the projection device may include components such as a processor 801 with one or more processing cores, a Wireless Fidelity (WiFi) module 802, a memory 803 with one or more computer readable storage media, an audio circuit 804, a display unit 805, an input unit 806, a sensor 807, a power supply 808, and a Radio Frequency (RF) circuit 809. Those skilled in the art will appreciate that the projection device configuration illustrated in fig. 8 does not constitute a limitation of the projection device and may include more or fewer components than illustrated, or some components may be combined, or a different arrangement of components. Wherein:
the processor 801 is a control center of the projection apparatus, connects various parts of the entire projection apparatus using various interfaces and lines, and performs various functions of the terminal and processes data by running or executing software programs and/or modules stored in the memory 803 and calling data stored in the memory 803, thereby monitoring the projection apparatus as a whole. In one embodiment, processor 801 may include one or more processing cores; preferably, the processor 801 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 801.
WiFi belongs to short-distance wireless transmission technology, and the terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 802, and provides wireless broadband internet access for the user. Although fig. 8 shows the WiFi module 802, it is understood that it does not belong to the essential constitution of the terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The memory 803 may be used to store software programs and modules, and the processor 801 executes various functional applications and data processing by operating the computer programs and modules stored in the memory 803. The memory 803 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, image data, etc.) created according to the use of the projection device, and the like. Further, the memory 803 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 803 may also include a memory controller to provide the processor 801 and the input unit 806 with access to the memory 803.
Audio circuitry 804 includes a speaker and a microphone may provide an audio interface between a user and the projection device. The audio circuit 804 can transmit the electrical signal converted from the received audio data to the speaker, and the electrical signal is converted into a sound signal by the speaker and output; on the other hand, the microphone converts the collected sound signal into an electric signal, which is received by the audio circuit 804 and converted into audio data, which is then processed by the audio data output processor 801 and then sent to, for example, another terminal via the RF circuit 809, or the audio data is output to the memory 803 for further processing. The audio circuit 804 may also include an earbud jack to provide communication of peripheral headphones with the terminal.
The display unit 805 may be used to display information input by or provided to the user and various graphical user interfaces of the projection device, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 805 may include a Display panel, and in one embodiment, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may overlay the display panel, and when a touch operation is detected on or near the touch-sensitive surface, the touch operation is transmitted to the processor 801 to determine the type of touch event, and then the processor 801 provides a corresponding visual output on the display panel according to the type of touch event. Although in FIG. 8 the touch sensitive surface and the display panel are two separate components to implement the input and output functions, in some embodiments the touch sensitive surface may be integrated with the display panel to implement the input and output functions.
The input unit 806 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, in one particular embodiment, the input unit 806 may include a touch-sensitive surface as well as other input devices. The touch-sensitive surface, also referred to as a touch display screen or a touch pad, may collect touch operations by a user (e.g., operations by a user on or near the touch-sensitive surface using a finger, a stylus, or any other suitable object or attachment) thereon or nearby, and drive the corresponding connection device according to a predetermined program. In one embodiment, the touch sensitive surface may comprise two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 801, and can receive and execute commands sent by the processor 801. In addition, touch sensitive surfaces may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 806 may include other input devices in addition to the touch-sensitive surface. In particular, other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The projection device may also include at least one sensor 807, such as a light sensor and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the projected image according to the brightness of ambient light. Other sensors such as a hygrometer, a thermometer, and an infrared sensor may also be configured for the projection device, and are not described herein.
The projection device also includes a power supply 808 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 801 via a power management system to manage charging, discharging, and power consumption via the power management system. The power supply 808 may also include any component including one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
The RF circuit 809 may be used for receiving and transmitting signals during a message transmission or communication process, and in particular, receives downlink information from a base station and then sends the received downlink information to the one or more processors 801 for processing; in addition, data relating to uplink is transmitted to the base station. Generally, the RF circuitry 809 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 809 can also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), and the like.
Although not shown, the projection device may further include a bluetooth module, etc., which will not be described herein. Specifically, in this embodiment, the processor 801 in the projection device loads an executable file corresponding to a process of one or more application programs into the memory 803 according to the following instructions, and the processor 801 runs the application programs stored in the memory 803, so as to implement the following functions:
receiving a first image template and a first image assembly from a mobile terminal;
generating a first image based on the first image template and a first image assembly, and projecting the first image to a projection surface;
acquiring and adjusting the first image according to the projection surface image of the projection surface to obtain a second image;
and sending a second image template and a second image component corresponding to the second image to the mobile terminal so that the mobile terminal synchronously displays the second image.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description, and are not described herein again.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, the disclosed embodiments provide a computer-readable storage medium having stored therein a plurality of instructions that are loadable by a processor to cause the following functionality:
receiving a first image template and a first image assembly from a mobile terminal;
generating a first image based on the first image template and a first image assembly, and projecting the first image to a projection surface;
acquiring and adjusting the first image according to the projection surface image of the projection surface to obtain a second image;
and sending a second image template and a second image component corresponding to the second image to the mobile terminal so that the mobile terminal synchronously displays the second image.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the computer-readable storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the computer-readable storage medium can execute the steps in any method provided by the embodiments of the present disclosure, the beneficial effects that can be achieved by any method provided by the embodiments of the present disclosure can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
Meanwhile, the disclosed embodiments provide a computer program product or a computer program comprising computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided in the various alternative implementations described above. For example, the following functions are implemented:
receiving a first image template and a first image assembly from a mobile terminal;
generating a first image based on the first image template and a first image assembly, and projecting the first image to a projection surface;
acquiring and adjusting the first image according to the projection surface image of the projection surface to obtain a second image;
and sending a second image template and a second image component corresponding to the second image to the mobile terminal so that the mobile terminal synchronously displays the second image.
The image editing method, the image editing apparatus, the projection device, and the computer-readable storage medium provided by the embodiments of the disclosure are described in detail above, and specific examples are applied herein to illustrate the principles and implementations of the disclosure, and the description of the embodiments is only used to help understand the method and the core ideas of the disclosure; meanwhile, for those skilled in the art, according to the idea of the present disclosure, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present description should not be construed as a limitation to the present disclosure.

Claims (10)

1. An image editing method, comprising:
receiving a first image template and a first image assembly from a mobile terminal;
generating a first image based on the first image template and a first image assembly, and projecting the first image to a projection surface;
acquiring and adaptively adjusting the first image according to projection parameters to obtain a second image according to a projection surface image of the projection surface, wherein the projection surface image comprises a non-projection image in the projection surface and a display image of the first image in the projection surface;
and sending a second image template and a second image component corresponding to the second image to the mobile terminal so that the mobile terminal synchronously displays the second image.
2. The image editing method according to claim 1, wherein the step of obtaining and adaptively adjusting the first image according to the projection parameter to obtain a second image according to the projection plane image of the projection plane comprises:
acquiring current projection parameters, wherein the projection parameters comprise at least one of projection surface style, color temperature and brightness;
according to the projection parameters, at least one of the first image template and the first image assembly is adjusted in a self-adaptive mode to obtain a second image template and a second image assembly;
generating a second image based on the second image template and the second image component.
3. The image editing method according to claim 1, further comprising, before the step of receiving the first image template and the first image component from the mobile terminal:
receiving a second editing request from the mobile terminal;
determining an initial image template and an initial image assembly according to the second editing request;
and generating an initial image based on the initial image template and an initial image component, and projecting the initial image to the projection surface.
4. The image editing method according to any one of claims 1 to 3, further comprising, before the step of receiving the first image template and the first image component from the mobile terminal:
establishing connection with a cloud end;
downloading a preset image template and a preset image assembly from the cloud;
and generating a preset image based on the preset image template and a preset image assembly, and projecting the preset image to the projection surface.
5. The image editing method according to any one of claims 1 to 3, further comprising, before the step of receiving the first image template and the first image component from the mobile terminal:
sending an image resource request to the mobile terminal;
receiving a local image template and a local image assembly returned by the mobile terminal based on the image resource request;
and generating a local image based on the local image template and a local image component, and projecting the local image to the projection surface.
6. The image editing method according to claim 1, further comprising, after the step of sending the second image template and the second image component corresponding to the second image to the mobile terminal:
and after receiving a storage instruction from the mobile terminal, storing the second image in a database.
7. The image editing method according to claim 6, further comprising, after the step of storing the second image in a database:
acquiring current projection parameters, wherein the projection parameters comprise at least one of projection surface style, color temperature and brightness;
and determining a display image corresponding to the projection parameters in the database according to the projection parameters and projecting the display image to the projection surface.
8. The image editing method according to claim 6, further comprising, after the step of storing the second image in a database:
receiving a display request from the mobile terminal;
and determining a display image corresponding to the display request from the database according to the display request and projecting the display image to the projection surface.
9. An image editing apparatus characterized by comprising:
the first receiving module is used for receiving a first image template and a first image assembly from the mobile terminal;
the projection module is used for generating a first image based on the first image template and the first image assembly and projecting the first image to a projection surface;
the image adjusting module is used for obtaining and adaptively adjusting the first image according to projection parameters to obtain a second image according to a projection surface image of the projection surface, wherein the projection surface image comprises a non-projection image in the projection surface and a display image of the first image in the projection surface;
and the sending module is used for sending a second image template and a second image component corresponding to the second image to the mobile terminal so that the mobile terminal synchronously displays the second image.
10. A projection device, comprising: a processor and a memory, the memory for storing instructions and data, the processor being configured to execute the instructions in the memory to perform the steps in the image editing method of any of claims 1 to 8.
CN202110329990.3A 2021-03-29 2021-03-29 Image editing method and device and projection equipment Active CN112711370B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110329990.3A CN112711370B (en) 2021-03-29 2021-03-29 Image editing method and device and projection equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110329990.3A CN112711370B (en) 2021-03-29 2021-03-29 Image editing method and device and projection equipment

Publications (2)

Publication Number Publication Date
CN112711370A CN112711370A (en) 2021-04-27
CN112711370B true CN112711370B (en) 2021-07-02

Family

ID=75550398

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110329990.3A Active CN112711370B (en) 2021-03-29 2021-03-29 Image editing method and device and projection equipment

Country Status (1)

Country Link
CN (1) CN112711370B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106507076A (en) * 2016-11-25 2017-03-15 重庆杰夫与友文化创意有限公司 A kind of projecting method, apparatus and system
CN110928126A (en) * 2019-12-09 2020-03-27 四川长虹电器股份有限公司 Projection equipment capable of automatically adjusting brightness
CN110996086A (en) * 2019-12-26 2020-04-10 成都极米科技股份有限公司 Projection brightness adjusting method and related device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004198858A (en) * 2002-12-20 2004-07-15 Casio Comput Co Ltd Projector
CN101324804B (en) * 2007-06-14 2014-11-19 深圳市巨龙科教高技术股份有限公司 Large-screen interactive electronic white board and interactive method thereof
JP2011141411A (en) * 2010-01-07 2011-07-21 Seiko Epson Corp Projector and method of controlling the same
US9383888B2 (en) * 2010-12-15 2016-07-05 Microsoft Technology Licensing, Llc Optimized joint document review
CN107589876B (en) * 2017-09-27 2020-06-30 深圳如果技术有限公司 Projection system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106507076A (en) * 2016-11-25 2017-03-15 重庆杰夫与友文化创意有限公司 A kind of projecting method, apparatus and system
CN110928126A (en) * 2019-12-09 2020-03-27 四川长虹电器股份有限公司 Projection equipment capable of automatically adjusting brightness
CN110996086A (en) * 2019-12-26 2020-04-10 成都极米科技股份有限公司 Projection brightness adjusting method and related device

Also Published As

Publication number Publication date
CN112711370A (en) 2021-04-27

Similar Documents

Publication Publication Date Title
CN107369197B (en) Picture processing method, device and equipment
CN110990741B (en) Page display method and device, electronic equipment, server and storage medium
EP4130963A1 (en) Object dragging method and device
CN106610826B (en) Method and device for manufacturing online scene application
CN110795666B (en) Webpage generation method, device, terminal and storage medium
AU2010287251B2 (en) Method for providing control widget and device using the same
US20130263011A1 (en) Control of computing devices and user interfaces
CN108566332B (en) Instant messaging information processing method, device and storage medium
US10972360B2 (en) Dynamic design of a lighting configuration
CN107332976B (en) Karaoke method, device, equipment and system
CN106780684B (en) Animation effect realization method and device
CN105446726A (en) Method and device for generating webpage
CN105978766A (en) Device, system and method for operating electric appliance through employing mobile terminal
CN107436712B (en) Method, device and terminal for setting skin for calling menu
CN107666406B (en) Intelligent card display method and device
TW201321999A (en) Electronic device and method for collaborating editing by a plurality of mobile devices
CN110458921B (en) Image processing method, device, terminal and storage medium
CN111124412A (en) Game page drawing method, device, equipment and storage medium
CN109614173B (en) Skin changing method and device
CN106789556B (en) Expression generation method and device
CN103279285A (en) Scenery color picking based theme switching method and mobile terminal
CN111128252B (en) Data processing method and related equipment
CN112711370B (en) Image editing method and device and projection equipment
CN116594616A (en) Component configuration method and device and computer readable storage medium
CN105320532B (en) Method, device and terminal for displaying interactive interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant