CN115562543A - Control method of electronic equipment and electronic equipment - Google Patents

Control method of electronic equipment and electronic equipment Download PDF

Info

Publication number
CN115562543A
CN115562543A CN202210023806.7A CN202210023806A CN115562543A CN 115562543 A CN115562543 A CN 115562543A CN 202210023806 A CN202210023806 A CN 202210023806A CN 115562543 A CN115562543 A CN 115562543A
Authority
CN
China
Prior art keywords
interface
image
control
displaying
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210023806.7A
Other languages
Chinese (zh)
Other versions
CN115562543B (en
Inventor
鲍新彤
王龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210023806.7A priority Critical patent/CN115562543B/en
Priority to CN202310965343.0A priority patent/CN117170557A/en
Publication of CN115562543A publication Critical patent/CN115562543A/en
Application granted granted Critical
Publication of CN115562543B publication Critical patent/CN115562543B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1202Dedicated interfaces to print systems specifically adapted to achieve a particular effect
    • G06F3/1203Improving or facilitating administration, e.g. print management
    • G06F3/1205Improving or facilitating administration, e.g. print management resulting in increased flexibility in print job configuration, e.g. job settings, print requirements, job tickets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1253Configuration of print job parameters, e.g. using UI at the client
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a control method of electronic equipment and the electronic equipment, the method can provide an interesting printing function, and is beneficial to improving the intelligent degree of the electronic equipment during printing, and finally the use experience of a user is improved. The method comprises the following steps: displaying a first interface, wherein the first interface comprises a first image; in response to a first operation on the first interface, displaying a second interface, the second interface comprising: a second image and a first control, the second image being different from the first image; in response to a second operation on the first control, displaying a third interface, the third interface comprising a third image; in response to a third operation on the third interface, displaying a fourth interface, the fourth interface comprising: a fourth image and a second control, the fourth image being different from the third image; in response to a fourth operation on the second control, displaying a fifth interface, the fifth interface comprising: a second image and a fourth image.

Description

Control method of electronic equipment and electronic equipment
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a control method for an electronic device and an electronic device.
Background
The printing service that present electronic equipment can provide is comparatively single, is difficult to satisfy user's demand.
Disclosure of Invention
The application provides a control method of an electronic device and the electronic device, which can provide an interesting printing function, is beneficial to improving the intelligent degree of the electronic device during printing, and finally improves the use experience of a user.
In a first aspect, an embodiment of the present application provides a method for controlling an electronic device, where the method may be applied to the electronic device, and the method includes: displaying a first interface, wherein the first interface comprises a first image, and the first image is used for displaying an upper link in the couplet; the method comprises the steps that a user edits the upper link in a first interface, and a second interface is displayed in response to a first operation on the first interface, wherein the second interface comprises: the second image corresponds to the upper link in the couplet, and the second image is different from the first image; the second image may be an edited upload; in response to a second operation on the first control, displaying a third interface, the third interface including a third image, the third image corresponding to a downline in the object; the user edits the downlinks in the third interface, and responds to a third operation on the third interface to display a fourth interface, wherein the fourth interface comprises: a fourth image and a second control, the fourth image corresponding to a lower link in the couplet, the fourth image being different from the third image; the fourth image may be an edited downline, and in response to a fourth operation on the second control, a fifth interface is displayed, the fifth interface including: the second image, the fourth image and the fifth interface can be preview interfaces, and the uplink and the downlink after editing can be displayed in a preview mode in one interface.
Therefore, in the embodiment, the electronic equipment provides an editing function of the antithetical couplet, the user can respectively check and edit the upper antithetical couplet and the lower antithetical couplet in the antithetical couplet, and the upper antithetical couplet and the lower antithetical couplet are previewed together, so that the user can conveniently edit and preview the antithetical couplet on the electronic equipment, and the user experience is better.
In one possible embodiment, the first interface further comprises: a third control, responsive to a fifth operation on the third control, displaying a sixth interface, the sixth interface including a fifth image, the fifth image corresponding to a horizontal batch in the couplet; the user can edit the horizontal batch on the sixth interface, and in response to a sixth operation on the sixth interface, a seventh interface is displayed, and the seventh interface comprises: a sixth image and a second control, the sixth image corresponding to a horizontal bar in the couplet, the sixth image being different from the fifth image; wherein the fifth interface further comprises a sixth image.
In one possible embodiment, the first interface further comprises: a fourth control, which may be a writing control; in response to a seventh operation on the fourth control, displaying an eighth interface on which the user can write, the eighth interface comprising: the seventh image is the same as the first image in content; in response to an eighth operation on the first writing area, displaying an eighth image in the first writing area according to a touch trajectory of the eighth operation; displaying a ninth image corresponding to the eighth image at the first position of the seventh image in response to a ninth operation acting on the fifth control; and responding to the tenth operation acted on the sixth control, and sending a printing instruction to the printing device.
In one possible embodiment, before sending the print instruction to the printing device, the method further comprises: and displaying prompt information, wherein the prompt information comprises the size information of the printing paper matched with the first image.
In one possible implementation, in response to a ninth operation applied to the fifth control, a second writing area is displayed in which the user can write a next word, and in response to an eighth operation applied to the first writing area, after an eighth image is displayed in the first writing area according to a touch trajectory of the eighth operation, a tenth image is displayed in the second writing area according to a touch trajectory of the eleventh operation in response to an eleventh operation applied to the second writing area; and displaying an eleventh image corresponding to the tenth image at a second position of the seventh image in response to a twelfth operation applied to the fifth control, wherein the second position is different from the first position.
In a possible implementation manner, the eighth interface includes a seventh control, and the user can also modify the word displayed on the couplet by triggering the seventh control; displaying a third writing area in response to a thirteenth operation on the seventh control after displaying a ninth image corresponding to the eighth image at the first position of the seventh image in response to the ninth operation on the fifth control; in response to a fourteenth operation applied to the third writing area, displaying a twelfth image in the third writing area according to a touch trajectory of the fourteenth operation; and displaying a thirteenth image corresponding to the twelfth image at the first position of the seventh image in response to a fifteenth operation acting on the fifth control.
In one possible embodiment, the second image in the second interface is smaller than the first image in the first interface. When the first writing area is displayed in the second interface, the first image is reduced to the second image in order to avoid the first writing area from blocking the image.
In one possible embodiment, the first interface further comprises: the electronic device displays a first option in the first interface, where the first option corresponds to the background image, and for example, the first option may be a background option; the electronic equipment receives a seventeenth operation of the user on the first option, and displays a background image selected by the seventeenth operation in the first image.
In one possible embodiment, the first interface further comprises: and the ninth control is triggered by the user, and a ninth interface is displayed in response to the eighteenth operation on the ninth control, wherein the ninth interface comprises: the fourth option corresponds to the color, the third option corresponds to the font, and the fourth option corresponds to the text; receiving a nineteenth operation of the user on the second option, and displaying a text corresponding to the nineteenth operation in the fourteenth image; receiving a twentieth operation of the user on the third option, and displaying the text in a font corresponding to the twentieth operation in the fourteenth image; and receiving a twenty-first operation of the user on the fourth option, and displaying the text in a color corresponding to the twenty-first operation in the fourteenth image.
In one possible embodiment, the third interface further comprises: and the tenth control is triggered by the user, and a tenth interface is displayed in response to a twelfth operation on the tenth control, wherein the tenth interface comprises: a fifteenth image, a text entry area, and a text box; receiving a twenty-third operation acting on the text input area, and displaying text content corresponding to the twenty-third operation in the text box; and displaying the text content on the fifteenth image in response to a twenty-fourth operation of the user on the text box.
In a second aspect, embodiments of the present application provide an electronic device comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method of controlling the electronic device according to any one of the first aspect.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium includes a stored program, where the program, when executed, controls an apparatus in which the computer-readable storage medium is located to perform the control method for an electronic apparatus according to any one of the first aspect.
In a fourth aspect, the present application provides a computer program product containing executable instructions that, when executed on a computer, cause the computer to perform the method for controlling an electronic device according to any one of the first aspect.
In a fifth aspect, embodiments of the present application further provide a chip, coupled to the memory, for executing a computer program stored in the memory to perform any one of the possible printing of any one of the above aspects.
It is to be understood that the electronic device of the second aspect, the computer-readable storage medium of the third aspect, the computer program product of the fourth aspect, and the chip of the fifth aspect are all configured to execute the corresponding method provided above, and therefore, the beneficial effects achieved by the electronic device of the second aspect, the computer program product of the fourth aspect, and the chip of the fifth aspect can refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Drawings
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of a mobile phone according to an embodiment of the present application;
fig. 3a is a schematic diagram of a graphical user interface of a first electronic device according to an embodiment of the present application;
FIG. 3b is a schematic diagram of a graphical user interface of a second electronic device according to an embodiment of the present application;
FIG. 3c is a schematic diagram of a graphical user interface of a third electronic device according to an embodiment of the present application;
FIG. 3d is a schematic diagram of a graphical user interface of a fourth electronic device according to an embodiment of the present application;
fig. 4a is a schematic diagram of a graphical user interface of a fifth electronic device according to an embodiment of the present application;
fig. 4b is a schematic diagram of a graphical user interface of a sixth electronic device according to an embodiment of the present application;
fig. 4c is a schematic diagram of a graphical user interface of a seventh electronic device according to an embodiment of the present application;
fig. 4d is a schematic diagram of a graphical user interface of an eighth electronic device according to an embodiment of the present application;
fig. 4e is a schematic diagram of a graphical user interface of a ninth electronic device according to an embodiment of the present application;
fig. 5a is a schematic diagram of a graphical user interface of a tenth electronic device according to an embodiment of the present application;
fig. 5b is a schematic diagram of a graphical user interface of an eleventh electronic device according to an embodiment of the present application;
fig. 5c is a schematic diagram of a graphical user interface of a twelfth electronic device according to an embodiment of the present application;
FIG. 5d is a schematic diagram of a graphical user interface of a thirteenth electronic device according to an embodiment of the present application;
fig. 5e is a schematic diagram of a graphical user interface of a fourteenth electronic device according to an embodiment of the present application;
fig. 5f is a schematic diagram of a graphical user interface of a fifteenth electronic device according to an embodiment of the present application;
fig. 5g is a schematic diagram of a graphical user interface of a sixteenth electronic device according to an embodiment of the present application;
fig. 6a is a schematic diagram of a graphical user interface of a seventeenth electronic device according to an embodiment of the present application;
fig. 6b is a schematic diagram of a graphical user interface of an eighteenth electronic device according to an embodiment of the present application;
fig. 6c is a schematic diagram of a graphical user interface of a nineteenth electronic device according to an embodiment of the present application;
fig. 7a is a schematic diagram of a graphical user interface of a twentieth electronic device according to an embodiment of the present application;
fig. 7b is a schematic diagram of a graphical user interface of a twenty-first electronic device according to an embodiment of the present application;
fig. 7c is a schematic diagram of a graphical user interface of a twenty-second electronic device according to an embodiment of the present application;
FIG. 7d is a schematic diagram of a graphical user interface of a twenty-third electronic device according to an embodiment of the present application;
fig. 7e is a schematic diagram of a graphical user interface of a twenty-fourth electronic device according to an embodiment of the present application;
fig. 7f is a schematic view of a graphical user interface of a twenty-fifth electronic device according to an embodiment of the present application;
fig. 7g is a schematic diagram of a graphical user interface of a twenty-sixth electronic device according to an embodiment of the present application;
fig. 8a is a schematic diagram of a graphical user interface of a twenty-seventh electronic device according to an embodiment of the present application;
fig. 8b is a schematic diagram of a graphical user interface of a twenty-eighth electronic device according to an embodiment of the present application;
fig. 8c is a schematic diagram of a graphical user interface of a twenty-ninth electronic device according to an embodiment of the present application;
fig. 8d is a schematic diagram of a graphical user interface of a thirtieth electronic device according to an embodiment of the present application;
fig. 9a is a schematic diagram of a graphical user interface of a thirty-first electronic device according to an embodiment of the present application;
fig. 9b is a schematic diagram of a graphical user interface of a twelfth electronic device according to an embodiment of the present application;
fig. 9c is a schematic diagram of a graphical user interface of a thirty-third electronic device according to an embodiment of the present application;
fig. 9d is a schematic diagram of a graphical user interface of a thirty-fourth electronic device according to an embodiment of the present application;
fig. 9e is a schematic diagram of a graphical user interface of a thirty-fifth electronic device according to an embodiment of the present application;
fig. 10a is a schematic diagram of a graphical user interface of a sixteenth electronic device according to an embodiment of the present application;
fig. 10b is a schematic diagram of a graphical user interface of a seventeenth electronic device according to an embodiment of the present application;
fig. 10c is a schematic diagram of a graphical user interface of a thirty-eighth electronic device according to an embodiment of the present application;
fig. 10d is a schematic diagram of a graphical user interface of a thirty-ninth electronic device according to an embodiment of the present application;
fig. 10e is a schematic diagram of a graphical user interface of a fortieth electronic device according to an embodiment of the present application;
fig. 10f is a schematic diagram of a graphical user interface of a fourth eleventh electronic device according to the embodiment of the present application;
fig. 11a is a schematic diagram of a graphical user interface of a fourth twelve electronic devices according to an embodiment of the present application;
fig. 11b is a schematic diagram of a graphical user interface of a forty-third electronic device according to an embodiment of the present application;
fig. 12a is a schematic diagram of a graphical user interface of a forty-fourth electronic device according to an embodiment of the present application;
fig. 12b is a schematic diagram of a graphical user interface of a forty-fifth electronic device according to an embodiment of the present application;
fig. 13a is a schematic diagram of a graphical user interface of a sixteenth electronic device according to an embodiment of the present application;
fig. 13b is a schematic diagram of a graphical user interface of a fourth seventeenth electronic device according to the embodiment of the present application;
fig. 14a is a schematic diagram of a graphical user interface of a forty-eighth electronic device according to an embodiment of the present application;
fig. 14b is a schematic diagram of a graphical user interface of a forty-ninth electronic device according to an embodiment of the present application;
FIG. 14c is a diagram illustrating a graphical user interface of a fifty-fifth electronic device according to an embodiment of the present application;
fig. 14d is a schematic diagram of a graphical user interface of a fifty-first electronic device according to an embodiment of the present application;
fig. 15 is a schematic diagram of a graphical user interface of a fifth twelfth electronic device according to an embodiment of the present application;
fig. 16a is a schematic diagram of a graphical user interface of a fifty-third electronic device according to an embodiment of the present application;
FIG. 16b is a schematic diagram of a graphical user interface of a fifty-fourth electronic device according to an embodiment of the present application;
fig. 16c is a schematic diagram of a graphical user interface of a fifteenth electronic device according to an embodiment of the present application;
fig. 17a is a schematic diagram of a graphical user interface of a fifty-sixth electronic device according to an embodiment of the present application;
fig. 17b is a schematic diagram of a graphical user interface of a seventeenth electronic device according to the embodiment of the present application;
fig. 17c is a schematic diagram of a graphical user interface of a fifty-eighth electronic device according to an embodiment of the present application;
fig. 17d is a schematic diagram of a graphical user interface of a nineteenth electronic device according to an embodiment of the present application;
fig. 17e is a schematic diagram of a graphical user interface of a sixteenth electronic device according to an embodiment of the present application;
fig. 17f is a schematic diagram of a graphical user interface of a sixteenth electronic device according to an embodiment of the present application;
fig. 17g is a schematic diagram of a graphical user interface of a sixteenth electronic device according to an embodiment of the present application;
fig. 17h is a schematic diagram of a sixty-third graphical user interface of an electronic device according to an embodiment of the present application;
fig. 17i is a schematic diagram of a sixty-fourth graphical user interface of an electronic device according to an embodiment of the present application;
fig. 18a is a schematic diagram of a graphical user interface of a sixteenth electronic device according to an embodiment of the present application;
fig. 18b is a schematic diagram of a sixteenth graphical user interface of an electronic device according to an embodiment of the present application;
fig. 18c is a schematic view of a graphical user interface of a sixth seventeenth electronic device according to an embodiment of the present application;
fig. 18d is a schematic diagram of a graphical user interface of a sixty-eight electronic device according to an embodiment of the present application;
fig. 18e is a schematic diagram of a graphical user interface of a sixty-ninth electronic device according to an embodiment of the present application;
fig. 18f is a schematic view of a graphical user interface of a seventy-ninth electronic device according to an embodiment of the present application;
fig. 19 is a flowchart of a control method of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, in the description of the embodiments of the present application, "a plurality" means two or more than two.
In the following, the terms "first", "second" and "third" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", "third" may explicitly or implicitly include one or more of the features.
The shooting method provided by the embodiment of the application can be applied to terminal devices such as mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented Reality (AR)/Virtual Reality (VR) devices, notebook computers, ultra-mobile personal computers (UMPCs), netbooks, personal Digital Assistants (PDAs), and the like, and the embodiment of the application does not limit the specific types of the terminal devices at all.
For example, fig. 1 is a schematic structural diagram of an example of a terminal device 100 provided in the embodiment of the present application. The terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the terminal device 100. In other embodiments of the present application, terminal device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The software system of the terminal device 100 may adopt a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a hierarchical architecture as an example, and exemplarily illustrates a software structure of the terminal device 100.
Fig. 2 is a block diagram of a software configuration of the terminal device 100 according to the embodiment of the present application. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom. The application layer may include a series of application packages. As shown in fig. 2, the application package may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The Android runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media libraries (media libraries), three-dimensional graphics processing libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The hardware layer may include various sensors, such as the various sensors described in fig. 1, acceleration sensors, gyroscope sensors, touch sensors, and the like, which are referred to in the embodiments of the present application.
With reference to the electronic device described in fig. 1 and fig. 2, in the embodiment of the present application, physical components related to the electronic device 100 mainly include hardware components such as a sensor, a Decision Support System (DSS) display chip, a touch display screen, and a fingerprint identification module; the screen management module, the display driver, the fingerprint driver, the false touch prevention kernel software layer and the like; application framework layer functions such as false touch input prevention, screen control, off-screen display (AOD) service, power management and the like; and application layer services such as special adaptation application (camera), three-party application, system hibernation, AOD, etc.
The workflow of the software and hardware of the electronic device 100 will be exemplarily described below with reference to the shooting method according to the embodiment of the present application. The shooting method provided by the embodiment of the application is mainly realized by mutual matching of Touch Panel (TP) modules, one or more physical components and layers of a software architecture layer of the electronic device 100.
The TP module receives touch operation of a user on the touch display screen, transmits the touch operation of the user to the physical state identification module of the system library, and the physical state identification module monitors the touch operation of the user and identifies the touch operation of the user. The physical state monitoring module transmits the touch operation of the user to a state machine management module of the electronic equipment, and the state machine management module controls a window management system of the FWK layer so as to control a series of actions and displays of the electronic equipment.
In addition, the implementation of the whole process also requires the cooperation of other multiple modules and sensors, for example, a skin module of an application layer for controlling a touch screen display interface, and the like, which is not described in detail herein.
For convenience of understanding, in the following embodiments of the present application, an electronic device having a structure shown in fig. 1 and fig. 2 is taken as an example, and a display method of the electronic device provided in the present application is specifically described in conjunction with the drawings and application scenarios. First, a printing scene is taken as an example.
The electronic apparatus has a function of generating a print job from a print object, which may be an image, a file, a memo, or the like. Taking the example where the print object is an image, the electronic apparatus is installed with an album application for browsing the image. A user can enter the album interface by clicking the icon of the album application, the album interface is provided with images displayed according to the preset size and the preset arrangement rule, and the images in the album interface can be selected through long-time pressing and other operations. Fig. 3a is a schematic diagram of selecting an image in an album interface of the electronic device, after selecting an image a, an image B, and an image C in the album interface, the user enters the interface 30a shown in fig. 3a, a sharing control 31 is provided in the interface 30a, the user clicks the sharing control 31, and the electronic device displays the interface 30B shown in fig. 3B. In the interface 30b, a print control 32 and some other controls 33 are provided, and the user triggers the print control 32 to enter a print job for the image object.
Taking the example that the print object is a file, the user enters the interface 30c shown in fig. 3c, displays the file manager in the interface 30c, selects the print object 36 through long press or other operations in the file manager, and enters the interface 30d shown in fig. 3d after triggering the sharing function of the print object 36. The interface 30d provides a job preview area 34, and the user can enter a print job corresponding to the document by triggering the print control 35 for the print object 36 in the job preview area 34.
The above-described manner of entering the print job is merely an example, and the present application does not limit the manner of entering the print job. The electronic device may also provide other ways to enter the print job, which are not described in detail herein.
As an example, on the basis of fig. 3b, after entering the print job corresponding to the image, an interface 40a as shown in fig. 4a can be obtained, where the interface 40a at least includes a preview area 41 and a control area 42.
The preview area 41 is used to display a preview result corresponding to the print object, i.e., a preview object. The preview area 41 includes a first preview object 411 that is fully displayed and a second preview object 412 that is not fully displayed, and the preview area 41 can be slid in the direction in which the preview objects are arranged to view the full preview objects. For example, the user may display a third preview object not displayed in interface 40a within preview area 41 by sliding to the left within preview area 41. Each preview object includes a selection control 413, and after entering the print job, the selection control of each preview object is in a selected state, and the user can cancel the selection of the preview object by clicking the selection control 413.
The control area 42 is used for displaying control controls, which may specifically include a start printing control 421 and a printing parameter control, and the printing parameter control may include: a printer selection control 422, a copy count selection control 423, and a range selection control 424, and the like. When the print parameters need to be adjusted, the control can be clicked to trigger the display of a control window on the interface 40a for selection by the user.
As an example, the control controls displayed in the control area 42 in the interface 40a may be common control controls, and in the printing scene of the image, may be a printer selection control 422, a copy number selection control 423, a range selection control 424, or other control controls. After the user confirms the print object and the print parameters, the start print control 421 can be clicked to trigger the electronic device to send a print instruction to the printing device.
Above the control area 42, there is a mark 43, which mark 43 is used to indicate that the control area 42 may provide other control controls, which are not shown, in addition to the control controls currently provided. As further shown in fig. 4b in conjunction with fig. 4a, the user slides a single point up within the interface 40a. In response to this sliding operation, the interface 40a changes to an interface 40c as shown in fig. 4 c. In the interface 40c, the area occupied by the control area 42 in the whole display interface becomes large, and the control controls hidden in the interface 40a are displayed: paper type selection control 425, paper size selection control 426, print mode selection control 427, and color mode selection control 428. When the user clicks the four control controls, the corresponding control windows can be displayed.
In 40c, the area of the preview area 41 in the display interface becomes smaller due to the enlargement of the control area 42. The preview object in the preview area 41 becomes correspondingly smaller. The preview area 41 displays a complete first preview object 411, a complete second preview object 412 and a non-complete third preview object 414. Similarly, a slide may be made in the direction in which the preview objects are arranged to display the complete third preview object in the preview area 41.
In the example of fig. 4c, control area 42 provides seven control controls. As an alternative example, to provide more control controls, a scroll bar may also be added in the control area 42 of 40c. As shown in FIG. 4d, in the interface 40d, the control area 42 displays seven control controls and a scroll bar 44, and the user pulls down on the scroll bar 44 to display hidden control controls such as: print direction, margin, etc.
As an example, the user clicks the copy number selection control 423 in the interface 40c, the display interface is 50a as shown in fig. 5a, a copy number selection window 51 is provided in the interface 50a, a copy number to be printed can be selected in the copy number selection window 51, then the determination control 511 is clicked, the copy number selection window 51 disappears, and the electronic device returns to the interface 40c.
As an example, the user clicks the paper type selection control 425 in the interface 40c, the interface 50b shown in fig. 5b is displayed, a paper selection window 52 is provided in the interface 50b, and the paper type to be printed can be selected in the paper selection window 52, for example, the user selects plain paper, the paper selection window 52 disappears, and the electronic device returns to the interface 40c, and the paper type is plain paper. As another example, the user has clicked the cancel control 521 and the electronic device returns to the interface 40c with the paper type unchanged from before the paper selection window 52 pops up.
As an example, the user clicks the printing mode selection control 427 in the interface 40c to display the interface 50c as shown in fig. 5c, a printing mode selection window 53 is provided in the interface 50c, and a printing mode required including double-sided printing or single-sided printing can be selected in the printing mode selection window 53, for example, in 50c, the user selects single-sided printing, the paper selection window 53 disappears, the electronic device returns to the interface 40c, and the printing mode is single-sided printing. For another example, the user clicks the cancel control 531, and the electronic device returns to the interface 40c, and the print mode is unchanged from that before the print mode selection window 53 pops up.
As an example, the first preview object, the second preview object and the third preview object have all been selected, the user clicks on the scope selection control 424 in the interface 40c, a first scope selection window 54 is provided in the interfaces 50d,50d as shown in FIG. 5d, the scope to be printed can be selected in the first scope selection window 54, the first scope selection window can provide all print options, or can provide options to select the start page and the end page, respectively. For example, in 50d, the user has selected all prints. After selecting the print range, the user again clicks on the determination control 541 in 50d, the first range selection window 54 disappears, and the electronic device returns to the interface 40c.
As an example, the first preview object and the third preview object have been selected, and the second preview object has not been selected. In such a scenario, the user clicks on the scope selection control 424 in the interface 40c, displaying the interface 50e as shown in fig. 5e, the interface 50e providing another second scope selection window 55 different from the first scope selection window 54, and in the second scope selection window 55, the all-print option 551, the custom print option 552, and the continuous print option 553 are provided. Alternatively, the user selects the all print option 551 and clicks on the determination control 554, and the second range selection window 55 disappears, returning to the interface 40c. Optionally, the user selects custom print option 552 and clicks on decision control 554, and the second range selection window 55 disappears, returning to interface 40c. Since the user selects the first preview object and the third preview object before entering the second range selection window 55, the print range corresponding to the custom print option 552 is the image corresponding to the first preview object and the third preview object. Alternatively, the user selects the continuous print option 553, selects the start page in the start page selection area, selects the end page in the end page selection area, and after the user clicks the determination control 554, the second range selection window 55 disappears, returning to the interface 40c.
As an example, the user clicks on the paper size selection control 426 in the interface 40c, the interface shown in fig. 5f is displayed, a paper size selection window 56 is provided in the interface 50f, and the paper size provided in the paper size selection window 56 is the paper size that can be provided by the printing apparatus currently connected to the electronic apparatus. For example, the user selects A4, the paper selection window 56 disappears, and the electronic device returns to the interface 40c with a paper size of A4. As another example, the user clicks the cancel control 561 and the electronic device returns to the interface 40c and the paper size does not change from before the paper size selection window 56 pops up.
As an example, the user clicks on the color mode selection control 428 in the interface 40c, and the display, as shown in FIG. 5g, provides a color mode selection window 57 in interfaces 50g, with the color mode selection window 57 providing both color and black and white mode options. Optionally, the user selects the black and white mode, and at the same time, the preview object in the preview area 42 is displayed in the black and white mode, the color mode selection window 57 disappears, and the electronic device returns to the interface 40c. Optionally, the user clicks the cancel control 571, the color mode selection window 57 disappears, and the electronic device returns to the interface 40c, and the color mode does not change compared to that before the color mode selection window 57 pops up.
Continuing with FIG. 4c, the control area 42 has a mark 45 on the top, the mark 45 being used to indicate that the control area 42 can be retracted downward. As an example, as shown in FIG. 4e, the user slides a single point down in the interface 40c, the control area 42 shrinks downward, the control controls within the control area 42 do not change in size and decrease in number, and the electronic device displays the interface shown in FIG. 4 a.
In the solution provided in the above embodiment, the control area and the preview area are both expandable or contractible areas, and when the user operates the expansion or contraction of the control area, the size of the preview object in the preview area changes accordingly. When the control area is in a contracted state, the preview object is larger; when the control area is in the extended state, the preview object is smaller, and the preview object can be changed accordingly according to the control in the control area. Based on the scheme, the shielding of the control area on the preview object is avoided while the perfect control is provided.
In the case where there is no operation conflict, the single-point upward-sliding operation and the single-point downward-sliding operation may be operations generated at arbitrary positions on the interface. For example, in the interface 40b of fig. 4b, there is only one single point of the sliding operation, i.e., the operation of pulling out the control area 42, and therefore the single point of the sliding operation can be generated at any position of the interface 40a. For another example, in fig. 4e, there is only one single-point sliding down operation, i.e., the operation of retracting the control area 42, and therefore the single-point sliding down operation can be generated at any position of the interface 40c.
Take the second print scenario as an example. Still taking the example of selecting three images as print objects in the camera interface 30a shown in FIG. 3a, in this example, after entering a print job, the electronic device displays another print job interface 60a as shown in FIG. 6 a. The interface 60 comprises at least a control area 61 and a preview area 62. Three more frequently used control controls are provided in the control area 61 and a plurality of preview objects are displayed in the preview area 62. The preview area 62 does not fully display all preview objects, and the user can view all preview objects by performing a slide operation in the direction in which the preview objects are arranged within the preview area 62.
An indicator 621 is included below the control area 61, the indicator 621 being used to indicate that the control area 61 allows retraction upward. An identifier 622 is included below preview area 62, which identifier 622 is used to indicate that there is additional content below preview area 62 that can be dragged out. As an example, a user slides a single point up interface 60a, and in response to the above-described operation, the electronic device displays interface 60b as shown in FIG. 6 b. In the interface 60b, the control area 61 is collapsed upward, the preview area 62 is moved upward, and the background selection area 63 is displayed.
Note that, after moving up preview area 62, background selection area 63 is not the only content that can be displayed below preview area 62, and other content may be displayed below preview area 62 after moving up, and only background selection area 63 will be described here as an example.
The background selection area 63 provides a plurality of backgrounds, which are preset, and the user can slide in the direction of the arrangement of the backgrounds to browse more backgrounds that are allowed to be selected. As an example, the user selects background 631 and the currently selected first preview object 411 in preview area 62 is displayed over background 631.
Also displayed in interface 60b above background selection area 62 is an indicator 623, which indicator 623 is used to indicate that background selection area 63 is allowed to be slid up to display more content. The user performs a single point swipe operation in interface 60b, drags out of position selection area 65, and the electronic device displays interface 60c as shown in fig. 6 c. The location selection area 65 is used to provide location options for various preview objects in the background. In the example of interface 60c, centering option 651 is selected and the first preview object is displayed centered over the selected background 631.
As an example, the user performs a single point slide down operation in the interface 60c, and the display interface of the electronic device is changed from 60c to 60b. The user performs a single point of the slide down operation on the interface 60b, and the display interface of the electronic device is changed from 60b to 60a.
In this example, at least three states as shown in 60a, 60b and 60c can be displayed in the print job, and among the three states, the position and size of the preview area can be changed differently according to the operation of the user, so that when the user calls out more contents, the preview object can not be covered, and when the user shrinks some contents, the preview object can occupy a larger area for displaying, thereby realizing variable layout under different control requirements, avoiding the preview object being covered, and facilitating the user to view the preview effect.
The following describes the scheme of the present application by taking a short video scene as an example. The electronic equipment is provided with the short video application, and a user starts the short video application on the electronic equipment through preset operation. When the electronic device plays a short video, the electronic device displays an interface 70a as shown in fig. 7a, and the interface 70a includes a video playing area 71, an information display area 72, and a comment area 73. The video playing area 71 is used for displaying a short video of the current page; the information display area 72 is used for displaying video information, such as video names, video introduction information, topics carried by videos, and the like; the comment area 73 is used to display comment information of the current video. The video playing area 71 and the comment area 73 may be displayed in different areas on the same layer, and the information display area 72 may be overlaid on the video playing area 71. In other examples, the information display area 72 may be on the same layer as the video playing area 71 and the comment area 73. The interface 70a also provides a plurality of operation controls 74, and the operation controls 74 are all displayed on the topmost layer and partially overlap with the video playing area 71, the information display area 72 and the comment area 73.
The proportion of the short video displayed in the video playing area 71 is not limited, and the short video can fill the video playing area 71 under the condition that the proportion of the short video to the video playing area 71 is the same; when the short video is different in proportion from the video play area 71, the video may be displayed in the video play area 71 in a centered manner.
The comment area 73 may display TopN comments by the number of praise of the currently played short video, or display TopN comments by the number of comment playback. In the case of a short video without comments, the comment area 73 may hide, or display, information for prompting the user to comment, such as: quickly say your idea Bar!
Continuing with FIG. 7a, comment area 73 has an indicator 75 above, which indicator 75 is used to indicate that there is additional content below the comment area. The content may be content belonging to the comment area 73, or may be other functional areas, for example, recommendation information for recommending other high-ranking videos currently on the main broadcast, or the like. Next, the expanded content is explained as comment information.
As an example, a user performs a single point up-slide operation in the interface 70a, which is generated in the video playback area 71, see FIG. 7b in particular, which triggers a short video within the video playback area 71 to up-slide and play the next piece of the recommended short video.
As an example, the user performs a single point swipe operation in the interface 70a as shown in fig. 7c, with the down point of the single point swipe operation in the comment area 73. In response to the single point swipe operation described above, the electronic device displays an interface 70d as shown in fig. 7 d. In the interface 70d, the comment area 73 is pulled out, the video play area 71 moves upward, the operation control 74 is hidden, the video play area 71 is reduced, and the short video in the video play area 71 is reduced and displayed in an equal scale.
In the case of many video comments, the pulled-out comment area 73 cannot completely display all the comments, and the pulled-out comment area 73 includes a scroll bar 731, so that all the comments of the current short video can be previewed by pulling the scroll bar 731.
As an example, as shown in fig. 7e in combination, in this example, a down point of a single point slide-down operation falls in the video playing area 71, and an up point falls in the comment area 73, and in response to this operation, the pulled-out comment area 73 is contracted, and the electronic device displays the interface 70a as shown in fig. 7 a. As still another example, in connection with fig. 7f, when the user performs a single-point slide-down operation in the interface 70e, the comment content in the comment area 73 is flipped up, and when the comment content in the comment area 73 is pulled to the top, the pulled-out comment area 73 is contracted, and the electronic device displays the interface 70a as shown in fig. 7 a. As yet another example, in response to a single-point downslide operation in which both the down point and the up point fall within the video playback area 71, one short video is displayed in the video playback area 71.
As an example, in connection with fig. 7g, the user performs a single point swipe operation 77 in the interface 70e, the down point and the up point of which fall in the video play area 71. In response to this operation, the next short video is displayed in the video play area 71. When the video playing area 71 is switched from the current video to the next short video, the comment area 73 is automatically retracted, and the layout of the interface is the same as that of the interface 70a.
It should be noted that the above operations are only used as an example to describe a part of the operations allowed to be performed in the interface 70d, and other operation manners or combinations of operation manners may also be used to perform the above operations, which is not described herein again.
The above embodiments provide for changes in layout in a task interface that includes a preview area under three different scenarios. It can be understood that, in addition to the above three scenarios, in other scenarios, in order to take account of both control and preview, the interface layout schemes that make corresponding adaptations to the size and position of the preview area according to different operations all belong to the protection scope of the present scheme.
Still in the printing scenario, the present solution may have other embodiments.
FIG. 8d is a schematic illustration of another incoming print job. The print control 85 can be triggered in the interface 800d by setting-more connections into the interface 800d as shown in fig. 8d in the electronic device, and into the interface 800a as shown in fig. 8a, two entries are provided in the interface 800a, the first entry being the printer connection entry 81 and the other entry being the print entry 82 for entering a print job. The user touches print portal 82 to enter a print job. After selecting the second print portal 82 in the interface 800a, the electronic device displays a print job interface 800b as shown in FIG. 8 b. Two types of print functions are provided in the interface 800b, specifically a first type of print function 83 and a second type of print function 84. In the interface 800b, the first type printing function 83 includes: a document print function 831, a large image print function 832, a picture print function 833, and an album print function 834.
In the interface 800b shown in fig. 8b, the print function area 83 further has an identification control 835, and the identification control 835 is used to indicate that other functions not shown in the interface 800b can also be provided. The user clicks on the identification control 835 and the electronic device displays the interface 800c as shown in figure 8c. In interface 800c, further printing functions that the electronic device is capable of providing are displayed, including: a scan function 835, a tile print function 836, a region cropping print function 837, a custom print function 838, and an identification card print function 839. In some examples, there may be more or less functionality than in interface 800c.
As an example, the user triggers the document print function 831 and the electronic device displays the interface 900a as shown in FIG. 9 a. Interface 900a includes preview pane 91 and control area 92. A preset position in the preview box 91 displays a document addition control 911. The user clicks on the document addition control 911 and the electronic device displays a document selection interface 900e as shown in FIG. 9 e. The user selects a document to be printed in the document selection interface 900e and clicks the finish control 96, the electronic device enters the print interface having the same layout as the interface 40a and displays the document to be printed selected by the user in the preview area 41.
As an example, the user triggers the picture print function 833 and the electronic device displays the interface 900c as shown in fig. 9 c. Interface 900c includes: a preview box 93 and a control area 94. The picture adding control 931 is displayed at a preset position in the preview box 93, and when the user clicks the picture adding control 931, the electronic device displays a picture selection interface 900d as shown in fig. 9 d. The user selects a picture to be printed in the picture selection interface 900d, and clicks the completion control 9, the electronic device enters a printing interface with the same layout as the interface 40a, and displays the picture to be printed selected by the user in the preview area 41.
In the above embodiment, the document can be printed by the document printing function 831, and also by the method of fig. 3c, which provide the same document printing service, except that when the document is printed by the document printing function 831, a print job is entered first, and then a document to be printed is selected; when printing a document in the manner of fig. 3c, a document to be printed is selected first, and then a print job is entered.
In the above embodiment, the picture may be printed by the picture printing function 833, or the picture may be printed in the manner shown in fig. 3b, where the picture printing services provided by the two are the same, and the difference is that when the picture is printed by the picture printing function 833, a print job is entered first, and then a picture to be printed is selected; when printing the picture in the manner of fig. 3b, the picture to be printed is selected first, and then the print job is entered.
As an example, the user triggers the scan function 835 and the electronic device displays the interface 100a as shown in FIG. 10a. The interface 100a includes a preview area 11 and a control area 12. When the user clicks the start scanning control 121, the electronic device executes the scanning task, and displays an interface 100b as shown in fig. 10b, in the interface 100b, the control in the control area 12 cannot be triggered, and the identifier 111 is used to identify the number of currently scanned pages, for example, 1/1 indicates that the currently scanned page number is the first page, and a total of one page is scanned. After the scanning completes the current page, the electronic device displays an interface 100c as shown in FIG. 10 c. The scan results, a thumbnail 124 of the scan results, a rescan control 122, and a continue scan control 123 are displayed in the preview area 11 of the interface 100c.
The user clicks the rescan control 122 in the interface 100c, the electronic device displays the interface 100d as shown in fig. 10d, displays the prompt message 13 in the interface 100d to prompt the user that the rescan will not retain the result of the scan, and asks the user whether to continue the rescan, if the user selects "continue", the electronic device clears the content in the scanned preview area 11, and enters the interface 100b to rescan, and if the user selects "cancel", the electronic device displays the interface 100c.
The user clicks the continue scanning control 123 in the interface 100c, and the electronic device enters an interface similar to the interface 100b to continue scanning, wherein the identifier 111 is 2/2. After the current scan is completed, the electronic device displays an interface 100f shown in fig. 10f, and a thumbnail 124 of the first page scan result and a thumbnail 125 of the second page scan result are displayed in the interface 100 f.
In interface 100c, the user may also click on control 14 and the electronic device displays interface 100e as shown in FIG. 10 e. In the interface 100e, a control window 15 is displayed. If the user selects to store the picture, storing the scanning result to the electronic equipment according to the format of the picture; and if the user selects to store the scanning result as PDF, storing the scanning result to the electronic equipment according to the format of PDF.
As an example, the user triggers the album print function 834 and the electronic device displays the interface 110a as shown in FIG. 11 a. The interface 110a includes a preview box 21 and a control area 22. The user clicks the add picture control 211 and the electronic device displays an image selection interface. The user selects an image to be printed in the image selection interface and the electronic device displays an interface 110b as shown in fig. 11 b. In the interface 110b, an image to be printed is displayed in the preview frame 21 in a "fit" display manner. The preview box 21 is not filled with the image to be printed, the user can click the fill control 23, the preview box 21 can be filled with the control to be printed, and the extra portion is cut.
In the interface 110a, the control area 22 further includes a printer selection control, a copy count selection control, and a paper size selection control. A mark 24 is included above the control area 22, and the mark 24 is used to indicate that the control area 22 may also provide more control controls, and the control controls hidden in the control area 22 in the interface 110a may be pulled out through a preset operation. After the hidden control is pulled out from the current control area 22, the electronic device displays an interface 110b as shown in fig. 11b, in the interface 110b, the size of each control is not changed, the control area 22 is enlarged, correspondingly, the preview frame 21 is reduced, and the preview object in the preview frame 21 is reduced. The manner of the interface layout change caused by the control hidden in the control area 22 being pulled out or retracted may be similar to the manner of the preview area adaptively changing according to the change of the control area in fig. 4a to 4e, and will not be described herein again.
In the interface 11b shown in fig. 11b, the print parameters can be adjusted by the sheet size selection control 221 and the sheet orientation selection control 222. After the user triggers the printing start control 223, under the condition that the printer allows, the electronic device controls the printer to judge whether the paper actually put in conforms to the set printing parameters, and if the judgment result is that the paper actually put in conforms to the set printing parameters, the electronic device controls the printer to print; if the judgment result is that the electronic equipment does not conform to the preset judgment result, the electronic equipment displays prompt information: may printing with the current size of paper differ from the preview effect or printing fail, continue? And the electronic equipment controls the printer to print again under the condition that the user confirms to continue printing.
As an example, the user triggers the identification card printing function 839 and the electronic device displays an interface 120a as shown in FIG. 12 a. The interface 120a includes a scan position prompt box 30 and prompt information 31. The user clicks on "know" in the prompt 31 to scan the identification card. The province certificate comprises two sides, and after the scanning of one side is completed, the electronic equipment prompts a user to scan the other side of the identity certificate. After both sides of the identification card are scanned, the electronic device displays an interface 120b as shown in FIG. 12 b. In the interface 120b, front and back images of the scanned identification card are displayed in the preview area 302.
Also included in interface 120b are save control 304, a rephoton control 331, and a go to print control 332. The user triggers the save control 304 and the electronic device saves the scan results of the identification card in the preview area 302. And triggering the rephotograph control by the user, and returning to the electronic equipment to scan the identity card again. The user triggers a go to the print control 332 and the electronic device enters the print job for the identification card. The print job of the identity card is similar to that of the picture, and the difference is that the control of the control area can be the same, which is not described herein again.
As an example, the user triggers the big print function 832. The large-image printing is used for enlarging and dividing an image to be printed into a plurality of parts, printing each part as a single page, and selecting the magnification (namely, the number of divided parts) and the dividing mode. As shown in fig. 13a, the interface 130a includes a preview area 401, a magnification selection control 402, and a print control 403. The dotted line in the preview area 401 is used to indicate the manner in which the image to be printed is divided. In this example, the magnification is four times.
The user triggers the print control 403 in the interface 130a and the electronic device displays the interface 130b as shown in fig. 13b, which includes a preview area 404 and a control area 405 in the interface 130 b. Each portion of the segmented image to be printed is previewed as a preview object in the preview area 404. The control area 405 further includes more control elements that can be pulled out, and a manner of changing the interface layout caused by pulling out or retracting the control element hidden in the control area 405 may be similar to a manner of adaptively changing the preview area according to a change of the control area in fig. 4a to 4e, and is not described herein again.
As an alternative example, the user clicks on the jigsaw printing function 836 and the electronic device may display the interface 140a as shown in FIG. 14 a. In the interface 140a, a preview area 141 and a control area 142 are provided, an add picture control 143 is provided in the control area 142, the user triggers the add picture control 143, the electronic device may display an image selection interface, and after the user selects three images and confirms the image selection interface, the electronic device enters the interface 140b shown in fig. 14 b. In the interface 140b, the selected three images are displayed as preview objects in the preview area 141. Optionally, the user selects ten images, the first nine images form a first puzzle according to a 3 × 3 layout, and the last and eight blank images form a second puzzle.
A layout style selection control 144 is provided in the control region 142, and upon triggering the layout style selection control 144, the electronic device displays an interface 140c as shown in fig. 14c, where a plurality of selectable layout styles are provided, for example: 3x3, 2x2, 2x3, etc. The user selects the 2x2 layout. The electronic device displays an interface 140d as shown in fig. 14 d. In the interface 140d, the selected three images are displayed in a 2 × 2 layout.
Similarly, in a scenario of tile printing, the control area 142 may further include more control elements that can be pulled out, and a manner of an interface layout change caused by pulling out or retracting the control element hidden in the control area 142 may be similar to a manner of an adaptive change of the preview area according to a change of the control area in fig. 4a to 4e, and is not described herein again.
As an alternative example, the user clicks on the custom print function 838 and the electronic device displays an image selection interface, and after the user selects an image and confirms on the image selection interface, the electronic device displays the interface 150 shown in FIG. 15. Included in interface 150 are preview area 151, paper size selection control 152, and print control 153. The user clicks on the paper selection control 152 and the electronic device displays a paper size selection window in which the user can customize the paper size at the time of printing. After the user determines the paper size, the electronic apparatus or the printer determines whether the printer supports the paper size selected by the user. If the judgment result is negative, popping up a paper size selection window again; if the judgment result is yes, the paper size selection window is closed. After the user clicks the print control 153, the electronic device prompts the size of the paper to be placed, which is the paper size adjusted by the user in the paper size selection window.
As an alternative example, the user clicks on the region cropping print function 837, the electronic device displays an image selection interface, and after the user selects and confirms a plurality of images on the image selection interface, the electronic device displays the interface 160a as shown in fig. 16 a. The interface 160a includes a first preview area 161, an auxiliary function area 162, a second preview area 163, and a print control 164.
Thumbnails of the plurality of images selected by the user are displayed in second preview area 163, and the user clicks thumbnail 165 to select an image corresponding to the thumbnail. First preview region 161 displays an image selected by the user in second preview region 163. A trimming grid is also displayed in the first preview area 161, and the user trims the preview object in the first preview area 16 by adjusting the trimming grid. The user adjusts the cropping grid to the state shown in interface 160b in fig. 16b, and the electronic device displays the cropping effect in the first preview area 161, in response to this cropping operation, specifically interface 160c shown in fig. 16 c.
The auxiliary function area 162 includes reset, application all, and extraction title. The user triggers a reset in interface 160c and the electronic device returns to interface 160a. All the applications in the user trigger interface 160c, the electronic device applies the cropping effect of the first preview interface 161 to the plurality of images selected in the image selection interface, and displays the prompt information: the cropping effect has been applied to the entire image. The user triggers the extraction of the title in interface 160c and the electronic device displays a watermark location selection window in which options are provided for the watermark, including "top placed", "bottom placed", and "closed", respectively. The user triggers the print control 164 and the electronic device displays a print interface similar to that of fig. 4a, where the preview object is a cropped image.
In the interface 800b shown in fig. 8b, the second type printing function 84 includes a couplet printing function 841 and a red packet printing function 842. The second type of printing function 841 can also be accessed with externally provided printing materials to provide more printing functions, which is exemplified by the couplet printing function 841.
As an example, the user triggers the couplet print function 841 in the second type print function 84 in the interface 80b shown in fig. 8 b. The electronic device displays an interface 170a as shown in fig. 17 a. The interface 170a includes: a first control area 171, a preview area 172 and a second control area 173. The first control area 171 includes controls for selecting the top, bottom, and horizontal lots in the couplet; preview area 172 is used to display the couplet being edited; the second control area 173 is used to provide various editing items (e.g., background, text, and handwriting), and contents in the various editing items. When the electronic apparatus enters the couplet printing function, "up" in the first control area 171, the background control 173b in the second control area 173, and the background option 173a are selected by default.
As an example, the user clicks on the text option 173c in the second control area 173 and the electronic device displays the interface 170b as shown in FIG. 17 b. In the second control area 173 of the interface 170b, selectable text content, selectable fonts, and selectable colors are displayed. In this example, content 173e, font 173d, and color 173f are selected. Each content provided comprises an upper link, a lower link and a transverse batch, and after one content is selected, the upper link, the lower link and the transverse batch can be determined according to the selected content. For example, if the content 173e is selected, the upper link is "tiger jumping the shenzhou song to flourish," the lower link is "phoenix dance to welcome new spring nine days," and the horizontal link is "lucky in tiger year. In this example, the second control area 173 creates an occlusion of a portion of the preview area 172.
As another example, the user clicks on the text option 173c in the second control area 173 and the electronic device displays an interface 170c as shown in FIG. 17 c. In this example, in response to the operation of clicking on the text option 173c, the preview area 172 is displayed adaptively reduced, and the second control area 173 does not cause an occlusion in the preview area 172. The other controls in the interface 170c and the interface 170b have the same functions, and are not described herein again.
The user clicks on the control 176 in the interface 170c and the interface 170c may pop up a "preview" option and a "save" option. The user clicks a 'save' option, and the currently edited couplet is saved in the album by the three images corresponding to the upper couplet, the lower couplet and the transverse batches respectively. The user clicks on the "preview" option and the electronic device displays an interface 170d as shown in figure 17 d. In the interface 170d, a preview effect of the entire couplet (including the upper, lower and horizontal) is displayed.
As shown in fig. 17b, a control 173g in the second control area 173 may also be clicked. The electronic device enters the interface 170e as shown in fig. 17 e. In the interface 170e, a virtual keyboard 174 and a custom content window 177 are displayed, and the user operates the virtual keyboard 174 to input custom text into the custom content window 177. The user may also select the font and color of the custom text in the second control area 173.
As shown in fig. 17b, the user may also select controls 173h and 173i to enter the handwriting function. As an example, the electronic device displays an interface 170f as shown in FIG. 17 f. In the interface 170f, an editing area 178 is displayed, and in the editing area 178, a control 178a, a control 178b, a control 178c, and a control 178d are further included. The user writes in the edit area 178 and, after writing a unit of writing (e.g., a word), can click on the control 178d to write the next unit of writing in the couplet, at which time the first unit of writing that has been written is displayed in the couplet in the preview area 172. In the case where handwriting already exists in the preview area 172, the control 178c may be clicked to write the handwriting of the previous unit. In the writing process, the deletion control 178a may be clicked to delete the currently edited handwriting, or the closing control 178b may be clicked to exit the editing.
Optionally, in the upper link in the preview area 172, each text has a corresponding position and size. In fig. 17f, the size and position of the first word in the upper row is shown as virtual box 172a (172 a is for illustration only and may not be displayed in interface 170 f). After the user writes the first word and clicks the control 178d, the currently written first word may be reduced or enlarged to make the written handwriting conform to the size of 172a, and then the reduced or enlarged handwriting is displayed at the position of 172 a. As shown in the interface 170i of FIG. 17i, after the user clicks on the control 178d, the written "tiger" word is zoomed and displayed at the location of 172a, and the user can book the next word in the couplet in the edit area 178.
The edit area 178 also includes a prompt message, which is displayed in x/y, where y represents how many words are included in the current written uplink, and x is used to represent the current written word as the second word in the uplink. In this example, the hint information is 1/7, i.e., there are 7 words in the upper link, the first word currently being written. In this example, the second control area 173 creates an occlusion of a portion of the preview area 172.
When the currently written word is the first word, the prompt is 1/7, and the control 178c may be grayed out. When the currently written word is the last word, the hint is 7/7, and control 178d can change from "next" to "done". At this point the user clicks on control 178d and the edit area 178 closes.
As another example, upon entering the handwriting function, the electronic device displays an interface 170g as shown in FIG. 17 g. In this example, the preview area 172 is adaptively zoomed out in response to the user also selecting the operations of controls 173h and 173i, and the second control area 173 does not occlude the preview area 172. After the edit area 178 is closed, the preview area 172 returns to the original size.
As shown in the interface 170h of fig. 17h, the user clicks the print control 175, a prompt message 179 is displayed to prompt the user to print the paper size required for the uplink, the user selects to start printing after putting the paper size in the printer, the electronic device sends a print instruction to the printing device, and the printing device prints the uplink. The paper that the user puts in the printer may be original paper having no color.
After the printing device finishes printing the upper link, the printed information is returned to the electronic device, and the electronic device automatically jumps to an editing interface for the lower link. Similarly, after the user finishes editing the downlinks, the user can print the downlinks by clicking the printing control 175, after the printing device finishes printing the downlinks, the printed information is returned to the electronic device, and the electronic device automatically jumps to the editing interface of the banner; after the user finishes editing the banner, the user clicks on print control 175 to print the banner.
Alternatively, the paper size of the user put in the printing apparatus does not match the paper size in the hint information 179. The printing device detects that the paper size of the current paper is not consistent with the paper size required by printing, corresponding information can be sent to the electronic device, and the electronic device can display prompt information to prompt a user to replace the paper.
The red envelope printing will be explained. In interface 800b as shown in FIG. 8b, the user clicks on the red envelope print function 842. Entering the interface 180a as shown in fig. 18 a. Interface 180a includes a first control area 181, a preview area 182, and a second control area 183. A number of selectable themes are provided in the first control area 181, for example: national tide, new spring, tiger year, etc., each theme has a corresponding background. In this example, the national tide theme is selected, and the background in the second control area 183 is the background corresponding to the national tide theme. A thumbnail of the red packet to be printed is displayed in the preview area 182, where a dotted line is used to indicate a folding and trimming trace, that is, the printed paper is trimmed and folded according to the dotted line, so as to obtain the red packet, and the folding and trimming trace may be a trace of the printed paper itself or a trace formed by printing on the paper in a double-sided printing manner.
The user may also add text to the red envelope. The user may enter interface 180b, shown in FIG. 18b, by clicking on the "text" control in second control area 183 of interface 180a. A text entry box 184 is displayed in the interface 180b, the user clicks on the text entry box 184, and the electronic device displays the interface 180c as shown in fig. 18 c. In the interface 180c, a text input box 186, a virtual keyboard 185, and a second control area 183 are displayed, and the user selects the font "rowbook" in the second control area 183 and inputs "Happy New year! ". After the user clicks the confirmation control, the text entered "Happy New year! "displayed in a lined font in preview area 182, the user may move the position of text entry box 186, or may move the position of the entered text.
The user may also add a sticker to the red envelope. The user may enter an interface 180d as shown in FIG. 18d by clicking on the "stick figure" control in a second control area 183 of the interface 180a. An alternative picture that may be a tile is displayed in interface 180d, in this example, the user selects tile 183a, and tile 187 is displayed in preview area 182. The user may also move the position of the map 187 in the preview area 182. The user may also click on a control 183b, and in response to the user clicking on the control 183b, the tile in the preview area 182 is cleared.
Still in interface 180d, for example, the user may pop up a "preview" option and a "save" option by clicking on control 188. The user clicks on the "save" option and the currently edited red envelope may be saved in the album as an image. The user clicks on the "preview" option and proceeds to interface 180e as shown in figure 18 e. In the interface 180e, a preview effect 188a of printing the red envelope is displayed. In preview effect 188a, a dashed line in preview area 182 may also be displayed to facilitate user cropping and folding. Double-sided printing may also be performed, with the front side printed as shown by preview effect 188a and the back side printed as a dotted line in preview area 182.
Still in the interface 180d as an example, the user clicks the control 189, the electronic device enters the interface 180f shown in fig. 18f, in the interface 180f, the prompt message 182a is displayed to prompt the user to print the paper size required for the red packet, the user selects to start printing after the paper size is prepared, the electronic device sends a print instruction to the printing device, and the printing device prints the red packet.
Referring to fig. 19, a flowchart of a method for controlling an electronic device according to an embodiment of the present application is shown. The method can be applied to the electronic device shown in fig. 1, as shown in fig. 19, which mainly includes the following steps.
And S191, displaying a first interface, wherein the first interface comprises a first image, and the first image corresponds to the upper link in the couplet.
The first interface may be the interface 170a shown in fig. 17a, and the first image may be an upper link in the preview area 172.
S192, responding to the first operation of the first interface, and displaying a second interface, wherein the second interface comprises: the second image corresponds to the upper connection in the couplet, and the second image is different from the first image.
The first operation may be an editing operation of the user on the first interface for the uplink, the second image may be the edited uplink, and the first control may be a "linkage down" control in the first control area 171. The user can edit the downline by clicking the "downline" control.
And S193, responding to the second operation of the first control, and displaying a third interface, wherein the third interface comprises a third image, and the third image corresponds to the lower link in the object.
The third interface is an interface for editing the lower part, and the third image can be used for displaying the lower link.
S194, in response to a third operation on the third interface, displaying a fourth interface, the fourth interface including: a fourth image and a second control, the fourth image corresponding to a down link in the couplet, the fourth image being different from the third image.
The third operation may be an editing operation of a user on a downlink on the third interface, the fourth image may be a downloaded link after editing, the second control may be a preview control, and the preview control may be directly displayed in the third interface or displayed in a window provided by another control in the third interface.
S195, in response to the fourth operation on the second control, displaying a fifth interface, the fifth interface including: a second image and a fourth image.
The fifth interface may be a preview interface, and in the preview interface, the edited upper link and the edited lower link are previewed in the same interface.
The first interface to the fifth interface further include a printing control, the user further clicks the printing control, and the electronic device sends a printing instruction to the printing device to instruct the printing device to print.
In the embodiment, the electronic equipment provides an editing function of the antithetical couplet, the user can check and edit the upper antithetical couplet and the lower antithetical couplet in the antithetical couplet respectively, and preview the upper antithetical couplet and the lower antithetical couplet together, so that the user can conveniently edit and preview the antithetical couplet on the electronic equipment, and the user experience is better.
In one implementation, the first interface further includes: the third control responds to a fifth operation on the third control and displays a sixth interface, the sixth interface comprises a fifth image, and the fifth image corresponds to the horizontal batch in the couplet; in response to a sixth operation on the sixth interface, displaying a seventh interface, the seventh interface including: a sixth image and a second control, the sixth image corresponding to a horizontal bar in the couplet, the sixth image being different from the fifth image; wherein the fifth interface further comprises a sixth image.
The third control element may be a "horizontal batch" control element shown in the interface 170a, and the user may edit the horizontal batch by clicking the "horizontal batch" control element to enter a sixth interface. The fifth operation is to edit the horizontal bar, and the sixth image may be the edited horizontal bar. In this example, the top up, bottom up, and the banners may be displayed together in the preview interface after the user clicks on the preview control.
In one implementation, the first interface further includes: a fourth control; in response to a seventh operation on the fourth control, displaying an eighth interface, the eighth interface comprising: the seventh image has the same content as the first image; displaying an eighth image in the first writing area according to a touch trajectory of an eighth operation in response to the eighth operation applied to the first writing area; displaying a ninth image corresponding to the eighth image at the first position of the seventh image in response to a ninth operation acting on the fifth control; and responding to the tenth operation acted on the sixth control, and sending a printing instruction to the printing device.
Still taking fig. 17b as an example, the fourth control may be control 173h. The operation of the user clicking the fourth control may be a seventh operation, and the eighth interface may be an interface 170f shown in fig. 17f, or an interface 170g shown in fig. 17 g. Taking 170f as an example, the seventh image may be the image within the preview area 172, the first writing area may be 178, the fifth control may be control 178d, and the sixth control may be a print control. The ninth operation acting on the fifth control may be an operation of clicking the control 178d, and in response to the ninth operation, the electronic device may display an interface 170i as shown in fig. 17 i. The ninth image corresponding to the eighth image may be an image obtained by scaling the eighth image.
In the above steps, the user may call up the writing area, write the text in the writing area, and display the written text in the upper link. Similar edits as in the upper link can also be made in the lower link and the horizontal batch, and are not described in detail here and in the following.
In the embodiment, the electronic equipment provides the function of editing the print content before printing, the user can edit the print content according to the function, and in the editing process, the user can obtain the print content by writing on the electronic equipment, so that the printing function is richer, and the interestingness is more sufficient. The scheme in the embodiment can be applied to the printing of the couplet, so that the user can write the couplet by hand at the electronic equipment end and print the couplet, and the user experience is better.
In one implementation, before sending a print instruction to a printing device, prompt information is displayed, the prompt information including size information of a printing sheet matching the first image.
As shown in interface 170h of FIG. 17h, after the user clicks on print control 175, the electronic device displays prompt information 179. After the user puts paper into the printing equipment, the user clicks to start printing, and the printing equipment executes the printing task again.
In one implementation mode, the second writing area is displayed in response to a ninth operation acting on the fifth control, and after an eighth image is displayed in the first writing area according to the touch track of the eighth operation in response to an eighth operation acting on the first writing area, a tenth image is displayed in the second writing area according to the touch track of the eleventh operation in response to an eleventh operation acting on the second writing area; and displaying an eleventh image corresponding to the tenth image at a second position of the seventh image in response to a twelfth operation applied to the fifth control, wherein the second position is different from the first position.
Still taking the interface 170f shown in fig. 17f as an example, after the user clicks the control 178d, an empty writing area, that is, the second writing area, is displayed, the user writes in the second writing area, and after the user clicks the control 178d again, the content written in the second writing area is displayed at a preset position in the seventh image. That is, when the user writes the couplet, each character in the couplet has a preset position and size, and each time the user clicks the control 178d, the handwriting in the writing area will be displayed as the next character at the corresponding position in the couplet.
In one implementation, the eighth interface includes a seventh control, and after displaying a ninth image corresponding to the eighth image at the first position of the seventh image in response to a ninth operation acting on the fifth control, a third writing area is displayed in response to a thirteenth operation acting on the seventh control; in response to a fourteenth operation applied to the third writing area, displaying a twelfth image in the third writing area according to the touch trajectory of the fourteenth operation; and displaying a thirteenth image corresponding to the twelfth image at the first position of the seventh image in response to a fifteenth operation acting on the fifth control.
Still taking the interface 170f shown in fig. 17f as an example, the seventh control may be the control 178c. The user clicks 178c and may edit the last word. After the user has finished writing, the user clicks on control 178d, and the last character is replaced with the most recently written handwriting. Through the above-described embodiment, it is possible to rewrite the characters that have been displayed in the couplet. The thirteenth image corresponding to the twelfth image may be an image obtained by scaling the twelfth image.
In one implementation, the seventh image is smaller than the first image.
As shown in the interface 170g of fig. 17g, when the first writing area is displayed in the eighth interface, the first image is reduced to the seventh image in order to avoid the first writing area from blocking the image.
In one implementation, the first interface further includes: the eighth control is used for responding to a sixteenth operation acted on the eighth control and displaying a first option in the first interface, wherein the first option corresponds to the background image; and receiving a seventeenth operation of the user on the first option, and displaying a background image selected by the seventeenth operation in the first image.
As shown in the interface 170a of fig. 17a, the eighth control may be a background control 173b, and the second control area 173 includes a plurality of first options corresponding to the background, the user selects 173a, and the first image displays 173a corresponding background.
In one implementation, the first interface further includes: a ninth control that, in response to an eighteenth operation on the ninth control, displays a ninth interface, the ninth interface including: the fourteenth image, the second option, the third option and the fourth option, wherein the second option corresponds to the text, the third option corresponds to the font, and the fourth option corresponds to the color; receiving a nineteenth operation of the user on the second option, and displaying a text corresponding to the nineteenth operation in the fourteenth image; receiving a twentieth operation of the user on the third option, and displaying the text in a font corresponding to the twentieth operation in the fourteenth image; and receiving a twenty-first operation of the user on the fourth option, and displaying the text in a color corresponding to the twenty-first operation in the fourteenth image.
The ninth control may be a text option 173c, the ninth interface may be an interface 170b as shown in fig. 17b, the second option may be a content option, the fourth option may be a font option, and the fifth option may be a color option, as shown in the interface 170a of fig. 17 a.
In one implementation, the third interface further includes: a tenth control to display a tenth interface in response to a twelfth operation on the tenth control, the tenth interface comprising: a fifteenth image, a text entry area, and a text box; receiving a twenty-third operation acting on the text input area, and displaying text content corresponding to the twenty-third operation in the text box; in response to a twenty-fourth operation by the user at the text box, the text content is displayed on the fifteenth image.
As with the interface 170b shown in fig. 17b, the tenth control may be control 173g, the user clicks on control 173g, the tenth interface is displayed, the tenth interface may be 170e as shown in fig. 17e, the fifteenth image may be an image in the preview area 172, the text entry area may be a virtual keyboard 174, and the textbox may be a custom content window 177.
Embodiments of the present application further provide an electronic device, including a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the above related method steps to implement the method in the above embodiments.
An embodiment of the present application further provides a computer-readable storage medium, where a computer instruction is stored in the computer-readable storage medium, and when the computer instruction runs on a terminal device, the terminal device is caused to execute the relevant method steps to implement the method in the foregoing embodiment.
The embodiment of the present application further provides a computer program product, which when running on a computer, causes the computer to execute the above related steps to implement the method in the above embodiment.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the method in the above method embodiments.
In addition, the electronic device, the computer storage medium, the computer program product, or the chip provided in the embodiments of the present application are all used for executing the corresponding method provided above, and therefore, the beneficial effects achieved by the electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the above embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the above functional modules is used as an example, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely one type of logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be discarded or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially implemented in the form of a software product, which is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and all the changes or substitutions should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A method of controlling an electronic device, comprising:
displaying a first interface, wherein the first interface comprises a first image, and the first image corresponds to an upper link in the couplet;
in response to a first operation on the first interface, displaying a second interface, the second interface comprising: a second image and a first control, the second image corresponding to an upper link in the couplet, the second image being different from the first image;
in response to a second operation on the first control, displaying a third interface, the third interface including a third image, the third image corresponding to a downline in the object;
in response to a third operation on the third interface, displaying a fourth interface, the fourth interface comprising: a fourth image and a second control, the fourth image corresponding to a downward link in the couplet, the fourth image being different from the third image;
in response to a fourth operation on the second control, displaying a fifth interface, the fifth interface comprising: the second image and the fourth image.
2. The method of claim 1, wherein the first interface further comprises: a third control, the method further comprising:
in response to a fifth operation on the third control, displaying a sixth interface, the sixth interface comprising a fifth image, the fifth image corresponding to a horizontal batch in the couplet;
in response to a sixth operation on the sixth interface, displaying a seventh interface, the seventh interface comprising: a sixth image and the second control, the sixth image corresponding to a cross-bar in the couplet, the sixth image different from the fifth image;
wherein the fifth interface further comprises the sixth image.
3. The method of claim 1, wherein the first interface further comprises: a fourth control; the method further comprises the following steps:
in response to a seventh operation on the fourth control, displaying an eighth interface, the eighth interface comprising: the content of the seventh image is the same as that of the first image;
responding to an eighth operation acting on the first writing area, and displaying an eighth image in the first writing area according to a touch track of the eighth operation;
displaying a ninth image corresponding to the eighth image at a first position of the seventh image in response to a ninth operation acting on the fifth control;
and responding to a tenth operation acted on the sixth control, and sending a printing instruction to the printing device.
4. The method of claim 3, wherein prior to sending the print instruction to the printing device, the method further comprises:
and displaying prompt information, wherein the prompt information comprises the size information of the printing paper matched with the first image.
5. The method of claim 3, further comprising: displaying a second writing area in response to a ninth operation on the fifth control, and after displaying an eighth image in the first writing area according to a touch trajectory of an eighth operation in response to the eighth operation on the first writing area, the method further includes:
in response to an eleventh operation on the second writing area, displaying a tenth image in the second writing area according to a touch trajectory of the eleventh operation;
displaying an eleventh image corresponding to the tenth image at a second position of the seventh image in response to a twelfth operation applied to the fifth control, wherein the second position is different from the first position.
6. The method of claim 3, wherein the eighth interface includes a seventh control, and wherein, after displaying a ninth image corresponding to the eighth image at the first location of the seventh image in response to a ninth operation performed on the fifth control, the method further comprises:
displaying a third writing area in response to a thirteenth operation acting on the seventh control;
in response to a fourteenth operation applied to the third writing area, displaying a twelfth image in the third writing area according to a touch trajectory of the fourteenth operation;
and displaying a thirteenth image corresponding to the twelfth image at the first position of the seventh image in response to a fifteenth operation acting on the fifth control.
7. The method of claim 3, wherein the seventh image is smaller than the first image.
8. The method of claim 1, wherein the first interface further comprises: an eighth control, the method further comprising:
in response to a sixteenth operation on the eighth control, displaying a first option in the first interface, wherein the first option corresponds to a background image;
and receiving a seventeenth operation of the first option by the user, and displaying the background image selected by the seventeenth operation in the first image.
9. The method of claim 1, wherein the first interface further comprises: a ninth control, the method further comprising:
in response to an eighteenth operation on the ninth control, displaying a ninth interface, the ninth interface comprising: a fourteenth image, a second option, a third option and a fourth option, wherein the second option corresponds to a text, the third option corresponds to a font, and the fourth option corresponds to a color;
receiving a nineteenth operation of the user on the second option, and displaying a text corresponding to the nineteenth operation in the fourteenth image;
receiving a twentieth operation of the user on the third option, and displaying text in a font corresponding to the twentieth operation in the fourteenth image;
receiving a twenty-first operation of the user on the fourth option, and displaying text in a color corresponding to the twenty-first operation in the fourteenth image.
10. The method of claim 9, wherein the third interface further comprises: a tenth control, the method further comprising:
in response to a twelfth operation on the tenth control, displaying a tenth interface comprising: a fifteenth image, a text entry area, and a text box;
receiving a twenty-third operation acting on the text input area, and displaying text content corresponding to the twenty-third operation in the text box;
displaying the text content on the fifteenth image in response to a twenty-fourth operation by the user at the text box.
11. An electronic device comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method of any of claims 1-10.
12. A computer-readable storage medium, comprising a stored program, wherein the program, when executed, controls an apparatus in which the computer-readable storage medium is located to perform the method of any one of claims 1-10.
13. A computer program product containing executable instructions which, when executed on a computer, cause the computer to perform the method of any one of claims 1 to 10.
CN202210023806.7A 2022-01-10 2022-01-10 Control method of electronic equipment and electronic equipment Active CN115562543B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210023806.7A CN115562543B (en) 2022-01-10 2022-01-10 Control method of electronic equipment and electronic equipment
CN202310965343.0A CN117170557A (en) 2022-01-10 2022-01-10 Control method of electronic equipment and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210023806.7A CN115562543B (en) 2022-01-10 2022-01-10 Control method of electronic equipment and electronic equipment

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310965343.0A Division CN117170557A (en) 2022-01-10 2022-01-10 Control method of electronic equipment and electronic equipment

Publications (2)

Publication Number Publication Date
CN115562543A true CN115562543A (en) 2023-01-03
CN115562543B CN115562543B (en) 2023-08-11

Family

ID=84737576

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310965343.0A Pending CN117170557A (en) 2022-01-10 2022-01-10 Control method of electronic equipment and electronic equipment
CN202210023806.7A Active CN115562543B (en) 2022-01-10 2022-01-10 Control method of electronic equipment and electronic equipment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202310965343.0A Pending CN117170557A (en) 2022-01-10 2022-01-10 Control method of electronic equipment and electronic equipment

Country Status (1)

Country Link
CN (2) CN117170557A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334701A (en) * 2007-12-04 2008-12-31 哈尔滨工业大学深圳研究生院 Method for directly writing handwriting information
US20150052480A1 (en) * 2013-08-14 2015-02-19 Samsung Electronics Co., Ltd Printing control method, and apparatus and computer-readable recording medium thereof
CN108228571A (en) * 2018-02-01 2018-06-29 北京百度网讯科技有限公司 Generation method, device, storage medium and the terminal device of distich

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334701A (en) * 2007-12-04 2008-12-31 哈尔滨工业大学深圳研究生院 Method for directly writing handwriting information
US20150052480A1 (en) * 2013-08-14 2015-02-19 Samsung Electronics Co., Ltd Printing control method, and apparatus and computer-readable recording medium thereof
CN108228571A (en) * 2018-02-01 2018-06-29 北京百度网讯科技有限公司 Generation method, device, storage medium and the terminal device of distich

Also Published As

Publication number Publication date
CN115562543B (en) 2023-08-11
CN117170557A (en) 2023-12-05

Similar Documents

Publication Publication Date Title
JP4821529B2 (en) Image display apparatus and program
CN102436358B (en) Signal conditioning package and output control method
US7464336B2 (en) Information processing apparatus, image edit apparatus, and control methods thereof, and computer program and computer-readable storage medium
CA2537359A1 (en) Method and data structure for user interface customization
JPH11328380A (en) Image processor, method for image processing and computer-readable recording medium where program allowing computer to implement same method is recorded
US9158488B2 (en) Data processing apparatus and data processing method for generating data to be edited using a print driver
JP2014063246A (en) Electronic album creation device, electronic album creation method, and program
US20070188774A1 (en) Display of thumbnails of image data
EP1887455A2 (en) Transmit data creation apparatus and transmit data creation program recorded in computer-readable recording medium
JP3922102B2 (en) List display of multiple images
CN114201097A (en) Interaction method among multiple application programs
JP4813561B2 (en) Portable terminal, display method, display format determination program, and computer-readable recording medium
JP2003308310A (en) Display method, display, display program, and recording medium recorded with display program
CN113591432A (en) Labeling method and electronic equipment
CN115562543B (en) Control method of electronic equipment and electronic equipment
WO2023130920A1 (en) Control method for electronic device and electronic device
JP5127554B2 (en) Keyword setting method, program and apparatus
WO2010143500A1 (en) Document browsing device, document display method, and document display program
JP6353261B2 (en) Information processing apparatus, control method, and program
JP6701268B2 (en) Information processing apparatus, control method, and program
JP2007104128A (en) Image processing apparatus, mobile phone, image processing method, and image processing program
JP2007074136A (en) Layout editing device, method, and program, and server
JP2002152496A (en) Image processing device, method thereof, and recording medium therefor
JP2006236277A (en) Content display device, content display program, and recording medium recording content display program
JP2001243489A (en) Method and device for composing and editing image and recording medium with recorded image composing and editing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant