CN108604158A - A kind of customed method and terminal of terminal applies operating space - Google Patents

A kind of customed method and terminal of terminal applies operating space Download PDF

Info

Publication number
CN108604158A
CN108604158A CN201780009054.6A CN201780009054A CN108604158A CN 108604158 A CN108604158 A CN 108604158A CN 201780009054 A CN201780009054 A CN 201780009054A CN 108604158 A CN108604158 A CN 108604158A
Authority
CN
China
Prior art keywords
area
user
layer
region
touch screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201780009054.6A
Other languages
Chinese (zh)
Inventor
罗亮
罗翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN108604158A publication Critical patent/CN108604158A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Abstract

The present invention discloses a kind of customed method of terminal applies operating space and terminal, this method include:The first operational order input by user is received, first operational order is the selection operation that the user determines first area to be operated on the touchscreen;The first figure layer is shown in touch screen top layer, and first figure layer includes second area, and the second area is region corresponding with the first area position;Build the position mapping relations of the second area and the first area;When receiving the second operational order input by user for adjusting to the second area, according to the position mapping relations, the second area is adjusted, customize size, all user-friendly operating area in position, to solve since the operating area contents and distribution on terminal touch screen is compact, when the problem of be easy to causeing maloperation and operating area are fixed on touch screen right, it is not easy to be accustomed to the problem of left-handed user operates.

Description

Self-customization method of terminal application operation area and terminal
The present application requires 2016 of priority of chinese patent application with application number 201611090591.1, filed by the chinese patent office on 12/01/2016, entitled "method and apparatus for customizing terminal application operating region by user". The entire contents of the above-mentioned prior application are incorporated by reference in the present document. For the sake of brevity only, the entire contents of which are not repeated in the text of this document.
Technical Field
The invention relates to the technical field of mobile terminals, in particular to a self-customizing method of a terminal application operation area and a terminal.
Background
Since the terminal is limited by the size of the screen, when the application needs to present more content, the corresponding operation area on the screen is compactly arranged, which is easy to cause misoperation for a less sensitive screen or a user with thick fingers. In addition, when the operation area is fixed to the right of the screen, it may be inconvenient for a left-handed user to perform the operation.
Regarding to the above problems, the currently solved solution is mainly a system-level magnifying glass function provided on an application terminal, and the function implements display zooming of a specific-size interface through a shading program (shader), thereby implementing an effect of magnifying and displaying in a specified area. The scheme has the defects that the first point is that after the amplification area moves, the content in the amplification changes along with the movement of the area; the second point is that the scheme only aims at the screen display amplification area and cannot be bound with a specific interface of an application, and when the application terminal is switched from one interface to another interface, the content of the amplification area can be changed into the amplification area of the other interface.
Disclosure of Invention
The embodiment of the invention provides a self-customizing method of a terminal application operation area and a terminal, which aim to solve the problems that the content layout of a corresponding operation area on a terminal touch screen is compact, and misoperation of a user is easily caused, and the operation area is inconvenient for a left-handed user to operate when being fixed on the right side of the touch screen.
In a first aspect, an embodiment of the present invention provides a method for self-customizing a terminal application operating area, where an execution main body is a terminal capable of self-customizing a terminal application operating area. The method comprises the following steps:
receiving a first operation instruction input by a user, wherein the first operation instruction is a selection operation of a first area to be operated determined by the user on a touch screen;
displaying a first image layer on the top layer of the touch screen, wherein the first image layer comprises a second area, and the second area is an area corresponding to the position of the first area;
constructing a position mapping relation between the second area and the first area;
when a second operation instruction which is input by the user and used for adjusting the second area is received, adjusting the second area according to the position mapping relation, and mapping the operation of the user on the second area to the position corresponding to the first area according to the position mapping relation.
In a possible implementation manner, the first layer is a transparent layer.
In a possible implementation manner, before displaying the first layer on the top layer of the touch screen, the method further includes:
and displaying a second layer, wherein the second layer is positioned above the first area, and the color of the second layer is the same as the background color of the top layer where the first area is positioned.
In a possible implementation manner, the second region is obtained by copying the first region, and the second region is an opaque region.
In a possible implementation manner, the second area includes an operation key, the operation key is an opaque area, and an area of the second area except for the operation key is a transparent area.
In a possible implementation manner, the constructing a position mapping relationship between the second region and the first region specifically includes:
acquiring coordinates of touch points in the first area, scaling parameters of the touch points in the second area relative to the touch points in the first area, and offset variables of the touch points in the second area relative to the touch points in the first area;
and performing matrix operation on the coordinates of the touch points in the first area, the scaling parameters and the offset variable to obtain the coordinates of the touch points in the second area corresponding to the positions of the touch points in the first area.
In a second aspect, an embodiment of the present invention provides a terminal, configured to execute the method for self-customizing a terminal application operating area provided in the first aspect of the embodiment of the present invention, where the terminal includes: a touch screen and a processor;
the touch screen is used for receiving a first operation instruction input by a user, wherein the first operation instruction is a selection operation of a first area to be operated determined by the user on the touch screen; receiving a second operation instruction which is input by the user and is used for adjusting the second area;
the processor is used for controlling the touch screen to display a first image layer, wherein the first image layer comprises the second area, and the second area is an area corresponding to the position of the first area; and constructing a position mapping relation between the second area and the first area, adjusting the second area according to the position mapping relation, and mapping the operation of the user on the second area to the corresponding position of the first area according to the position mapping relation.
In a possible implementation manner, the first layer is a full-screen transparent layer.
In a possible implementation manner, before displaying the first layer, the touch screen further includes:
and displaying a second layer, wherein the second layer is positioned above the first area, and the color of the second layer is the same as the background color of the top layer where the first area is positioned.
In a possible implementation manner, the second region is obtained by copying the first region, and the second region is an opaque region.
In a possible implementation manner, the second area includes an operation key, the operation key is an opaque area, and an area of the second area except for the operation key is a transparent area.
In a possible implementation manner, the processor constructs a position mapping relationship between the second region and the first region, including:
acquiring coordinates of touch points in the first area, scaling parameters of the touch points in the second area relative to the touch points in the first area, and offset variables of the touch points in the second area relative to the touch points in the first area;
and performing matrix operation on the coordinates of the touch points in the first area, the scaling parameters and the offset variable to obtain the coordinates of the touch points in the second area corresponding to the positions of the touch points in the first area.
In a third aspect, an embodiment of the present invention further provides another method for self-customizing a terminal application operating area, where the method includes:
acquiring input of a user for selecting a first area on a touch screen;
generating a second region;
acquiring input of a user for adjusting the second area on the touch screen;
adjusting the second area;
acquiring input of a user for operating the second area on a touch screen;
and mapping the operation of the user on the second area to the position corresponding to the first area.
In one possible implementation, the user adjusting the second area includes zooming in and/or moving the second area.
In one possible implementation, the user's operation on the second area includes clicking on the second area.
In a fourth aspect, an embodiment of the present invention further provides another embodiment of a terminal, configured to execute the method for self-customizing a terminal application operating area provided in the third aspect of the embodiment of the present invention, where the terminal includes: a processor, a touch screen;
the processor generates a second area on the touch screen according to the input of a user for selecting a first area on the touch screen; adjusting the second area according to the input of the user for adjusting the second area on the touch screen; and mapping the operation of the user on the second area to the position corresponding to the first area according to the input of the user on the touch screen for operating the second area.
In one possible implementation, the user adjusting the second area includes zooming and/or moving the second area.
In one possible implementation, the user's operation on the second area includes clicking on the second area.
In a fifth aspect, an embodiment of the present invention further provides another terminal, where the terminal is a terminal capable of performing self-control of a terminal application operating area, and the terminal includes:
the input and output unit is used for receiving a first operation instruction and a second operation instruction input by a user, wherein the first operation instruction is a selection operation of a first area to be operated determined by the user in the touch screen; the second operation instruction is the adjustment operation of the user on the second area.
The image layer generating unit is used for displaying a first image layer on the top layer of the touch screen, wherein the first image layer comprises a second area, and the second area is an area corresponding to the position of the first area.
The mapping unit is used for constructing a position mapping relation between the second area and the first area;
and the processing unit is used for adjusting the second area according to the position mapping relation when receiving a second operation instruction input by the user, and mapping the operation of the user on the second area to the position corresponding to the first area according to the position mapping relation.
In a possible implementation manner, the first layer is a transparent layer.
In a possible implementation manner, before the layer generating unit displays the first layer on the touch screen, the layer generating unit is further configured to:
and displaying a second layer, wherein the second layer is positioned above the first area, and the color of the second layer is the same as the background color of the top layer where the first area is positioned.
In a possible implementation manner, the second area is obtained by copying the first area by the layer generating unit, and the second area is an opaque area.
In a possible implementation manner, the second area includes an operation key, the operation key is an opaque area, and an area of the second area except for the operation key is a transparent area.
In a possible implementation manner, the constructing, by the mapping unit, a position mapping relationship between the second region and the first region includes:
the mapping unit acquires coordinates of touch points in the first area, scaling parameters of the touch points in the second area relative to the touch points in the first area, and offset variables of the touch points in the second area relative to the touch points in the first area;
and the mapping unit performs matrix operation on the coordinates of the touch points in the first area, the scaling parameters and the offset variable to obtain the coordinates of the touch points in the second area corresponding to the positions of the touch points in the first area.
Compared with the prior art, in the embodiment of the invention, the transparent first layer is generated on the top layer of the touch screen according to the first area selected by the user, and the second area which is covered on the first area and has a position mapping relation with the first area is generated at the position corresponding to the first area in the first layer. The user can customize the operation area with convenient size and position by zooming and/or moving the second area, thereby solving the problems of misoperation or inconvenient operation caused by the factors of too small size and/or too compact layout of keys or contents in the terminal application operation area, unmovable position and the like.
Drawings
Fig. 1 is a schematic diagram illustrating a self-customizing process of a playing control area of a video player according to an embodiment of the present invention;
fig. 2 is a flowchart of a method for self-customizing a terminal application operating area according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an embodiment of the present invention that improves upon the self-customization process of FIG. 1;
FIG. 4 is a schematic diagram illustrating a second layer generated when the self-customization process of FIG. 1 is modified according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of the self-customization process of FIG. 1 after a temporary modification in accordance with an embodiment of the present invention;
FIG. 6 is a schematic diagram of an embodiment of the present invention to improve the second region of the self-customization process of FIG. 1;
fig. 7 is a flowchart of another method for self-customizing a terminal application operating area according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of another terminal provided in an embodiment of the present invention;
fig. 10 is a schematic structural diagram of another terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions of the embodiments of the present invention are further described in detail with reference to the accompanying drawings and embodiments.
Fig. 1 is a schematic diagram of a self-customization process of a playing control area of a video player according to an embodiment of the present invention, as shown in fig. 1, the present invention takes a video player on a mobile phone as an example, as shown in fig. 1-1, a touch screen 101 of the mobile phone includes a video display area 102 and a playing control area 103, the playing control area 103 is an area where a back key forward key playing/pause key on/off key "■" located in a lower right corner of the touch screen 101 is located, after a user selects the playing control area 103 by two fingers, a full-screen transparent first layer 104 is generated on a top layer of the touch screen according to the playing control area 103 selected by the user, an opaque playing control area 105 corresponding to a position of the playing control area 103 is generated in the first layer 104, and a position mapping relationship between the playing control area 105 and the playing control area 103 is constructed, as shown in fig. 1-3, the first layer 104 is overlaid on the playing control area 103, the playing control area 105 overlays the playing control area 103, as shown in fig. 1-4, the user can enlarge the playing control area 105 to a desired scale, and move the playing control area 105 to a position mapping relationship between the playing control area 105 and a click point of the playing control area 105, so as shown in fig. 5, thereby facilitating the playing control area.
Fig. 2 is a flowchart of a method for self-customizing a terminal application operating area according to an embodiment of the present invention. As shown in fig. 2, the execution subject of the self-customizing method for the playing control area of the video player according to the embodiment of the present invention is a terminal capable of self-customizing the terminal application operation area. The method comprises the following steps:
step 201: receiving a first operation instruction input by a user, wherein the first operation instruction is a selection operation of a first area to be operated determined by the user in a touch screen.
Specifically, the first operation instruction input by the user may be an operation of selecting a first area needing to be operated on a touch screen of the mobile phone terminal by the user; the first area may be a rectangular play control area 103 of the video player.
Before selecting the playing control area 103 of the video player, the user can start the adjustment function of the video playing control area by adopting a gesture or a power and volume key combination key, then uses a double finger to select the upper left corner and the lower right corner of the rectangular area needing zooming, moves the upper left corner and the lower right corner of the rectangular area, and includes a backward key forward key playing/pause key on/off key "■", thereby determining the shape, the size and the position of the playing control area 103 shown in fig. 1-1, namely completing the input of a first operation instruction.
Step 202: displaying a first image layer on the top layer of the touch screen, wherein the first image layer comprises a second area, and the second area is an area corresponding to the first area.
Specifically, as shown in fig. 1-2, the second area may be a play control area 105. After the play control area 103 is determined, a full-screen transparent layer is generated on the top layer of the touch screen, where the layer is the first layer 104, and then the play control area 103 is copied in the first layer 104 to obtain a play control area 105 corresponding to the position of the play control area 103. The areas in the first image layer 104 except the play control area 105 are opaque areas, and the other areas are transparent areas. The first image layer 104 is overlaid on the original image layer, and the playback control area 105 overlays the playback control area 103.
Step 203: and constructing a position mapping relation between the second area and the first area.
Specifically, the position mapping relationship between the second area and the first area is the position mapping relationship between the playback control area 105 and the playback control area 103. The position mapping relationship between the playback control area 105 and the playback control area 103 is the mapping relationship between the touch point (X, Y) in the playback control area 105 and the touch point (X, Y) in the playback control area 103. As shown in formula (i), the mapping relationship between the touch point (X, Y) in the play control area 105 and the touch point (X, Y) in the play control area 103 can be determined by the coordinates (X, Y) of the touch point in the play control area 103, the scaling parameters sx, sy of the touch point (X, Y) in the play control area 105 relative to the touch point (X, Y) in the play control area 103, and the offset variables dx, dy of the touch point (X, Y) in the play control area 105 relative to the touch point (X, Y) in the play control area 103, expressed as a matrix:
where X, Y is the coordinate of the touch point (X, Y) in the playback control area 105, X and Y are the coordinates of the touch point (X, Y) in the playback control area 103, sx and sy are the scales of the touch point (X, Y) in the X and Y directions, respectively, and dx and dy are the offsets of the touch point (X, Y) in the X and Y directions, respectively.
Step 204: when a second operation instruction which is input by a user and used for adjusting the second area is received, adjusting the second area according to the position mapping relation, and mapping the operation of the user on the second area to the position corresponding to the first area according to the position mapping relation.
Specifically, the second operation instruction may be an operation of zooming and moving the second area by the user. The adjusting of the second area is to determine the size and/or position of the second area according to the zooming and/or moving operation of the user on the second area. Specifically, as shown in fig. 1 to 4, after the play control area 105 is obtained, when the user zooms the play control area 105 according to the required size of the user, the zoom ratio parameter sx, sy input by the user is obtained, and the size of the play control area 105 is determined according to the formula (i) in step 203. As shown in fig. 1-5, when the user moves the playing control area 105 according to the required position, the offset variables dx, dy input by the user are obtained, and the position of the playing control area 105 is determined according to the formula (i) in step 203. When the user stops zooming and/or moving the playing control area 105, the size and the position of the current playing control area 105 on the touch screen, and the mapping relation data of the playing control area 105 and the playing control area 103 at this time are saved. When the user performs a click operation in the play control area 105, the click position of the user is mapped to a corresponding position in the play control area 103 according to the position mapping relationship between the play control area 105 and the play control area 103, and finally the user operates the play control area 103.
When the user exits the self-control interface of the playing control area, the position of the current playing control area 105 on the touch screen and the mapping relation data of the playing control area 105 and the playing control area 103 are automatically saved, so that the default position of the playing control area 105 is determined when the user starts the adjustment function of the video playing control area next time.
According to the embodiment of the invention, a full-screen transparent first layer 104 is generated on the top layer of the touch screen according to the play control area 103 selected by a user, and the play control area 103 is copied to the position of the first layer 104 corresponding to the play control area 103, so that an opaque play control area 105 which is covered on the play control area 103 and has a position mapping relation with the play control area 103 is obtained. The user can customize the size and position of the play control area 105 by zooming and/or moving the play control area, thereby solving the problem of misoperation or inconvenient operation caused by too small and/or dense keys and unmovable position in the play control area 103.
Optionally, step 202 in this embodiment of the present invention further includes, before displaying the first layer on the top layer of the touch screen, a step of displaying a second layer by the terminal. The second layer is located above the first area, and the color of the second layer is the same as the background color of the top layer where the first area is located.
Specifically, in one example, as shown in fig. 3, fig. 3 is a schematic diagram illustrating the effect of the embodiment of the present invention on improving the self-customization process of fig. 1. In step 202 of the method in the above embodiment, before generating the first layer 104, according to the size of the playing control area 103, the second layer 106 as shown in fig. 4 is generated on the playing control area 103, where the area of the second layer 106 is at least equal to the playing control area 103, and is used to cover the playing control area 103. Specifically, color values (RGB) at a first area frame are multipoint sampled, and if the frame sampled color values are the same, the second layer is filled with the color pure color; and if the color values are different, filling (gradient) the second image layer by adopting sampling color interpolation.
In the embodiment of the present invention, by generating the second layer 106, when the user zooms out and/or moves the play control area 106 to another position, the second layer 106 covers the play control area 103, so that the content on the touch screen is simpler, and the visual experience of the user is increased.
Optionally, the second area generated in step 202 in the embodiment of the present invention includes an operation key, where the operation key is an opaque area, and an area of the second area except for the operation key is a transparent area.
Specifically, in one example, as shown in fig. 5, fig. 5 is a schematic diagram illustrating an effect of another modification of the self-customization process of fig. 1 according to the embodiment of the present invention. In step 202 of the above embodiment, when the playing control area 105 is generated, each key in the playing control area 105 is reserved as an opaque area, and the rest areas are transparent areas, so as to obtain the reserved playing control area 107 shown in fig. 6.
In the embodiment of the present invention, the play control area 105 in the above embodiment is improved to obtain the play control area 107 in which each key is an opaque area and the rest areas are transparent areas, so that when the user enlarges or moves the play control area 107 to another position, the content on the touch screen is covered as little as possible, and the influence on the user in watching video content and the influence on the user experience are avoided.
Fig. 7 is a flowchart of another method for customizing an operating area of a terminal application according to an embodiment of the present invention. As shown in fig. 7, according to the method for self-customizing a terminal application operating area provided by the embodiment of the present invention, the execution main body may be a mobile phone, a tablet computer, a computer with a touch display screen function, and a television with a self-contained processing function and a touch display screen function. The method comprises the following specific steps:
step 301: an input of a user selecting a first area on a touch screen is acquired.
Specifically, before the user selects the first area to be operated, the user can start the self-customizing function of the terminal application operation area by adopting a gesture or a power supply + volume key combination key. After the self-customizing function of the terminal application operation area is started, the user selects an area to be operated on the touch screen, namely a first area, for example: and selecting the upper left corner and the lower right corner of the rectangular area needing zooming on the touch screen by using the double fingers.
Step 302: a second region is generated.
Specifically, after a user selects a first operation area, a second area is generated at a position on the touch screen corresponding to the first area, the second area is an opaque block, the second area covers the first area, and content displayed in the second area is the same as content in the first area.
Step 302: and acquiring the input of the user for adjusting the second area on the touch screen.
Specifically, after the second area is obtained, a touch screen signal input of a zooming and/or moving operation of the user on the second area is received.
Step 302: adjusting the second area.
Specifically, according to the zoom and/or move operation of the user on the touch screen on the second area, the second area is correspondingly adjusted to obtain an area suitable for the size and position of the user operation, for example: and according to the amplification operation of the user on the second area, amplifying the second area and the like.
Step 302: and acquiring input of a user for operating the second area on the touch screen.
Specifically, the operation of the user on the second area includes clicking. The user can click on the adjusted area.
Step 302: and mapping the operation of the user on the second area to the position corresponding to the first area.
Specifically, for example, when the user performs a click operation on the content in the second area, the click position of the user in the second area may be mapped to the corresponding position in the first area, so as to implement the click operation on the first area by the user.
Optionally, the operation of the second area by the user may further include: closing and hiding; the user can close or hide the second area according to the needs of the user, so as to meet the needs of the user, for example: and closing and hiding the second area, thereby reducing the content of the display interface and enabling the interface to be simpler and more beautiful.
Optionally, the operation of the second area by the user may further include: minimization; the user can minimize the second area according to the requirement of the user so as to meet the requirement of the user.
In the embodiment of the invention, an operation area which can be adjusted by zooming, moving and the like, namely a second area, is generated according to the operation (for example, on the area with small size, dense content and improper position) of a first area selected by a user on a touch screen, the user enlarges the second area to the size suitable for the user to operate and/or moves the second area to the position suitable for the user to operate, and then performs the operation such as clicking and the like, and when the user clicks the second area, the operation of clicking on the second area by the user is mapped to the corresponding position of the first operation area, so that the operation of the user on the first area is realized, and the misoperation on the first area is further reduced.
Fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present invention. As shown in fig. 8, the terminal is configured to execute the self-customizing method of the terminal application operation area shown in fig. 2 according to the embodiment of the present invention, and the terminal may include: the mobile phone, the tablet personal computer, the computer with the touch display screen function, the television with the processing function and the touch display screen function and the like.
The terminal provided by the embodiment of the invention takes a mobile phone as an example, and the mobile phone comprises: touch screen 410, processor 420, other input devices 430, transceiver circuitry 440, sensors 450, audio circuitry 460, I/O subsystem 470, memory 480, and power supply 490. Those skilled in the art will appreciate that the handset configuration shown in fig. 7 is not intended to be limiting and may include more or fewer components than those shown, or may combine certain components, or split certain components, or arranged in different components.
The following describes each component of the mobile phone in detail with reference to fig. 8:
the touch screen 410 may be used to display information input by or provided to the user and various menus of the cellular phone and may also accept user input. The touch screen 410 may include a display panel 411, and a touch panel 412. The Display panel 411 may be configured by a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like; the touch panel 412, also referred to as a touch screen, a touch sensitive screen, etc., may collect contact or non-contact operations (e.g., operations performed by a user on or near the touch panel 412 using any suitable object or accessory such as a finger or a stylus, and may also include body sensing operations; including single-point control operations, multi-point control operations, etc.) on or near the touch panel 412, and drive the corresponding connection device according to a preset program.
Alternatively, the touch panel 412 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction and gesture of a user, detects signals brought by touch operation and transmits the signals to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into information that can be processed by the processor, sends the information to the processor 420, and receives and executes commands sent by the processor 420.
In addition, the touch panel 412 may be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave, and the touch panel 412 may also be implemented by any technology developed in the future.
Further, the touch panel 412 may cover the display panel 411, a user may operate on or near the touch panel 412 covered on the display panel 411 according to content displayed on the display panel 411 (the display content includes, but is not limited to, a soft keyboard, a virtual mouse, virtual keys, icons, etc.), the touch panel 412 may transmit the operation on or near the touch panel 412 to the processor 420 through the I/O subsystem 470 to determine a user input after detecting the operation on or near the touch panel, and the processor 420 may then provide a corresponding visual output on the display panel 411 through the I/O subsystem 470 according to the user input. Although in fig. 7, the touch panel 412 and the display panel 411 are two separate components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 412 and the display panel 411 may be integrated to implement the input and output functions of the mobile phone.
In this embodiment of the present invention, the mobile phone touch screen 410 is specifically configured to receive a first operation instruction input by a user through the display panel 411, where the first operation instruction is a selection operation of determining, by the user, a first area to be operated in the touch screen; and receiving a second operation instruction which is input by the user and is used for adjusting the second area.
Specifically, when the user selects an area requiring an operation area on the display panel 411 with two fingers, the display panel 411 receives a touch signal generated by the user operation as a first operation instruction, and transmits the first operation instruction to the memory 480 and the processor 420 via the I/O subsystem.
Further, when the user performs operations such as zooming and moving on the second area, the display panel 411 receives a touch signal generated by the user operation as a second operation instruction, and transmits the second operation instruction to the memory 480 and the processor 420 via the I/O subsystem.
The processor 420 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 480 and calling data stored in the memory 480, thereby integrally monitoring the mobile phone. Alternatively, processor 420 may include one or more processing units; processor 420 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications.
In an embodiment of the present invention, the processor 420 may be configured to control the touch screen 410 to perform corresponding display. Specifically, the processor 420 is configured to determine, according to the received first operation instruction, a shape, a size, and a position of a first area, generate a first layer on the top layer of the touch screen 440, and copy the first area to the first layer, so as to obtain a second area corresponding to the position of the first area; and constructing a position mapping relation between the second area and the first area, adjusting the second area according to the position mapping relation and the second operation instruction, and mapping the operation of the user on the second area to the corresponding position of the first area.
Further, the first operation instruction received by the processor 420 is a selection operation performed by a user when determining the shape, size and position of the first area.
Further, the first layer is a full-screen transparent layer.
Further, the second area generated in the first layer by the processor 420 is an opaque area.
Further, the processor 420 constructs a position mapping relationship between the second region and the first region, including:
the processor 420 obtains coordinates of the touch point in the first area, a scaling parameter of the touch point in the second area relative to the touch point in the first area, and an offset variable of the touch point in the second area relative to the touch point in the first area;
the processor 420 performs matrix operation on the coordinates of the touch point in the first area, the scaling parameter, and the offset variable to obtain the coordinates of the touch point in the second area corresponding to the position of the touch point in the first area.
In the embodiment of the present invention, the processor 420 determines the shape, size, and position of the first area according to a first operation instruction input by a user when determining the first area to be operated, generates a full-screen transparent first layer on a top layer of the touch screen, and copies the first area to a position of the first layer corresponding to the first area, so as to obtain an opaque second area which is covered on the first area and has a position mapping relationship with the first area. The user can customize the operation area with convenient size and position by zooming and/or moving the second area, thereby solving the problems of misoperation or inconvenient operation caused by undersize and/or dense operation keys and unmovable position in the first area.
Optionally, before displaying the first layer on the top of the touch screen, the processor 420 is further configured to:
the processor 420 displays a second layer above the first area. And sampling the background color of the top layer where the first region is located by the color of the second layer.
In this embodiment of the present invention, the processor 420 generates the second layer, so that when the user zooms out or moves the second area to another position, the second layer covers the first area, so that the content on the touch screen is more concise, and the user visual experience is increased.
Optionally, the second area generated by the processor 420 may also be an area including an operation key, but the operation key is an opaque area, and an area of the second area other than the operation key is a transparent area.
In the embodiment of the present invention, the processor 420 generates the second area in which each key is an opaque area and the rest areas are transparent areas, so that when the user enlarges or moves the second area to another position, the content on the touch screen is covered as little as possible, and the influence on the content on the touch screen of the user and the user experience are avoided.
Other input devices 430 may be used to receive entered numeric or character information and generate key signal inputs relating to user settings and function controls of the handset. In particular, other input devices 430 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, a light mouse (a light mouse is a touch-sensitive surface that does not display visual output, or is an extension of a touch-sensitive surface formed by a touch screen), and the like. The other input device 430 is connected to an other input device controller 471 of the I/O subsystem 470 and is in signal communication with the processor 420 under control of the other device input controller 471.
The transceiver circuit 440 may be used for receiving and transmitting signals during a message transmission or a call. The transceiver circuit 440 in the embodiment of the present invention may be a Radio Frequency (RF) circuit. Specifically, the information received by the terminal is transmitted to the processor for processing. Generally, the transceiver circuit includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. Additionally, the transceiver circuit 440 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), NFC, etc.
Although not shown, the mobile phone may further include a camera, a bluetooth module, an infrared module, and the like, which are not described herein again.
In an embodiment of the present invention, the sensor 450 is embodied as a fingerprint sensor or a retinal recognizer. The fingerprint sensor is configured to collect fingerprint characteristic information of a user, transmit a collection result to the processor 420 through the I/O subsystem 470, and then the processor 420 records the collection result and provides a corresponding visual output on the display panel 411 through the I/O subsystem 470; the retina recognizer is used for collecting the retina feature information of the user, and transmitting the collected result to the processor 420 through the I/O subsystem 470, and then the processor 420 records the collected result and provides a corresponding visual output on the display panel 411 through the I/O subsystem 470.
The audio circuit 460, speaker 461, microphone 462 may provide an audio interface between a user and a cell phone. The audio circuit 460 may transmit the converted signal of the received audio data to the speaker 461, and convert the signal into a sound signal by the speaker 461 for output; on the other hand, the microphone 462 converts the collected sound signals into signals, which are received by the audio circuit 460 and converted into audio data, which are then output to the RF circuit 440 for transmission to, for example, another cell phone, or to the memory 480 for further processing.
The I/O subsystem 470 controls input and output of external devices, which may include other device input controllers 471, sensor controllers 472, and display controllers 473. Optionally, one or more other input control device controllers 471 receive signals from and/or transmit signals to other input devices 430, and the other input devices 430 may include physical buttons (push buttons, rocker buttons, etc.), dials, slide switches, joysticks, click wheels, a light mouse (a light mouse is a touch-sensitive surface that does not display visual output, or is an extension of a touch-sensitive surface formed by a touch screen). It is noted that other input control device controllers 471 can be coupled to any one or more of the devices described above. The display controller 473 in the I/O subsystem 470 receives signals from the touch screen 410 and/or sends signals to the touch screen 410. After the touch screen 410 detects the user input, the display controller 473 converts the detected user input into an interaction with the user interface object displayed on the touch screen 410, i.e., implements a human-machine interaction. The sensor controller 472 may receive signals from the one or more sensors 450 and/or send signals to the one or more sensors 450.
The memory 480 may be used to store software programs and modules, the processor 420 may execute various functional applications and data processing of the mobile phone by running the software programs and modules stored in the memory 480, and the memory 480 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, memory 480 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Specifically, in the embodiment of the present invention, the memory 480 stores the first operation instruction, the second operation instruction, and the coordinates of each touch point in the first area received by the touch screen 410, and stores the position mapping relationship between the second area and the first area when the user performs an adjustment operation on the second area each time, so that the processor 420 can customize the application operation area on the touch screen 410.
In the embodiment of the present invention, the mobile phone further includes a power supply 490 (such as a battery) for supplying power to each component, and the power supply may be logically connected to the processor 420 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
Fig. 9 is a schematic structural diagram of another terminal according to an embodiment of the present invention. As shown in fig. 9, based on the terminal shown in fig. 8, the embodiment of the present invention improves the function of the processor 420 in the terminal shown in fig. 8, to obtain another terminal, where the terminal is configured to execute the method shown in fig. 7 provided by the embodiment of the present invention, and the terminal includes:
processor 510 generates a second region on touch screen 520 based on an input from a user selecting the first region on touch screen 520; and adjusting the second area according to the input of the user for adjusting the second area on the touch screen 520; and mapping the operation of the user on the second area to the position corresponding to the first area according to the input of the user on the touch screen 520 for operating the second area.
Further, the user adjusting the second area includes zooming and/or moving the second area.
Further, the user's operation on the second area includes clicking on the second area.
In the embodiment of the present invention, the processor 510 generates an operation area (i.e., a second area) that can be adjusted by zooming, moving, and the like according to an operation (e.g., on an area with a small size, a dense content, and an unsuitable position) of a first area selected by a user on the touch screen 520, the user enlarges the second area to a size suitable for the user's own operation and/or moves the second area to a position suitable for the user's own operation, and then performs an operation such as clicking, and the processor 510 maps the operation of clicking on the second area by the user to the first operation area, thereby implementing the operation of the user on the first area and further reducing or avoiding a misoperation on the first area.
In addition, as shown in fig. 10, an embodiment of the present invention further provides another terminal, where the terminal is a self-customizing method that can execute a terminal application operating area, and the terminal includes, as shown in fig. 2: input/output section 610, layer generation section 620, mapping section 630, and processing section 640. The functions of the individual units are specifically as follows:
the input/output unit 610 may be configured to receive a first operation instruction and a second operation instruction input by a user, where the first operation instruction is a selection operation of a first area to be operated in a touch screen determined by the user; the second operation instruction is the adjustment operation of the user on the second area.
The layer generating unit 620 may be configured to display a first layer on a top layer of a touch screen, where the first layer includes a second area, and the second area is an area corresponding to a position of the first area.
The mapping unit 630 may be configured to construct a position mapping relationship between the second area and the first area;
the processing unit 640 may be configured to, when a second operation instruction for adjusting the second area is received, adjust the second area according to the position mapping relationship, and map, according to the position mapping relationship, an operation of the user on the second area to a corresponding position of the first area.
Further, the selection operation of the user on the first region is a selection operation performed when the user uses two fingers to determine the shape, size, and position of the first region.
Further, the first layer displayed on the top layer of the touch screen by the layer generating unit 620 is a full-screen transparent layer.
Further, the layer generation unit 620 generates a second region in the first layer as an opaque region.
Further, the mapping unit 630 constructs a position mapping relationship between the second region and the first region, including:
the mapping unit 630 obtains coordinates of a touch point in the first area, a scaling parameter of the touch point in the second area relative to the touch point in the first area, and an offset variable of the touch point in the second area relative to the touch point in the first area;
the mapping unit 630 performs matrix operation on the coordinates of the touch points in the first area, the scaling parameters, and the offset variables to obtain coordinates of the touch points in the second area corresponding to the positions of the touch points in the first area.
Further, the processing unit 640 determines the shape, size and position of the first area according to the received first operation instruction.
In the embodiment of the invention, according to a first operation instruction input by a user in determining a first area to be operated, the shape, size and position of the first area are determined, a full-screen transparent first layer is generated on a top layer of a touch screen, and the first area is copied to the position of the first layer corresponding to the first area, so that an opaque second area which is covered on the first area and has a position mapping relation with the first area is obtained. The user can customize the operation area with convenient size and position by zooming and/or moving the second area, thereby solving the problems of misoperation or inconvenient operation caused by undersize and/or dense operation keys and unmovable position in the first area.
Optionally, before the layer generating unit 620 displays the first layer on the top layer of the touch screen, the layer generating unit is further configured to:
the layer generating unit 620 displays a second layer above the first area. And the color of the second layer is the same as the background color of the top layer where the first region is located.
In this embodiment of the present invention, the layer generating unit 620 generates the second layer, so that when the user zooms out or moves the second area to another position, the second layer covers the second area, so that the content on the touch screen is simpler, and the visual experience of the user is increased.
Optionally, the second area generated by the image layer generating unit 620 may also be an area including an operation key, but the operation key is an opaque area, and an area of the second area except for the operation key is a transparent area.
In this embodiment of the present invention, the layer generating unit 620 generates the second area in which each key is an opaque area and the rest areas are transparent areas, so that when the user enlarges or moves the second area to another position, the content on the touch screen is covered as little as possible, and the content on the touch screen of the user is prevented from being influenced, thereby avoiding influencing the user experience.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, a software module executed by a processor, or a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are merely exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (18)

  1. A self-customizing method of a terminal application operation area is characterized by comprising the following steps:
    receiving a first operation instruction input by a user, wherein the first operation instruction is a selection operation of a first area to be operated determined by the user on a touch screen;
    displaying a first image layer on the top layer of the touch screen, wherein the first image layer comprises a second area, and the second area is an area corresponding to the position of the first area;
    constructing a position mapping relation between the second area and the first area;
    when a second operation instruction which is input by the user and used for adjusting the second area is received, adjusting the second area according to the position mapping relation, and mapping the operation of the user on the second area to the position corresponding to the first area according to the position mapping relation.
  2. The method of claim 1, wherein the first layer is a transparent layer.
  3. The method according to claim 1, wherein before displaying the first layer on top of the touch screen, the method further comprises:
    and displaying a second layer, wherein the second layer is positioned above the first area, and the color of the second layer is the same as the background color of the top layer where the first area is positioned.
  4. The method of claim 1, wherein the second region is a duplicate of the first region, and wherein the second region is an opaque region.
  5. The method of claim 1, wherein the second region comprises an operation key, the operation key is an opaque region, and a region of the second region other than the operation key is a transparent region.
  6. The method according to claim 1, wherein the constructing the position mapping relationship between the second region and the first region specifically includes:
    acquiring coordinates of touch points in the first area, scaling parameters of the touch points in the second area relative to the touch points in the first area, and offset variables of the touch points in the second area relative to the touch points in the first area;
    and performing matrix operation on the coordinates of the touch points in the first area, the scaling parameters and the offset variable to obtain the coordinates of the touch points in the second area corresponding to the positions of the touch points in the first area.
  7. A terminal, characterized in that the terminal comprises:
    the touch screen is used for receiving a first operation instruction input by a user, wherein the first operation instruction is a selection operation of a first area to be operated determined by the user on the touch screen; receiving a second operation instruction which is input by the user and is used for adjusting the second area;
    the processor is used for controlling the touch screen to display a first image layer, wherein the first image layer comprises the second area, and the second area is an area corresponding to the position of the first area; and constructing a position mapping relation between the second area and the first area, adjusting the second area according to the position mapping relation, and mapping the operation of the user on the second area to the corresponding position of the first area according to the position mapping relation.
  8. A terminal according to claim 7, characterised in that the first layer is a transparent layer.
  9. The terminal of claim 7, wherein the touch screen, before displaying the first layer, further comprises:
    and displaying a second layer, wherein the second layer is positioned above the first area, and the color of the second layer is the same as the background color of the top layer where the first area is positioned.
  10. A terminal as claimed in claim 7, wherein the second region is a duplicate of the first region, the second region being an opaque region.
  11. The terminal according to claim 7, wherein the second region includes an operation key, the operation key is an opaque region, and a region of the second region other than the operation key is a transparent region.
  12. The terminal of claim 7, wherein the processor constructs a position mapping relationship between the second region and the first region, comprising:
    acquiring coordinates of touch points in the first area, scaling parameters of the touch points in the second area relative to the touch points in the first area, and offset variables of the touch points in the second area relative to the touch points in the first area;
    and performing matrix operation on the coordinates of the touch points in the first area, the scaling parameters and the offset variable to obtain the coordinates of the touch points in the second area corresponding to the positions of the touch points in the first area.
  13. A self-customizing method of a terminal application operation area is characterized by comprising the following steps:
    acquiring input of a user for selecting a first area on a touch screen;
    generating a second region;
    acquiring input of a user for adjusting the second area on the touch screen;
    adjusting the second area;
    acquiring input of a user for operating the second area on a touch screen;
    and mapping the operation of the user on the second area to the position corresponding to the first area.
  14. The method of claim 13, wherein the user adjusting the second area comprises zooming and/or moving the second area.
  15. The method of claim 13, wherein the user action on the second area comprises clicking on the second area.
  16. A terminal, characterized in that the terminal comprises: a processor, a touch screen;
    the processor generates a second area on the touch screen according to the input of a user for selecting a first area on the touch screen; adjusting the second area according to the input of the user for adjusting the second area on the touch screen; and mapping the operation of the user on the second area to the position corresponding to the first area according to the input of the user on the touch screen for operating the second area.
  17. The terminal of claim 16, wherein the user adjusting the second area comprises zooming and/or moving the second area.
  18. The terminal of claim 16, wherein the user action on the second area comprises clicking on the second area.
CN201780009054.6A 2016-12-01 2017-03-30 A kind of customed method and terminal of terminal applies operating space Pending CN108604158A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201611090591.1 2016-12-01
CN201611090591 2016-12-01
PCT/CN2017/078850 WO2018098953A1 (en) 2016-12-01 2017-03-30 Self-customizing method and terminal for terminal application operation area

Publications (1)

Publication Number Publication Date
CN108604158A true CN108604158A (en) 2018-09-28

Family

ID=62241168

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780009054.6A Pending CN108604158A (en) 2016-12-01 2017-03-30 A kind of customed method and terminal of terminal applies operating space

Country Status (2)

Country Link
CN (1) CN108604158A (en)
WO (1) WO2018098953A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110168487A (en) * 2017-11-07 2019-08-23 华为技术有限公司 A kind of method of toch control and device
CN111444494A (en) * 2020-03-26 2020-07-24 维沃移动通信有限公司 Verification method, electronic device and computer readable storage medium
WO2021169569A1 (en) * 2020-02-26 2021-09-02 京东方科技集团股份有限公司 Touch-control display system and control method therefor
CN113535042A (en) * 2020-09-18 2021-10-22 厦门市和家健脑智能科技有限公司 Method and device for generating image based on old people cognitive recognition

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110968247B (en) * 2019-09-18 2022-05-10 华为技术有限公司 Electronic equipment control method and electronic equipment
CN112799580A (en) * 2021-01-29 2021-05-14 联想(北京)有限公司 Display control method and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101634935A (en) * 2009-08-18 2010-01-27 深圳市同洲电子股份有限公司 Method and device for inputting characters by touch screen
CN102662566A (en) * 2012-03-21 2012-09-12 中兴通讯股份有限公司 Magnifying display method and terminal for screen content
CN103399706A (en) * 2013-07-25 2013-11-20 北京小米科技有限责任公司 Page interaction method, device and terminal
CN103472996A (en) * 2013-09-17 2013-12-25 深圳市佳创软件有限公司 Method and device for receiving touch in mobile device
CN103558957A (en) * 2013-10-17 2014-02-05 深圳市欧珀通信软件有限公司 Method and device for screen operation of mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101634935A (en) * 2009-08-18 2010-01-27 深圳市同洲电子股份有限公司 Method and device for inputting characters by touch screen
CN102662566A (en) * 2012-03-21 2012-09-12 中兴通讯股份有限公司 Magnifying display method and terminal for screen content
CN103399706A (en) * 2013-07-25 2013-11-20 北京小米科技有限责任公司 Page interaction method, device and terminal
CN103472996A (en) * 2013-09-17 2013-12-25 深圳市佳创软件有限公司 Method and device for receiving touch in mobile device
CN103558957A (en) * 2013-10-17 2014-02-05 深圳市欧珀通信软件有限公司 Method and device for screen operation of mobile terminal

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110168487A (en) * 2017-11-07 2019-08-23 华为技术有限公司 A kind of method of toch control and device
US20200257445A1 (en) 2017-11-07 2020-08-13 Huawei Technologies Co., Ltd. Touch control method and apparatus
US11188225B2 (en) 2017-11-07 2021-11-30 Huawei Technologies Co., Ltd. Touch control method and apparatus
US11526274B2 (en) 2017-11-07 2022-12-13 Huawei Technologies Co., Ltd. Touch control method and apparatus
US11809705B2 (en) 2017-11-07 2023-11-07 Huawei Technologies Co., Ltd. Touch control method and apparatus
WO2021169569A1 (en) * 2020-02-26 2021-09-02 京东方科技集团股份有限公司 Touch-control display system and control method therefor
CN111444494A (en) * 2020-03-26 2020-07-24 维沃移动通信有限公司 Verification method, electronic device and computer readable storage medium
CN113535042A (en) * 2020-09-18 2021-10-22 厦门市和家健脑智能科技有限公司 Method and device for generating image based on old people cognitive recognition
CN113535042B (en) * 2020-09-18 2023-09-22 厦门市和家健脑智能科技有限公司 Method and device for generating cognitive recognition image based on old people

Also Published As

Publication number Publication date
WO2018098953A1 (en) 2018-06-07

Similar Documents

Publication Publication Date Title
CN108604158A (en) A kind of customed method and terminal of terminal applies operating space
CN111240789B (en) Widget processing method and related device
CN108271419B (en) Color temperature adjusting method and device
KR101484529B1 (en) Touchscreen apparatus user interface processing method and touchscreen apparatus
CN104932809B (en) Apparatus and method for controlling display panel
WO2018082269A1 (en) Menu display method and terminal
CN107077239B (en) Method for adjusting photographing focal length of mobile terminal through touch pad and mobile terminal
EP3098526A1 (en) Customized control method and system for air conditioner operation mode
CN105518605A (en) Touch operation method and apparatus for terminal
EP3287883B1 (en) Screen activation method, device and electronic equipment
WO2012095058A1 (en) Touch screen and input control method
CN108920069B (en) Touch operation method and device, mobile terminal and storage medium
JP2015007949A (en) Display device, display controlling method, and computer program
CN111050073B (en) Focusing method and electronic equipment
US20150248213A1 (en) Method to enable hard keys of a device from the screen
CN107992263A (en) A kind of information sharing method and mobile terminal
WO2020001193A1 (en) Gesture recognition method and apparatus, readable storage medium and mobile terminal
EP3379392A1 (en) Force touch method and electronic device
CN108563383A (en) A kind of image viewing method and mobile terminal
CN110795189A (en) Application starting method and electronic equipment
CN109542307B (en) Image processing method, device and computer readable storage medium
CN107797723B (en) Display style switching method and terminal
WO2018039914A1 (en) Method for copying data, and user terminal
CN109491741B (en) Method and terminal for switching background skin
CN107924274A (en) Information terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180928

RJ01 Rejection of invention patent application after publication