CN110941375A - Method and device for locally amplifying image and storage medium - Google Patents

Method and device for locally amplifying image and storage medium Download PDF

Info

Publication number
CN110941375A
CN110941375A CN201911175802.5A CN201911175802A CN110941375A CN 110941375 A CN110941375 A CN 110941375A CN 201911175802 A CN201911175802 A CN 201911175802A CN 110941375 A CN110941375 A CN 110941375A
Authority
CN
China
Prior art keywords
image
area
viewing
viewed
local
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911175802.5A
Other languages
Chinese (zh)
Other versions
CN110941375B (en
Inventor
袁佳平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911175802.5A priority Critical patent/CN110941375B/en
Publication of CN110941375A publication Critical patent/CN110941375A/en
Application granted granted Critical
Publication of CN110941375B publication Critical patent/CN110941375B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Abstract

The disclosure provides a method and a device for carrying out local amplification on an image and a storage medium, and belongs to the technical field of image amplification processing. The method comprises the following steps: acquiring a first local area to be amplified on a viewing image and an amplified image corresponding to the viewing image; determining a second local region in the magnified image corresponding to the first local region; covering a second area image in the second local area on a first area image in the first local area to obtain a viewing image covered with the second area image; displaying the viewing image overlaid with the second area image. By adopting the method and the device, the flexibility of viewing the locally enlarged image by a user can be improved.

Description

Method and device for locally amplifying image and storage medium
Technical Field
The present disclosure relates to the field of image enlargement processing technologies, and in particular, to a method and an apparatus for locally enlarging an image, and a storage medium.
Background
The user can view the image on the terminal. In order to observe the detailed part of the image more clearly, a local area of the image can be enlarged, and the terminal can display a local enlarged image corresponding to the local area.
The terminal in the related art can store a viewing image and local enlarged images corresponding to a plurality of local areas of the viewing image, when a user clicks one of the local areas in the viewing image, if the local enlarged image corresponding to the local area is stored in the terminal, the terminal can display the local enlarged image corresponding to the local area, and if the local enlarged image corresponding to the local area is not stored in the terminal, the terminal cannot display the local enlarged image corresponding to the local area.
The user can only view the pre-stored locally-enlarged image, but cannot view the locally-enlarged image corresponding to any region in the image, so that the flexibility of viewing the locally-enlarged image by the user is poor.
Disclosure of Invention
The embodiment of the disclosure provides a method, a device and a storage medium for locally amplifying an image, which can be used for solving the problems in the related art. The technical scheme is as follows:
in one aspect, a method for locally magnifying an image is provided, the method comprising:
acquiring a first local area to be amplified on a viewing image and an amplified image corresponding to the viewing image;
determining a second local region in the magnified image corresponding to the first local region;
covering a second area image in the second local area on a first area image in the first local area to obtain a viewing image covered with the second area image;
displaying the viewing image overlaid with the second area image.
In another aspect, a method of locally magnifying an image is provided, the method comprising:
opening a target application program when an operation of opening the target application program is detected;
when an operation instruction for displaying a viewing image is detected, displaying the viewing image through an image viewing tool of the target application program;
displaying, when a zoom-in operation triggered within a first partial area on the viewing image is detected, a viewing image overlaid with a second area image that is image content within a second partial area corresponding to the first partial area in the zoom-in image;
the magnified image is another image with the same display content as the viewed image but higher resolution, or the magnified image is an image obtained by magnifying the viewed image according to a target multiple.
In another aspect, there is provided an apparatus for locally magnifying an image, the apparatus including:
the acquisition module is used for acquiring a first local area to be amplified on a viewing image and an amplified image corresponding to the viewing image;
a first determining module for determining a second local region in the magnified image corresponding to the first local region;
a second determining module, configured to overlay a second area image in the second local area on a first area image in the first local area to obtain a viewing image overlaid with the second area image;
and the display module is used for displaying the viewing image covered with the second area image.
In another aspect, there is provided an apparatus for locally magnifying an image, the apparatus including:
the system comprises an opening module, a judging module and a judging module, wherein the opening module is used for opening a target application program when the operation of opening the target application program is detected;
the first display module is used for displaying the viewed image through an image viewing tool of the target application program when an operation instruction for displaying the viewed image is detected;
a second display module configured to display, when a zoom-in operation triggered in a first partial area on the viewing image is detected, the viewing image overlaid with a second area image that is image content in a second partial area corresponding to the first partial area in the zoom-in image;
the magnified image is another image with the same display content as the viewed image but higher resolution, or the magnified image is an image obtained by magnifying the viewed image according to a target multiple.
In another aspect, a computer device is provided, which includes a processor and a memory, where at least one instruction is stored in the memory, and the at least one instruction is loaded and executed by the processor to implement the method for locally magnifying an image.
In another aspect, a computer-readable storage medium is provided, in which at least one instruction is stored, and the at least one instruction is loaded and executed by a processor to implement the method for locally magnifying an image described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
in the embodiment of the disclosure, when a user locally magnifies any local area, such as a first local area, on a viewed image, a terminal may first obtain the first local area on the viewed image and a magnified image corresponding to the viewed image; then, a second partial region in the magnified image corresponding to the first partial region may be determined; thereafter, the second area image in the second partial area may be overlaid on the first area image in the first partial area, and the viewing image overlaid with the second area image may be obtained and displayed. Therefore, any local area in the image can be amplified by using the method, and the flexibility of viewing the locally amplified image by a user can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a method for locally magnifying an image according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an embodiment of the present disclosure for partially magnifying an image;
FIG. 3 is a schematic view of a scene with an image partially enlarged according to an embodiment of the present disclosure;
FIG. 4 is a schematic view of a scene with an image partially enlarged according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a first local area of a viewed image provided by an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a first local area of a viewed image provided by an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a first local area of a viewed image provided by an embodiment of the present disclosure;
FIG. 8 is a schematic flow chart diagram illustrating a method for determining a viewed image overlaid with a second area image provided by an embodiment of the present disclosure;
FIG. 9 is a schematic flow chart diagram of a method for determining a viewed image overlaid with a second area image provided by an embodiment of the present disclosure;
fig. 10 is a schematic flowchart of a method for locally magnifying an image according to an embodiment of the present disclosure;
FIG. 11 is a scene schematic diagram illustrating a game scene image being partially enlarged in an online game application according to an embodiment of the disclosure;
FIG. 12 is a scene schematic diagram illustrating a game scene image being partially enlarged in an online game application according to an embodiment of the disclosure;
FIG. 13 is a scene diagram illustrating a game scene image being partially enlarged in an online game application according to an embodiment of the disclosure;
FIG. 14 is a scene schematic diagram of a game scene image being partially enlarged in an online game application according to an embodiment of the disclosure;
FIG. 15 is a scene schematic diagram illustrating a game scene image being partially enlarged in an online game application according to an embodiment of the disclosure;
fig. 16 is a schematic structural diagram of an apparatus for locally enlarging an image according to an embodiment of the present disclosure;
fig. 17 is a schematic structural diagram of an apparatus for locally enlarging an image according to an embodiment of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the present disclosure more apparent, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings.
The embodiment of the disclosure provides a method for locally amplifying an image, which can be implemented by a terminal with a display function, wherein the terminal can be a mobile phone, a tablet computer, a notebook computer, a desktop computer, and the like, and a user can use various application programs based on different requirements of the user in the process of using the terminal, for example, the terminal can be provided with an application program for viewing the image. Wherein the image may be an image stored locally by the terminal. The image may also be an image captured by the terminal from a server through an application program, for example, the application program may be a network game application program, and the image may be a game scene image captured by the terminal from a background server of the network game application program.
As shown in fig. 1, the processing flow of the method may include the following steps:
in step 101, the terminal acquires a first local area to be enlarged on a viewing image and an enlarged image corresponding to the viewing image.
The viewed image may be an image without any enlargement processing, or may be an enlarged image.
The enlarged image corresponding to the viewed image is the image which is enlarged on the basis of the viewed image. Accordingly, the magnified image may be an image that has been magnified based on an image that has not been subjected to any magnification processing, such as an image that has been magnified by a target magnification; or the image may be a magnified image based on the magnified image, such as a magnified image of the target magnification. For convenience of description, in the present embodiment, an image without any enlargement may be taken as a viewing image.
In some examples, the magnified image is another image that is the same as the display content of the viewed image, but at a higher resolution. For example, the magnified image may be an image that is the same as the display content of the viewed image, and is equal in size, but at a higher resolution than the resolution of the viewed image; for another example, the enlarged image may be an image having the same display content as the viewing image, but having a size larger than the viewing image and a resolution higher than the viewing image.
Therefore, the enlarged image may be an image in which the viewing image is enlarged in size, an image in which the viewing image is enlarged in resolution, or an image in which the viewing image is enlarged in storage and enlarged in resolution. For convenience of description, the description part and the related drawings may be enlarged in size for illustration, and the resolution enlargement is similar to that, and thus the description is omitted.
The enlarged image may also be referred to as a clone map of the viewed image because the displayed content in the viewed image and the enlarged image are the same, and only the size is different, or only the resolution is different, or both the size and the resolution are different.
In one example, when a user intends to view a certain image (which may be referred to as a viewing image), the viewing image may be obtained from a local terminal or a server and displayed on a display interface of the terminal. If the user intends to view a magnified detail at a location (which may be noted as a first location) on the display interface, the first location on the display interface may be clicked. When the terminal detects that the amplification operation is triggered at the first position on the display interface, the terminal can acquire a first local area to be amplified on the view image based on the first position.
The first position may be a coordinate position of a click position of the user on the display interface, the first local area is a local area in the viewing image determined according to the first position, and the shape of the first local area may be a circular area, a rectangular area, a polygonal area, or the like.
In one example, when the terminal detects that the zooming-in operation is triggered at the first position on the display interface, not only the first local area to be zoomed in on the viewing image but also a zoomed-in image corresponding to the viewing image can be acquired. The terminal may first obtain the first local area, and then obtain the magnified image corresponding to the viewed image, or first obtain the magnified image corresponding to the viewed image, and then obtain the first local area, or simultaneously obtain the magnified image and the first local area.
The enlarged image may be an image which is stored in the terminal in advance and is enlarged by n times relative to the viewed image, where n is a numerical value greater than 1. Alternatively, the magnified image is another image of the same display content as the viewed image, but of higher resolution.
In step 102, the terminal determines a second partial region in the magnified image corresponding to the first partial region.
The second local area is a local area in the enlarged image, and the first local area and the second local area are different in size but have the same image content in the area.
In one example, the terminal may determine a second partial region in the magnified image corresponding to the first region image based on a magnification ratio of the magnified image relative to the viewed image and a position of the first partial region in the viewed image.
In step 103, the terminal overlays the second area image in the second local area on the first area image in the first local area to obtain a viewing image overlaid with the second area image.
In one example, after the terminal determines the second local area in the magnified image, the second area image within the second local area may be overlaid on the first area image within the first local area, resulting in a viewed image overlaid with the second area image.
In step 104, the terminal displays the viewing image overlaid with the second area image.
In one example, after the terminal obtains the viewing image overlaid with the second area image, the viewing image overlaid with the second area image may be displayed. In this way, on the display interface of the terminal, the second area image can be displayed at the first local area on the viewing image, and other areas which are not covered by the second area image still display the image with the same scale as the viewing image. Furthermore, local enlargement of a local region of the viewed image can be achieved.
In one example, reference may be made to fig. 2, where a in fig. 2 represents a viewed image and a' represents an enlarged image corresponding to the viewed image. When the user intends to view an enlarged image of an arbitrary local area on the viewing image a, for example, when the user intends to view an enlarged image corresponding to the local area a, the terminal may acquire the local area a on the viewing image a, and an enlarged image a ' corresponding to the viewing image a and a local area image a ' corresponding to the local area image a on the enlarged image a '. Then, the terminal may overlay the partial area image a 'on the enlarged image a' on the viewed image a, and the partial area image a 'is overlaid on the partial area image a, resulting in the viewed image a overlaid with the partial area image a' shown on the right side of the arrow in fig. 2. Thereafter, as shown in the right side of the arrow in fig. 2, a viewing image a overlaid with the partial area image a' may be displayed on the display interface of the terminal.
In one possible application scenario, as shown in FIG. 3, after a user opens a viewing image, the user can click on the location of the "mushroom" as indicated by the arrow in FIG. 3 on the display interface. The local area where the "mushroom" is located as indicated by an arrow in fig. 4 may be displayed on the display interface of the terminal to be enlarged, whereas the area image in the other area except the local area where the "mushroom" is located on the viewing image in fig. 4 is not enlarged.
Therefore, when the user uses the method to magnify the local area of the image, the user can click any position on the viewed image on the display interface, and the terminal can perform local magnification processing on the local area clicked by the corresponding user in the viewed image according to the method, so that the flexibility of viewing the locally magnified image by the user can be improved.
As described above, when the terminal detects that the touch zoom-in operation is triggered at the first position on the display interface, the first local area to be zoomed in on the view image may be acquired based on the first position. The process of the terminal specifically acquiring the first local area from the first position may be as follows:
when a user clicks a first position on the display interface, the terminal can detect that the first position on the display interface triggers touch amplification operation, and the terminal can determine a second position corresponding to the first position on the viewed image based on the first position on the display interface. In both cases, the first local area may be determined as follows:
if the second position is far away from the edge line of the viewed image, that is, if the distance between the second position and the edge line of the enlarged image is greater than or equal to the target value, a circular region determined by taking the second position as a center of a circle and the target value as a radius on the viewed image is determined as a first local region to be enlarged on the viewed image.
If the second position is closer to the edge line of the viewed image, that is, if the distance between the second position and the edge line of the enlarged image is smaller than the target value, on the viewed image, a region surrounded by the edge line of the enlarged image and an arc region determined by taking the second position as a center of a circle and taking the target value as a radius is determined as a first local region to be enlarged on the viewed image.
The target value may be a value preset by a technician, or may be a value determined by the terminal according to the size of the viewed image, for example, the target value may be a quarter of the length of the short side of the viewed image.
In one example, as shown in FIG. 5, the second location is further from the edge line of the viewed image, and the first local area is a circular area as shown in FIG. 5. As shown in fig. 6, the second position is closer to an edge line of the viewed image, and the first local area is an area surrounded by the edge line of the viewed image and the arc area, as shown in fig. 6. As shown in fig. 7, the second position is closer to both edge lines of the viewed image, and the first local area is an area surrounded by the two closer edge lines of the viewed image and the arc-shaped area as shown in fig. 7.
As described above, when the terminal detects that the touch zoom-in operation is triggered at the first position on the display interface, not only the first local area on the viewed image but also a zoomed-in image corresponding to the viewed image can be obtained.
In one example, the magnified image may be an image obtained by copying and rearranging pixels of the viewed image by the terminal. For example, the terminal may copy and arrange the pixel points of the viewed image in advance based on the magnification ratio of the magnified image relative to the viewed image, and store the data corresponding to the pixel points of the processed viewed image. Therefore, when the amplified image is obtained, the copying and arrangement of the pixel points are not needed, and the time can be saved.
In one example, the magnified image may also be an image drawn by a technician from a viewing image by drawing software that is larger in size and resolution than the viewing image. For example, the technician may use the drawing software to draw an enlarged image having a size larger than the size of the viewed image in accordance with the image content displayed by the viewed image, and based on the enlargement ratio of the enlarged image relative to the viewed image. For example, the magnified image is three times magnified relative to the viewed image, and accordingly, the technician may draw the magnified image three times the size of the viewed image through the drawing software.
In order to further improve the definition of the magnified image, so as to make the second region image in the second local region clearer, correspondingly, the resolution of the magnified image is rendered higher than that of the viewed image. In this way, when the viewing image on which the second area image is superimposed is displayed, the displayed second area image is not only large in size but also high in definition.
The above is a process in which the terminal acquires the first partial region on the viewing image and the enlarged image corresponding to the viewing image, and after the terminal acquires the first partial region and the enlarged image, the terminal may further determine the second partial region corresponding to the first partial region in the enlarged image based on an enlargement ratio of the enlarged image with respect to the viewing image. Specifically, the following may be mentioned:
for example, referring again to FIG. 2, the view image a is rectangular in shape, and may have the upper left corner of the rectangle as the origin of coordinates, with the location information of the first center point of the first local area A on the view image a noted (α).
For example, the magnification ratio of the enlarged image with respect to the viewed image is 3 times, as shown again with reference to FIG. 2, still with the upper left corner of the enlarged image as the origin of coordinates, the position information of (3 α, 3 β) in the enlarged image a 'may be determined as the second center point in the enlarged image A' corresponding to the first center point of the first partial region A.
Finally, the terminal may determine, in the enlarged image, a second partial region having the second center point as the center point and the same region shape as that of the first partial region, based on the position information of the second center point in the enlarged image, the enlargement scale, and the region shape of the first partial region.
For example, referring again to FIG. 2, the first local region is a circular region centered at (α) and having a radius of r, and correspondingly, the second local region is a circular region centered at (3 α, 3 β) and having a radius of 3 r. As another example, as shown in FIGS. 6 and 7, if the first local region is a region surrounded by an arc region and an edge line of the viewed image, the second local region is also a region surrounded by an arc region and an edge line of the enlarged image, wherein the size of the second local region is three times the size of the first local region.
The above is the process in which the terminal determines the first partial area on the viewed image and the second partial area on the enlarged image, and after the terminal determines the first partial area and the second partial area, the second partial area image in the second partial area can be overlaid on the first partial area image in the first partial area to obtain the viewed image overlaid with the second partial area image. The specific process may be as follows:
the terminal may obtain the viewing image covered with the second area image through a masking technique, that is, a technique of displaying a partial area of the masked layer through the masked layer. The principle of masking is generally as follows:
the mask layer and the to-be-masked layer are two layers, the size of the mask layer is smaller than that of the to-be-masked layer, the transparency of the mask layer is larger than zero, and the transparency of the mask layer is larger than zero, namely, the transparency of each pixel point of the mask layer is larger than zero. In an application, the mask layer covers the mask layer, the image of the mask layer is not displayed, the image in the area covered by the mask layer on the mask layer can be displayed, and the image in the area not covered by the mask layer can not be displayed. By this principle, a mask layer having a size smaller than the enlarged image and a transparency larger than zero may be overlaid on the enlarged image. Further, the magnified image may display the image in the area covered by the mask layer without displaying the image in the other area.
The specific implementation process may be executed according to the process shown in fig. 8:
in step 1031, the terminal obtains a mask layer, which matches the second local area and has a transparency greater than zero.
In one example, the mask layer is matched to the second partial region, i.e. the size of the mask layer is matched to the size of the second partial region and the shape of the mask layer is matched to the shape of the second partial region. For example, if the second partial region is a circular region, the mask layer may be the same circular region as the circular region; whereas if the second partial area is an area surrounded by an arc-shaped area and an edge line of the enlarged image, the mask layer may still be a circular area corresponding to the arc-shaped area. That is, the mask layer is a complete circular area regardless of whether the second partial area is a complete circular area.
Wherein, the implementation code for setting the mask layer to be circular may be: bg, 2mask ═ mask sp.
The above codes for implementing the processes are only used as an example, and do not specifically limit the embodiments.
In step 1032, the terminal overlays a mask layer on the magnified image to obtain a second area image within the second local area.
In one example, after the terminal acquires the mask layer, the mask layer may be aligned with the second partial area of the magnified image and overlaid on the magnified image, and according to the principle of the masking technique, the second area image in the second partial area may be obtained. If the second partial area on the magnified image is a complete circular area, a mask layer is placed over the second partial area of the magnified image with the circular edge of the mask layer aligned with the circular edge of the second partial area. If the second partial area on the magnified image is an area surrounded by the arc-shaped area and the edge line of the magnified image, a circular mask layer is covered on the second partial area of the magnified image, and the circular edge of the mask layer is aligned with the circular edge of the arc-shaped area of the second partial area.
It can be seen from the above that the resulting second area image is also a magnified image covered with a mask layer, wherein the mask layer is aligned with the second partial area of the magnified image.
In step 1033, the terminal overlays the second area image on the first area image of the first partial area, resulting in a viewed image overlaid with the second area image.
In one example, after the terminal aligns the mask layer with the second partial area and covers the enlarged image, an image of the second area in the second partial area may be obtained, and then the image of the second area may be covered on the image of the first area in the first partial area on the viewing image, so that the viewing image covered with the image of the second area may be obtained.
Therefore, the step of determining the viewing image covered with the second area image is to cover the acquired mask layer on the magnified image by aligning the acquired mask layer with the second local area of the magnified image to obtain a second area image of the second local area; the magnified image, overlaid with a mask layer, is then overlaid on the viewed image, with the mask layer aligned with the first local area of the viewed image. Since only the portion of the enlarged image covered by the mask layer can be displayed and the mask layer does not affect the lowermost viewed image, the viewed image covered with the second area image can be obtained.
In one example, obtaining the viewing image overlaid with the second area image may also be performed according to the flow shown in fig. 9:
in step 1031', the terminal overlays the enlarged image on the viewing image in such a manner that the first partial region corresponds to the second partial region.
In one example, the terminal may align the second local area with the first local area, overlaying the magnified image on the viewed image. For example, if the first and second local areas are both circular areas, the center of the second local area may be aligned with the center of the first local area, and the magnified image may be overlaid on the viewed image. For another example, if the first and second partial regions are both regions including an arc region and an edge line, the terminal may align a center point of the arc region of the second partial region with a center point of the arc region of the first partial region, align an edge line of the second partial region with an edge line of the first partial region, and then overlay the enlarged image on the viewing image.
Wherein, the implementation code for overlaying the magnified image magnified by three times on the viewed image may be:
Let bg2=new Sprite();
Laya.stage.addChild(bg2);
bg2.graphics.drawTexture(bgRes);
bg2.scale(3,3).
the above codes for implementing the processes are only used as an example, and do not specifically limit the embodiments.
In step 1032', the terminal obtains a mask layer, the mask layer is matched with the second local area and the transparency of the mask layer is greater than zero.
Wherein, after the terminal overlays the enlarged image on the viewed image, the mask layer stored in advance can be obtained. As already mentioned above, the mask layer may be a circular area matching the second partial area, i.e. the circular area of the mask layer may be adapted to the circular area of the second partial area, or the circular area of the mask layer may be adapted to the arc-shaped area of the second partial area.
In step 1033', the terminal overlays a mask layer over the magnified image resulting in a viewed image overlaid with the second area image.
In one example, after the terminal overlays the magnified image on the viewed image, a mask layer may be overlaid at a second partial area of the magnified image, resulting in the viewed image overlaid with the second partial area image.
It can be seen that the process of determining the viewing image overlaid with the second area image is to overlay a magnified image on the viewing image first, wherein the second partial area is aligned with the first partial area; then, a mask layer is covered on the enlarged image, wherein the mask layer is aligned with the second partial region. Since only the portion of the enlarged image covered by the mask layer can be displayed and the mask layer does not affect the lowermost viewed image, the viewed image covered with the second area image can be obtained.
Based on the above, the viewing image covered with the second area image obtained by the flow shown in fig. 8 and by the flow shown in fig. 9 may include three layers: a view image at the lowermost layer, a magnified image overlaid on the view image, and a mask layer overlaid on the magnified image. The mask layer only generates a mask effect on the layer below the mask layer, so that an image displayed on a display interface of the terminal is a viewed image covered with a second area image, the second area image is an area image in the magnified image, and an area not covered by the second area image on the viewed image is still the area image in the viewed image, so that the viewed image can be locally magnified.
In an exemplary application, the user may also slide on the display interface, and during the sliding process, the local area to be enlarged on the viewing image is dynamically changed, so that the first local area to be enlarged on the viewing image is consistent with the operation position of the user on the display interface.
When the user slides on the display interface, the process of the terminal acquiring and viewing the first local area to be enlarged on the image can be as follows:
when the terminal detects the sliding and zooming operation from the starting position to the ending position on the display interface, a first local area to be zoomed on the viewing image is obtained based on a first position, and the first position is a real-time sliding position between the starting position and the ending position on the display interface.
The starting position is the position where the user starts sliding on the display interface, and when the user starts sliding, the first position is the starting position; the end position is a position where the user ends the sliding on the display interface, and when the end position of the sliding is ended, the first position is the end position.
The user may slide on the display interface by a finger or a mouse, and the implementation manner of sliding on the display interface is not limited in this embodiment.
In one example, when the user performs a slide zoom operation on the display interface, the terminal may detect, in real time, a first position touched by the user on the display interface and then determine, based on the detected first position, a first partial region to be zoomed on the viewing image. The manner in which the terminal determines the first local area based on the first position is described above, and reference may be made to the above description, and details are not described here again.
In a possible application, when the user slides to the B position during the process of sliding on the display interface, the corresponding area of the B position on the magnified image is a B area image, and the viewing image covering the B area image may be displayed on the display interface according to the above method. When the image continues to slide from the position B to the position C, where the corresponding area of the position C on the magnified image is the image of the area C, the viewing image covering the image of the area C may be displayed on the display interface according to the above method. In the process of sliding from the B position to the C position, the overlapped part of the B area image and the C area image is still in the amplification effect, and the part of the B area image, which is not overlapped with the C area image, can be switched to the same display scale as the viewed image, namely, the display scale is switched to the effect before amplification.
In this way, when the user performs the sliding and zooming operation on the display interface, the area of the display interface for locally zooming the viewing image also dynamically changes, so that the area image of the zoomed-in image overlaid on the viewing image is consistent with the real-time sliding position of the user at the current operation position on the display interface.
When the terminal performs the sliding zoom operation by moving the zoom image and the mask layer, the local area to be zoomed in on the view image is consistent with the current operation position of the user on the display interface, wherein the code for moving the zoom image and the mask layer may be:
Laya.stage.on(Laya.Event.MOUSE_MOVE,10is,()=>{
bg2.x=-Laya.stage.mouseX*2;
bg2.y=-Laya.stage.mouseY*2;
maskSp.x=Laya.stage.mouseX;
maskSp.y=Laya.stage.mouseY;}).
the above codes for implementing the processes are only used as an example, and do not specifically limit the embodiments.
Based on the above, one scene in which the viewed image is locally enlarged by the method may be that after the user opens the target application installed on the terminal, the viewed image to be viewed may be searched in the target application, and a zoom view of the viewed image may be displayed on the display interface of the terminal. After the user clicks the zoom image of the viewed image, the terminal can display the complete viewed image on the display interface through the image viewing tool of the target application program.
Or another possible scenario may be that after the user opens the target application program, thumbnails of multiple images may be displayed on a display interface of the terminal, when the user intends to view one of the images, the image that the user intends to view may be recorded as a view image, and the user may click the thumbnail corresponding to the view image. Then, the terminal can display the complete viewing image on the display interface through the image viewing tool of the target application program.
As shown in the flowchart of fig. 10, after step S1, that is, after the viewing image is displayed on the display interface of the terminal, the user may perform a local zoom-in operation on the display interface. The user can directly execute the operation of local amplification on the display interface, and can also execute the operation of local amplification after starting the local amplification function. For example, in the latter case, a function icon corresponding to the local zoom-in function is set on the display interface on which the viewing image is displayed, and after clicking the function icon, the user may start the local zoom-in function, and then the user may perform an operation of local zoom-in on the display interface.
As shown in the flowchart of fig. 10, when the terminal detects that the zoom-in operation is triggered on the display interface, the terminal may then perform step S2 to determine a first partial region to be zoomed in on the viewing image, then the terminal may perform steps S3 and S4 to obtain a zoomed-in image corresponding to the viewing image and determine a second partial region corresponding to the first partial region on the zoomed-in image, and then the terminal may perform step S5 to obtain a mask layer adapted to the second partial region.
In this embodiment, the sequence of step S3 and step S5 is not limited, and may be executed simultaneously, or step S5 may precede step S3 and follow.
Then, in step S6, the terminal may obtain a viewing image overlaid with a mask layer and a magnified image, wherein the mask layer is located on the magnified image and corresponds to the second partial region, the magnified image is located on the viewing image, and the first partial region corresponds to the second partial region. Further, according to the masking technique principle, in step S7, a viewing image overlaid with the second area image within the second partial area, that is, a viewing image having a partial enlargement effect may be displayed on the display interface.
Then, when the terminal detects that the sliding operation is triggered on the display interface, the first local area which needs to be enlarged on the view image is updated correspondingly, and then the second local area which corresponds to the first local area on the enlarged image is also updated. In step S8, the terminal may update the second partial area by moving the magnified image and the mask layer according to the updated condition of the first partial area, so that the viewing image may be overlaid with the area image in the updated second partial area, and in step S9, the viewing image overlaid with the area image in the updated second partial area may be displayed on the display interface.
Therefore, the process that the user uses the method to locally amplify the viewed image is similar to the process that the user holds a magnifier, the regional image in the local region of the viewed image is viewed on the viewed image, and the visual effect that the user holds the magnifier to view the viewed image can be achieved.
Based on the above, one possible application scenario may be that the viewing image may be a game scenario image in an online game application, and a user may perform a local magnification process on the game scenario image in the online game application. For example, a plurality of applications may be installed on a terminal of a user, where the applications may include network game applications, and as shown in fig. 11, a plurality of application icons are displayed on a desktop of the terminal, and an application icon clicked by the user is an application icon corresponding to a certain network game application.
After clicking an application icon corresponding to a certain network game application program, the user enters a start interface of the network game application program as shown in fig. 12, and after clicking a start game button, the user can enter a mode selection interface of the game as shown in fig. 13. Then, after selecting a certain mode, the user can enter a game interface as shown in fig. 14, where fig. 14 is a game scene image of a certain frame in the network game application. The user can click any position on the game scene image to perform the local amplification processing, for example, as shown in fig. 14, the position pointed by the finger of the user is the position that the user intends to perform the local amplification processing. After the user clicks the position for performing the partial amplification, as shown in fig. 15, a game scene image after the partial amplification is performed at the position clicked by the user may be displayed on the display interface of the terminal.
The scene graphs of the game scene images shown in fig. 11 to 15, which are partially enlarged, are merely an example, and are not particularly limited.
In the embodiment of the disclosure, when a user locally magnifies any local area, such as a first local area, on a viewed image, a terminal may first obtain the first local area on the viewed image and a magnified image corresponding to the viewed image; then, a second partial region in the magnified image corresponding to the first partial region may be determined; thereafter, the second area image in the second partial area may be overlaid on the first area image in the first partial area, and the viewing image overlaid with the second area image may be obtained and displayed. Therefore, any local area in the image can be amplified by using the method, and the flexibility of viewing the locally amplified image by a user can be improved.
The embodiment of the disclosure also provides a method for locally amplifying an image, which can be implemented by a terminal with a display function. The method can comprise the following steps: when the terminal detects the operation of opening the target application program, opening the target application program; when the terminal detects an operation instruction for displaying a viewing image, displaying the viewing image through an image viewing tool of the target application program; when a zoom-in operation triggered within a first partial area on the viewing image is detected, the viewing image overlaid with a second area image that is image content within a second partial area corresponding to the first partial area in the zoomed-in image is displayed.
The magnified image is another image with the same display content as the viewed image but higher resolution, or the magnified image is an image obtained by magnifying the viewed image according to a target multiple.
The target application is an application selected by a user installed on the terminal, and may be, for example, a media application installed on the terminal, a browsing application, and the like.
Wherein the image viewing tool may be a plug-in the target application for executing the display image.
In one example, the target application may be a browsing application, and accordingly, the user may double-click to open the target application on an icon corresponding to the target application on the desktop of the terminal. After the user opens the target application program, a viewing image to be viewed can be searched in the target application program, and a zoom map of the viewing image can be displayed on a display interface of the terminal. After the user clicks the zoom image of the viewed image, the terminal can display the complete viewed image on the display interface through the image viewing tool of the target application program. Then, the user may locally enlarge the viewed image according to the above-mentioned method on the display interface displaying the viewed image, and the specific process may refer to the above-mentioned embodiment.
Or, the target application may be a media application, and accordingly, the user may double-click to open the target application on an icon corresponding to the target application on the desktop of the terminal. After the user opens the target application program, thumbnails of a plurality of images can be displayed on a display interface of the terminal, when the user intends to view one of the images, the image that the user intends to view can be recorded as a view image, and the user can click the thumbnail corresponding to the view image. Then, the terminal can display the complete viewing image on the display interface through the image viewing tool of the target application program. Then, the user may locally enlarge the viewed image according to the above-mentioned method on the display interface displaying the viewed image, and the specific process may refer to the above-mentioned embodiment.
Based on the same technical concept, the embodiment of the present disclosure also provides an apparatus for locally magnifying an image, as shown in fig. 16, the apparatus including:
an obtaining module 910, configured to obtain a first local area to be enlarged on a viewing image and an enlarged image corresponding to the viewing image;
a first determining module 920, configured to determine a second local area in the enlarged image corresponding to the first local area;
a second determining module 930, configured to overlay a second area image in the second local area on a first area image in the first local area to obtain a viewing image overlaid with the second area image;
a display module 940, configured to display the viewing image covered with the second area image.
Optionally, the second determining module 930 is specifically configured to:
obtaining a mask layer, wherein the mask layer is matched with the second local area and the transparency of the mask layer is larger than zero;
covering the mask layer on the amplified image to obtain a second area image in the second local area;
and overlaying the second area image on the first area image of the first local area to obtain a viewing image overlaid with the second area image.
Optionally, the second determining module 930 is specifically configured to:
overlaying the magnified image on the viewed image in a manner that the first local area corresponds to the second local area;
obtaining a mask layer, wherein the mask layer is matched with the second local area and the transparency of the mask layer is larger than zero;
and covering the mask layer on the amplified image to obtain a viewing image covered with the second area image.
Optionally, the obtaining module 910 is specifically configured to, when it is detected that a touch zoom-in operation is triggered at a first position on a display interface, obtain a first local area to be zoomed in on the viewing image based on the first position.
Optionally, the obtaining module 910 is specifically configured to:
when the touch amplification operation is triggered at a first position on a display interface, determining a second position corresponding to the first position on the viewing image based on the first position on the display interface;
if the distance between the second position and the edge line of the enlarged image is larger than or equal to a target numerical value, determining a circular area determined by taking the second position as a circle center and the target numerical value as a radius on the viewed image as a first local area to be enlarged on the viewed image;
if the distance between the second position and the edge line of the amplified image is smaller than a target value, determining a region surrounded by the edge line of the amplified image and an arc-shaped region determined by taking the second position as a circle center and the target value as a radius on the viewed image as a first local region to be amplified on the viewed image.
Optionally, the obtaining module 910 is specifically configured to, when a sliding zoom-in operation from a start position to an end position on a display interface is detected, obtain a first local area to be zoomed in on a viewing image based on a first position, where the first position is a real-time sliding position between the start position and the end position on the display interface.
Optionally, the first determining module 920 is specifically configured to:
determining a second partial region in the magnified image corresponding to the first partial region based on a magnification ratio of the magnified image relative to the viewed image and a position of the first partial region in the viewed image.
Optionally, the first determining module 920 is specifically configured to:
determining position information of a first center point of the first local area in the viewed image;
determining, in the enlarged image, position information of a second center point corresponding to the first center point based on an enlargement ratio of the enlarged image with respect to the viewed image and position information of the first center point in the viewed image;
determining a second local region in the enlarged image, which has the second center point as a center point and has a region shape identical to that of the first local region, based on the position information of the second center point in the enlarged image, the enlargement ratio, and the region shape of the first local region.
Based on the same technical concept, the embodiment of the present disclosure further provides an apparatus for locally magnifying an image, where the apparatus includes:
the system comprises an opening module, a judging module and a judging module, wherein the opening module is used for opening a target application program when the operation of opening the target application program is detected;
the first display module is used for displaying the viewed image through an image viewing tool of the target application program when an operation instruction for displaying the viewed image is detected;
a second display module configured to display, when a zoom-in operation triggered in a first partial area on the viewing image is detected, the viewing image overlaid with a second area image that is image content in a second partial area corresponding to the first partial area in the zoom-in image;
the magnified image is another image with the same display content as the viewed image but higher resolution, or the magnified image is an image obtained by magnifying the viewed image according to a target multiple.
In the embodiment of the disclosure, when a user locally magnifies any local area, such as a first local area, on a viewed image, the device may first obtain the first local area on the viewed image and a magnified image corresponding to the viewed image; then, a second partial region in the magnified image corresponding to the first partial region may be determined; thereafter, the second area image in the second partial area may be overlaid on the first area image in the first partial area, and the viewing image overlaid with the second area image may be obtained and displayed. Therefore, any local area in the image can be amplified by using the device, and the flexibility of viewing the locally amplified image by a user can be improved.
It should be noted that: in the device for locally enlarging an image according to the above embodiment, when the image is locally enlarged, only the division of the functional modules is illustrated, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the device for locally amplifying the image and the method for locally amplifying the image provided by the above embodiment belong to the same concept, and specific implementation processes thereof are detailed in the method embodiment and are not described herein again.
Fig. 17 shows a block diagram of a terminal 1000 according to an exemplary embodiment of the present application. The terminal 1000 can be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio layer iii, motion video Experts compression standard Audio layer 3), an MP4 player (Moving Picture Experts Group Audio layer IV, motion video Experts compression standard Audio layer 4), a notebook computer, or a desktop computer. Terminal 1000 can also be referred to as user equipment, portable terminal, laptop terminal, desktop terminal, or the like by other names.
In general, terminal 1000 can include: a processor 1001 and a memory 1002.
Processor 1001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 1001 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1001 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1001 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 1001 may further include an AI (Artificial Intelligence) processor for processing a computing operation related to machine learning.
Memory 1002 may include one or more computer-readable storage media, which may be non-transitory. The memory 1002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1002 is used to store at least one instruction for execution by the processor 1001 to implement the method of locally magnifying an image provided by the method embodiments herein.
In some embodiments, terminal 1000 can also optionally include: a peripheral interface 1003 and at least one peripheral. The processor 1001, memory 1002 and peripheral interface 1003 may be connected by a bus or signal line. Various peripheral devices may be connected to peripheral interface 1003 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1004, touch screen display 1005, camera 1006, audio circuitry 1007, positioning components 1008, and power supply 1009.
The peripheral interface 1003 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 1001 and the memory 1002. In some embodiments, processor 1001, memory 1002, and peripheral interface 1003 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1001, the memory 1002, and the peripheral interface 1003 may be implemented on separate chips or circuit boards, which are not limited by this embodiment.
The Radio Frequency circuit 1004 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1004 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1004 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1004 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1004 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1004 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1005 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1005 is a touch display screen, the display screen 1005 also has the ability to capture touch signals on or over the surface of the display screen 1005. The touch signal may be input to the processor 1001 as a control signal for processing. At this point, the display screen 1005 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display screen 1005 can be one, providing a front panel of terminal 1000; in other embodiments, display 1005 can be at least two, respectively disposed on different surfaces of terminal 1000 or in a folded design; in still other embodiments, display 1005 can be a flexible display disposed on a curved surface or on a folded surface of terminal 1000. Even more, the display screen 1005 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display screen 1005 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 1006 is used to capture images or video. Optionally, the camera assembly 1006 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1006 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1007 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1001 for processing or inputting the electric signals to the radio frequency circuit 1004 for realizing voice communication. For stereo sound collection or noise reduction purposes, multiple microphones can be provided, each at a different location of terminal 1000. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1001 or the radio frequency circuit 1004 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1007 may also include a headphone jack.
A location component 1008 is employed to locate a current geographic location of terminal 1000 for navigation or LBS (location based Service). The positioning component 1008 may be a positioning component based on the GPS (global positioning System) in the united states, the beidou System in china, the graves System in russia, or the galileo System in the european union.
Power supply 1009 is used to supply power to various components in terminal 1000. The power source 1009 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power source 1009 includes a rechargeable battery, the rechargeable battery may support wired charging or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1000 can also include one or more sensors 1010. The one or more sensors 1010 include, but are not limited to: acceleration sensor 1011, gyro sensor 1012, pressure sensor 1013, fingerprint sensor 1014, optical sensor 1015, and proximity sensor 1016.
Acceleration sensor 1011 can detect acceleration magnitudes on three coordinate axes of a coordinate system established with terminal 1000. For example, the acceleration sensor 1011 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1001 may control the touch display screen 1005 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1011. The acceleration sensor 1011 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1012 may detect a body direction and a rotation angle of the terminal 1000, and the gyro sensor 1012 and the acceleration sensor 1011 may cooperate to acquire a 3D motion of the user on the terminal 1000. From the data collected by the gyro sensor 1012, the processor 1001 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1013 may be disposed on a side frame of terminal 1000 and/or on a lower layer of touch display 1005. When pressure sensor 1013 is disposed on a side frame of terminal 1000, a user's grip signal on terminal 1000 can be detected, and processor 1001 performs left-right hand recognition or shortcut operation according to the grip signal collected by pressure sensor 1013. When the pressure sensor 1013 is disposed at a lower layer of the touch display screen 1005, the processor 1001 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 1005. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1014 is used to collect a fingerprint of the user, and the processor 1001 identifies the user according to the fingerprint collected by the fingerprint sensor 1014, or the fingerprint sensor 1014 identifies the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 1001 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying, and changing settings, etc. Fingerprint sensor 1014 can be disposed on the front, back, or side of terminal 1000. When a physical key or vendor Logo is provided on terminal 1000, fingerprint sensor 1014 can be integrated with the physical key or vendor Logo.
The optical sensor 1015 is used to collect the ambient light intensity. In one embodiment, the processor 1001 may control the display brightness of the touch display screen 1005 according to the intensity of the ambient light collected by the optical sensor 1015. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1005 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1005 is turned down. In another embodiment, the processor 1001 may also dynamically adjust the shooting parameters of the camera assembly 1006 according to the intensity of the ambient light collected by the optical sensor 1015.
Proximity sensor 1016, also known as a distance sensor, is typically disposed on a front panel of terminal 1000. Proximity sensor 1016 is used to gather the distance between the user and the front face of terminal 1000. In one embodiment, when proximity sensor 1016 detects that the distance between the user and the front surface of terminal 1000 gradually decreases, processor 1001 controls touch display 1005 to switch from a bright screen state to a dark screen state; when proximity sensor 1016 detects that the distance between the user and the front of terminal 1000 is gradually increased, touch display screen 1005 is controlled by processor 1001 to switch from a breath-screen state to a bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 17 is not intended to be limiting and that terminal 1000 can include more or fewer components than shown, or some components can be combined, or a different arrangement of components can be employed.
Yet another embodiment of the present disclosure provides a computer-readable storage medium, in which instructions, when executed by a processor of a terminal, enable the terminal to perform the above-described method of locally magnifying an image.
The above description is intended to be exemplary only and not to limit the present disclosure, and any modifications, equivalents, improvements, etc. made within the spirit and scope of the present disclosure are intended to be included therein.

Claims (15)

1. A method of locally magnifying an image, the method comprising:
acquiring a first local area to be amplified on a viewing image and an amplified image corresponding to the viewing image;
determining a second local region in the magnified image corresponding to the first local region;
covering a second area image in the second local area on a first area image in the first local area to obtain a viewing image covered with the second area image;
displaying the viewing image overlaid with the second area image.
2. The method of claim 1, wherein overlaying a second area image within the second partial area over a first area image within the first partial area results in a viewed image overlaid with the second area image, comprising:
obtaining a mask layer, wherein the mask layer is matched with the second local area and the transparency of the mask layer is larger than zero;
covering the mask layer on the amplified image to obtain a second area image in the second local area;
and overlaying the second area image on the first area image of the first local area to obtain a viewing image overlaid with the second area image.
3. The method of claim 1, wherein overlaying a second area image within the second partial area over a first area image within the first partial area results in a viewed image overlaid with the second area image, comprising:
overlaying the magnified image on the viewed image in a manner that the first local area corresponds to the second local area;
obtaining a mask layer, wherein the mask layer is matched with the second local area and the transparency of the mask layer is larger than zero;
and covering the mask layer on the amplified image to obtain a viewing image covered with the second area image.
4. The method of any of claims 1 to 3, wherein the obtaining a first local area to be magnified on the viewing image comprises:
when the fact that touch amplification operation is triggered at a first position on a display interface is detected, a first local area to be amplified on the viewing image is obtained based on the first position.
5. The method according to claim 4, wherein when detecting that a touch zoom operation is triggered at a first position on a display interface, acquiring a first local area to be zoomed on the viewing image based on the first position comprises:
when the touch amplification operation is triggered at a first position on a display interface, determining a second position corresponding to the first position on the viewing image based on the first position on the display interface;
if the distance between the second position and the edge line of the enlarged image is larger than or equal to a target numerical value, determining a circular area determined by taking the second position as a circle center and the target numerical value as a radius on the viewed image as a first local area to be enlarged on the viewed image;
if the distance between the second position and the edge line of the amplified image is smaller than a target value, determining a region surrounded by the edge line of the amplified image and an arc-shaped region determined by taking the second position as a circle center and the target value as a radius on the viewed image as a first local region to be amplified on the viewed image.
6. The method of any of claims 1 to 3, wherein the obtaining a first local area to be magnified on the viewing image comprises:
when the sliding amplification operation from a starting position to an ending position on a display interface is detected, a first local area to be amplified on a viewing image is acquired based on a first position, and the first position is a real-time sliding position between the starting position and the ending position on the display interface.
7. The method of any of claims 1 to 3, wherein said determining a second local region in the magnified image corresponding to the first local region comprises:
determining a second partial region in the magnified image corresponding to the first partial region based on a magnification ratio of the magnified image relative to the viewed image and a position of the first partial region in the viewed image.
8. The method of claim 7, wherein determining a second partial region in the magnified image that corresponds to the first partial region based on a magnification scale of the magnified image relative to the viewed image and a location of the first partial region in the viewed image comprises:
determining position information of a first center point of the first local area in the viewed image;
determining, in the enlarged image, position information of a second center point corresponding to the first center point based on an enlargement ratio of the enlarged image with respect to the viewed image and position information of the first center point in the viewed image;
determining a second local region in the enlarged image, which has the second center point as a center point and has a region shape identical to that of the first local region, based on the position information of the second center point in the enlarged image, the enlargement ratio, and the region shape of the first local region.
9. The method of claim 1,
the magnified image is an image with a resolution greater than that of the viewed image, which is pre-drawn according to the viewed image;
or the like, or, alternatively,
the magnified image is an image obtained by magnifying the viewed image by a target magnification.
10. A method of locally magnifying an image, the method comprising:
opening a target application program when an operation of opening the target application program is detected;
when an operation instruction for displaying a viewing image is detected, displaying the viewing image through an image viewing tool of the target application program;
displaying, when a zoom-in operation triggered within a first partial area on the viewing image is detected, a viewing image overlaid with a second area image that is image content within a second partial area corresponding to the first partial area in the zoom-in image;
the magnified image is another image with the same display content as the viewed image but higher resolution, or the magnified image is an image obtained by magnifying the viewed image according to a target multiple.
11. An apparatus for locally magnifying an image, the apparatus comprising:
the acquisition module is used for acquiring a first local area to be amplified on a viewing image and an amplified image corresponding to the viewing image;
a first determining module for determining a second local region in the magnified image corresponding to the first local region;
a second determining module, configured to overlay a second area image in the second local area on a first area image in the first local area to obtain a viewing image overlaid with the second area image;
and the display module is used for displaying the viewing image covered with the second area image.
12. The apparatus of claim 11, wherein the second determining module is specifically configured to:
obtaining a mask layer, wherein the mask layer is matched with the second local area and the transparency of the mask layer is larger than zero;
covering the mask layer on the amplified image to obtain a second area image in the second local area;
and overlaying the second area image on the first area image of the first local area to obtain a viewing image overlaid with the second area image.
13. An apparatus for locally magnifying an image, the apparatus comprising:
the system comprises an opening module, a judging module and a judging module, wherein the opening module is used for opening a target application program when the operation of opening the target application program is detected;
the first display module is used for displaying the viewed image through an image viewing tool of the target application program when an operation instruction for displaying the viewed image is detected;
a second display module configured to display, when a zoom-in operation triggered in a first partial area on the viewing image is detected, the viewing image overlaid with a second area image that is image content in a second partial area corresponding to the first partial area in the zoom-in image;
the magnified image is another image with the same display content as the viewed image but higher resolution, or the magnified image is an image obtained by magnifying the viewed image according to a target multiple.
14. A computer device comprising a processor and a memory, the memory having stored therein at least one instruction, the at least one instruction being loaded and executed by the processor to implement a method of locally magnifying an image as claimed in any one of claims 1 to 9.
15. A computer-readable storage medium having stored thereon at least one instruction which is loaded and executed by a processor to implement a method of locally magnifying an image according to any one of claims 1 to 9.
CN201911175802.5A 2019-11-26 2019-11-26 Method, device and storage medium for locally amplifying image Active CN110941375B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911175802.5A CN110941375B (en) 2019-11-26 2019-11-26 Method, device and storage medium for locally amplifying image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911175802.5A CN110941375B (en) 2019-11-26 2019-11-26 Method, device and storage medium for locally amplifying image

Publications (2)

Publication Number Publication Date
CN110941375A true CN110941375A (en) 2020-03-31
CN110941375B CN110941375B (en) 2023-09-05

Family

ID=69908697

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911175802.5A Active CN110941375B (en) 2019-11-26 2019-11-26 Method, device and storage medium for locally amplifying image

Country Status (1)

Country Link
CN (1) CN110941375B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111489295A (en) * 2020-06-29 2020-08-04 平安国际智慧城市科技股份有限公司 Image processing method, electronic device, and storage medium
CN111541845A (en) * 2020-04-30 2020-08-14 维沃移动通信(杭州)有限公司 Image processing method and device and electronic equipment
CN113364976A (en) * 2021-05-10 2021-09-07 荣耀终端有限公司 Image display method and electronic equipment
CN114356191A (en) * 2022-03-04 2022-04-15 北京阿丘科技有限公司 Circuit board image display method, device, equipment and storage medium

Citations (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101510008A (en) * 2008-01-14 2009-08-19 捷讯研究有限公司 Using a shape-changing display as an adaptive lens for selectively magnifying information displayed onscreen
CN102129338A (en) * 2010-01-12 2011-07-20 宏碁股份有限公司 Image amplification method and computer system
CN102157011A (en) * 2010-12-10 2011-08-17 北京大学 Method for carrying out dynamic texture acquisition and virtuality-reality fusion by using mobile shooting equipment
CN102208171A (en) * 2010-03-31 2011-10-05 安凯(广州)微电子技术有限公司 Local detail playing method on portable high-definition video player
CN102339469A (en) * 2010-07-21 2012-02-01 腾讯科技(深圳)有限公司 Image processing method and device
CN102662566A (en) * 2012-03-21 2012-09-12 中兴通讯股份有限公司 Magnifying display method and terminal for screen content
US20120288215A1 (en) * 2011-05-13 2012-11-15 Altek Corporation Image processing device and processing method thereof
CN103019537A (en) * 2012-11-19 2013-04-03 广东欧珀移动通信有限公司 Image preview method and image preview device
CN103076952A (en) * 2012-12-21 2013-05-01 天津三星光电子有限公司 Amplifying method and amplifying device for display contents in display interface of intelligent camera
CN103226806A (en) * 2013-04-03 2013-07-31 广东欧珀移动通信有限公司 Method and camera system for enlarging picture partially
JP2014067208A (en) * 2012-09-26 2014-04-17 Sharp Corp Image display system and control method therefor
CN104036453A (en) * 2014-07-03 2014-09-10 上海斐讯数据通信技术有限公司 Image local deformation method and image local deformation system and mobile phone with image local deformation method
CN104077387A (en) * 2014-06-27 2014-10-01 北京奇虎科技有限公司 Webpage content display method and browser device
JP2015060144A (en) * 2013-09-20 2015-03-30 株式会社Nttドコモ Map image display device
CN104750389A (en) * 2015-03-31 2015-07-01 努比亚技术有限公司 Picture displaying method and device
CN105120366A (en) * 2015-08-17 2015-12-02 宁波菊风系统软件有限公司 A presentation method for an image local enlarging function in video call
CN105155153A (en) * 2015-08-28 2015-12-16 深圳思瑞普科技有限公司 Processing method for displaying local patterns in computerized embroidery machine
CN105159581A (en) * 2015-08-27 2015-12-16 广东欧珀移动通信有限公司 Picture browsing method and mobile terminal
CN106055247A (en) * 2016-05-25 2016-10-26 努比亚技术有限公司 Picture display device, method and mobile terminal
CN106056038A (en) * 2012-03-12 2016-10-26 佳能株式会社 Image display apparatus and image display method
CN106126088A (en) * 2016-06-15 2016-11-16 青岛海信移动通信技术股份有限公司 A kind of picture amplifies the method and device of display
CN106909274A (en) * 2017-02-27 2017-06-30 努比亚技术有限公司 A kind of method for displaying image and device
CN109062405A (en) * 2018-07-23 2018-12-21 努比亚技术有限公司 Mobile terminal display methods, mobile terminal and computer readable storage medium
CN109324736A (en) * 2018-08-31 2019-02-12 阿里巴巴集团控股有限公司 The exchange method and device of partial enlargement picture
CN109901768A (en) * 2019-02-27 2019-06-18 深圳市沃特沃德股份有限公司 Amplification display method, device, storage medium and the computer equipment of interface image

Patent Citations (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101510008A (en) * 2008-01-14 2009-08-19 捷讯研究有限公司 Using a shape-changing display as an adaptive lens for selectively magnifying information displayed onscreen
CN102129338A (en) * 2010-01-12 2011-07-20 宏碁股份有限公司 Image amplification method and computer system
CN102208171A (en) * 2010-03-31 2011-10-05 安凯(广州)微电子技术有限公司 Local detail playing method on portable high-definition video player
CN102339469A (en) * 2010-07-21 2012-02-01 腾讯科技(深圳)有限公司 Image processing method and device
CN102157011A (en) * 2010-12-10 2011-08-17 北京大学 Method for carrying out dynamic texture acquisition and virtuality-reality fusion by using mobile shooting equipment
US20120288215A1 (en) * 2011-05-13 2012-11-15 Altek Corporation Image processing device and processing method thereof
CN106056038A (en) * 2012-03-12 2016-10-26 佳能株式会社 Image display apparatus and image display method
CN102662566A (en) * 2012-03-21 2012-09-12 中兴通讯股份有限公司 Magnifying display method and terminal for screen content
JP2014067208A (en) * 2012-09-26 2014-04-17 Sharp Corp Image display system and control method therefor
CN103019537A (en) * 2012-11-19 2013-04-03 广东欧珀移动通信有限公司 Image preview method and image preview device
CN103076952A (en) * 2012-12-21 2013-05-01 天津三星光电子有限公司 Amplifying method and amplifying device for display contents in display interface of intelligent camera
CN103226806A (en) * 2013-04-03 2013-07-31 广东欧珀移动通信有限公司 Method and camera system for enlarging picture partially
JP2015060144A (en) * 2013-09-20 2015-03-30 株式会社Nttドコモ Map image display device
CN104077387A (en) * 2014-06-27 2014-10-01 北京奇虎科技有限公司 Webpage content display method and browser device
CN104036453A (en) * 2014-07-03 2014-09-10 上海斐讯数据通信技术有限公司 Image local deformation method and image local deformation system and mobile phone with image local deformation method
CN104750389A (en) * 2015-03-31 2015-07-01 努比亚技术有限公司 Picture displaying method and device
CN105120366A (en) * 2015-08-17 2015-12-02 宁波菊风系统软件有限公司 A presentation method for an image local enlarging function in video call
CN105159581A (en) * 2015-08-27 2015-12-16 广东欧珀移动通信有限公司 Picture browsing method and mobile terminal
CN105155153A (en) * 2015-08-28 2015-12-16 深圳思瑞普科技有限公司 Processing method for displaying local patterns in computerized embroidery machine
CN106055247A (en) * 2016-05-25 2016-10-26 努比亚技术有限公司 Picture display device, method and mobile terminal
CN106126088A (en) * 2016-06-15 2016-11-16 青岛海信移动通信技术股份有限公司 A kind of picture amplifies the method and device of display
CN106909274A (en) * 2017-02-27 2017-06-30 努比亚技术有限公司 A kind of method for displaying image and device
CN109062405A (en) * 2018-07-23 2018-12-21 努比亚技术有限公司 Mobile terminal display methods, mobile terminal and computer readable storage medium
CN109324736A (en) * 2018-08-31 2019-02-12 阿里巴巴集团控股有限公司 The exchange method and device of partial enlargement picture
CN109901768A (en) * 2019-02-27 2019-06-18 深圳市沃特沃德股份有限公司 Amplification display method, device, storage medium and the computer equipment of interface image

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111541845A (en) * 2020-04-30 2020-08-14 维沃移动通信(杭州)有限公司 Image processing method and device and electronic equipment
CN111541845B (en) * 2020-04-30 2022-06-24 维沃移动通信(杭州)有限公司 Image processing method and device and electronic equipment
CN111489295A (en) * 2020-06-29 2020-08-04 平安国际智慧城市科技股份有限公司 Image processing method, electronic device, and storage medium
CN111489295B (en) * 2020-06-29 2020-11-17 平安国际智慧城市科技股份有限公司 Image processing method, electronic device, and storage medium
CN113364976A (en) * 2021-05-10 2021-09-07 荣耀终端有限公司 Image display method and electronic equipment
CN113364976B (en) * 2021-05-10 2022-07-15 荣耀终端有限公司 Image display method and electronic equipment
WO2022237287A1 (en) * 2021-05-10 2022-11-17 荣耀终端有限公司 Image display method and electronic device
CN114356191A (en) * 2022-03-04 2022-04-15 北京阿丘科技有限公司 Circuit board image display method, device, equipment and storage medium
CN114356191B (en) * 2022-03-04 2022-07-12 北京阿丘科技有限公司 Circuit board image display method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN110941375B (en) 2023-09-05

Similar Documents

Publication Publication Date Title
CN111464749B (en) Method, device, equipment and storage medium for image synthesis
CN108449641B (en) Method, device, computer equipment and storage medium for playing media stream
CN110308956B (en) Application interface display method and device and mobile terminal
CN109862412B (en) Method and device for video co-shooting and storage medium
CN110941375B (en) Method, device and storage medium for locally amplifying image
CN108965922B (en) Video cover generation method and device and storage medium
CN109922356B (en) Video recommendation method and device and computer-readable storage medium
CN109886208B (en) Object detection method and device, computer equipment and storage medium
CN110225390B (en) Video preview method, device, terminal and computer readable storage medium
CN109302632B (en) Method, device, terminal and storage medium for acquiring live video picture
CN111754386B (en) Image area shielding method, device, equipment and storage medium
CN110839174A (en) Image processing method and device, computer equipment and storage medium
CN112667835A (en) Work processing method and device, electronic equipment and storage medium
CN112565806A (en) Virtual gift presenting method, device, computer equipment and medium
CN111083526B (en) Video transition method and device, computer equipment and storage medium
CN110677713B (en) Video image processing method and device and storage medium
CN111385525B (en) Video monitoring method, device, terminal and system
CN108965769B (en) Video display method and device
CN112396076A (en) License plate image generation method and device and computer storage medium
CN110992268B (en) Background setting method, device, terminal and storage medium
CN113160031A (en) Image processing method, image processing device, electronic equipment and storage medium
CN111370096A (en) Interactive interface display method, device, equipment and storage medium
CN111327819A (en) Method, device, electronic equipment and medium for selecting image
CN111860064A (en) Target detection method, device and equipment based on video and storage medium
CN113467682B (en) Method, device, terminal and storage medium for controlling movement of map covering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40021087

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant