CN112308780A - Image processing method, device, equipment and storage medium - Google Patents

Image processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN112308780A
CN112308780A CN202011189933.1A CN202011189933A CN112308780A CN 112308780 A CN112308780 A CN 112308780A CN 202011189933 A CN202011189933 A CN 202011189933A CN 112308780 A CN112308780 A CN 112308780A
Authority
CN
China
Prior art keywords
image
resolution
super
display
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011189933.1A
Other languages
Chinese (zh)
Inventor
熊一能
符乐安
唐志新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202011189933.1A priority Critical patent/CN112308780A/en
Publication of CN112308780A publication Critical patent/CN112308780A/en
Priority to PCT/CN2021/116137 priority patent/WO2022088970A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4053Super resolution, i.e. output image resolution higher than sensor resolution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4046Scaling the whole image or part thereof using neural networks

Abstract

The present disclosure relates to a method, an apparatus, a device and a storage medium for processing an image, wherein the method comprises: receiving an amplification instruction aiming at a display image; acquiring an image area appointed to be amplified on a display image as a target image according to the amplification instruction; performing super-resolution amplification processing on a target image to obtain a super-resolution amplified image of the target image; and displaying the ultraclear magnified image on the display image in an overlapping manner. The technical scheme provided by the embodiment of the disclosure is suitable for the condition of processing the picture or the video. The terminal display image super-resolution processing method allows a user to select a local area of a terminal display image, and performs super-resolution amplification processing and display based on the local area selected by the user. The method can meet the requirement of a user for super-resolution amplification of the local area of the image, and can improve the user experience.

Description

Image processing method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for processing an image.
Background
Video application provided by the related art can provide video browsing service for users, but is limited by network environment, video data size and the limitation of a video frame-level coding mode, and any area on a video image cannot be clearly displayed to the users, so that the video display effect is poor, and the user experience is poor.
Disclosure of Invention
To solve the technical problem or at least partially solve the technical problem, the present disclosure provides a method, an apparatus, a device and a storage medium for processing an image.
A first aspect of an embodiment of the present disclosure provides a method for processing an image, where the method includes:
receiving an amplification instruction aiming at a display image;
acquiring an image area appointed to be amplified on a display image as a target image according to the amplification instruction;
performing super-resolution amplification processing on the target image to obtain a super-resolution amplified image of the target image;
and displaying the ultraclear magnified image on the display image in an overlapping manner.
A second aspect of the embodiments of the present disclosure provides an image processing apparatus including:
the receiving module is used for receiving an amplification instruction aiming at a display image;
the first acquisition module is used for acquiring an image area appointed to be amplified on a display image as a target image according to the amplification instruction;
the super-resolution amplification module is used for carrying out super-resolution amplification processing on the target image to obtain a super-resolution amplification image of the target image;
and the display module is used for displaying the super-definition amplified image on the display image in an overlapping manner.
A third aspect of the embodiments of the present disclosure provides a terminal device, which includes a memory and a processor, where the memory stores a computer program, and when the computer program is executed by the processor, the method of the first aspect may be implemented.
A fourth aspect of embodiments of the present disclosure provides a computer-readable storage medium having a computer program stored therein, which, when executed by a processor, may implement the method of the first aspect described above.
Compared with the prior art, the technical scheme provided by the embodiment of the disclosure has the following advantages:
the technical scheme provided by the embodiment of the disclosure allows a user to select a local area of a terminal display image, and performs super-resolution amplification processing and display based on the local area selected by the user. The method can meet the requirement of a user for super-resolution amplification of the local area of the image, and can improve the user experience.
Compared with a full-scene and full-time video super-resolution technology, the technical scheme disclosed by the invention has the advantages that the area needing to be amplified can be reduced by enabling a user to independently select the target image, the calculation difficulty is reduced, and the power consumption of the terminal is not required to be considered too much. Moreover, the method can bring stronger definition sense enhancement and cannot influence the service performance.
The technical scheme provided by the embodiment of the disclosure can be suitable for the condition of processing the picture or the video. If the technical scheme is applied to a video (such as a short video, a long video or a live broadcast) playing scene, the blank that the local area of the video is not amplified at all at present can be made up.
If the technical scheme provided by the embodiment of the disclosure is applied to a mobile terminal (such as a mobile phone), the calculation power of an embedded neural Network Processor (NPU) of the mobile phone can be fully exerted, and the purpose of landing a large high-definition enhancement algorithm with excellent performance on a mobile terminal is favorably realized.
According to the technical scheme provided by the embodiment of the disclosure, if the ultraclear magnified image comprises a pre-calibrated object, the object is set; the calibration information of the object and the ultra-clear magnified image of the object are displayed on the terminal display device. Since the calibration information may include information not included in the ultra-clear magnified image. In essence, the calibration information is complementary to the information related to the calibration object, and is complementary to information that cannot be obtained only by the ultra-clear magnified image of the calibration object. Therefore, the user can quickly and conveniently know the related content without carrying out network search on the calibrated object, the operation steps of the user can be simplified, and the user perception is improved. This approach is advantageous for deriving new business models.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments or technical solutions in the prior art of the present disclosure, the drawings used in the description of the embodiments or prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a flowchart of a method for processing an image according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a step 120 according to an embodiment of the disclosure;
fig. 3 is a schematic diagram of displaying an image before performing an image processing method provided by the present disclosure according to an embodiment of the present disclosure;
4-6 are schematic diagrams of displaying an image after performing the image processing method provided by the present disclosure, according to an embodiment of the present disclosure;
FIG. 7 is a flow chart of another image processing method provided by the disclosed embodiment;
fig. 8 is a block diagram of a device for processing an image according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a terminal device in an embodiment of the present disclosure.
Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, aspects of the present disclosure will be further described below. It should be noted that the embodiments and features of the embodiments of the present disclosure may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced in other ways than those described herein; it is to be understood that the embodiments disclosed in the specification are only a few embodiments of the present disclosure, and not all embodiments.
Fig. 1 is a flowchart of a method for processing an image, which may be performed by a terminal device according to an embodiment of the present disclosure. The terminal device may be understood as an exemplary device with image processing capabilities, such as a mobile phone, a tablet computer, a notebook computer, a desktop computer, a smart television, and the like. In some embodiments, an image display device (such as, but not limited to, a display screen) may be mounted on the terminal device referred to in this embodiment, and the terminal device may perform image display through the display screen, and perform super-resolution enlarged display on a local area of an image (i.e., a target image) by using the method of this embodiment. In other embodiments, the terminal device referred to in this embodiment may also be equipped with a shooting device at the same time, and after the shooting device collects an image, the terminal device may perform super-resolution enlarged display on a local area of the image by using the method of this embodiment.
The method provided by the present application is referred to as an image processing method, where the image includes a picture or a video. Further, if the image is a video, it may be specifically a short video, a long video, or a live video.
As shown in fig. 1, the method provided by this embodiment includes the following steps:
step 110, receiving an enlargement instruction for the display image.
The enlargement instruction is an instruction for instructing the terminal to enlarge a certain local area (i.e., a target image) in an image displayed by the terminal. Therefore, the zoom-in command has two roles, i.e., the function of starting the zoom-in and the function of determining the target image.
Alternatively, the enlargement instruction is triggered by an operation of the user to select an image area on the display image.
There are various operation methods for selecting an image area on a display image by a user, and the present application does not limit this. Illustratively, a virtual key having an on-zoom function may be displayed in advance in a display screen of the terminal. And when the fact that the user clicks the virtual key is identified, the terminal enters an image area selection interface. Or, it is preset that a certain touch action (for example, a finger draws a specific figure, such as "L", "W", or "O", etc.) is used to trigger the terminal to enter the image area selection interface. And when the preset touch action input by the user is recognized, the terminal enters an image area selection interface.
After entering the target image selection interface, the user selects the image area of the target magnification by a finger or a first icon (such as a brush tool, a magnifying glass tool and the like) provided on the display interface. Specifically, when selecting, a user may click on the displayed image with a finger, specify an enlarged target image for the user in an image area within a preset range centered on the clicked position, or drag the first icon to a certain position of the displayed image, specify an enlarged target image for the user in an image area within a preset range centered on the position, or draw a closed area (such as a circle, a quadrangle, or the like) on the displayed image with a finger or the first icon, and take an image on the drawn closed area as the target image.
Alternatively, the virtual key for enabling the terminal to enter the image area selection interface and the first icon for selecting the icon area may be the same UI icon, such as using a magnifying glass icon.
Alternatively, it is also possible to set the zoom-on function and the image area selection to be completed in one operation. For example, it is preset that when a certain touch action is received (such as drawing "L", "W", or "O" on the screen), the terminal automatically starts the zoom-in function, and determines the target zoomed-in image area based on the touch trajectory in the touch action.
And step 120, acquiring an image area appointed to be enlarged on the display image as a target image according to the enlargement instruction.
There are various ways to implement this step, and the present disclosure does not limit this. In actual setting, the input mode of the zoom-in command is different, and different methods for executing the step may be set.
Illustratively, if the image region is selected through the first icon provided on the display interface includes: designating a position of a first icon on the display image; the execution method of the step can comprise the following steps: acquiring the position of a first icon on a display image; and taking the position as a center, and acquiring an image area in a preset range around the position as a target image.
Fig. 2 is a schematic diagram of performing step 120 according to an embodiment of the disclosure. For example, referring to fig. 2, assume that a is an image currently displayed by the terminal, and the method for determining the preset range is to use a specified position of the first icon on the displayed image as a center of a circle, use 1cm as a radius, and use the obtained region as the preset range. Continuing to refer to fig. 2, if the user drags the magnifier tool to point B, the position of the magnifier tool on the display image is designated as the position of point B. When the step is executed, firstly, the position data of the point B is obtained, then the position of the point B is used as the center of a circle, 1cm is used as the radius, the preset range is determined, and then the image area in the preset range is used as the target image.
It should be noted that, in the above example, the preset range is a circle and a radius of 1cm is specified, which is only one specific example of the present disclosure. The shape of the preset range is not limited by the present disclosure, and alternatively, the preset range may be a polygon, an ellipse, or the like. In addition, the present application is not limited to the size of the preset range.
Optionally, if the target image is selected through the first icon on the display interface, the method includes: the area position of the target image on the display image is defined through the first icon; the execution method of the step can comprise the following steps: acquiring a region position defined by a first icon on a display image; and acquiring an image at the area position as a target image according to the area position.
Illustratively, suppose that the user drags the magnifier tool, so that the dragging track forms a closed area, i.e. the position of the area of the target image on the display image is defined by the magnifier tool. In executing this step, the position of the region surrounded by the closed region is acquired, and then the image within the region is taken as the target image.
And step 130, performing super-resolution amplification processing on the target image to obtain a super-resolution amplified image of the target image.
The implementation method of the step has various methods, and illustratively, the super-resolution amplification processing can be performed on the target image based on a preset model to obtain a super-resolution amplified image with the target resolution.
Optionally, the preset model comprises a confrontation network model.
Illustratively, a deep learning convolutional neural network-based image super-resolution technique is used in conjunction with generation of a countermeasure network (GAN) to make the super-resolution magnified image more perceptually realistic. For example, the user can clearly see the original blurred license plate and characters, the brand of the product, the details of the product, the face of the user or the gullies on the surface of the moon, etc., so as to obtain the amazing amplification effect.
The general confrontation network model includes two parts, namely an image generator and an image judger, wherein the image generator generates an image with higher resolution based on the original resolution of the input image, the image judger is used for identifying whether the resolution of the newly generated image reaches a preset resolution standard, if not, the training of the image generator is continued until the image generator can generate an image higher than the preset resolution standard, and the confrontation network model in the embodiment is obtained.
And 140, overlapping and displaying the ultraclean magnified image on the display image.
In executing this step, the present disclosure does not limit the superimposition position of the ultraclear magnified image with respect to the original display image.
Exemplarily, fig. 3 is a schematic diagram of displaying an image before performing an image processing method provided by the present disclosure according to an embodiment of the present disclosure. Fig. 4-6 are schematic diagrams illustrating an image displayed after the image processing method provided by the present disclosure is performed according to an embodiment of the present disclosure. Referring to fig. 3, before the image processing method provided by the present disclosure is performed, the image a is displayed at the terminal, and the super-resolution enlargement processing is performed on the target image B1 by the image processing method provided by the present disclosure, so that a super-resolution enlarged image B2 of the target image B1 is obtained. In executing step 140, optionally, as shown in fig. 4, the super-clear enlarged image B2 is placed at the original target image B1, and the layer of the super-clear enlarged image B2 is positioned above the layer of the target image B1, so that the super-clear enlarged image B2 occludes the target image B1. Alternatively, as shown in fig. 5, the super-resolution enlarged image B2 is placed in a region other than the original target image B1 so that the super-resolution enlarged image B2 does not obscure the target image B1. Alternatively, as shown in fig. 6, the super-resolution enlarged image B2 is made to completely cover the original image a, so that the display device of the terminal displays only the super-resolution enlarged image B2 and does not display the image a.
The essence of the technical scheme is that the user is allowed to select the local area of the terminal display image, and the super-resolution amplification processing is carried out and the local area is displayed based on the local area selected by the user. The method can meet the requirement of a user for super-resolution amplification of the local area of the image, and can improve the user experience.
In addition, compared with a full-scene and full-time video super-resolution technology, the technical scheme disclosed by the invention has the advantages that the area needing to be amplified can be reduced by enabling a user to independently select the target image, the calculation difficulty is reduced, and the power consumption of the terminal does not need to be considered too much. Moreover, the method can bring stronger definition sense enhancement and cannot influence the service performance.
In addition, the technical scheme can be applied to the situation of processing pictures or videos. If the technical scheme is applied to a video (such as a short video, a long video or a live broadcast) playing scene, the blank that the local area of the video is not amplified at all at present can be made up.
If the technical scheme is applied to a mobile terminal (such as a mobile phone), the calculation power of an embedded neural Network Processor (NPU) of the mobile phone can be fully exerted, and the purpose of landing a large high-definition enhancement algorithm with excellent performance on a mobile terminal is favorably realized.
Fig. 7 is a flowchart of another image processing method provided by the embodiment of the present disclosure. Fig. 7 is a specific example of fig. 1. Referring to fig. 7, the image processing method includes:
step 210, receiving an enlargement instruction for the display image, and executing step 220.
Step 220, according to the enlargement instruction, the image area appointed to be enlarged on the display image is obtained as the target image, and step 230 is executed.
And step 230, performing super-resolution amplification processing on the target image to obtain a super-resolution amplified image of the target image, and executing step 240.
And step 240, displaying the ultraclear enlarged image on the display image in an overlapping mode, and executing step 250.
And step 250, judging whether the ultraclean magnified image comprises a pre-calibrated object. If yes, go to step 260; if not, ending.
In practice, at least one object is often included in the image, and some or all of the objects in the image may be calibrated in advance. Calibration means to establish the association relationship between the selected object (i.e. calibration object) and the calibration information. The calibration information refers to information related to the calibration object.
Illustratively, assume that the image includes a girl carrying a pink bag and wearing black shoes on their feet. Only the pink bag can be calibrated in advance, only the black shoe can be calibrated, or the pink bag and the black shoe can be calibrated at the same time.
Further, assuming that only the pink packet is calibrated, the calibration information is the attribute information that can reflect the pink packet. Illustratively, the calibration information may be set as a purchase link of the package, a selling price of the package, a multi-angle picture of the package, a goods number of the package, and the like.
It should be noted that, in practice, the calibration information of the object may be partially the same as or completely different from the content shown by the object calibrated in the super-resolution enlarged image. For example, if a brand identifier (logo) of the package is clearly shown in the super-resolution magnified image, the calibration information associated with the package may or may not include the brand identifier.
It should be noted that the step of calibrating the object in the image needs to be performed before this step. The execution subject for calibrating the object in the image and the execution subject of the step may be the same execution subject or different execution subjects.
There are many actual scenes that need to be calibrated for the object in the image, which is not limited in this application. Illustratively, when a merchant makes a promotion video of a new product, the merchant calibrates commodities displayed in the promotion video.
The method for implementing the step can be various, for example, the super-resolution magnified image is subjected to image recognition, and whether the super-resolution magnified image comprises a pre-calibrated object is judged based on the recognition result.
And step 260, acquiring the calibration information of the object, and displaying the calibration information of the object on the display image in an overlapping manner.
In executing this step, the present disclosure does not limit the display position of the calibration information. Optionally, the calibration information is occluded complementarily with the ultra-clear magnified image.
According to the technical scheme, if the ultraclear magnified image comprises a pre-calibrated object, the method comprises the steps of; the calibration information of the object and the ultra-clear magnified image of the object are displayed on the terminal display device. Since the calibration information may include information not included in the ultra-clear magnified image. In essence, the calibration information is complementary to the information related to the calibration object, and is complementary to information that cannot be obtained only by the ultra-clear magnified image of the calibration object. Therefore, the user can quickly and conveniently know the related content without carrying out network search on the calibrated object, the operation steps of the user can be simplified, and the user perception is improved. This approach is advantageous for deriving new business models.
Fig. 8 is a block diagram of a device for processing an image according to an embodiment of the present disclosure. Referring to fig. 8, the image processing apparatus includes:
a receiving module 310, configured to receive an enlargement instruction for a display image;
a first obtaining module 320, configured to obtain, as a target image, an image area specified to be enlarged on a display image according to the enlargement instruction;
the super-resolution amplification module 330 is configured to perform super-resolution amplification processing on the target image to obtain a super-resolution amplified image of the target image;
and the display module 340 is configured to display the super-resolution magnified image on the display image in an overlapping manner.
Further, the enlargement instruction is triggered by an operation of the user selecting the image area on the display image.
Further, the operation of the user selecting the image area includes:
the image area is selected through a first icon provided on the display interface.
Further, the air conditioner is provided with a fan,
the selecting an image area through a first icon provided on a display interface includes:
designating a position of the first icon on the display image;
the first obtaining module includes:
the first obtaining sub-module is used for obtaining the position of the first icon on the display image; and taking the position as a center, and acquiring an image area in a preset range around the position as a target image.
Further, the
Selecting an image area through a first icon provided on a display interface, including:
a region is defined on the display image through the first icon to serve as an image region of a target image;
the first obtaining module includes:
the second obtaining submodule is used for obtaining the position of the area defined by the first icon on the display image; and acquiring the image at the position as a target image.
Further, the super-resolution amplifying module is configured to:
and performing super-resolution amplification processing on the target image based on a preset confrontation network model to obtain a super-resolution amplified image of the target resolution.
Further, the apparatus further comprises:
the determining module is used for determining whether the super-resolution enlarged image comprises a pre-calibrated object after the super-resolution enlargement module performs super-resolution enlargement processing on the target image to obtain the super-resolution enlarged image of the target image;
the second acquisition module is used for acquiring calibration information of the object when the ultraclear magnified image comprises the pre-calibrated object;
the display module is further configured to display the calibration information of the object in an overlaid manner on the display image.
Since the image processing apparatus provided in the embodiment of the present disclosure may be configured to execute any one of the image processing methods provided in the embodiment of the present disclosure, the image processing apparatus has the same or corresponding beneficial effects as the executable image processing method, and details are not repeated here.
The embodiment of the present disclosure further provides a terminal device, which includes a processor and a memory, where the memory stores a computer program, and when the computer program is executed by the processor, the method of any one of the above-mentioned embodiments in fig. 1 to 7 may be implemented.
For example, fig. 9 is a schematic structural diagram of a terminal device in the embodiment of the present disclosure. Referring now specifically to fig. 9, a schematic diagram of a terminal device 1000 suitable for implementing embodiments of the present disclosure is shown. The terminal apparatus 1000 in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle-mounted terminal (e.g., a car navigation terminal), and the like, and a fixed terminal such as a digital TV, a desktop computer, and the like. The terminal device shown in fig. 9 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 9, terminal apparatus 1000 can include a processing device (e.g., central processing unit, graphics processor, etc.) 1001 that can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)1002 or a program loaded from a storage device 1008 into a Random Access Memory (RAM) 1003. In the RAM 1003, various programs and data necessary for the operation of the terminal apparatus 1000 are also stored. The processing device 1001, the ROM 1002, and the RAM 1003 are connected to each other by a bus 1004. An input/output (I/O) interface 1005 is also connected to bus 1004.
Generally, the following devices may be connected to the I/O interface 1005: input devices 1006 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 1007 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage devices 1008 including, for example, magnetic tape, hard disk, and the like; and a communication device 1009. The communication means 1009 may allow the terminal apparatus 1000 to perform wireless or wired communication with other apparatuses to exchange data. While fig. 9 illustrates a terminal apparatus 1000 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication means 1009, or installed from the storage means 1008, or installed from the ROM 1002. The computer program, when executed by the processing device 1001, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be included in the terminal device; or may exist separately without being assembled into the terminal device.
The computer readable medium carries one or more programs which, when executed by the terminal device, cause the terminal device to:
receiving an amplification instruction aiming at a display image;
acquiring an image area appointed to be amplified on a display image as a target image according to the amplification instruction;
performing super-resolution amplification processing on the target image to obtain a super-resolution amplified image of the target image;
and displaying the ultraclear magnified image on the display image in an overlapping manner.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of an element does not in some cases constitute a limitation on the element itself.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
An embodiment of the present disclosure further provides a computer-readable storage medium, where a computer program is stored in the storage medium, and when the computer program is executed by a processor, the method in any of the embodiments of fig. 1 to 7 may be implemented, and an execution manner and beneficial effects of the method are similar, and are not described herein again.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present disclosure, which enable those skilled in the art to understand or practice the present disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (16)

1. A method of processing an image, comprising:
receiving an amplification instruction aiming at a display image;
acquiring an image area appointed to be amplified on a display image as a target image according to the amplification instruction;
performing super-resolution amplification processing on the target image to obtain a super-resolution amplified image of the target image;
and displaying the ultraclear magnified image on the display image in an overlapping manner.
2. The method according to claim 1, wherein the zoom-in instruction is triggered by an operation of the user selecting the image area on the display image.
3. The method of claim 2, wherein the user selecting the image region comprises:
the image area is selected through a first icon provided on the display interface.
4. The method of claim 3, wherein selecting the image area via a first icon provided on the display interface comprises:
designating a position of the first icon on the display image;
the acquiring of the image area specified to be enlarged on the display image as the target image includes:
acquiring the position of the first icon on the display image;
and taking the position as a center, and acquiring an image area in a preset range around the position as a target image.
5. The method of claim 3, wherein selecting the image area via a first icon provided on the display interface comprises:
a region is defined on the display image through the first icon to serve as an image region of a target image;
the acquiring of the image area specified to be enlarged on the display image as the target image includes:
acquiring the position of a region defined by the first icon on the display image;
and acquiring the image at the position as a target image.
6. The method according to any one of claims 1-5, wherein the performing super-resolution magnification processing on the target image to obtain a super-resolution magnified image of the target image comprises:
and performing super-resolution amplification processing on the target image based on a preset confrontation network model to obtain a super-resolution amplified image of the target resolution.
7. The method according to claim 1, wherein after performing super-resolution magnification processing on the target image to obtain a super-resolution magnified image of the target image, the method further comprises:
determining whether the ultraclear magnified image comprises a pre-calibrated object;
if so, acquiring the calibration information of the object, and displaying the calibration information of the object on the display image in an overlapping manner.
8. An image processing apparatus characterized by comprising:
the receiving module is used for receiving an amplification instruction aiming at a display image;
the first acquisition module is used for acquiring an image area appointed to be amplified on a display image as a target image according to the amplification instruction;
the super-resolution amplification module is used for carrying out super-resolution amplification processing on the target image to obtain a super-resolution amplification image of the target image;
and the display module is used for displaying the super-definition amplified image on the display image in an overlapping manner.
9. The apparatus according to claim 8, wherein the zoom-in instruction is triggered by an operation of the user selecting the image area on the display image.
10. The apparatus of claim 9, wherein the operation of the user selecting the image region comprises:
the image area is selected through a first icon provided on the display interface.
11. The apparatus of claim 10, wherein selecting the image area through a first icon provided on the display interface comprises:
designating a position of the first icon on the display image;
the first obtaining module includes:
the first obtaining sub-module is used for obtaining the position of the first icon on the display image; and taking the position as a center, and acquiring an image area in a preset range around the position as a target image.
12. The apparatus of claim 10, wherein selecting the image area through a first icon provided on the display interface comprises:
a region is defined on the display image through the first icon to serve as an image region of a target image;
the first obtaining module includes:
the second obtaining submodule is used for obtaining the position of the area defined by the first icon on the display image; and acquiring the image at the position as a target image.
13. The apparatus according to any of claims 8-12, wherein the super-resolution magnification module is configured to:
and performing super-resolution amplification processing on the target image based on a preset confrontation network model to obtain a super-resolution amplified image of the target resolution.
14. The apparatus of claim 8, further comprising:
the determining module is used for determining whether the super-resolution enlarged image comprises a pre-calibrated object after the super-resolution enlargement module performs super-resolution enlargement processing on the target image to obtain the super-resolution enlarged image of the target image;
the second acquisition module is used for acquiring calibration information of the object when the ultraclear magnified image comprises the pre-calibrated object;
the display module is further configured to display the calibration information of the object in an overlaid manner on the display image.
15. A terminal device, comprising:
a memory and a processor;
wherein the memory has stored therein a computer program which, when executed by the processor, performs the method of any one of claims 1-7.
16. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202011189933.1A 2020-10-30 2020-10-30 Image processing method, device, equipment and storage medium Pending CN112308780A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011189933.1A CN112308780A (en) 2020-10-30 2020-10-30 Image processing method, device, equipment and storage medium
PCT/CN2021/116137 WO2022088970A1 (en) 2020-10-30 2021-09-02 Image processing method and apparatus, device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011189933.1A CN112308780A (en) 2020-10-30 2020-10-30 Image processing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112308780A true CN112308780A (en) 2021-02-02

Family

ID=74334148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011189933.1A Pending CN112308780A (en) 2020-10-30 2020-10-30 Image processing method, device, equipment and storage medium

Country Status (2)

Country Link
CN (1) CN112308780A (en)
WO (1) WO2022088970A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022088970A1 (en) * 2020-10-30 2022-05-05 北京字跳网络技术有限公司 Image processing method and apparatus, device, and storage medium
CN114598823A (en) * 2022-03-11 2022-06-07 北京字跳网络技术有限公司 Special effect video generation method and device, electronic equipment and storage medium
TWI825951B (en) * 2022-08-26 2023-12-11 瑞昱半導體股份有限公司 Display device and image display method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117834830A (en) * 2022-09-27 2024-04-05 万有引力(宁波)电子科技有限公司 Image processor, processing method, storage medium, and augmented reality display device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6816626B1 (en) * 2001-04-27 2004-11-09 Cisco Technology, Inc. Bandwidth conserving near-end picture-in-picture videotelephony
KR101009881B1 (en) * 2008-07-30 2011-01-19 삼성전자주식회사 Apparatus and method for zoom display of target area from reproducing image
CN104268827B (en) * 2014-09-24 2019-06-04 三星电子(中国)研发中心 The method and apparatus of video image regional area amplification
CN110362250B (en) * 2018-04-09 2021-03-23 杭州海康威视数字技术股份有限公司 Image local amplification method and device and display equipment
CN111741274B (en) * 2020-08-25 2020-12-29 北京中联合超高清协同技术中心有限公司 Ultrahigh-definition video monitoring method supporting local amplification and roaming of picture
CN112308780A (en) * 2020-10-30 2021-02-02 北京字跳网络技术有限公司 Image processing method, device, equipment and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022088970A1 (en) * 2020-10-30 2022-05-05 北京字跳网络技术有限公司 Image processing method and apparatus, device, and storage medium
CN114598823A (en) * 2022-03-11 2022-06-07 北京字跳网络技术有限公司 Special effect video generation method and device, electronic equipment and storage medium
TWI825951B (en) * 2022-08-26 2023-12-11 瑞昱半導體股份有限公司 Display device and image display method

Also Published As

Publication number Publication date
WO2022088970A1 (en) 2022-05-05

Similar Documents

Publication Publication Date Title
CN112308780A (en) Image processing method, device, equipment and storage medium
CN111399956B (en) Content display method and device applied to display equipment and electronic equipment
CN113191726B (en) Task detail interface display method, device, equipment and computer readable medium
CN110070496B (en) Method and device for generating image special effect and hardware device
CN111243049B (en) Face image processing method and device, readable medium and electronic equipment
CN111459364B (en) Icon updating method and device and electronic equipment
CN111784712B (en) Image processing method, device, equipment and computer readable medium
CN115600629B (en) Vehicle information two-dimensional code generation method, electronic device and computer readable medium
CN112307375A (en) Page display method and device, electronic equipment and computer readable medium
CN111309240B (en) Content display method and device and electronic equipment
CN113315924A (en) Image special effect processing method and device
CN110956128A (en) Method, apparatus, electronic device, and medium for generating lane line image
WO2022141969A1 (en) Image segmentation method and apparatus, electronic device, storage medium, and program
CN111461965B (en) Picture processing method and device, electronic equipment and computer readable medium
CN112000251A (en) Method, apparatus, electronic device and computer readable medium for playing video
KR20210110813A (en) Method and apparatus for displaying pathological slide images, electronic devices and storage media
CN114339447A (en) Method, device and equipment for converting picture into video and storage medium
WO2023098576A1 (en) Image processing method and apparatus, device, and medium
US11810336B2 (en) Object display method and apparatus, electronic device, and computer readable storage medium
CN115619904A (en) Image processing method, device and equipment
CN112672182B (en) Live broadcast interface display method, device, electronic equipment and computer readable medium
CN111461969B (en) Method, device, electronic equipment and computer readable medium for processing picture
JP2023550970A (en) Methods, equipment, storage media, and program products for changing the background in the screen
WO2021031909A1 (en) Data content output method and apparatus, electronic device and computer-readable medium
CN114187169A (en) Method, device and equipment for generating video special effect package and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination