CN108921212B - Image matching method, mobile terminal and computer readable storage medium - Google Patents

Image matching method, mobile terminal and computer readable storage medium Download PDF

Info

Publication number
CN108921212B
CN108921212B CN201810680702.7A CN201810680702A CN108921212B CN 108921212 B CN108921212 B CN 108921212B CN 201810680702 A CN201810680702 A CN 201810680702A CN 108921212 B CN108921212 B CN 108921212B
Authority
CN
China
Prior art keywords
matching
image
block
points
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810680702.7A
Other languages
Chinese (zh)
Other versions
CN108921212A (en
Inventor
王乔明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201810680702.7A priority Critical patent/CN108921212B/en
Publication of CN108921212A publication Critical patent/CN108921212A/en
Application granted granted Critical
Publication of CN108921212B publication Critical patent/CN108921212B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an image matching method, which comprises the steps of carrying out blocking processing on a first image; determining a feature point from each block of the first image after the block processing; performing bidirectional matching on the determined feature points in each block of the first image; determining the matching parameters of each block according to the result of the bidirectional matching; and performing matching processing according to the blocks through the matching parameters of each block. The invention also discloses a mobile terminal and a computer readable storage medium, which solve the problem of low image matching accuracy rate aiming at the images with flat pure colors in most areas in the related technology, improve the matching accuracy rate and improve the user experience by performing bidirectional matching on the images.

Description

Image matching method, mobile terminal and computer readable storage medium
Technical Field
The present invention relates to the field of mobile communications technologies, and in particular, to an image matching method, a mobile terminal, and a computer-readable storage medium.
Background
Currently, matching technology is widely applied to medical, biological, information processing and other fields, and has become an indispensable technology in image processing application.
Image matching technology has become an extremely important technology in the field of modern information processing, and has wide and practical applications in many fields, such as: pattern recognition, automatic navigation, medical diagnosis, computer vision, image three-dimensional reconstruction, remote sensing image processing and the like. Inaccurate image matching is a bottleneck problem in these application fields, and many important computer vision studies are developed on the premise of supposing that the matching problem is solved.
Image matching is a common fundamental problem in the field of image processing, where one or more transformations are found in a transformation space to make two or more images of the same scene from different times, different sensors, or different perspectives spatially consistent.
Due to the variation of shooting time, angle and environment, the use of various sensors and the defects of the sensors, the shot image is not only influenced by noise, but also has serious gray scale distortion and geometric distortion. Under the condition, how to achieve high precision, high matching accuracy, high speed, strong robustness and anti-interference performance of the matching algorithm and parallel implementation becomes the target pursued by people.
In the fusion processing of multiple images, image matching is often used, for example, multi-frame noise reduction, panoramic stitching and the like, the accuracy of image matching directly affects the fusion effect of the images, taking the matching of two images a and B as an example, a common image matching algorithm is to uniformly take points in the image a first, then find the points in the image B by adopting an optical flow method, and the displacement between the points is used as a local registration parameter of the two images near the points.
Aiming at the problem that the image matching accuracy is low in the related art for most of flat pure-color images, no solution is provided at present.
Disclosure of Invention
The invention mainly aims to provide an image matching method, a mobile terminal and a computer readable storage medium, and aims to solve the problem that in the related art, the image matching accuracy is low for most of flat pure-color images.
In order to achieve the above object, an embodiment of the present invention provides an image matching method, including:
the first image is subjected to blocking processing;
determining a feature point from each block of the first image after the block processing;
performing bidirectional matching on the determined feature points in each block of the first image;
determining the matching parameters of each block according to the result of the bidirectional matching;
and performing matching processing according to the blocks through the matching parameters of each block.
Optionally, determining a feature point from each block of the first image after the block processing includes:
determining a characteristic point with the maximum gray level from each block of the first image after blocking processing; and/or the presence of a gas in the gas,
and determining the characteristic point with the maximum gradient from each block of the first image after the blocking processing.
Optionally, performing bidirectional matching on the determined feature points in each block of the first image includes: :
and carrying out bidirectional matching on the determined feature points in each block of the first image through an optical flow method.
Optionally, determining the matching parameter of each block according to the result of the bidirectional matching includes:
and determining the matching parameters of each block according to the result of the bidirectional matching by a linear interpolation method.
Optionally, performing bidirectional matching on the determined feature points in each block includes:
matching the determined feature points of each block of the first image into a second image;
and matching the matched feature points in the second image to the first image to obtain a matching result.
Optionally, determining the matching parameter of each block according to the result of the bidirectional matching includes:
and obtaining the matching parameters of each block according to the feature points of each block of the first image and the matching result.
Optionally, obtaining a matching parameter of each block according to the feature point of each block of the first image and the matching result includes:
judging whether the difference value of the characteristic point of each block of the first image and the matching result is smaller than a preset threshold value or not;
determining the characteristic points with the difference value between the characteristic points and the matching result smaller than the preset threshold value as accurately matched points, and determining the characteristic points with the difference value between the characteristic points and the matching result larger than or equal to the preset threshold value as inaccurately matched points;
and obtaining the matching parameters of the points with inaccurate matching through a linear interpolation method so as to determine the matching parameters of each block in the first image.
According to another aspect of the embodiments of the present invention, there is provided a mobile terminal, including: a processor, a memory, and a communication bus, wherein,
the communication bus is used for realizing connection communication between the processor and the memory;
the processor is used for executing the image matching program stored in the memory so as to realize the following steps:
the first image is subjected to blocking processing;
determining a feature point from each block of the first image after the block processing;
performing bidirectional matching on the determined feature points in each block of the first image;
determining the matching parameters of each block according to the result of the bidirectional matching;
and performing matching processing according to the blocks through the matching parameters of each block.
Optionally, the processor is further configured to execute an image matching program to implement the following steps:
determining a characteristic point with the maximum gray level from each block of the first image after blocking processing; and/or the presence of a gas in the gas,
and determining the characteristic point with the maximum gradient from each block of the first image after the blocking processing.
Optionally, the processor is further configured to execute an image matching program to implement the following steps:
and carrying out bidirectional matching on the determined feature points in each block of the first image through an optical flow method.
Optionally, the processor is further configured to execute an image matching program to implement the following steps:
and determining the matching parameters of each block according to the result of the bidirectional matching by a linear interpolation method.
Optionally, the processor is further configured to execute an image matching program to implement the following steps:
matching the determined feature points of each block of the first image into a second image;
and matching the matched feature points in the second image to the first image to obtain a matching result.
Optionally, the processor is further configured to execute an image matching program to implement the following steps:
and obtaining the matching parameters of each block according to the feature points of each block of the first image and the matching result.
Optionally, the processor is further configured to execute an image matching program to implement the following steps:
judging whether the difference value of the characteristic point of each block of the first image and the matching result is smaller than a preset threshold value or not;
determining the characteristic points with the difference value between the characteristic points and the matching result smaller than the preset threshold value as accurately matched points, and determining the characteristic points with the difference value between the characteristic points and the matching result larger than or equal to the preset threshold value as inaccurately matched points;
and obtaining the matching parameters of the points with inaccurate matching through a linear interpolation method so as to determine the matching parameters of each block in the first image.
According to another aspect of embodiments of the present invention, there is also provided a computer-readable storage medium storing one or more programs, which are executable by one or more processors, to implement the steps of the above-described image matching method.
According to the invention, the first image is subjected to blocking processing; determining a feature point from each block of the first image after the block processing; performing bidirectional matching on the determined feature points in each block of the first image; determining the matching parameters of each block according to the result of the bidirectional matching; the matching parameters of each block are matched according to the blocks, so that the problem that the image matching accuracy rate is low for most of flat pure-color images in the related technology is solved, the matching accuracy rate is improved and the user experience is improved by performing bidirectional matching on the images.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of an optional mobile terminal for implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal shown in FIG. 1;
FIG. 3 is a flow chart of an image matching method according to an embodiment of the invention;
fig. 4 is a first schematic diagram of panoramic image acquisition in a mobile terminal according to an embodiment of the present invention;
fig. 5 is a second schematic diagram of panoramic image acquisition in a mobile terminal according to an embodiment of the present invention;
FIG. 6 is a schematic illustration of the principles of rotation and translation according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a mobile terminal according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in itself. Thus, "module", "component" or "unit" may be used mixedly.
The terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and a fixed terminal such as a Digital TV, a desktop computer, and the like.
The following description will be given by way of example of a mobile terminal, and it will be understood by those skilled in the art that the construction according to the embodiment of the present invention can be applied to a fixed type terminal, in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000(Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex Long Term Evolution), and TDD-LTE (Time Division duplex Long Term Evolution).
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or a backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited to these specific examples.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In order to facilitate understanding of the embodiments of the present invention, a communication network system on which the mobile terminal of the present invention is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system according to an embodiment of the present invention, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an E-UTRAN (Evolved UMTS Terrestrial Radio Access Network) 202, an EPC (Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Specifically, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Among them, the eNodeB2021 may be connected with other eNodeB2022 through backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include an MME (Mobility Management Entity) 2031, an HSS (Home Subscriber Server) 2032, other MMEs 2033, an SGW (Serving gateway) 2034, a PGW (PDN gateway) 2035, and a PCRF (Policy and Charging Rules Function) 2036, and the like. The MME2031 is a control node that handles signaling between the UE201 and the EPC203, and provides bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location register (not shown) and holds subscriber specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present invention is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems.
Based on the above mobile terminal hardware structure and communication network system, the present invention provides various embodiments of the method.
Example 1
Based on the above mobile terminal, an embodiment of the present invention provides an image matching method, and fig. 3 is a flowchart of the image matching method according to the embodiment of the present invention, as shown in fig. 3, the method includes the following steps:
step S301, a first image is subjected to blocking processing;
step S302, determining a characteristic point from each block of the first image after the block processing;
step S303, performing bidirectional matching on the determined feature points in each block of the first image;
step S304, determining the matching parameters of each block according to the result of the bidirectional matching;
step S305, performing matching processing on a block-by-block basis by using the matching parameter of each block.
Through the steps, the first image is subjected to blocking processing; determining a feature point from each block of the first image after the block processing; performing bidirectional matching on the determined feature points in each block of the first image; determining the matching parameters of each block according to the result of the bidirectional matching; the matching parameters of each block are matched according to the blocks, so that the problem that the image matching accuracy rate is low for most of flat pure-color images in the related technology is solved, the matching accuracy rate is improved and the user experience is improved by performing bidirectional matching on the images.
Further, before the step S301, the method further includes: and acquiring a first image and a second image to be matched.
The first image and the second image are acquired to include images of the same object, such as the same animal, object, etc. The two images containing the same target can be obtained directly by shooting, obtained from a storage space, or obtained by capturing a video. If the first image and the second image are color images, the first image and the second image may be further grayed respectively to obtain a grayed first image and a grayed second image.
A user shoots a panoramic image through a mobile terminal, the mobile terminal enters a shooting interface after receiving a trigger for triggering shooting, fig. 4 is a schematic diagram i of panoramic image acquisition in the mobile terminal according to an embodiment of the present invention, as shown in fig. 4, an image acquisition window is displayed on a display interface, an acquired picture is displayed, and the mobile terminal acquires an image.
And the user enters a shooting interface of the mobile terminal through the touch instruction to select a shooting mode. Correspondingly, the mobile terminal acquires a current photographing mode, wherein the current photographing mode is a panoramic photographing mode.
The panoramic shooting mode may be a mode in which the user performs an instruction triggered by an operation such as corresponding mode selection/switching, for example, the user triggers a desired mode by operating a virtual key or a physical key of the mobile terminal, and the mobile terminal obtains a current shooting mode according to the received instruction after receiving the instruction triggered by the mode selection/switching operation of the user. In another possible implementation manner, the current shooting mode may be a shooting mode that is automatically adjusted by the mobile terminal and can meet the current shooting requirement, and if it is detected that the image of the preview window moves a predetermined distance in a certain direction during shooting the preview, the current shooting mode is automatically adjusted to be the panoramic shooting mode.
The user takes a picture through the touch instruction, and the mobile terminal obtains a shooting instruction, where the shooting instruction may be an instruction triggered by the user by executing a corresponding shooting operation in a panoramic shooting mode, for example, a shooting instruction triggered by the user pressing a virtual shutter key on a display screen of the mobile terminal or a physical key on the side of the mobile terminal. In this embodiment, the instruction is used to instruct the mobile terminal to shoot the panoramic image in the panoramic shooting mode. Fig. 5 is a schematic diagram of panoramic image capture in the mobile terminal according to the embodiment of the present invention, and as shown in fig. 5, after receiving a photographing instruction from a user, all process images of photographing are displayed on a display interface, and after receiving a finishing instruction of finishing photographing by the user, a panoramic image is displayed.
After receiving the photographing instruction of the user, the mobile terminal performs corresponding matching processing on the image in the mobile terminal in the background, and details are described below by taking matching of the first image and the second image as an example.
The step S302 further includes: acquiring feature points in the first image; the acquired feature points in the first image comprise pixel points on the same target contained in the first image and the second image.
In this embodiment, the feature points in the first image are not limited to be acquired, and the feature points in the second image may be acquired. As long as one image is arbitrarily selected from two images containing the same object and feature points in the selected image are acquired, it is not necessary to acquire feature points in the other image.
The feature points acquired from the first image in the present embodiment do not include feature descriptions of the feature points, but only the feature points are acquired.
The above step S303 further includes: and determining the position of the pixel point matched with the feature point in the second image by using an optical flow method. After the feature points are obtained from the first image, feature point matching is required to be performed, that is, pixel points which exist in the second image and are matched with the feature points in the first image need to be determined, so that matching between a target contained in the first image and the same target contained in the second image is realized. And matching the feature points matched in the second image into the first image by using an optical flow method, and determining the position of the pixel points matched with the feature points matched in the second image in the first image. After the matched feature points are obtained from the second image, feature point matching needs to be performed again, that is, pixel points in the first image, which are matched with the feature points in the second image, need to be determined, so that matching between a target included in the second image and the same target included in the first image is achieved. And the bidirectional matching of the determined feature points in each block of the first image is realized.
In this embodiment, the optical flow method is used to determine the position of the pixel point matched with the feature point in the second image, so as to match the target included in the first image with the same target included in the second image, and match the feature point matched in the second image into the first image, so as to implement bidirectional matching.
The method disclosed in the embodiment can also be applied to scenes such as image splicing, target tracking, target detection and the like.
According to the technical scheme, the first image and the second image to be matched are obtained, the feature points are only obtained from the first image, the feature points are not required to be obtained from the second image, and the pixel points matched with the feature points in the first image in the second image can be determined through an optical flow method, so that the image matching is completed. In this embodiment, feature points do not need to be acquired from both the first image and the second image, and for any feature point in the first image, when determining a pixel point in the second image that matches the feature point, a pixel point that matches the feature point may be directly determined in the second image by using an optical flow method, and it is not necessary to traverse all feature points acquired in the second image to determine a pixel point that matches the feature point in the first image among all feature points in the second image.
In addition, the accuracy of image matching is higher through bidirectional matching. Specifically, a first image is subjected to block processing, a feature point is determined from each block of the first image after the block processing, the determined feature point of each block of the first image is matched into a second image, the matched feature point in the second image is matched into the first image to obtain a matching result, a matching parameter of each block is obtained according to the feature point of each block of the first image and the matching result, and the matching is carried out according to the matching parameter of each block.
Further, obtaining the matching parameter of each block according to the feature point of each block of the first image and the matching result may specifically include: judging whether the difference value of the characteristic point of each block of the first image and the matching result is smaller than a preset threshold value or not; determining the characteristic points with the difference value between the characteristic points and the matching result smaller than the preset threshold value as accurately matched points, and determining the characteristic points with the difference value between the characteristic points and the matching result larger than or equal to the preset threshold value as inaccurately matched points; since the point with inaccurate matching is negated, the matching parameters of the point with inaccurate matching are obtained by a linear interpolation method to determine the matching parameters of each block in the first image.
Optionally, determining a feature point from each block of the first image after the block processing may be determined according to a gray scale, or may be determined according to a gradient, and selecting a feature point with a maximum gray scale or a maximum gradient, which may specifically include: determining a characteristic point with the maximum gray level from each block of the first image after blocking processing; and/or determining the characteristic point with the maximum gradient from each block of the first image after the block processing. That is, the point with the maximum gray level, that is, the maximum brightness, in each block is selected as the feature point, or the point with the maximum gradient, that is, the maximum difference from the surrounding points, is selected as the feature point.
Specifically, the feature points may be obtained based on the accelerated segmentation test features, and the method may include the following steps:
selecting a pixel point in the first image; since the first image includes the same target as the second image, selecting a pixel in the first image in this step may specifically be selecting a pixel in the region where the target included in the first image is located.
Taking the pixel points in the first image as the circle center, and making a circle in the first image according to a preset radius; according to Bresenham's algorithm, a circle is made in the first image by taking the selected pixel point as the center of the circle and the preset radius as the radius. The preset radius can be adjusted correspondingly according to actual needs. In this embodiment, the preset radius is a distance including 3 pixels.
Selecting a specific number of pixel points located on the circumference of the circle; and selecting the pixel points with the characteristic quantity on the circumference of the generated circle with the selected pixel points as the circle center and the preset radius as the radius, wherein the selected pixel points on the circumference are located at different positions on the circumference. For example, if the predetermined number is 12, 12 pixels located at different positions on the circumference are selected. The selection can be uniform, that is, the circumferential lengths of the intervals between every two pixel points are the same.
Comparing the gray level of each pixel point on the circumference with the gray level of the circle center one by one; and numbering 12 pixel points which are selected on the circumference and are positioned at different positions according to the sequence from 1 to 12 in the clockwise direction.
And acquiring the gray level of each pixel point and the gray level of the pixel point as the center of the circle. And sequentially comparing the gray level of the pixel point corresponding to each serial number with the gray level of the pixel point serving as the circle center according to the sequence of the serial numbers from small to large.
Judging whether gray levels of at least two pixel points in the pixels on the circumference are larger than or smaller than the gray level of the circle center;
and judging that the gray levels of at least two pixel points in the pixel points on the circumference are both larger than or both smaller than the gray level of the circle center, and determining the circle center as the feature point in the first image.
Judging whether gray levels of at least two pixel points in the pixels on the circumference are larger than or smaller than the gray level of the circle center according to the comparison result;
if the gray levels of at least two pixel points in the pixel points on the circumference are judged to be larger than or smaller than the gray level of the circle center, the fact that enough pixel points exist in the neighborhood of the pixel point serving as the circle center and the pixel point serving as the circle center has larger difference is shown, and then the pixel point serving as the circle center is a feature point.
And if the gray levels of at least two pixel points do not exist in the pixel points on the circumference and are larger than the gray level of the circle center, and the gray levels of at least two pixel points do not exist in the pixel points on the circumference and are smaller than the gray level of the circle center, the pixel point serving as the circle center is not a feature point.
In this embodiment, in order to expand the difference between the pixel point as the center of circle and other pixel points in the neighborhood, a gray threshold may be set, and when comparing the gray level of each pixel point on the circumference with the gray level of the pixel point as the center of circle, the sum of the gray level of each pixel point on the circumference and the gray level of the pixel point as the center of circle plus the gray threshold is compared, and the difference between the gray level of each pixel point on the circumference and the gray level of the pixel point as the center of circle minus the gray threshold is compared.
Judging whether the gray levels of at least two pixel points in the pixel points on the circumference are both larger than the sum of the gray levels of the pixel points as the circle center and the gray level threshold value, or whether the gray levels of at least two pixel points in the pixel points on the circumference are both smaller than the difference of the gray levels of the pixel points as the circle center minus the gray level threshold value;
if the gray levels of at least two pixel points in the pixel points on the circumference are judged to be larger than the sum of the gray levels of the pixel points as the circle centers and the gray level threshold value, the gray levels of the pixel points with the gray levels of at least two pixel points on the circumference as the circle centers and the gray level threshold value are indicated to be brighter; judging that the gray levels of at least two pixel points in the pixel points on the circumference are smaller than the difference value obtained by subtracting the gray level threshold value from the gray level of the pixel point serving as the circle center, namely, the gray levels of at least two pixel points on the circumference are still darker than the gray levels obtained by subtracting the gray level threshold value from the gray level of the pixel point serving as the circle center; it is determined that the pixel point as the center of the circle is the feature point.
And determining the circle center as a feature point in the first image.
It should be noted that the step of determining the feature point with the maximum gradient from each block of the first image after the block processing is similar to the step of determining the feature point with the maximum gradient from each block of the first image after the block processing, and details are not repeated here.
Through the steps, the purpose of acquiring the characteristic points by using the FAST is achieved. It can be understood that, in this embodiment, not only one feature point is obtained, but only one pixel point can be determined whether to be a feature point by performing the above steps, so that the above steps can be performed multiple times in this embodiment to obtain multiple feature points.
And then, determining the position of the pixel point matched with the feature point in the second image by using an optical flow method, and matching the feature point matched in the second image into the first image again to finish bidirectional matching.
Optionally, performing bidirectional matching on the determined feature points in each block of the first image includes: and carrying out bidirectional matching on the determined feature points in each block of the first image through an optical flow method. Optical flow is a simple and practical expression of image motion, and is generally defined as the apparent motion of an image brightness pattern in an image sequence, i.e., the expression of the motion velocity of a point on the surface of a spatial object on the imaging plane of a visual sensor. This definition considers that the optical flow represents only one geometric variation. Negahdaripoiur redefined the optical flow as a comprehensive representation of the geometric and radiometric changes of the dynamic images in 1998. Optical flow is the use of temporal variations and correlations of pixel intensity data in a sequence of images to determine the "motion" of the respective pixel location, i.e. to study the relationship of the temporal variation of the image grey scale to the structure of objects in the scene and their motion. In general, optical flow is caused by relative motion resulting from camera motion, object motion in the scene, or both. The basic principle of detecting moving objects by an optical flow method is as follows: each pixel point in the image is endowed with a velocity vector, so that an image motion field is formed, at a specific moment of motion, the points on the image correspond to the points on the three-dimensional object one to one, the corresponding relation can be obtained by projection relation, and the image can be dynamically analyzed according to the velocity vector characteristics of each pixel point. If there is no moving object in the image, the optical flow vector is continuously varied over the entire image area. When a moving object exists in the image, the target and the image background move relatively, and the speed vector formed by the moving object is different from the speed vector of the neighborhood background, so that the moving object and the position are detected. The problem of adopting the optical flow method to detect the moving object mainly lies in that most of the optical flow methods are time-consuming in calculation, and poor in real-time performance and practicability. However, the optical flow method has the advantage that the optical flow not only carries the motion information of a moving object, but also carries rich information about the three-dimensional structure of a scene, and can detect a moving object without knowing any information of the scene. Optical flow computing methods can be broadly divided into three categories: a matching based approach, a frequency domain approach and a gradient approach.
Optionally, determining the matching parameter of each block according to the result of the bidirectional matching includes: and determining the matching parameters of each block according to the result of the bidirectional matching by a linear interpolation method.
The embodiment of the invention provides an improved bidirectional image matching method, which takes the matching of two images A and B as an example for description and mainly comprises the following steps:
1. image a is divided into W × H blocks. The size of each block can be flexibly set, and the width and the height are preferably not more than 512;
2. feature points are found within each block separately. The characteristic point may be a point with a maximum characteristic value in a local region, such as a maximum gray level, a maximum gradient, etc.;
3. matching all characteristic points in A from A- > B by using a registration algorithm such as an optical flow method, for example, a certain characteristic point in A (x0, y0), and the corresponding point in B is (x1, y 1);
4. matching points in the image B matched in the last step from B- > A by using a registration algorithm such as an optical flow method, and obtaining a point (x2, y2) corresponding to a certain point (x1, y1) in the image B in the image A;
5. and evaluating the matching results of the two times, and finding out all points which are matched more accurately. Taking a certain characteristic point as an example in the above example as an example, theoretically, if the point is matched accurately, x2 ═ x0, y2 ═ y0 should be available; in practical implementation, | x2-x0| < C and | y2-y0| < C can be regarded as points where matching is accurate;
6. and obtaining the matching parameters of all the blocks by using a linear interpolation method. Through the evaluation screening of step 5, the matching results of some blocks may be negated, so that it is necessary to use interpolation to obtain the matching parameters of these negated blocks.
7. Matching is performed on a block-by-block basis using the matching parameters for each block.
In the image matching in the embodiment of the invention, the homonymy point is identified between two or more images through a certain matching algorithm, for example, the window center point corresponding to the maximum relational number in the search area is taken as the homonymy point by comparing the correlation coefficients of windows with the same size in the target area and the search area in the two-dimensional image matching. The essence is to apply the best search problem of matching criteria under the condition of primitive similarity.
The embodiments of the present invention use a two-way matching algorithm to select the initial point because the results obtained for such two-way matching are good for those matches that are correct. If the two-way matching algorithm obtains few matching points, it can be asserted that the two corresponding pictures are not necessarily related, or that one of the pictures is distorted seriously, so that the basic characteristic of two-way matching is lost.
According to the image matching method provided by the embodiment of the invention, the key for matching the two frames of images is to evaluate the content similarity of the two frames of images. The image matching method of the embodiment integrates the advantages of the two methods, extracts similar objects in the two frames of images, and fully utilizes the position and layout information of the objects to evaluate the similarity of the two frames of images. The method realizes the comparison of the content similarity of any two frames of images through three links of the extraction of the overlapping area, the reconstruction of the overlapping area of sub-block decomposition matching and the similarity analysis of the reconstructed overlapping area. And the matching parameters of each block are obtained through bidirectional matching, so that the matching parameters of each block are more accurate, and matching is realized according to the blocks through the matching parameters of each block, so that the matching accuracy is improved.
Specifically, the overlap region between the images is extracted: for the two frames of images to be matched, even if they contain the same image content, they are in most cases taken from different positions and perspectives. Resulting in only a partial area of the two images containing overlapping image content. Therefore, the aim of image content identification and matching is to extract the overlapping area and evaluate the similarity of the two frames of images by analyzing the image content in the area; to achieve this, two frame band matching images photographed at different positions and coordinates need to be translated and rotated to the same photographing position by the affine transformation model. By this adjustment, the same object photographed in both images can be adjusted to approximately the same position. Fig. 6 is a schematic diagram of the principles of rotation and translation, as shown in fig. 6, according to an embodiment of the present invention. The triangles and parallelograms in the original image are adjusted to the same position. Fig. 6 (a) shows the frame a image content, fig. 6 (B) shows the frame B image content, and fig. 6 (c) shows the adjusted result.
Overlap region reconstruction based on sub-block decomposition matching: for the two matched frames of images, the overlapping area of the two matched frames of images has the characteristics that the objects in the overlapping area of the two frames of images are the same and the coordinates of the objects are also the same. Therefore, for the frame A, B, the similarity of the two frames of images can be evaluated by calculating the correlation coefficient of their overlapping regions and comparing each pixel point in the overlapping regions, and if the two frames have the same image content, the evaluation result of the similarity is very large.
However, due to the differences in accuracy of the SURF algorithm and the actual heights of the feature points, etc., the extraction result of the feature points has errors, and the overlapped region cannot be extracted accurately. The main causes of errors include: first, there is a translation error for the object in the overlapping region of the two frame images. Secondly, there is also a rotational error between the two regions. Therefore, if only the common matching method (such as correlation coefficient) is used, the similarity of the overlapped area of the two images cannot be accurately analyzed, and the matching accuracy is seriously influenced.
In order to solve the problem, an overlap region reconstruction method based on sub-block decomposition matching is designed, and an overlap region in a frame A is reconstructed through small blocks in machine vision and then matched with an overlap region in a frame B. The reconstruction process comprises: the overlap region of frame a is divided into a number of sub-blocks that can be matched with the overlap region of frame B by the sad (sum of Absolute difference) method, find their most suitable position in frame B, and reconstruct the region of frame B. If frame a is similar to frame B, the reconstructed overlap area will be very similar to the overlap area of frame B. Otherwise only a few sub-blocks can be matched into frame B.
Analyzing the similarity of the reconstructed overlapping areas; evaluating the similarity of the frames A, B may be accomplished by comparing the similarity of the frame a reconstruction overlap region and the frame B overlap region SAB.
Example 2
According to another aspect of the embodiments of the present invention, there is also provided a mobile terminal, and fig. 7 is a schematic diagram of a mobile terminal according to an embodiment of the present invention, as shown in fig. 7, the mobile terminal includes: a processor 110, a memory 109, and a communication bus, wherein,
the communication bus is used for realizing connection communication between the processor 110 and the memory 109;
the processor 110 is configured to execute the image matching program stored in the memory 109 to implement the following steps:
the first image is subjected to blocking processing;
determining a feature point from each block of the first image after the block processing;
performing bidirectional matching on the determined feature points in each block of the first image;
determining the matching parameters of each block according to the result of the bidirectional matching;
and performing matching processing according to the blocks through the matching parameters of each block.
Optionally, the processor 110 is further configured to execute an image matching program to implement the following steps:
determining a characteristic point with the maximum gray level from each block of the first image after blocking processing; and/or the presence of a gas in the gas,
and determining the characteristic point with the maximum gradient from each block of the first image after the blocking processing.
Optionally, the processor 110 is further configured to execute an image matching program to implement the following steps:
and carrying out bidirectional matching on the determined feature points in each block of the first image through an optical flow method.
Optionally, the processor 110 is further configured to execute an image matching program to implement the following steps:
and determining the matching parameters of each block according to the result of the bidirectional matching by a linear interpolation method.
Optionally, the processor 110 is further configured to execute an image matching program to implement the following steps:
matching the determined feature points of each block of the first image into a second image;
and matching the matched feature points in the second image to the first image to obtain a matching result.
Optionally, the processor 110 is further configured to execute an image matching program to implement the following steps:
and obtaining the matching parameters of each block according to the feature points of each block of the first image and the matching result.
Optionally, the processor 110 is further configured to execute an image matching program to implement the following steps:
judging whether the difference value of the characteristic point of each block of the first image and the matching result is smaller than a preset threshold value or not;
determining the characteristic points with the difference value between the characteristic points and the matching result smaller than the preset threshold value as accurately matched points, and determining the characteristic points with the difference value between the characteristic points and the matching result larger than or equal to the preset threshold value as inaccurately matched points;
and obtaining the matching parameters of the points with inaccurate matching through a linear interpolation method so as to determine the matching parameters of each block in the first image.
Example 3
According to another aspect of embodiments of the present invention, there is also provided a computer-readable storage medium storing one or more programs, the one or more programs being executable by one or more processors to implement the following steps of the above-mentioned image matching method:
s11, the first image is processed by block processing;
s12, determining a characteristic point from each block of the first image after the blocking processing;
s13, performing bidirectional matching on the determined feature points in each block of the first image;
s14, determining the matching parameters of each block according to the result of the bidirectional matching;
and S15, matching the blocks according to the matching parameters of each block.
According to the embodiment of the invention, the first image is subjected to blocking processing; determining a feature point from each block of the first image after the block processing; performing bidirectional matching on the determined feature points in each block of the first image; determining the matching parameters of each block according to the result of the bidirectional matching; the matching parameters of each block are matched according to the blocks, so that the problem that the image matching accuracy rate is low for most of flat pure-color images in the related technology is solved, the matching accuracy rate is improved and the user experience is improved by performing bidirectional matching on the images.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (7)

1. An image matching method, comprising:
the first image is subjected to blocking processing;
determining a feature point from each block of the first image after the block processing;
performing bidirectional matching on the determined feature points in each block of the first image;
determining the matching parameters of each block according to the result of the bidirectional matching;
matching processing is carried out according to the blocks through the matching parameters of each block;
wherein the bi-directionally matching the determined feature points in each block of the first image comprises:
matching the determined feature points of each block of the first image to a second image;
matching the matched feature points in the second image to the first image to obtain a matching result; the characteristic points are pixel points of the same target contained in the first image and the second image;
wherein, determining the matching parameters of each block according to the result of the bidirectional matching comprises:
obtaining a matching parameter of each block according to the feature point of each block of the first image and the matching result, wherein the matching parameter comprises:
judging whether the difference value of the characteristic point of each block of the first image and the matching result is smaller than a preset threshold value or not;
determining the characteristic points with the difference value between the characteristic points and the matching result smaller than the preset threshold value as accurately matched points, and determining the characteristic points with the difference value between the characteristic points and the matching result larger than or equal to the preset threshold value as inaccurately matched points;
and obtaining the matching parameters of the points with inaccurate matching through a linear interpolation method so as to determine the matching parameters of each block in the first image.
2. The method of claim 1, wherein determining feature points from each block of the first image after the block processing comprises:
determining a characteristic point with the maximum gray level from each block of the first image after blocking processing; and/or the presence of a gas in the gas,
and determining the characteristic point with the maximum gradient from each block of the first image after the blocking processing.
3. The method of claim 1, wherein bi-directionally matching the determined feature points in each block of the first image comprises:
and carrying out bidirectional matching on the determined feature points in each block of the first image through an optical flow method.
4. The method of claim 1, wherein determining the matching parameters for each block according to the result of the bi-directional matching comprises:
and determining the matching parameters of each block according to the result of the bidirectional matching by a linear interpolation method.
5. A mobile terminal, characterized in that the mobile terminal comprises: a processor, a memory, and a communication bus, wherein,
the communication bus is used for realizing connection communication between the processor and the memory;
the processor is used for executing the image matching program stored in the memory so as to realize the following steps:
the first image is subjected to blocking processing;
determining a feature point from each block of the first image after the block processing;
performing bidirectional matching on the determined feature points in each block of the first image;
determining the matching parameters of each block according to the result of the bidirectional matching;
matching processing is carried out according to the blocks through the matching parameters of each block;
wherein the processor is further configured to execute an image matching procedure to implement the steps of:
matching the determined feature points of each block of the first image to a second image;
matching the matched feature points in the second image to the first image to obtain a matching result; the characteristic points are pixel points of the same target contained in the first image and the second image;
wherein, determining the matching parameters of each block according to the result of the bidirectional matching comprises:
obtaining a matching parameter of each block according to the feature point of each block of the first image and the matching result, wherein the matching parameter comprises:
judging whether the difference value of the characteristic point of each block of the first image and the matching result is smaller than a preset threshold value or not;
determining the characteristic points with the difference value between the characteristic points and the matching result smaller than the preset threshold value as accurately matched points, and determining the characteristic points with the difference value between the characteristic points and the matching result larger than or equal to the preset threshold value as inaccurately matched points;
and obtaining the matching parameters of the points with inaccurate matching through a linear interpolation method so as to determine the matching parameters of each block in the first image.
6. The mobile terminal of claim 5, wherein the processor is further configured to execute an image matching procedure to implement the steps of:
determining a characteristic point with the maximum gray level from each block of the first image after blocking processing; and/or the presence of a gas in the gas,
and determining the characteristic point with the maximum gradient from each block of the first image after the blocking processing.
7. A computer readable storage medium, characterized in that the computer readable storage medium stores one or more programs which are executable by one or more processors to implement the steps of the image matching method of any one of claims 1-4.
CN201810680702.7A 2018-06-27 2018-06-27 Image matching method, mobile terminal and computer readable storage medium Active CN108921212B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810680702.7A CN108921212B (en) 2018-06-27 2018-06-27 Image matching method, mobile terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810680702.7A CN108921212B (en) 2018-06-27 2018-06-27 Image matching method, mobile terminal and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN108921212A CN108921212A (en) 2018-11-30
CN108921212B true CN108921212B (en) 2021-11-19

Family

ID=64424037

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810680702.7A Active CN108921212B (en) 2018-06-27 2018-06-27 Image matching method, mobile terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN108921212B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109816619B (en) * 2019-01-28 2024-06-21 努比亚技术有限公司 Image fusion method, device, terminal and computer readable storage medium
CN110110615B (en) * 2019-04-19 2021-11-23 苏州极目机器人科技有限公司 Image information processing method and image information processing system
CN110334729B (en) * 2019-05-09 2023-03-24 四川大学华西医院 Automatic alarm notification system for inspection instrument
CN111814535B (en) * 2020-05-20 2024-05-07 平安科技(深圳)有限公司 Palm print image recognition method, device, equipment and computer readable storage medium
CN111950510B (en) * 2020-08-26 2023-10-03 上海申瑞继保电气有限公司 High-voltage switch switching indication board image recognition method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104978742A (en) * 2015-06-12 2015-10-14 北京邮电大学 Image registration method and image registration device based on cascade structure
CN105654479A (en) * 2015-12-28 2016-06-08 北京邮电大学 Multispectral image registering method and multispectral image registering device
CN106791845A (en) * 2017-01-17 2017-05-31 湖南优象科技有限公司 A kind of quick parallax method of estimation for multi-view image coding
CN106851049A (en) * 2017-03-06 2017-06-13 济南中维世纪科技有限公司 A kind of scene alteration detection method and device based on video analysis

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101308946B1 (en) * 2012-02-02 2013-09-24 한국과학기술연구원 Method for reconstructing three dimensional facial shape

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104978742A (en) * 2015-06-12 2015-10-14 北京邮电大学 Image registration method and image registration device based on cascade structure
CN105654479A (en) * 2015-12-28 2016-06-08 北京邮电大学 Multispectral image registering method and multispectral image registering device
CN106791845A (en) * 2017-01-17 2017-05-31 湖南优象科技有限公司 A kind of quick parallax method of estimation for multi-view image coding
CN106851049A (en) * 2017-03-06 2017-06-13 济南中维世纪科技有限公司 A kind of scene alteration detection method and device based on video analysis

Also Published As

Publication number Publication date
CN108921212A (en) 2018-11-30

Similar Documents

Publication Publication Date Title
CN108921212B (en) Image matching method, mobile terminal and computer readable storage medium
CN110072061B (en) Interactive shooting method, mobile terminal and storage medium
CN107707821B (en) Distortion parameter modeling method and device, correction method, terminal and storage medium
CN110086993B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN111885307B (en) Depth-of-field shooting method and device and computer readable storage medium
CN108171744A (en) Determining method, mobile terminal and the storage medium of disparity map in a kind of binocular virtualization
CN107172349B (en) Mobile terminal shooting method, mobile terminal and computer readable storage medium
CN112995467A (en) Image processing method, mobile terminal and storage medium
CN112188082A (en) High dynamic range image shooting method, shooting device, terminal and storage medium
CN111866388B (en) Multiple exposure shooting method, equipment and computer readable storage medium
CN107689029A (en) Image processing method, mobile terminal and computer-readable recording medium
CN112184722A (en) Image processing method, terminal and computer storage medium
CN109510941B (en) Shooting processing method and device and computer readable storage medium
CN108848321B (en) Exposure optimization method, device and computer-readable storage medium
CN107395971B (en) Image acquisition method, image acquisition equipment and computer-readable storage medium
CN111010514B (en) Image processing method and electronic equipment
CN112149431A (en) Translation method, electronic device and readable storage medium
CN107493431A (en) A kind of image taking synthetic method, terminal and computer-readable recording medium
CN111614902A (en) Video shooting method and device and computer readable storage medium
CN108495033B (en) Photographing regulation and control method and device and computer readable storage medium
CN107257430B (en) A kind of camera control method, terminal and computer readable storage medium
CN112532838B (en) Image processing method, mobile terminal and computer storage medium
CN111866389B (en) Video tracking shooting method and device and computer readable storage medium
CN108764261A (en) A kind of image processing method, mobile terminal and storage medium
CN113572916A (en) Shooting method, terminal device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant