CN114510192B - Image processing method and related device - Google Patents

Image processing method and related device Download PDF

Info

Publication number
CN114510192B
CN114510192B CN202210113847.5A CN202210113847A CN114510192B CN 114510192 B CN114510192 B CN 114510192B CN 202210113847 A CN202210113847 A CN 202210113847A CN 114510192 B CN114510192 B CN 114510192B
Authority
CN
China
Prior art keywords
touch
candidate
reliability
quadrangle
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210113847.5A
Other languages
Chinese (zh)
Other versions
CN114510192A (en
Inventor
顾磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210113847.5A priority Critical patent/CN114510192B/en
Publication of CN114510192A publication Critical patent/CN114510192A/en
Application granted granted Critical
Publication of CN114510192B publication Critical patent/CN114510192B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an image processing method and a related device, which are applied to electronic equipment, wherein the method comprises the following steps: acquiring a target image, wherein the target image comprises at least one target object with a quadrilateral peripheral outline; performing line segment detection on the target image to obtain a plurality of line segments; determining at least one candidate quadrangle according to the plurality of line segments; acquiring a first touch parameter aiming at the target image; performing reliability calculation on each candidate quadrangle in the at least one candidate quadrangle according to the first touch parameter to obtain at least one reliability, wherein each reliability is used for representing the strength of will of the corresponding candidate quadrangle to be selected; sorting the at least one candidate quadrangle according to the at least one reliability; and displaying the image of the area where the at least one candidate quadrangle is located according to the sequencing result. By adopting the embodiment of the application, the quadrangle required by the user can be accurately and quickly identified.

Description

Image processing method and related device
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and related apparatus.
Background
In life, in order to facilitate work, when storing files, users sometimes scan a large number of documents, business cards, posters and the like, and store the scanned files into electronic equipment (such as mobile phones, tablet computers and the like).
In general, when scanning documents, business cards, posters, and the like, these objects usually have quadrilaterals, but in a shot image, the objects themselves or background line textures may interfere, so that false detection may occur, and an incorrect quadrilaterals are detected, so that a problem of how to quickly identify quadrilaterals required by a user is needed to be solved.
Disclosure of Invention
The embodiment of the application provides an image processing method and a related device, which can quickly identify a quadrangle required by a user.
In a first aspect, an embodiment of the present application provides an image processing method, including:
acquiring a target image, wherein the target image comprises at least one target object with a quadrilateral peripheral outline;
performing line segment detection on the target image to obtain a plurality of line segments;
determining at least one candidate quadrangle according to the plurality of line segments;
acquiring a first touch parameter aiming at the target image;
Performing reliability calculation on each candidate quadrangle in the at least one candidate quadrangle according to the first touch parameter to obtain at least one reliability, wherein each reliability is used for representing the strength of will of the corresponding candidate quadrangle to be selected;
sorting the at least one candidate quadrangle according to the at least one reliability;
and displaying the image of the area where the at least one candidate quadrangle is located according to the sequencing result.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including: the device comprises a first acquisition unit, a detection unit, a determination unit, a second acquisition unit, a calculation unit, a sequencing unit and a display unit, wherein,
the first acquisition unit is used for acquiring a target image, and the target image comprises at least one target object with a quadrangular peripheral outline;
the detection unit is used for carrying out line segment detection on the target image to obtain a plurality of line segments;
the determining unit is used for determining at least one candidate quadrangle according to the plurality of line segments;
the second acquisition unit is used for acquiring a first touch parameter aiming at the target image;
the computing unit is configured to perform reliability computation on each candidate quadrilateral in the at least one candidate quadrilateral according to the first touch parameter, so as to obtain at least one reliability, where each reliability is used to represent a strength of will of the corresponding candidate quadrilateral to be selected;
The sorting unit is used for sorting the at least one candidate quadrangle according to the at least one reliability;
and the display unit is used for displaying the image of the area where the at least one candidate quadrangle is located according to the sequencing result.
In a third aspect, embodiments of the present application provide an electronic device comprising a processor, a memory for storing one or more programs and configured for execution by the processor, the programs comprising instructions for performing part or all of the steps as described by the first party.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program causes a computer to perform some or all of the steps as described in the first aspect of the embodiments of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, wherein the computer program product comprises a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
By implementing the embodiment of the application, the following beneficial effects are achieved:
it can be seen that, in the image processing method and related apparatus described in the embodiments of the present application, a target image is obtained, the target image includes at least one target object whose peripheral outline is a quadrangle, line segment detection is performed on the target image, a plurality of line segments are obtained, at least one candidate quadrangle is determined according to the plurality of line segments, a first touch parameter for the target image is obtained, reliability calculation is performed on each candidate quadrangle in the at least one candidate quadrangle according to the first touch parameter, at least one reliability is obtained, each reliability is used for indicating a willingness strength of the corresponding candidate quadrangle to be selected, at least one candidate quadrangle is ranked according to the at least one reliability, and an image of an area where the at least one candidate quadrangle is located is displayed according to a ranking result.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic software structure of an electronic device according to an embodiment of the present application;
fig. 3A is a schematic flow chart of an image processing method according to an embodiment of the present application;
fig. 3B is a schematic illustration of a user touch operation provided in an embodiment of the present application;
fig. 3C is a schematic illustration of a touch point provided in an embodiment of the present application;
fig. 3D is a schematic illustration of a touch track provided in an embodiment of the present application;
fig. 3E is a schematic illustration of a touch area according to an embodiment of the present disclosure;
fig. 3F is a schematic diagram illustrating a relationship between a touch parameter and a quadrilateral according to an embodiment of the present disclosure;
FIG. 3G is a schematic diagram illustrating a relationship between another touch parameter and a quadrilateral according to an embodiment of the present disclosure;
FIG. 3H is a schematic diagram illustrating a relationship between another touch parameter and a quadrilateral according to an embodiment of the present disclosure;
FIG. 3I is a schematic illustration of reliability calculation according to an embodiment of the present disclosure;
FIG. 3J is a flowchart illustrating another image processing method according to an embodiment of the present disclosure;
FIG. 3K is a flowchart illustrating another image processing method according to an embodiment of the present disclosure;
FIG. 3L is a flowchart illustrating another image processing method according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of another electronic device according to an embodiment of the present application;
fig. 5 is a functional unit block diagram of an image processing apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
For a better understanding of aspects of embodiments of the present application, related terms and concepts that may be related to embodiments of the present application are described below.
The terms first, second and the like in the description and in the claims of the present application and in the above-described figures, are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
In particular, in the embodiment of the present application, the electronic device may include various devices with computer functions, for example, a handheld device (smart phone, tablet computer, etc.), an in-vehicle device (navigator, auxiliary reversing system, automobile data recorder, in-vehicle refrigerator, etc.), a wearable device (smart bracelet, wireless earphone, smart watch, smart glasses, etc.), a computing device or other processing device connected to a wireless modem, and various forms of User Equipment (UE), a Mobile Station (MS), a virtual reality/augmented reality device, a terminal device (terminal device), etc., where the electronic device may also be a base Station or a server.
The electronic device may further include an intelligent home device, where the intelligent home device may be at least one of: the intelligent sound box, the intelligent camera, the intelligent electric cooker, the intelligent wheelchair, the intelligent massage chair, the intelligent furniture, the intelligent dish washer, the intelligent television, the intelligent refrigerator, the intelligent electric fan, the intelligent warmer, the intelligent clothes hanger, the intelligent lamp, the intelligent router, the intelligent switch board, the intelligent humidifier, the intelligent air conditioner, the intelligent door, the intelligent window, the intelligent cooking bench, the intelligent disinfection cabinet, the intelligent toilet, the sweeping robot and the like are not limited herein.
In the first part, the software and hardware operation environment of the technical scheme disclosed in the application is introduced as follows.
As shown, fig. 1 shows a schematic structural diagram of an electronic device 100. Electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a compass 190, a motor 191, an indicator 192, a camera 193, a display 194, a subscriber identity module (subscriber identification module, SIM) card interface 195, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor AP, a modem processor, a graphics processor GPU, an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor NPU, etc. Wherein the different processing units may be separate components or may be integrated in one or more processors. In some embodiments, the electronic device 101 may also include one or more processors 110. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. In other embodiments, memory may also be provided in the processor 110 for storing instructions and data. Illustratively, the memory in the processor 110 may be a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it may be called directly from memory. This avoids repeated accesses and reduces the latency of the processor 110, thereby improving the efficiency of the electronic device 101 in processing data or executing instructions. The processor may also include an image processor, which may be an image preprocessor (preprocess image signal processor, pre-ISP), which may be understood as a simplified ISP, which may also perform some image processing operations, e.g. may obtain image statistics.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include inter-integrated circuit (inter-integrated circuit, I2C) interfaces, inter-integrated circuit audio (inter-integrated circuit sound, I2S) interfaces, pulse code modulation (pulse code modulation, PCM) interfaces, universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interfaces, mobile industry processor interfaces (mobile industry processor interface, MIPI), general-purpose input/output (GPIO) interfaces, SIM card interfaces, and/or USB interfaces, among others. The USB interface 130 is an interface conforming to the USB standard, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 101, or may be used to transfer data between the electronic device 101 and a peripheral device. The USB interface 130 may also be used to connect headphones through which audio is played.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle times, battery health (leakage, impedance), and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G/6G, etc. applied on the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (FLED), a mini light-emitting diode (mini light-emitting diode), microLed, micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or more display screens 194.
The electronic device 100 may implement a photographing function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, so that the electrical signal is converted into an image visible to naked eyes. ISP can also perform algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature, etc. of the photographed scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or more cameras 193.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may cause the electronic device 101 to execute the methods of displaying page elements, and various applications, data processing, and the like provided in some embodiments of the present application by executing the above-described instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area can store an operating system; the storage program area may also store one or more applications (such as gallery, contacts, etc.), etc. The storage data area may store data created during use of the electronic device 101 (e.g., photos, contacts, etc.), and so on. In addition, the internal memory 121 may include high-speed random access memory, and may also include nonvolatile memory, such as one or more disk storage units, flash memory units, universal flash memory (universal flash storage, UFS), and the like. In some embodiments, processor 110 may cause electronic device 101 to perform the methods of displaying page elements provided in embodiments of the present application, as well as other applications and data processing, by executing instructions stored in internal memory 121, and/or instructions stored in a memory provided in processor 110. The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The pressure sensor 180A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., X, Y and Z axis) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
By way of example, fig. 2 shows a block diagram of the software architecture of the electronic device 100. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively. The application layer may include a series of application packages.
As shown in fig. 2, the application layer may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
In the second section, the image processing method and the related apparatus disclosed in the embodiments of the present application are described below.
Referring to fig. 3A, fig. 3A is a flowchart of an image processing method according to an embodiment of the present application, and as shown in the drawing, the image processing method includes:
301. and obtaining a target image, wherein the target image comprises at least one target object with a quadrilateral peripheral outline.
In this embodiment of the present application, the target object may be an object whose peripheral outline is quadrangular or approximately quadrangular, for example, the target object may include at least one of the following: business cards, posters, books, packaging boxes, documents (e.g., identification cards, drivers' licenses, social security cards, etc.), wallets, bus cards, bank cards, greeting cards, toy cards, poker cards, magic prop cards, commemorative cards, amulet cards, graffiti, hand drawings, photographs, boxes, bread, etc., without limitation. In a specific implementation, the target image may be a scanned image, a pre-stored image, or a currently captured image. Of course, the peripheral outline of the quadrangle of the target object in the target image may be a complete quadrangle, or may be a partially incomplete quadrangle, for example, an identity card with one corner cut off. The number of the target objects can be one or more, and the peripheral outline of each target object is quadrilateral.
In a specific implementation, the target object may be photographed, and further, a target image may be obtained, and since other backgrounds may exist in the target image, there may be more than one quadrilateral target object, and of course, it is possible that there is a quadrilateral pattern on the wallpaper, and then the target image may also be used as the target object. As shown in fig. 3B, in the target image, the user realizes the touch function through the gesture to realize selection of the quadrangle required by the user, if the quadrangle selected by the user is the target quadrangle, the quadrangle not selected by the user is the non-target quadrangle.
The target image can be one or more frames of images, and when the target image is a plurality of frames, batch scanning of objects and storage functions can be realized.
302. And carrying out line segment detection on the target image to obtain a plurality of line segments.
In a specific implementation, a line segment detection algorithm may be used to perform line segment detection on the target image to obtain a plurality of line segments, where the line segment detection algorithm may include at least one of the following: hough transform, fast line detection algorithm (fast line detection), etc., without limitation.
In the specific implementation, a line segment detection algorithm can be adopted to detect the line segment of the target image, then the detected line segment is preprocessed, and the preprocessing can screen out some line segments with poor quality. For example, the following functions can be realized through preprocessing, for example, segments with widths lower than a preset width can be screened out, or segments with lengths smaller than the preset length can be screened out, or noise segment removal can be performed on the detected segments, or overlapping segment fusion processing can be performed on the detected segments, and further, after preprocessing, a plurality of segments can be obtained. The preset width and the preset length can be preset or default.
303. At least one candidate quadrilateral is determined from the plurality of line segments.
In a specific implementation, the arrangement and combination may be performed according to a plurality of line segments, so that at least one candidate quadrangle may be obtained, for example, four line segments may form a closed area, and if the closed area is a quadrangle, the closed area may be regarded as one candidate quadrangle.
Specifically, different manners may be adopted according to practical applications, for example, the detected line segments may be included, and permutation and combination of four line segments are performed to obtain all possible candidate quadrilaterals.
Of course, if some line segments cannot form a closed quadrilateral in the permutation and combination process, the edges (line segments) of the non-closed quadrilateral may also be subjected to an extension process to obtain the closed quadrilateral. In a specific implementation, the length ratio between the length of the line segment before extension and the length of the line segment after extension is in a set range interval, and the set range interval can be preset or defaulted by a system.
304. And acquiring a first touch parameter aiming at the target image.
In a specific implementation, the first touch parameter may include at least one of the following: the touch position of the touch display screen, the touch area of the touch display screen, the number of touch points of the touch display screen, the touch force of the touch display screen, the touch duration of the touch display screen, the number of touches of the touch display screen, the touch track of the touch display screen, the touch pattern of the touch display screen, and the like are not limited herein. In this embodiment, the touch operation may be completed in step 304 or before step 304, and the first touch parameter may be a touch parameter within a preset time range, where the preset time range may be preset or default, for example, the preset time range may be a touch parameter within the last 1 second. As shown in fig. 3C, after the user touches the touch panel, at least one touch point may be generated, and each touch point may correspond to one coordinate point. As shown in fig. 3D, after the user touches the target image, a corresponding touch track may be generated, where the touch track may be one or more, for example, a plurality of fingers touch, and one finger may correspond to one touch track. As shown in fig. 3E, in the target image, a touch area may be formed by a touch operation of the user, and the touch area may be a closed area.
In a specific implementation, a target image can be displayed on a touch display screen of the electronic device, and a user can perform touch operation on an area of the target image in the touch display screen, so that a first touch parameter can be obtained.
305. And performing reliability calculation on each candidate quadrangle in the at least one candidate quadrangle according to the first touch parameter to obtain at least one reliability, wherein each reliability is used for representing the strength of will of the corresponding candidate quadrangle to be selected.
In a specific implementation, the candidate quadrilaterals may be the outline of the target object to a great extent, and the first touch parameter reflects the willingness of the user to select the target object to a certain extent, and further, the reliability calculation can be performed on each candidate quadrilaterals in the at least one candidate quadrilaterals through the first touch parameter, so as to obtain at least one reliability, for example, the touch parameter may include a touch position, which may correspond to a coordinate, an average distance between the coordinate and four vertexes of each candidate quadrilaterals in the at least one candidate quadrilaterals may be determined, and the reliability corresponding to each average distance may be determined according to a mapping relationship between a preset distance and the reliability, i.e., the reliability corresponding to each candidate quadrilaterals may be determined. The reliability reflects the strength of willingness of the user to select the candidate quadrangle, e.g., the stronger the willingness, the greater the reliability.
For example, as shown in fig. 3F, in the target image, the touch points generated by the touch operation may fall into different candidate quadrilaterals, and then the corresponding reliability may be determined according to the number of touch points in each candidate quadrilaterals, for example, the more touch points, the greater the reliability. As shown in fig. 3G, in the target image, the touch track generated by the touch operation may fall into different candidate quadrilaterals, so that the total length of the track length in each candidate quadrilaterals may be counted, for example, the longer the total length, the greater the reliability. As shown in fig. 3H, in the target image, the touch area generated by the touch operation may also fall into different candidate quadrilaterals, so that the area size of the touch area in each candidate quadrilaterals may be counted, for example, the larger the area, the larger the reliability. Of course, it is also possible to detect whether the four deformations are quadrilaterals required by the user by regarding the deviation degree between the four corners of the quadrilaterals and the right angle, and in general, the real object selected by the user is a rectangle, the four corners of which are 90 degrees, and if the deviation between the angle and 90 degrees in the candidate quadrilaterals is larger, it means that the candidate quadrilaterals are not quadrilaterals selected by the user, and the reliability is lower. In addition, the distance between the touch center and the center (middle position) of the candidate quadrangle may be determined to determine the reliability, for example, the smaller the distance, the greater the reliability.
Specifically, for reliability calculation, the calculation can integrate each dimension information to calculate the comprehensive reliability of the quadrangle. The comprehensive strategy may be to add the evaluation functions, to set weights for the functions, to weight the functions, or to set normalization for the functions and to add the functions. As shown in fig. 3I, each dimension may correspond to an evaluation parameter, the area ratio may be understood as a ratio between an area of the touch area falling into the quadrangle and a total touch area, the vertical angle may be understood as a correlation between an angle of the quadrangle and a vertical angle (90 degrees), and the center position may be understood as a positional relationship between the touch center and the quadrangle center, and further, based on each evaluation function, the input coordinate information and the quadrangle information may be combined, and an influence of gesture information (touch parameter) given by a user on the quadrangle reliability evaluation may be calculated. In general, if the more gesture information of the user appears in the range of the quadrangle, the higher the reliability given by the evaluation function, so that the overall reliability of the quadrangle is increased.
Further, the evaluation function is calculated, such as: the number of times of occurrence of the user touch points in the quadrangle can be calculated, or the total length of the part of the user touch track in the quadrangle can be calculated, or the total area of the part of the user gesture area in the quadrangle can be calculated, or the proportion of the total area of the part of the user gesture area in the quadrangle to the quadrangle area can be calculated, further, the evaluation functions of the dimensions can be weighted, finally, the comprehensive evaluation function is obtained, and the user will is fully expressed due to the fact that the dimensions are fully expressed, further, the user intention can be accurately mastered, and the user experience is improved.
Optionally, in the step 305, reliability calculation is performed on each candidate quadrilateral in the at least one candidate quadrilateral according to the first touch parameter to obtain at least one reliability, which may include the following steps:
51. determining touch parameters of at least one dimension according to the first touch parameters;
52. acquiring a weight corresponding to each dimension touch parameter in the touch parameters of at least one dimension;
53. and determining the reliability corresponding to each candidate quadrangle in the at least one candidate quadrangle according to the touch parameter of the at least one dimension and the weight corresponding to the touch parameter of each dimension, so as to obtain the at least one reliability.
In a specific implementation, at least one dimension of touch parameters may be determined according to the first touch parameters, where the at least one dimension may include one or more touch parameters, and the one dimension of touch parameters may include any one of the following: the touch position of the touch display screen, the touch area of the touch display screen, the number of touch points of the touch display screen, the touch force of the touch display screen, the touch duration of the touch display screen, the number of touches of the touch display screen, the touch track of the touch display screen, the touch pattern of the touch display screen, and the like are not limited herein.
Further, a weight corresponding to each dimension of the touch parameters of at least one dimension can be obtained, at least one weight can be preset or defaulted, different dimensions can correspond to different weights, the sum of the at least one weight can be smaller than or equal to 1, further, the reliability corresponding to each candidate quadrangle in at least one candidate quadrangle can be determined according to the touch parameters of at least one dimension and the at least one weight, at least one reliability is obtained, wherein for any candidate quadrangle, each dimension can correspond to one reference reliability, weighting operation is carried out on the reference reliability of all dimensions, the final reliability of the candidate quadrangle can be obtained, and as the selection intention of a user is reflected through a plurality of dimensions, the selection of the user can be accurately grasped, and the user experience is improved.
Further, optionally, step 53 determines the reliability of each candidate quadrilateral in the at least one candidate quadrilateral according to the touch parameter of the at least one dimension and the weight corresponding to the touch parameter of each dimension, so as to obtain the at least one reliability, which includes:
531. Determining the reference reliability between each touch parameter in the touch parameters of at least one dimension and a candidate quadrangle i, so as to obtain at least one reference reliability, wherein the candidate quadrangle i is any quadrangle in the at least one candidate quadrangle;
532. and carrying out weighting operation according to the at least one reference reliability and the at least one weight to obtain the reliability corresponding to the candidate quadrangle i.
For any candidate quadrangle, a confidence level may be corresponding to any candidate quadrangle, for example, the candidate quadrangle i is any quadrangle in at least one candidate quadrangle. The method comprises the steps of determining the reference reliability between each touch parameter in at least one dimension and a candidate quadrangle i, obtaining at least one reference reliability, carrying out weighting operation according to at least one reference reliability and at least one weight, namely carrying out product operation on each reference reliability and the corresponding weight, and summing the results of all product operations, so that the reliability corresponding to the candidate quadrangle i can be obtained.
Optionally, in step 531, determining the reference reliability between each touch parameter of the at least one dimension touch parameters and the candidate quadrangle i to obtain at least one reference reliability may be implemented as follows:
when the touch parameters are touch points, determining the number of touch points of the touch points falling into the candidate quadrangle i, and determining the reference reliability according to the number of touch points;
or,
when the touch parameters are touch tracks, determining track lengths of the touch tracks falling into the candidate quadrilaterals i, and determining the reference reliability according to the track lengths;
or,
and when the touch parameter is a touch area, determining the touch area of the touch area falling into the candidate quadrangle i, and determining the reference reliability according to the touch area.
In a specific implementation, when the touch parameter is a touch point, the number of touch points of the touch point falling into the candidate quadrangle i can be determined, a mapping relation between the number of touch points and the reliability can be preset, and then the reference reliability corresponding to the number of touch points is determined based on the mapping relation. For another example, when the touch parameter is a touch track, the track length of the touch track falling into the candidate quadrangle i may be determined, and the reference reliability corresponding to the track length may be determined according to the mapping relationship between the preset length and the reliability. For another example, when the touch parameter is a touch area, the touch area of the touch area falling into the candidate quadrangle i may be determined, and the reference reliability corresponding to the touch area may be determined according to the mapping relationship between the preset area and the reliability.
306. And sequencing the at least one candidate quadrangle according to the at least one reliability.
In a specific implementation, the reliability reflects the user's selection intention, and further, different reliability reflects different intentions of the user, so that at least one candidate quadrangle may be ranked according to at least one reliability, for example, the at least one candidate quadrangle may be ranked according to the order of the reliability from high to low.
307. And displaying the image of the area where the at least one candidate quadrangle is located according to the sequencing result.
In a specific implementation, in the embodiment of the application, the candidate quadrangle corresponding to the maximum reliability can be displayed according to the sorting result, and of course, the region image corresponding to the candidate quadrangle can be subjected to perspective transformation, and then the region image after perspective transformation is displayed.
In a specific implementation, as shown in fig. 3J, line segment detection is performed on a target image, candidate quadrilaterals are found based on a detection result, reliability calculation can be performed on all candidate quadrilaterals based on a first touch parameter, all candidate quadrilaterals are ordered according to the reliability, and one quadrilateral information with the largest reliability is selected from the ordered quadrilaterals and is output. The user interface can display a preview picture currently shot by the camera, display the best quadrangle detected in the current picture, respond to the interaction of a user and update and display the touch point or the touch point track, and further, can process the area image corresponding to the best quadrangle and output the processed image.
In the embodiment of the application, the position or the range of the quadrangle expected by the user can be indicated based on the multidimensional touch parameters by providing the multidimensional touch parameters input by the user, such as the multipoint touch coordinates, the touch track, the touch area and the like, in the preview user interface of the document scanning, the input information such as the coordinates of the contact or the track and the like is included in the total reliability calculation of the quadrangle when the optimal quadrangle calculation is searched, the intention of the user can be more accurately reflected in the result of the quadrangle detection, thereby assisting the quadrangle automatic detection to obtain better effect and avoiding the false detection in ambiguous or complex scenes.
Optionally, the step 307 of displaying the image of the area where the at least one candidate quadrangle is located according to the sorting result may include the following steps:
71. selecting a preset number of candidate quadrilaterals in the at least one candidate quadrilaterals according to the sorting result to obtain at least one reference candidate quadrilaterals;
72. screening the at least one reference candidate quadrangle to obtain a target quadrangle;
73. performing perspective transformation on the image of the area where the target quadrangle is positioned to obtain a target area image;
74. And displaying the target area image.
In a specific implementation, the preset number may be preset or default to the system, for example, the preset number may be a fixed value, or may be a specified percentage of the total number of at least one candidate quadrangle, and the specified percentage may be preset or default to the system.
Specifically, a preset number of candidate quadrilaterals which are ranked ahead in at least one candidate quadrilaterals can be selected according to a ranking result to obtain at least one reference candidate quadrilaterals, then the at least one reference candidate quadrilaterals can be screened to obtain a target quadrilaterals, the most desirable quadrilaterals of users are screened out, then perspective transformation is carried out on the image of the area where the target quadrilaterals are located to obtain a target area image, namely the area image of the original quadrilaterals is converted into a rectangular area image, finally, the target area image can be displayed, and of course, special treatment can be carried out on the edges of the target area image, for example, the edges are processed into a waveform shape, so that the display effect which the users want can be obtained, and the user experience can be promoted.
Optionally, step 72 above filters the at least one reference candidate quadrangle to obtain a target quadrangle, and may include the following steps:
A721, acquiring a second touch parameter;
and A722, selecting the target quadrangle from the at least one reference candidate quadrangle according to the second touch parameter.
The second touch parameter may be the same as the first touch parameter, or may be different from the first touch parameter. The second touch parameter may include at least one of: the touch position of the touch display screen, the touch area of the touch display screen, the number of touch points of the touch display screen, the touch force of the touch display screen, the touch duration of the touch display screen, the number of touches of the touch display screen, the touch track of the touch display screen, the touch pattern of the touch display screen, and the like are not limited herein. In the process of executing step 72, a candidate quadrangle with the highest reliability may be selected as the target quadrangle based on the above reliability calculation principle. As shown in fig. 3K, the user touch operation may generate a second touch parameter, and utilize the second touch parameter to implement quadrilateral screening.
Of course, the target quadrangle may also be determined by, for example, displaying all candidate quadrangles in the at least one reference candidate quadrangle on the display screen, and the user may select the candidate quadrangle to be reserved through a touch operation, for example, the user touches which candidate quadrangle, and may reserve the candidate quadrangle.
Optionally, step 72 above filters the at least one reference candidate quadrangle to obtain a target quadrangle, and may include the following steps:
b721, acquiring eyeball fixation parameters of a user;
and B722, selecting the target quadrangle from the at least one reference candidate quadrangle according to the eyeball fixation parameter of the user.
Wherein the user eye gaze parameters may include at least one of: the gaze location, the gaze duration, etc. are not limited herein, and the gaze location may correspond to a coordinate that indicates that the closer the coordinate is to the reference candidate quadrangle, the stronger the user's will to select the reference candidate quadrangle. In this embodiment of the present application, as shown in fig. 3L, a sight line heat map may also be constructed based on the eyeball position, the gaze position, and the center of each reference candidate quadrangle, that is, a three-dimensional coordinate system may be constructed, these points may be mapped to the three-dimensional coordinate system, a first distance between the eyeball position and the gaze position may be determined, and a second distance between the eyeball position and the center of each reference candidate quadrangle may be determined, then an absolute value of a difference between the first distance and each second distance may be determined, and then a reference candidate quadrangle corresponding to a minimum value of the absolute value may be selected as a target quadrangle, and further, based on the eyeball tracking technology, a user selection intention may be identified, so as to quickly identify a quadrangle required by the user.
It can be seen that, in the image processing method described in the embodiment of the present application, a target image is obtained, the target image includes at least one target object whose peripheral outline is a quadrangle, line segment detection is performed on the target image, a plurality of line segments are obtained, at least one candidate quadrangle is determined according to the plurality of line segments, a first touch parameter for the target image is obtained, reliability calculation is performed on each candidate quadrangle in the at least one candidate quadrangle according to the first touch parameter, at least one reliability is obtained, each reliability is used for indicating a willingness strength of the corresponding candidate quadrangle to be selected, at least one candidate quadrangle is ranked according to the at least one reliability, and an image of an area where the at least one candidate quadrangle is located is displayed according to the ranking result.
In accordance with the above-described embodiments, referring to fig. 4, fig. 4 is a schematic structural diagram of an electronic device provided in an embodiment of the present application, as shown in the fig. 4, the electronic device includes a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and in the embodiment of the present application, the programs include instructions for performing the following steps:
Acquiring a target image, wherein the target image comprises at least one target object with a quadrilateral peripheral outline;
performing line segment detection on the target image to obtain a plurality of line segments;
determining at least one candidate quadrangle according to the plurality of line segments;
acquiring a first touch parameter aiming at the target image;
performing reliability calculation on each candidate quadrangle in the at least one candidate quadrangle according to the first touch parameter to obtain at least one reliability, wherein each reliability is used for representing the strength of will of the corresponding candidate quadrangle to be selected;
sorting the at least one candidate quadrangle according to the at least one reliability;
and displaying the image of the area where the at least one candidate quadrangle is located according to the sequencing result.
Optionally, in the aspect that the reliability calculation is performed on each candidate quadrilateral in the at least one candidate quadrilateral according to the first touch parameter to obtain at least one reliability, the program includes instructions for executing the following steps:
determining touch parameters of at least one dimension according to the first touch parameters;
acquiring a weight corresponding to each dimension touch parameter in the touch parameters of at least one dimension;
And determining the reliability corresponding to each candidate quadrangle in the at least one candidate quadrangle according to the touch parameter of the at least one dimension and the weight corresponding to the touch parameter of each dimension, so as to obtain the at least one reliability.
Optionally, in determining the reliability corresponding to each candidate quadrangle in the at least one candidate quadrangle according to the touch parameter of the at least one dimension and the weight corresponding to the touch parameter of each dimension, obtaining the at least one reliability, the program includes instructions for executing the following steps:
determining the reference reliability between each touch parameter in the touch parameters of at least one dimension and a candidate quadrangle i, so as to obtain at least one reference reliability, wherein the candidate quadrangle i is any quadrangle in the at least one candidate quadrangle;
and carrying out weighting operation according to the at least one reference reliability and the at least one weight to obtain the reliability corresponding to the candidate quadrangle i.
Optionally, in determining the reference reliability between each of the touch parameters of the at least one dimension and the candidate quadrangle i, obtaining at least one reference reliability, the program includes instructions for executing the following steps:
When the touch parameters are touch points, determining the number of touch points of the touch points falling into the candidate quadrangle i, and determining the reference reliability according to the number of touch points;
or,
when the touch parameters are touch tracks, determining track lengths of the touch tracks falling into the candidate quadrilaterals i, and determining the reference reliability according to the track lengths;
or,
and when the touch parameter is a touch area, determining the touch area of the touch area falling into the candidate quadrangle i, and determining the reference reliability according to the touch area.
Optionally, in the aspect of displaying the image of the area where the at least one candidate quadrangle is located according to the sorting result, the program includes instructions for executing the following steps:
selecting a preset number of candidate quadrilaterals in the at least one candidate quadrilaterals according to the sorting result to obtain at least one reference candidate quadrilaterals;
screening the at least one reference candidate quadrangle to obtain a target quadrangle;
performing perspective transformation on the image of the area where the target quadrangle is positioned to obtain a target area image;
and displaying the target area image.
Optionally, in the aspect of screening the at least one reference candidate quadrangle to obtain a target quadrangle, the program includes instructions for performing the following steps:
acquiring a second touch parameter;
and selecting the target quadrangle from the at least one reference candidate quadrangle according to the second touch parameter.
Optionally, in the aspect of screening the at least one reference candidate quadrangle to obtain a target quadrangle, the program includes instructions for performing the following steps:
acquiring eyeball fixation parameters of a user;
and selecting the target quadrangle from the at least one reference candidate quadrangle according to the eyeball fixation parameters of the user.
It can be seen that, in the electronic device described in the embodiment of the present application, a target image is obtained, where the target image includes at least one target object whose peripheral outline is a quadrangle, line segment detection is performed on the target image to obtain a plurality of line segments, at least one candidate quadrangle is determined according to the plurality of line segments, a first touch parameter for the target image is obtained, reliability calculation is performed on each candidate quadrangle in the at least one candidate quadrangle according to the first touch parameter to obtain at least one reliability, each reliability is used to represent a willingness strength of the corresponding candidate quadrangle to be selected, at least one candidate quadrangle is ranked according to the at least one reliability, and an image of an area where the at least one candidate quadrangle is located is displayed according to the ranking result.
The foregoing description of the embodiments of the present application has been presented primarily in terms of a method-side implementation. It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied as hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application may divide the functional units of the electronic device according to the above method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional units. It should be noted that, in the embodiment of the present application, the division of the units is schematic, which is merely a logic function division, and other division manners may be implemented in actual practice.
Fig. 5 is a functional unit block diagram of an image processing apparatus 500 according to an embodiment of the present application. The image processing apparatus 500 is applied to an electronic device, and the apparatus 500 includes: a first acquisition unit 501, a detection unit 502, a determination unit 503, a second acquisition unit 504, a calculation unit 505, a sorting unit 506 and a presentation unit 507, wherein,
the first obtaining unit 501 is configured to obtain a target image, where the target image includes at least one target object whose peripheral contour is quadrilateral;
the detecting unit 502 is configured to perform line segment detection on the target image to obtain a plurality of line segments;
the determining unit 503 is configured to determine at least one candidate quadrangle according to the plurality of line segments;
the second obtaining unit 504 is configured to obtain a first touch parameter for the target image;
the calculating unit 505 is configured to calculate, according to the first touch parameter, a reliability of each candidate quadrilateral in the at least one candidate quadrilateral, so as to obtain at least one reliability, where each reliability is used to represent a strength of will of the corresponding candidate quadrilateral to be selected;
the sorting unit 506 is configured to sort the at least one candidate quadrangle according to the at least one reliability;
And the displaying unit 507 is configured to display, according to the sorting result, an image of an area where the at least one candidate quadrangle is located.
Optionally, in the aspect that the reliability calculation is performed on each candidate quadrilateral in the at least one candidate quadrilateral according to the first touch parameter to obtain at least one reliability, the calculating unit 505 is specifically configured to:
determining touch parameters of at least one dimension according to the first touch parameters;
acquiring a weight corresponding to each dimension touch parameter in the touch parameters of at least one dimension;
and determining the reliability corresponding to each candidate quadrangle in the at least one candidate quadrangle according to the touch parameter of the at least one dimension and the weight corresponding to the touch parameter of each dimension, so as to obtain the at least one reliability.
Optionally, in determining the reliability corresponding to each candidate quadrangle in the at least one candidate quadrangle according to the touch parameter of the at least one dimension and the weight corresponding to the touch parameter of each dimension, the calculating unit 505 is specifically configured to:
determining the reference reliability between each touch parameter in the touch parameters of at least one dimension and a candidate quadrangle i, so as to obtain at least one reference reliability, wherein the candidate quadrangle i is any quadrangle in the at least one candidate quadrangle;
And carrying out weighting operation according to the at least one reference reliability and the at least one weight to obtain the reliability corresponding to the candidate quadrangle i.
Optionally, in determining the reference reliability between each touch parameter of the at least one dimension touch parameters and the candidate quadrangle i, the calculating unit 505 is specifically configured to:
when the touch parameters are touch points, determining the number of touch points of the touch points falling into the candidate quadrangle i, and determining the reference reliability according to the number of touch points;
or,
when the touch parameters are touch tracks, determining track lengths of the touch tracks falling into the candidate quadrilaterals i, and determining the reference reliability according to the track lengths;
or,
and when the touch parameter is a touch area, determining the touch area of the touch area falling into the candidate quadrangle i, and determining the reference reliability according to the touch area.
Optionally, in the aspect of displaying the image of the area where the at least one candidate quadrangle is located according to the sorting result, the displaying unit 507 is specifically configured to:
Selecting a preset number of candidate quadrilaterals in the at least one candidate quadrilaterals according to the sorting result to obtain at least one reference candidate quadrilaterals;
screening the at least one reference candidate quadrangle to obtain a target quadrangle;
performing perspective transformation on the image of the area where the target quadrangle is positioned to obtain a target area image;
and displaying the target area image.
Optionally, in the aspect of screening the at least one reference candidate quadrangle to obtain a target quadrangle, the display unit 507 is specifically configured to:
acquiring a second touch parameter;
and selecting the target quadrangle from the at least one reference candidate quadrangle according to the second touch parameter.
Optionally, in the aspect of screening the at least one reference candidate quadrangle to obtain a target quadrangle, the display unit 507 is specifically configured to:
acquiring eyeball fixation parameters of a user;
and selecting the target quadrangle from the at least one reference candidate quadrangle according to the eyeball fixation parameters of the user.
It can be seen that, in the image processing apparatus described in the embodiments of the present application, a target image is obtained, the target image includes at least one target object whose peripheral outline is a quadrangle, line segment detection is performed on the target image, a plurality of line segments are obtained, at least one candidate quadrangle is determined according to the plurality of line segments, a first touch parameter for the target image is obtained, reliability calculation is performed on each candidate quadrangle in the at least one candidate quadrangle according to the first touch parameter, at least one reliability is obtained, each reliability is used for indicating a willingness strength of the corresponding candidate quadrangle to be selected, at least one candidate quadrangle is ranked according to the at least one reliability, and an image of an area where the at least one candidate quadrangle is located is displayed according to the ranking result.
It should be noted that the electronic device described in the embodiments of the present application is presented in the form of functional units. The term "unit" as used herein should be understood in the broadest possible sense, and the objects used to implement the functions described by the various "units" may be, for example, an integrated circuit ASIC, a single circuit, a processor (shared, dedicated, or group) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
The first obtaining unit 501, the detecting unit 502, the determining unit 503, the second obtaining unit 504, the calculating unit 505, the sorting unit 506, and the displaying unit 507 may be a processor, which may be an artificial intelligent chip, NPU, CPU, GPU, or the like, and not limited herein, and the second obtaining unit 504 and the displaying unit 507 may include a touch display screen. The functions or steps of any of the above methods can be implemented based on the above unit modules.
The present embodiment also provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, wherein the above computer program causes a computer to execute the embodiments as the present application for implementing any of the methods of the embodiments.
The present embodiment also provides a computer program product which, when run on a computer, causes the computer to perform the above-described relevant steps to implement any of the methods of the above-described embodiments.
In addition, embodiments of the present application also provide an image processing apparatus, which may be a chip, a component, or a module, in particular, the apparatus may include a processor and a memory connected to each other; the memory is configured to store computer-executable instructions that, when the device is operated, are executable by the processor to cause the chip to perform any one of the method embodiments described above.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding methods provided above, and will not be described herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. An image processing method, the method comprising:
acquiring a target image, wherein the target image comprises at least one target object with a quadrilateral peripheral outline;
performing line segment detection on the target image to obtain a plurality of line segments;
determining at least one candidate quadrangle according to the plurality of line segments;
acquiring a first touch parameter aiming at the target image;
performing reliability calculation on each candidate quadrangle in the at least one candidate quadrangle according to the first touch parameter to obtain at least one reliability, wherein each reliability is used for representing the strength of will of the corresponding candidate quadrangle to be selected;
sorting the at least one candidate quadrangle according to the at least one reliability;
displaying the images of the area where the at least one candidate quadrangle is located according to the sequencing result;
the performing reliability calculation on each candidate quadrilateral in the at least one candidate quadrilateral according to the first touch parameter to obtain at least one reliability includes:
determining touch parameters of at least one dimension according to the first touch parameters;
acquiring a weight corresponding to each dimension touch parameter in the touch parameters of at least one dimension;
And determining the reliability corresponding to each candidate quadrangle in the at least one candidate quadrangle according to the touch parameter of the at least one dimension and the weight corresponding to the touch parameter of each dimension, so as to obtain the at least one reliability.
2. The method of claim 1, wherein determining the reliability of each candidate quadrilateral in the at least one candidate quadrilateral according to the touch parameter of the at least one dimension and the weight corresponding to the touch parameter of each dimension, and obtaining the at least one reliability comprises:
determining the reference reliability between each touch parameter in the touch parameters of at least one dimension and a candidate quadrangle i, so as to obtain at least one reference reliability, wherein the candidate quadrangle i is any candidate quadrangle in the at least one candidate quadrangle;
and carrying out weighting operation according to the at least one reference reliability and the at least one weight to obtain the reliability corresponding to the candidate quadrangle i.
3. The method according to claim 2, wherein determining the reference reliability between each of the touch parameters of the at least one dimension and the candidate quadrangle i, to obtain at least one reference reliability, comprises:
When the touch parameters are touch points, determining the number of touch points of the touch points falling into the candidate quadrangle i, and determining the reference reliability according to the number of touch points;
or,
when the touch parameters are touch tracks, determining track lengths of the touch tracks falling into the candidate quadrilaterals i, and determining the reference reliability according to the track lengths;
or,
and when the touch parameter is a touch area, determining the touch area of the touch area falling into the candidate quadrangle i, and determining the reference reliability according to the touch area.
4. A method according to any one of claims 1-3, wherein displaying the image of the region in which the at least one candidate quadrangle is located according to the ranking result comprises:
selecting a preset number of candidate quadrilaterals in the at least one candidate quadrilaterals according to the sorting result to obtain at least one reference candidate quadrilaterals;
screening the at least one reference candidate quadrangle to obtain a target quadrangle;
performing perspective transformation on the image of the area where the target quadrangle is positioned to obtain a target area image;
and displaying the target area image.
5. The method of claim 4, wherein the screening the at least one reference candidate quadrilateral to obtain a target quadrilateral comprises:
acquiring a second touch parameter;
and selecting the target quadrangle from the at least one reference candidate quadrangle according to the second touch parameter.
6. The method of claim 4, wherein the screening the at least one reference candidate quadrilateral to obtain a target quadrilateral comprises:
acquiring eyeball fixation parameters of a user;
and selecting the target quadrangle from the at least one reference candidate quadrangle according to the eyeball fixation parameters of the user.
7. An image processing apparatus, characterized in that the apparatus comprises: the device comprises a first acquisition unit, a detection unit, a determination unit, a second acquisition unit, a calculation unit, a sequencing unit and a display unit, wherein,
the first acquisition unit is used for acquiring a target image, and the target image comprises at least one target object with a quadrangular peripheral outline;
the detection unit is used for carrying out line segment detection on the target image to obtain a plurality of line segments;
the determining unit is used for determining at least one candidate quadrangle according to the plurality of line segments;
The second acquisition unit is used for acquiring a first touch parameter aiming at the target image;
the computing unit is configured to perform reliability computation on each candidate quadrilateral in the at least one candidate quadrilateral according to the first touch parameter, so as to obtain at least one reliability, where each reliability is used to represent a strength of will of the corresponding candidate quadrilateral to be selected;
the sorting unit is used for sorting the at least one candidate quadrangle according to the at least one reliability;
the display unit is used for displaying the image of the area where the at least one candidate quadrangle is located according to the sequencing result;
the performing reliability calculation on each candidate quadrilateral in the at least one candidate quadrilateral according to the first touch parameter to obtain at least one reliability includes:
determining touch parameters of at least one dimension according to the first touch parameters;
acquiring a weight corresponding to each dimension touch parameter in the touch parameters of at least one dimension;
and determining the reliability corresponding to each candidate quadrangle in the at least one candidate quadrangle according to the touch parameter of the at least one dimension and the weight corresponding to the touch parameter of each dimension, so as to obtain the at least one reliability.
8. An electronic device comprising a processor, a memory for storing one or more programs and configured for execution by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-6.
9. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-6.
CN202210113847.5A 2022-01-30 2022-01-30 Image processing method and related device Active CN114510192B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210113847.5A CN114510192B (en) 2022-01-30 2022-01-30 Image processing method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210113847.5A CN114510192B (en) 2022-01-30 2022-01-30 Image processing method and related device

Publications (2)

Publication Number Publication Date
CN114510192A CN114510192A (en) 2022-05-17
CN114510192B true CN114510192B (en) 2024-04-09

Family

ID=81552429

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210113847.5A Active CN114510192B (en) 2022-01-30 2022-01-30 Image processing method and related device

Country Status (1)

Country Link
CN (1) CN114510192B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729719A (en) * 2008-10-29 2010-06-09 夏普株式会社 Image processing apparatus, image forming apparatus, image reading apparatus, image processing method
CN103048846A (en) * 2011-10-14 2013-04-17 佳能株式会社 Focus adjustment apparatus and method for controlling the same
KR20160023754A (en) * 2016-02-15 2016-03-03 이성호 Split screen displaying method and device and computer-readable storage medium with program therefor
CN111523353A (en) * 2019-02-02 2020-08-11 顾泽苍 Method for processing machine understanding radar data
CN111553846A (en) * 2020-05-12 2020-08-18 Oppo广东移动通信有限公司 Super-resolution processing method and device
CN113177229A (en) * 2021-04-29 2021-07-27 Oppo广东移动通信有限公司 Data processing method and related device
CN113452980A (en) * 2020-03-24 2021-09-28 Oppo广东移动通信有限公司 Image processing method, terminal and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729719A (en) * 2008-10-29 2010-06-09 夏普株式会社 Image processing apparatus, image forming apparatus, image reading apparatus, image processing method
CN103048846A (en) * 2011-10-14 2013-04-17 佳能株式会社 Focus adjustment apparatus and method for controlling the same
KR20160023754A (en) * 2016-02-15 2016-03-03 이성호 Split screen displaying method and device and computer-readable storage medium with program therefor
CN111523353A (en) * 2019-02-02 2020-08-11 顾泽苍 Method for processing machine understanding radar data
CN113452980A (en) * 2020-03-24 2021-09-28 Oppo广东移动通信有限公司 Image processing method, terminal and storage medium
CN111553846A (en) * 2020-05-12 2020-08-18 Oppo广东移动通信有限公司 Super-resolution processing method and device
CN113177229A (en) * 2021-04-29 2021-07-27 Oppo广东移动通信有限公司 Data processing method and related device

Also Published As

Publication number Publication date
CN114510192A (en) 2022-05-17

Similar Documents

Publication Publication Date Title
CN112717370B (en) Control method and electronic equipment
CN111738122B (en) Image processing method and related device
CN111768416B (en) Photo cropping method and device
CN111553846B (en) Super-resolution processing method and device
CN111882642B (en) Texture filling method and device for three-dimensional model
CN111768352B (en) Image processing method and device
CN111400605A (en) Recommendation method and device based on eyeball tracking
CN111612723B (en) Image restoration method and device
CN115437601B (en) Image ordering method, electronic device, program product and medium
CN114244655B (en) Signal processing method and related device
CN114465852B (en) Signal processing method and related device
CN114510192B (en) Image processing method and related device
CN115390738A (en) Scroll screen opening and closing method and related product
CN111836226B (en) Data transmission control method, device and storage medium
CN114399622A (en) Image processing method and related device
CN117769696A (en) Display method, electronic device, storage medium, and program product
CN111459271B (en) Gaze offset error determination method and device
CN115175164B (en) Communication control method and related device
CN114596819B (en) Brightness adjusting method and related device
CN117950882A (en) Image transmission control method and related device
CN114422686B (en) Parameter adjustment method and related device
CN113311380B (en) Calibration method, device and storage medium
CN116048681B (en) Icon display method, electronic device, and computer-readable storage medium
CN115933952B (en) Touch sampling rate adjusting method and related device
CN116343247B (en) Form image correction method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant