CN111562877A - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN111562877A
CN111562877A CN202010363713.XA CN202010363713A CN111562877A CN 111562877 A CN111562877 A CN 111562877A CN 202010363713 A CN202010363713 A CN 202010363713A CN 111562877 A CN111562877 A CN 111562877A
Authority
CN
China
Prior art keywords
edge line
edge
target area
obtaining
operation data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010363713.XA
Other languages
Chinese (zh)
Inventor
彭方振
马戈芳
武亚强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202010363713.XA priority Critical patent/CN111562877A/en
Publication of CN111562877A publication Critical patent/CN111562877A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Image Processing (AREA)

Abstract

The application discloses an image processing method and device, wherein the method comprises the following steps: obtaining operation data; obtaining an adjustment instruction based on the operation data; and adjusting the position of at least one edge line of the target area according to the adjusting instruction to obtain an adjusted target area, wherein the target area is an area in an image preview area output on the electronic equipment. Therefore, the position of the edge line of the target area obtained through edge detection can be adjusted by using the operation data in the image preview area, and the adjusted target area is formed after the corresponding position is adjusted, so that the error of the edge detection in the collected image can be reduced, and the accuracy of the edge detection is improved.

Description

Image processing method and device
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus.
Background
With the rapid development of electronic technology, the application of image frame detection and correction technology in the field of intelligent education is more and more extensive.
For example, the edge of a text area of an acquired image of a test question or a test paper is detected by an edge detection algorithm, an irregular quadrilateral area is formed at the detected edge of the image, and then the imaged irregular quadrilateral is optimized into a rectangle through image change, so that the image is more attractive on one hand, and the accuracy and efficiency of subsequent character recognition are improved on the other hand.
In actual use, due to the fact that shadows may exist in the acquired image or lines exist in the image content, errors may exist in the detected edge, and the edge detection is inaccurate.
Disclosure of Invention
In view of the above, the present application provides an image processing method and apparatus, comprising:
an image processing method comprising:
obtaining operation data;
obtaining an adjustment instruction based on the operation data;
and adjusting the position of at least one edge line of the target area according to the adjusting instruction to obtain an adjusted target area, wherein the target area is an area in an image preview area output on the electronic equipment.
Preferably, the method, according to the adjustment instruction, of adjusting the position of at least one edge line of the target area includes:
determining at least one first edge line in the edge lines of the target area according to the adjusting instruction;
determining a second edge line at least meeting an adjusting condition in the edge line set corresponding to the first edge line; wherein, the edge line set comprises at least one alternative edge line corresponding to the first edge line;
and replacing the first edge line corresponding to the second edge line in the target area with the second edge line to obtain the adjusted target area.
The above method, preferably, further comprises:
and if the edge line set corresponding to the first edge line is an empty set, adjusting detection parameters of edge detection and performing edge detection again to obtain a candidate edge line of the first edge line.
In the above method, preferably, the adjustment condition includes:
the second edge line is an edge line in the edge line set, and the confidence of the edge line is only lower than that of the first edge.
In the above method, preferably, the adjustment instruction at least characterizes: the replaced first edge line.
In the above method, preferably, the adjustment instruction further characterizes: the relative positional relationship between the edge line of the first edge line and the first edge line is replaced.
In the above method, preferably, the adjustment instruction further characterizes: a position adjustment direction of the first edge line;
wherein, the edge line set corresponding to the first edge line is obtained by the following method:
and obtaining the candidate edge lines corresponding to the position adjustment direction from the candidate edge lines corresponding to the first edge line to form an edge line set of the first edge line. .
The above method, preferably, obtains operation data, including:
and obtaining the motion parameters of the electronic equipment as operation data.
Preferably, in the above method, the obtaining an adjustment instruction based on the operation data includes:
obtaining the moving direction of the electronic equipment according to the motion parameters of the electronic equipment in the operation data;
according to the moving direction, obtaining an adjusting instruction, wherein the adjusting instruction at least represents that: the replaced first edge line.
The above method, preferably, obtains operation data, including:
acquiring gesture data of a user operation body in the image preview area through an image acquisition device;
and performing gesture recognition on the gesture data to obtain operation data.
Preferably, in the above method, the obtaining an adjustment instruction based on the operation data includes:
obtaining an adjusting instruction according to the gesture track of the user operation body in the operation data, wherein the adjusting instruction at least represents that: the replaced first edge line.
An image processing apparatus comprising:
an operation obtaining unit for obtaining operation data;
the instruction obtaining unit is used for obtaining an adjusting instruction based on the operation data;
and the edge line adjusting unit is used for adjusting the position of at least one edge line of the target area according to the adjusting instruction to obtain an adjusted target area, wherein the target area is an area in an image preview area output on the electronic equipment.
As can be seen from the foregoing technical solutions, according to the image processing method and apparatus disclosed in the present application, after operation data for a target area in an image preview area output on an electronic device is obtained, an adjustment instruction capable of adjusting a position of at least one edge line in the target area can be obtained based on the operation data, so that the edge line after the position adjustment can form an adjusted target area. Therefore, the position of the edge line of the target area obtained through edge detection can be adjusted by using the operation data in the image preview area, and the adjusted target area is formed after the corresponding position is adjusted, so that the error of the edge detection in the collected image can be reduced, and the accuracy of the edge detection is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a diagram illustrating an example of detection errors in conventional edge detection;
fig. 2 is a flowchart of an image processing method according to an embodiment of the present application;
FIG. 3 is a partial flow chart of a first embodiment of the present application;
4-14 are exemplary diagrams of the edge detection according to the embodiment of the present application;
fig. 15 is a schematic structural diagram of an image processing apparatus according to a second embodiment of the present application;
fig. 16 is a schematic structural diagram of an electronic device according to a third embodiment of the present application;
FIGS. 17 and 18 are schematic diagrams of gestures involved in practical applications of the embodiment of the present application, respectively;
fig. 19 is a flowchart of an embodiment of the present application in a specific implementation.
Detailed Description
As shown in fig. 1, after edge detection is performed on a previewed image area in a camera preview area of a mobile phone, four edge lines 1-4 of a "report" are obtained, but due to uneven distribution of ambient light of the "report", a partial area of the "report" in the camera preview area is in shadow, and at this time, the detected edge line 1 is not an accurate edge line. For another example, after edge detection is performed on the previewed image area in the camera preview area of the mobile phone, four edge lines 1-4 of "notebook" are obtained, but due to the influence factor of the cover line existing in "notebook", the line 4 of the "notebook" cover in the image preview area is mistakenly recognized as the edge line of "notebook", and at this time, the detected edge line 4 is not an accurate edge line. Therefore, the image area obtained after the imaged irregular quadrangle is optimized into a rectangle through image change is not complete.
In view of the above drawbacks, the inventors of the present application have further studied and proposed a technical solution capable of correcting an edge line in edge detection, which may be an image processing solution applied to an electronic device, such as a mobile phone, a pad, or a computer, equipped with an image capturing device to perform image processing, and specifically includes the following steps:
after obtaining the operation data for the target area in the image preview area output on the electronic device, an adjustment instruction capable of performing position adjustment on at least one edge line in the target area may be obtained based on the operation data, so that the edge line subjected to the position adjustment can form the adjusted target area.
Therefore, in the scheme, the position of the edge line of the target area obtained through edge detection can be adjusted in the image preview area by using the operation data, and the adjusted target area is formed after the position is adjusted to the corresponding position, so that the error of edge detection in the acquired image can be reduced, and the accuracy of edge detection is improved.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 2, a flowchart of an implementation of an image processing method provided in an embodiment of the present application is shown, where the method may be applied to an electronic device capable of image acquisition and edge detection, such as a mobile phone with an image acquisition device, such as a camera, a pad, a computer or a server. The technical scheme in the embodiment is mainly used for: and the error of edge detection in the image is reduced, and the accuracy of edge detection is improved.
In a specific implementation, the method in this embodiment may be implemented by the following steps:
step 201: operational data is obtained.
The operation data is operation data performed by a user based on an edge line of a target area in the image preview area when the image preview area is output on the electronic device and the image preview area has the target area obtained through edge detection, such as a shake operation on the electronic device, a slide operation on a display screen on the electronic device, a gesture operation performed in the image preview area of the image acquisition device, and the like. It should be noted that the operation data in this embodiment is different from the operation data of the user operating on the image after performing edge detection on the acquired image.
Step 202: based on the operational data, an adjustment instruction is obtained.
The adjustment instruction may include one or more adjustment parameters, where the adjustment parameters may represent the adjusted first edge line and an adjustment manner for the first edge line, such as moving the first edge line to a suitable position or replacing the first edge line with an edge line at a corresponding position, and may also represent a relative position relationship between a second edge line replacing the first edge line and the first edge line.
In addition, after the first edge line is adjusted, in this embodiment, it may be necessary to adaptively adjust other edge lines of the target area besides the first edge line, so that these other edge lines form the adjusted target area together with the adjusted first edge line, and therefore, the adjustment parameter in this embodiment can also characterize: at least one adjusted third edge line and an adjustment mode for the third edge line, wherein the adjustment parameters are related to the adjusted first edge line, for example, the adjustment parameters characterize the extended or shortened third edge line, the extended or shortened direction or the extended or shortened edge line end point, and the extended or shortened edge line end point position or the extended or shortened edge line length, etc.
Step 203: and adjusting the position of at least one edge line of the target area according to the adjusting instruction to obtain the adjusted target area.
For example, in this embodiment, according to the adjustment instruction, the first edge line represented by the adjustment instruction in the target area may be replaced by the corresponding second edge line, or the first edge line represented by the adjustment instruction in the target area may be moved to the position of the corresponding second edge line. Furthermore, the third edge line represented by the adjustment instruction in the target area can be adjusted together adaptively according to the adjustment instruction.
As can be seen from the foregoing technical solutions, according to an image processing method provided in an embodiment of the present invention, after operation data for a target area in an image preview area output on an electronic device is obtained, an adjustment instruction capable of performing position adjustment on at least one edge line in the target area can be obtained based on the operation data, so that the edge line after the position adjustment can form an adjusted target area. Therefore, in the embodiment, the position of the edge line of the target area obtained through edge detection can be adjusted in the image preview area by using the operation data, and the adjusted target area is formed after the position is adjusted to the corresponding position, so that the error of edge detection in the acquired image can be reduced, and the accuracy of edge detection is further improved.
In one implementation, when performing position adjustment on at least one edge line of the target area according to the adjustment instruction, step 203 may be specifically implemented by the following manner, as shown in fig. 3:
step 301: and determining at least one first edge line in the edge lines of the target area according to the adjusting instruction.
As shown in fig. 4, in the present embodiment, the first edge line a corresponding to the adjustment instruction is determined among the 4 edge lines of the target area according to the adjustment parameter in the adjustment instruction.
Step 302: and determining a second edge line at least meeting the adjusting condition in the edge line set corresponding to the first edge line.
And the edge line set corresponding to the first edge line comprises at least one alternative edge line corresponding to the first edge line.
The first edge line and the candidate edge lines in the edge line set corresponding to the first edge line are both edge lines recognized when the edge detection is performed in the image preview area, the edge confidence degrees of these recognized edge lines are different, and the first edge line is the edge line with the highest edge confidence degree among these recognized edge lines, that is, the edge confidence degree of the first edge line is higher than that of any one of the candidate edge lines in the edge line set corresponding to the first edge line. Here, the edge confidence may be represented by a probability value recognized as an edge line.
The candidate edge lines in the edge line set corresponding to the first edge line may be all the edge lines identified by performing edge detection in the image preview area, or the candidate edge lines in the edge line set corresponding to the first edge line may also be the edge lines selected from all the identified edge lines and corresponding to the operation data, such as the candidate edge lines corresponding to the position adjustment direction of the first edge line characterized in the adjustment instruction, or the candidate edge lines in the edge line set corresponding to the first edge line may be the edge lines selected from all the identified edge lines and corresponding to the position adjustment direction of the first edge line, and the edge confidence is greater than or equal to a confidence threshold, such as 0.6.
In a specific implementation, each edge line of the target region has one edge line set, and taking the target region with four edge lines as an example, the upper, lower, left, and right edge lines of the target region in fig. 4 have corresponding edge line sets, one or more candidate edge lines may be present in the edge line set corresponding to each edge line or the candidate edge lines in the edge line set may be empty, for example, two candidate edge lines b and c are present in the edge line set corresponding to the left edge line, a candidate edge line d is present in the edge line set corresponding to the right edge line, one candidate edge line e is present in the edge line set corresponding to the upper edge line, and no candidate edge line is present in the edge line set corresponding to the lower edge line.
Based on this, in this embodiment, the second edge line satisfying the adjustment condition may be directly determined in the edge line set corresponding to the first edge line, for example, the second edge line satisfying the adjustment condition is determined in the two candidate edge lines b and c.
Or, two edge lines that are at least approximately parallel to each other in the target region correspond to one edge line set, taking the target region with four edge lines as an example, in the upper, lower, left, and right edge lines of the target region in fig. 4, the left and right edge lines correspond to one edge line set, the upper and lower edge lines correspond to one edge line set, each edge line set may have one or more candidate edge lines or a candidate edge line in an edge line set may be empty, for example, there are two candidate edge lines b, c, and d in the edge line set corresponding to the left and right edge lines, and there is one candidate edge line e in the edge line set corresponding to the upper and lower edge lines.
Based on this, in this embodiment, after determining all the candidate edge lines corresponding to the first edge line, first finding the candidate edge lines whose distance from the first edge line is less than the distance threshold to form an edge line set, then determining a second edge line satisfying the adjustment condition among these candidate edge lines in the edge line set, for example, first finding the candidate edge lines b and c that are closer to the first edge line a from three candidate edge lines b, c, and d of all the candidate edge lines corresponding to the first edge line to form an edge line set, and then determining the second edge line satisfying the adjustment condition from the candidate edge lines b and c. Alternatively, in this embodiment, after determining all the candidate edge lines corresponding to the first edge line, first, finding out the alternative edge line corresponding to the position adjustment direction of the first edge line characterized in the adjustment instruction to form an edge line set, then, a second edge line meeting the adjustment condition is determined in the candidate edge lines in the edge line set, for example, first of the three candidate edge lines b, c and d of all the candidate edge lines corresponding to the first edge line, then according to the adjustment instruction for adjusting the first edge line to the left determined by the operation data for shaking the mobile phone to the left, finding the candidate edge lines b and c on the left side of the first edge line (and the edge confidence of these candidate edge lines b and c may be greater than the confidence threshold), thus, a set of edge lines is formed, and then, a second edge line satisfying the adjustment condition is determined among the candidate edge lines b and c.
Wherein, the adjusting condition can be: the second edge line is an edge line in the edge line set, and the confidence of the edge line is only lower than that of the first edge. Alternatively, the adjustment conditions may be: the second edge line is the edge line whose edge position in the edge line set is closest to the first edge line, or the adjustment condition may be: the second edge line is an edge line in the edge line set, the relative positional relationship with the first edge line corresponding to the operation data.
Accordingly, in this embodiment, it may be determined that, in the edge line set corresponding to the first edge line, the candidate edge line whose edge confidence is only lower than that of the first edge line is the second edge line, or it may be determined that, in the edge line set corresponding to the first edge line, the candidate edge line whose edge position is closest to the first edge line is the second edge line, or it may be determined that, in the edge line set corresponding to the first edge line, the candidate edge line corresponding to the direction in the operation data and closest to the first edge line is the second edge line.
As shown in fig. 4, there are two candidate edge lines b and c in the edge line set corresponding to the first edge line a, and in this embodiment, the candidate edge line c with an edge confidence lower than that of the first edge line a is selected as the second edge line, or in this embodiment, the candidate edge line b with an edge position closest to the first edge line a is selected as the second edge line.
Step 303: and replacing the first edge line corresponding to the second edge line in the target area by the second edge line to obtain the adjusted target area.
The second edge line replacing the first edge line may be understood as the adjusted first edge line, and at this time, the second edge line and other edge lines in the target area form the adjusted target area.
It should be noted that, after replacing the first edge line a with the second edge line b in the present embodiment, there may be a case where the second edge line b is not connected to other edge lines of the target area, as shown in fig. 5. Therefore, in this embodiment, after the second edge line is replaced by the first edge line, other edge lines of the target area may be adaptively adjusted according to the adjustment instruction. As shown in fig. 6, the edge line f and the edge line g connected to the end points of the original first edge line a in the target area are extended to a length that can be end-connected to the second edge line b replacing the first edge line, thereby forming an adjusted target area.
In one case, the method further comprises: and if the edge line set corresponding to the first edge line is an empty set, adjusting the detection parameters of the edge detection and carrying out the edge detection again to obtain the alternative edge line of the first edge line.
Specifically, in the present embodiment, in the process of adjusting the position of the edge line of the target area, if the set of edge lines corresponding to the first edge line is empty, in this embodiment, the detection parameter in the edge detection may be adjusted in the image preview area, for example, the reliability of identifying the edge line is improved in a manner of reducing the detection accuracy, so that the first edge line can be associated with the candidate edge line, thereby avoiding a situation that the set of edge lines corresponding to the first edge line is empty. For example, if when performing edge detection on the image preview area, when only the first edge line a is identified for the edge line on the left side of the target area, and no candidate edge line is located on the left side and the right side of the first edge line, in this embodiment, the detection parameters in the edge detection on the image preview area may be adjusted, and thus not only the first edge line a but also the candidate edge lines b and c thereof may be detected in the image preview area, based on this, in this embodiment, an adjustment instruction may be obtained again according to the operation data, and after determining the first edge line a according to the adjustment instruction, in the edge line set composed of the candidate edge lines b and c, the candidate edge line c whose edge confidence is only lower than that of the first edge line a is selected as the second edge line, or in this embodiment, the candidate edge line b whose edge position is closest to the first edge line a is selected as the second edge line, and finally, replacing the first edge line a in the target area with the alternative edge line b or c to obtain the adjusted target area.
In an implementation manner, the operation data obtained in step 201 may specifically be: the motion data of the electronic device is obtained to serve as operation data, that is, the user drives the electronic device to move by holding the electronic device by hand or in other manners. As shown in fig. 7, the user holds the mobile phone and shakes left, and the motion data of the mobile phone shaking left is obtained in this embodiment, so as to perform subsequent correspondence on the operation of the user that needs to adjust the left edge line of the target area.
Based on this, when obtaining the adjustment instruction based on the operation data in step 202, the following method may be implemented:
firstly, obtaining the moving direction of the electronic device according to the motion parameters of the electronic device in the operation data, as shown in fig. 7, obtaining the moving direction of the mobile phone moving to the left;
then, according to the moving direction, obtaining an adjusting instruction, wherein the adjusting parameter contained in the adjusting instruction can at least represent: a replaced first edge line, wherein a relative position of the replaced first edge line characterized in the adjustment instruction in the target area corresponds to a direction of movement of the electronic device. In addition, the adjustment instruction may further represent a relative positional relationship between the edge line replacing the first edge line and the first edge line, where the relative positional relationship between the edge line replacing the first edge line and the first edge line corresponds to the moving direction of the electronic device.
For example, in the label 202 of the present embodiment, the replaced first edge line is first determined to be the first edge line a on the left side of the target area in the image preview area output on the mobile phone according to the moving direction of the mobile phone moving to the left, as shown in fig. 7, and in addition, the edge line replacing the first edge line a may be determined to be located on the left side of the first edge line a and closest to the first edge line a according to the moving direction of the mobile phone moving to the left, so as to obtain the adjustment instruction. Based on this, in step 203, the first edge line a may be determined according to the adjustment instruction, and then the second edge line b replacing the first edge line a is determined, for example, the second edge line b is an edge line whose edge confidence in the edge line set corresponding to the first edge line a is only lower than that of the first edge line a, or the second edge line b is an edge line closest to the first edge line a in the edge line set corresponding to the first edge line a, or is determined according to the relative position relationship characterized in the adjustment instruction: the second edge line b is the edge line located on the left side of the first edge line a and closest to the first edge line a in the edge line set corresponding to the first edge line a, and finally, the second edge line b is replaced with the first edge line a to become a new edge line in the target area, thereby forming the adjusted target area together with other edge lines. When the second edge line b is an edge line located on the left side of the first edge line a in the edge line set corresponding to the first edge line a, the adjustment effect of adjusting the position of the edge line is equivalent to enlarging the area of the target region to the left.
It should be noted that, in this embodiment, it is also possible to expand or reduce the area of the target region rightward, upward or downward when the user holds the mobile phone and shakes the mobile phone rightward, upward or downward, and a specific implementation manner may refer to the above implementation manner of expanding the area of the target region leftward, and details are not described here.
As another example, in this embodiment, according to the forward moving direction of the mobile phone, that is, the direction toward which the mobile phone is held by the user and moved along the screen, the replaced first edge lines are determined as the four first edge lines of the target area in the image preview area output on the mobile phone, and in addition, according to the forward moving direction of the mobile phone, the edge lines respectively replacing the four first edge lines may be determined to be located outside the corresponding first edge lines, so as to obtain the adjustment instruction. Based on this, in step 203, four first edge lines may be determined according to the adjustment instruction, and then, a second edge line replacing each first edge line is determined, for example, each second edge line is an edge line whose confidence coefficient of the edge in the edge line set corresponding to the corresponding first edge line is only lower than that of the first edge line, or each second edge line is the edge line closest to the first edge line in the edge line set corresponding to the corresponding first edge line, or, determining each second edge line as the nearest edge line outside the first edge line in the edge line set corresponding to the corresponding first edge line according to the relative position relation represented in the adjustment instruction, and finally, and replacing the corresponding first edge line with each second edge line to form a new edge line in the target area, thereby forming the adjusted target area together with other edge lines. When each of the second edge lines is an edge line located outside the first edge line in the edge line set corresponding to the corresponding first edge line, the adjustment effect of adjusting the position of the edge line is equivalent to expanding the area of the target region to four sides, as shown in fig. 8.
In this embodiment, the first edge lines to be replaced are determined as the four first edge lines of the target area in the image preview area output on the mobile phone according to the backward moving direction of the mobile phone, that is, the mobile phone is moved in the opposite direction along the screen by the user, and the edge lines to replace the four first edge lines are determined to be located inside the corresponding first edge lines according to the backward moving direction of the mobile phone, so as to obtain the adjustment instruction. Based on this, in step 203, four first edge lines may be determined according to the adjustment instruction, and then, a second edge line replacing each first edge line is determined, for example, each second edge line is an edge line whose confidence coefficient of the edge in the edge line set corresponding to the corresponding first edge line is only lower than that of the first edge line, or each second edge line is the edge line closest to the first edge line in the edge line set corresponding to the corresponding first edge line, or, determining each second edge line as the nearest edge line located at the inner side of the first edge line in the edge line set corresponding to the corresponding first edge line according to the relative position relationship represented in the adjustment instruction, and finally, and replacing the corresponding first edge line with each second edge line to form a new edge line in the target area, thereby forming the adjusted target area together with other edge lines. When each of the second edge lines is an edge line located inside the first edge line in the edge line set corresponding to the corresponding first edge line, the adjustment effect of adjusting the position of the edge line is equivalent to inwardly reducing the area of the target region, as shown in fig. 9.
As shown in fig. 10, in the present embodiment, after edge detection is performed in the image preview area of the mobile phone, four edge lines 1-4 of the "report" are obtained, but due to uneven distribution of ambient light in which the "report" is located, a partial area of the "report" in the image preview area is in shadow, and at this time, the detected edge lines 1 are not accurate edge lines, so that the user can hold the mobile phone by hand and shake it upward, thereby determining the replaced edge line 1 on the mobile phone according to the operation data of shaking upward, and determining the edge line 5 of the replaced edge line 1, where the edge line 5 may be the edge line with the highest edge confidence selected from the candidate edge lines corresponding to the edge line 1 detected in the previous edge detection (the edge line 5 may be the edge line closest to the edge line 1 selected from the candidate edge lines corresponding to the edge line 1 detected in the previous edge detection, or, the edge line 5 may be the edge line located above and closest to the edge line 1 selected from candidate edge lines corresponding to the edge line 1 detected in the previous edge detection), and thus, the edge line 5 is replaced with the edge line 1 in the image preview region and the edge lines 2 and 4 are extended adaptively, whereby the edge line 5, the extended edge lines 2 and 4, and the original edge line 3 form a new "report" region, and further, the region may be subjected to processing such as character recognition.
In an implementation manner, the operation data obtained in step 201 may specifically be: and acquiring gesture data of the user operation body in the image preview area through the image acquisition device, and performing gesture recognition on the gesture data to acquire operation data.
As shown in fig. 11, the user operation body is located in the acquisition area of the image acquisition device, and at this time, a preview image of the user operation body exists in the image preview area, so that gesture data of the user operation body in the image preview area, such as one or more frames of gesture images, can be obtained in this embodiment, and then gesture recognition is performed to obtain operation data including a gesture trajectory.
Based on this, when obtaining the adjustment instruction based on the operation data, the step 202 may obtain the adjustment instruction according to a gesture trajectory of the user operation body in the operation data, such as an appearance position of the user operation body in the image preview area and/or a moving direction of the user operation body in the image preview area, where an adjustment parameter included in the adjustment instruction at least can represent: and the relative position of the replaced first edge line in the target area, which is characterized in the adjusting instruction, corresponds to the moving direction of the user operation body in the image preview area in the gesture track, or the relative position of the replaced first edge line in the target area, which is characterized in the adjusting instruction, corresponds to the appearance position of the user operation body in the image preview area in the gesture track. In addition, the adjustment instruction may further represent a relative positional relationship between the edge line replacing the first edge line and the first edge line, where the relative positional relationship between the edge line replacing the first edge line and the first edge line corresponds to a moving direction of the operation body in the image preview area in the gesture trajectory.
For example, in step 202 of this embodiment, according to the moving direction of the user operating body moving leftward in the image preview area, the replaced first edge line is first determined to be the first edge line a on the left side of the target area in the image preview area output on the mobile phone, as shown in fig. 11, and in addition, according to the moving direction of the user operating body moving leftward in the image preview area, it may be determined that the edge line replacing the first edge line a is located on the left side of the first edge line a and is closest to the first edge line a, so as to obtain the adjustment instruction. Based on this, in step 203, the first edge line may be determined according to the adjustment instruction, and then the second edge line b replacing the first edge line a is determined, for example, the second edge line b is an edge line whose edge confidence in the edge line set corresponding to the first edge line a is only lower than that of the first edge line a, or the second edge line b is an edge line closest to the first edge line a in the edge line set corresponding to the first edge line a, or is determined according to the relative position relationship characterized in the adjustment instruction: the second edge line b is the edge line located on the left side of the first edge line a in the edge line set corresponding to the first edge line a, and finally, the second edge line b is replaced with the first edge line a to become a new edge line in the target area, thereby forming the adjusted target area together with other edge lines. When the second edge line b is an edge line located on the left side of the first edge line a in the edge line set corresponding to the first edge line a, the adjustment effect of adjusting the position of the edge line is equivalent to enlarging the area of the target region to the left.
It should be noted that, in this embodiment, it is also possible to implement right, upward or downward reduction of the area of the target region when the user operation body moves to the right, upward or downward in the image preview region, and specific implementation may refer to the above implementation of leftward expansion of the area of the target region, and details are not described here.
As shown in fig. 12, in this embodiment, after edge detection is performed in the image preview area of the mobile phone, four edge lines 1-4 of "notebook" are obtained, but due to the influence factor of the cover line existing in "notebook", the line of the "notebook" cover in the image preview area is mistakenly recognized as the edge line of "notebook", at this time, the detected edge line 4 is not an accurate edge line, therefore, the finger of the user can move to the left at a position close to the edge line 4 in the capture area of the camera of the mobile phone, so that the edge line 4 to be replaced is determined on the mobile phone according to the finger position and/or the finger movement to the left, and the edge line 5 of the replaced edge line 4 is determined, where the edge line 5 may be the edge line with the highest edge confidence selected from the candidate edge lines corresponding to the edge line 4 detected in the previous edge detection (or the edge line 5 may be the candidate edge line corresponding to the edge line 4 detected in the previous edge detection) The edge line 5 closest to the edge line 4 among the edge lines is selected, or the edge line 5 may be the edge line closest to the left side of the edge line 4 selected from the candidate edge lines corresponding to the edge line 4 detected in the previous edge detection), so that the edge line 4 is replaced by the edge line 5 in the image preview area, and the edge lines 1 and 3 are extended adaptively, whereby the edge line 5, the extended edge lines 1 and 3, and the original edge line 2 form a new "notebook" area, and further, the area may be subjected to processing such as character recognition.
In an implementation manner, the operation data obtained in step 201 may specifically be: and acquiring operation data of a user operation body on the touch screen where the image preview area is located. As shown in fig. 13, the operation data including the sliding direction and/or the click position of the user operation body on the touch screen can be obtained in the present embodiment based on the operation data that the user operation body performs the operation such as clicking or sliding on the touch screen on which the image preview area is output.
Based on this, when obtaining the adjustment instruction based on the operation data, the step 202 may obtain the adjustment instruction according to an operation trajectory of the user operation body in the operation data, such as a click position of the user operation body on the touch screen and/or a sliding direction of the user operation body on the touch screen. Wherein the adjustment parameters contained in the adjustment command at least represent: and the relative position of the replaced first edge line in the target area represented in the adjustment instruction corresponds to the sliding direction of the user operation body on the touch screen in the operation track, or the relative position of the replaced first edge line in the target area represented in the adjustment instruction corresponds to the clicking position of the user operation body on the touch screen in the operation track. In addition, the adjustment instruction may further represent a relative positional relationship between the edge line replacing the first edge line and the first edge line, where the relative positional relationship between the edge line replacing the first edge line and the first edge line corresponds to a sliding direction of the operation body on the touch screen in the operation trajectory.
For example, in step 202 in this embodiment, according to the sliding direction of the user operation body sliding leftward on the touch screen, the replaced first edge line is first determined to be the first edge line a on the left side of the target area in the image preview area output on the mobile phone, as shown in fig. 13, and in addition, according to the sliding direction of the user operation body sliding leftward on the touch screen, it may be determined that the edge line replacing the first edge line a is located on the left side of the first edge line a and is closest to the first edge line a, so as to obtain the adjustment instruction. Based on this, in step 203, the first edge line may be determined according to the adjustment instruction, and then the second edge line b replacing the first edge line a is determined, for example, the second edge line b is an edge line whose edge confidence in the edge line set corresponding to the first edge line a is only lower than that of the first edge line a, or the second edge line b is an edge line closest to the first edge line a in the edge line set corresponding to the first edge line a, or is determined according to the relative position relationship characterized in the adjustment instruction: the second edge line b is the edge line located on the left side of the first edge line a and closest to the first edge line a in the edge line set corresponding to the first edge line a, and finally, the second edge line b is replaced with the first edge line a to become a new edge line in the target area, thereby forming the adjusted target area together with other edge lines. When the second edge line b is an edge line located on the left side of the first edge line a in the edge line set corresponding to the first edge line a, the adjustment effect of adjusting the position of the edge line is equivalent to enlarging the area of the target region to the left.
It should be noted that, in this embodiment, it is also possible to implement rightward, upward or downward reduction of the area of the target region when the user operation body slides rightward, upward or downward on the touch screen, and a specific implementation manner may refer to the above implementation manner of expanding the area of the target region leftward, and details are not described here.
As shown in fig. 14, in the present embodiment, after the edge detection is performed in the image preview area of the mobile phone, four edge lines 1-4 of "notebook" are obtained, but due to the influence factor of the cover line existing in "notebook", the line of the "notebook" cover in the image preview area is mistakenly recognized as the edge line of "notebook", at this time, the detected edge line 4 is not an accurate edge line, so the finger of the user can slide to the left on the touch screen of the mobile phone at a position close to the edge line 4, so that the edge line 4 to be replaced is determined on the mobile phone according to the finger position and/or the finger sliding to the left, and the edge line 5 to replace the edge line 4 is determined, where the edge line 5 may be the edge line with the highest edge confidence selected from the candidate edge lines corresponding to the edge lines 4 detected in the previous edge detection (or the edge line 5 may be the candidate edge lines corresponding to the edge lines 4 detected in the previous edge detection) The edge line 5 closest to the edge line 4, or the edge line 5 may be the edge line closest to the left of the edge line 4 selected from the candidate edge lines corresponding to the edge line 4 detected in the previous edge detection), and thus the edge line 5 is substituted for the edge line 4 in the image preview region and the edge lines 1 and 3 are extended adaptively, whereby the edge line 5, the extended edge lines 1 and 3, and the original edge line 2 form a new "notebook" region, and further, the region may be subjected to processing such as character recognition.
Referring to fig. 15, a schematic structural diagram of an image processing apparatus according to the second embodiment of the present disclosure is provided, where the apparatus may be configured in an electronic device capable of image capture and edge detection, such as a mobile phone with an image capture device, such as a camera, a pad, a computer, or a server. The technical scheme in the embodiment is mainly used for: and the error of edge detection in the image is reduced, and the accuracy of edge detection is improved.
In a specific implementation, the apparatus in this embodiment may include the following structure:
an operation obtaining unit 1501 for obtaining operation data;
an instruction obtaining unit 1502 configured to obtain an adjustment instruction based on the operation data;
an edge line adjusting unit 1503, configured to perform position adjustment on at least one edge line of the target region according to the adjustment instruction, to obtain an adjusted target region, where the target region is a region in an image preview region output on the electronic device.
As can be seen from the foregoing technical solutions, in the image processing apparatus according to the second embodiment of the present invention, after the operation data for the target area in the image preview area output on the electronic device is obtained, an adjustment instruction capable of adjusting the position of at least one edge line in the target area is obtained based on the operation data, so that the edge line after the position adjustment can form the adjusted target area. Therefore, in the embodiment, the position of the edge line of the target area obtained through edge detection can be adjusted by using the operation data in the image preview area, and the adjusted target area is formed after the position is adjusted to the corresponding position.
In one implementation, when performing position adjustment on at least one edge line of the target region according to the adjustment instruction, the edge line adjustment unit 1503 may perform:
determining at least one first edge line in the edge lines of the target area according to the adjusting instruction; determining a second edge line at least meeting an adjusting condition in the edge line set corresponding to the first edge line; wherein, the edge line set comprises at least one alternative edge line corresponding to the first edge line; and replacing the first edge line corresponding to the second edge line in the target area with the second edge line to obtain the adjusted target area.
In an alternative, the adjustment condition includes: the second edge line is an edge line in the edge line set, and the confidence of the edge line is only lower than that of the first edge.
In one implementation, the operation obtaining unit 1501 is specifically configured to: and obtaining the motion parameters of the electronic equipment as operation data.
Based on this, the instruction obtaining unit 1502 is specifically configured to: obtaining the moving direction of the electronic equipment according to the motion parameters of the electronic equipment in the operation data; according to the moving direction, obtaining an adjusting instruction, wherein the adjusting instruction at least represents that: the replaced first edge line.
In one implementation, the operation obtaining unit 1501 is specifically configured to: acquiring gesture data of a user operation body in the image preview area through an image acquisition device; and performing gesture recognition on the gesture data to obtain operation data.
Based on this, the instruction obtaining unit 1502 is specifically configured to: obtaining an adjusting instruction according to the gesture track of the user operation body in the operation data, wherein the adjusting instruction at least represents that: the replaced first edge line.
In one implementation, the operation obtaining unit 1501 is specifically configured to: and acquiring operation data of a user operation body on the touch screen where the image preview area is located.
Based on this, the instruction obtaining unit 1502 is specifically configured to: obtaining an adjusting instruction according to the operation track of the user operation body in the operation data, wherein the adjusting instruction at least represents: the replaced first edge line.
It should be noted that, for the specific implementation of each unit in the present embodiment, reference may be made to the corresponding content in the foregoing, and details are not described here.
Referring to fig. 16, a schematic structural diagram of an electronic device according to a third embodiment of the present disclosure is shown, where the electronic device may be an electronic device capable of image capture and edge detection, such as a mobile phone with an image capture device, such as a camera, a pad, a computer, or a server. The technical scheme in the embodiment is mainly used for: and the error of edge detection in the image is reduced, and the accuracy of edge detection is improved.
Specifically, the electronic device in this embodiment may include the following structure:
an image acquisition device 1601 for acquiring an image;
an output section 1602 for outputting an image preview area of the image acquisition apparatus 1601, the image preview area having therein a target area obtained by edge detection, the target area having a plurality of edge lines; the output component 1602 may be a display screen or a touch screen of an electronic device, etc.
An input section 1603 for obtaining operation data;
the input unit 1603 may be implemented by a gyroscope, a touch panel, an image capture device, or the like. When the operation data acquired by the input unit 1603 is gesture data such as a gesture image of a user operation body in the image preview area, the input unit 1603 and the image acquisition apparatus 1601 are the same unit. When the operation data collected by the input unit 1603 is operation data of a user operator on the touch panel, the input unit 1603 and the output unit 1602 are the same unit.
A processor 1604 for obtaining an adjustment instruction based on the operation data; according to the adjustment instruction, at least one edge line of the target region is adjusted in position to obtain an adjusted target region, and the adjusted target region is also output to the output component 1602.
As can be seen from the foregoing technical solutions, in the electronic device provided in the third embodiment of the present application, after the operation data for the target area in the image preview area output on the electronic device is obtained, an adjustment instruction capable of performing position adjustment on at least one edge line in the target area can be obtained based on the operation data, so that the edge line after the position adjustment can form an adjusted target area. Therefore, in the embodiment, the position of the edge line of the target area obtained through edge detection can be adjusted by using the operation data in the image preview area, and the adjusted target area is formed after the position is adjusted to the corresponding position.
The following describes a specific example of the implementation of the present application on a mobile phone:
since most of the edge detection is caused by the influence of light or lines, an easy-to-operate gesture is designed in the application: shaking the phone in up, down, left, right, front and back six directions, respectively, as shown by the shake gesture in fig. 17, or defining up, down, left, right, zoom-in and zoom-out gestures on the phone preview interface, or defining up, down, left, right, zoom-in and zoom-out gestures on the phone touch screen, as shown by the touch screen gesture in fig. 18, etc.
Based on this, as shown in the flow in fig. 19, after performing image-text edge detection on the mobile phone, and after detecting a corresponding adjustment gesture on the mobile phone, determining a gesture direction, such as a gesture of up, down, left, right, enlarging a layer or reducing a layer, and the like, according to a corresponding adjustment instruction, such as a replaced edge line and an adjustment direction, for an area edge line in the edge detection, optimization is performed in the mobile phone based on the detected edge line and the adjustment instruction, for example: searching the nearest edge in the gesture direction according to the gesture direction (input direction), automatically moving the corresponding edge line up, down, left or right or enlarging one layer or reducing one layer to generate a new edge discount, and displaying the latest edge result on a mobile phone preview interface, thereby realizing automatic correction (of course, correction can be performed circularly) of the edge line in the edge detection result, the correction effect can refer to the attached contents of the previous figures such as fig. 10 and fig. 12, and the like, and then performing subsequent trapezoidal correction and the like, and under the condition of no gesture adjustment, the subsequent trapezoidal correction can be performed directly, so that the accuracy of edge detection can be improved, and the operation complexity caused by manually correcting the edge line can be reduced.
Therefore, in the embodiment, edge adjustment can be realized by shaking the mobile phone or triggering the gesture, and the method is simple and easy to implement and good in usability, so that a high-precision edge detection effect based on manual assistance can be achieved through simple user interaction operation.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. An image processing method comprising:
obtaining operation data;
obtaining an adjustment instruction based on the operation data;
and adjusting the position of at least one edge line of the target area according to the adjusting instruction to obtain an adjusted target area, wherein the target area is an area in an image preview area output on the electronic equipment.
2. The method of claim 1, wherein adjusting the position of at least one edge line of the target area according to the adjustment instruction comprises:
determining at least one first edge line in the edge lines of the target area according to the adjusting instruction;
determining a second edge line at least meeting an adjusting condition in the edge line set corresponding to the first edge line; wherein, the edge line set comprises at least one alternative edge line corresponding to the first edge line;
and replacing the first edge line corresponding to the second edge line in the target area with the second edge line to obtain the adjusted target area.
3. The method of claim 2, further comprising:
and if the edge line set corresponding to the first edge line is an empty set, adjusting detection parameters of edge detection and performing edge detection again to obtain a candidate edge line of the first edge line.
4. The method of claim 2, the adjustment condition comprising:
the second edge line is an edge line in the edge line set, and the confidence of the edge line is only lower than that of the first edge.
5. The method of claim 2, the adjustment instructions characterizing at least: the replaced first edge line.
6. The method of claim 5, the adjustment instructions further characterizing: the relative positional relationship between the edge line of the first edge line and the first edge line is replaced.
7. The method of claim 5, the adjustment instructions further characterizing: a position adjustment direction of the first edge line;
wherein, the edge line set corresponding to the first edge line is obtained by the following method:
and obtaining the candidate edge lines corresponding to the position adjustment direction from the candidate edge lines corresponding to the first edge line to form an edge line set of the first edge line.
8. The method of claim 5, obtaining operational data, comprising:
obtaining motion parameters of the electronic equipment as operation data;
wherein the obtaining an adjustment instruction based on the operation data comprises:
obtaining the moving direction of the electronic equipment according to the motion parameters of the electronic equipment in the operation data;
and obtaining an adjusting instruction according to the moving direction.
9. The method of claim 5, obtaining operational data, comprising:
acquiring gesture data of a user operation body in the image preview area through an image acquisition device;
performing gesture recognition on the gesture data to obtain operation data;
wherein the obtaining an adjustment instruction based on the operation data comprises:
and obtaining an adjusting instruction according to the gesture track of the user operation body in the operation data.
10. An image processing apparatus comprising:
an operation obtaining unit for obtaining operation data;
the instruction obtaining unit is used for obtaining an adjusting instruction based on the operation data;
and the edge line adjusting unit is used for adjusting the position of at least one edge line of the target area according to the adjusting instruction to obtain an adjusted target area, wherein the target area is an area in an image preview area output on the electronic equipment.
CN202010363713.XA 2020-04-30 2020-04-30 Image processing method and device Pending CN111562877A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010363713.XA CN111562877A (en) 2020-04-30 2020-04-30 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010363713.XA CN111562877A (en) 2020-04-30 2020-04-30 Image processing method and device

Publications (1)

Publication Number Publication Date
CN111562877A true CN111562877A (en) 2020-08-21

Family

ID=72070693

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010363713.XA Pending CN111562877A (en) 2020-04-30 2020-04-30 Image processing method and device

Country Status (1)

Country Link
CN (1) CN111562877A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1760822A (en) * 2004-10-15 2006-04-19 日本电气株式会社 Portable terminal and display control method thereof
US20070285681A1 (en) * 2006-06-07 2007-12-13 Sony Corporation Operation screen generating apparatus, printing apparatus, imaging apparatus, operation screen generating method, and computer program
CN103809965A (en) * 2012-11-05 2014-05-21 株式会社三丰 Edge measurement video tool and interface including automatic parameter set alternatives
CN104504684A (en) * 2014-12-03 2015-04-08 小米科技有限责任公司 Edge extraction method and device
CN104508614A (en) * 2012-08-01 2015-04-08 索尼公司 Display control device, display control method, and program
CN104808789A (en) * 2015-04-03 2015-07-29 孙建康 Non-contact control device for mobile terminal
CN104914982A (en) * 2014-03-12 2015-09-16 联想(北京)有限公司 Control method and device of electronic equipment
CN105827894A (en) * 2015-01-28 2016-08-03 佳能株式会社 Information processing apparatus and information processing method
CN107037953A (en) * 2015-09-07 2017-08-11 Lg电子株式会社 Display device and the method for controlling it
JP6485732B2 (en) * 2014-12-10 2019-03-20 株式会社リコー Information providing apparatus, information providing method, and information providing control program
CN110674665A (en) * 2018-07-03 2020-01-10 杭州海康威视系统技术有限公司 Image processing method and device, forest fire prevention system and electronic equipment

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1760822A (en) * 2004-10-15 2006-04-19 日本电气株式会社 Portable terminal and display control method thereof
US20070285681A1 (en) * 2006-06-07 2007-12-13 Sony Corporation Operation screen generating apparatus, printing apparatus, imaging apparatus, operation screen generating method, and computer program
CN104508614A (en) * 2012-08-01 2015-04-08 索尼公司 Display control device, display control method, and program
CN103809965A (en) * 2012-11-05 2014-05-21 株式会社三丰 Edge measurement video tool and interface including automatic parameter set alternatives
CN104914982A (en) * 2014-03-12 2015-09-16 联想(北京)有限公司 Control method and device of electronic equipment
CN108255300A (en) * 2014-03-12 2018-07-06 联想(北京)有限公司 The control method and device of a kind of electronic equipment
CN104504684A (en) * 2014-12-03 2015-04-08 小米科技有限责任公司 Edge extraction method and device
JP6485732B2 (en) * 2014-12-10 2019-03-20 株式会社リコー Information providing apparatus, information providing method, and information providing control program
CN105827894A (en) * 2015-01-28 2016-08-03 佳能株式会社 Information processing apparatus and information processing method
CN104808789A (en) * 2015-04-03 2015-07-29 孙建康 Non-contact control device for mobile terminal
CN107037953A (en) * 2015-09-07 2017-08-11 Lg电子株式会社 Display device and the method for controlling it
CN110674665A (en) * 2018-07-03 2020-01-10 杭州海康威视系统技术有限公司 Image processing method and device, forest fire prevention system and electronic equipment

Similar Documents

Publication Publication Date Title
US9807306B2 (en) Apparatus and method for photographing image in camera device and portable terminal having camera
CN107659769B (en) A kind of image pickup method, first terminal and second terminal
US9560271B2 (en) Removing unwanted objects from photographed image
JP4575829B2 (en) Display screen position analysis device and display screen position analysis program
CN111062312A (en) Gesture recognition method, gesture control method, device, medium and terminal device
US10291843B2 (en) Information processing apparatus having camera function and producing guide display to capture character recognizable image, control method thereof, and storage medium
CN105827952A (en) Photographing method for removing specified object and mobile terminal
US9807299B2 (en) Image capture methods and systems with positioning and angling assistance
US10990226B2 (en) Inputting information using a virtual canvas
JP6478654B2 (en) Imaging apparatus and control method thereof
US9535604B2 (en) Display device, method for controlling display, and recording medium
CN112492215B (en) Shooting control method and device and electronic equipment
US10586099B2 (en) Information processing apparatus for tracking processing
US10643095B2 (en) Information processing apparatus, program, and information processing method
CN112099689A (en) Interface display method and device, electronic equipment and computer readable storage medium
CN112584043A (en) Auxiliary focusing method and device, electronic equipment and storage medium
KR101503017B1 (en) Motion detecting method and apparatus
US10872263B2 (en) Information processing apparatus, information processing method and storage medium
CN110443772B (en) Picture processing method and device, computer equipment and storage medium
EP2498256A2 (en) Reproduction processing apparatus, imaging apparatus, reproduction processing method, and program
JP2013080266A (en) Input device
JP7028729B2 (en) Object tracking device, object tracking system, and object tracking method
US20130097543A1 (en) Capture-and-paste method for electronic device
CN111562877A (en) Image processing method and device
JP2017162179A (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination