CN117499549A - Scanning method and electronic equipment - Google Patents

Scanning method and electronic equipment Download PDF

Info

Publication number
CN117499549A
CN117499549A CN202311785086.9A CN202311785086A CN117499549A CN 117499549 A CN117499549 A CN 117499549A CN 202311785086 A CN202311785086 A CN 202311785086A CN 117499549 A CN117499549 A CN 117499549A
Authority
CN
China
Prior art keywords
electronic device
camera
electronic equipment
included angle
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311785086.9A
Other languages
Chinese (zh)
Inventor
王光旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311785086.9A priority Critical patent/CN117499549A/en
Publication of CN117499549A publication Critical patent/CN117499549A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/04Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
    • H04N1/19Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays
    • H04N1/195Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays the array comprising a two-dimensional array or a combination of two-dimensional arrays
    • H04N1/19594Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays the array comprising a two-dimensional array or a combination of two-dimensional arrays using a television camera or a still video camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Abstract

The embodiment of the application discloses a scanning method and electronic equipment, relates to the field of electronic equipment, and can avoid that a user needs to adjust the position of the electronic equipment or a shot object when the electronic equipment is not parallel to the plane where the shot object is located, so that the operation of the user is reduced. The specific scheme is as follows: determining a first included angle, wherein the first included angle is an included angle between the plane of the shot object and the electronic equipment; determining a target angle required to rotate by a camera of the electronic equipment based on the first included angle; controlling the camera to rotate a target angle; after the camera rotates a target angle, the plane where the shot object is positioned is in a parallel state with the camera; receiving triggering operation of a user on determining a scanning control; responding to the triggering operation of the user on the determined scanning control, and scanning the shot object through a camera after rotating the target angle to acquire the scanning information of the shot object; and obtaining the electronic scanning piece corresponding to the shot object according to the information of the shot object.

Description

Scanning method and electronic equipment
Technical Field
The present disclosure relates to the field of electronic devices, and in particular, to a scanning method and an electronic device.
Background
With the development of electronic device technology, more and more electronic devices have a document scanning function. For example, when the electronic device uses the document scanning function, the electronic device may scan a photographed object such as a paper document, so that information (such as a text or a picture) in the photographed object may be extracted, and an electronic scan piece corresponding to the photographed object may be converted, so that a user may conveniently view and edit the document.
When the electronic equipment scans a paper shot object by using a document scanning function, the electronic equipment needs to be relatively parallel to a plane where the shot object is located, so that the electronic equipment can obtain a clear electronic scanning piece.
However, when the electronic apparatus is not parallel to the plane in which the subject is photographed, the electronic scan obtained by the electronic apparatus may be unclear. This results in that the user needs to adjust the position of the electronic device or the object to be photographed, so that the electronic device is parallel to the plane where the object to be photographed is located, resulting in troublesome operation of the user.
Disclosure of Invention
The embodiment of the application provides a scanning method and electronic equipment, which can avoid that a user needs to adjust the position of the electronic equipment or a shot object when the electronic equipment is not parallel to the plane where the shot object is located, so that the operation of the user is reduced.
In order to achieve the above purpose, the embodiment of the application adopts the following technical scheme:
in a first aspect, an embodiment of the present application provides a scanning method, applied to an electronic device, where the electronic device may include a document scanning function, where the scanning method may include: receiving an opening operation of a user on a document scanning function; responding to the opening operation of a user on a document scanning function, displaying a scanning interface, wherein the scanning interface comprises a determining scanning control; determining a first included angle, wherein the first included angle is an included angle between a plane where a shot object is located and electronic equipment; determining a target angle required to rotate by a camera of the electronic equipment based on the first included angle; controlling the camera to rotate a target angle; after the camera rotates a target angle, the plane where the shot object is positioned is in a parallel state with the camera; receiving triggering operation of a user on determining a scanning control; responding to the triggering operation of the user on the determined scanning control, scanning the shot object through a camera after rotating the target angle, and acquiring scanning information of the shot object; and obtaining the electronic scanning piece corresponding to the shot object according to the information of the shot object.
Based on the method of the first aspect, before the electronic device scans the photographed object, the electronic device can determine the target angle required to rotate by the camera in real time, and rotate the camera by the target angle, and after the camera rotates by the target angle, the plane where the photographed object is located is in a parallel state with the camera, so that the photographed object can be all in a depth of field range corresponding to the camera after rotating by the target angle. Therefore, when the electronic device scans the shot object through the camera after the rotation target angle, the shot object can be all in the depth of field range corresponding to the camera after the rotation target angle, so that a clear electronic scanning piece can be obtained. Therefore, when the plane where the electronic equipment and the shot object are located is not parallel, the user needs to adjust the position of the electronic equipment or the shot object, so that the plane where the electronic equipment and the shot object are located is not parallel, the operation of the user can be reduced, and the experience of the user is improved.
With reference to the first aspect, in another possible implementation manner, determining the first included angle may include: acquiring distances between the electronic equipment and a plurality of detection points in a plane where the shot object is located; and determining a first included angle according to the distance between the electronic equipment and a plurality of detection points in the plane of the shot object.
Based on the possible implementation manner, the electronic device may determine an included angle between the plane in which the object is photographed and the electronic device according to distances between the electronic device and a plurality of detection points in the plane in which the object is photographed. Therefore, the target angle of the camera of the electronic equipment, which is required to rotate, can be determined according to the included angle between the plane where the shot object is located and the electronic equipment, namely, the angle of the camera of the electronic equipment, which is required to incline, can be determined, so that after the camera rotates the target angle, the plane where the shot object is located is in a parallel state with the camera.
With reference to the first aspect, in another possible implementation manner, the plurality of detection points may include a first target detection point and a second target detection point, and determining the first included angle according to a distance between the electronic device and the plurality of detection points in a plane where the object is photographed may include: determining a second included angle according to the first target detection point and the second target detection point, wherein the second included angle is an included angle between the first connecting line and the second connecting line; the first connecting line is a connecting line between the first target detection point and the electronic equipment, and the second connecting line is a connecting line between the second target detection point and the electronic equipment; determining a first included angle according to the first distance, the second distance and the second included angle; the first distance is the distance between the first target detection point and the electronic equipment, and the second distance is the distance between the second target detection point and the electronic equipment.
Based on the possible implementation manner, the electronic device may determine an included angle between the plane in which the object is photographed and the electronic device according to a distance between the electronic device and the first target detection point and the second target detection point in the plane in which the object is photographed and an included angle between the first connection line and the second connection line. Therefore, the target angle of the camera of the electronic equipment, which is required to rotate, can be determined according to the included angle between the plane where the shot object is located and the electronic equipment, namely, the angle of the camera of the electronic equipment, which is required to incline, can be determined, so that after the camera rotates the target angle, the plane where the shot object is located is in a parallel state with the camera.
With reference to the first aspect, in another possible implementation manner, when the display screen of the electronic device is in a portrait display state, the first target detection point may be a detection point corresponding to an upper edge of the object to be shot or a detection point corresponding to a lower edge of the object to be shot, and the second target detection point may be a detection point corresponding to a vertical line of the electronic device on a plane where the object to be shot is located.
Based on the possible implementation manner, the electronic device can accurately determine the included angle between the plane where the shot object is located and the electronic device according to the detection point corresponding to the upper edge of the shot object or the detection point corresponding to the lower edge of the shot object and the detection point corresponding to the vertical line of the electronic device on the plane where the shot object is located. Therefore, the target angle of the camera of the electronic equipment, which is required to be rotated, can be accurately determined according to the accurate included angle between the plane where the shot object is located and the electronic equipment, namely, the angle of the camera of the electronic equipment, which is required to be inclined, can be determined, so that after the camera rotates the target angle, the plane where the shot object is located and the camera are in a parallel state.
With reference to the first aspect, in another possible implementation manner, determining the first included angle according to the first distance, the second distance, and the second included angle may include: determining a first included angle based on a first preset formula according to the first distance, the second distance and the second included angle; the first preset formula is:the method comprises the steps of carrying out a first treatment on the surface of the Wherein,for the first angle, H1 is the first distance, H2 is the second distance, ++>Is a second included angle.
Based on the possible implementation manner, the electronic device can accurately determine the included angle between the plane where the photographed object is located and the electronic device through a first preset formula. Therefore, the target angle of the camera of the electronic equipment, which is required to be rotated, can be accurately determined according to the accurate included angle between the plane where the shot object is located and the electronic equipment, namely, the angle of the camera of the electronic equipment, which is required to be inclined, can be determined, so that after the camera rotates the target angle, the plane where the shot object is located and the camera are in a parallel state.
With reference to the first aspect, in another possible implementation manner, determining the first included angle may include: acquiring a phase difference between adjacent photodiode PD points of the electronic device; and determining a first included angle according to the phase difference between the adjacent PD points and a preset corresponding relation.
Based on the possible implementation manner, the electronic device can determine the included angle between the plane where the photographed object is located and the electronic device according to the phase difference between the adjacent photodiode PD points. Therefore, the target angle of the camera of the electronic equipment, which is required to rotate, can be determined according to the included angle between the plane where the shot object is located and the electronic equipment, namely, the angle of the camera of the electronic equipment, which is required to incline, can be determined, so that after the camera rotates the target angle, the plane where the shot object is located is in a parallel state with the camera.
With reference to the first aspect, in another possible implementation manner, determining, based on the first included angle, a target angle of rotation required by a camera of the electronic device may include: determining an object distance between a shot object and electronic equipment and a focal length corresponding to a camera of the electronic equipment; and determining a target angle required to rotate the camera of the electronic equipment according to the object distance between the shot object and the electronic equipment, the focal length corresponding to the camera of the electronic equipment and the first included angle.
Based on the possible implementation manner, the electronic device can accurately determine the target angle required to rotate by the camera of the electronic device according to the object distance between the shot object and the electronic device, the focal length corresponding to the camera of the electronic device and the included angle between the plane where the shot object is located and the electronic device. Therefore, the target angle of the camera of the electronic equipment, which is required to be rotated, can be accurately determined according to the accurate included angle between the plane where the shot object is located and the electronic equipment, namely, the angle of the camera of the electronic equipment, which is required to be inclined, can be determined, so that after the camera rotates the target angle, the plane where the shot object is located and the camera are in a parallel state.
With reference to the first aspect, in another possible implementation manner, determining the target angle required to rotate the camera of the electronic device according to the object distance between the photographed object and the electronic device, the focal length corresponding to the camera of the electronic device, and the first included angle may include: according to the object distance between the shot object and the electronic equipment and the focal length corresponding to the camera of the electronic equipmentThe first included angle is used for determining a target angle based on a second preset formula; the second preset formula is:the method comprises the steps of carrying out a first treatment on the surface of the Wherein (1)>And (2) taking μ as a first included angle, taking the object distance between the shot object and the electronic equipment, f as the focal length corresponding to the camera of the electronic equipment, and θ as a target angle.
Based on the possible implementation manner, the electronic device can accurately determine the target angle of the camera of the electronic device, namely, the angle of inclination of the camera of the electronic device, through a second preset formula, the object distance between the shot object and the electronic device, the focal length corresponding to the camera of the electronic device, and the included angle between the plane where the shot object is located and the electronic device, so that the plane where the shot object is located and the camera are in a parallel state after the camera rotates the target angle.
In a second aspect, an embodiment of the present application provides a scanning apparatus, where the scanning apparatus may be applied to an electronic device, for implementing the method in the first aspect. The functions of the scanning device can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the above functions, for example, a receiving module, a display module, a determining module, a control module, an acquiring module, and the like.
The receiving module can be used for receiving the opening operation of the document scanning function by a user.
And the display module can be used for responding to the opening operation of the user on the document scanning function and displaying a scanning interface, wherein the scanning interface comprises a determining scanning control.
The determining module can be used for determining a first included angle, wherein the first included angle is an included angle between a plane where the shot object is located and the electronic equipment.
The determining module is further configured to determine a target angle required to rotate the camera of the electronic device based on the first included angle.
The control module can also be used for controlling the camera to rotate a target angle; after the camera rotates a target angle, the plane where the shot object is located is in a parallel state with the camera.
The receiving module is also used for receiving the triggering operation of the user on the determined scanning control.
The acquisition module can be used for responding to the triggering operation of the user on the determined scanning control, scanning the shot object through the camera after rotating the target angle, and acquiring the scanning information of the shot object.
The determining module can also be used for obtaining an electronic scanning piece corresponding to the shot object according to the information of the shot object.
In a third aspect, a scanning device is provided, which has the functionality to implement the method of the first aspect described above. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
In a fourth aspect, there is provided a scanning apparatus comprising: a processor and a memory; the memory is configured to store computer-executable instructions that, when executed by the scanning device, cause the scanning device to perform the scanning method of any of the first aspects.
In a fifth aspect, there is provided a photographing apparatus comprising: a processor; the processor is configured to couple to the memory and execute the scanning method according to any one of the first aspect according to the instructions after reading the instructions in the memory.
In a sixth aspect, embodiments of the present application provide a computer-readable storage medium having computer program instructions stored thereon. The computer program instructions, when executed by an electronic device, cause the electronic device to implement the scanning method as described in the first aspect or any one of the possible implementations of the first aspect.
In a seventh aspect, embodiments of the present application provide a computer program product comprising computer readable code which, when run in an electronic device, causes the electronic device to implement a scanning method as in the first aspect or any one of the possible implementations of the first aspect.
In an eighth aspect, there is provided an apparatus (e.g. the apparatus may be a system-on-a-chip) comprising a processor for supporting an electronic device to implement the functions referred to in the first aspect above. In one possible design, the apparatus further includes a memory for storing program instructions and data necessary for the electronic device. When the device is a chip system, the device can be formed by a chip, and can also comprise the chip and other discrete devices.
It should be appreciated that the advantages of the second to eighth aspects may be referred to in the description of the first aspect, and are not described herein.
Drawings
FIG. 1 is a schematic diagram of a display interface of a document scanning function of an electronic device;
FIG. 2 is a second schematic diagram of a display interface of a document scanning function of an electronic device;
FIG. 3 is a schematic diagram of a scanning method according to an embodiment of the present disclosure;
fig. 4 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 5 is a schematic software structure of an electronic device according to an embodiment of the present application;
fig. 6 is a schematic flow chart of a scanning method according to an embodiment of the present application;
fig. 7 is a schematic diagram of an included angle between a photographed object and an electronic device according to an embodiment of the present application;
fig. 8 is a schematic diagram of a photographed object and a camera of an electronic device according to an embodiment of the present application;
fig. 9 is a schematic view of a depth of field range of a camera of an electronic device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a scanning device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The terminology used in the following embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It should also be understood that "/" means or, e.g., A/B may represent A or B; the text "and/or" is merely an association relationship describing the associated person, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone.
With the development of electronic device technology, more and more electronic devices have a document scanning function, that is, the electronic devices may have a document mode. When a user uses the document scanning function through the electronic equipment, the electronic equipment can scan the shot object when the electronic equipment opens the document scanning mode, so that information in the shot object can be extracted and converted into an electronic scanning piece corresponding to the shot object. After the electronic scanning piece corresponding to the shot object is obtained, the electronic equipment can display the electronic scanning piece, so that the electronic equipment can be convenient for a user to check and edit.
The shot object can be a document, a picture, a table or the like. That is, the object may be a document, a picture, a form, or the like of paper, or may be a document, an image, a form, or the like displayed by another electronic device. When the object is a document, the information in the object is text information, image information, form information, or the like in the document, when the object is an image, the information in the object is text information, image information, form information, or the like in the image, and when the object is a form, the information in the object is text information, image information, form information, or the like in the form.
When the electronic device scans the shot object by using the document scanning function, the electronic device needs to be relatively parallel to the plane of the shot object, so that the electronic device can obtain a clear electronic scanning piece. When the plane where the electronic equipment and the shot object are located is kept relatively parallel, the depth of field range corresponding to the camera of the electronic equipment is larger, and the shot object cannot exceed the depth of field range corresponding to the camera, so that the electronic equipment can obtain a clear electronic scanning piece through the larger depth of field range.
Depth of field (DOF) refers to the range in which a scene within a certain distance from the front of the camera (i.e., lens) of an electronic device can remain in clear focus. The depth of field is affected by a number of factors including the focal length, aperture, distance of the camera from the object to be photographed, etc. A smaller aperture and a longer focal length generally produce a larger depth of field so that both front and back scenes remain clear, while a larger aperture and a shorter focal length produce a shallow depth of field so that only scenes within a specific distance range remain clear and other areas are in a blurred state.
When the shot object is positioned in the depth of field range corresponding to the camera of the electronic device, the electronic device can obtain a clear electronic scanning piece, and when the shot object is positioned in the depth of field range, the electronic device cannot obtain the clear electronic scanning piece corresponding to the part of the content of the shot object beyond the depth of field range.
That is, when the plane where the electronic device and the object to be photographed are located is kept relatively parallel, the depth of field range corresponding to the camera of the electronic device is large, and the object to be photographed can be located in the depth of field range corresponding to the camera entirely, so that the depth of field range corresponding to the camera cannot be exceeded. When the object to be photographed is scanned (may also be referred to as photographing) through the large depth of field range, the object to be photographed may be located entirely within the large depth of field range, so that the electronic device may obtain a clear electronic scan (i.e., the object to be photographed of the electronic board) through the camera.
For example, taking a document with a photographed object being paper as an example, when an electronic device (such as a mobile phone) uses a document scanning function to scan the photographed object, the electronic device and a plane where the photographed object is located are kept relatively parallel, and the electronic device can obtain a clear electronic document (i.e. an electronic scanning piece corresponding to the photographed object) to perform schematic description.
After a user opens a document scanning mode on the electronic device, the electronic device may display a document scanning interface 101, as shown in fig. 1 (a), and the document scanning interface 101 may include a scanning control 102. Thereafter, the user may scan (i.e., photograph) the paper document 103 using the electronic device. In order to obtain a clear electronic version document, a user can keep the electronic device and the plane where the paper document 103 is located relatively parallel, so that the depth of field range corresponding to the camera of the electronic device is larger, and the paper document 103 can be completely located in the depth of field range corresponding to the camera.
After the user has kept the electronic device relatively parallel to the plane in which the paper document 103 lies, the user activates the scan control 102 so that the electronic device can scan the paper document 103. That is, when the electronic device receives a triggering operation, such as a clicking operation, of the user on the scan control 102, the electronic device may scan the paper document 103 in response to obtain document information (such as text information) corresponding to the paper document 103. After the electronic device obtains the document information corresponding to the paper document 103, the electronic device may perform format conversion on the document information corresponding to the paper document 103, such as converting to an electronic version of the document. Thereafter, as shown in (b) of fig. 1, the electronic device may display an electronic version document 104 corresponding to the paper document 103.
When the electronic device is kept relatively parallel to the plane in which the paper document 103 is located, the depth of field range corresponding to the camera of the electronic device is large. When the paper document 103 is scanned through the large depth of field, since the paper document 103 can be entirely located within the large depth of field, the electronic device can obtain a clear electronic version document 104 as shown in fig. 1 (b) through the camera.
However, when the electronic device is not parallel to the plane in which the object is located, the object of the electronic version obtained by the electronic device is unclear. This results in that the user needs to adjust the position of the electronic device or the object to be photographed, so that the electronic device is parallel to the plane where the object to be photographed is located, resulting in troublesome operation of the user.
That is, when the plane of the electronic device and the object is not parallel (i.e., the plane of the electronic device and the object is at a certain angle), the object may exceed the depth of field range corresponding to the camera, for example, the portion of the object near the upper side or the portion near the lower side may exceed the depth of field range corresponding to the camera. When the photographed object exceeds the depth of field range corresponding to the camera, the electronic device cannot obtain a clear electronic scanning piece corresponding to the part of the photographed object with respect to the depth of field range.
For example, when an electronic device (such as a mobile phone) scans a photographed object by using a document scanning function, the electronic device is not parallel to the plane of the photographed object (i.e., the electronic device forms a certain angle with the plane of the photographed object), and the electronic device cannot obtain a clear electronic document to be schematically illustrated.
After a user opens a document scanning mode on the electronic device, the electronic device may display a document scanning interface 201, as shown in fig. 2 (a), and the document scanning interface 201 may include a scanning control 202. Thereafter, the user may scan (i.e., photograph) the paper document 203 using the electronic device.
When the electronic device is at an angle to the plane of the paper document 203, the user may trigger the scan control 202 so that the electronic device may scan the paper document 203. That is, when the electronic device receives a triggering operation, such as a clicking operation, of the scanning control 202 by the user, the electronic device may scan the paper document 203 in response to obtain document information (such as text information) corresponding to the paper document 203. After the electronic device obtains the document information corresponding to the paper document 203, the electronic device may perform format conversion on the document information corresponding to the paper document 203, such as converting to an electronic version of the document. Thereafter, as shown in (b) of fig. 2, the electronic device may display an electronic version document 204 corresponding to the paper document 203.
When the electronic device forms a certain angle with the plane of the paper document 203, the paper document 203 exceeds the depth of field range corresponding to the camera, for example, the upper part or the lower part of the paper document 203 exceeds the depth of field range corresponding to the camera. When the paper document 203 exceeds the depth of field range corresponding to the camera, the electronic device cannot obtain a clear electronic scan piece corresponding to the part of the content of the photographed object exceeding the depth of field range, that is, as shown in (b) of fig. 2, the electronic device obtains a lower definition of the electronic version document 204.
When the definition of the electronic version document 204 obtained by the electronic device is low, the user needs to adjust the position of the paper document 203 or the electronic device (such as a mobile phone), so that the plane of the paper document 203 is relatively parallel to the electronic device (such as the mobile phone), which results in troublesome operation of the user. Therefore, the paper document 203 can be completely located in the depth of field corresponding to the camera, and the electronic equipment can obtain a clear electronic scanning piece.
In view of the above problems, an embodiment of the present application provides a scanning method applied to an electronic device. The scanning method can include that after the electronic equipment receives the opening operation of a document scanning application program by a user, the electronic equipment can acquire the included angle between the plane where the shot object is located and the electronic equipment. Then, the electronic equipment can determine the target angle required to rotate by the camera according to the included angle between the plane of the shot object and the electronic equipment. Then, the electronic device can rotate the camera by the target angle, so that the shot object can be in the depth of field range corresponding to the camera after the target angle is rotated. After the electronic equipment receives the triggering operation of the user on the scanning control, the electronic equipment can scan the shot object through the camera after rotating the target angle, so that the electronic scanning piece corresponding to the shot object is obtained.
According to the scheme, before the electronic equipment scans the shot object, the electronic equipment can determine the target angle required to rotate by the camera in real time and rotate the camera by the target angle, so that the shot object can be in the depth of field range corresponding to the camera after rotating by the target angle. Therefore, when the electronic device scans the shot object through the camera after the rotation target angle, the shot object can be all in the depth of field range corresponding to the camera after the rotation target angle, so that a clear electronic scanning piece can be obtained. Therefore, when the plane where the electronic equipment and the shot object are located is not parallel, the user needs to adjust the position of the electronic equipment or the shot object, so that the plane where the electronic equipment and the shot object are located is not parallel (even if the shot object can be completely located in the depth of field range corresponding to the camera), the trouble of operation of the user is avoided, and the user experience is improved.
The principle of the scanning method provided in the embodiment of the present application is schematically described below with reference to fig. 3. As shown in fig. 3, the scanning method provided in the embodiment of the present application may include the following procedure.
The electronic equipment firstly enters a document mode, namely after the electronic equipment receives the opening operation of a user on a document scanning application program, the electronic equipment can display a document scanning interface and enter the document mode.
Then, the electronic device can acquire the included angle between the scanning plane and the mobile device. I.e. the electronic device can determine the angle between the plane in which the object is photographed and the electronic device. It should be noted that, the initial state of the camera of the electronic device is parallel to the electronic device, so the included angle between the plane of the photographed object and the electronic device is the included angle between the camera of the electronic device and the plane of the photographed object.
The electronic equipment can acquire the included angle between the scanning plane and the mobile equipmentThen, the electronic device can control the electronic device to control the electronic device according to the acquired included angle between the scanning plane and the mobile device>And determining the target angle theta of the camera required to rotate in real time. That is, the electronic device can control the angle between the acquired scanning plane and the mobile device>And calculating a target angle theta of the camera required to rotate according to the Moire law.
In addition, when calculating the target angle of the camera inclination, the principle of the law of the Mooney can be understood as that the camera is used for the camera is included with the optical axis and the center of the field of view, and the position information of the target object in the image to calculate the inclination angle of the camera relative to the horizontal plane or the vertical plane.
Specifically, the law of Moire can be used for calibration and calibration of cameras to determine the position and attitude of optical elements inside the camera, distortion correction, and the like. The camera is regarded as an ideal pinhole model, and the position and posture information of the target object in the image can be calculated by utilizing the geometric relation in the law of the Moire.
When calculating the inclined target angle of the camera, the position and posture information of the target object in the image need to be determined, and then the inclined angle of the camera relative to the horizontal plane or the vertical plane is calculated by utilizing the geometric relationship in the law of the Moire. This tilt angle may be used to further analyze the position and pose information of the target object or to achieve more accurate image processing and recognition tasks.
After the electronic device determines the target angle θ required to rotate by the camera, the electronic device may push the lens to tilt by the motor corresponding to the camera (i.e., the lens). Therefore, the shot objects can be all in the depth of field range corresponding to the camera after the target angle is rotated.
Then, the electronic apparatus may scan (may also be referred to as photographing) the object to be photographed by the camera after rotating the target angle, so that the electronic apparatus may photograph to complete the document scanning. The electronic equipment can scan the shot object after rotating the target angle to obtain the information of the shot object, and perform format conversion to obtain the electronic scanning piece corresponding to the shot object.
When the electronic device scans the shot object through the camera after rotating the target angle, the shot object can be in the depth of field range corresponding to the camera after rotating the target angle, so that a clear electronic scanning piece corresponding to the shot object can be obtained. Therefore, when the plane where the electronic equipment and the shot object are located is not parallel, the user needs to adjust the position of the electronic equipment or the shot object, so that the plane where the electronic equipment and the shot object are located is not parallel (even if the shot object can be completely located in the depth of field range corresponding to the camera), the trouble of operation of the user is avoided, and the user experience is improved.
The scanning method provided in the embodiment of the present application is described below.
The scanning method provided by the embodiment of the application can be applied to the electronic equipment. In some embodiments, the electronic device may be a cell phone, tablet, handheld computer, personal computer (personal computer, PC), cellular telephone, personal digital assistant (personal digital assistant, PDA), or the like, including an adjustable angle camera, and including a document scanning application. The embodiment of the present application does not limit the specific form of the electronic device herein.
By way of example, taking an electronic device as a mobile phone, fig. 4 shows a schematic structural diagram of an electronic device according to an embodiment of the present application.
As shown in fig. 4, the electronic device may include a processor 410, an external memory interface 420, an internal memory 421, a universal serial bus (universal serial bus, USB) interface 430, a charge management module 440, a power management module 441, a battery 442, an antenna 1, an antenna 2, a mobile communication module 450, a wireless communication module 460, an audio module 470, a speaker 470A, a receiver 470B, a microphone 470C, an earphone interface 470D, a sensor module 480, keys 490, a motor 491, an indicator 492, a camera 493, a display screen 494, and a subscriber identity module (subscriber identification module, SIM) card interface 495, etc. Among other things, the sensor module 480 may include a pressure sensor 480A, a gyroscope sensor 480B, an air pressure sensor 480C, a magnetic sensor 480D, an acceleration sensor 480E, a distance sensor 480F, a proximity light sensor 480G, a fingerprint sensor 480H, a temperature sensor 480J, a touch sensor 480K, an ambient light sensor 480L, a bone conduction sensor 480M, and the like.
It is to be understood that the configuration illustrated in this embodiment does not constitute a specific limitation on the electronic apparatus. In other embodiments, the electronic device may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 410 may include one or more processing units, such as: the processor 410 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and command center of the electronic device. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 410 for storing instructions and data. In some embodiments, the memory in the processor 410 is a cache memory. The memory may hold instructions or data that the processor 410 has just used or recycled. If the processor 410 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided, reducing the latency of the processor 410 and thus improving the efficiency of the system.
In some embodiments, processor 410 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 450, the wireless communication module 460, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 450 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied on an electronic device. The mobile communication module 450 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 450 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 450 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate the electromagnetic waves. In some embodiments, at least some of the functional modules of the mobile communication module 450 may be disposed in the processor 410. In some embodiments, at least some of the functional modules of the mobile communication module 450 may be disposed in the same device as at least some of the modules of the processor 410.
The wireless communication module 460 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. for application on an electronic device. The wireless communication module 460 may be one or more devices that integrate at least one communication processing module. The wireless communication module 460 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and transmits the processed signals to the processor 410. The wireless communication module 460 may also receive a signal to be transmitted from the processor 410, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 450 of the electronic device are coupled, and antenna 2 and wireless communication module 460 are coupled, such that the electronic device may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others.
The electronic device implements display functions through the GPU, the display screen 494, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 494 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 410 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 494 is used to display images, videos, and the like. The display screen 494 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (flex), a mini, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 494, N being a positive integer greater than 1.
The electronic device may implement shooting functions through the ISP, the camera 493, the video codec, the GPU, the display screen 494, the application processor, and the like. In some embodiments, the electronic device may include 1 or N cameras 493, N being a positive integer greater than 1.
The internal memory 421 may be used to store computer-executable program code that includes instructions. The processor 410 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 421. The internal memory 421 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data (e.g., audio data, phonebook, etc.) and the like established during use of the electronic device. In addition, the internal memory 421 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The acceleration sensor 480E may periodically collect acceleration data of the electronic device at a certain frequency. For example, the magnitude of acceleration of the electronic device in various directions (typically XYZ three axes) may be collected.
It will be understood, of course, that the above illustration of fig. 4 is merely exemplary of the case where the electronic device is in the form of a cellular phone. If the electronic device is a tablet computer, a handheld computer, a PC, a PDA, a wearable device (e.g., a smart watch, a smart bracelet), etc., the electronic device may include fewer structures than those shown in fig. 4, or may include more structures than those shown in fig. 4, which is not limited herein.
It will be appreciated that in general, implementation of electronic device functions requires software in addition to hardware support.
The software system of the electronic device may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, the Android system with a layered architecture is taken as an example, and the software structure of the electronic equipment is illustrated.
Fig. 5 is a software structural block diagram of an electronic device provided in an embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer (also referred to as a system framework java layer), an Zhuo runtime (also referred to as a Native layer), and a system library (also referred to as a kernel layer).
The application layer may include a series of application packages. As shown in FIG. 5, the application layer package may include a photo application (e.g., camera), gallery, calendar, phone, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
As shown in FIG. 5, the application layer may include a document scanning application. Document scanning applications may be used to convert paper documents, pictures, forms, etc. into digital documents, pictures, forms (i.e., electronic version of documents, pictures, forms) for storage, transmission, and editing. The document scanning application may also be used to convert documents, pictures, forms, etc. displayed on other electronic devices into digital documents, pictures, forms (i.e., electronic version of documents, pictures, forms, which may also be referred to as electronic scan pieces) for storage, transmission, and editing.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 5, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is for providing communication functions of the electronic device. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
As shown in fig. 5, the application framework layer may also include an angle adjustment module.
The angle adjustment module may be used to execute the scanning method provided in the embodiments of the present application. The angle adjusting module can be used for acquiring the included angle between the plane where the shot object is located and the electronic equipment.
The angle adjustment module is also used for determining a target angle required to rotate by the camera according to the included angle between the plane where the shot object is and the electronic equipment.
The angle adjusting module can also be used for sending the target angle to the drive corresponding to the camera, so that the drive corresponding to the camera can drive the camera to rotate the target angle.
Note that the corresponding driving of the camera (i.e., motor driving) may include electromagnetic driving, piezoelectric driving, smart metal antenna driving (smart metal antenna, SMA), and the like. In the embodiment of the application, the specific type of the driving corresponding to the camera is not limited, and the camera can be driven to rotate by a target angle.
An Zhuo runtime Android run includes core libraries and virtual machines. An Zhuo Android run time is responsible for scheduling and management of An Zhuo systems.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of An Zhuo.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), two-dimensional graphics engines (e.g., SGL), etc.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The methods in the following embodiments may be implemented in an electronic device having the above-described hardware structure or software structure.
The scanning method provided in the embodiment of the present application is described in detail below with reference to fig. 6, and the scanning method may be applied to an electronic device (such as a mobile phone). As shown in fig. 6, the scanning method may include S601 to S608 described below.
S601, the electronic equipment receives an opening operation of a document scanning function by a user.
When a user needs to scan a shot object by using an electronic device, so as to obtain an electronic scan piece (i.e., a shot object of an electronic version) corresponding to the shot object, the user may trigger the electronic device to open a scan function (in the embodiment of the present application, the scan function may also be referred to as a document scan function) in the electronic device.
The document scanning function can be an application program of the electronic equipment, and can also be a three-party application program. This is not limiting in the embodiments of the present application.
The document scanning function may be a function in a separate application or other application, and may be referred to as a scanning mode when the document scanning function may be a function in other application. For example, the document scanning function may be a function of a camera application of the electronic device.
That is, the opening operation of the document scanning function by the user may be the opening operation of the application program having the document scanning function by the user or the opening operation of the scanning mode in other application programs.
The photographed object may be a document, a picture, a table, or the like, and the specific type of the photographed object is not limited in the embodiment of the present application.
The object to be photographed may be a document, a picture, a form, or the like of paper, or may be a document, an image, a form, or the like displayed by another electronic device. When the object is a document, the information in the object is text information, image information, form information, or the like in the document, when the object is an image, the information in the object is text information, image information, form information, or the like in the image, and when the object is a form, the information in the object is text information, image information, form information, or the like in the form. In the embodiment of the present application, a document in which the object is paper is taken as an example for illustration, that is, the object may include information such as text information, image information, or table information in the paper document.
S602, responding to the opening operation of a user on a document scanning function, and displaying a scanning interface by the electronic equipment, wherein the scanning interface comprises a scanning control.
When the electronic device receives an opening operation of the document scanning function by a user, the electronic device may display a document scanning interface in response. The document scanning interface may include a scanning control. The scan control may be used to trigger the electronic device to scan the photographed object (in the embodiment of the present application, the scan control may also be referred to as a determining scan control).
It should be noted that, when the electronic device displays the scan interface, the electronic device may start the camera included in the electronic device. Therefore, a user can scan the shot object by using the camera of the electronic device.
After the electronic device starts the camera of the electronic device and the user uses the camera of the electronic device to aim at the shot object, the electronic device can also display a preview picture of the shot object in the displayed scanning interface.
S603, the electronic equipment determines an included angle between the plane where the shot object is located and the electronic equipment.
After the electronic device displays the scanning interface, the electronic device may determine an included angle between the plane where the photographed object is located and the electronic device after the scanning interface includes the scanning control, that is, an included angle between the photographed object and the electronic device (in this embodiment, the included angle between the plane where the photographed object is located and the electronic device may also be referred to as a first included angle). Therefore, the electronic equipment can determine the target angle required to rotate by the camera of the electronic equipment according to the included angle between the plane of the shot object and the electronic equipment.
It should be noted that, because the initial state of the camera of the electronic device is parallel to the electronic device, the included angle between the plane where the photographed object is located and the electronic device is the included angle between the camera of the electronic device and the plane where the photographed object is located.
In some examples, the electronic device determining an angle between a plane in which the photographed object is located and the electronic device may include the electronic device acquiring distances between a plurality of detection points between the electronic device and the plane in which the photographed object is located through a direct time-of-flight measurement (direct time of flight, DTOF) device included in the electronic device. And then, the electronic equipment can determine the included angle between the electronic equipment and the plane of the shot object according to the distance between the electronic equipment and a plurality of detection points between the plane of the shot object.
The DTOF device on the electronic device may detect the distance and position information from the object to be photographed (i.e., the plane where the object to be photographed is located) to the electronic device by using a 3D sensing technology. The DTOF device on the electronic apparatus can emit short pulse laser light of nanosecond or even picosecond order to a plurality of detection points of a plane where a subject is photographed, and detect the time from the emission to the return of the laser light. The electronic device can measure the distance between different detection points of the plane where the shot object is located and the electronic device according to the time from emission to return of the laser corresponding to the different detection points and the speed of the laser. Therefore, the electronic equipment can determine the included angle between the electronic equipment and the plane of the shot object according to the distance between the electronic equipment and a plurality of detection points between the plane of the shot object. That is, the electronic device determining the included angle between the plane in which the photographed object is located and the electronic device may include the electronic device first acquiring distances between the electronic device and a plurality of detection points in the plane in which the photographed object is located. Then, the electronic device may determine the first included angle according to distances between the electronic device and a plurality of detection points in a plane where the object is photographed.
In some examples, when the user holds the electronic device vertically (i.e., the display screen of the electronic device is a vertical screen display), the plurality of detection points between the planes in which the object is located may include a detection point corresponding to an upper edge of the object or a detection point corresponding to a lower edge of the object (in the embodiment of the present application, a detection point corresponding to an upper edge of the object or a detection point corresponding to a lower edge of the object may also be referred to as a first target detection point), and a detection point corresponding to a vertical line of the electronic device on the plane in which the object is located (in the embodiment of the present application, a detection point corresponding to a vertical line of the electronic device on the plane in which the object is located may also be referred to as a second target detection point).
When the user holds the electronic device horizontally (that is, the display screen of the electronic device is a horizontal screen display), the plurality of detection points between the planes where the photographed object is located may include a detection point corresponding to the left edge of the photographed object or a detection point corresponding to the right edge of the photographed object, and a detection point corresponding to the vertical line of the electronic device on the plane where the photographed object is located.
In the embodiment of the present application, a specific manner in which the user holds the electronic device vertically or holds the electronic device horizontally is not limited. In the embodiment of the application, the user holds the electronic device vertically for example to make a schematic description. In other words, in the embodiment of the present application, a plurality of detection points between the planes of the object to be shot, including the detection point corresponding to the upper edge of the object to be shot, and the detection point corresponding to the vertical line of the electronic device on the plane of the object to be shot are schematically described as examples.
That is, the electronic apparatus can detect the distance between the detection point corresponding to the upper edge of the photographed object and the electronic apparatus, and the distance between the detection point corresponding to the vertical line of the electronic apparatus on the plane where the photographed object is located and the electronic apparatus by the DTOF device. And the electronic device may obtain, through measurement, an included angle between a connection line between the detection point corresponding to the upper edge of the object to be shot and the electronic device (in this embodiment, the connection line between the detection point corresponding to the upper edge of the object to be shot and the electronic device may also be referred to as a first connection line), and a connection line between the detection point corresponding to the vertical line of the electronic device on the plane where the object to be shot is located (in this embodiment, the connection line between the detection point corresponding to the vertical line of the electronic device on the plane where the object to be shot is located may also be referred to as a second connection line). In this embodiment of the present application, the included angle between the connection line between the detection point corresponding to the upper edge of the object to be shot and the electronic device and the connection line between the detection point corresponding to the vertical line of the electronic device on the plane where the object to be shot is located may also be referred to as the second included angle. Namely, the second included angle is the included angle between the first connecting line and the second connecting line.
Then, the electronic device may determine, according to a distance between the detection point corresponding to the upper edge of the object to be shot and the electronic device (in this embodiment, a distance between the detection point corresponding to the upper edge of the object to be shot and the electronic device may also be referred to as a second distance), a distance between the detection point corresponding to the vertical line of the electronic device and the electronic device on the plane where the object to be shot is located (in this embodiment, a distance between the detection point corresponding to the vertical line of the electronic device and the electronic device on the plane where the object to be shot is located, and a distance between the detection point corresponding to the upper edge of the object to be shot and the electronic device may also be referred to as a first distance), and an included angle between the electronic device and the plane where the object to be shot is located.
That is, the electronic device may determine the first included angle according to the first distance, the second distance, and the second included angle. When determining the first distance and the second distance, the second distance may be a distance between a detection point corresponding to an upper edge of the object to be shot and a target position on the electronic device, and a distance between a detection point corresponding to a vertical line of the electronic device on a plane where the object to be shot is located and the same target position on the electronic device. The specific position of the target position on the electronic device may be set according to the actual situation, which is not limited in the embodiment of the present application.
The following describes schematically a process of determining an included angle between the electronic device and the plane where the object is photographed by the electronic device according to a distance between the detection point corresponding to the upper edge of the object to be photographed and the electronic device, a distance between the detection point corresponding to the vertical line of the electronic device and the electronic device on the plane where the object to be photographed is located, and an included angle between a connection line between the detection point corresponding to the upper edge of the object to be photographed and the electronic device and a connection line between the detection point corresponding to the vertical line of the electronic device on the plane where the object to be photographed is located, with reference to fig. 7.
As shown in fig. 7, after the electronic device turns on the document scanning function, the electronic device may detect the distance between the detection point 702 corresponding to the upper edge of the plane where the photographed object is located and the electronic device, i.e., H2, and the distance between the detection point 701 corresponding to the vertical line of the electronic device on the plane where the photographed object is located and the electronic device, i.e., H1, through the DTOF device. And the electronic device can obtain the shot through measurementThe included angle between the detection point corresponding to the upper edge of the object and the connecting line of the electronic equipment (namely H2) and the connecting line of the detection point corresponding to the vertical line of the electronic equipment (namely H1) on the plane of the shot object
Then, the electronic device may determine the distance between the detection point 702 corresponding to the upper edge of the object to be photographed and the electronic device, i.e. H2, the distance between the detection point corresponding to the vertical line of the electronic device on the plane of the object to be photographed and the electronic device, i.e. H2, and the angle between the connection line (i.e. H2) between the detection point corresponding to the upper edge of the object to be photographed and the electronic device and the connection line (i.e. H1) between the detection point corresponding to the vertical line of the electronic device on the plane of the object to be photographedDetermining an angle between the electronic device and a plane in which the object is located +.>
H1, H2The relationship between the two can be represented by the following formula one, that is, the electronic device can determine the relationship between H1, H2 and ∈1 through the following formula one (in the embodiment of the present application, the formula one can also be called a first preset formula)>Determining an angle between the electronic device and a plane in which the object is located +.>
Equation one:
wherein,in order to obtain an included angle between the electronic equipment and the plane of the shot object, H2 is the distance between the electronic equipment and the detection point corresponding to the upper edge of the plane of the shot object, H1 is the distance between the electronic equipment and the detection point corresponding to the vertical line of the electronic equipment on the plane of the shot object, and H1 is the distance between the electronic equipment and the detection point corresponding to the vertical line of the electronic equipment >The included angle between the connecting line (H2) of the detection point corresponding to the upper edge of the shot object and the electronic equipment and the connecting line (H1) of the detection point corresponding to the vertical line of the electronic equipment on the plane of the shot object.
That is, when the electronic device determines the first included angle according to the first distance, the second distance, and the second included angle, the method may include: the electronic equipment determines a first included angle based on a first preset formula according to the first distance, the second distance and the second included angle.
In other examples, the determining, by the electronic device, an angle between the plane in which the object is photographed and the electronic device may include determining, by the electronic device, an angle between the electronic device and the plane in which the object is photographed based on a phase difference between adjacent Photodiode (PD) points.
A photodiode in an electronic device is a device capable of converting an optical signal into an electrical signal. In electronic devices, PD points are often used to measure the brightness of ambient light in order to automatically adjust the brightness of the screen, thereby providing a better user experience. The electronic device may include a plurality of PD points.
It should be noted that, in the electronic device, a correspondence relationship (in the embodiment of the present application, may also be referred to as a preset correspondence relationship) between different angles between the electronic device and the plane where the object is located and a phase difference between adjacent PD points of the electronic device may be stored in advance. The phase difference between adjacent PD points, i.e. the difference in phase between different PD points. This difference can be used to measure the image sharpness of the corresponding region scene. When the electronic device includes a plurality of PD points, the adjacent PD points may be any two adjacent PD points of the plurality of PD points, or may be two adjacent PD points preset in the plurality of PD points. That is, when the distance between the electronic device and the plane in which the photographed object is located is fixed and the angle between the electronic device and the plane in which the photographed object is located is different, the phase difference between the adjacent PD points of the electronic device is different.
After the electronic device turns on the document scanning function, the electronic device may acquire a phase difference between adjacent PD points. And then, the electronic equipment can determine the included angle between the electronic equipment and the plane of the shot object according to the phase difference between the adjacent PD points and the corresponding relation between different included angles between the electronic equipment and the plane of the shot object and the phase difference between the adjacent PD points of the electronic equipment.
That is, when the electronic device determines the first included angle (i.e., the included angle between the plane in which the photographed object is located and the electronic device), it may include: the electronic device acquires a phase difference between adjacent photodiode PD points of the electronic device. Then, the electronic device may determine the first included angle according to the phase difference between the adjacent PD points and a preset correspondence.
It should be noted that, in the embodiment of the present application, a specific manner of determining the included angle between the electronic device and the plane where the object is located is not limited, and the included angle between the electronic device and the plane where the object is located may be determined.
S604, the electronic equipment determines a target angle required to rotate by the camera of the electronic equipment according to the included angle between the plane of the shot object and the electronic equipment.
After the electronic equipment determines the included angle between the plane where the shot object is located and the electronic equipment, the electronic equipment can determine the target angle required to rotate by the camera of the electronic equipment by the included angle between the plane where the shot object is located and the electronic equipment. So that the electronic device can drive the camera to rotate (which may also be referred to as tilt) a target angle. The target angle is an angle at which the camera of the electronic device can be parallel to the plane in which the object to be photographed is located even after the camera of the electronic device is rotated.
In some examples, the electronic device determines the target angle required to rotate by the camera of the electronic device according to the included angle between the plane in which the photographed object is located and the electronic device, and may include determining the target angle required to rotate by the camera of the electronic device according to the included angle between the plane in which the photographed object is located and the electronic device through the law of the poloxamer.
In some examples, the electronic device may be configured to determine, based on the angle between the plane in which the subject is located and the electronic device, when the target angle required to rotate by the camera of the electronic equipment is determined through the law of the Moire, the electronic equipment can firstly acquire the object distance between the shot object and the electronic equipment and the focal length corresponding to the camera of the electronic equipment. Then, the electronic equipment can determine the target angle required to rotate by the camera of the electronic equipment according to the included angle between the plane where the shot object is located and the electronic equipment, the object distance between the shot object and the electronic equipment and the focal length corresponding to the camera of the electronic equipment through the law of the Moer's law.
The object distance between the object and the electronic device may be a distance between an intersection point of a plane in which the object is located and an optical axis corresponding to a camera of the electronic device, and an intersection point of the optical axis corresponding to the camera of the electronic device and the camera (i.e., a lens group of the electronic device may also be referred to as a main plane). The main plane of the camera of the electronic device refers to the shooting plane of the camera of the electronic device, that is, the front face of the camera of the electronic device. The camera is a main shooting direction of a camera of the electronic equipment and is also a plane where a shot picture is located.
The focal length corresponding to the camera of the electronic device may be a distance between an intersection point of an optical axis corresponding to the camera of the electronic device and the camera, and an intersection point of the optical axis corresponding to the camera of the electronic device and a sensor (may also be referred to as an image plane) for imaging in the electronic device. A sensor for imaging in an electronic device is generally referred to as an image sensor, which is a device capable of converting light into an electronic signal.
That is, the determining, by the electronic device, the target angle at which the camera of the electronic device needs to rotate based on the first included angle may include: the electronic device determines an object distance between the shot object and the electronic device and a focal length corresponding to a camera of the electronic device. And then, the electronic equipment can determine the target angle required to rotate by the camera of the electronic equipment according to the object distance between the shot object and the electronic equipment, the focal length corresponding to the camera of the electronic equipment and the first included angle.
For example, as shown in fig. 8 (a), when the plane in which the object is located is parallel to the electronic apparatus, an intersection point between an optical axis corresponding to a camera (i.e., a lens group) of the electronic apparatus and the plane in which the object is located (i.e., a scanning plane) is an intersection point 801. An intersection point between an optical axis corresponding to a camera (i.e., a lens group) of the electronic device and the camera (i.e., a main plane) is an intersection point 802. An intersection point between an optical axis corresponding to a camera (i.e., lens group) of the electronic device and an intersection point of a sensor (which may also be referred to as an image plane) for imaging is an intersection point 803.
The object distance between the photographed object and the electronic device is the distance between the intersection point of the plane where the photographed object is located and the optical axis corresponding to the camera of the electronic device and the intersection point of the optical axis corresponding to the camera of the electronic device and the camera. That is, the object distance between the subject and the electronic device is the distance between the intersection 801 and the intersection 802.
The focal length corresponding to the camera of the electronic equipment is the distance between the intersection point of the optical axis corresponding to the camera of the electronic equipment and the camera and the intersection point of the optical axis corresponding to the camera of the electronic equipment and the sensor used for imaging in the electronic equipment. I.e. the focal length corresponding to the camera of the electronic device, is the distance between the intersection 802 and the intersection 803.
As another example, in conjunction with (b) in fig. 8, when the plane of the object to be shot is not parallel to the electronic device (i.e., the plane of the object to be shot forms a certain angle with the electronic device), if the camera of the electronic device rotates by the target angle, an intersection point between the optical axis corresponding to the camera (i.e., the lens group) of the electronic device and the plane of the object to be shot (i.e., the scanning plane) is the intersection point 804. An intersection point between an optical axis corresponding to a camera (i.e., a lens group) of the electronic device and the camera (i.e., a main plane) is an intersection point 805. An intersection point between an optical axis corresponding to a camera (i.e., lens group) of the electronic device and an intersection point of a sensor (which may also be referred to as an image plane) for imaging is an intersection point 806.
The object distance between the photographed object and the electronic device is the distance between the intersection point of the plane where the photographed object is located and the optical axis corresponding to the camera of the electronic device and the intersection point of the optical axis corresponding to the camera of the electronic device and the camera. That is, the object distance between the subject and the electronic device is the distance between the intersection 804 and the intersection 805.
The focal length corresponding to the camera of the electronic equipment is the distance between the intersection point of the optical axis corresponding to the camera of the electronic equipment and the camera and the intersection point of the optical axis corresponding to the camera of the electronic equipment and the sensor used for imaging in the electronic equipment. I.e. the focal length corresponding to the camera of the electronic device, is the distance between the intersection point 805 and the intersection point 806.
In some examples, the electronic device may determine the angle between the plane in which the subject is located and the electronic device (i.e.) When determining a target angle (namely theta) required to rotate the camera of the electronic equipment through the law of the Mooney, namely theta, mu, f and ∈>The relationship between them may be a relationship shown in the following formula two. That is, the electronic device may perform the following formula two (the formula two may also be referred to as a second preset formula in the embodiment of the present application)>μ, and f, determining a target angle (i.e., θ) at which the camera of the electronic device is required to rotate.
Formula II:
wherein,the included angle between the electronic equipment and the plane where the shot object is located is μ the object distance between the shot object and the electronic equipment, f is the focal length corresponding to the camera of the electronic equipment, and θ is the target angle required to rotate the camera of the electronic equipment.
That is, the determining, by the electronic device, the target angle required to rotate the camera of the electronic device according to the object distance between the photographed object and the electronic device, the focal length corresponding to the camera of the electronic device, and the first included angle may include: the electronic equipment determines a target angle based on a second preset formula according to the object distance between the shot object and the electronic equipment, the focal length corresponding to the camera of the electronic equipment and the first included angle.
In some examples, the object distance between the photographed object and the electronic device may be obtained by a sensor in the electronic device that measures the distance. In the embodiment of the present application, the method for acquiring the object distance between the object to be photographed and the electronic device is not limited.
In some examples, the focal length corresponding to the camera of the electronic device is typically a fixed value, and thus the focal length corresponding to the camera of the electronic device may be pre-stored in the electronic device.
S605, the electronic equipment rotates the camera by a target angle.
After the electronic equipment determines the target angle required to rotate by the camera of the electronic equipment according to the included angle between the plane of the shot object and the electronic equipment, the electronic equipment can rotate the camera by the target angle. Therefore, the depth of field range corresponding to the camera of the electronic equipment is changed (namely, the focus corresponding to the camera of the electronic equipment is changed in real time), and the shot objects can be all in the depth of field range corresponding to the camera after the rotation target angle. Because the electronic equipment rotates the camera by the target angle, the camera of the electronic equipment can be in a parallel position with the plane where the shot object is located, after the depth of field range of the camera is changed, the depth of field range corresponding to the changed camera can be fully within the depth of field range corresponding to the camera after the shot object is rotated by the target angle.
Therefore, when the electronic device scans the shot object through the camera after the rotation target angle, the shot object can be all in the depth of field range corresponding to the camera after the rotation target angle, so that a clear electronic scanning piece can be obtained.
It should be noted that, the depth of field corresponding to the camera includes a front depth of field and a rear depth of field. The depth of field range corresponding to the camera of the electronic equipment changes, namely the range sounding changes of the front depth of field and the rear depth of field corresponding to the camera. Because the range of the front depth of field and the rear depth of field corresponding to the camera changes in sounding, the shot object can be in the range of the front depth of field and the rear depth of field corresponding to the camera after the rotation target angle.
In some examples, the electronic device may rotate the camera by a target angle, which may be sent by the electronic device to a driver corresponding to the camera, such that the driver corresponding to the camera may drive the camera by the target angle.
It should be noted that, the corresponding driving (i.e., motor driving) of the camera may include electromagnetic driving, piezoelectric driving, SMA driving, and the like. In the embodiment of the application, the specific type of the driving corresponding to the camera is not limited, and the camera can be driven to rotate by a target angle.
S606, the electronic equipment receives the triggering operation of the user on the scanning control.
After the electronic device rotates the camera by the target angle, the electronic device can receive the triggering operation of the scanning control by the user. Therefore, the electronic equipment can scan the shot object through the camera after rotating the target angle, so that the information of the shot object can be obtained, and the electronic scanning piece of the shot object is obtained.
S607, responding to the triggering operation of the user on the scanning control, and the electronic equipment scans the shot object through the camera after rotating the target angle to acquire the information of the shot object.
When the electronic device receives a triggering operation, such as a clicking operation, of the scanning control by a user, the electronic device can scan the shot object through the camera after rotating the target angle, so that the electronic device can acquire the information of the shot object. The electronic device can obtain the electronic scanning piece of the shot object according to the information of the shot object.
When the object is a document, the information in the object is text information, image information, or form information in the document, and when the object is an image, the information in the object is text information, image information, or form information in the image, and when the object is a form, the information in the object is text information, image information, or form information in the form.
In some examples, after the electronic device rotates the camera by the target angle, the depth of field range corresponding to the camera of the electronic device changes (that is, the focal point corresponding to the camera of the electronic device changes in real time), that is, the electronic device rotates the camera by the target angle, and the camera of the electronic device may be in a parallel position with the plane where the object to be photographed is located, so after the depth of field range of the camera changes, the depth of field range corresponding to the camera after the change is made, and the object to be photographed may be all within the depth of field range corresponding to the camera after the rotation of the target angle. After the electronic equipment rotates the camera by the target angle, the shot object can be completely in the depth of field range corresponding to the camera after rotating the target angle, so that a clear electronic scanning piece can be obtained, and therefore, the electronic equipment can reduce the aperture of the camera, and the definition of the obtained electronic scanning piece can be further improved.
For example, as shown in fig. 9 (a), when the plane in which the object is located (i.e., the focal plane in the drawing) is parallel to the electronic device (i.e., the image plane in the drawing), the depth of field range corresponding to the camera of the electronic device may include a front depth of field and a rear depth of field. The front depth of field is the portion of the depth of field that is near the image plane. The front depth of field is the part far away from the image plane in the depth of field range, and when the focal plane is positioned in the front depth of field and the rear depth of field range, the electronic scanning surface of the shot object formed in the image plane corresponding to the sensor used for imaging in the electronic equipment is clearer.
As another example, as shown in fig. 9 (b), when the plane (i.e., the focal plane in the drawing) where the object is located forms a certain angle with the electronic device (i.e., the image plane in the drawing), the depth of field range corresponding to the camera of the electronic device may include a front depth of field and a rear depth of field. The front depth of field is the portion of the depth of field that is near the image plane. The front depth of field is the part far away from the image plane in the depth of field range, and when the focal plane is positioned in the front depth of field and the rear depth of field range, the electronic scanning surface of the shot object formed in the image plane corresponding to the sensor used for imaging in the electronic equipment is clearer. And the depth of field range corresponding to the camera at this time becomes wedge-shaped, a smaller lens f-number (larger aperture) than that required parallel to the scanning plane can be used for realizing higher definition for the document scanning scene. That is, the electronic device can reduce the aperture of the camera, thereby further improving the definition of the obtained electronic scanning piece.
And S608, the electronic equipment obtains an electronic scanning piece corresponding to the shot object according to the information of the shot object.
After the electronic device obtains the information of the shot object, the electronic device can obtain the electronic scanning piece corresponding to the shot object according to the information of the shot object.
For example, after the electronic device acquires the information in the object, the electronic device may convert the information in the object into an electronic scan piece corresponding to the object. After the electronic scanning piece corresponding to the shot object is obtained, the electronic equipment can display the electronic scanning piece, so that the electronic equipment can be convenient for a user to check and edit.
After the electronic device obtains the electronic scan piece corresponding to the object, the electronic device may display the electronic scan piece corresponding to the object, so that the user may view the electronic scan piece corresponding to the object, or may edit the electronic scan piece corresponding to the object through the electronic device.
According to the scheme, before the electronic equipment scans the shot object, the electronic equipment can determine the target angle required to rotate by the camera in real time and rotate the camera by the target angle, so that the shot object can be in the depth of field range corresponding to the camera after rotating by the target angle. Therefore, when the electronic device scans the shot object through the camera after the rotation target angle, the shot object can be all in the depth of field range corresponding to the camera after the rotation target angle, so that a clear electronic scanning piece can be obtained. Therefore, when the plane where the electronic equipment and the shot object are located is not parallel, the user needs to adjust the position of the electronic equipment or the shot object, so that the plane where the electronic equipment and the shot object are located is not parallel (even if the shot object can be completely located in the depth of field range corresponding to the camera), the trouble of operation of the user is avoided, and the user experience is improved.
Corresponding to the method in the foregoing embodiment, the embodiment of the present application further provides a scanning device. The scanning device may be applied to an electronic apparatus for implementing the method in the foregoing embodiment. The functions of the scanning device can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
For example, fig. 10 shows a schematic structural diagram of a scanning device 10, and as shown in fig. 10, the control device 10 may include: a receiving module 1001, a display module 1002, a determining module 1003, a control module 1004, an acquiring module 1005, and the like.
The receiving module 1001 may be configured to receive an opening operation of the document scanning function by a user.
The display module 1002 may be configured to display a scan interface in response to a user opening a document scanning function, the scan interface including a determination scan control.
The determining module 1003 may be configured to determine a first included angle, where the first included angle is an included angle between a plane where the object is photographed and the electronic device.
The determining module 1003 may be further configured to determine a target angle required to rotate the camera of the electronic device based on the first included angle.
The control module 1004 may also be used to control the camera to rotate a target angle; after the camera rotates a target angle, the plane where the shot object is located is in a parallel state with the camera.
The receiving module 1001 may be further configured to receive a triggering operation of determining a scan control by a user.
The acquiring module 1005 may be configured to respond to a triggering operation of the user on the determining scan control, scan the object to be photographed by rotating the camera after the target angle, and acquire scan information of the object to be photographed.
The determining module 1003 may be further configured to obtain an electronic scan piece corresponding to the object according to the information of the object.
In another possible implementation manner, the acquiring module 1005 may be further configured to acquire distances between the electronic device and a plurality of detection points in a plane of the object to be photographed.
The determining module 1003 may be further configured to determine the first included angle according to a distance between the electronic device and a plurality of detection points in a plane where the object is photographed.
In another possible implementation manner, the determining module 1003 may be further configured to determine a second included angle according to the first target detection point and the second target detection point, where the second included angle is an included angle between the first connection line and the second connection line; the first connecting line is a connecting line between the first target detection point and the electronic equipment, and the second connecting line is a connecting line between the second target detection point and the electronic equipment.
The determining module 1003 may be further configured to determine a first included angle according to the first distance, the second distance, and the second included angle; the first distance is the distance between the first target detection point and the electronic equipment, and the second distance is the distance between the second target detection point and the electronic equipment.
In another possible implementation manner, when the display screen of the electronic device is in a portrait display state, the first target detection point is a detection point corresponding to the upper edge of the photographed object or a detection point corresponding to the lower edge of the photographed object, and the second target detection point is a detection point corresponding to a vertical line of the electronic device on a plane where the photographed object is located.
In another possible implementation manner, the determining module 1003 may be further configured to determine the first included angle based on a first preset formula according to the first distance, the second distance, and the second included angle.
The first preset formula is:
wherein,for the first angle, H1 is the first distance, H2 is the second distance, ++>Is a second included angle.
In another possible implementation, the acquiring module 1005 may also be configured to acquire a phase difference between adjacent photodiode PD points of the electronic device.
The determining module 1003 may be further configured to determine the first included angle according to a phase difference between adjacent PD points and a preset correspondence.
In another possible implementation manner, the determining module 1003 may be further configured to determine an object distance between the photographed object and the electronic device, and a focal length corresponding to a camera of the electronic device.
The determining module 1003 may be further configured to determine a target angle required to rotate the camera of the electronic device according to an object distance between the object and the electronic device, a focal length corresponding to the camera of the electronic device, and the first included angle.
In another possible implementation manner, the determining module 1003 may be further configured to determine the target angle based on a second preset formula according to an object distance between the photographed object and the electronic device, a focal length corresponding to a camera of the electronic device, and the first included angle.
The second preset formula is:
wherein,and (2) taking μ as a first included angle, taking the object distance between the shot object and the electronic equipment, f as the focal length corresponding to the camera of the electronic equipment, and θ as a target angle.
It should be understood that the division of units or modules (hereinafter referred to as units) in the above apparatus is merely a division of logic functions, and may be fully or partially integrated into one physical entity or may be physically separated. And the units in the device can be all realized in the form of software calls through the processing element; or can be realized in hardware; it is also possible that part of the units are implemented in the form of software, which is called by the processing element, and part of the units are implemented in the form of hardware.
For example, each unit may be a processing element that is set up separately, may be implemented as integrated in a certain chip of the apparatus, or may be stored in a memory in the form of a program, and the functions of the unit may be called and executed by a certain processing element of the apparatus. Furthermore, all or part of these units may be integrated together or may be implemented independently. The processing element described herein, which may also be referred to as a processor, may be an integrated circuit with signal processing capabilities. In implementation, each step of the above method or each unit above may be implemented by an integrated logic circuit of hardware in a processor element or in the form of software called by a processing element.
In one example, the units in the above apparatus may be one or more integrated circuits configured to implement the above method, for example: one or more ASICs, or one or more DSPs, or one or more FPGAs, or a combination of at least two of these integrated circuit forms.
For another example, when the units in the apparatus may be implemented in the form of a scheduler of processing elements, the processing elements may be general-purpose processors, such as CPUs or other processors that may invoke programs. For another example, the units may be integrated together and implemented in the form of a system on chip SOC.
In one implementation, the above means for implementing each corresponding step in the above method may be implemented in the form of a processing element scheduler. For example, the apparatus may comprise a processing element and a storage element, the processing element invoking a program stored in the storage element to perform the method described in the above method embodiments. The memory element may be a memory element on the same chip as the processing element, i.e. an on-chip memory element.
In another implementation, the program for performing the above method may be on a memory element on a different chip than the processing element, i.e. an off-chip memory element. At this point, the processing element invokes or displays a program on the on-chip storage element from the off-chip storage element to invoke and execute the method described in the method embodiments above.
For example, embodiments of the present application may also provide an apparatus, such as: an electronic device may include: a processor, a memory for storing instructions executable by the processor. The processor is configured to execute the above-described instructions, causing the electronic device to implement the scanning method as described in the previous embodiments. The memory may be located within the electronic device or may be located external to the electronic device. And the processor includes one or more.
In yet another implementation, the unit implementing each step in the above method may be configured as one or more processing elements, where the processing elements may be disposed on the electronic device corresponding to the above, and the processing elements may be integrated circuits, for example: one or more ASICs, or one or more DSPs, or one or more FPGAs, or a combination of these types of integrated circuits. These integrated circuits may be integrated together to form a chip.
For example, the embodiment of the application also provides a chip, and the chip can be applied to the electronic equipment. The chip includes one or more interface circuits and one or more processors; the interface circuit and the processor are interconnected through a circuit; the processor receives and executes computer instructions from the memory of the electronic device through the interface circuit to implement the methods described in the method embodiments above.
Embodiments of the present application also provide a computer program product comprising computer instructions for operating an electronic device as described above.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the related art or all or part of the technical solution may be embodied in the form of a software product, for example: and (5) program. The software product is stored in a program product, such as a computer readable storage medium, comprising instructions for causing a device (which may be a single-chip microcomputer, chip or the like) or processor (processor) to perform all or part of the steps of the methods described in the various embodiments of the application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
For example, embodiments of the present application may also provide a computer-readable storage medium having computer program instructions stored thereon. The computer program instructions, when executed by an electronic device, cause the electronic device to implement a scanning method as described in the foregoing method embodiments.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A scanning method, characterized by being applied to an electronic device, the electronic device including a document scanning function, the method comprising:
receiving an opening operation of a user on the document scanning function;
responsive to a user opening operation of the document scanning function, displaying a scanning interface, the scanning interface including a determination scanning control;
determining a first included angle, wherein the first included angle is an included angle between a plane where a shot object is located and the electronic equipment;
determining a target angle required to rotate by a camera of the electronic equipment based on the first included angle;
controlling the camera to rotate the target angle; after the camera rotates the target angle, the plane where the shot object is positioned is in a parallel state with the camera;
receiving triggering operation of a user on the determined scanning control;
responding to the triggering operation of the user on the determined scanning control, scanning the shot object through the camera after rotating the target angle, and acquiring the scanning information of the shot object;
and obtaining an electronic scanning piece corresponding to the shot object according to the information of the shot object.
2. The method of claim 1, wherein the determining the first included angle comprises:
acquiring distances between the electronic equipment and a plurality of detection points in a plane where the shot object is located;
and determining the first included angle according to the distance between the electronic equipment and a plurality of detection points in the plane where the shot object is located.
3. The method of claim 2, wherein the plurality of detection points includes a first target detection point and a second target detection point, the determining the first included angle based on a distance between the electronic device and the plurality of detection points in a plane in which the photographed object is located, comprising:
determining a second included angle according to the first target detection point and the second target detection point, wherein the second included angle is an included angle between the first connecting line and the second connecting line; the first connection is a connection between the first target detection point and the electronic equipment, and the second connection is a connection between the second target detection point and the electronic equipment;
determining the first included angle according to the first distance, the second distance and the second included angle; the first distance is the distance between the first target detection point and the electronic equipment, and the second distance is the distance between the second target detection point and the electronic equipment.
4. The method according to claim 3, wherein when the display screen of the electronic device is in a portrait display state, the first target detection point is a detection point corresponding to an upper edge of the object to be photographed or a detection point corresponding to a lower edge of the object to be photographed, and the second target detection point is a detection point corresponding to a vertical line of the electronic device on a plane where the object to be photographed is located.
5. The method of claim 3 or 4, wherein determining the first included angle based on the first distance, the second distance, and the second included angle comprises:
determining the first included angle based on a first preset formula according to the first distance, the second distance and the second included angle;
the first preset formula is:
wherein,for the first included angle, H1 is the first distance, H2 is the second distance, ++>Is the second included angle.
6. The method of claim 1, wherein the determining the first included angle comprises:
acquiring a phase difference between PD points of adjacent photodiodes of the electronic equipment;
and determining the first included angle according to the phase difference between the PD points of the adjacent photodiodes and a preset corresponding relation.
7. The method of claim 1, wherein determining a target angle at which a camera of the electronic device is required to rotate based on the first included angle comprises:
determining an object distance between the shot object and the electronic equipment and a focal length corresponding to a camera of the electronic equipment;
and determining a target angle required to rotate the camera of the electronic equipment according to the object distance between the shot object and the electronic equipment, the focal length corresponding to the camera of the electronic equipment and the first included angle.
8. The method of claim 7, wherein determining the target angle at which the camera of the electronic device is required to rotate according to the object distance between the photographed object and the electronic device, the focal length corresponding to the camera of the electronic device, and the first angle comprises:
determining the target angle based on a second preset formula according to the object distance between the shot object and the electronic equipment, the focal length corresponding to the camera of the electronic equipment and the first included angle;
the second preset formula is:
wherein,and for the first included angle, mu is the object distance between the shot object and the electronic equipment, f is the focal length corresponding to the camera of the electronic equipment, and theta is the target angle.
9. An electronic device comprising a processor, a memory for storing instructions executable by the processor; the processor is configured to, when executing the instructions, cause the electronic device to implement the method of any one of claims 1 to 8.
10. A computer readable storage medium having stored thereon computer program instructions; it is characterized in that the method comprises the steps of,
the computer program instructions, when executed by an electronic device, cause the electronic device to implement the method of any one of claims 1 to 8.
CN202311785086.9A 2023-12-25 2023-12-25 Scanning method and electronic equipment Pending CN117499549A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311785086.9A CN117499549A (en) 2023-12-25 2023-12-25 Scanning method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311785086.9A CN117499549A (en) 2023-12-25 2023-12-25 Scanning method and electronic equipment

Publications (1)

Publication Number Publication Date
CN117499549A true CN117499549A (en) 2024-02-02

Family

ID=89678496

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311785086.9A Pending CN117499549A (en) 2023-12-25 2023-12-25 Scanning method and electronic equipment

Country Status (1)

Country Link
CN (1) CN117499549A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104902190A (en) * 2015-06-24 2015-09-09 联想(北京)有限公司 Control method, photographic device and electronic device
US20190215461A1 (en) * 2016-09-30 2019-07-11 Optim Corporation System, method, and program for adjusting angle of camera
CN110769162A (en) * 2019-11-28 2020-02-07 维沃移动通信有限公司 Electronic equipment and focusing method
CN112272267A (en) * 2020-10-22 2021-01-26 Oppo广东移动通信有限公司 Shooting control method, shooting control device and electronic equipment
CN115442487A (en) * 2021-06-06 2022-12-06 上海钛仕科技有限公司 File scanning method
CN115550517A (en) * 2021-06-15 2022-12-30 展讯半导体(南京)有限公司 Scanning control method, system, electronic device and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104902190A (en) * 2015-06-24 2015-09-09 联想(北京)有限公司 Control method, photographic device and electronic device
US20190215461A1 (en) * 2016-09-30 2019-07-11 Optim Corporation System, method, and program for adjusting angle of camera
CN110769162A (en) * 2019-11-28 2020-02-07 维沃移动通信有限公司 Electronic equipment and focusing method
CN112272267A (en) * 2020-10-22 2021-01-26 Oppo广东移动通信有限公司 Shooting control method, shooting control device and electronic equipment
CN115442487A (en) * 2021-06-06 2022-12-06 上海钛仕科技有限公司 File scanning method
CN115550517A (en) * 2021-06-15 2022-12-30 展讯半导体(南京)有限公司 Scanning control method, system, electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN114679537B (en) Shooting method and terminal
WO2020073959A1 (en) Image capturing method, and electronic device
WO2020000448A1 (en) Flexible screen display method and terminal
CN114092364B (en) Image processing method and related device
WO2023016025A1 (en) Image capture method and device
WO2021129198A1 (en) Method for photography in long-focal-length scenario, and terminal
CN109903260B (en) Image processing method and image processing apparatus
WO2020029306A1 (en) Image capture method and electronic device
CN112351156B (en) Lens switching method and device
CN110006340B (en) Object size measuring method and electronic equipment
CN113660408B (en) Anti-shake method and device for video shooting
WO2021135618A1 (en) Interface display method and related apparatus
CN110248037B (en) Identity document scanning method and device
WO2023056795A1 (en) Quick photographing method, electronic device, and computer readable storage medium
CN110138999B (en) Certificate scanning method and device for mobile terminal
CN117061861B (en) Shooting method, chip system and electronic equipment
US20240013432A1 (en) Image processing method and related device
CN117278850A (en) Shooting method and electronic equipment
CN117135467A (en) Image processing method and electronic equipment
CN117499549A (en) Scanning method and electronic equipment
WO2022062902A1 (en) File transfer method and electronic device
CN116761082B (en) Image processing method and device
CN116723382B (en) Shooting method and related equipment
CN115460343B (en) Image processing method, device and storage medium
CN114390195B (en) Automatic focusing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination