CN114982214A - Electronic device, method of controlling electronic device, and computer-readable storage medium - Google Patents

Electronic device, method of controlling electronic device, and computer-readable storage medium Download PDF

Info

Publication number
CN114982214A
CN114982214A CN202080093632.0A CN202080093632A CN114982214A CN 114982214 A CN114982214 A CN 114982214A CN 202080093632 A CN202080093632 A CN 202080093632A CN 114982214 A CN114982214 A CN 114982214A
Authority
CN
China
Prior art keywords
signal processor
image signal
page
image
depth information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080093632.0A
Other languages
Chinese (zh)
Inventor
青山千秋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Publication of CN114982214A publication Critical patent/CN114982214A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/387Composing, repositioning or otherwise geometrically modifying originals
    • H04N1/3877Image rotation
    • H04N1/3878Skew detection or correction

Abstract

An electronic device according to an embodiment of the present disclosure includes: a camera module that takes a picture of a subject to obtain a main camera image; a distance sensor module that emits pulsed light to an object and detects reflected light of the pulsed light reflected from the object, thereby acquiring time-of-flight (ToF) depth information; and an image signal processor controlling the camera module and the distance sensor module to acquire the camera image based on the main camera image and the ToF depth information.

Description

Electronic device, method of controlling electronic device, and computer-readable storage medium
Technical Field
The present disclosure relates to an electronic device, a method of controlling the electronic device, and a computer-readable storage medium.
Background
Recently, in taking a book as an image, an apparatus is popular which takes an image by pressing the book on a flat glass and scanning the book from below using a line sensor.
Some devices can take books from above when they are open, but they are more expensive and less popular than the previous type of device.
Users who read a large number of books often use a way to break down the books into pages and read them continuously.
On the other hand, there is also a conventional technique of reading a sheet placed on a flat surface using a smartphone camera. This conventional technology is implemented by a free application and is widely used. However, in this related art, when the sheet is bent, the sheet surface is bent in the image read from the sheet.
Therefore, it is necessary to acquire an image of the surface of a page extending in a plane from an image of the surface of a curved page spread as a book opens, using a camera of an electronic apparatus such as a smartphone.
Disclosure of Invention
The present disclosure is directed to solving at least one of the above technical problems. Accordingly, the present disclosure needs to provide an electronic device and a method of controlling the electronic device.
According to the present disclosure, an electronic device may include:
a camera module which takes a picture of a subject to obtain a main camera image;
a distance sensor module acquiring distance depth information of an object by using light; and
an image signal processor which controls the camera module and the distance sensor module to acquire the camera image based on the main camera image and the distance depth information, wherein
The image signal processor controls the camera module and the distance sensor module to acquire a main camera image including a curved page of a page in a state where the book is open and distance depth information which is distance depth information of the curved page,
an image signal processor estimates a curve corresponding to a position of a curved page based on the main camera image and the distance depth information, an
The image signal processor obtains an image of a surface of the sheet that has been projection-transformed into a plane by projection-transforming the curved page of the sheet in the main camera image into a plane based on the estimated curve.
In some embodiments, wherein the distance sensor module emits pulsed light toward the object and detects reflected light of the pulsed light reflected from the object, thereby acquiring time-of-flight (ToF) depth information as the distance depth information.
In some embodiments, wherein the image signal processor obtains the main camera image including the curved page by taking a picture of the curved page that has been opened and curved using the camera module, and
the image signal processor acquires the ToF depth information of the curved page through the distance sensor module.
In some embodiments, wherein the image signal processor estimates a curve corresponding to a position of the curved page in a plane perpendicular to a crease direction of a position of the opening crease of the page based on the main camera image and the ToF depth information.
In some embodiments, wherein the image signal processor obtains a main camera image including the curved pages of the other pages in a state where the book is open, and ToF depth information which is the ToF depth information of the curved pages of the other pages,
wherein the image signal processor estimates other curves corresponding to positions of the curved pages of other sheets based on the main camera image and the ToF depth information,
wherein the image signal processor obtains an image of a surface of the other sheet, which has been projectively transformed into a plane, by projectively transforming curved pages of the other sheet in the main camera image into the plane based on the estimated other curve, and
wherein the image signal processor synthesizes the acquired images of the surfaces of the two pages of the plane, and acquires the images of the surfaces of the two pages of the plane in a state where the book is opened.
In some embodiments, wherein the image signal processor sets a fold position specification frame which is specified by a user on a page opened in an image taken by the camera module in a state where the book is opened, and
wherein the image signal processor acquires a main camera image including a curved page in which a crease position designation frame is set by taking an image of the curved page of a page that has been opened and curved with the camera module in a state in which the book is opened.
In some embodiments, further comprising:
a display module that displays predetermined information;
an input module that receives an operation of a user; and
a main processor for controlling the display module and the input module,
wherein the image signal processor displays the fold position specification frame on the display module together with the main camera image taken by the camera module, and
wherein the image signal processor sets the fold position specification frame at a position specified by the user on the curved page of the main camera image in response to an operation input to the input block by the user in association with the fold position specification frame instruction.
In some embodiments, wherein the image signal processor obtains reference point cloud data for a curved page based on the main camera image and the ToF depth information,
wherein the image signal processor calculates a normal vector by applying principal component analysis to the reference point cloud data, and
the image signal processor performs projective transformation based on the calculated normal vector, and projectively transforms the reference point cloud data into first point cloud data of a main camera image photographed from a depth direction.
In some embodiments, wherein the image signal processor scans the first dot cloud data along a plurality of lines in a direction perpendicular to a longitudinal direction of the crease position designation frame,
wherein the image signal processor calculates a slope of a valley of the first point cloud data by applying a least square method to the scanned first point cloud data,
wherein the image signal processor estimates a crease position based on the calculated slope of the valley, and
the image signal processor obtains second point cloud data by performing a first rotation on the first point cloud data so that the estimated crease position is parallel to a preset reference direction.
In some embodiments, wherein the image signal processor scans the second point cloud data along a plurality of lines in a direction perpendicular to the reference direction,
wherein the image signal processor calculates a slope of a ridge in a reference direction of the scanned second point cloud data by applying a least square method to the scanned second point cloud data,
and the image signal processor acquires third point cloud data by performing second rotation on the second point cloud data so that the calculated slope of the ridge is parallel to a preset reference plane.
In some embodiments, wherein the image signal processor scans the third point cloud data along a plurality of lines in the reference direction,
wherein the image signal processor calculates an average value of the third point cloud data in the depth direction in the vicinity of the plurality of lines, and
wherein the image signal processor approximates the calculated average value to a curve in a direction perpendicular to the reference direction and in the depth direction by a polynomial of fourth order or higher.
In some embodiments, wherein the image signal processor divides the third point cloud data into a plurality of rectangular areas for each section obtained by dividing the curve, and
wherein the image signal processor obtains an image of a surface of the page that has been projection-transformed into a plane by projection-transforming a curved page in the main camera image into a plane based on a relationship between coordinates of the plurality of rectangular areas and coordinates of a projection space of the reference point cloud data.
In some embodiments, wherein the detection resolution of the distance sensor module is lower than the detection resolution of the camera module.
According to the present disclosure, a method for controlling an electronic device, the electronic device comprising: a camera module that takes a picture of a subject to obtain a main camera image; a distance sensor module acquiring distance depth information of an object by using light; and an image signal processor controlling the camera module and the distance sensor module to acquire a camera image based on the main camera image and the distance depth information,
the method comprises the following steps:
controlling, by an image signal processor, a camera module and a distance sensor module to acquire a main camera image and distance depth information, the main camera image including a curved page of a page in a state in which a book is open, the distance depth information being distance depth information of the curved page;
estimating, by an image signal processor, a curve corresponding to a position of a curved page based on a main camera image and distance depth information; and
an image of the surface of the page that has been projectively transformed into a plane is obtained by projectively transforming, by the image signal processor, the curved page of the page in the main camera image into a plane based on the estimated curve.
In some embodiments, wherein the distance depth information is time-of-flight (ToF) depth information.
According to the present disclosure, a computer-readable storage medium, on which a computer program is stored, wherein the computer program realizes a method for controlling an electronic device when the computer program is executed by a processor, and the method comprises:
acquiring, by an image signal processor, a main camera image and distance depth information by correcting a camera module and a distance sensor module, the main camera image including a curved page of a page in a state in which a book is open, the distance depth information being distance depth information of the curved page;
estimating, by an image signal processor, a curve corresponding to a position of a curved page based on a main camera image and distance depth information; and
an image of the surface of the page that has been projectively transformed into a plane is obtained by projectively transforming the curved page in the main camera image into a plane based on the estimated curve by the image signal processor.
In some embodiments, wherein the distance depth information is time-of-flight (ToF) depth information.
Drawings
These and/or other aspects and advantages of embodiments of the present disclosure will become apparent and more readily appreciated from the following description, taken in conjunction with the accompanying drawings, in which:
fig. 1 is a diagram showing an example of an arrangement of an electronic apparatus 100 and an object 101 according to an embodiment of the present disclosure;
fig. 2 is a diagram showing an example of the configuration of the electronic apparatus 100 shown in fig. 1;
fig. 3 is a diagram showing an example of an overall flow of acquiring an image, which is obtained by transforming a surface of a page of the book 100 in an open state into a plane, by the electronic apparatus 100 shown in fig. 1 and 2;
fig. 4 is a diagram showing a specific example of step S1 shown in fig. 3, the step S1 being for acquiring a main camera image and ToF depth information;
fig. 5 is a diagram showing an example of display on the display module 45 of the electronic apparatus 100 when the fold position is designated;
fig. 6 is a diagram showing a specific example of step S2 shown in fig. 3 for executing the top view process;
fig. 7A is a diagram showing an example of reference point cloud data P of an image of the surface of a curved page imaged from an oblique direction with a book open;
fig. 7B is a diagram showing an example of first point cloud data P1 obtained by projective transformation as if taken from the depth direction (from the front);
fig. 8A is a diagram showing an example of scanning along a plurality of lines L1 with respect to the first dot cloud data P1 in a direction perpendicular to the longitudinal direction D1 of the fold position specification frame;
fig. 8B is a diagram showing an example of second point cloud data P2 obtained by first rotating R the first point cloud data such that the estimated crease position D2 is parallel to the preset reference direction (z-axis direction);
fig. 9A is a diagram showing an example of scanning along a plurality of lines L2 with respect to the second point cloud data P2 in a direction perpendicular to the preset reference direction (z-axis direction);
fig. 9B is a diagram showing an example of the slope of the ridge N in the longitudinal direction of the scanned second point cloud data P2;
fig. 9C is a diagram showing an example of third point cloud data P3 obtained by second rotating Q the second point cloud data P2 so that the calculated inclination of the ridge N is parallel to a preset reference plane (z-y plane);
fig. 10A is a diagram showing an example of scanning along a plurality of lines L3 in the reference direction (z-axis direction) with respect to the third point cloud data P3;
fig. 10B is a diagram showing an example of the average value a in the depth direction (x-axis direction) of the third point cloud data P3 in the vicinity of the plurality of lines L3;
fig. 10C is a diagram showing an example of a curve E obtained by approximating the average value a calculated with a polynomial of fourth order or higher in the direction perpendicular to the reference direction (y-axis direction) and the depth direction (x-axis direction);
fig. 11 is a diagram showing a specific example of step S25 for executing the division process shown in fig. 6;
fig. 12A is a diagram showing an example in which the curve E is divided by points such that the error between the points of the curve E approximated by a polynomial falls within an allowable range;
fig. 12B is a diagram showing an example in which the curve E is divided by points such that the error between the points of the curve E approximated by a polynomial falls within an allowable range, fig. 12B being continued from fig. 12A;
fig. 12C is a diagram showing an example in which the curve E is divided by points such that the error between the points of the curve E approximated by a polynomial falls within an allowable range, fig. 12C being continued from fig. 12B;
fig. 13 is a diagram showing a specific example of step S3 for executing the image correction processing shown in fig. 3;
fig. 14A is a diagram showing an example of dividing the third point cloud data P3 into a plurality of rectangular regions for each section obtained by dividing the curve E;
fig. 14B is a diagram showing an example of coordinates obtained by inverse-transforming the third point cloud data P3 into the projection space of the point cloud data P1;
fig. 15A is a diagram showing an example of a plurality of rectangular regions J into which the third point cloud data P3 is divided for each section obtained by dividing the curve E;
fig. 15B is a diagram showing an example of a plurality of rectangular regions G obtained in a two-dimensional space (u, v) from a plurality of rectangular regions J in a three-dimensional space (x, y, z);
fig. 16 is a diagram showing an example of a relationship between coordinates when point cloud data is expanded on a plane and coordinates on a captured image;
fig. 17 is a diagram showing an example of an image obtained by performing projective transformation for each corresponding rectangular region so that the surface of a curved page in a main camera image becomes a plane; and
fig. 18 is a diagram showing an example of images of the surfaces of two pages 200 and 201 that have been projected and transformed into planes.
Detailed Description
Embodiments of the present disclosure will be described in detail, and examples of the embodiments will be shown in the accompanying drawings. Throughout the specification, the same or similar elements and elements having the same or similar functions are denoted by the same reference numerals. The embodiments described herein with reference to the drawings are illustrative for the purpose of illustrating the disclosure and are not to be construed as limiting the disclosure.
Fig. 1 is a diagram showing an example of an arrangement of an electronic apparatus 100 and an object 101 according to an embodiment of the present disclosure. Fig. 2 is a diagram showing an example of the configuration of the electronic apparatus 100 shown in fig. 1.
As shown in fig. 1 and 2, for example, the electronic apparatus 100 includes a camera module 10, a distance sensor module 20, and an image signal processor 30, and the image signal processor 30 controls the camera module 10 and the distance sensor module 20, and processes camera image data acquired from the camera module 10. For example, reference numeral 101 in fig. 1 denotes that the object is a book.
As shown in fig. 2, the camera module 10 includes, for example: a main lens 10a that can be focused on a subject, a main image sensor 10b that detects an image input via the main lens 10a, and a main image sensor driver 10c that drives the main image sensor 10 b.
In addition, as shown in fig. 2, the camera module 10 includes, for example: a gyro sensor 10d that detects the angular velocity and acceleration of the camera module 10, a focus & OIS actuator 10f that actuates the main lens 10a, and a focus & OIS driver 10e that drives the focus & OIS actuator 10 f.
For example, the camera module 10 acquires a main camera image of the subject 101 (fig. 1).
As shown in fig. 2, the distance sensor module 20 includes, for example, a ToF lens 20a, a distance sensor 20b that detects reflected light input via the ToF lens 20a, a distance sensor driver 20c that drives the distance sensor 20b, and a projector 20d that outputs pulsed light.
The distance sensor module 20 acquires distance depth information of the object 101 by using light. Specifically, for example, the distance sensor module 20 acquires time-of-flight (ToF) depth information (ToF depth value) as the distance depth information by emitting pulsed light to the object 101 and detecting reflected light from the object 101.
The detection resolution of the distance sensor module 20 is lower than that of the camera module 10.
The image signal processor 30 controls, for example, the camera module 10 and the distance sensor module 20 to acquire a camera image, which is a main camera image, based on the main camera image obtained by the camera module 10 and the ToF depth information obtained by the distance sensor module 20.
In addition, as shown in fig. 2, for example, the electronic apparatus 100 includes a Global Navigation Satellite System (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial navigation unit (IMU)47, a main processor 48, and a memory 49.
For example, the GNSS module 40 may measure a current location of the electronic device 100.
For example, the wireless communication module 41 performs wireless communication with the internet.
For example, the CODEC 42 bidirectionally performs encoding and decoding using a predetermined encoding/decoding method, as shown in fig. 2.
For example, the speaker 43 outputs sound based on the sound data decoded by the CODEC 42.
For example, the microphone 44 outputs sound data to the CODEC 42 based on the input sound.
The display module 45 displays predefined information.
The input module 46 receives an input of a user (operation by the user).
For example, the IMU 47 detects angular velocity and acceleration of the electronic device 100.
The host processor 48 controls the Global Navigation Satellite System (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
The memory 49 stores a program and data necessary for the image signal processor 30 to control the camera module 10 and the distance sensor module 20, acquired image data, and a program and data necessary for the main processor 48 to control the electronic apparatus 100.
For example, the memory 49 includes a computer-readable storage medium having a computer program stored thereon, where the computer program, when executed by the main processor 48, implements a method for controlling the electronic device 100. For example, the method includes: the camera module 10 and the distance sensor module 20 are corrected by the image signal processor 30 to acquire a main camera image including a curved page in a state where the book is open and distance depth information which is the distance depth information of the curved page; estimating, by the image signal processor 30, a curve corresponding to a position of the curved page based on the main camera image and the distance depth information; and obtaining, by the image signal processor 30, an image of the surface of the page that has been projection-transformed into a plane by projection-transforming the curved page of the page in the main camera image into a plane based on the estimated curve.
The electronic apparatus 100 having the above-described configuration is a mobile phone such as a smartphone in the present embodiment, but may be other types of electronic apparatuses (e.g., a tablet computer, a PDA, or the like) including a camera module.
Next, an example of a method of controlling the electronic apparatus 100 having the above-described configuration and functions will now be described. In particular, an example of a flow in which the electronic apparatus 100 acquires a camera image of a curved page of a page in a state of being spread in a plane based on a curved image of a curved page in a state in which a book is opened by using a camera will be described below.
Fig. 3 is a diagram showing an example of an overall flow of acquiring an image, which is obtained by transforming a curved page of a page of the book 100 in an open state into a plane, for the electronic apparatus 100 shown in fig. 1 and 2.
First, as shown in fig. 3, the image signal processor 30 controls the camera module 10 and the distance sensor module 20 to capture a main camera image including a curved page in a state where the book is open and ToF depth information which is ToF depth information of the curved page (step S1).
Next, as shown in fig. 3, the image signal processor 30 performs a top view process (step S2). The top view process estimates a fold position of a page based on the main camera image and the ToF depth information, and estimates a curve perpendicular to the fold position corresponding to a position of a bent page of a page.
Next, as shown in fig. 3, the image signal processor 30 performs an image correction process (step S3). The image correction process projects and transforms the curved page of the page in the main camera image into a plane based on the estimated curve. Thereby, the image signal processor 30 acquires an image of the surface of the page that has been projectively transformed into a plane.
Next, as shown in fig. 3, the image signal processor 30 determines whether additional photographing, for example, another curved page in a state where the book is open is necessary (step S4).
If additional photographing is required, the image signal processor 30 returns to step S1 and photographs a main camera image including a curved page of another page in a state where the book is open and ToF depth information, which is ToF depth information of the curved page of the page.
In step S2, the image signal processor 30 estimates another curve corresponding to the position of the curved page of another page based on the main camera image and the ToF depth information.
Then, in step S3, the image signal processor 30 performs projection transformation on the curved page of the other page curved in the main camera image to become a plane based on the estimated other curve. Thereby, the image signal processor 30 acquires an image of the surface of the page transformed onto the plane.
If the image signal processor 30 determines in step S4 that additional photographing is not required, the image signal processor 30 combines the acquired images of the two surface pages of the plane and opens the book. Thereby, the image signal processor 30 acquires images of the surfaces of the two pages of the plane in this state (step S5).
The present disclosure has the following preconditions (a) to (d). (a) The method comprises the following steps The book opens up and down (in the direction of the fold) in almost the same curved manner. (b) The method comprises the following steps The binding edge (fold location) of the book is located substantially above or below one of the center, left end and right end of the screen. (c) The method comprises the following steps The detection resolution of the distance sensor module 20 is lower than that of the camera module 10. (d) The method comprises the following steps The distance to the surface of the book may be detected by the distance sensor module 20.
Here, an example of a flow for acquiring the main camera image and the ToF depth information shown in fig. 3 will be described below.
Fig. 4 is a diagram showing a specific example of step S1 for acquiring the main camera image and the ToF depth information shown in fig. 3. Fig. 5 is a diagram showing an example of display on the display module 45 of the electronic apparatus 100 when the fold position is designated.
As shown in fig. 4, the image signal processor 30 sets a fold position specification frame 45a, the fold position specification frame 45a being specified by the user on a page opened in an image captured by the camera module 10 in a state in which the book is open (step S11).
The image signal processor 30 displays the fold position designation frame 45a on the display module 45 together with the main camera image captured by the camera module 10.
The image signal processing section 30 sets the fold position designation frame 45a at a position designated by the user on the curved page of the main camera image in response to an operation input related to an instruction of the fold position designation frame 45a of the input block 46 by the user.
For example, as shown in fig. 5, the fold position designation frame 45a is first displayed at the center of the display module 45 or at the previous operation position. When the user wants to designate another position, the user touches the position to be designated. Alternatively, in the display module 45, nothing may be displayed at first, and then, when the user touches the display module 45, the fold position designation frame 45a is displayed at the touched position in the display module 45.
In addition, when the user operates the frame direction designation button B1, the fold line position designation frame 45a rotates up and down or left and right in the display module 45. The input module 46 includes a frame direction designation button B1.
In addition, for example, as shown in fig. 5, the fold position specification frame 45a has a mark (an arrow in this example) indicating the vertical direction.
For example, in the display module 45, when the page display is turned upside down, it is rotated to the opposite direction by the user's operation of the frame direction specifying button B1.
For example, when the user touches and drags the end of the fold position designation frame 45a, it may rotate based on the center position.
For example, when the user touches the display module 45 with two fingers, the fold position designation frame 45a is provided in a direction connecting the touched points in the display module 45.
Then, as shown in fig. 4, the image signal processor 30 takes a main camera image including a curved page of the page that has been opened and curved by taking a photograph of the curved page of the page using the camera module 10 in a state where the book is opened, the main camera image having the crease position designation frame 45a set thereon (step S12).
For example, as shown in fig. 5, when the user operates the shutter button 46 of the input module 46, the camera module 10 takes a picture of a curved page of a page opened and curved while the book displayed on the display module 45 is opened.
The image signal processor 30 acquires a main camera image including a curved page of the opened and curved page by photographing the curved page using the camera module 10 in a state where the book 101 is opened.
The image signal processor 30 acquires ToF depth information of the curved page of the sheet by irradiating the curved page of the sheet with pulsed light from the distance sensor module 20.
Next, an example of a flow for executing the overhead process shown in fig. 3 will be described below.
Fig. 6 is a diagram showing a specific example of step S2 shown in fig. 3 for executing the top-down process.
First, as shown in fig. 6, the image signal processor 30 performs projective transformation of the point cloud data into data captured from the front (step S21).
Then, the image signal processor 30 performs estimation of the fold position of the projection-transformed point cloud data, and rotates the point cloud data into data (step S22).
Then, as shown in fig. 6, the image signal processor 30 performs estimation of the ridge inclination and projective transformation of the rotated point cloud data (step S23).
Then, the image signal processor 30 estimates a curved page by estimating a curve corresponding to the position of the curved page of the sheet (step S24).
In this way, as shown in fig. 6, in the top view processing, the image signal processor 30 estimates a curve corresponding to the position of the curved page in a plane (XY plane) perpendicular to the folding direction (z-axis direction) of the open folding position of the page based on the main camera image and the ToF depth information (steps S22 to S24).
Fig. 7A is a diagram showing an example of reference point cloud data P of an image of a curved page imaged from an oblique direction with a book open. Fig. 7B is a diagram showing an example of first point cloud data P1 obtained by projective transformation as if taken from the depth direction (from the front).
As shown in fig. 7A, the image signal processor 30 acquires reference point cloud data of a curved page of a book page based on the main camera image and ToF depth information.
Then, the image signal processor 30 calculates a normal vector by applying principal component analysis to the reference point cloud data.
Then, as shown in fig. 7B, the image signal processor 30 performs a projective transformation T of the reference point cloud data P into the first point cloud data P1 of the main camera image taken from the depth direction (front) based on the calculated normal vector.
As such, principal component analysis is applied to the reference point cloud data p taken from the inclination angle to generate a normal vector, and a projective transformation T is performed on the data taken from the front (y-z plane).
Fig. 8A is a diagram showing an example of scanning along a plurality of lines L1 with respect to the first dot cloud data P1 in a direction perpendicular to the longitudinal direction D1 of the fold position specification frame. Fig. 8B is a diagram showing an example of second point cloud data P2 obtained by performing a first rotation R of the first point cloud data such that the estimated crease position D2 is parallel to the preset reference direction (z-axis direction).
As shown in fig. 8A, the image signal processor 30 scans the first dot cloud data P1 along a plurality of lines L1 in a direction (short direction) perpendicular to the longitudinal direction D1 of the fold position specification frame 45A.
Then, the image signal processor 30 calculates the slope of the valley of the first point cloud data P1 by applying the least square method to the scanned first point cloud data P1.
As shown in fig. 8A, the image signal processor 30 estimates a crease position (valley position) M based on the calculated slope of the valley.
Then, as shown in fig. 8B, the image signal processor 30 acquires second point cloud data P2 by performing a first rotation R on the first point cloud data P1 such that the estimated fold position M is parallel to a preset reference direction (z-axis direction).
Further, when the image signal processor 30 scans the first point cloud data P1 along the plurality of lines L1, the image signal processor 30 extracts data within a predetermined range, which is data for the least square method, from the first point cloud data P1.
Fig. 9A is a diagram showing an example of scanning along a plurality of lines L2 with respect to the second point cloud data P2 in a direction perpendicular to the preset reference direction (z-axis direction). Fig. 9B is a diagram showing an example of the slope of the ridge N in the longitudinal direction of the scanned second point cloud data P2. Fig. 9C is a diagram showing an example of third point cloud data P3 obtained by subjecting the second point cloud data P2 to a second rotation Q so that the calculated inclination of the ridge N is parallel to a preset reference plane (z-y plane).
For example, as shown in fig. 9A, the image signal processor 30 scans the second point cloud data P2 along a plurality of lines L2 in a direction perpendicular to the reference direction (z-axis direction).
Then, for example, as shown in fig. 9B, the image signal processor 30 calculates the slope of the ridge N in the reference direction of the scanned second point cloud data by applying the least square method to the scanned second point cloud data.
Then, for example, as shown in fig. 9C, the image signal processor 30 acquires third point cloud data P3 by performing a second rotation Q of the second point cloud data P2 such that the calculated slope of the ridge N is parallel to a preset reference plane (z-y plane).
The coordinate transformation from the original reference point cloud data P to the third point cloud data P3 is "QRT".
Fig. 10A is a diagram showing an example of scanning with respect to the third point cloud data P3 along a plurality of lines L3 in the reference direction (z-axis direction). Fig. 10B is a diagram showing an example of the average value a in the depth direction (x-axis direction) of the third point cloud data P3 in the vicinity of the plurality of lines L3. Fig. 10C is a diagram showing an example of a curve E obtained by approximating the average value a calculated with a polynomial of fourth order or higher in the direction perpendicular to the reference direction (y-axis direction) and the depth direction (x-axis direction).
For example, as shown in fig. 10A, the image signal processor 30 scans the third point cloud data P3 along a plurality of lines L3 in the reference direction (z).
Then, for example, as shown in fig. 10B, the image signal processor 30 calculates an average value a of the third point cloud data P3 in the vicinity of the depth direction (x-axis direction) of the plurality of lines L3.
Then, for example, as shown in fig. 10C, the image signal processor 30 calculates a curve E approximated by a fourth-order or higher polynomial (1) in the direction perpendicular to the reference direction (y-axis direction) and the depth direction (x-axis direction) based on the average value a.
x=a 4 y 4 +a 3 y 3 +a 2 y 2 +a 1 y+b (1)
Next, an example of a flow for executing the division process shown in fig. 3 will be described below.
Fig. 11 is a diagram showing a specific example of step S25 for executing the division process shown in fig. 6. Fig. 12A is a diagram showing an example in which the curve E is divided by points so that the error between the points of the curve E approximated by a polynomial falls within an allowable range. Fig. 12B is a diagram showing an example in which the curve E is divided by points so that the error between the points of the curve E approximated by a polynomial falls within an allowable range, and fig. 12B is continued to fig. 12A. Fig. 12C is a diagram showing an example in which the curve E is divided by points such that the error between the points of the curve E approximated by a polynomial falls within an allowable range, and fig. 12C is continued to fig. 12B.
For example, as shown in fig. 11 and 12A, the image signal processor 30 calculates an error between points in the division process (step S251).
Next, as shown in fig. 11, the image signal processor 30 determines whether the error between the points is within the allowable range (step S252).
Then, as shown in fig. 11, 12B, and 12C, when the error between the dots exceeds the allowable range, the image signal processor 30 divides the dots that exceed the range (step S253).
On the other hand, the image signal processor 30 ends the dividing process when the error between the points falls within the allowable range.
Next, an example of a flow for executing the image correction processing shown in fig. 3 will be described below.
Fig. 13 is a diagram showing a specific example of step S3 for executing the image correction processing shown in fig. 3. Fig. 14A is a diagram showing an example of dividing the third point cloud data P3 into a plurality of rectangular areas for each section obtained by dividing the curve E. Fig. 14B is a diagram showing an example of coordinates obtained by inverse-transforming the third point cloud data P3 into the projection space of the point cloud data P1. Fig. 15A is a diagram showing an example of a plurality of rectangular regions J into which the third point cloud data P3 is divided for each section obtained by dividing the curve E. Fig. 15B is a diagram showing an example of a plurality of rectangular regions G obtained in a two-dimensional space (u, v) from a plurality of rectangular regions J in a three-dimensional space (x, y, z).
Fig. 16 is a diagram showing an example of a relationship between coordinates when point cloud data is expanded on a plane and coordinates on a captured image
First, as shown in fig. 13, in the image correction process, the image signal processor 30 sets an area for executing the process (step S31).
For example, as shown in fig. 14A, the image signal processor 30 divides the third point cloud data P3 into a plurality of rectangular areas for each section obtained by dividing the curve E.
For example, the coordinates of the third point cloud data P3 are transformed by the inverse transformation T- 1 R- 1 Q- 1 Is transformed into projection space coordinates of the original reference point of the cloud data P (fig. 14B).
Figure BDA0003751748130000161
Then, as shown in step S32 in fig. 13, the image signal processor 30 estimates a transformation matrix for expanding the three-dimensional spatial plane (fig. 15A) to the two-dimensional spatial plane (fig. 15B).
For example, as shown in fig. 15A, the image signal processor 30 divides the third point cloud data P3 into a plurality of rectangular regions J for each section obtained by dividing a curve.
Then, as shown in fig. 15B, the image signal processor 30 transforms a plurality of rectangular regions J obtained by dividing the third point cloud data P3 in the three-dimensional space (x, y, z) into the two-dimensional space (u, v) to obtain a plurality of rectangular regions G expanded in the two-dimensional space (u, v).
For example, the width of the rectangular region G shown in fig. 15B is represented by equation (3). In addition, the length of the rectangular region G shown in fig. 15B is represented by equation (4).
Figure BDA0003751748130000162
|z 1 -z 0 |=|v 1 -v 0 | (4)
Then, the image signal processor 30 calculates a transformation matrix for each area from the relationship between the coordinates when developed on the plane and the coordinates on the captured image.
As shown in equation (5), the offset values Ou, Ov and the enlargement/reduction ratio k are matched with the distortion-corrected camera image, and are obtained by performing calibration by adding an offset to the coordinates subjected to the distance-normalized enlargement/reduction.
Figure BDA0003751748130000171
Two equations can be created with one corresponding point. There are nine unknowns (equation (7)).
Figure BDA0003751748130000172
Formula (8) is obtained from formula (6) and formula (7).
Figure BDA0003751748130000173
Then, in equation (8), if "c" (for example, "1") is assumed, the number of unknown numbers becomes eight (equation (9)).
Figure BDA0003751748130000174
row3=-u′row1-v′row2 (10)
Then, in equation (9), when the relationship of the vertical type (10) is established, equation (11) is obtained.
Figure BDA0003751748130000175
Therefore, if the correspondence between four points in the coordinates when developed on the plane and the coordinates on the captured image is known, the unknowns can be obtained.
That is, as shown in equation (12), the transformation matrix is calculated by solving the equation from a combination of known points.
Figure BDA0003751748130000181
Next, the image signal processing section 30 performs projection transformation so that the curved page of the page in the main camera image becomes a plane based on the coordinates of the plurality of rectangular areas and the coordinates of the projection space of the reference point cloud data P (step S33 of fig. 13).
Fig. 17 is a diagram showing an example of an image obtained by performing projection transformation for each corresponding rectangular region so that a curved page of a curved page in a main camera image becomes a plane.
For example, as shown in fig. 17, the image signal processor 30 performs projection transformation for each corresponding rectangular region so that the curved page of the curved page in the main camera image becomes a plane.
Then, the image signal processor 30 acquires an image of the surface of the page that has been projectively transformed into a plane by the projective transformation.
Then, the image signal processor 30 determines whether the entire region of the third point cloud data P3 has been processed (step S34 in fig. 13).
Then, when the image signal processor 30 has not processed the entire area of the third spot cloud data P3, the process returns to step S31 in fig. 13.
On the other hand, as shown in fig. 15B, when the image signal processor 30 has processed the entire area of the third point cloud data P3, the division processing ends.
Fig. 18 is a diagram showing an example of images of the surfaces of two pages 200 and 201 that have been projected and transformed into planes.
As shown in fig. 18, the image signal processor 30 synthesizes the two images of the surfaces of the two pages of the acquired plane, and acquires the images of the surfaces of the two pages of the plane in a state where the book is opened.
Thus, using the camera of the electronic apparatus 100 such as a smartphone, a camera image of the surface of a page spread in a plane can be acquired from a camera image of the surface of a curved page of an opened book.
As described above, according to the present disclosure, a widened image can be obtained by simply photographing a widened book. The technology can be provided at low cost by being sold as a smartphone application. The present disclosure does not require large equipment.
In the description of the embodiments of the present disclosure, it is to be understood that the terms "central," "longitudinal," "lateral," "length," "width," "thickness," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," "clockwise," "counterclockwise" refer to the orientation or position as shown or described in the drawing under discussion. These relative terms are merely intended to simplify the description of the present disclosure and do not indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation. Therefore, no limitation to the present disclosure is to be construed.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying a number of technical features indicated. Thus, features defined as "first", "second" may include one or more of such features. In the description of the present disclosure, "a plurality" means "two or more" unless otherwise defined.
In the description of the embodiments of the present disclosure, unless specified or limited otherwise, the terms "mounted," "connected," and "coupled" are to be construed broadly, e.g., as meaning a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium; there may be communication between the interior of the two elements as will be understood by those skilled in the art depending on the particular situation.
In the present disclosure, unless otherwise specified and limited, "above" or "below" a first feature may include the first and second features being in direct contact, and may also include the first and second features not being in direct contact but being in contact with each other through another feature therebetween. Also, a first feature "on," "above," and "over" a second feature may include the first feature being directly above and obliquely above the second feature, or merely indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may include the first feature being directly under and obliquely below the second feature, or merely indicating that the first feature is at a lower level than the second feature.
In the above description, numerous different embodiments or examples are provided to implement different structures of the present disclosure. Specific components and arrangements are described above to simplify the present disclosure. Of course, they are merely examples and are not intended to limit the present disclosure. Moreover, the present disclosure may repeat reference numerals and/or reference letters in the various examples, which have been repeated for purposes of brevity and clarity and do not in themselves dictate a relationship between the various embodiments and/or arrangements. Further, the present disclosure provides examples of different processes and materials, but one of ordinary skill in the art will recognize that other processes and/or other materials may be applied.
Reference throughout this specification to "an embodiment," "some embodiments," "example embodiments," "an example," "a specific example," or "some examples" means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the foregoing terms in this specification are not necessarily referring to the same embodiment or example of the disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps (e.g., a particular sequence of executable instructions for implementing logical functions) represented in the flowcharts or otherwise described herein may be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device executing the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples of the computer readable medium include, but are not limited to: an electronic connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present disclosure may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, the steps or methods may be implemented using any one or combination of the following technologies, which are well known in the art: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps of the above-described exemplary methods of the present disclosure may be implemented by using program commands to associated hardware. The program may be stored in a computer readable storage medium and when run on a computer comprises one or a combination of the steps in the method embodiments of the present disclosure.
Furthermore, the functional units of the embodiments of the present disclosure may be integrated in the processing module, or these units may exist separately physically, or two or more units are integrated in the processing module. The integration module may be implemented in the form of hardware or in the form of a software functional module. When the integrated module is implemented in the form of a software functional module and sold or used as a separate product, the integrated module may be stored in a computer-readable storage medium.
The storage medium may be a read-only memory, a magnetic disk, a CD, etc.
Although embodiments of the present disclosure have been shown and described, it will be understood by those skilled in the art that these embodiments are illustrative and not to be construed as limiting the present disclosure, and that changes, modifications, substitutions and alterations may be made to these embodiments without departing from the scope of the present disclosure.

Claims (17)

1. An electronic device, comprising:
a camera module that takes a picture of a subject to obtain a primary camera image;
a distance sensor module acquiring distance depth information of the object by using light; and
an image signal processor that controls the camera module and the distance sensor module to acquire a camera image based on the main camera image and the distance depth information, wherein:
the image signal processor controls the camera module and the distance sensor module to acquire the main camera image including a curved page of a page in a state where a book is open and distance depth information that is distance depth information of the curved page,
the image signal processor estimates a curve corresponding to a position of the curved page based on the main camera image and the distance depth information, an
The image signal processor obtains an image of a surface of the page that has been projectively transformed into a plane by projectively transforming a curved page of the page in the main camera image into a plane based on the estimated curve.
2. The electronic device of claim 1, wherein the electronic device,
wherein the distance sensor module emits pulsed light to the object and detects reflected light of the pulsed light reflected from the object, thereby acquiring time-of-flight (ToF) depth information as the distance depth information.
3. The electronic apparatus according to claim 2, wherein the image signal processor obtains a main camera image including the curved page by taking a picture of the curved page that has been opened and curved using the camera module, and
wherein the image signal processor acquires ToF depth information of the curved page through the distance sensor module.
4. The electronic device according to claim 2, wherein the image signal processor estimates the curve corresponding to the position of the curved page in a plane perpendicular to a crease direction of a position of an open crease of the page based on the main camera image and the ToF depth information.
5. The electronic device according to claim 2, wherein the image signal processor obtains a main camera image including a curved page of the other page in a state in which the book is open, and ToF depth information that is the ToF depth information of the curved page of the other page,
wherein the image signal processor estimates other curves corresponding to positions of curved pages of the other sheets based on the main camera image and the ToF depth information,
wherein the image signal processor obtains an image of a surface of the other sheet, which has been projectively transformed into a plane, by projectively transforming curved pages of the other sheet in the main camera image into a plane, based on the estimated other curve, and
wherein the image signal processor synthesizes the acquired images of the surfaces of the two pages of the plane, and acquires the images of the surfaces of the two pages of the plane in a state in which the book is opened.
6. The electronic apparatus according to claim 3, wherein the image signal processor sets a fold position specification frame that is specified by a user on a page opened in an image taken by the camera module in a state in which the book is opened, and
wherein the image signal processor acquires the main camera image including the curved page by taking an image of the curved page of the page that has been opened and curved with the camera module in a state in which the book is opened, the crease position designation frame being set in the main camera image.
7. The electronic device of claim 6, further comprising:
a display module that displays predetermined information;
an input module that receives an operation of a user; and
a main processor controlling the display module and the input module,
wherein the image signal processor displays the fold position specification frame on the display module together with the main camera image captured by the camera module, and
wherein the image signal processor sets the fold position specification frame at a position specified by the user on the curved page of the main camera image in response to an operation input to the input module by the user in relation to the fold position specification frame instruction.
8. The electronic device of claim 7, wherein the image signal processor obtains reference point cloud data for the curved page based on the main camera image and the ToF depth information,
wherein the image signal processor calculates a normal vector by applying principal component analysis to the reference point cloud data, and
wherein the image signal processor performs a projective transformation on the basis of the calculated normal vector, projectively transforming the reference point cloud data into first point cloud data of the main camera image taken from a depth direction.
9. The electronic device according to claim 8, wherein the image signal processor scans the first point cloud data along a plurality of lines in a direction perpendicular to a longitudinal direction of the crease position specification frame,
wherein the image signal processor calculates a slope of a valley of the scanned first point cloud data by applying a least square method to the first point cloud data,
wherein the image signal processor estimates a crease position based on the calculated slope of the valley, and
wherein the image signal processor obtains second point cloud data by performing a first rotation on the first point cloud data such that the estimated fold position is parallel to a preset reference direction.
10. The electronic device of claim 9, wherein the image signal processor scans the second point cloud data along a plurality of lines in a direction perpendicular to a reference direction,
wherein the image signal processor calculates a slope of a ridge in a reference direction of the scanned second point cloud data by applying a least square method to the scanned second point cloud data,
and the image signal processor acquires third point cloud data by performing second rotation on the second point cloud data so that the calculated slope of the ridge is parallel to a preset reference plane.
11. The electronic device of claim 10, wherein the image signal processor scans the third point cloud data along a plurality of lines in the reference direction,
wherein the image signal processor calculates an average value of third point cloud data in the depth direction in the vicinity of the plurality of lines, and
wherein the image signal processor approximates the calculated average to a curve in a direction perpendicular to the reference direction and in a depth direction by a polynomial of fourth order or higher.
12. The electronic apparatus according to claim 11, wherein the image signal processor divides the third point cloud data into a plurality of rectangular areas for each section obtained by dividing a curve, and
wherein the image signal processor obtains an image of the surface of the sheet of the book, which has been projectively transformed into a plane, by transforming the projection of the curved page in the main camera image into a plane based on a relationship between the coordinates of the plurality of rectangular areas and the coordinates of the projection space of the reference point cloud data.
13. The electronic device of claim 1, wherein a detection resolution of the distance sensor module is lower than a detection resolution of the camera module.
14. A method for controlling an electronic device, the electronic device comprising: a camera module that takes a picture of a subject to obtain a primary camera image; a distance sensor module acquiring distance depth information of the object by using light; and an image signal processor controlling the camera module and the distance sensor module to acquire a camera image based on the main camera image and the distance depth information,
the method comprises the following steps:
controlling, by the image signal processor, the camera module and the distance sensor module to acquire the main camera image including a curved page of a page in a state in which a book is open and distance depth information of the curved page;
estimating, by the image signal processor, a curve corresponding to a position of the curved page based on the main camera image and the distance depth information; and
obtaining, by the image signal processor, an image of a surface of the page that has been projectively transformed into a plane by projectively transforming a curved page of the page in the main camera image into a plane based on the estimated curve.
15. The method as set forth in claim 14, wherein,
wherein the distance sensor module emits pulsed light to the object and detects reflected light of the pulsed light reflected from the object, thereby acquiring time-of-flight (ToF) depth information as the distance depth information.
16. A computer-readable storage medium, on which a computer program is stored, wherein the computer program realizes a method for controlling an electronic device when executed by a processor, and the method comprises:
acquiring, by an image signal processor, a main camera image and distance depth information by correcting a camera module and a distance sensor module, the main camera image including a curved page of a page in a state in which a book is open, the distance depth information being distance depth information of the curved page;
estimating, by the image signal processor, a curve corresponding to a position of the curved page based on the main camera image and the distance depth information; and
obtaining, by the image signal processor, an image of the surface of the page, which has been projectively transformed into a plane, by projectively transforming a curved page in the main camera image into a plane, based on the estimated curve.
17. The computer-readable storage medium of claim 16,
wherein the distance depth information is time-of-flight (ToF) depth information.
CN202080093632.0A 2020-02-07 2020-02-07 Electronic device, method of controlling electronic device, and computer-readable storage medium Pending CN114982214A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/074508 WO2021155575A1 (en) 2020-02-07 2020-02-07 Electric device, method of controlling electric device, and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN114982214A true CN114982214A (en) 2022-08-30

Family

ID=77199693

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080093632.0A Pending CN114982214A (en) 2020-02-07 2020-02-07 Electronic device, method of controlling electronic device, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN114982214A (en)
WO (1) WO2021155575A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150116542A1 (en) * 2013-10-29 2015-04-30 Samsung Electronics Co., Ltd. Electronic apparatus for making bokeh image and method thereof
CN105872291A (en) * 2016-05-31 2016-08-17 大连成者科技有限公司 Intelligent internet high-definition scanner with laser correcting function
CN109089047A (en) * 2018-09-29 2018-12-25 Oppo广东移动通信有限公司 Control method and apparatus, the storage medium, electronic equipment of focusing
CN109726614A (en) * 2017-10-27 2019-05-07 北京小米移动软件有限公司 3D stereoscopic imaging method and device, readable storage medium storing program for executing, electronic equipment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5497236A (en) * 1993-06-23 1996-03-05 Ricoh Company Ltd. Method and apparatus for distortion correction of scanned images
CN102833460B (en) * 2011-06-15 2015-03-11 富士通株式会社 Image processing method, image processing device and scanner
US9317893B2 (en) * 2013-03-26 2016-04-19 Sharp Laboratories Of America, Inc. Methods and systems for correcting a document image
CN104835119A (en) * 2015-04-23 2015-08-12 天津大学 Method for positioning base line of bending book cover
CN105979117B (en) * 2016-04-28 2018-11-27 大连成者科技有限公司 Bending page image flattening method based on laser rays
CN110533769B (en) * 2019-08-20 2023-06-02 福建捷宇电脑科技有限公司 Flattening method and terminal for open book image
CN110519480B (en) * 2019-09-21 2021-05-07 深圳市本牛科技有限责任公司 Curved surface flattening method based on laser calibration

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150116542A1 (en) * 2013-10-29 2015-04-30 Samsung Electronics Co., Ltd. Electronic apparatus for making bokeh image and method thereof
CN105872291A (en) * 2016-05-31 2016-08-17 大连成者科技有限公司 Intelligent internet high-definition scanner with laser correcting function
CN109726614A (en) * 2017-10-27 2019-05-07 北京小米移动软件有限公司 3D stereoscopic imaging method and device, readable storage medium storing program for executing, electronic equipment
CN109089047A (en) * 2018-09-29 2018-12-25 Oppo广东移动通信有限公司 Control method and apparatus, the storage medium, electronic equipment of focusing

Also Published As

Publication number Publication date
WO2021155575A1 (en) 2021-08-12

Similar Documents

Publication Publication Date Title
JP5670476B2 (en) Image capture device with tilt or perspective correction capability
JP4529837B2 (en) Imaging apparatus, image correction method, and program
JP5109803B2 (en) Image processing apparatus, image processing method, and image processing program
JP4435145B2 (en) Method and apparatus for providing panoramic image by calibrating geometric information
US9516223B2 (en) Motion-based image stitching
JP4010754B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium
KR102452564B1 (en) Apparatus and method for estimating optical image stabilization motion
KR102452575B1 (en) Apparatus and method for compensating variation of images caused by optical image stabilization motion
CN109690568A (en) A kind of processing method and mobile device
CN112055869A (en) Perspective distortion correction for face
CN107222737B (en) A kind of processing method and mobile terminal of depth image data
CN114286068B (en) Focusing method, focusing device, storage medium and projection equipment
WO2021031709A1 (en) Imaging method, imaging device, and electronic equipment
CN114286066A (en) Projection correction method, projection correction device, storage medium and projection equipment
CN114982214A (en) Electronic device, method of controlling electronic device, and computer-readable storage medium
KR20180065666A (en) Apparatus and method for processing image of vehicle
CN117063198A (en) Electronic device, method of controlling electronic device, and computer-readable storage medium
JP2004165944A (en) Method and device for projection information correction, program, and recording medium
CN112541506A (en) Method, device, equipment and medium for correcting text image
JP2016111521A (en) Information processing device, information processing program and information processing method
KR20140132452A (en) electro device for correcting image and method for controlling thereof
KR101595960B1 (en) 3d image generation method and apparatus performing the same
JP4484037B2 (en) Image processing apparatus, image processing system, imaging apparatus, and image processing method
JP2023546037A (en) image recording device
JP2016224888A (en) Information processing apparatus, coordinate estimation program, and coordinate estimation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination