CN115144870A - Image shooting method, device, terminal and storage medium - Google Patents

Image shooting method, device, terminal and storage medium Download PDF

Info

Publication number
CN115144870A
CN115144870A CN202110344438.1A CN202110344438A CN115144870A CN 115144870 A CN115144870 A CN 115144870A CN 202110344438 A CN202110344438 A CN 202110344438A CN 115144870 A CN115144870 A CN 115144870A
Authority
CN
China
Prior art keywords
information
pixel
distance
image
image information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110344438.1A
Other languages
Chinese (zh)
Inventor
陈朝喜
李志武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202110344438.1A priority Critical patent/CN115144870A/en
Publication of CN115144870A publication Critical patent/CN115144870A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C17/00Compasses; Devices for ascertaining true or magnetic north for navigation or surveying purposes
    • G01C17/02Magnetic compasses
    • G01C17/28Electromagnetic compasses
    • G01C17/32Electron compasses
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to an image shooting method, an image shooting device, a terminal and a storage medium, wherein the method comprises the following steps: acquiring image information, azimuth information and distance detection information of a shooting target; and determining a three-dimensional image of the shooting target according to the image information, the azimuth information and the distance detection information. According to the method, when image shooting is carried out, image information, azimuth information and distance detection information of a shooting target are obtained firstly, then data fusion is carried out on the image information, the azimuth information and the distance detection information, and a three-dimensional image of the shooting target is obtained.

Description

Image shooting method, device, terminal and storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an image capturing method and apparatus, a terminal, and a storage medium.
Background
Terminals such as mobile phones are increasingly popularized, more and more people are used to shoot images by using the mobile phones, and the shooting requirements of people on the mobile phones are higher and higher.
At present, in addition to storing image information captured by a camera when a user takes a picture, a mobile phone may record attribute information of the image acquired during shooting and image processing, such as shooting date and Global Positioning System (GPS) positioning data, in the image information. Thus, the user can further view the attribute information when reviewing the image information.
However, the content provided by the attribute information is not comprehensive enough, and the user may still not accurately restore the specific details of the shot image by looking at the attribute information, which is poor in experience.
Disclosure of Invention
In order to overcome the problems in the related art, the present application provides an image capturing method, an image capturing apparatus, a terminal, and a storage medium.
According to a first aspect of the embodiments of the present application, there is provided an image capturing method applied to a terminal, the method including:
acquiring image information, azimuth information and distance detection information of a shooting target;
and determining a three-dimensional image of the shooting target according to the image information, the azimuth information and the distance detection information.
Optionally, the determining a three-dimensional image of the shooting target according to the image information, the orientation information, and the distance information includes:
determining pixel distance information of each pixel in the image information according to the distance detection information and the image information;
determining pixel orientation information of each pixel in the image information according to the orientation information and the image information;
and determining the three-dimensional image according to the pixel distance information, the pixel azimuth information and the image information.
Optionally, the determining the pixel distance information of each pixel in the image information according to the distance detection information and the image information includes:
determining a distance value of each pixel in the image information according to the following rules, and using the determined distance value as pixel distance information of the corresponding pixel:
if a = e, in each row of pixels of the image information, each pixel corresponds to a distance value in the distance value matrix;
if b = f, in each column of pixels of the image information, each pixel corresponds to a distance value in the distance value matrix;
if a is larger than e, in each row of pixels of the image information, each a/e pixels correspond to one distance value in the distance value matrix;
if b > f, in each column of pixels of the image information, every b/f pixels correspond to a distance value in the distance value matrix;
if a is less than e, in each row of pixels of the image information, the average distance value of each e/a distance values in the distance value matrix corresponds to one pixel;
if b is less than f, in each column of pixels of the image information, the average distance value of every f/b distance values in the distance value matrix corresponds to one pixel;
wherein a, b, e and f are integers greater than or equal to 1.
Optionally, the determining the pixel orientation information of each pixel in the image information according to the orientation information and the image information includes:
determining pixel orientation information for each pixel in the image information according to the following rules:
in each column of pixel group of the image information, the pixel orientation information of each pixel is the same;
in the image information, the pixel orientation information of the ith column of pixel group is:
Figure BDA0002997029260000021
wherein i is an integer greater than or equal to 1 and less than or equal to j.
Optionally, the method further comprises:
and controlling the image shooting device to acquire the image information at the same rate, detecting the direction information by the direction detection device and detecting the distance detection information by the distance detection device.
Optionally, the method further comprises:
and controlling the frame rate starting time of the image information collected by the image shooting device, the starting time of the direction information detected by the direction detection device and the starting time of the distance detection information detected by the distance detection device to be the same.
According to a second aspect of the embodiments of the present application, there is provided an image capturing apparatus applied to a terminal, the apparatus including:
the acquisition module is used for acquiring image information, azimuth information and distance detection information of a shooting target;
and the determining module is used for determining the three-dimensional image of the shooting target according to the image information, the direction information and the distance detection information.
Optionally, the determining module is further configured to:
determining pixel distance information of each pixel in the image information according to the distance detection information and the image information;
determining pixel orientation information of each pixel in the image information according to the orientation information and the image information;
and determining the three-dimensional image according to the pixel distance information, the pixel azimuth information and the image information.
Optionally, the distance detection information includes a distance value matrix denoted as e _ f, and the pixel resolution of the image information is denoted as a _ b, and the determining module is further configured to:
determining a distance value of each pixel in the image information according to the following rules, and using the determined distance value as pixel distance information of the corresponding pixel:
if a = e, in each row of pixels of the image information, each pixel corresponds to a distance value in the distance value matrix;
if b = f, in each column of pixels of the image information, each pixel corresponds to a distance value in the distance value matrix;
if a is larger than e, in each row of pixels of the image information, each a/e pixels correspond to one distance value in the distance value matrix;
if b > f, in each column of pixels of the image information, every b/f pixels correspond to a distance value in the distance value matrix;
if a is less than e, in each row of pixels of the image information, the average distance value of each e/a distance values in the distance value matrix corresponds to one pixel;
if b is less than f, in each column of pixels of the image information, the average distance value of every f/b distance values in the distance value matrix corresponds to one pixel;
wherein a, b, e and f are integers greater than or equal to 1.
Optionally, the orientation information includes an angle range deviating from a set direction, the angle range is denoted as (β, γ), the image information includes j columns of pixel groups, and the determining module is further configured to:
determining pixel orientation information for each pixel in the image information according to the following rules:
in each column of pixel group of the image information, the pixel orientation information of each pixel is the same;
in the image information, the pixel orientation information of the ith column of pixel group is:
Figure BDA0002997029260000031
wherein i is an integer greater than or equal to 1 and less than or equal to j.
Optionally, the apparatus further comprises:
and the control module is used for controlling the image shooting device to acquire the same speed of the image information, the direction detection device to detect the same speed of the direction information and the distance detection device to detect the same speed of the distance detection information.
Optionally, the apparatus further comprises:
and the control module is used for controlling the image shooting device to acquire the frame rate starting time of the image information, the direction detection device to detect the starting time of the direction information and the distance detection device to detect the starting time of the distance detection information to be the same.
According to a third aspect of embodiments of the present application, there is provided a terminal including an image capturing device, an orientation determining device, and a distance detecting device,
wherein the image capturing apparatus is configured to acquire image information of a capturing target; the orientation determining device is configured to detect orientation information of the photographic target; the distance detection device is configured to detect distance detection information of the shooting target relative to the terminal;
the terminal includes:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the method of the first aspect.
According to a fourth aspect of embodiments herein, there is provided a non-transitory computer readable storage medium having instructions which, when executed by a processor of a terminal, enable the terminal to perform the method of the first aspect.
The technical scheme provided by the embodiment of the application can have the following beneficial effects: according to the method, when image shooting is carried out, image information, azimuth information and distance detection information of a shooting target are obtained firstly, then data fusion is carried out on the image information, the azimuth information and the distance detection information, and a three-dimensional image of the shooting target is obtained.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a flowchart illustrating an image capturing method according to an exemplary embodiment.
Fig. 2 is a block diagram of an image capturing apparatus according to an exemplary embodiment.
Fig. 3 is a block diagram of a terminal shown in accordance with an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
The disclosure provides an image shooting method applied to a terminal. According to the method, when image shooting is carried out, image information, azimuth information and distance detection information of a shooting target are obtained firstly, then data fusion is carried out on the image information, the azimuth information and the distance detection information, a three-dimensional image (3D image) of the shooting target is obtained, through the three-dimensional image, not only can the specific form of the shooting target be better known, but also the geographic azimuth of the shooting target when the shooting target is shot can be obtained, the content of the image is enriched, the use experience of a user is improved, and the shooting requirements of the user can be better met.
In one exemplary embodiment, an image photographing method is provided, which is applied to a terminal. Referring to fig. 1, the method includes:
s110, acquiring image information, azimuth information and distance detection information of a shooting target;
and S120, determining a three-dimensional image of the shooting target according to the distance detection information, the image information and the direction information.
In step S110, the image information may be acquired by an image acquisition device of the terminal, and the image acquisition device may include a camera (camera).
For example, the processor of the terminal may control the camera to capture an image, and the image information of the target captured by the camera is transmitted to the processor after the image information of the target is captured by the camera, so that the processor may obtain the image information of the target.
It should be noted that, because the data volume of the image information acquired by the image acquisition device is large, data transmission is performed between the image acquisition device and the Processor through a mipi (Mobile Industry Processor Interface) protocol.
The azimuth information refers to information of the geographical azimuth of the photographic target, and may be, for example, an offset angle of the photographic target from the true north azimuth. The azimuth information can be detected by an azimuth detection device of the terminal, and the azimuth detection device can comprise an electronic compass (also called electronic compass).
For example, when an image is shot, the processor of the terminal may control the compass to detect the azimuth information of the shooting direction, it can be understood that the shooting direction of the camera is the direction in which the shooting target is located, and the specific azimuth information detection method may refer to the related art and is not described herein. After the compass detects the azimuth information, the compass transmits the azimuth information to the processor, and the processor can acquire the information of the geographic azimuth of the shooting target.
The distance detection information is information obtained when the distance between the shooting target and the terminal is detected. The distance detection information may be detected by a distance detection device of the terminal, and the distance detection device may include a time-of-flight (Tof) sensor.
For example, when capturing an image, the processor of the terminal may control the time-of-flight sensor to detect the distance of the capturing target from the terminal. After the flight time sensor detects and obtains the distance detection information of the shooting target and the terminal, the flight time sensor transmits the distance detection information to the processor, and the processor can obtain the distance information of the shooting target and the terminal.
The principle of the time-of-flight sensor detecting the distance detection information is as follows: the time-of-flight sensor comprises a signal transmitting end and a plurality of signal receiving ends, wherein the signal receiving ends form a signal receiving matrix. The signal transmitting terminal transmits a ranging signal (such as an electromagnetic wave signal), and the ranging signal can be reflected back to the flight time sensor after being transmitted to the shooting target. Therefore, each signal transmitting terminal can directly receive the ranging signals transmitted by the signal transmitting terminal and can also receive the ranging signals fed back by the shooting target, the time interval of receiving the ranging signals twice is determined to be recorded as tn, and then the measured distance value of the signal receiving terminal is determined to be dn = c × tn/2. Wherein c represents the propagation speed of the electromagnetic wave signal, n represents the number of the signal receiving end, that is, the time interval of the first signal receiving end is t1, and the corresponding distance value is d1; the time interval of the second signal receiving end is t2, the corresponding distance value is d2, and so on. The plurality of distance values constitute distance detection information of the photographic subject.
It should be noted that, when the matrix time-of-flight sensor is used to detect the distance between the object to be captured and the terminal, a relatively strong data processing capability is required, and therefore, in this method, after the time-of-flight sensor detects the distance detection information, the distance detection information needs to be transmitted to the processor, and then the processor performs processing such as calculation on the distance detection information, so as to finally obtain the object distance detection information of the object to be captured.
In step S120, after the processor obtains the distance detection information, the image information, and the azimuth information of the shooting target, the processor may draw a three-dimensional image through an image fusion technique, so that the shot three-dimensional image includes both a stereo image of the shooting target and azimuth information (for example, an angle deviating from the north direction) of the shooting target in a geographic space, thereby improving user experience.
When the time-of-flight sensor is used to detect the distance detection information, the distance detection information refers to detection information of the distance detection device, which may include energy information of a ranging signal received by each signal receiving end in the time-of-flight sensor, and the processor determines time information of the ranging signal received by each signal receiving end according to the energy information (for example, when the energy characteristic of the ranging signal received by the signal receiving end matches a preset energy characteristic, the time is determined as the time of receiving the ranging signal, which may refer to the prior art and is not described herein), and then determines time interval information, and then calculates final target distance detection information, which refers to information of a distance between a shooting target and a terminal.
The method comprises the steps that an image acquisition device, an orientation detection device and a distance detection device are arranged in a terminal at the same time, image information, orientation information and distance detection information of a shooting target are obtained at the same time when an image is shot, and then data fusion processing is carried out on the image information, the orientation information and the distance detection information to obtain a three-dimensional image of the shooting target. Through this three-dimensional image, can the multi-angle, diversified three-dimensional form of shooting the target of observation, not only can better must know the concrete form of shooting the target, but also can obtain when shooting the shooting target, shoots the geographical position that the target is located, has richened the content of image, has promoted user's use and has experienced.
In one exemplary embodiment, an image photographing method is provided, which is applied to a terminal. The method for determining the three-dimensional image of the shooting target according to the image information, the azimuth information and the distance detection information comprises the following steps:
s210, determining pixel distance information of each pixel in the image information according to the distance detection information and the image information;
s220, determining pixel orientation information of each pixel in the image information according to the orientation information and the image information;
and S230, determining the three-dimensional image according to the pixel distance information, the pixel direction information and the image information.
In step S210, when the distance detection information is the detection information of the time-of-flight sensor, the processor further needs to determine the target distance information according to the distance detection information, and then determine the pixel distance information of each pixel in the image information, where the pixel distance information represents the distance information between each pixel and the terminal.
When the distance detection information directly indicates information of the distance between the photographing target and the terminal, the distance detection information is directly determined as target distance information.
The pixel resolution of the image information is marked as a × b, that is, the image information includes a × b pixels; the distance value matrix of the target distance information is e f, that is, the distance detection information includes a distance value matrix of e f, which includes e f distance values.
In this method, the distance value of each pixel in the image information may be determined according to the following rule (averaging method), and the determined distance value may be used as the pixel distance information of the corresponding pixel:
if a = e, in each row of pixels of the image information, each pixel corresponds to a distance value in the distance value matrix;
if b = f, in each column of pixels of the image information, each pixel corresponds to a distance value in the distance value matrix;
if a is larger than e, in each row of pixels of the image information, each a/e pixels correspond to a distance value in the distance value matrix;
if b is larger than f, in each column of pixels of the image information, each b/f pixels correspond to a distance value in the distance value matrix;
if a is less than e, in each row of pixels of the image information, the average distance value of each e/a distance values in the distance value matrix corresponds to one pixel;
if b is less than f, in each row of pixels of the image information, the average distance value of every f/b distance values in the distance value matrix corresponds to one pixel;
wherein a, b, e and f are integers greater than or equal to 1.
It should be noted that, when determining the pixel distance information of each pixel, it is necessary to sequentially correspond in order.
In the case of example 1, the following examples,
the terminal is provided with a camera, and the pixel resolution of the camera is 800 × 1600, that is, the resolution of image information obtained by the camera is 800 × 1600. The terminal is also provided with a time-of-flight sensor, and a signal receiving end in the time-of-flight sensor forms an 800 × 1600 matrix, namely, a distance value matrix of the target distance information is marked as 800 × 1600.
In each row of pixels of the image information, each pixel corresponds to a distance value in a matrix of distance values. I.e. a first pixel corresponds to a first distance value, a second pixel corresponds to a second distance value, and so on.
In each column of pixels of the image information, each pixel corresponds to a distance value in the matrix of distance values. I.e. a first pixel corresponds to a first distance value, a second pixel corresponds to a second distance value, and so on.
In this way, a pixel distance value for each pixel may be determined.
In the case of example 2, the following example was carried out,
the terminal is provided with a camera, and the pixel resolution of the camera is 800 × 1600, that is, the resolution of image information obtained by the camera is 800 × 1600. The terminal is also provided with a time-of-flight sensor, and a signal receiving end in the time-of-flight sensor forms an 8 × 8 matrix, namely, a distance value matrix of the target distance information is marked as 8 × 8.
In each row of pixels of the image information, every 800/8=100 pixels corresponds to one distance value in the matrix of distance values. I.e. the first 100 pixels corresponds to a first distance value, the second 100 pixels corresponds to a second distance value, and so on.
In each column of pixels of the image information, every 1600/8=200 pixels corresponds to one distance value in the distance value matrix. I.e. the first 200 pixels corresponds to the first distance value, the second 200 pixels corresponds to the second distance value, and so on.
In this way, a pixel distance value for each pixel may be determined.
In the example 3, the first and second images are,
a camera is arranged in the terminal, and the pixel resolution of the camera is denoted as 300 × 600, that is, the resolution of the image information obtained by the camera is 300 × 400. The terminal is also provided with a time-of-flight sensor, and a signal receiving end in the time-of-flight sensor forms a 600 × 800 matrix, namely, a distance value matrix of the target distance information is recorded as 600 × 800.
In each row of pixels of the image information, an average distance value of every 600/300=2 distance values in the distance value matrix corresponds to one pixel. I.e. the average distance value of the first 2 distance values in the distance value matrix is taken as the distance value of the first pixel, the average distance value of the second 2 distance values is taken as the distance value of the second pixel, and so on.
In each column of pixels of the image information, an average distance value of every 800/400=2 distance values in the distance value matrix corresponds to one pixel. I.e. the average distance value of the first 2 distance values in the distance value matrix is taken as the distance value of the first pixel, the average distance value of the second 2 distance values is taken as the distance value of the second pixel, and so on.
In this way, a pixel distance value for each pixel may be determined.
It should be noted that the ranging accuracy of current time-of-flight sensors is much less than the pixel resolution of cameras, so that each range value in the target range information typically represents multiple pixels in the image information.
In step S220, the method of determining the pixel orientation information is similar to the method of determining the pixel distance information described above.
Currently, when a compass application of a terminal is used, only the azimuth information right in front of the terminal, i.e., the azimuth information corresponding to the middle of the terminal, is generally displayed. In the method, when the image is shot, the electronic compass detects the azimuth information corresponding to the edge of the imaging area in the terminal, so that the azimuth range (for example, the range of 10-30 degrees deviated from the north direction of geography) where the shooting target is located in the shot image can be determined, and then the pixel azimuth information of each pixel is determined by adopting an averaging method.
In this step, the orientation information may include an angle range from a set direction (for example, a geographical north direction), the angle range being denoted by (β, γ), the image information includes j columns of pixel groups, and the pixel orientation information of each pixel in the image information is determined based on the orientation information and the image information, including:
the pixel orientation information for each pixel in the image information is determined according to the following rule (averaging):
in each column of pixel group of the image information, the pixel azimuth information of each pixel is the same;
in each row of pixels of the image, the pixel orientation information of the ith pixel is
Figure BDA0002997029260000091
Wherein i is an integer greater than or equal to 1 and less than or equal to j.
It should be noted that each column of pixel group may include only one column of pixels, and may also include two or more columns of pixels.
In the case of example 1, the following examples,
the image information comprises 100 columns of pixel groups, each column of pixel groups comprising only 1 column of pixels, i.e. the image information comprises 100 columns of pixels. The azimuth information is a range deviating from the geographical true north direction (10 °,30 °), and it should be noted that the azimuth information includes critical angles of 10 ° and 30 °.
In the 1 st column of pixels, the pixel azimuth information of each pixel is:
Figure BDA0002997029260000092
in the 2 nd column of pixels, the pixel orientation information of each pixel is:
Figure BDA0002997029260000093
by analogy, the pixel orientation information for each pixel in the image information can be determined. In the example, each column of pixels determines one pixel position information, so that the imaging effect is better.
In the case of example 1, the following examples,
the image information comprises 50 columns of pixel groups, each column of pixel groups comprising only 2 columns of pixels, i.e. the image information comprises 100 columns of pixels. The azimuth information is a range deviating from the geographical due north direction (10 °,30 °), and it should be noted that the azimuth information includes critical angles of 10 ° and 30 °.
The pixel position information of each pixel in the pixel group is as follows:
Figure BDA0002997029260000101
the 3 rd column pixel and the 4 th column pixel are used as a 2 nd pixel group, and the pixel position information of each pixel in the pixel group is as follows:
Figure BDA0002997029260000102
by analogy, the pixel orientation information for each pixel in the image information can be determined. In the example, a plurality of rows of pixels are used as a pixel group, and the azimuth information of one pixel is determined, so that the requirement on data processing capacity can be reduced, and the energy consumption of the terminal can be reduced.
In step S230, after the pixel distance information, the pixel orientation information, and the image information are determined, the pixels in the image information are processed according to the pixel distance information of each pixel and the pixel orientation information of each pixel, and then a three-dimensional image including the orientation information, the distance information, and the image information is obtained.
In the method, a piece of pixel distance information and a piece of pixel azimuth information are distributed to each pixel in the image information, so that the finally formed three-dimensional image is more vivid and stereoscopic, and the contained information is richer. When the user subsequently observes the three-dimensional image, the form of the shot target can be better known, the geographic position of the shot target can be known, different requirements of the user can be better met, and the use experience is improved.
In one exemplary embodiment, an image photographing method is provided and applied to a terminal. In the method, when the image capturing device is used for collecting image information, the orientation detection device is used for detecting orientation information, and the distance detection device is used for detecting distance detection information, the method further comprises the following steps:
and the rate of image information acquisition of the image shooting device, the rate of azimuth information detection of the azimuth detection device and the rate of distance detection of the distance detection device are controlled to be the same.
It is understood that when taking pictures, multiple frames of pictures are often taken. In the method, the image information comprises multi-frame sub-picture information, the azimuth information also comprises multi-frame sub-azimuth information, and the distance detection information comprises multi-frame sub-detection information. The speed of controlling the image shooting device to collect the image information, the speed of controlling the direction detection device to detect the direction information and the speed of controlling the distance detection device to detect the distance detection information are the same, namely, the multi-frame sub-picture information, the multi-frame sub-direction information and the multi-frame sub-detection information are in one-to-one correspondence.
For example, when the picture information includes 20 frames of sub-picture information, the orientation information includes 20 frames of sub-orientation information, and the distance detection information includes 20 frames of sub-detection information, since the rate at which the image capturing device acquires the image information, the rate at which the orientation detection device detects the orientation information, and the rate at which the distance detection device detects the distance detection information are the same, the first frame of sub-picture information, the first frame of sub-orientation information, and the first frame of sub-distance information are in one-to-one correspondence, the second frame of sub-picture information, the second frame of sub-orientation information, and the second frame of sub-distance information are in one-to-one correspondence, and so on. Therefore, each frame of sub-picture information is determined to correspond to one frame of sub-azimuth information and one frame of sub-distance information, the synthesis effect of the image information, the azimuth information and the distance information is improved, the shooting effect is improved, and the finally drawn three-dimensional image is better ensured to be more vivid.
Wherein, the method also comprises:
and controlling the image shooting device to acquire the same frame rate starting time of the image information, the starting time of the direction detection device for detecting the direction information and the starting time of the distance detection device for detecting the distance detection information.
In the method, when image shooting is carried out, the frame rate starting time of the image information collected by the image shooting device, the starting time of the azimuth detection device for detecting the azimuth information and the starting time of the distance detection device for detecting the distance detection information are controlled to be the same, namely, the three devices are controlled to simultaneously collect the corresponding information of the shooting target, so that the three information are more matched, and the finally obtained three-dimensional image has a better effect.
In one exemplary embodiment, an image capturing method is provided and applied to a terminal, which may be a mobile phone. In the method, image information is collected through a camera in the mobile phone, azimuth information is detected through an electronic compass in the mobile phone, and distance detection information is detected through a flight time sensor in the mobile phone.
When image shooting is carried out, the camera, the electronic compass and the flight time sensor are controlled to be started simultaneously, and the information is collected at the same speed, so that the finally obtained image information, the finally obtained azimuth information and the finally obtained distance detection information are ensured to have the same timestamp, image synthesis is carried out better, and the final imaging effect is improved.
For example, the three are controlled to be started at the time of 10. That is, the camera acquires sub-image information of a photographing target every 1/1000 second, the electronic compass detects sub-azimuth information every 1/1000 second, the time-of-flight sensor acquires a set of sub-detection information for determining distance information every 1/1000 second, and then the processor determines a distance matrix according to the sub-detection information.
It is assumed that 100 pieces of sub-image information, 100 pieces of sub-orientation information, and 100 pieces of sub-distance detection information are finally obtained.
When image synthesis is performed, the processor processes the sub-distance detection information to obtain 100 pieces of sub-target distance information.
Wherein, the nth sub-image information, the nth sub-azimuth information and the nth sub-distance information are in one-to-one correspondence (n is an integer greater than 0). Firstly, respectively synthesizing a plurality of sub-image information into integral image information, synthesizing a plurality of sub-azimuth information into integral azimuth information, synthesizing a plurality of sub-distance information into integral distance information, and then carrying out data fusion processing on the integral image information, the integral azimuth information and the integral distance information to obtain a final integral three-dimensional image.
It should be noted that, the corresponding sub-image information, sub-azimuth information, and sub-distance information may be subjected to data fusion to obtain 100 sub-three-dimensional images, and then the 100 sub-three-dimensional images are synthesized to obtain a final complete three-dimensional image.
The three-dimensional image obtained by the method contains more abundant information, when a user subsequently observes the three-dimensional image, the form of the shot target can be better known, the geographical position of the shot target can be known, different requirements of the user can be better met, and the use experience is improved.
In one exemplary embodiment, an image photographing apparatus is provided that is applied to a terminal. The device is used for implementing the image shooting method. Referring to fig. 2, the apparatus includes an obtaining module 101 and a determining module 102, in the course of implementing the above method,
an obtaining module 101, configured to obtain image information, orientation information, and distance detection information of a shooting target;
and the determining module 102 is configured to determine a three-dimensional image of the shooting target according to the image information, the azimuth information, and the distance detection information.
In one exemplary embodiment, an image photographing apparatus is provided that is applied to a terminal. In the apparatus, the determining module 102 is further configured to:
determining pixel distance information of each pixel in the image information according to the distance detection information and the image information;
determining pixel orientation information of each pixel in the image information according to the orientation information and the image information;
and determining the three-dimensional image according to the pixel distance information, the pixel direction information and the image information.
In one exemplary embodiment, an image photographing apparatus is provided that is applied to a terminal. The distance detection information includes a distance value matrix denoted as e f, and the pixel resolution of the image information denoted as a b, as shown in fig. 2, in the apparatus, the determining module 102 is further configured to:
determining a distance value of each pixel in the image information according to the following rule, and using the determined distance value as pixel distance information of the corresponding pixel:
if a = e, in each row of pixels of the image information, each pixel corresponds to a distance value in the distance value matrix;
if b = f, in each column of pixels of the image information, each pixel corresponds to a distance value in the distance value matrix;
if a is larger than e, in each row of pixels of the image information, each a/e pixels correspond to a distance value in the distance value matrix;
if b is larger than f, in each column of pixels of the image information, each b/f pixels correspond to a distance value in the distance value matrix;
if a is less than e, in each row of pixels of the image information, the average distance value of each e/a distance values in the distance value matrix corresponds to one pixel;
if b is less than f, in each row of pixels of the image information, the average distance value of every f/b distance values in the distance value matrix corresponds to one pixel;
wherein a, b, e and f are integers greater than or equal to 1.
In one exemplary embodiment, an image photographing apparatus is provided that is applied to a terminal. The orientation information includes an angle range deviating from the set direction, the angle range is denoted as (β, γ), the image information includes j columns of pixel groups, and referring to fig. 2, in the apparatus, the determining module 102 is further configured to:
determining pixel orientation information for each pixel in the image information according to the following rules:
in each column of pixel group of the image information, the pixel azimuth information of each pixel is the same;
in the image information, the pixel orientation information of the pixel group of the ith column is:
Figure BDA0002997029260000131
wherein i is an integer greater than or equal to 1 and less than or equal to j.
In one exemplary embodiment, an image photographing apparatus is provided that is applied to a terminal. The device also includes:
and the control module 103 is used for controlling the image shooting device to acquire the same speed of the image information, the direction detection device to detect the same speed of the direction information and the distance detection device to detect the same speed of the distance detection information.
In one exemplary embodiment, an image photographing apparatus is provided that is applied to a terminal. The device also includes:
the control module 103 is configured to control the image capturing device to acquire the same frame rate start time of the image information, the start time of the direction detection device detecting the direction information, and the start time of the distance detection device detecting the distance detection information.
The application also provides a terminal which can be a mobile phone, a tablet computer, a notebook computer, a video camera, a camera and other equipment with an image shooting function. The terminal comprises an image shooting device, a position determining device and a distance detecting device, wherein the image shooting device is configured to collect image information of a shooting target; the orientation determination device is configured to detect orientation information of the photographic target; the distance detection means is configured to detect distance detection information of the photographic subject with respect to the terminal. The terminal also includes a processor and a memory for storing processor-executable instructions, wherein the processor is configured to perform the image capture method described above.
In the terminal, when carrying out image shooting, acquire the image information who shoots the target earlier, position information and distance detection information, then with above-mentioned image information, position information and distance detection information carry out data fusion, obtain the three-dimensional image who shoots the target, through this three-dimensional image, not only can better must know the concrete form who shoots the target, and can also obtain when shooting the target, shoot the geographical position that the target is located, the content of image has been enriched, user's use experience has been promoted, can satisfy user's shooting demand better.
In an exemplary embodiment, illustrated with reference to fig. 3, the terminal 400 may include one or more of the following components: a processing component 402, a memory 404, a power component 406, a multimedia component 408, an audio component 410, an interface for input/output (I/O) 412, a sensor component 414, and a communication component 416.
The processing component 402 generally controls overall operation of the terminal 400, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 402 may include one or more processors 420 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 402 can include one or more modules that facilitate interaction between the processing component 402 and other components. For example, the processing component 402 can include a multimedia module to facilitate interaction between the multimedia component 408 and the processing component 402.
The memory 404 is configured to store various types of data to support operations at the terminal 400. Examples of such data include instructions for any application or method operating on the terminal 400, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 404 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power components 406 provide power to the various components of the terminal 400. The power components 406 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the terminal 400.
The multimedia component 408 includes a screen providing an output interface between the terminal 400 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 408 includes a front camera module and/or a rear camera module. The front camera module and/or the rear camera module can receive external multimedia data when the terminal 400 is in an operation mode, such as a shooting mode or a video mode. Each front camera module and rear camera module may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 410 is configured to output and/or input audio signals. For example, the audio component 410 includes a Microphone (MIC) configured to receive external audio signals when the terminal 400 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in the memory 404 or transmitted via the communication component 416. In some embodiments, audio component 410 also includes a speaker for outputting audio signals.
The I/O interface 412 provides an interface between the processing component 402 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 414 includes one or more sensors for providing various aspects of status assessment for the terminal 400. For example, the sensor assembly 414 can detect an open/closed state of the terminal 400, relative positioning of components, such as a display and keypad of the terminal 400, the sensor assembly 414 can also detect a change in position of the terminal 400 or a component of the terminal 400, the presence or absence of user contact with the terminal 400, orientation or acceleration/deceleration of the terminal 400, and a change in temperature of the terminal 400. The sensor assembly 414 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 414 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 414 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 416 is configured to facilitate communications between the terminal 400 and other devices in a wired or wireless manner. The device 700 may access a wireless network based on a communication standard, such as WiFi, 2G, 3G, 4G, 5G, or a combination thereof. In an exemplary embodiment, the communication component 416 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 416 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the terminal 400 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 404 comprising instructions, executable by the processor 420 of the terminal 400 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like. The instructions in the storage medium, when executed by a processor of the terminal, enable the terminal to perform the methods shown in the above-described embodiments.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (14)

1. An image shooting method is applied to a terminal, and is characterized by comprising the following steps:
acquiring image information, azimuth information and distance detection information of a shooting target;
and determining a three-dimensional image of the shooting target according to the image information, the azimuth information and the distance detection information.
2. The method of claim 1, wherein determining a three-dimensional image of the photographic target based on the image information, the orientation information, and the distance information comprises:
determining pixel distance information of each pixel in the image information according to the distance detection information and the image information;
determining pixel orientation information of each pixel in the image information according to the orientation information and the image information;
and determining the three-dimensional image according to the pixel distance information, the pixel azimuth information and the image information.
3. The method of claim 2, wherein the distance detection information comprises a matrix of distance values denoted as e f and a pixel resolution of the image information denoted as a b, and wherein determining the pixel distance information for each pixel in the image information based on the distance detection information and the image information comprises:
determining a distance value of each pixel in the image information according to the following rules, and using the determined distance value as pixel distance information of the corresponding pixel:
if a = e, in each row of pixels of the image information, each pixel corresponds to a distance value in the distance value matrix;
if b = f, in each column of pixels of the image information, each pixel corresponds to a distance value in the distance value matrix;
if a is larger than e, in each row of pixels of the image information, each a/e pixels correspond to one distance value in the distance value matrix;
if b > f, in each column of pixels of the image information, each b/f pixels correspond to a distance value in the distance value matrix;
if a is less than e, in each row of pixels of the image information, the average distance value of each e/a distance values in the distance value matrix corresponds to one pixel;
if b is less than f, in each column of pixels of the image information, the average distance value of every f/b distance values in the distance value matrix corresponds to one pixel;
wherein a, b, e and f are integers greater than or equal to 1.
4. The method of claim 3, wherein the orientation information includes an angular range from a set direction, the angular range being denoted as (β, γ), wherein the image information includes j columns of pixel groups, and wherein determining pixel orientation information for each pixel in the image information based on the orientation information and the image information comprises:
determining pixel orientation information for each pixel in the image information according to the following rules:
in each column of pixel group of the image information, the pixel position information of each pixel is the same;
in the image information, the pixel orientation information of the ith column of pixel group is:
Figure FDA0002997029250000021
wherein i is an integer greater than or equal to 1 and less than or equal to j.
5. The method according to any one of claims 1-4, further comprising:
and controlling the image shooting device to acquire the image information at the same rate, detecting the direction information by the direction detection device and detecting the distance detection information by the distance detection device.
6. The method according to any one of claims 1-4, further comprising:
and controlling the frame rate starting time of the image information collected by the image shooting device, the starting time of the direction information detected by the direction detection device and the starting time of the distance detection information detected by the distance detection device to be the same.
7. An image photographing apparatus applied to a terminal, the apparatus comprising:
the acquisition module is used for acquiring image information, azimuth information and distance detection information of a shooting target;
and the determining module is used for determining the three-dimensional image of the shooting target according to the image information, the direction information and the distance detection information.
8. The apparatus of claim 7, wherein the determining module is further configured to:
determining pixel distance information of each pixel in the image information according to the distance detection information and the image information;
determining pixel orientation information of each pixel in the image information according to the orientation information and the image information;
and determining the three-dimensional image according to the pixel distance information, the pixel azimuth information and the image information.
9. The apparatus of claim 8, wherein the distance detection information comprises a matrix of distance values denoted as e f, and wherein the pixel resolution of the image information is denoted as a b, and wherein the determining module is further configured to:
determining a distance value of each pixel in the image information according to the following rules, and using the determined distance value as pixel distance information of the corresponding pixel:
if a = e, in each row of pixels of the image information, each pixel corresponds to a distance value in the distance value matrix;
if b = f, in each column of pixels of the image information, each pixel corresponds to a distance value in the distance value matrix;
if a is larger than e, in each row of pixels of the image information, each a/e pixels correspond to one distance value in the distance value matrix;
if b > f, in each column of pixels of the image information, every b/f pixels correspond to a distance value in the distance value matrix;
if a is less than e, in each row of pixels of the image information, the average distance value of each e/a distance values in the distance value matrix corresponds to one pixel;
if b is less than f, in each column of pixels of the image information, the average distance value of every f/b distance values in the distance value matrix corresponds to one pixel;
wherein a, b, e and f are integers greater than or equal to 1.
10. The apparatus of claim 9, wherein the orientation information comprises an angular range of deviation from a set direction, the angular range being denoted as (β, γ), wherein the image information comprises j columns of pixel groups, and wherein the determining module is further configured to:
determining pixel orientation information for each pixel in the image information according to the following rules:
in each column of pixel group of the image information, the pixel orientation information of each pixel is the same;
in the image information, the pixel orientation information of the ith column of pixel group is:
Figure FDA0002997029250000031
wherein i is an integer greater than or equal to 1 and less than or equal to j.
11. The apparatus according to any one of claims 7-10, further comprising:
and the control module is used for controlling the image shooting device to acquire the same speed of the image information, the direction detection device to detect the same speed of the direction information and the distance detection device to detect the same speed of the distance detection information.
12. The apparatus of any one of claims 7-10, further comprising:
and the control module is used for controlling the image shooting device to acquire the frame rate starting time of the image information, the direction detection device to detect the starting time of the direction information and the distance detection device to detect the starting time of the distance detection information to be the same.
13. A terminal characterized in that it comprises image capturing means, orientation determining means and distance detecting means,
wherein the image capturing apparatus is configured to acquire image information of a capturing target; the orientation determining device is configured to detect orientation information of the photographic target; the distance detection device is configured to detect distance detection information of the shooting target relative to the terminal;
the terminal further comprises:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the method of any one of claims 1 to 6.
14. A non-transitory computer readable storage medium, wherein instructions in the storage medium, when executed by a processor of a terminal, enable the terminal to perform the method of any of claims 1 to 6.
CN202110344438.1A 2021-03-29 2021-03-29 Image shooting method, device, terminal and storage medium Pending CN115144870A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110344438.1A CN115144870A (en) 2021-03-29 2021-03-29 Image shooting method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110344438.1A CN115144870A (en) 2021-03-29 2021-03-29 Image shooting method, device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN115144870A true CN115144870A (en) 2022-10-04

Family

ID=83403239

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110344438.1A Pending CN115144870A (en) 2021-03-29 2021-03-29 Image shooting method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN115144870A (en)

Similar Documents

Publication Publication Date Title
US9674395B2 (en) Methods and apparatuses for generating photograph
CN110557547B (en) Lens position adjusting method and device
CN108462833B (en) Photographing method, photographing device and computer-readable storage medium
CN110858873B (en) Electronic device and photographing method
CN114009003A (en) Image acquisition method, device, equipment and storage medium
CN106954020B (en) A kind of image processing method and terminal
CN113364965A (en) Shooting method and device based on multiple cameras and electronic equipment
CN110876014B (en) Image processing method and device, electronic device and storage medium
CN115134505B (en) Preview picture generation method and device, electronic equipment and storage medium
EP4161054A1 (en) Anchor point information processing method, apparatus and device and storage medium
CN111614910B (en) File generation method and device, electronic equipment and storage medium
US11265529B2 (en) Method and apparatus for controlling image display
CN112188096A (en) Photographing method and device, terminal and storage medium
CN110769146B (en) Shooting method and electronic equipment
CN114422687B (en) Preview image switching method and device, electronic equipment and storage medium
CN112866555B (en) Shooting method, shooting device, shooting equipment and storage medium
CN115144870A (en) Image shooting method, device, terminal and storage medium
CN108769513B (en) Camera photographing method and device
CN111356001A (en) Video display area acquisition method and video picture display method and device
CN114390189A (en) Image processing method, device, storage medium and mobile terminal
CN109447929B (en) Image synthesis method and device
CN114339017B (en) Distant view focusing method, device and storage medium
CN117522942A (en) Depth distance measuring method, depth distance measuring device, electronic equipment and readable storage medium
CN115731296A (en) Image processing method, device, terminal and storage medium
CN110876013B (en) Method and device for determining image resolution, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination