CN110087057B - Depth image acquisition method and device for projector - Google Patents

Depth image acquisition method and device for projector Download PDF

Info

Publication number
CN110087057B
CN110087057B CN201910181857.0A CN201910181857A CN110087057B CN 110087057 B CN110087057 B CN 110087057B CN 201910181857 A CN201910181857 A CN 201910181857A CN 110087057 B CN110087057 B CN 110087057B
Authority
CN
China
Prior art keywords
depth
pixel unit
depth signal
receiving
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910181857.0A
Other languages
Chinese (zh)
Other versions
CN110087057A (en
Inventor
宋林东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Optical Technology Co Ltd
Original Assignee
Goertek Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Inc filed Critical Goertek Inc
Priority to CN201910181857.0A priority Critical patent/CN110087057B/en
Publication of CN110087057A publication Critical patent/CN110087057A/en
Application granted granted Critical
Publication of CN110087057B publication Critical patent/CN110087057B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/22Measuring arrangements characterised by the use of optical techniques for measuring depth
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/004Diagnosis, testing or measuring for television systems or their details for digital television systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiment of the invention provides a depth image acquisition method and a depth image acquisition device of a projector, wherein the method comprises the following steps: receiving an mth depth signal fed back for the mth pixel unit; receiving an nth depth signal fed back for an nth pixel unit; wherein the number of the pixel units spaced between the mth pixel unit and the nth pixel unit is k; generating a first low resolution depth image based on the mth depth signal and the nth depth signals received in a first frame of display image; wherein m, n and k are more than or equal to 1. Through the technical scheme, the time length of the fed back depth signal can be received, the display time of the original pixel unit is prolonged to the display time of K +1 pixel units, the depth value of the display image with the larger depth value can be measured, and the required depth image is obtained.

Description

Depth image acquisition method and device for projector
Technical Field
The invention relates to the technical field of image processing, in particular to a depth image acquisition method and device of a projector.
Background
With the development of the projected touch technology, the resolution of a display image generated by projection of a projector is higher and higher, and the frame rate is also higher and higher.
In the prior art, an infrared Laser is added to a Laser Beam Scanner (LBS), the light running time of each pixel unit can be recorded point by point through the infrared Laser, and further, the depth information of each pixel unit can be calculated, so that a corresponding depth image is obtained. In practical applications, the light propagation speed is constant, and the higher the resolution and the frame rate of the image to be displayed, the shorter the display time per pixel unit. When depth measurement is performed based on the display time of a single pixel unit, a depth image having a large depth value cannot be acquired because the available measurement time is too short.
Based on this, a solution for obtaining a depth image corresponding to a projection display image of a projector having a larger depth value is required.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method and an apparatus for obtaining a depth image of a projector, which can obtain a depth image corresponding to a projection display image of the projector with a larger depth value.
In a first aspect, an embodiment of the present invention provides a depth image obtaining method for a projector, where the method includes:
receiving an x depth signal fed back for an m pixel unit;
receiving a y depth signal fed back for the n pixel unit; wherein the number of the pixel units spaced between the m-th pixel unit and the n-th pixel unit is k, and the number of the depth signals spaced between the x-th depth signal and the y-th depth signal is z;
generating a first low resolution depth image based on the x-th depth signal and the plurality of y-th depth signals received in a first frame of display image; wherein m, n, x, y, k and z are all larger than or equal to 1, and k is larger than z.
In a second aspect, an embodiment of the present invention provides a depth image acquiring apparatus for a projector, including:
the receiving module is used for receiving the x depth signal fed back by aiming at the m pixel unit; receiving a y depth signal fed back for the n pixel unit; wherein the number of the pixel units spaced between the mth pixel unit and the nth pixel unit is k, and the number of the depth signals spaced between the x-th depth signal and the y-th depth signal is z;
the depth image generation module is used for generating a first low-resolution depth image based on the x-th depth signal and the y-th depth signals received in the first frame of display image; wherein m, n, x, y and k are all larger than or equal to 1, and k is larger than z.
According to the method and the device for obtaining the depth image of the projector, provided by the embodiment of the invention, based on the same frame of display image, a low-resolution depth image is formed by a plurality of depth signals together by obtaining the depth signals of a part of designated m-th pixel unit and n-th pixel unit in the display image; it should be noted that the number of pixel units spaced between the mth pixel unit and the nth pixel unit is K (K ≧ 1), so that the time interval between two adjacent depth signals is ensured to be the sum of the display times of K +1 pixel units. Through the technical scheme, the time length of the fed back depth signal can be received, the display time of the original pixel unit is prolonged to the display time of K +1 pixel units, the depth value of the display image with the larger depth value can be measured, and the required depth image is obtained.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a depth image obtaining method for a projector according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a first low-resolution depth image generation process according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a second low-resolution depth image generation process according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a full-depth image generation process according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a depth image obtaining apparatus of a projector according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, and "a" and "an" generally include at least two, but do not exclude at least one, unless the context clearly dictates otherwise.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a good or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such good or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a commodity or system that includes the element.
In the projector, an infrared laser is added. The infrared light of the infrared laser can be emitted simultaneously with the projection light of the projector, and the depth value of each pixel unit can be measured by using the infrared light, so that the whole depth image can be obtained. Specifically, the depth value of each pixel unit is calculated according to the flight time of the infrared light. It should be noted that the emitting sequence and the receiving sequence of the infrared light for each pixel unit are consistent, in other words, the situation that the infrared light emitted later returns before the infrared light emitted earlier cannot occur, and the detection signal sequence is prevented from being wrong. In some cases, if the projection distance is relatively far, the round trip time of the infrared light will also increase correspondingly, and for this reason, the measurement period for the depth measurement of the pixel unit needs to be extended, for example, one measurement is performed at intervals of 4 pixels, in other words, every four pixel units are in one group, for convenience of understanding, it is assumed in the following example that m is 1, n is 5, k is 4, and the display period of a single pixel unit is t. The infrared laser herein referred to includes an infrared transmitting unit and an infrared receiving unit.
In addition, the sequence of steps in each method embodiment described below is only an example and is not strictly limited.
An embodiment of the present invention provides a method for obtaining a depth image of a projector, and fig. 1 is a schematic flow chart of the method for obtaining a depth image of a projector according to the embodiment of the present invention. The method comprises the following specific steps:
101: receiving an x-th depth signal fed back for the m-th pixel unit.
As described above, assuming that m and x are both 1, the infrared emitting unit in the infrared laser emits infrared light for the first pixel unit at the same time when the projector projects the first pixel unit. When the infrared light contacts an object capable of presenting the content of the first pixel unit, the infrared light returns and is received by the infrared receiving unit, and a first depth signal aiming at the first pixel unit is obtained. The first depth signal represents a depth value of the first pixel unit.
102: receiving a y depth signal fed back for the n pixel unit; wherein the number of the pixel units spaced between the m-th pixel unit and the n-th pixel unit is k, and the number of the depth signals spaced between the x-th depth signal and the y-th depth signal is z.
As described above, assuming that n is 5, k is 4, and y is 2, the infrared emitting unit in the infrared laser emits infrared light for the second pixel unit while the projector projects the fifth pixel unit. When the infrared light contacts an object capable of presenting the content of the fifth pixel unit, the infrared light returns and is received by the infrared receiving unit, and a second depth signal aiming at the fifth pixel unit is obtained. The second depth signal represents a depth value of the fifth pixel unit.
103: generating a first low resolution depth image based on the x-th depth signal and the plurality of y-th depth signals received in a first frame of display image; wherein m, n, x, y, k and z are all larger than or equal to 1, and k is larger than z.
In generating the depth image, a first low resolution depth image is generated according to the order of the received depth signals and the corresponding depth values. According to the foregoing embodiments, the depth values are acquired once every 4 pixel units, so as to obtain the low resolution depth image as shown in fig. 2, where each pixel unit in the first low resolution depth image corresponds to the first pixel unit, the fifth pixel unit, the ninth pixel unit, and so on in the first frame display image. A low resolution depth image, which can be understood as a depth image having a resolution lower than the resolution of the display image; in the present embodiment, the resolution of the first low-resolution depth image is one-fourth of the resolution of the first frame display image.
The depth signals x and y are received at different time points for different pixel units in the same frame of display image. If x and y are two adjacent depth signals, then z is 1; the number k of spaces between the corresponding pixel units m and n is greater than 1.
In the same frame of display image, the mth pixel unit appears before the nth pixel unit. The x-th depth signal is returned before the y-th depth signal.
In one or more embodiments of the present invention, the receiving the xth depth signal fed back by the mth pixel unit may specifically include: triggering the xth depth signal for the mth pixel cell at a first pulse time; and receiving the fed back x-th depth signal within a specified time period.
For example, at a first pulse instant, triggering the projection of the first pixel element, the infrared laser is triggered to emit infrared light, so that a fed back first depth signal representing the depth value of the first pixel element can be received. It should be noted that each infrared light is synchronized with the projection trigger time of the pixel unit.
The specified time period referred to herein may be understood as a trigger period of infrared light, and in the present embodiment, the infrared light is triggered once every k pixel units, assuming that the display period of each pixel unit is t. Then, the trigger period T of the infrared light is T × k.
In one or more embodiments of the present invention, the receiving the y-th depth signal fed back by the nth pixel unit may specifically include: triggering the y depth signal for the n pixel cell at a second pulse time; receiving the feedback y depth signal in a specified time period; the time interval between the first pulse time and the second pulse time is k pixel unit display time.
The second pulse timing lags behind the first pulse timing. If the first pulse time and the second pulse time are adjacent time, the corresponding depth signals x and y are also two adjacent infrared light signals. If there are a plurality of pulses between the first pulse time and the second pulse time, there are a corresponding number of depth signals between the depth signal x and the depth signal y.
In one or more embodiments of the invention, the resolution of the first frame display image is k times the resolution of the first low resolution depth image.
As can be seen from fig. 2, assuming that the period T of the depth signal is k times the display period T of the pixel unit, in other words, the depth values of 1/k pixel units in the display image (assuming that the resolution is W × H) are acquired, the resolution of the generated depth image is (W/k) × H.
By the technical scheme, the acquisition period of the depth value of each pixel unit in the depth image is prolonged to K times of the display period of the pixel unit, and the depth image with a longer distance can be acquired. But the resolution of the depth image is reduced. If a depth image with a higher resolution is desired to be obtained, a plurality of depth images with a lower resolution may be merged, and the specific scheme is as follows:
to obtain a higher resolution depth image, after generating the first low resolution depth image, the method may further include: receiving an x depth signal fed back for the m + a pixel unit; receiving a y depth signal fed back for the n + a pixel unit; wherein the number of the pixel units spaced between the m + a pixel unit and the n + a pixel unit is k, and an a +1 th low resolution depth image is generated based on the x-th depth signal and a plurality of the y-th depth signals received in an a +1 th frame display image; wherein k is more than or equal to a and more than or equal to 1.
As previously described, it is assumed that m is 1, a is 1, n is 5, x is 1, y is 2, and k is 4. After obtaining the corresponding first low resolution depth image based on the first frame display image, further, a second low resolution depth image is obtained again. Still further, according to the above steps, a third low-resolution depth image and a fourth low-resolution depth image are sequentially acquired (specifically, the number of the acquired low-resolution depth images is determined by K). Specifically, a first depth signal fed back for the second pixel unit is received, and then a second depth signal fed back for the sixth pixel unit is received. A second low resolution depth image is generated as shown in fig. 3, with depth image pixel elements corresponding to the second pixel element, the sixth pixel element, the tenth pixel element, etc. of the second display image.
In one or more embodiments of the present invention, the receiving the xth depth signal fed back to the m + a-th pixel unit specifically includes: triggering the x-th depth signal for the m + a-th pixel cell at a third pulse time; and receiving the fed back x-th depth signal within a specified time period.
The third pulse time is not continuous or adjacent to the second pulse time described above, but is different in different frames.
In one or more embodiments of the present invention, the receiving the y-th depth signal fed back by the n + a-th pixel unit specifically includes: triggering the y depth signal for the n + a pixel cell at a fourth pulse time; receiving the feedback y depth signal in a specified time period; the time interval between the third pulse time and the fourth pulse time is k pixel unit display time.
The fourth pulse time lag behind the third pulse time, and if the third pulse time and the fourth pulse time are adjacent times, the corresponding depth signals x and y are also adjacent two infrared light signals. If there are a plurality of pulses between the third pulse time and the fourth pulse time, there are a corresponding number of depth signals between the depth signal x and the depth signal y.
In one or more embodiments of the present invention, the resolution of the a +1 th frame display image is k times the resolution of the a +1 th low resolution depth image.
As can be seen from fig. 3, assuming that the period T of the depth signal is k times the display period T of the pixel unit, in other words, the depth values of 1/k pixel units in the display image (assuming that the resolution is W × H) are acquired, the resolution of the generated depth image is (W/k) × H.
In one or more embodiments of the invention, a full depth image is generated based on the first low resolution depth image and k-1 of the a +1 th low resolution depth images; wherein the resolution of the first frame display image and/or the a +1 th frame display image is the same as the resolution of the full-depth image.
As can be seen from the foregoing, assuming that K is 4, by combining the first low resolution depth image, the second low resolution depth image, the third low resolution depth image, and the fourth low resolution depth image, a full depth image may be generated. As shown in fig. 4, one full depth image is generated from the four low resolution depth images. The full depth image has the same resolution as the display image, all W x H. Since the full-depth image is composed of 4 (determined by k) frames of images, the frame rate for generating the depth image is reduced by a factor of four (or k) of the original frame rate.
Based on the same idea, an embodiment of the present invention further provides a depth image obtaining apparatus for a projector, as shown in fig. 5, the apparatus includes:
a receiving module 51, configured to receive an x-th depth signal fed back by the m-th pixel unit; receiving a y depth signal fed back for the n pixel unit; wherein the number of the pixel units spaced between the mth pixel unit and the nth pixel unit is k, and the number of the depth signals spaced between the x-th depth signal and the y-th depth signal is z;
a depth image generating module 52, configured to generate a first low-resolution depth image based on the x-th depth signal and the y-th depth signals received in the first frame of display image; wherein m, n, x, y and k are all larger than or equal to 1, and k is larger than z.
Further, the receiving module 51 triggers the x-th depth signal for the m-th pixel unit at the first pulse time;
and receiving the fed back x-th depth signal within a specified time period.
Further, the receiving module 51 triggers the y-th depth signal for the n-th pixel unit at the second pulse time;
receiving the feedback y depth signal in a specified time period;
the time interval between the first pulse time and the second pulse time is k pixel unit display time.
Further, the resolution of the first frame display image is k times the resolution of the first low resolution depth image.
Further, the receiving module 51 is configured to receive an x-th depth signal fed back for the m + a-th pixel unit;
receiving a y depth signal fed back for the n + a pixel unit; wherein the number of the pixel units spaced between the m + a pixel unit and the n + a pixel unit is k,
generating an a +1 th low resolution depth image based on the x-th depth signal and the plurality of y-th depth signals received in the a +1 th frame display image; wherein k is more than or equal to a and more than or equal to 1.
Further, the receiving module 51 triggers the x-th depth signal for the m + a-th pixel unit at the third pulse time;
and receiving the fed back x-th depth signal within a specified time period.
Further, the receiving module 51 triggers the y-th depth signal for the n + a-th pixel unit at the fourth pulse time;
receiving the feedback y depth signal in a specified time period;
the time interval between the third pulse time and the fourth pulse time is k pixel unit display time.
Further, the resolution of the a +1 th frame display image is k times the resolution of the a +1 th low resolution depth image.
Further, the depth image generating module 52 generates a full depth image based on the first low resolution depth image and k-1 of the a +1 th low resolution depth images; wherein the resolution of the first frame display image and/or the a +1 th frame display image is the same as the resolution of the full-depth image.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by adding a necessary general hardware platform, and of course, can also be implemented by a combination of hardware and software. With this understanding in mind, the above-described aspects and portions of the present technology which contribute substantially or in part to the prior art may be embodied in the form of a computer program product, which may be embodied on one or more computer-usable storage media having computer-usable program code embodied therein, including without limitation disk storage, CD-ROM, optical storage, and the like.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable coordinate determination device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable coordinate determination device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable coordinate determination apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable coordinate determination device to cause a series of operational steps to be performed on the computer or other programmable device to produce a computer implemented process such that the instructions which execute on the computer or other programmable device provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A depth image acquisition method for a projector, comprising:
receiving an x depth signal fed back for an m pixel unit;
receiving a y depth signal fed back for the n pixel unit; wherein the number of the pixel units spaced between the m-th pixel unit and the n-th pixel unit is k, the number of the depth signals spaced between the x-th depth signal and the y-th depth signal is z, each depth signal is obtained by measuring a light running time of each pixel unit by an infrared laser installed on a projector, and the depth signal represents a depth value of the corresponding pixel unit;
generating a first low resolution depth image based on the x-th depth signal and the plurality of y-th depth signals received in a first frame of display image; wherein m, n, x, y, k and z are all larger than or equal to 1, and k is larger than z.
2. The method of claim 1, wherein receiving the xth depth signal fed back for the mth pixel unit comprises:
triggering the xth depth signal for the mth pixel cell at a first pulse time;
and receiving the fed back x-th depth signal within a specified time period.
3. The method of claim 2, wherein receiving the y depth signal fed back for the n pixel unit comprises:
triggering the y depth signal for the n pixel cell at a second pulse time;
receiving the feedback y depth signal in a specified time period;
the time interval between the first pulse time and the second pulse time is k pixel unit display time.
4. The method of claim 1, wherein the resolution of the first frame display image is k times the resolution of the first low resolution depth image.
5. The method of claim 1, wherein after generating the first low resolution depth image, further comprising:
receiving an x depth signal fed back for the m + a pixel unit;
receiving a y depth signal fed back for the n + a pixel unit; wherein the number of the pixel units spaced between the m + a pixel unit and the n + a pixel unit is k,
generating an a +1 th low resolution depth image based on the x-th depth signal and the plurality of y-th depth signals received in the a +1 th frame display image; wherein k is more than or equal to a and more than or equal to 1.
6. The method of claim 5, wherein receiving the x-th depth signal fed back for the m + a-th pixel unit comprises:
triggering the x-th depth signal for the m + a-th pixel cell at a third pulse time;
and receiving the fed back x-th depth signal within a specified time period.
7. The method of claim 6, wherein receiving the y depth signal fed back for the n + a pixel unit comprises:
triggering the y depth signal for the n + a pixel cell at a fourth pulse time;
receiving the feedback y depth signal in a specified time period;
the time interval between the third pulse time and the fourth pulse time is k pixel unit display time.
8. The method of claim 5 wherein the resolution of the a +1 th frame display image is k times the resolution of the a +1 th low resolution depth image.
9. The method of claim 6, further comprising: generating a full-depth image based on the first low-resolution depth image and k-1 of the a +1 th low-resolution depth images; wherein the resolution of the first frame display image and/or the a +1 th frame display image is the same as the resolution of the full-depth image.
10. A depth image acquiring apparatus of a projector, characterized by comprising:
the receiving module is used for receiving the x depth signal fed back by aiming at the m pixel unit; receiving a y depth signal fed back for the n pixel unit; wherein the number of the pixel units spaced between the mth pixel unit and the nth pixel unit is k, the number of the depth signals spaced between the xth depth signal and the yth depth signal is z, each depth signal is obtained by measuring a light running time of each pixel unit by an infrared laser installed on a projector, and the depth signals represent depth values of the corresponding pixel units;
the depth image generation module is used for generating a first low-resolution depth image based on the x-th depth signal and the y-th depth signals received in the first frame of display image; wherein m, n, x, y and k are all larger than or equal to 1, and k is larger than z.
CN201910181857.0A 2019-03-11 2019-03-11 Depth image acquisition method and device for projector Active CN110087057B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910181857.0A CN110087057B (en) 2019-03-11 2019-03-11 Depth image acquisition method and device for projector

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910181857.0A CN110087057B (en) 2019-03-11 2019-03-11 Depth image acquisition method and device for projector

Publications (2)

Publication Number Publication Date
CN110087057A CN110087057A (en) 2019-08-02
CN110087057B true CN110087057B (en) 2021-10-12

Family

ID=67412387

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910181857.0A Active CN110087057B (en) 2019-03-11 2019-03-11 Depth image acquisition method and device for projector

Country Status (1)

Country Link
CN (1) CN110087057B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1889631A (en) * 2006-06-06 2007-01-03 王程 Method for rebuilding super resolution image from reduced quality image caused by interlaced sampling
CN104884972A (en) * 2012-11-27 2015-09-02 E2V半导体公司 Method for producing images with depth information and image sensor
CN105120257A (en) * 2015-08-18 2015-12-02 宁波盈芯信息科技有限公司 Vertical depth sensing device based on structured light coding
CN105143817A (en) * 2013-04-15 2015-12-09 微软技术许可有限责任公司 Super-resolving depth map by moving pattern projector
US9766060B1 (en) * 2016-08-12 2017-09-19 Microvision, Inc. Devices and methods for adjustable resolution depth mapping
CN107636488A (en) * 2015-07-20 2018-01-26 谷歌有限责任公司 For the method and apparatus for the resolution ratio for improving flight time pel array
US9906717B1 (en) * 2016-09-01 2018-02-27 Infineon Technologies Ag Method for generating a high-resolution depth image and an apparatus for generating a high-resolution depth image
CN108648222A (en) * 2018-04-27 2018-10-12 华中科技大学 The method for improving and device of structure light depth data spatial resolution
CN108983249A (en) * 2017-06-02 2018-12-11 比亚迪股份有限公司 Time-of-flight ranging systems, method, distance measuring sensor and camera
CN108989780A (en) * 2018-08-01 2018-12-11 歌尔股份有限公司 The high-resolution projecting method of laser scanning projection's instrument
CN109425305A (en) * 2017-09-05 2019-03-05 脸谱科技有限责任公司 Use the depth measurement of multiple pulsed structured light projection instrument

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2369710C (en) * 2002-01-30 2006-09-19 Anup Basu Method and apparatus for high resolution 3d scanning of objects having voids
WO2003105289A2 (en) * 2002-06-07 2003-12-18 University Of North Carolina At Chapel Hill Methods and systems for laser based real-time structured light depth extraction
US9066087B2 (en) * 2010-11-19 2015-06-23 Apple Inc. Depth mapping using time-coded illumination
KR101887099B1 (en) * 2010-12-29 2018-08-09 삼성전자주식회사 image processing system and image processing method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1889631A (en) * 2006-06-06 2007-01-03 王程 Method for rebuilding super resolution image from reduced quality image caused by interlaced sampling
CN104884972A (en) * 2012-11-27 2015-09-02 E2V半导体公司 Method for producing images with depth information and image sensor
CN105143817A (en) * 2013-04-15 2015-12-09 微软技术许可有限责任公司 Super-resolving depth map by moving pattern projector
CN107636488A (en) * 2015-07-20 2018-01-26 谷歌有限责任公司 For the method and apparatus for the resolution ratio for improving flight time pel array
CN105120257A (en) * 2015-08-18 2015-12-02 宁波盈芯信息科技有限公司 Vertical depth sensing device based on structured light coding
US9766060B1 (en) * 2016-08-12 2017-09-19 Microvision, Inc. Devices and methods for adjustable resolution depth mapping
US9906717B1 (en) * 2016-09-01 2018-02-27 Infineon Technologies Ag Method for generating a high-resolution depth image and an apparatus for generating a high-resolution depth image
CN108983249A (en) * 2017-06-02 2018-12-11 比亚迪股份有限公司 Time-of-flight ranging systems, method, distance measuring sensor and camera
CN109425305A (en) * 2017-09-05 2019-03-05 脸谱科技有限责任公司 Use the depth measurement of multiple pulsed structured light projection instrument
CN108648222A (en) * 2018-04-27 2018-10-12 华中科技大学 The method for improving and device of structure light depth data spatial resolution
CN108989780A (en) * 2018-08-01 2018-12-11 歌尔股份有限公司 The high-resolution projecting method of laser scanning projection's instrument

Also Published As

Publication number Publication date
CN110087057A (en) 2019-08-02

Similar Documents

Publication Publication Date Title
JP6698655B2 (en) Ranging device
JP6605580B2 (en) Linear mode calculation sensing laser radar
CN108474843A (en) Distance-measuring device
JP2021507218A5 (en)
US20180268297A1 (en) Network training device, network training system, network training method, and computer program product
JP7164721B2 (en) Sensor data processing method, device, electronic device and system
CN111240614B (en) Screen projection processing method, device and equipment
CN110087057B (en) Depth image acquisition method and device for projector
WO2022110947A1 (en) Control method for electronic device, electronic device, and computer-readable storage medium
KR102643611B1 (en) Pulse signal-based display methods and apparatus, electronic devices, and media
CN114072697B (en) Method for simulating continuous wave lidar sensor
US20130207967A1 (en) Image processing apparatus and method
EP3023987A1 (en) Method and apparatus for visualizing information of a digital video stream
EP3757945A1 (en) Device for generating an augmented reality image
CN109118702B (en) Fire detection method, device and equipment
US20220351329A1 (en) Image Processing Method, Method for Generating Instructions for Image Processing and Apparatuses Therefor
US20150380056A1 (en) Video Channel Display Method and Apparatus
US9906775B2 (en) Device and method for multiview image calibration
CN110020264B (en) Method and device for determining invalid hyperlinks
KR20170093421A (en) Method for determining object of interest, video processing device and computing device
US20220326358A1 (en) Method and device for determining distances to a scene
CN210093321U (en) Camera and camera control device
CN109582295B (en) Data processing method and device, storage medium and processor
CN109698951B (en) Stereoscopic image reproducing method, apparatus, device and storage medium
CN108234196B (en) Fault detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221129

Address after: 261031 workshop 1, phase III, Geer Photoelectric Industrial Park, 3999 Huixian Road, Yongchun community, Qingchi street, high tech Zone, Weifang City, Shandong Province

Patentee after: GoerTek Optical Technology Co.,Ltd.

Address before: 261031 No.268, Dongfang Road, Weifang High tech Industrial Development Zone, Weifang City, Shandong Province

Patentee before: GOERTEK Inc.