WO2021120217A1 - 图像采集装置、图像采集方法及采集芯片 - Google Patents

图像采集装置、图像采集方法及采集芯片 Download PDF

Info

Publication number
WO2021120217A1
WO2021120217A1 PCT/CN2019/127179 CN2019127179W WO2021120217A1 WO 2021120217 A1 WO2021120217 A1 WO 2021120217A1 CN 2019127179 W CN2019127179 W CN 2019127179W WO 2021120217 A1 WO2021120217 A1 WO 2021120217A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
speckle
projector
projection
infrared camera
Prior art date
Application number
PCT/CN2019/127179
Other languages
English (en)
French (fr)
Inventor
陈文斌
Original Assignee
深圳市汇顶科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市汇顶科技股份有限公司 filed Critical 深圳市汇顶科技股份有限公司
Priority to PCT/CN2019/127179 priority Critical patent/WO2021120217A1/zh
Priority to CN201980008848.XA priority patent/CN111656778B/zh
Publication of WO2021120217A1 publication Critical patent/WO2021120217A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/218Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/156Mixing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects

Definitions

  • the embodiments of the present application relate to the field of 3D image technology, and in particular to an image acquisition device, an image acquisition method, and an acquisition chip.
  • 3D image technology is applied in various fields. For example, in the scene of face recognition, the accuracy of face recognition is higher by using 3D image collection.
  • the speckle projector is used to project light spots on the object to be inspected, and images are collected. According to the distribution of light spots in the collected images, the depth information of the pixels is determined to construct a 3D image, but In the image acquisition process, if the overall area of the light spot projected by the speckle projector is small, the object to be inspected cannot be comprehensively 3D image acquisition.
  • FOV field of view
  • one of the technical problems solved by the embodiments of the present application is to provide an image acquisition device, image acquisition method, and acquisition chip to overcome the disadvantages of low 3D image accuracy and poor integrity in the prior art.
  • an embodiment of the present application provides an image acquisition device, including: a processor, an infrared camera module, a first speckle projector and a second speckle projector, and the processor is electrically connected to the infrared camera module;
  • the first speckle projector is configured to perform speckle projection on the first projection area according to a preset direction
  • the infrared camera module is used to perform speckle image acquisition to obtain the first image when the first speckle projector projects the speckle on the first projection area, and transmit the first image to the processor;
  • the second speckle projector is used to perform speckle projection on the second projection area according to a preset direction
  • the infrared camera module is also used to collect speckle images to obtain a second image when the second speckle projector projects speckles on the second projection area, and transmit the second image to the processor, the first projection area and the second projection area 2.
  • the projection areas are connected or there are overlapping areas;
  • the processor is used to combine the first image and the second image to obtain a speckle collection image.
  • the first speckle projector and the second speckle projector are respectively electrically connected to the processor;
  • the first speckle projector and the second speckle projector are respectively electrically connected to the infrared camera module.
  • the infrared camera module includes an infrared camera, a controller, and a control interface.
  • the infrared camera module is connected to the first speckle projector and the second speckle projector through the control interface. connection;
  • the controller is used to control the first speckle projector and the second speckle projector to perform speckle projection through the control interface.
  • the image acquisition device further includes a distance sensor; the distance sensor is electrically connected to the processor;
  • the distance sensor is used to detect whether there is a target object in the first projection area or the second projection area, and transmit the detection result to the processor;
  • the processor is configured to generate a first collection instruction when the detection result indicates that there is a target object in the first projection area or the second projection area, and transmit the first collection instruction to the infrared camera module, so that the infrared camera module triggers the first
  • the speckle projector performs speckle projection.
  • the processor is further configured to generate a second acquisition instruction when receiving the first image transmitted by the infrared camera module, and transmit the second acquisition instruction to the infrared camera Module, so that the infrared camera module triggers the second speckle projector to perform speckle projection.
  • the infrared camera module is also used to trigger the second speckle projector to perform speckle projection when the first image is collected.
  • the image acquisition device further includes a third speckle projector, and the third speckle projector is electrically connected to the processor and the infrared camera module, respectively;
  • the third speckle projector is used to perform speckle projection on the third area according to a preset direction
  • the infrared camera module is also used to collect speckle images to obtain a third image when the third speckle projector projects speckles on the third area, and transmit the third image to the processor;
  • the processor is also used to combine the first image, the second image, and the third image to obtain a speckle collection image.
  • the image acquisition device further includes a fourth speckle projector, and the fourth speckle projector is electrically connected to the processor and the infrared camera module, respectively;
  • the fourth speckle projector is used to perform speckle projection on the fourth area according to a preset direction
  • the infrared camera module is also used to collect speckle images to obtain a fourth image when the fourth speckle projector projects speckles on the fourth area, and transmit the fourth image to the processor;
  • the processor is also used to combine the first image, the second image, the third image, and the fourth image to obtain a speckle collection image.
  • the image acquisition device further includes an RGB camera; the RGB camera is electrically connected to the processor;
  • the RGB camera is used to collect RGB images from the first projection area and the second projection area to obtain RGB images, and transmit the RGB images to the processor;
  • the processor is used to generate a 3D image based on the RGB image and the speckle collected image.
  • the image acquisition device further includes a flood illuminator, and the flood illuminator is electrically connected to the RGB camera and the processor, respectively;
  • the image acquisition device further includes a substrate, the infrared camera module, the first speckle projector, and the second speckle projector are arranged on the substrate, and the infrared camera module is arranged on the substrate. Between the first speckle projector and the second speckle projector.
  • the distances from the first speckle projector and the second speckle projector to the infrared camera module are equal.
  • the centers of the infrared camera module, the first speckle projector, and the second speckle projector are located on the same straight line.
  • the angle of view of the first speckle projector is the same as the angle of view of the second speckle projector.
  • an image acquisition method including:
  • the first image and the second image are combined to obtain a speckle acquisition image.
  • the method further includes:
  • a first acquisition instruction is generated, and the first acquisition instruction is used to instruct to perform speckle image acquisition on the first projection area according to a preset direction.
  • the method further includes:
  • a second acquisition instruction is generated, and the second acquisition instruction is used to instruct to perform speckle image acquisition on the second projection area according to a preset direction.
  • the method further includes: performing speckle projection on the third region according to a preset direction, and performing speckle image collection to obtain a third image;
  • the first image, the second image, and the third image are combined to obtain a speckle acquisition image.
  • the method further includes: performing speckle projection on the fourth area according to a preset direction, and performing speckle image collection to obtain a fourth image;
  • Combining the first image and the second image to obtain a speckle acquisition image includes:
  • the first image, the second image, the third image, and the fourth image are combined to obtain a speckle collection image.
  • the method further includes: performing RGB red, green, and blue image acquisition on the first projection area and the second projection area to obtain an RGB image; and generating a 3D image based on the RGB image and the speckle acquisition image image.
  • an embodiment of the present application provides a collection chip, and the collection chip executes a pre-stored computer program to implement the following method:
  • the acquisition chip is configured to control the first speckle projector to perform speckle projection on the first projection area according to a preset direction; and control the infrared camera module to perform speckle image acquisition to obtain the first image;
  • the acquisition chip is further configured to control the second speckle projector to perform speckle projection on the second projection area according to the preset direction; and control the infrared camera module to perform speckle image acquisition to obtain the second image, the first projection area and the second projection area
  • the projected areas are connected or there are overlapping areas;
  • the acquisition chip is further configured to combine the first image and the second image to obtain a speckle acquisition image.
  • the image acquisition device, image acquisition method, and acquisition chip of the embodiments of the present application perform speckle projection on the first projection area according to a preset direction, and perform speckle image acquisition to obtain the first image; and perform speckle projection on the second projection area according to the preset direction Performing speckle projection, and performing speckle image acquisition to obtain a second image, where the first projection area and the second projection area are connected or there is an overlapping area; the first image and the second image are combined to obtain the speckle acquisition image.
  • a single speckle projector can maintain a large number of speckles per unit area of the illuminated area with a smaller field of view, while the speckle acquisition image of the target object passes
  • the images collected by the two regions are spliced together, which increases the detection range of the target object, that is, improves the field of view of the image acquisition device, and takes into account the field of view and image accuracy of the image acquisition device.
  • FIG. 1 is a structural diagram of an image acquisition device provided by an embodiment of the application.
  • FIG. 2 is a structural diagram of an image acquisition device provided by an embodiment of the application.
  • FIG. 3 is a structural diagram of an image acquisition device provided by an embodiment of the application.
  • FIG. 4 is a structural diagram of an infrared camera module provided by an embodiment of the application.
  • FIG. 5 is a structural diagram of an image acquisition device provided by an embodiment of the application.
  • FIG. 6 is a structural diagram of an image acquisition device provided by an embodiment of the application.
  • FIG. 7 is a flowchart of an image acquisition method provided by an embodiment of the application.
  • FIG. 1 is a structural diagram of an image acquisition device provided by an embodiment of the application.
  • the image acquisition device 10 includes a processor 101 and an infrared camera module. 102.
  • the first speckle projector 103 and the second speckle projector 104; the processor 101 is electrically connected to the infrared camera module 102;
  • the first speckle projector 103 is configured to perform speckle projection on the first projection area according to a preset direction
  • the infrared camera module 102 is configured to perform speckle image acquisition to obtain a first image when the first speckle projector 103 projects speckles on the first projection area, and transmit the first image to the processor 101;
  • the second speckle projector 104 is configured to perform speckle projection on the second projection area according to a preset direction
  • the infrared camera module 102 is also used to collect speckle images when the second speckle projector 104 projects speckles on the second projection area to obtain a second image, and transmit the second image to the processor 101.
  • the first projection The area is adjacent to the second projection area or there is an overlapping area;
  • the processor 101 is configured to combine the first image and the second image to obtain a speckle collection image.
  • FIG. 2 is a structural diagram of an image capture device provided by an embodiment of the application.
  • the image capture device may also include a substrate 105, such as PCB or FPC.
  • the substrate 105, the infrared camera module 102, the first speckle projector 103, and the second speckle projector 104 may be arranged on the substrate 105, and the processor may also be arranged on the substrate 105 or connected to the substrate through a wire 105.
  • the substrate 105 can provide electrical connection and support between the functional modules.
  • the infrared camera module 102 is arranged between the first speckle projector 103 and the second speckle projector 104, and the first speckle projector 103 and the second speckle projector 104 are connected to the infrared
  • the distances of the camera modules 102 are equal, and the centers of the three can be on the same straight line, or the centers of the first speckle projector 103 and the second speckle projector 104 can be on the same straight line and parallel to the edge of the substrate 105.
  • the first speckle projector 103 performs speckle projection on the first projection area
  • the first image collected by the infrared camera module 102 is an image obtained by collecting the light spot projected by the first projection area
  • the second speckle projector 104 Speckle projection is performed on the second projection area
  • the second image collected by the infrared camera module 102 is an image obtained by collecting the light spot projected by the second projection area. Therefore, the first image and the second image are combined.
  • the speckle acquisition image covers the first projection area and the second projection area, increasing the field of view of image acquisition, and the first speckle projector 103 and the second speckle projector 104 are not collected at the same time, avoiding two projections
  • the light spots emitted by the sensor overlap, which avoids the increase of the unit light spot area and ensures the accuracy of the 3D image.
  • the first projection area and the second projection area are connected or there is an overlapping area, therefore, the first image and the second image are also connected or have an overlapping area, and the first image and the second image are also connected or overlapped according to the positional relationship between the two projection areas.
  • the image and the second image can be spliced to obtain the speckle collection image.
  • the speckle acquisition image can be used to generate a 3D image, or directly identify or register based on the depth data obtained from the speckle acquisition image, and the application does not limit the purpose of the speckle acquisition image.
  • the first projection area is the speckle coverage area projected by the first speckle projector 103
  • the second projection area is the speckle coverage area projected by the second speckle projector 104.
  • it can be the entire area covered by the speckle. It can be a part of the area, and this application does not restrict it.
  • the first speckle projector 103 and the second speckle projector 104 project the speckle in the same direction, and both are preset directions.
  • the preset direction may be the first speckle Directly in front of the projection of the projector 103, the first speckle projector 103 and the second speckle projector 104 face the same direction, which ensures that the projection directions are both preset directions; for another example, the preset direction may be the infrared camera module 102
  • the orientation of the camera of, of course, is only an example here, and this application does not limit the preset direction.
  • the preset direction is only to ensure that the projection directions of the first speckle projector 103 and the second speckle projector 104 are the same. Performing speckle projection and shooting on the target object, as long as the direction of the speckle projector is the same, the speckle can be projected on the target object according to the preset direction.
  • the first speckle projector 103 and the second speckle projector 104 are on the same horizontal line. Both speckle projectors project speckles in a preset direction, and the first speckle projector 103 projects The spot of light covers the first projection area, and the light spot projected by the second speckle covers the second projection area. There is an overlap area between the first projection area and the second projection area. Of course, the first projection area and the second projection area can also be the same Then, in this way, the area that can be collected by the image collecting device 10 is the sum of the first projection area and the second projection area, which greatly improves the horizontal field of view. Of course, if the first speckle projector When the second speckle projector 103 and the second speckle projector 104 are arranged on the same vertical straight line, the vertical field of view angle of the image acquisition device 10 can be increased, which is not limited in this application.
  • the angle of view of the first speckle projector 103 and the angle of view of the second speckle projector 104 are the same. Mosaic of images. Assuming that the angle of view of the first speckle projector 103 is F1, the size of the first projection area is related to the angle of view F1, and the area where the first projection area overlaps with the second projection area is located at F1 and the two projectors. The distance between the positions is related. The closer the two projectors with F1 field of view are, the larger the overlapping area.
  • FIG. 3 is a structural diagram of an image acquisition device provided by an embodiment of the application.
  • the image acquisition device 10 further includes a distance sensor 106; the distance sensor 106 is electrically connected to the processor 101;
  • the distance sensor 106 is used to detect whether there is a target object in the first projection area or the second projection area, and transmit the detection result to the processor 101;
  • the processor 101 is configured to generate a first collection instruction when the detection result indicates that there is a target object in the first projection area or the second projection area, and transmit the first collection instruction to the infrared camera module 102 so that the infrared camera module 102
  • the first speckle projector 103 is triggered to perform speckle projection.
  • the distance sensor 106 can be an ultrasonic sensor. If there is a target object in the first projection area or the second projection area, the ultrasonic wave emitted by the ultrasonic sensor is reflected back by the target object, and the reflected ultrasonic wave is converted into an electrical signal. The electrical signal is transmitted to the processor 101, and the processor 101 determines that the target object exists according to the electrical signal, and generates a first collection instruction.
  • the distance sensor 106 may also be other types of sensors, such as an infrared light proximity sensor or an infrared light distance sensor, which is not limited in this application.
  • the first acquisition instruction may be transmitted to the infrared camera module 102, and the infrared camera module 102 triggers the first speckle projector 103 to perform speckle projection, or the processor 101 may transmit the first acquisition instruction to the infrared camera.
  • the camera module 102 and the first speckle projector 103 in order to ensure that the infrared camera module 102 collects the first image, when transmitting the first acquisition instruction, the infrared camera module 102 can be sent first, and then the first speckle The projector 103 sends it, which is not limited in this application.
  • image acquisition is performed on the second projection area.
  • image acquisition is performed on the second projection area.
  • the processor 101 is further configured to generate a second acquisition instruction when receiving the first image transmitted by the infrared camera module 102, and transmit the second acquisition instruction to the infrared camera Module 102, so that the infrared camera module 102 triggers the second speckle projector 104 to perform speckle projection.
  • the processor 101 controls the infrared camera module 102 to perform image collection on the second projection area through the second collection instruction, and the infrared camera module 102 triggers the second speckle projector 104 to perform speckle projection according to the second collection instruction.
  • the processor 101 may also send the second acquisition instruction to the infrared camera module 102 and the second speckle projector 104.
  • the second acquisition instruction may be sent to the infrared camera module 102 first, and then to the second speckle projector 104.
  • 104 sends the second collection instruction, which is not limited in this application.
  • the infrared camera module 102 is also used to trigger the second speckle projector 104 to perform speckle projection when the first image is collected.
  • the infrared camera module 102 After the infrared camera module 102 collects the first image, it does not need to wait for the processor 101 to issue an instruction, and directly triggers the second speckle projector 104 to perform speckle projection. The infrared camera module 102 then performs speckle projection on the second projection area.
  • the second image obtained by image collection may require the processor 101 to make a judgment and control the infrared camera module 102 to collect the second projection area, which reduces the time for image collection and improves the efficiency of image collection.
  • the processor 101 may trigger the first speckle projector 103 and the second speckle projector 104 to project speckles, or the infrared camera module 102 may trigger the first speckle projector 103 and the second speckle projector 104 to project speckles.
  • Speckle here, the connection relationship of different implementation methods is explained:
  • the first speckle projector 103 and the second speckle projector 104 are respectively electrically connected to the processor 101;
  • the first speckle projector 103 and the second speckle projector 104 are electrically connected to the infrared camera module 102 respectively.
  • FIG. 4 is a structural diagram of an infrared camera module provided by an embodiment of this application; the infrared camera module 102 triggers the first speckle When the projector 103 and the second speckle projector 104 project speckles, the infrared camera module 102 is triggered to include an infrared camera 1021, a controller 1022, and a control interface 1023. The infrared camera module 102 communicates with the first speckle through the control interface 1023.
  • the projector 102 and the second speckle projector 104 are connected; the controller 1022 is used to control the first speckle projector 103 and the second speckle projector 104 to perform speckle projection through the control interface 1023.
  • control interface 1023 can be an I/O (English: Input/Output, input/output) interface. This application does not limit the specific form of the control interface 1023.
  • the infrared camera module 102 can have multiple controls. Interface 1023, a control interface 1023 is connected to a speckle projector; or, the infrared camera module may have a common control interface 1023, through which the control interface 1023 is connected to multiple speckle projectors, each speckle projector Corresponding to one piece of address information, when the infrared camera module 102 transmits the acquisition instruction through the control interface 1023, it carries the address information of the speckle projector to instruct the speckle projector corresponding to the address information to perform projection.
  • the controller 1022 may transmit a clock signal to the first speckle projector 103 and the second speckle projector 104 through the control interface 1023, and when the clock signal is at a high level, it indicates the corresponding scatter The speckle projector performs speckle projection. When the clock signal is at a low level, the corresponding speckle projector is instructed to stop working.
  • a signal in a certain format can also be transmitted to instruct the speckle projector to perform speckle. Projecting or stopping work, this application does not restrict it.
  • FIG. 5 is a structural diagram of an image acquisition device provided by an embodiment of the application.
  • the image acquisition device 10 may also include more speckle projectors.
  • the image acquisition device 10 further includes a third speckle projector 107, and the third speckle projector 107 is electrically connected to the processor 101 and the infrared camera module 102, respectively;
  • the third speckle projector 107 is configured to perform speckle projection on the third area according to a preset direction
  • the infrared camera module 102 is also used to perform speckle image collection when the third speckle projector 107 projects speckles on the third area to obtain a third image, and transmit the third image to the processor 101;
  • the processor 101 is further configured to combine the first image, the second image, and the third image to obtain a speckle collection image.
  • the image acquisition device 10 further includes a fourth speckle projector 108, and the fourth speckle projector 108 is electrically connected to the processor 101 and the infrared camera module 102, respectively;
  • the fourth speckle projector 108 is configured to perform speckle projection on the fourth area according to a preset direction
  • the infrared camera module 102 is also used to perform speckle image collection when the fourth speckle projector 108 projects speckles on the fourth area to obtain a fourth image, and transmit the fourth image to the processor 101;
  • the processor 101 is further configured to combine the first image, the second image, the third image, and the fourth image to obtain a speckle collection image.
  • the projection principles of the third speckle projector 107, the fourth speckle projector 108 and the second speckle projector 104 are the same, and will not be repeated here.
  • the optional image acquisition device can be integrated on the PCB (English: Printed Circuit Board), the first speckle projector 103 and the second speckle projector 104 can be arranged along the first direction, respectively located in the infrared camera On the left and right sides of the module 102, the direction of the line connecting the midpoints of the first speckle projector 103 and the second speckle projector 104 is the first direction.
  • the line connecting the midpoints of the speckle projector 104 can be parallel to the edge of the PCB;
  • the third speckle projector 107 and the fourth speckle projector 108 can be arranged along the second direction, respectively located above and below the infrared camera module 102, in the third speckle projector 107 and the fourth speckle projector 108
  • the direction of the line of the points is the second direction, and the line between the midpoints of the third speckle projector 107 and the fourth speckle projector 108 is perpendicular to the upper or lower surface of the PCB; as shown in Figure 5,
  • the first direction may be perpendicular to the second direction.
  • the projection directions of the first speckle projector 103, the second speckle projector 104, the third speckle projector 107, and the fourth speckle projector 108 are the same, all of which are Project according to the preset direction.
  • the four speckle projectors can also be arranged and distributed along a straight line, which is not limited in this application.
  • the infrared camera module 102 triggers the third speckle projector 107 to perform speckle projection on the third area. After the infrared camera module 102 collects the third image, it triggers the fourth The speckle projector 108 performs speckle projection on the fourth area.
  • the triggering method is the same as the triggering method of the second speckle projector 104, and will not be repeated here. The more speckle projectors, the larger the angle of view for image acquisition.
  • FIG. 6 is a structural diagram of an image acquisition device provided by an embodiment of the application.
  • the image acquisition device 10 further includes RGB (English: Red Green). Blue, red, green and blue) camera 109; RGB camera 109 is electrically connected to processor 101;
  • the RGB camera 109 is configured to collect RGB images from the first projection area and the second projection area to obtain RGB images, and transmit the RGB images to the processor 101;
  • the processor 101 is configured to generate a 3D image according to the RGB image and the speckle collected image.
  • the RGB camera 109 may also have other functions. For example, when the image acquisition device 10 is performing image acquisition, the RGB camera 109 may preliminarily capture the area in front of the lens (the area may include the first projection area and the second projection area). A pre-display screen is generated, and the user can determine the effect of the shooting through the pre-display screen, thereby adjusting the shooting angle and so on. The RGB camera 109 can also collect 2D images separately, and the application does not limit the functions of the RGB camera 109.
  • the image acquisition device 10 further includes a flood illuminator 110, which is electrically connected to the RGB camera 109 and the processor 101; the flood illuminator 110 is used When the RGB camera 109 captures an image, the light is supplemented.
  • the RGB camera 109 needs sufficient light to capture images, and the infrared camera module 102 can also perform image capture in a dark environment. Therefore, when the RGB camera 109 captures images, The flood illuminator 110 can be used to supplement light. Of course, if the ambient light is sufficiently bright, the flood illuminator 110 is not required to supplement light.
  • the image acquisition device 10 may further include a memory 111, the memory 111 may store a computer program, the processor 101 calls the computer program to implement its functions, and the memory 111 also
  • the driver program of the speckle projector can be stored.
  • the driver program of the speckle projector can also be stored on the speckle projector, which is not limited in this application.
  • the memory 111 can be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, as described in the (Method for Determining Touch Position of Capacitive Screen) in the embodiment of this application. ) Corresponding program instructions/modules.
  • the processor 101 executes various functional applications and data processing of the server by running non-volatile software programs, instructions, and modules stored in the memory 111, that is, implementing the foregoing method embodiment (method for determining the touch position of the capacitive screen).
  • the memory 111 may include a storage program area and a storage data area, where the storage program area can store an operating system and an application program required by at least one function; the storage data area can store the data created according to the use of (the device for determining the touch position of the capacitive screen) Data etc.
  • the memory 111 may include a high-speed random access memory 111, and may also include a non-volatile memory 111, such as 111 pieces of at least one magnetic disk memory, a flash memory device, or 111 other non-volatile solid-state memories.
  • the memory 111 may optionally include a memory 111 remotely provided with respect to the processor 101, and these remote memories 111 may be connected to (device for determining a touch position of a capacitive screen) via a network.
  • networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the image acquisition device of the embodiment of the present application performs speckle projection on the first projection area according to a preset direction, and performs speckle image acquisition to obtain the first image; performs speckle projection on the second projection area according to the preset direction, and performs speckle projection on the second projection area.
  • the speckle image is acquired to obtain a second image, and the first projection area and the second projection area are connected or there is an overlapping area; the first image and the second image are combined to obtain the speckle acquisition image. Because the two regions are separately imaged, the area of a single spot is avoided to increase, the accuracy of the 3D image is guaranteed, and the final generated speckle collection image covers two regions, which improves the field of view.
  • an embodiment of the application provides an image acquisition method, which is applied to the image acquisition device described in the first embodiment.
  • FIG. 7 is an image acquisition method provided by an embodiment of the application. As shown in Figure 7, the image acquisition method provided by the embodiment of the present application includes the following steps:
  • Step S701 Perform speckle projection on the first projection area according to a preset direction, and perform speckle image collection to obtain a first image.
  • the method further includes:
  • a first acquisition instruction is generated, and the first acquisition instruction is used to instruct to perform speckle image acquisition on the first projection area according to a preset direction.
  • Step S702 Perform speckle projection on the second projection area according to the preset direction, and perform speckle image collection to obtain a second image.
  • the first projection area and the second projection area meet or there is an overlap area.
  • the method further includes: after the first image is acquired, generating a second acquisition instruction, where the second acquisition instruction is used to instruct the second projection area to be performed in a preset direction. Speckle image acquisition.
  • Step S703 Combine the first image and the second image to obtain a speckle collection image.
  • the method further includes: performing speckle projection on the third region according to a preset direction, and performing speckle image collection to obtain a third image;
  • the first image, the second image, and the third image are combined to obtain a speckle acquisition image.
  • the method further includes: performing speckle projection on the fourth area according to a preset direction, and performing speckle image collection to obtain a fourth image;
  • Combining the first image and the second image to obtain a speckle acquisition image includes:
  • the first image, the second image, the third image, and the fourth image are combined to obtain a speckle collection image.
  • first projection area and the second projection area are connected or there is an overlapping area, therefore, the first image and the second image are also connected or have an overlapping area, and the first image and the second image are also connected or overlapped according to the positional relationship between the two projection areas.
  • the image and the second image can be spliced to obtain the speckle collection image.
  • this is only an exemplary description, which does not mean that the application is limited to this.
  • the method further includes: performing RGB red, green, and blue image acquisition on the first projection area and the second projection area to obtain an RGB image; and generating a 3D image based on the RGB image and the speckle acquisition image image.
  • speckle projection is performed on a first projection area according to a preset direction, and speckle image acquisition is performed to obtain a first image;
  • the speckle image is acquired to obtain a second image, and the first projection area and the second projection area are connected or there is an overlapping area; the first image and the second image are combined to obtain the speckle acquisition image. Because the two regions are separately imaged, the area of a single spot is avoided to increase, the accuracy of the 3D image is guaranteed, and the final generated speckle collection image covers two regions, which improves the field of view.
  • an embodiment of the present application provides an acquisition chip, and the acquisition chip executes a pre-stored computer program to implement the following method:
  • the acquisition chip is configured to control the first speckle projector to perform speckle projection on the first projection area according to a preset direction; and control the infrared camera module to perform speckle image acquisition to obtain the first image;
  • the acquisition chip is further configured to control the second speckle projector to perform speckle projection on the second projection area according to the preset direction; and control the infrared camera module to perform speckle image acquisition to obtain the second image, the first projection area and the second projection area
  • the projected areas are connected or there are overlapping areas;
  • the acquisition chip is further configured to combine the first image and the second image to obtain a speckle acquisition image.
  • the acquisition chip is further configured to generate a first acquisition instruction when it is detected that a target object exists in the first projection area or the second projection area, and the first acquisition instruction is used to instruct Perform speckle image collection on the first projection area according to the preset direction.
  • the acquisition chip is further configured to generate a second acquisition instruction after acquiring the first image, and the second acquisition instruction is used to instruct the second projection area to be projected in a preset direction. Perform speckle image acquisition.
  • the acquisition chip is further configured to control the third speckle projector to perform speckle projection on the third area according to a preset direction, and control the infrared camera module to perform speckle images
  • the third image is acquired; the acquisition chip is further configured to combine the first image, the second image, and the third image to obtain a speckle acquisition image.
  • the acquisition chip is further configured to control the fourth speckle projector to perform speckle projection on the fourth area according to a preset direction, and perform speckle image acquisition to obtain a fourth image
  • the collection chip is also configured to combine the first image, the second image, the third image, and the fourth image to obtain a speckle collection image.
  • the capture chip is further configured to control the RGB camera to perform RGB image capture on the first projection area and the second projection area to obtain RGB images; generate RGB images based on the RGB images and speckle collection images 3D image.
  • the collection chip of the embodiment of the application controls the first speckle projector to perform speckle projection on the first projection area according to the preset direction, and controls the infrared camera module to detect the reflected light generated by the target object in the first projection area , That is, perform speckle image collection on the target image to obtain the first image; control the second speckle projector to perform speckle projection on the second projection area according to the preset direction, and control the infrared camera module to perform speckle image collection to obtain the first image
  • the first projection area and the second projection area are connected or there is an overlapping area; the first image and the second image are combined to obtain a speckle collection image. Because the two regions are separately imaged, the area of a single spot is avoided to increase, the accuracy of the 3D image is guaranteed, and the final generated speckle collection image covers two regions, which improves the field of view.
  • the image acquisition device of the embodiment of the present application exists in various forms, including but not limited to:
  • Mobile communication equipment This type of equipment is characterized by mobile communication functions, and its main goal is to provide voice and data communications.
  • Such terminals include: smart phones (such as iPhone), multimedia phones, functional phones, and low-end phones.
  • Ultra-mobile personal computer equipment This type of equipment belongs to the category of personal computers, has calculation and processing functions, and generally also has mobile Internet features.
  • Such terminals include: PDA, MID and UMPC devices, such as iPad.
  • Portable entertainment equipment This type of equipment can display and play multimedia content.
  • Such devices include: audio, video players (such as iPod), handheld game consoles, e-books, as well as smart toys and portable car navigation devices.
  • Server A device that provides computing services.
  • the composition of a server includes a processor, hard disk, memory, system bus, etc.
  • the server is similar to a general computer architecture, but due to the need to provide highly reliable services, it is in terms of processing power and stability. , Reliability, security, scalability, and manageability.
  • the processor or controller in this application can be implemented in any suitable manner.
  • the controller can take the form of a microprocessor or processor and store computer readable program codes (such as software or software) executable by the (micro) processor.
  • Firmware in the form of computer-readable media, logic gates, switches, application specific integrated circuits (ASICs), programmable logic controllers and embedded microcontrollers.
  • controllers include but are not limited to the following microcontrollers : ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20 and Silicon Labs C8051F320, the memory controller can also be implemented as part of the memory control logic.
  • controllers in addition to implementing the controller in a purely computer-readable program code manner, it is entirely possible to program the method steps to make the controller use logic gates, switches, application-specific integrated circuits, programmable logic controllers, and embedded logic.
  • the same function can be realized in the form of a microcontroller or the like. Therefore, such a controller can be regarded as a hardware component, and the devices included in it for realizing various functions can also be regarded as a structure within the hardware component. Or even, the device for realizing various functions can be regarded as both a software module for realizing the method and a structure within a hardware component.
  • this application may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware.
  • this application may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes.
  • This application may be described in the general context of computer-executable instructions executed by a computer, such as a program module.
  • program modules include routines, programs, objects, components, data structures, etc. that perform specific transactions or implement specific abstract data types.
  • This application can also be practiced in distributed computing environments. In these distributed computing environments, remote processing devices connected through a communication network execute transactions.
  • program modules can be located in local and remote computer storage media including storage devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

一种图像采集装置、图像采集方法及采集芯片,图像采集装置(10)包括:处理器(101)、红外摄像模组(102)、第一散斑投射器(103)及第二散斑投射器(104);处理器(101)与红外摄像模组(102)电连接。第一散斑投射器(103),用于按照预设方向对第一投射区域进行散斑投射,第二散斑投射器(104),用于按照预设方向对第二投射区域进行散斑投射;因为对两个区域分别进行图像采集,避免了单个光斑的面积增加,保证了3D图像的精度,而且最终生成的散斑采集图像涵盖了两个区域,提高了视场角。

Description

图像采集装置、图像采集方法及采集芯片 技术领域
本申请实施例涉及3D图像技术领域,尤其涉及图像采集装置、图像采集方法及采集芯片。
背景技术
随着科技的发展,3D图像技术应用于各个领域,例如,人脸识别的场景中,利用3D图像采集进行人脸识别,准确性更高。在进行3D图像采集的过程中,利用散斑投射器向待检测对象投射光斑,并采集图像,根据采集到的图像中光斑的分布情况,确定像素点的深度信息,以此构建3D图像,但是,在图像采集过程中,如果散斑投射器投射出的光斑整体区域较小,则不能对待检测对象进行全面的3D图像采集。而如果增加散斑投射器的视场角(field of view,FOV,也称为“视场”)以使得照射区域更大,则会使得该散斑投射器投射出的单个光斑的面积增加,使得单位面积上光斑数量减小,降低了3D图像的精度。
发明内容
有鉴于此,本申请实施例所解决的技术问题之一在于提供一种图像采集装置、图像采集方法及采集芯片,用以克服现有技术中3D图像精度较低、完整性较差的缺陷。
第一方面,本申请实施例提供一种图像采集装置,包括:处理器、红外摄像模组、第一散斑投射器及第二散斑投射器,处理器与红外摄像模组电连接;
第一散斑投射器,用于按照预设方向对第一投射区域进行散斑投射;
红外摄像模组,用于在第一散斑投射器对第一投射区域投射散斑时进行散斑图像采集得到第一图像,并将第一图像传输至处理器;
第二散斑投射器,用于按照预设方向对第二投射区域进行散斑投射;
红外摄像模组,还用于在第二散斑投射器对第二投射区域投射散斑时进行散斑图像采集得到第二图像,并将第二图像传输至处理器,第一投射区域和第二投射区域相接或存在重叠区域;
处理器,用于将第一图像和第二图像进行合并得到散斑采集图像。
可选地,在本申请的任一实施例中,第一散斑投射器及第二散斑投射器分别与处理器电连接;
和/或,第一散斑投射器及第二散斑投射器分别与红外摄像模组电连接。
可选地,在本申请的任一实施例中,红外摄像模组包括红外摄像头、控制器和控制接口,红外摄像模组通过控制接口分别与第一散斑投射器及第二散斑投射器连接;
控制器,用于通过控制接口控制第一散斑投射器和第二散斑投射器进行散斑投射。
可选地,在本申请的任一实施例中,图像采集装置还包括距离传感器;距离传感器与处理器电连接;
距离传感器,用于检测第一投射区域或第二投射区域是否存在目标对象,并将检测结果传输至处理器;
处理器,用于在检测结果指示第一投射区域或第二投射区域存在目标对象时,生成第一采集指令,并将第一采集指令传输至红外摄像模组,以便红外摄像模组触发第一散斑投射器进行散斑投射。
可选地,在本申请的任一实施例中,处理器,还用于在接收到红外摄像模组传输的第一图像时,生成第二采集指令,并将第二采集指令传输至红外摄像模组,以便红外摄像模组触发第二散斑投射器进行散斑投射。
可选地,在本申请的任一实施例中,红外摄像模组,还用于在采集到第一图像时,触发第二散斑投射器进行散斑投射。
可选地,在本申请的任一实施例中,图像采集装置还包括第三散斑投射器,第三散斑投射器分别与处理器和红外摄像模组电连接;
第三散斑投射器,用于按照预设方向对第三区域进行散斑投射;
红外摄像模组,还用于在第三散斑投射器对第三区域投射散斑时进行散斑图像采集得到第三图像,并将第三图像传输至处理器;
处理器,还用于将第一图像、第二图像和第三图像进行合并得到散斑采集图像。
可选地,在本申请的任一实施例中,图像采集装置还包括第四散斑投射器,第四散斑投射器分别与处理器和红外摄像模组电连接;
第四散斑投射器,用于按照预设方向对第四区域进行散斑投射;
红外摄像模组,还用于在第四散斑投射器对第四区域投射散斑时进行散 斑图像采集得到第四图像,并将第四图像传输至处理器;
处理器,还用于将第一图像、第二图像、第三图像和第四图像进行合并得到散斑采集图像。
可选地,在本申请的任一实施例中,图像采集装置还包括RGB摄像头;RGB摄像头与处理器电连接;
RGB摄像头,用于对第一投射区域和第二投射区域进行RGB图像采集得到RGB图像,并将RGB图像传输至处理器;
处理器,用于根据RGB图像与散斑采集图像生成3D图像。
可选地,在本申请的任一实施例中,图像采集装置还包括泛光照明器,泛光照明器分别与RGB摄像头和处理器电连接;
泛光照明器,用于在RGB摄像头采集图像时进行补光。
可选地,在本申请的任一实施例中,图像采集装置还包括基板,红外摄像模组、第一散斑投射器及第二散斑投射器设置在基板,且红外摄像模组设置在第一散斑投射器与第二散斑投射器之间。
可选地,在本申请的任一实施例中,第一散斑投射器及第二散斑投射器到红外摄像模组的距离相等。
可选地,在本申请的任一实施例中,红外摄像模组、第一散斑投射器及第二散斑投射器的中心位于同一直线。
可选地,在本申请的任一实施例中,第一散斑投射器的视场角与第二散斑投射器的视场角相同。
第二方面,本申请实施例提供一种图像采集方法,包括:
按照预设方向对第一投射区域进行散斑投射,并进行散斑图像采集得到第一图像;
按照预设方向对第二投射区域进行散斑投射,并进行散斑图像采集得到第二图像,第一投射区域和第二投射区域相接或存在重叠区域;
将第一图像和第二图像进行合并得到散斑采集图像。
可选地,在本申请的任一实施例中,该方法还包括:
在检测到第一投射区域或第二投射区域存在目标对象时,生成第一采集指令,第一采集指令用于指示按照预设方向对第一投射区域进行散斑图像采集。
可选地,在本申请的任一实施例中,该方法还包括:
在采集到第一图像之后,生成第二采集指令,第二采集指令用于指示按 照预设方向对第二投射区域进行散斑图像采集。
可选地,在本申请的任一实施例中,该方法还包括:按照预设方向对第三区域进行散斑投射,并进行散斑图像采集得到第三图像;
将第一图像、第二图像和第三图像进行合并得到散斑采集图像。
可选地,在本申请的任一实施例中,该方法还包括:按照预设方向对第四区域进行散斑投射,并进行散斑图像采集得到第四图像;
将第一图像和第二图像进行合并得到散斑采集图像,包括:
将第一图像、第二图像、第三图像和第四图像进行合并得到散斑采集图像。
可选地,在本申请的任一实施例中,该方法还包括:对第一投射区域和第二投射区域进行RGB红绿蓝图像采集得到RGB图像;根据RGB图像与散斑采集图像生成3D图像。
第三方面,本申请实施例提供一种采集芯片,采集芯片执行预先存储的计算机程序实现如下方法:
采集芯片,配置为控制第一散斑投射器按照预设方向对第一投射区域进行散斑投射;并控制红外摄像模组进行散斑图像采集得到第一图像;
采集芯片,还配置为控制第二散斑投射器按照预设方向对第二投射区域进行散斑投射;并控制红外摄像模组进行散斑图像采集得到第二图像,第一投射区域和第二投射区域相接或存在重叠区域;
采集芯片,还配置为将第一图像和第二图像进行合并得到散斑采集图像。
本申请实施例的图像采集装置、图像采集方法及采集芯片,按照预设方向对第一投射区域进行散斑投射,并进行散斑图像采集得到第一图像;按照预设方向对第二投射区域进行散斑投射,并进行散斑图像采集得到第二图像,第一投射区域和第二投射区域相接或存在重叠区域;将第一图像和第二图像进行合并得到散斑采集图像。因为通过向两个区域分别投射散斑并进行图像采集,单个散斑投射器可以以较小的视场角以保持所照射区域单位面积散斑数量较多,同时目标物体的散斑采集图像通过两个区域采集的图像拼接得到,增大了对目标物体的检测范围,也就是提高了图像采集装置的视场角,兼顾了图像采集装置的视场角和图像精度。
附图说明
后文将参照附图以示例性而非限制性的方式详细描述本申请实施例的一些具体实施例。附图中相同的附图标记标示了相同或类似的部件或部分。本领域技术人员应该理解,这些附图未必是按比例绘制的。附图中:
图1为本申请实施例提供的一种图像采集装置的结构图;
图2为本申请实施例提供的一种图像采集装置的结构图;
图3为本申请实施例提供的一种图像采集装置的结构图;
图4为本申请实施例提供的一种红外摄像模组的结构图;
图5为本申请实施例提供的一种图像采集装置的结构图;
图6为本申请实施例提供的一种图像采集装置的结构图;
图7为本申请实施例提供的一种图像采集方法的流程图。
具体实施方式
实施本申请实施例的任一技术方案必不一定需要同时达到以上的所有优点。
为了使本领域的人员更好地理解本申请实施例中的技术方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本申请实施例一部分实施例,而不是全部的实施例。基于本申请实施例中的实施例,本领域普通技术人员所获得的所有其他实施例,都应当属于本申请实施例保护的范围。
下面结合本申请实施例附图进一步说明本申请实施例具体实现。
实施例一、
本申请实施例提供一种图像采集装置,如图1所示,图1为本申请实施例提供的一种图像采集装置的结构图,该图像采集装置10包括:处理器101、红外摄像模组102、第一散斑投射器103及第二散斑投射器104;处理器101与红外摄像模组102电连接;
第一散斑投射器103,用于按照预设方向对第一投射区域进行散斑投射;
红外摄像模组102,用于在第一散斑投射器103对第一投射区域投射散斑时进行散斑图像采集得到第一图像,并将第一图像传输至处理器101;
第二散斑投射器104,用于按照预设方向对第二投射区域进行散斑投射;
红外摄像模组102,还用于在第二散斑投射器104对第二投射区域投射散斑时进行散斑图像采集得到第二图像,并将第二图像传输至处理器101,第 一投射区域和第二投射区域相接或存在重叠区域;
处理器101,用于将第一图像和第二图像进行合并得到散斑采集图像。
可选地,在一个实施例中,如图2所示,图2为本申请实施例提供的一种图像采集装置的结构图,图像采集装置还可以包括基板105,比如PCB或者FPC等类型的基板105,所述红外摄像模组102、第一散斑投射器103及第二散斑投射器104可以设置在基板105上,所述处理器也可以设置在基板105上或者通过线路连接到基板105。基板105可以提供各功能模块之间的电连接以及支撑作用。在一个具体实施例中,红外摄像模组102设置在第一散斑投射器103与第二散斑投射器104之间,并且第一散斑投射器103及第二散斑投射器104与红外摄像模组102的距离相等,三者的中心可以位于同一直线上,也可以是第一散斑投射器103及第二散斑投射器104的中心位于同一直线上且与基板105的边缘平行。
第一散斑投射器103对第一投射区域进行散斑投射,红外摄像模组102采集到的第一图像是对第一投射区域投射的光斑进行采集得到的图像,第二散斑投射器104对第二投射区域进行散斑投射,红外摄像模组102采集到的第二图像是对第二投射区域投射的光斑进行采集得到的图像,因此,将第一图像和第二图像进行合并得到的散斑采集图像涵盖了第一投射区域和第二投射区域,增加了图像采集的视场角,而且第一散斑投射器103和第二散斑投射器104没有同时进行采集,避免两个投射器发出的光斑重叠,也就避免了单位光斑面积增加,保证了3D图像的精度。需要说明的是因为第一投射区域和第二投射区域相接或者有重叠区域,因此,第一图像和第二图像也是相接或者有重叠区域的,按照两个投射区域的位置关系对第一图像和第二图像进行拼接即可得到散斑采集图像,当然,此处只是示例性说明,并不代表本申请局限于此。散斑采集图像可以用于生成3D图像,或者直接根据散斑采集图像得到的深度数据进行识别或者注册,将散斑采集图像用于何种用途,本申请不作限制。
第一投射区域为第一散斑投射器103投射的散斑覆盖区域,第二投射区域为第二散斑投射器104投射的散斑覆盖区域,当然,可以是散斑覆盖的全部区域,也可以是部分区域,本申请对此不作限制。
第一散斑投射器103和第二散斑投射器104投射散斑的方向相同,均为预设方向,例如,以第一散斑投射器103为例,预设方向可以是第一散斑投射器103投射的正前方,第一散斑投射器103与第二散斑投射器104朝向相同, 即可保证投射方向均为预设方向;又如,预设方向可以是红外摄像模组102的摄像头的朝向,当然,此处只是示例性说明,本申请对于预设方向不作限定,预设方向只是为了保证第一散斑投射器103和第二散斑投射器104投射的方向相同,为了对目标对象进行散斑投射并进行拍摄,只要散斑投射器的方向相同,按照预设方向能够将散斑投射到目标对象上即可。
如图2所示,第一散斑投射器103和第二散斑投射器104在同一个水平线上,两个散斑投射器都向预设方向投射散斑,第一散斑投射器103投射的光斑覆盖了第一投射区域,第二散斑投射的光斑覆盖了第二投射区域,第一投射区域和第二投射区域存在重叠区域,当然,第一投射区域和第二投射区域也可以相接,这样,图像采集装置10所能采集到的区域范围为第一投射区域和第二投射区域的总和,很大程度上提高了水平方向的视场角,当然,如果第一散斑投射器103和第二散斑投射器104设置在同一竖直直线上,则可以增加图像采集装置10垂直方向的视场角,本申请对此不做限制。
在一个实施例中,第一散斑投射器103的视场角和第二散斑投射器104的视场角相同,一方面可以在生产图像采集装置的时候便于物料采购,另外也可以便于两幅图像的拼接。假设第一散斑投射器103的视场角为F1,第一投射区域的大小与视场角F1相关,而第一投射区域与第二投射区域相重叠的区域与F1以及两个投射器所在位置的间距相关,两个具有F1视场角的投射器距离越近,其相互重叠的区域越大。
图3为本申请实施例提供的一种图像采集装置的结构图,可选地,在本申请的一个实施例中,进一步说明第一散斑投射器103和第二散斑投射器104如何进行散斑投射,如图3所示,图像采集装置10还包括距离传感器106;距离传感器106与处理器101电连接;
距离传感器106,用于检测第一投射区域或第二投射区域是否存在目标对象,并将检测结果传输至处理器101;
处理器101,用于在检测结果指示第一投射区域或第二投射区域存在目标对象时,生成第一采集指令,并将第一采集指令传输至红外摄像模组102,以便红外摄像模组102触发第一散斑投射器103进行散斑投射。
距离传感器106可以是一个超声波传感器,如果第一投射区域或第二投射区域存在目标对象时,超声波传感器发出的超声波被目标对象反射回,并将反射回的超声波转换为电信号,距离传感器106将电信号传输至处理器101, 处理器101根据电信号确定存在目标对象,并生成第一采集指令。当然,距离传感器106也可以是其他形式的传感器,比如红外光接近传感器或红外光距离传感器,本申请对此不作限制。
需要说明的是,第一采集指令可以传输至红外摄像模组102,红外摄像模组102触发第一散斑投射器103进行散斑投射,也可以是处理器101将第一采集指令传输至红外摄像模组102和第一散斑投射器103,为保证红外摄像模组102采集到第一图像,在传输第一采集指令时,可以先向红外摄像模组102发送,再向第一散斑投射器103发送,本申请对此不作限制。
在采集到第一图像后,再对第二投射区域进行图像采集,此处列举两种具体的实现方式如下:
可选地,在第一种实现方式中,处理器101,还用于在接收到红外摄像模组102传输的第一图像时,生成第二采集指令,并将第二采集指令传输至红外摄像模组102,以便红外摄像模组102触发第二散斑投射器104进行散斑投射。
处理器101通过第二采集指令控制红外摄像模组102对第二投射区域进行图像采集,红外摄像模组102根据第二采集指令触发第二散斑投射器104进行散斑投射。当然,也可以是处理器101向红外摄像模组102和第二散斑投射器104发送第二采集指令,可以先向红外摄像模组102发送第二采集指令,再向第二散斑投射器104发送第二采集指令,本申请对此不作限制。
可选地,在第二种实现方式中,红外摄像模组102,还用于在采集到第一图像时,触发第二散斑投射器104进行散斑投射。
红外摄像模组102在采集到第一图像后,不需要再等待处理器101下发指令,直接触发第二散斑投射器104进行散斑投射,红外摄像模组102进而对第二投射区域进行图像采集得到第二图像,许需要处理器101做判断并控制红外摄像模组102对第二投射区域进行采集,减少了图像采集的时间,提高了图像采集的效率。
可以是处理器101触发第一散斑投射器103和第二散斑投射器104投射散斑,也可以是红外摄像模组102触发第一散斑投射器103和第二散斑投射器104投射散斑,此处,对不同实现方式的连接关系进行说明:
可选地,在本申请的任一实施例中,第一散斑投射器103及第二散斑投射器104分别与处理器101电连接;
和/或,第一散斑投射器103及第二散斑投射器104分别与红外摄像模组102电连接。
进一步可选地,在本申请的任一实施例中,如图4所示,图4为本申请实施例提供的一种红外摄像模组的结构图;红外摄像模组102触发第一散斑投射器103和第二散斑投射器104投射散斑时,触发红外摄像模组102包括红外摄像头1021、控制器1022和控制接口1023,红外摄像模组102通过控制接口1023分别与第一散斑投射器102及第二散斑投射器104连接;控制器1022,用于通过控制接口1023控制第一散斑投射器103和第二散斑投射器104进行散斑投射。
需要说明的是,控制接口1023可以是一种I/O(英文:Input/Output,输入/输出)接口,本申请对控制接口1023的具体形式不作限制,红外摄像模组102可以有多个控制接口1023,一个控制接口1023连接一个散斑投射器;或者,红外摄像模组可以有一个共用的控制接口1023,通过该控制接口1023分别与多个散斑投射器连接,每一个散斑投射器对应一个地址信息,红外摄像模组102通过控制接口1023传输采集指令时,携带散斑投射器的地址信息,指示地址信息对应的散斑投射器进行投射。当然,以上只是示例性说明,并不代表本申请局限于此。在一种可选地实现方式中,控制器1022可以通过控制接口1023向第一散斑投射器103和第二散斑投射器104传输时钟信号,在时钟信号是高电平时,指示对应的散斑投射器进行散斑投射,在时钟信号为低电平时,指示对应的散斑投射器停止工作,当然,此处只是示例性说明,也可以传输一定格式的信号指示散斑投射器进行散斑投射或停止工作,本申请对此不作限制。
图5为本申请实施例提供的一种图像采集装置的结构图,如图5所示,图像采集装置10还可以包括更多的散斑投射器,此处,列举两个示例进行说明,可选地,在第一个示例中,图像采集装置10还包括第三散斑投射器107,第三散斑投射器107分别与处理器101和红外摄像模组102电连接;
第三散斑投射器107,用于按照预设方向对第三区域进行散斑投射;
红外摄像模组102,还用于在第三散斑投射器107对第三区域投射散斑时进行散斑图像采集得到第三图像,并将第三图像传输至处理器101;
处理器101,还用于将第一图像、第二图像和第三图像进行合并得到散斑采集图像。
可选地,在第二个示例中,图像采集装置10还包括第四散斑投射器108, 第四散斑投射器108分别与处理器101和红外摄像模组102电连接;
第四散斑投射器108,用于按照预设方向对第四区域进行散斑投射;
红外摄像模组102,还用于在第四散斑投射器108对第四区域投射散斑时进行散斑图像采集得到第四图像,并将第四图像传输至处理器101;
处理器101,还用于将第一图像、第二图像、第三图像和第四图像进行合并得到散斑采集图像。
第三散斑投射器107和第四散斑投射器108与第二散斑投射器104投射原理和相同,此处不再赘述。
可选的图像采集装置可以集成在PCB(英文:Printed Circuit Board,印制电路板)上,第一散斑投射器103和第二散斑投射器104可以沿第一方向设置,分别位于红外摄像模组102的左侧和右侧,第一散斑投射器103和第二散斑投射器104中点的连线所在直线的方向即为第一方向,第一散斑投射器103和第二散斑投射器104中点的连线可以平行于PCB的边线;
第三散斑投射器107和第四散斑投射器108可以沿第二方向设置,分别位于红外摄像模组102的上方和下方,第三散斑投射器107和第四散斑投射器108中点的连线所在直线的方向即为第二方向,第三散斑投射器107和第四散斑投射器108中点的连线垂直于PCB的上表面或下表面;如图5所示,第一方向可以垂直于第二方向,第一散斑投射器103、第二散斑投射器104、第三散斑投射器107和第四散斑投射器108的投射方向是相同的,都是按照预设方向进行投射。当然,四个散斑投射器也可以沿一条直线排列分布,本申请对此不作限制。
需要说明的是,在采集到第二图像后,红外摄像模组102触发第三散斑投射器107对第三区域进行散斑投射,红外摄像模组102采集到第三图像后,触发第四散斑投射器108对第四区域进行散斑投射。触发的方式与第二散斑投射器104触发的方式相同,此处不再赘述。散斑投射器越多,图像采集的视场角越大。
图6为本申请实施例提供的一种图像采集装置的结构图,可选地,在本申请的任一实施例中,如图6所示,图像采集装置10还包括RGB(英文:Red Green Blue,红绿蓝)摄像头109;RGB摄像头109与处理器101电连接;
RGB摄像头109,用于对第一投射区域和第二投射区域进行RGB图像采集得到RGB图像,并将RGB图像传输至处理器101;
处理器101,用于根据RGB图像与散斑采集图像生成3D图像。
RGB摄像头109还可以有其他功能,例如,在图像采集装置10进行图像采集的时候,RGB摄像头109可以预先对镜头前的区域(该区域可以包括第一投射区域和第二投射区域)进行图像采集生成预显示画面,用户通过预显示画面可以确定拍摄的效果,从而调整拍摄角度等。RGB摄像头109也可以单独进行2D图像的采集,对于RGB摄像头109的功能,本申请不作限制。
可选地,在本申请的任一实施例中,图像采集装置10还包括泛光照明器110,泛光照明器110分别与RGB摄像头109和处理器101电连接;泛光照明器110,用于在RGB摄像头109采集图像时进行补光。
需要说明的是,通常情况下,RGB摄像头109进行图像采集的时候需要充足的光照,红外摄像模组102在环境较暗的情况下也可以进行图像采集,因此,在RGB摄像头109采集图像时,可以通过泛光照明器110进行补光,当然,如果环境光足够亮,不需要泛光照明器110进行补光。
可选地,在本申请的一个实施例中,如图5所示,图像采集装置10还可以包括存储器111,存储器111可以存储计算机程序,处理器101调用该计算机程序实现其功能,存储器111还可以存储散斑投射器的驱动程序,当然,散斑投射器的驱动程序也可以存储在散斑投射器上,本申请对此不作限制。存储器111作为一种非易失性计算机可读存储介质,可用于存储非易失性软件程序、非易失性计算机可执行程序以及模块,如本申请实施例中的(确定电容屏触摸位置方法)对应的程序指令/模块。处理器101通过运行存储在存储器111中的非易失性软件程序、指令以及模块,从而执行服务器的各种功能应用以及数据处理,即实现上述方法实施例中(确定电容屏触摸位置方法)。
存储器111可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储根据(确定电容屏触摸位置装置)的使用所创建的数据等。此外,存储器111可以包括高速随机存取存储器111,还可以包括非易失性存储器111,例如至少一个磁盘存储器111件、闪存器件、或其他非易失性固态存储器111件。在一些实施例中,存储器111可选包括相对于处理器101远程设置的存储器111,这些远程存储器111可以通过网络连接至(确定电容屏触摸位置装置)。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
本申请实施例的图像采集装置,按照预设方向对第一投射区域进行散斑 投射,并进行散斑图像采集得到第一图像;按照预设方向对第二投射区域进行散斑投射,并进行散斑图像采集得到第二图像,第一投射区域和第二投射区域相接或存在重叠区域;将第一图像和第二图像进行合并得到散斑采集图像。因为两个区域分别进行图像采集,避免了单个光斑的面积增加,保证了3D图像的精度,而且最终生成的散斑采集图像涵盖了两个区域,提高了视场角。
实施例二、
基于上述实施例一所描述的图像采集装置,本申请实施例提供一种图像采集方法,应用于实施例一中所描述的图像采集装置,图7为本申请实施例提供的一种图像采集方法的流程图;如图7所示,本申请实施例提供的图像采集方法包括以下步骤:
步骤S701、按照预设方向对第一投射区域进行散斑投射,并进行散斑图像采集得到第一图像。
可选地,在本申请的任一实施例中,该方法还包括:
在检测到第一投射区域或第二投射区域存在目标对象时,生成第一采集指令,第一采集指令用于指示按照预设方向对第一投射区域进行散斑图像采集。
步骤S702、按照预设方向对第二投射区域进行散斑投射,并进行散斑图像采集得到第二图像。
第一投射区域和第二投射区域相接或存在重叠区域。
可选地,在本申请的任一实施例中,该方法还包括:在采集到第一图像之后,生成第二采集指令,第二采集指令用于指示按照预设方向对第二投射区域进行散斑图像采集。
步骤S703、将第一图像和第二图像进行合并得到散斑采集图像。
需要说明的是,还可以向更多的区域投射散斑并进行图像采集,例如:
可选地,在本申请的任一实施例中,该方法还包括:按照预设方向对第三区域进行散斑投射,并进行散斑图像采集得到第三图像;
将第一图像、第二图像和第三图像进行合并得到散斑采集图像。
可选地,在本申请的任一实施例中,该方法还包括:按照预设方向对第四区域进行散斑投射,并进行散斑图像采集得到第四图像;
将第一图像和第二图像进行合并得到散斑采集图像,包括:
将第一图像、第二图像、第三图像和第四图像进行合并得到散斑采集图像。
需要说明的是因为第一投射区域和第二投射区域相接或者有重叠区域,因此,第一图像和第二图像也是相接或者有重叠区域的,按照两个投射区域的位置关系对第一图像和第二图像进行拼接即可得到散斑采集图像,当然,此处只是示例性说明,并不代表本申请局限于此。
可选地,在本申请的任一实施例中,该方法还包括:对第一投射区域和第二投射区域进行RGB红绿蓝图像采集得到RGB图像;根据RGB图像与散斑采集图像生成3D图像。
本申请实施例的图像采集方法,按照预设方向对第一投射区域进行散斑投射,并进行散斑图像采集得到第一图像;按照预设方向对第二投射区域进行散斑投射,并进行散斑图像采集得到第二图像,第一投射区域和第二投射区域相接或存在重叠区域;将第一图像和第二图像进行合并得到散斑采集图像。因为两个区域分别进行图像采集,避免了单个光斑的面积增加,保证了3D图像的精度,而且最终生成的散斑采集图像涵盖了两个区域,提高了视场角。
实施例三、
基于上述示例一所描述的图像采集装置,本申请实施例提供一种采集芯片,采集芯片执行预先存储的计算机程序实现如下方法:
采集芯片,配置为控制第一散斑投射器按照预设方向对第一投射区域进行散斑投射;并控制红外摄像模组进行散斑图像采集得到第一图像;
采集芯片,还配置为控制第二散斑投射器按照预设方向对第二投射区域进行散斑投射;并控制红外摄像模组进行散斑图像采集得到第二图像,第一投射区域和第二投射区域相接或存在重叠区域;
采集芯片,还配置为将第一图像和第二图像进行合并得到散斑采集图像。
可选地,在本申请的任一实施例中,采集芯片,还配置为在检测到第一投射区域或第二投射区域存在目标对象时,生成第一采集指令,第一采集指令用于指示按照预设方向对第一投射区域进行散斑图像采集。
可选地,在本申请的任一实施例中,采集芯片,还配置为在采集到第一图像之后,生成第二采集指令,第二采集指令用于指示按照预设方向对第二投射区域进行散斑图像采集。
可选地,在本申请的任一实施例中,采集芯片,还配置为控制第三散斑投射器按照预设方向对第三区域进行散斑投射,并控制红外摄像模组进行散斑图像采集得到第三图像;采集芯片,还配置为将第一图像、第二图像和第三图 像进行合并得到散斑采集图像。
可选地,在本申请的任一实施例中,采集芯片,还配置为控制第四散斑投射器按照预设方向对第四区域进行散斑投射,并进行散斑图像采集得到第四图像;采集芯片,还配置为将第一图像、第二图像、第三图像和第四图像进行合并得到散斑采集图像。
可选地,在本申请的任一实施例中,采集芯片,还配置为控制RGB摄像头对第一投射区域和第二投射区域进行RGB图像采集得到RGB图像;根据RGB图像与散斑采集图像生成3D图像。
本申请实施例的采集芯片,控制第一散斑投射器按照预设方向对第一投射区域进行散斑投射,并控制红外摄像模组对第一投射区域内的目标物体产生的反射光进行检测,也就是对目标图像进行散斑图像采集得到第一图像;控制第二散斑投射器按照预设方向对第二投射区域进行散斑投射,并控制红外摄像模组进行散斑图像采集得到第二图像,第一投射区域和第二投射区域相接或存在重叠区域;将第一图像和第二图像进行合并得到散斑采集图像。因为两个区域分别进行图像采集,避免了单个光斑的面积增加,保证了3D图像的精度,而且最终生成的散斑采集图像涵盖了两个区域,提高了视场角。
本申请实施例的图像采集装置以多种形式存在,包括但不限于:
(1)移动通信设备:这类设备的特点是具备移动通信功能,并且以提供话音、数据通信为主要目标。这类终端包括:智能手机(例如iPhone)、多媒体手机、功能性手机,以及低端手机等。
(2)超移动个人计算机设备:这类设备属于个人计算机的范畴,有计算和处理功能,一般也具备移动上网特性。这类终端包括:PDA、MID和UMPC设备等,例如iPad。
(3)便携式娱乐设备:这类设备可以显示和播放多媒体内容。该类设备包括:音频、视频播放器(例如iPod),掌上游戏机,电子书,以及智能玩具和便携式车载导航设备。
(4)服务器:提供计算服务的设备,服务器的构成包括处理器、硬盘、内存、系统总线等,服务器和通用的计算机架构类似,但是由于需要提供高可靠的服务,因此在处理能力、稳定性、可靠性、安全性、可扩展性、可管理性等方面要求较高。
(5)其他具有图像采集功能的装置。
至此,已经对本主题的特定实施例进行了描述。其它实施例在所附权利要求书的范围内。在一些情况下,在权利要求书中记载的动作可以按照不同的顺序来执行并且仍然可以实现期望的结果。另外,在附图中描绘的过程不一定要求示出的特定顺序或者连续顺序,以实现期望的结果。在某些实施方式中,多任务处理和并行处理可以是有利的。
本申请中的处理器或控制器可以按任何适当的方式实现,例如,控制器可以采取例如微处理器或处理器以及存储可由该(微)处理器执行的计算机可读程序代码(例如软件或固件)的计算机可读介质、逻辑门、开关、专用集成电路(Application Specific Integrated Circuit,ASIC)、可编程逻辑控制器和嵌入微控制器的形式,控制器的例子包括但不限于以下微控制器:ARC 625D、Atmel AT91SAM、Microchip PIC18F26K20以及Silicone Labs C8051F320,存储器控制器还可以被实现为存储器的控制逻辑的一部分。本领域技术人员也知道,除了以纯计算机可读程序代码方式实现控制器以外,完全可以通过将方法步骤进行逻辑编程来使得控制器以逻辑门、开关、专用集成电路、可编程逻辑控制器和嵌入微控制器等的形式来实现相同功能。因此这种控制器可以被认为是一种硬件部件,而对其内包括的用于实现各种功能的装置也可以视为硬件部件内的结构。或者甚至,可以将用于实现各种功能的装置视为既可以是实现方法的软件模块又可以是硬件部件内的结构。
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、商品或者设备中还存在另外的相同要素。
本领域技术人员应明白,本申请的实施例可提供为方法或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请可以在由计算机执行的计算机可执行指令的一般上下文中描述,例如程序模块。一般地,程序模块包括执行特定事务或实现特定抽象数据类型 的例程、程序、对象、组件、数据结构等等。也可以在分布式计算环境中实践本申请,在这些分布式计算环境中,由通过通信网络而被连接的远程处理设备来执行事务。在分布式计算环境中,程序模块可以位于包括存储设备在内的本地和远程计算机存储介质中。
本说明书中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于系统实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。
以上所述仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。

Claims (21)

  1. 一种图像采集装置,其特征在于,包括:处理器、红外摄像模组、第一散斑投射器及第二散斑投射器;所述处理器与所述红外摄像模组电连接;
    所述第一散斑投射器,用于按照预设方向对第一投射区域进行散斑投射;
    所述红外摄像模组,用于在所述第一散斑投射器对所述第一投射区域投射散斑时进行散斑图像采集得到第一图像,并将所述第一图像传输至所述处理器;
    所述第二散斑投射器,用于按照所述预设方向对第二投射区域进行散斑投射;
    所述红外摄像模组,还用于在所述第二散斑投射器对所述第二投射区域投射散斑时进行散斑图像采集得到第二图像,并将所述第二图像传输至所述处理器,所述第一投射区域和所述第二投射区域相接或存在重叠区域;
    所述处理器,用于将所述第一图像和所述第二图像进行合并得到散斑采集图像。
  2. 根据权利要求1所述的装置,其特征在于,所述第一散斑投射器及所述第二散斑投射器分别与所述处理器电连接;
    和/或,所述第一散斑投射器及所述第二散斑投射器分别与所述红外摄像模组电连接。
  3. 根据权利要求2所述的装置,其特征在于,所述红外摄像模组包括红外摄像头、控制器和控制接口,所述红外摄像模组通过所述控制接口分别与所述第一散斑投射器及所述第二散斑投射器连接;
    所述控制器,用于通过所述控制接口控制所述第一散斑投射器和所述第二散斑投射器进行散斑投射。
  4. 根据权利要求1所述的装置,其特征在于,所述图像采集装置还包括距离传感器;所述距离传感器与所述处理器电连接;
    所述距离传感器用于检测所述第一投射区域或所述第二投射区域是否存在目标对象,并将检测结果传输至所述处理器;
    所述处理器,用于在所述检测结果指示所述第一投射区域或所述第二投射区域存在所述目标对象时,生成第一采集指令,并将所述第一采集指令传输至所述红外摄像模组,以便所述红外摄像模组触发所述第一散斑投射器进行散斑投射。
  5. 根据权利要求1所述的装置,其特征在于,
    所述处理器,还用于在接收到所述红外摄像模组传输的所述第一图像时, 生成第二采集指令,并将所述第二采集指令传输至所述红外摄像模组,以便所述红外摄像模组触发所述第二散斑投射器进行散斑投射。
  6. 根据权利要求1所述的装置,其特征在于,
    所述红外摄像模组,还用于在采集到所述第一图像时,触发所述第二散斑投射器进行散斑投射。
  7. 根据权利要求1所述的装置,其特征在于,所述图像采集装置还包括第三散斑投射器,所述第三散斑投射器分别与所述处理器和所述红外摄像模组电连接;
    所述第三散斑投射器,用于按照所述预设方向对第三区域进行散斑投射;
    所述红外摄像模组,还用于在所述第三散斑投射器对所述第三区域投射散斑时进行散斑图像采集得到第三图像,并将所述第三图像传输至所述处理器;
    所述处理器,还用于将所述第一图像、所述第二图像和所述第三图像进行合并得到所述散斑采集图像。
  8. 根据权利要求7所述的装置,其特征在于,所述图像采集装置还包括第四散斑投射器,所述第四散斑投射器分别与所述处理器和所述红外摄像模组电连接;
    所述第四散斑投射器,用于按照所述预设方向对第四区域进行散斑投射;
    所述红外摄像模组,还用于在所述第四散斑投射器对所述第四区域投射散斑时进行散斑图像采集得到第四图像,并将所述第四图像传输至所述处理器;
    所述处理器,还用于将所述第一图像、所述第二图像、所述第三图像和所述第四图像进行合并得到所述散斑采集图像。
  9. 根据权利要求1所述的装置,其特征在于,所述图像采集装置还包括RGB红绿蓝摄像头;所述RGB摄像头与所述处理器电连接;
    所述RGB摄像头,用于对所述第一投射区域和所述第二投射区域进行RGB图像采集得到RGB图像,并将所述RGB图像传输至所述处理器;
    所述处理器,用于根据所述RGB图像与所述散斑采集图像生成3D图像。
  10. 根据权利要求9所述的装置,其特征在于,所述图像采集装置还包括泛光照明器,所述泛光照明器分别与所述RGB摄像头和所述处理器电连接;
    所述泛光照明器,用于在所述RGB摄像头采集图像时进行补光。
  11. 根据权利要求1-10任意一项所述的装置,其中,所述装置还包括基板,所述红外摄像模组、第一散斑投射器及第二散斑投射器设置在所述基板,且所 述红外摄像模组设置在所述第一散斑投射器与所述第二散斑投射器之间。
  12. 根据权利要求11所述的装置,其中,所述第一散斑投射器及所述第二散斑投射器到所述红外摄像模组的距离相等。
  13. 根据权利要求11所述的装置,其中,所述红外摄像模组、所述第一散斑投射器及所述第二散斑投射器的中心位于同一直线。
  14. 根据权利要求1-13任意一项所述的装置,其中,所述第一散斑投射器的视场角与所述第二散斑投射器的视场角相同。
  15. 一种图像采集方法,其特征在于,包括:
    按照预设方向对第一投射区域进行散斑投射,并进行散斑图像采集得到第一图像;
    按照所述预设方向对第二投射区域进行散斑投射,并进行散斑图像采集得到第二图像,所述第一投射区域和所述第二投射区域相接或存在重叠区域;
    将所述第一图像和所述第二图像进行合并得到散斑采集图像。
  16. 根据权利要求15所述的方法,其特征在于,所述方法还包括:
    在检测到所述第一投射区域或所述第二投射区域存在目标对象时,生成第一采集指令,所述第一采集指令用于指示按照所述预设方向对所述第一投射区域进行散斑图像采集。
  17. 根据权利要求15所述的方法,其特征在于,所述方法还包括:
    在采集到所述第一图像之后,生成第二采集指令,所述第二采集指令用于指示按照所述预设方向对所述第二投射区域进行散斑图像采集。
  18. 根据权利要求15所述的方法,其特征在于,所述方法还包括:
    按照所述预设方向对第三区域进行散斑投射,并进行散斑图像采集得到第三图像;
    将所述第一图像、所述第二图像和所述第三图像进行合并得到所述散斑采集图像。
  19. 根据权利要求18所述的方法,其特征在于,所述方法还包括:
    按照所述预设方向对第四区域进行散斑投射,并进行散斑图像采集得到第四图像;
    将所述第一图像和所述第二图像进行合并得到散斑采集图像,包括:
    将所述第一图像、所述第二图像、所述第三图像和所述第四图像进行合并得到所述散斑采集图像。
  20. 根据权利要求15-19任一项所述的方法,其特征在于,所述方法还包括:
    对所述第一投射区域和所述第二投射区域进行RGB红绿蓝图像采集得到RGB图像;
    根据所述RGB图像与所述散斑采集图像生成3D图像。
  21. 一种采集芯片,其特征在于,所述采集芯片执行预先存储的计算机程序实现如下方法:
    所述采集芯片,配置为控制第一散斑投射器按照预设方向对第一投射区域进行散斑投射;并控制红外摄像模组进行散斑图像采集得到第一图像;
    所述采集芯片,还配置为控制第二散斑投射器按照所述预设方向对第二投射区域进行散斑投射;并控制所述红外摄像模组进行散斑图像采集得到第二图像,所述第一投射区域和所述第二投射区域相接或存在重叠区域;
    所述采集芯片,还配置为将所述第一图像和所述第二图像进行合并得到散斑采集图像。
PCT/CN2019/127179 2019-12-20 2019-12-20 图像采集装置、图像采集方法及采集芯片 WO2021120217A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2019/127179 WO2021120217A1 (zh) 2019-12-20 2019-12-20 图像采集装置、图像采集方法及采集芯片
CN201980008848.XA CN111656778B (zh) 2019-12-20 2019-12-20 图像采集装置、图像采集方法及采集芯片

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/127179 WO2021120217A1 (zh) 2019-12-20 2019-12-20 图像采集装置、图像采集方法及采集芯片

Publications (1)

Publication Number Publication Date
WO2021120217A1 true WO2021120217A1 (zh) 2021-06-24

Family

ID=72348931

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/127179 WO2021120217A1 (zh) 2019-12-20 2019-12-20 图像采集装置、图像采集方法及采集芯片

Country Status (2)

Country Link
CN (1) CN111656778B (zh)
WO (1) WO2021120217A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114995028A (zh) * 2022-05-09 2022-09-02 中国科学院半导体研究所 散斑投影装置及用于增大视场范围的投影成像方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103796004A (zh) * 2014-02-13 2014-05-14 西安交通大学 一种主动结构光的双目深度感知方法
CN106091985A (zh) * 2016-06-07 2016-11-09 西安交通大学 一种三维采集装置及三维扫描系统
CN205942802U (zh) * 2016-06-07 2017-02-08 西安新拓三维光测科技有限公司 一种人体数字散斑投采装置
US9754376B1 (en) * 2016-03-18 2017-09-05 Chenyang Ge Method and apparatus for generating a structured light speckle encoded pattern
CN209673042U (zh) * 2019-06-03 2019-11-22 易思维(杭州)科技有限公司 一种基于散斑投影的摄影测量系统

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10001656B2 (en) * 2016-04-12 2018-06-19 Microvision, Inc. Devices and methods for speckle reduction in scanning projectors
CN108269238B (zh) * 2017-01-04 2021-07-13 浙江舜宇智能光学技术有限公司 深度图像采集装置和深度图像采集系统及其图像处理方法
CN108711186B (zh) * 2018-06-19 2023-09-12 深圳阜时科技有限公司 目标物体制图的方法与装置、身份识别装置和电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103796004A (zh) * 2014-02-13 2014-05-14 西安交通大学 一种主动结构光的双目深度感知方法
US9754376B1 (en) * 2016-03-18 2017-09-05 Chenyang Ge Method and apparatus for generating a structured light speckle encoded pattern
CN106091985A (zh) * 2016-06-07 2016-11-09 西安交通大学 一种三维采集装置及三维扫描系统
CN205942802U (zh) * 2016-06-07 2017-02-08 西安新拓三维光测科技有限公司 一种人体数字散斑投采装置
CN209673042U (zh) * 2019-06-03 2019-11-22 易思维(杭州)科技有限公司 一种基于散斑投影的摄影测量系统

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114995028A (zh) * 2022-05-09 2022-09-02 中国科学院半导体研究所 散斑投影装置及用于增大视场范围的投影成像方法

Also Published As

Publication number Publication date
CN111656778B (zh) 2022-07-12
CN111656778A (zh) 2020-09-11

Similar Documents

Publication Publication Date Title
TWI531929B (zh) 基於影像來識別觸控表面的目標接觸區域之技術
US9491441B2 (en) Method to extend laser depth map range
CN111949111B (zh) 交互控制方法、装置、电子设备及存储介质
US9900500B2 (en) Method and apparatus for auto-focusing of an photographing device
US8614694B2 (en) Touch screen system based on image recognition
CN106256124B (zh) 结构化立体
CN112005548B (zh) 生成深度信息的方法和支持该方法的电子设备
CN104166509A (zh) 一种非接触式屏幕交互方法及系统
CN104834394B (zh) 一种互动显示系统
CN115004685A (zh) 电子装置和用于在电子装置处显示图像的方法
US10447998B2 (en) Power efficient long range depth sensing
WO2021120217A1 (zh) 图像采集装置、图像采集方法及采集芯片
US10725586B2 (en) Presentation of a digital image of an object
KR100968205B1 (ko) 적외선 카메라 방식의 공간 터치 감지 장치, 방법 및스크린 장치
TWI636381B (zh) 互動顯示系統及互動顯示控制方法
CN110213407B (zh) 一种电子装置的操作方法、电子装置和计算机存储介质
CN210691314U (zh) 一种基于活体检测的接入控制系统及登录设备
US8760437B2 (en) Sensing system
TWI330099B (zh)
US20140184745A1 (en) Accurate 3D Finger Tracking with a Single Camera
TWI582672B (zh) 光學觸控裝置及其觸控偵測方法
ES2812851T3 (es) Dispositivo de previsualización
US20150070321A1 (en) Optical coordinate input device
JP6686319B2 (ja) 画像投影装置及び画像表示システム
CN213754772U (zh) 图像采集装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19956515

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19956515

Country of ref document: EP

Kind code of ref document: A1