WO2021046793A1 - Procédé et appareil d'acquisition d'image, et support de stockage - Google Patents

Procédé et appareil d'acquisition d'image, et support de stockage Download PDF

Info

Publication number
WO2021046793A1
WO2021046793A1 PCT/CN2019/105582 CN2019105582W WO2021046793A1 WO 2021046793 A1 WO2021046793 A1 WO 2021046793A1 CN 2019105582 W CN2019105582 W CN 2019105582W WO 2021046793 A1 WO2021046793 A1 WO 2021046793A1
Authority
WO
WIPO (PCT)
Prior art keywords
image acquisition
image
exposure time
distance
target object
Prior art date
Application number
PCT/CN2019/105582
Other languages
English (en)
Chinese (zh)
Inventor
李明采
王波
Original Assignee
深圳市汇顶科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市汇顶科技股份有限公司 filed Critical 深圳市汇顶科技股份有限公司
Priority to PCT/CN2019/105582 priority Critical patent/WO2021046793A1/fr
Priority to CN201980001904.7A priority patent/CN113228622A/zh
Publication of WO2021046793A1 publication Critical patent/WO2021046793A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Definitions

  • the embodiments of the present application relate to the field of image processing technology, and in particular to an image acquisition method, device, and storage medium.
  • images can display information more intuitively, the development of image acquisition is becoming more and more important, especially for image acquisition of some specific objects.
  • face image collection license plate image collection, etc.
  • the quality of the image will be affected by the photosensitive device (English: sensor) of the camera, the difference of the lens lens, and the exposure time.
  • face recognition builds a face model by acquiring face images.
  • the distance from which the images are collected, ambient light and other influencing factors are uncertain, and using a fixed exposure time often cannot guarantee the image.
  • the quality of the image will be over-exposed or under-exposed, and the details of the image will not be obvious.
  • one of the technical problems solved by the embodiments of the present application is to provide an image acquisition method, device, and storage medium to overcome the excessive or short exposure time caused by the distance of the image acquisition in the prior art. Defects in the quality of the captured image.
  • an image acquisition method which includes:
  • the target exposure time used by the image acquisition device for image acquisition of the target object is determined, and the exposure time determination model is used to indicate the correspondence between at least one image acquisition distance and at least one exposure time.
  • the exposure time to determine the image acquisition distance and exposure time corresponding to each other in the model when image acquisition of the target object the brightness of the target object in the acquired image is within the preset range;
  • the method further includes:
  • the exposure time corresponding to each image collection distance in the at least one image collection distance is determined, and an exposure time determination model is established according to the correspondence between the at least one image collection distance and the at least one exposure time.
  • determining the exposure time corresponding to each image collection distance in the at least one image collection distance includes:
  • a sample image in at least one sample image whose brightness of the target object is within a preset range is determined as the target sample image, and the exposure time of the target sample image is determined as the exposure time corresponding to the preset distance.
  • using a preset distance as the image collection distance to perform image collection on the target object according to at least one exposure time to obtain at least one sample image includes:
  • the preset range includes a range greater than or equal to the first threshold and less than or equal to the second threshold.
  • the method further includes:
  • the target object is a human face
  • the preset range is [900, 1000]
  • the value range of at least one image collection distance is greater than or equal to 300 mm and less than or equal to 1200 mm .
  • determining the image capture distance between the target object and the image capture device includes:
  • the target image acquisition distance is determined by calculating the time difference or phase difference between the light signal emission and reflection between the target object and the image acquisition device.
  • an embodiment of the present application provides an image acquisition device, including: a processor, a distance measurement component, and an image acquisition component; both the distance measurement component and the image acquisition component are electrically connected to the processor;
  • the distance measurement component is used to determine the target image acquisition distance between the target object and the image acquisition device;
  • the processor is configured to determine the target exposure time used by the image acquisition device for image acquisition of the target object according to the target image acquisition distance and the exposure time determination model, and the exposure time determination model is used to indicate one of the at least one image acquisition distance and the at least one exposure time Correspondence between the two, in which, when the image acquisition distance and exposure time corresponding to each other in the model are determined according to the exposure time, the brightness of the target object in the acquired image is within the preset range when image acquisition is performed on the target object;
  • the image acquisition component is used for image acquisition of the target object according to the target image acquisition distance and target exposure time.
  • the processor is further configured to determine the exposure time corresponding to each image acquisition distance in the at least one image acquisition distance, and according to the correspondence between the at least one image acquisition distance and the at least one exposure time The relationship establishes the exposure time to determine the model.
  • the processor is further configured to use a preset distance as the image acquisition distance to perform image acquisition of the target object according to at least one exposure time to obtain at least one sample image;
  • a sample image whose brightness of the target object is within a preset range is determined as the target sample image, and the exposure time of the target sample image is determined as the exposure time corresponding to the preset distance.
  • the processor is further configured to use the preset distance as the image collection distance to perform image collection of the target object according to the preset exposure time to obtain a sample image; the brightness of the target object in the sample image When it is less than the first threshold, increase the preset exposure time to perform image acquisition again on the target object; when the brightness of the target object in the sample image is greater than the second threshold, reduce the preset exposure time to perform image acquisition again on the target object.
  • the preset range includes The range is greater than or equal to the first threshold and less than or equal to the second threshold.
  • the processor is further configured to calculate the average value of the pixel brightness of the area where the target object is located in each sample image as the brightness of the target object in each sample image.
  • the distance measurement component is also used to determine the target image collection distance by calculating the time difference or phase difference between the emission and reflection of the optical signal between the target object and the image collection device.
  • the distance measurement component includes a time-of-flight ranging module
  • the image acquisition component includes a structured light image acquisition component and/or an RGB image acquisition component.
  • an embodiment of the present application provides a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the implementation is as described in the first aspect or any one of the embodiments of the first aspect. Methods.
  • the target exposure time corresponding to the target image acquisition distance between the target object and the image acquisition device is determined according to the exposure time determination model, and the target object is imaged according to the target image acquisition distance and the corresponding target exposure time.
  • the brightness of the target object in the acquired image will not be too bright or too dark, avoiding overexposure or underexposure, the display of the target object is clearer, improving the quality of the acquired image, and determining the model and target according to the exposure time
  • the image collection distance directly determines the corresponding target exposure time, which can quickly determine the appropriate exposure time and improve efficiency.
  • FIG. 1 is a flowchart of an image acquisition method provided by an embodiment of the application
  • FIG. 2 is a schematic diagram of the relationship between exposure time and brightness provided by an embodiment of the application.
  • FIG. 3 is a structural diagram of a face recognition door lock provided by an embodiment of the application.
  • FIG. 4 is a schematic diagram of brightness distribution of target objects at different distances according to an embodiment of the application.
  • FIG. 5 is a flowchart of a mapping establishment method provided by an embodiment of the application.
  • FIG. 6 is a logical block diagram of a method for collecting sample images according to an embodiment of the application.
  • FIG. 7 is a structural diagram of an image acquisition device provided by an embodiment of the application.
  • FIG. 8 is a structural diagram of an image acquisition device provided by an embodiment of the application.
  • FIG. 1 is a flowchart of an image acquisition method provided by an embodiment of the application.
  • the image acquisition method can be applied to an image acquisition device.
  • the image acquisition device can be an infrared camera, a digital camera, a smart phone tablet computer, etc., which has an image acquisition function.
  • the image acquisition method includes the following steps:
  • Step 101 Determine the target image capture distance between the target object and the image capture device.
  • the target image acquisition distance is the distance between the target object and the image acquisition device.
  • the target object can be a human face, a license plate, etc., which is not limited in this application.
  • the term target is used to indicate a singular number and is not used for any limitation.
  • the target object refers to a collection object. This application uses this collection object as an example to describe the process of implementation of the solution, and does not have any limiting effect.
  • the target image collection distance also indicates any collection distance.
  • determining the image capture distance between the target object and the image capture device includes: calculating the time difference or phase difference between the light signal emission and reflection between the target object and the image capture device. Determine the target image collection distance.
  • the image acquisition device sends a light signal (such as infrared light) to the target object through the sensor, and is received by the sensor after being reflected by the target object, and determines the target object and the image acquisition by calculating the time difference or phase difference between the light signal emission and reflection The distance between the devices.
  • a light signal such as infrared light
  • the image acquisition device is an infrared camera, and the infrared camera itself has a sensor that receives infrared light, so there is no need to make much changes to the device itself, which is more convenient.
  • Step 102 Determine the target exposure time used by the image acquisition device for image acquisition of the target object according to the target image acquisition distance and exposure time determination model.
  • the exposure time determination model is used to indicate the correspondence between at least one image acquisition distance and at least one exposure time.
  • the image collection distance refers to the distance between the image collection device and the collection object.
  • the collection object is the target object.
  • the target exposure time is the exposure time corresponding to the target image collection distance in the exposure time determination model.
  • the target image collection distance belongs to at least one image collection distance.
  • the corresponding image collection distance and exposure time in the model are determined according to the exposure time.
  • the brightness of the target object in the acquired image is within a preset range.
  • the exposure time determination model may be a preset mapping, and the preset mapping may be expressed in the form of a list, a function, or an image, which is not limited in this application.
  • the corresponding exposure time is 10ms, which means that the brightness of the target object in the acquired image obtained by the image acquisition of the target object according to the distance between the image acquisition device and the target object is 500mm and the exposure time is 10ms Within the preset range.
  • the target object may be a human face
  • the preset range is [900, 1000]
  • the value range of at least one image collection distance is greater than or equal to 300 mm and less than or equal to 1200 mm.
  • the effective distance for image collection of a human face is between 300 mm and 1200 mm.
  • the preset range refer to FIG. 2.
  • the preset range is set between [900,1000] to ensure that the area where the target object is located in the image is fully exposed and displayed clearly, and it can also avoid overexposure. Exposure, of course, the preset range can also be flexibly adjusted. For example, it is possible to set the preset range between [800,900], or between [850,900], or between [900,950]. No restrictions.
  • the method further includes: determining the exposure time corresponding to each image acquisition distance in the at least one image acquisition distance, and according to the correspondence between the at least one image acquisition distance and the at least one exposure time Create a default mapping.
  • Step 103 Perform image collection on the target object according to the target image collection distance and the target exposure time.
  • FIG. 3 is a structural diagram of a face recognition door lock provided by an embodiment of the application.
  • the face recognition door lock includes a processor, an infrared camera, a TOF distance sensor, and an infrared light supplement.
  • the processor is respectively connected with an infrared camera, a TOF (English: Time of Flight, time of flight distance measurement method) distance sensor, and an infrared light supplementer to realize the control of these components.
  • an infrared camera During use, the user needs to register on the face recognition door lock first, that is, enter his face image. After the registration is successful, the user can open the door lock through face recognition. Whether in the process of face registration or face recognition, face recognition door locks need to use an infrared camera to capture images of the user's face (that is, the target object).
  • the processor controls the TOF distance sensor to measure the distance between the infrared camera and the user's face as the target image acquisition distance.
  • the distance between the infrared camera and the user's face can represent the face recognition door lock to the user's person.
  • the processor determines the target exposure time corresponding to the target image collection distance according to the preset mapping.
  • the processor controls the infrared light supplementer to fill light according to the target exposure time, and controls the infrared camera to collect images of the user's face.
  • the exposure time determined according to the preset mapping can make the brightness of the user's face area in the collected image reach 900, making the user's face display clearer. Referring to FIG. 4, FIG.
  • FIG. 4 is a schematic diagram of the brightness distribution of a target object at different distances according to an embodiment of the application.
  • the abscissa represents the distance and the ordinate represents the brightness.
  • the preset range is ( 830,920) After automatic exposure according to the preset mapping, the brightness of the target object is between 830 and 920.
  • the target exposure time corresponding to the target image acquisition distance between the target object and the image acquisition device is determined according to the preset mapping, and the target object is imaged according to the target image acquisition distance and the corresponding target exposure time.
  • the brightness of the target object in the acquired captured image will not be too bright or too dark to avoid overexposure or underexposure, the display of the target object is clearer, and the quality of the captured image is improved.
  • the preset mapping and target image capture The distance directly determines the corresponding target exposure time, which can quickly determine the appropriate exposure time and improve efficiency.
  • FIG. 5 is a flowchart of a mapping establishment method provided by an embodiment of the application, and the method includes the following steps:
  • Step 501 Use the preset distance as the image collection distance to perform image collection on the target object according to at least one exposure time to obtain at least one sample image.
  • the preset distance may be any length of at least one image collection distance, and the value range of the at least one image collection distance may be any distance between 300 mm and 1200 mm.
  • using a preset distance as the image collection distance to perform image collection on the target object according to at least one exposure time to obtain at least one sample image includes:
  • the preset range includes a range greater than or equal to the first threshold and less than or equal to the second threshold. It should be noted that when increasing or decreasing the exposure time, it can be increased or decreased according to the preset step length. For example, the preset step length is 10ms.
  • the target object will be imaged according to the exposure time of 10ms.
  • the exposure time is increased by 10ms, and the image is re-acquired according to the exposure time of 30ms.
  • the exposure time is reduced by 10ms, According to the exposure time of 10ms, the target object is imaged.
  • the preset step length can also be 1ms, and the preset exposure time can be 8ms.
  • the first threshold may be 800 or 900
  • the second threshold may be 950 or 1000, which is not limited in this application.
  • Step 502 Calculate the average value of the pixel brightness of the area where the target object is located in each sample image as the brightness of the target object in each sample image.
  • the brightness can be represented by a DN value or a gray value, which is not limited in this application.
  • the maximum or minimum value of the pixel brightness of the area where the target object is located can also be used as the brightness of the target object in each sample image, which is not limited in this application.
  • Step 503 Determine a sample image in at least one sample image whose brightness of the target object is within a preset range as the target sample image.
  • Step 504 Determine the exposure time of the target sample image as the exposure time corresponding to the preset distance.
  • Steps 501 to 504 determine the exposure time corresponding to the preset distance according to at least one sample image at the preset distance.
  • the preset mapping ie, the exposure time determination model
  • each image collection distance can be determined according to the method of step 501-step 504 The corresponding exposure time.
  • at least one image collection distance may include (300mm, 310mm, 320mm...1190mm, 1200mm), that is, increase from 300mm to 1200mm in steps of 10mm.
  • Step 505 Establish a preset mapping according to the correspondence between at least one image collection distance and at least one exposure time.
  • the target object is a human face as an example
  • the preset exposure time is 8ms
  • the step length of the exposure time is 1ms
  • the preset range is 900
  • at least one value of the image acquisition distance The range is (300mm, 310mm, 320mm...1190mm, 1200mm).
  • the sample image acquisition method of this example includes the following steps:
  • step 603 is executed, otherwise, the method ends.
  • the initial preset exposure time is 8ms.
  • step 606 is executed; otherwise, step 607 is executed.
  • step 608 is executed; otherwise, step 609 is executed.
  • the preset range only includes the value of 900.
  • an embodiment of the present application provides an image acquisition device for executing the methods described in the first to third embodiments.
  • the image acquisition device 70 Including: a processor 701, a distance measurement component 702, and an image acquisition component 703, and both the distance measurement component 702 and the image acquisition component 703 are electrically connected to the processor 701;
  • the distance measurement component 702 is used to determine the target image acquisition distance between the target object and the image acquisition device;
  • the processor 701 is configured to determine the target exposure time used by the image capture device for image capture of the target object according to the target image capture distance and a preset mapping, and the preset mapping is used to indicate a distance between at least one image capture distance and at least one exposure time Wherein, when the image acquisition distance and exposure time corresponding to each other in the model are determined according to the exposure time, when the target object is imaged, the brightness of the target object in the acquired image is within a preset range;
  • the image acquisition component 703 is used for image acquisition of the target object according to the target image acquisition distance and the target exposure time.
  • the image acquisition device 70 may further include a memory 704, which is electrically connected to the processor 701, the memory 704 stores a computer program, and the processor 701 Executing the computer program implements the methods described in the first to third embodiments.
  • the computer program may also be stored on the processor 701, which is not limited in this application.
  • the processor 701 is further configured to determine the exposure time corresponding to each image acquisition distance in the at least one image acquisition distance, and determine the exposure time corresponding to the at least one image acquisition distance and the at least one exposure time.
  • the corresponding relationship establishes a preset mapping, and the brightness of the target object is within the preset range in the images collected on the target object according to the corresponding image collection distance and exposure time.
  • the processor 701 is further configured to use a preset distance as the image collection distance to perform image collection of the target object according to at least one exposure time to obtain at least one sample image;
  • the sample image in which the brightness of the target object is within the preset range is determined as the target sample image, and the exposure time of the target sample image is determined as the exposure time corresponding to the preset distance.
  • the processor 701 is further configured to use the preset distance as the image collection distance to perform image collection of the target object according to the preset exposure time to obtain a sample image;
  • the preset range Including the range greater than or equal to the first threshold and less than or equal to the second threshold.
  • the processor 701 is further configured to calculate the average value of the pixel brightness of the area where the target object is located in each sample image as the brightness of the target object in each sample image.
  • the distance measurement component 702 is also used to determine the target image collection distance by calculating the time difference or phase difference between the emission and reflection of the optical signal between the target object and the image collection device.
  • the distance measurement component 702 includes a time-of-flight ranging module
  • the image acquisition component 703 includes a structured light image acquisition component and/or an RGB image acquisition component.
  • the time-of-flight ranging module may include a TOF distance sensor
  • the structured light image acquisition component may include a speckle projector to obtain the depth information of the target object, RGB (English: Red Green Blue)
  • the image acquisition component may include a camera for acquiring a two-dimensional image of the target object, and the complete three-dimensional image information of the target object can be acquired through the structured light image acquisition component and the RGB image acquisition component.
  • the image acquisition component 703 may also only include a camera, for example, an infrared camera, a common camera, etc., which is not limited in this application.
  • the image acquisition device 70 can be a smart phone, an infrared camera, a face recognition door lock and other equipment.
  • an embodiment of the present application provides a computer-readable storage medium on which a computer program is stored.
  • the feature is that when the program is executed by a processor, the implementation is as in the first to third embodiments.
  • the image acquisition device of the embodiment of the present application may exist in various forms, including but not limited to:
  • Mobile communication equipment This type of equipment is characterized by mobile communication functions, and its main goal is to provide voice and data communications.
  • Such terminals include: smart phones (such as iPhone), multimedia phones, functional phones, and low-end phones.
  • Ultra-mobile personal computer equipment This type of equipment belongs to the category of personal computers, has calculation and processing functions, and generally also has mobile Internet features.
  • Such terminals include: PDA, MID and UMPC devices, such as iPad.
  • Portable entertainment equipment This type of equipment can display and play multimedia content.
  • Such devices include: audio, video players (such as iPod), handheld game consoles, e-books, as well as smart toys and portable car navigation devices.
  • Server A device that provides computing services.
  • the structure of a server includes a processor 810, hard disk, memory, system bus, etc.
  • the server is similar to a general computer architecture, but because it needs to provide highly reliable services, it is High requirements in terms of performance, reliability, security, scalability, and manageability.
  • the improvement of a technology can be clearly distinguished between hardware improvements (for example, improvements in circuit structures such as diodes, transistors, switches, etc.) or software improvements (improvements in method flow).
  • hardware improvements for example, improvements in circuit structures such as diodes, transistors, switches, etc.
  • software improvements improvements in method flow.
  • the improvement of many methods and processes of today can be regarded as a direct improvement of the hardware circuit structure.
  • Designers almost always get the corresponding hardware circuit structure by programming the improved method flow into the hardware circuit. Therefore, it cannot be said that the improvement of a method flow cannot be realized by the hardware entity module.
  • a programmable logic device for example, a Field Programmable Gate Array (Field Programmable Gate Array, FPGA)
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • HDL Hardware Description Language
  • ABEL Advanced Boolean Expression Language
  • AHDL Altera Hardware Description Language
  • HDCal JHDL
  • Lava Lava
  • Lola MyHDL
  • PALASM RHDL
  • VHDL Very-High-Speed Integrated Circuit Hardware Description Language
  • Verilog Verilog
  • the controller can be implemented in any suitable manner.
  • the controller can take the form of, for example, a microprocessor or a processor, and a computer-readable medium storing computer-readable program codes (such as software or firmware) executable by the (micro)processor. , Logic gates, switches, application specific integrated circuits (ASICs), programmable logic controllers and embedded microcontrollers. Examples of controllers include but are not limited to the following microcontrollers: ARC625D, Atmel AT91SAM, Microchip PIC18F26K20 and Silicon Labs C8051F320, the memory controller can also be implemented as part of the memory control logic.
  • controller in addition to implementing the controller in a purely computer-readable program code manner, it is entirely possible to program the method steps to make the controller use logic gates, switches, application specific integrated circuits, programmable logic controllers and embedded
  • the same function can be realized in the form of a microcontroller or the like. Therefore, such a controller can be regarded as a hardware component, and the devices included in it for realizing various functions can also be regarded as a structure within the hardware component. Or even, the device for realizing various functions can be regarded as both a software module for realizing the method and a structure within a hardware component.
  • a typical implementation device is a computer.
  • the computer may be, for example, a personal computer, a laptop computer, a cell phone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or Any combination of these devices.
  • this application can be provided as methods, systems, or computer program products. Therefore, this application may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, this application may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes.
  • computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
  • the device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment.
  • the instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • the computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • the memory may include non-permanent memory in a computer readable medium, random access memory (RAM) and/or non-volatile memory, such as read-only memory (ROM) or flash memory (flash RAM). Memory is an example of computer readable media.
  • RAM random access memory
  • ROM read-only memory
  • flash RAM flash memory
  • Computer-readable media include permanent and non-permanent, removable and non-removable media, and information storage can be realized by any method or technology.
  • the information can be computer-readable instructions, data structures, program modules, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical storage, Magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices. According to the definition in this article, computer-readable media does not include transitory media, such as modulated data signals and carrier waves.
  • this application can be provided as a method, a system, or a computer program product. Therefore, this application may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, this application may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes.
  • a computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • program modules include routines, programs, objects, components, data structures, etc. that perform specific transactions or implement specific abstract data types.
  • the present application can also be practiced in distributed computing environments. In these distributed computing environments, transactions are executed by remote processing devices connected through a communication network. In a distributed computing environment, program modules can be located in local and remote computer storage media including storage devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé et un appareil d'acquisition d'image, ainsi qu'un support de stockage. Le procédé d'acquisition d'image comprend les étapes consistant à : déterminer une distance d'acquisition d'image cible entre un objet cible et l'appareil d'acquisition d'image (S101) ; en fonction de la distance d'acquisition d'image cible et d'un modèle de détermination de temps d'exposition, déterminer un temps d'exposition cible utilisé lorsque l'appareil d'acquisition d'image effectue une acquisition d'image sur l'objet cible, la luminosité de l'objet cible dans une image acquise obtenue lorsque l'acquisition d'image est effectuée sur l'objet cible en fonction d'une distance d'acquisition d'image et d'un temps d'exposition correspondant l'un à l'autre dans le modèle de détermination de temps d'exposition se situant dans une plage prédéfinie (S102) ; et effectuer une acquisition d'image sur l'objet cible en fonction de la distance d'acquisition d'image cible et du temps d'exposition cible (S103). La luminosité d'un objet cible dans une image acquise obtenue en effectuant une acquisition d'image sur l'objet cible en fonction d'une distance d'acquisition d'image cible et d'un temps d'exposition cible lui correspondant n'est ni trop élevée ni trop faible, ce qui évite une surexposition ou une sous-exposition.
PCT/CN2019/105582 2019-09-12 2019-09-12 Procédé et appareil d'acquisition d'image, et support de stockage WO2021046793A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2019/105582 WO2021046793A1 (fr) 2019-09-12 2019-09-12 Procédé et appareil d'acquisition d'image, et support de stockage
CN201980001904.7A CN113228622A (zh) 2019-09-12 2019-09-12 图像采集方法、装置及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/105582 WO2021046793A1 (fr) 2019-09-12 2019-09-12 Procédé et appareil d'acquisition d'image, et support de stockage

Publications (1)

Publication Number Publication Date
WO2021046793A1 true WO2021046793A1 (fr) 2021-03-18

Family

ID=74866887

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/105582 WO2021046793A1 (fr) 2019-09-12 2019-09-12 Procédé et appareil d'acquisition d'image, et support de stockage

Country Status (2)

Country Link
CN (1) CN113228622A (fr)
WO (1) WO2021046793A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113240428A (zh) * 2021-05-27 2021-08-10 支付宝(杭州)信息技术有限公司 支付处理方法及装置
CN113627923A (zh) * 2021-07-02 2021-11-09 支付宝(杭州)信息技术有限公司 一种线下支付方法、装置及设备

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114374801A (zh) * 2021-12-28 2022-04-19 苏州凌云视界智能设备有限责任公司 一种曝光时间的确定方法、装置、设备以及存储介质
CN114885104B (zh) * 2022-05-06 2024-04-26 北京银河方圆科技有限公司 摄像机自适应调节的方法、可读存储介质和导航系统

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6426775B1 (en) * 1995-09-20 2002-07-30 Canon Kabushiki Kaisha Image pickup apparatus with distance measurement dependent on object lighting condition
CN101038624A (zh) * 2003-03-28 2007-09-19 富士通株式会社 摄影装置
CN101631201A (zh) * 2003-03-28 2010-01-20 富士通株式会社 摄影装置
CN103905739A (zh) * 2012-12-28 2014-07-02 联想(北京)有限公司 电子设备控制方法及电子设备
CN104580929A (zh) * 2015-02-02 2015-04-29 南通莱奥电子科技有限公司 一种自适应曝光的图像数字处理系统
CN104573603A (zh) * 2013-10-09 2015-04-29 Opto电子有限公司 光学信息读取装置及照明控制方法
CN105635565A (zh) * 2015-12-21 2016-06-01 华为技术有限公司 一种拍摄方法及设备
CN107181918A (zh) * 2016-08-09 2017-09-19 深圳市瑞立视多媒体科技有限公司 一种光学动捕摄像机的拍摄控制方法及系统

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140037135A1 (en) * 2012-07-31 2014-02-06 Omek Interactive, Ltd. Context-driven adjustment of camera parameters
CN109413326A (zh) * 2018-09-18 2019-03-01 Oppo(重庆)智能科技有限公司 拍照控制方法及相关产品
CN109903324B (zh) * 2019-04-08 2022-04-15 京东方科技集团股份有限公司 一种深度图像获取方法及装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6426775B1 (en) * 1995-09-20 2002-07-30 Canon Kabushiki Kaisha Image pickup apparatus with distance measurement dependent on object lighting condition
CN101038624A (zh) * 2003-03-28 2007-09-19 富士通株式会社 摄影装置
CN101631201A (zh) * 2003-03-28 2010-01-20 富士通株式会社 摄影装置
CN103905739A (zh) * 2012-12-28 2014-07-02 联想(北京)有限公司 电子设备控制方法及电子设备
CN104573603A (zh) * 2013-10-09 2015-04-29 Opto电子有限公司 光学信息读取装置及照明控制方法
CN104580929A (zh) * 2015-02-02 2015-04-29 南通莱奥电子科技有限公司 一种自适应曝光的图像数字处理系统
CN105635565A (zh) * 2015-12-21 2016-06-01 华为技术有限公司 一种拍摄方法及设备
CN107181918A (zh) * 2016-08-09 2017-09-19 深圳市瑞立视多媒体科技有限公司 一种光学动捕摄像机的拍摄控制方法及系统

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113240428A (zh) * 2021-05-27 2021-08-10 支付宝(杭州)信息技术有限公司 支付处理方法及装置
CN113240428B (zh) * 2021-05-27 2023-09-08 支付宝(杭州)信息技术有限公司 支付处理方法及装置
CN113627923A (zh) * 2021-07-02 2021-11-09 支付宝(杭州)信息技术有限公司 一种线下支付方法、装置及设备

Also Published As

Publication number Publication date
CN113228622A (zh) 2021-08-06

Similar Documents

Publication Publication Date Title
WO2021046793A1 (fr) Procédé et appareil d'acquisition d'image, et support de stockage
US20200160040A1 (en) Three-dimensional living-body face detection method, face authentication recognition method, and apparatuses
WO2021046715A1 (fr) Procédé de calcul de temps d'exposition, dispositif et support de stockage
WO2019148978A1 (fr) Procédé et appareil de traitement d'images, support de stockage et dispositif électronique
US11048913B2 (en) Focusing method, device and computer apparatus for realizing clear human face
US20200167582A1 (en) Liveness detection method, apparatus and computer-readable storage medium
AU2014374638B2 (en) Image processing apparatus and method
CN105227838B (zh) 一种图像处理方法及移动终端
US20170026565A1 (en) Image capturing apparatus and method of operating the same
WO2019071613A1 (fr) Procédé et dispositif de traitement d'image
US20170213105A1 (en) Method and apparatus for event sampling of dynamic vision sensor on image formation
RU2628494C1 (ru) Способ и устройство для генерирования фильтра изображения
KR102263537B1 (ko) 전자 장치와, 그의 제어 방법
RU2612892C2 (ru) Способ автоматической фокусировки и устройство автоматической фокусировки
US11289078B2 (en) Voice controlled camera with AI scene detection for precise focusing
CN109903324B (zh) 一种深度图像获取方法及装置
CN106249508B (zh) 自动对焦方法和系统、拍摄装置
US9471979B2 (en) Image recognizing apparatus and method
CN108200335A (zh) 基于双摄像头的拍照方法、终端及计算机可读存储介质
CN114267041B (zh) 场景中对象的识别方法及装置
CN109714539B (zh) 基于姿态识别的图像采集方法、装置及电子设备
US20230336878A1 (en) Photographing mode determination method and apparatus, and electronic device and storage medium
TWI676113B (zh) 虹膜識別過程中的預覽方法及裝置
WO2023273498A1 (fr) Procédé et appareil de détection de profondeur, dispositif électronique et support de stockage
US8804029B2 (en) Variable flash control for improved image detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19944815

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19944815

Country of ref document: EP

Kind code of ref document: A1