WO2024032379A1 - 光学信息采集器及其方法 - Google Patents

光学信息采集器及其方法 Download PDF

Info

Publication number
WO2024032379A1
WO2024032379A1 PCT/CN2023/109637 CN2023109637W WO2024032379A1 WO 2024032379 A1 WO2024032379 A1 WO 2024032379A1 CN 2023109637 W CN2023109637 W CN 2023109637W WO 2024032379 A1 WO2024032379 A1 WO 2024032379A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
image
optical information
decoding
frames
Prior art date
Application number
PCT/CN2023/109637
Other languages
English (en)
French (fr)
Inventor
王冬生
魏江涛
张颂来
周小芹
Original Assignee
无锡盈达聚力科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 无锡盈达聚力科技有限公司 filed Critical 无锡盈达聚力科技有限公司
Priority to AU2023305023A priority Critical patent/AU2023305023A1/en
Priority to EP23844164.6A priority patent/EP4365772A1/en
Priority to US18/413,108 priority patent/US20240152716A1/en
Publication of WO2024032379A1 publication Critical patent/WO2024032379A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/146Methods for optical code recognition the method including quality enhancement steps
    • G06K7/1473Methods for optical code recognition the method including quality enhancement steps error correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • the present application relates to the field of optical information collection, and in particular, to an optical information collection device and a method thereof.
  • Optical information collection based on image scanning includes the collection of various machine-readable optical information.
  • Common machine-readable optical information includes one-dimensional codes, two-dimensional codes, OCR graphics and texts, ultraviolet anti-counterfeiting codes, infrared anti-counterfeiting codes, etc.
  • the main inventor of this application has developed a low-power light filling method disclosed in Chinese patent CN201810098421.0, the entire content of which is cited here as a reference, in which the optical imaging unit turns on the filling light during the exposure time of the frame period, and Turn off the fill light during the sampling time of the frame period, and periodically turn on and off the fill light unit to reduce power consumption.
  • the frame period of the optical imaging unit is usually within 50ms.
  • the frame period can be within 20ms, which means that the optical imaging unit can collect dozens of frames of images in 1 second.
  • the optical imaging unit usually uses a digital stream to continuously collect images, and the successful recognition of optical information usually comes from one frame of the image. When the optical information recognition is successful, the optical imaging unit will no longer continue to recognize subsequent collected images and stop collecting. optical information.
  • the optical imaging unit uses a digital stream to continuously collect images
  • the optical information is successfully recognized through one of the images
  • multiple frames of images have been continuously collected, and the subsequent multi-frame images do not need to be collected. Due to optical information recognition, power consumption is wasted.
  • the main inventor of this application has further developed another low-power barcode scanning system and method disclosed in Chinese patent CN201811485945.1, the entire content of which is incorporated herein by reference.
  • the image sensor is in a standby state before being triggered. And after the decoding is successful or the decoding times out, the image sensor re-enters the standby state.
  • This method is different from the mobile phone industry. After the camera application is opened on the mobile phone, the camera keeps collecting and previewing images, and the power consumption is very high. However, the mobile phone does not use the camera for a long time, and the power consumption problem can be ignored.
  • the barcode scanning industry is different. The image sensor for barcode scanning needs to be used for a long time. If it is always working, the battery life will be significantly reduced.
  • the inventor of the present application found that when the image sensor continuously collects images in a digital stream mode, if the image sensor is controlled to suddenly interrupt image collection after decoding is successful or the decoding times out (the device is still powered on at this time), The image sensor is still powered on, but will not output and preview images, thereby saving power consumption). As mentioned above, the image sensor may have collected images. Although this is a probabilistic event, it will also cause certain adverse effects.
  • images may remain not only in the image sensor, but also in other memories, such as the buffer of the image signal processor.
  • the purpose of this application is to provide an optical information collector and method that can avoid decoding errors and thereby reduce power consumption.
  • This application provides an optical information collector, which is characterized in that it includes: an image sensor, used to collect image data of optical information; a decoding unit, used to decode the image data according to a preset decoding algorithm; a central processor, The central processor controls the image sensor to collect image data via a trigger, and controls the decoding unit to decode the image data, wherein, once triggered, the central processor issues an instruction to discard N frames of image data of a specific number of frames, where, The N frames of image data of a specific number of frames are the image data that were last triggered and collected and remain in the optical information collector.
  • the N frames of image data of a specific number of frames include image data remaining in the storage area of the image sensor.
  • an image signal processor is included to receive the image collected by the image sensor and transmit the image data to the decoding unit.
  • the N frames of image data of a specific number of frames include the remaining images in the image signal processor. image data.
  • discarding a specific number of N frames of image data includes: the decoding unit does not receive a specific number of N frames of image data, or the decoding unit does not decode a specific number of N frames of image data, or the decoding unit The unit does not output or display the decoded information of N frames of image data of a specific number of frames.
  • the decoding unit starts decoding from the N+1th frame of image data.
  • the present application provides an optical information collection method, which is characterized in that it includes: a central processor controls the image sensor to collect and output image data via a trigger; the central processor receives the image data and issues an instruction to discard N frames of image data of a specific number of frames, where , the N frames of image data of a specific number of frames are the image data collected and retained through triggering last time; the decoding unit decodes the image data.
  • the N frames of image data of a specific number of frames include image data remaining in the storage area of the image sensor.
  • the image data collected by the image sensor is received through an image signal processor, and the image signal processor further transmits the image data to the decoding unit; N frames of image data of a specific frame number include the image signal processing image data remaining in the device.
  • discarding a specific number of N frames of image data includes: the decoding unit does not receive a specific number of N frames of image data, or the decoding unit does not decode a specific number of N frames of image data, or the decoding unit The unit does not output or display the decoded information of N frames of image data of a specific number of frames.
  • the decoding unit starts decoding from the N+1th frame of image data.
  • This application provides an optical information collector, which is characterized in that it includes: an image sensor, used to collect image data of optical information; a decoding unit, used to receive and decode the image data; and a central processor, used to control the image sensor Collect image data, and control the decoding unit to decode the image data, wherein, via a trigger, the central processor controls the image sensor to collect and output image data of a fixed number of frames in a fixed frame mode, and controls the decoding unit to decode image data, and when one frame of image data is successfully decoded, decoding of the remaining image data in a fixed number of frames of image data is stopped.
  • the fixed frame mode includes: when the decoding unit decodes successfully but the image sensor has not collected a fixed number of image data, the image sensor will continue to collect a fixed number of image data and output a fixed number of frames. Frame number of image data.
  • the fixed frame mode includes: the central processor controls the decoding unit to sequentially receive and decode a fixed number of image data, and when the last frame of image data in the fixed number of image data is not successfully decoded, or before, controlling the image sensor to collect image data of a fixed number of frames again.
  • an image signal processor is not included or the image data collected by the image sensor is not optimized through the image signal processor.
  • the image sensor is configured to collect image data in the following order: using a preset number of fixed frame modes to collect image data, and using a digital stream mode to continuously collect image data.
  • This application provides an optical information collection method, which is characterized by including: a central processor controls the image sensor to collect and output image data of a fixed frame number in a fixed frame mode via a trigger; a decoding unit receives and decodes the image data, and when one of the When the frame image data is successfully decoded, decoding of the remaining image data in the fixed number of frames of image data is stopped.
  • the image sensor will continue to collect the image data of the fixed number of frames and output the image data of the fixed number of frames.
  • the central processor controls the decoding unit to sequentially receive and decode a fixed number of frames of image data, and when or before the last frame of image data in the fixed number of frames of image data is not successfully decoded, controls the decoding unit.
  • the image sensor again collects image data for a fixed number of frames.
  • an image signal processor is not included or the image data collected by the image sensor is not optimized through the image signal processor.
  • the image sensor is configured to collect image data in the following order: using a preset number of fixed frame modes to collect image data, and using a digital stream mode to continuously collect image data.
  • This application provides an optical information collector, which is characterized in that it includes: an image sensor, used to collect image data of optical information; a memory, preset with one or more decoding algorithms; a decoding unit, used to receive and decode images Data; a central processor used to control the image sensor to continuously collect image data in a digital stream mode, and control the decoding unit to decode the image data in sequence. Once the decoding unit decodes successfully or the decoding times out, the image sensor is controlled. Stop continuously collecting image data in digital stream mode, and control the image sensor to continue collecting and outputting image data of a fixed number of frames.
  • the optical information collector does not have an image signal processor or does not optimize the image data through the image signal processor.
  • the image sensor outputs RAW format image data
  • the decoding unit acquires grayscale image data based on the RAW format image data, and performs decoding based on the grayscale image data.
  • the fixed number of frames of image data is one frame or two frames.
  • the image data collected by the image sensor is directly transmitted to the decoding unit for decoding.
  • This application provides an optical information collection method, which is characterized by including: a central processor controls an image sensor to continuously collect and output image data in a digital stream mode; a decoding unit receives and decodes image data. Once the decoding unit decodes successfully, The image sensor is controlled to stop collecting image data in digital stream mode; the image sensor is controlled to continue collecting and outputting image data of a fixed number of frames.
  • the optical information collector does not have an image signal processor or does not optimize the image data through the image signal processor.
  • the image sensor outputs RAW format image data
  • the decoding unit acquires grayscale image data based on the RAW format image data, and performs decoding based on the grayscale image data.
  • the fixed number of frames of image data is one frame or two frames.
  • the image data collected by the image sensor is directly transmitted to the decoding unit for decoding.
  • Figure 1 is a simplified block diagram of an optical information collector according to an embodiment of the present application.
  • Figure 2 is a schematic diagram of an optical information collector according to an embodiment of the present application.
  • Figure 3 is a perspective view of the optical information collector in Figure 2;
  • Figure 4 is a high-level block diagram of an optical information collector according to an embodiment of the present application.
  • Figure 5 is a timing diagram of an optical information collector collecting optical information through digital stream mode according to an embodiment of the present application
  • Figure 6 is a timing diagram for collecting optical information by an optical information collector according to an embodiment of the present application.
  • Figure 7 is another timing diagram of the optical information collector collecting optical information according to an embodiment of the present application.
  • Figure 8 is a high-level block diagram of an optical information collector according to another embodiment of the present application.
  • Figure 9 is a timing diagram for collecting optical information by an optical information collector according to another embodiment of the present application.
  • Figure 10 is a timing diagram of an optical information collector collecting optical information through a fixed frame mode according to another embodiment of the present application.
  • Figure 11 is a timing diagram of an optical information collector collecting optical information through a mixed mode according to another embodiment of the present application.
  • Figure 12 is a timing diagram of an optical information collector collecting optical information through another hybrid mode according to another embodiment of the present application.
  • Optical information collector 100 camera 1; optical system 2; image sensor 3; fill light 4; central processing unit 5; memory 6; image signal processor 7; decoding unit 8; housing 9; scanning window 10; display screen 11; button 12; storage area 13; register 14; buffer 15.
  • the optical information collector 100 can be used to collect one or more optical information, such as one-dimensional code, two-dimensional code, OCR image and text, ultraviolet anti-counterfeiting code, infrared anti-counterfeiting code, etc.
  • the optical information collector 100 may include at least one camera 1, and the camera 1 may include a combination of an optical system 2 (lens) that captures light and an image sensor 3 (sensor) that photoelectrically converts the light captured by the optical system 2,
  • the optical system 2 may include one or more mirrors, prisms, lenses or a combination thereof.
  • the image sensor 3 may also be one or more.
  • One image sensor 3 may correspond to one/set of the optical sensors 3.
  • System 2, or multiple image sensors 3 may share the same optical system/set 2, or multiple/multiple sets of optical systems 2 may share the same image sensor 3.
  • the image sensor 3 may be a CCD, CMOS or other type of image sensor 3.
  • the image sensor 3 is used to convert light signals into electrical signals, and then output digital signals of image data.
  • the optical information collector 100 may include one or more fill lights 4 , which are used to illuminate optical information when the camera 1 collects image data.
  • the fill light 4 may not be used for fill light, or the optical information collector 100 may not have the fill light 4 .
  • the fill light 4 can be provided in various forms: for example, the fill light 4 can continuously fill light when the camera 1 collects optical information; or the fill light 4 can be in conjunction with the image of the camera 1
  • the exposure time of the sensor 3 is synchronized to fill the light.
  • the Chinese patent CN201810098421.0 discloses a technical solution for the fill light 4 to synchronize the exposure time of the image sensor 3 to fill the light.
  • the entire content of the fill light is incorporated herein by reference; the supplement
  • the light lamp 4 can also be a pulsed fill light, the pulse time of which overlaps with a part of the exposure time of the image sensor 3 .
  • the optical information collector 100 may also include a central processing unit 5 for executing various instructions.
  • the optical information collector 100 may also include a separate or integrated memory 6.
  • One or more decoding algorithms are preset in the memory 6 as needed, and the memory 6 may also store other programs or instructions.
  • the memory 6 may comprise one or more non-transitory storage media, such as volatile and/or non-volatile memory which may be fixed or removable, for example.
  • the memory 6 may be configured to store information, data, applications, instructions, etc. for enabling the processing module to perform various functions according to example embodiments of the present invention.
  • the memory 6 may be configured to buffer input data for processing by the central processor 5 .
  • memory 6 may be configured to store instructions for execution by central processor 5 .
  • Memory 6 may be considered as main memory and be included in a volatile storage device, for example in the form of RAM or other form, which retains its contents only during operation, and/or the memory 6 may be included in a non-volatile storage device.
  • volatile memory devices such as ROM, EPROM, EEPROM, FLASH, or other types of memory devices
  • non-volatile memory devices retain memory contents independent of the power state of the processing module.
  • Memory 6 may also be included in a secondary storage device that stores large amounts of data, such as external disk storage.
  • the memory may communicate with the central processor 5 via a data bus or other routing components using input/output components.
  • Secondary storage may include a hard drive, compact disk, DVD, memory card, or any other type of mass storage type known to those skilled in the art.
  • the memory 6 may store one or more of the various processes or methods of optical information collection, transmission, processing and decoding to be described next.
  • the optical information collector 100 may also include an image signal processor 7 (Image Signal Processor, referred to as ISP).
  • the image signal processor 7 is used to optimize the image data collected by the camera 1.
  • the optimization processing includes linear correction, One or more of noise removal, dead pixel repair, color interpolation, white balance correction, exposure correction, etc. to optimize the quality of image data.
  • part or all of the aforementioned optimization processing, such as color interpolation is not necessary.
  • the image signal processor 7 can process one frame of image data at a time using a single core and a single thread, or the image signal processor 7 can also process multiple frames of image data simultaneously using multiple cores and multiple threads.
  • the optical information collector 100 may not have an image signal processor 7 , or may not use the image signal processor 7 to optimize the image data.
  • the optical information collector 100 may also include a decoding unit 8, which is used to decode the image data collected by the camera 1 according to a preset decoding algorithm, and thereby identify optical information, such as identifying one-dimensional information. Coded information of code or QR code, or recognition of OCR graphics and text, or recognition of coding information of various UV/infrared anti-counterfeiting codes, etc.
  • the decoding unit 8 may be a single-core, single-thread decoding one frame of image data at a time, or the decoding unit 8 may be a multi-core, multi-thread decoding multiple frames of image data simultaneously.
  • some or all functional modules of the image signal processor 7 can be integrated into the central processor 5.
  • Chinese patent CN201811115589.4 discloses a central processor 5 integrated with the image signal processor 7, The entire content is incorporated herein by reference; optionally, some or all functional modules of the image signal processor 7 can be integrated into the image sensor 3; optionally, the decoding unit 8 can also be integrated into the image sensor 3.
  • Central processing unit 5; optionally, the memory 6 can also be integrated into the central processing unit 5.
  • the image signal processor 7 and the decoding unit 8 are preferably integrated into the central processor 5, thereby saving costs; of course, the image signal The processor 7 and the decoding unit 8 may not be integrated into the central processing unit 5 .
  • the handheld terminal includes a housing 9, a display screen 11 and a button 12.
  • a scanning window 10 is provided at the front end of the housing 9 .
  • the camera 1 is housed inside the housing 9 and can collect optical information through the scanning window 10 .
  • the optical information collector 100 may not have a display screen 11, but may output information to a separate display screen 11 for display.
  • the optical information collector 100 can be a fixed, desktop or other form of terminal, and the optical information collector 100 can also be integrated into other devices as a part of other devices.
  • the central processor 5 issues a trigger instruction via an external trigger.
  • the external trigger may be a trigger generated by the user pressing a specific button 12 or touching a specific area of the display screen 11 or the user operating the optical information collector 100 through a specific gesture.
  • the central processor 5 issues a trigger instruction according to a preset algorithm, thereby triggering the image sensor 3 to collect image data.
  • the image data collected by the image sensor 3 can also be optimized through the image signal processor 7 and then output to the decoding unit 8 for decoding.
  • the optical information collector 100 collects a schematic diagram of a barcode
  • the image signal processor 7 can sequentially receive and optimally process the image data collected by the image sensor 3 through the MIPI interface (Mobile Industry Processor Interface, MIPI for short), and the decoding unit 8 decodes the image signal.
  • the processor 7 optimizes and processes the transmitted image data.
  • the decoding unit 8 will stop decoding and notify the central processor 5 that the decoding is successful, and the central processor 5 will issue an instruction control
  • the image sensor 3 stops collecting image data.
  • the image sensor 3 can continuously collect image data through a digital stream mode.
  • the so-called digital stream mode means that according to a preset algorithm, the image sensor 3 will continuously collect image data within a preset time.
  • the decoding unit 8 is single-threaded and sequentially. Decode the continuously collected image data or use multiple threads to simultaneously decode the continuously collected image data.
  • the image sensor 3 is controlled to stop collecting image data, and the decoding unit 8 is controlled to stop decoding.
  • the preset time is five seconds, which means that the image sensor 3 will continuously collect image data within five seconds.
  • the decoding will time out; if one of the frames of image If the data is successfully decoded, even if the time has not reached five seconds, the central processor 5 will control the image sensor 3 to stop collecting image data, and control the decoding unit 8 to stop decoding.
  • FIG. 5 shows a timing diagram 200 of the optical information collector 100 collecting optical information in digital stream mode according to an embodiment.
  • the timing diagram 200 shows the trigger signal 201 of the external trigger and the supplementary light of the fill light 4.
  • the low level triggers the image sensor 3 to stop collecting image data and the fill light 4 to stop filling the light.
  • the fill light 4 fills the light at the high level of the light filling sequence 202 and turns off the fill light at the low level; so
  • the image data acquisition timing 203 of the image sensor 3 is synchronized with the fill light timing 202.
  • the image sensor 3 exposes at the high level of the image data acquisition timing 203 and outputs image data at the low level; the dotted arrow in Figure 5 represents the third One frame of image data is output to the decoding unit 8 for decoding.
  • the decoding unit 8 receives the first frame of image data at time point a, successfully decodes the first frame of image data at time point b, and feeds back the successfully decoded information to the center. Processor 5.
  • the central processor 5 controls the image sensor 3 to stop collecting image data and controls the fill light 4 to stop filling light at time point c. Due to signal delay, the rising edge of the high level of the trigger signal 201 is slightly earlier than the rising edge of the high level of the image data acquisition timing 203, and the falling edge of the high level of the trigger signal is slightly earlier than the time point when the image sensor 3 ends collecting image data. c. It should be noted that when the ambient light is sufficient, fill light is not necessary.
  • the image sensor 3 is also collecting new image data at the same time.
  • the image sensor 3 3 has collected seven frames of image data, of which the second to seventh frames of image data are not transmitted to the image data of the decoding unit 8, but are stored (residual) in the storage area 13 (cache or PN) of the image sensor 3 (junction) or the corresponding register 14 of the image signal processor 7, according to the first-in-first-out principle, the image data collected later will overwrite the image data collected previously, and the seventh frame of image data will be stored in the image signal processor 7, the image data of the sixth frame will be stored in the storage area 13 of the image sensor 3, and the image data of the second to fifth frames will be overwritten and cleared.
  • the decoding unit 8 When the optical information collector 100 collects new optical information via triggering again, the decoding unit 8 will first receive and decode the storage area 13 or image signal processor 7 that was last stored in the image sensor 3 The image data in the register 14 will inevitably lead to decoding errors, because the last remaining image data is not the image data of new optical information.
  • An optional method achieves the purpose of avoiding decoding errors by discarding N frames of image data of a specific number of frames and starting decoding from the N+1th frame of image data, where N ⁇ 1.
  • the timing diagram 300 shows the trigger signal 301 of the external trigger, the fill light timing 302 of the fill light 4, the image data collection timing 303 of the image sensor 3 continuously collecting image data, and the decoding timing 304 of the decoding unit 8, where :
  • the trigger signal 301 triggers the image sensor 3 to collect image data and the fill light 4 to fill in the light at a high level, and triggers the image sensor 3 to stop collecting image data and the fill light 4 to stop filling the light at a low level.
  • the fill light 4 The fill light is at a high level in the fill light sequence 302 and is turned off at a low level; the image data acquisition sequence 303 of the image sensor 3 is synchronized with the fill light sequence 302.
  • the image sensor 3 is in the image data acquisition sequence 303. Expose at a high level and output image data at a low level; since one frame of image data remains in the image sensor 3 and the image signal processor 7 each, a specific number of frames N is discarded, which is two frames, and the first frame in the image data acquisition timing 303 The two frames of image data are not transmitted to the decoding unit 8.
  • the dotted arrow in Figure 6 represents that the third frame of image data is output to the decoding unit 8 for decoding.
  • the decoding unit 8 receives and decodes the third frame of image at time point d. data, the third frame of image data is successfully decoded at time point e, and the successfully decoded information is fed back to the central processor 5. Due to the signal delay, the image sensor 3 is controlled to stop collecting image data at time point f and control The fill light 4 stops filling light. It should be noted that when the ambient light is sufficient, fill light is not necessary. According to the image data acquisition timing sequence 302, it can be known that the image sensor 3 has collected eight frames of image data at this time, and the eighth frame of image data will remain in the register 14 of the image signal processor 7, and the seventh frame of image data will remain in the register 14 of the image signal processor 7.
  • the storage area 13 of the image sensor 3 The storage area 13 of the image sensor 3 .
  • the optical information collector 100 collects new optical information through triggering again, the optical information collector 100 will discard the remaining two frames of image data in the image sensor 3 and the image signal processor 7 again, The decoding and output starts from the third frame of image data to avoid decoding errors.
  • the specific number of discarded frames N is equal to or greater than the number of frames of image data remaining when image data was collected last time, and is not limited to two frames.
  • the image sensor 3 and the image signal processor are discarded. 7, the specific number of frames N is greater than or equal to two frames.
  • Discarding the remaining image data may include: the decoding unit 8 does not receive the remaining image data, or the decoding unit 8 receives the remaining image data but does not decode the remaining image data, or the decoding unit 8 decodes the remaining image data but does not decode the remaining image data.
  • the information is not output or displayed on the display screen 11, and the information output and displayed on the display screen 11 is decoded information of new optical information.
  • the storage area 13 of the image sensor 3 and the register 14 of the image signal processor 7 respectively store the image data of the last triggered acquisition, then when the next trigger acquisition of new optical information is performed, the first two data will be discarded.
  • frame image data, and the third frame and subsequent image data are regarded as the image data of new optical information, and then the third frame and subsequent image data are decoded until the decoding is successful or the decoding times out.
  • the optical information collector 100 may not optimize the image data through the image signal processor 7.
  • the image signal processor 7 only receives the RAW format image data transmitted by the image sensor 3, and then does not process the image data.
  • the optimized image data is transmitted to the decoding unit 8 for decoding.
  • the decoding unit 8 directly receives the grayscale image data (only the brightness signal of the RAW image data is taken), which facilitates binary decoding of the image data, and the image signal If the processor 7 is only used as a simple data transmission channel, then there is no remaining image data in the register 14 of the image signal processor 7; or the optical information collector 100 does not include the image signal processor 7, and the RAW image collected by the image sensor 3
  • the data is directly transmitted to the decoding unit 8 through an interface such as a DVP interface (Digital Video Port) or an LVDS (Low Voltage Differential Signaling) interface. If only one frame of image data remains in the storage area 13 of the image sensor 3, a new frame of image data is collected. When obtaining optical information, you only need to discard one frame of image data.
  • the second frame and subsequent image data are image data of new optical information.
  • the second frame and subsequent image data will be decoded until the decoding is successful or the decoding times out; due to Decoding optical information starts from the second frame.
  • decoding optical information starts from the third frame, which saves the processing time and light filling time of one frame of image data, which can increase the decoding speed and reduce power consumption.
  • the image data is not optimized through the image signal processor 7 , a certain amount of image data processing time can be saved in theory.
  • the timing diagram 400 in FIG. 7 shows the trigger signal 401 of the external trigger, the fill light timing 402 of the fill light 4 , and the image data collection timing of the image sensor 3 continuously collecting image data.
  • the trigger signal 401 triggers the image sensor 3 to collect image data and the fill light 4 to fill the light at a high level, and triggers the image sensor 3 to stop collecting image data and the fill light 4 at a low level. 4. Stop the fill light.
  • the fill light 4 fills in the high level of the fill light sequence 402 and turns off the fill light in the low level; the optical information collector 100 does not optimize the image data through the image signal processor 7 , then only one frame of the previously collected image data remains in the storage area 13 of the image sensor 3, and the optical information collector 100 discards the first frame of image data.
  • the dotted arrow in Figure 7 represents the output of the second frame of image data to the decoder.
  • Unit 8 decodes.
  • the decoding unit 8 receives and decodes the second frame of image data at time point g, successfully decodes the second frame of image data at time point h, and feeds back the successfully decoded information to the central processor 5.
  • the image sensor 3 stops collecting image data and controls the fill light 4 to stop filling light. It should be noted that when the ambient light is sufficient, fill light is not necessary.
  • the image data collection timing 403 the image sensor 3 has collected six frames of image data at this time, and the sixth frame of image data will remain in the storage area 13 of the image sensor 3 .
  • the optical information collector 100 collects new optical information via triggering again, the optical information collector 100 will again discard the remaining frame of image data in the image sensor 3 and start from the second frame of image data. Decode output to avoid decoding errors.
  • each time new optical information is collected one or two frames of residual image data are discarded as needed, which can solve the problem of residual image data in the image sensor 3 or the image signal processor 7; optionally, The residual data of more than two frames can also be discarded according to actual needs.
  • the above method has certain flaws, that is, every time new optical information is collected, the image sensor 3 needs to output and discard one or more remaining image data, and the decoding unit 8 does not start decoding until at least the second frame of image data, which is a waste of time; It is conceivable that if new optical information is collected each time, the first frame of image data output by the image sensor 3 is the effective image data (image data of new optical information), which can improve efficiency.
  • the storage area 13 of the image sensor 3 or the register 14 of the image signal processor 7 is cleared after each successful decoding, there will be no remaining image data information when new optical information is collected next time.
  • the first frame of image data is the image data of new optical information, so decoding can be started directly from the first frame of image data, improving the decoding speed. This can be achieved through a preset algorithm, that is, through algorithm control, after each decoding is successful, the storage area 13 of the image sensor 3 or the register 14 of the image signal processor 7 continues to be cleared.
  • This preset algorithm for clearing the storage area 13 of the image sensor 3 or the register 14 of the image signal processor 7 usually needs to be implemented by the image sensor 3 manufacturer or the image signal processor 7 manufacturer (or the central processing unit integrated with the image signal processor 7 Processor 5 manufacturer, the same below) presets the algorithm.
  • the purchased image sensor 3 or image signal processor 7 has usually been predefined by the image sensor 3 or image signal processor 7 manufacturer, and is not easy to change.
  • residual image data is eliminated by bypassing the image data processing flow predefined by the manufacturer of the image signal processor 7 .
  • the optical information collector 100 does not optimize the image data through the image signal processor 7.
  • the image data collected by the image sensor 3 is output to the image signal processor 7 through the existing MIPI interface and stored in the optical information collector 100.
  • the manufacturer can configure the buffer 15 separately, and the buffer 15 is integrated into the image signal processor 7.
  • the buffer 15 can also be set independently from the image signal processor 7, and the decoding unit 8 is configured from the image signal processor 7.
  • the image data is taken out of the buffer 15 and decoded.
  • the image data collected by the image sensor 3 is still transmitted to the image signal processor 7 and then to the decoding unit 8 through the existing MIPI interface, because it is relatively simple to transmit the image data through the existing MIPI interface.
  • the image signal processor 7 can be completely bypassed, that is, the image data collected by the image sensor 3 is directly transmitted to the decoding unit 8 for decoding. Since the image data is not optimized by the image signal processor 7 , only one frame of image data remains in the storage area 13 of the image sensor 3 . A specific process can be set to eliminate the remaining frame of image data in the image sensor 3 .
  • the central processor 5 sends an end instruction to control the image sensor 3 to end collecting image data
  • the central processor 5 Send instructions to the image sensor 3 again to control the image sensor 3 to continue to collect one or more frames of image data, preferably one frame of image data, and control the image sensor 3 to continue to output this frame of image data.
  • the image data in the storage area 13 of the image sensor 3 is cleared, so that the next time new optical information is collected, the first frame of image data output by the image sensor 3 is the image data of the new optical information.
  • the last frame of image data output by the image sensor 3 can be input into the buffer 15 that can be configured by the manufacturer of the optical information collector 100, and further cleared, ultimately eliminating the remaining image data in the image sensor 3.
  • the timing diagram 500 of an embodiment in FIG. 9 shows the trigger signal 501 of the central processor 5 , the fill light timing 502 of the fill light 4 , and the image sensor 3 continuously collecting images.
  • the image data and the fill light 4 stop filling the light.
  • the fill light 4 fills the light at the high level of the fill light sequence 502 and turns off the fill light at the low level; the image data collection sequence 503 of the image sensor 3 is in accordance with the fill light.
  • the light timing 502 is synchronized, and the image sensor 3 is exposed at the high level of the image data collection timing 503 and outputs the image data at the low level; the dotted arrow in Figure 9 represents the output of the first frame of image data to the decoding unit 8 for decoding, so
  • the decoding unit 8 receives the first frame of image data at time point j, successfully decodes the first frame of image data at time point k, and feeds back the successfully decoded information to the central processor 5, and the central processor 5 Point i sends out a trigger signal to control the image sensor 3 to stop collecting image data and to control the fill light 4 to stop filling light.
  • the central processor 5 will send the control signal 510 again to individually control the image sensor 3 to continue to collect one frame of image data at the high level 530 and output this frame of image data. Then the image sensor 3 There is no remaining image data in the system. Once it is triggered to collect new optical information next time, the first frame of image data collected and output by the image sensor 3 is the image data of the new optical information.
  • the decoding unit 8 can directly accept and decode the first frame. image data. At this time, the fill light 4 is at a low level 520 and no fill light is performed, thereby saving power consumption. It should be noted that when the ambient light is sufficient, fill light is not necessary during the entire process.
  • image data is continuously collected and decoded through the digital stream mode.
  • the image sensor 3 has collected multiple frames of image data, such as in the timing diagram 200 , the image sensor 3 collects a total of seven frames of image data.
  • the optical information collector 100 produced by iData, Honeywell, Zebra and other companies can basically successfully decode within the first three frames of image data. That is, at least one frame of the first three frames of image data collected by the image sensor 3 can be decoded.
  • Unit 8 is successfully decoded.
  • the image sensor 3 has collected more than three frames of image data, and has even collected six or seven frames of image data, and the fourth to seventh frames have been collected.
  • the image data also requires the image sensor 3 to work, or the fill light 4 to fill in the light. Since the fourth to seventh frames of image data are not used for decoding, collecting the fourth to seventh frames of image data results in a waste of power consumption. It should be noted that in some embodiments, when the ambient light is sufficient, fill-in light is not necessary. For example, in daily life, when scanning codes with a mobile phone, fill-in light is usually not required.
  • the optical information collector 100 can collect image data in a fixed frame mode, which is different from the continuous collection of image data in the digital stream mode.
  • the central processor 5 controls the image sensor 3
  • the decoding unit 8 decodes the image data of a fixed number of frames.
  • the decoding of the image data of a fixed number of frames currently collected is completed (one frame of image data is decoded successfully or all the image data of a fixed number of frames are successfully decoded). decoding fails) or when the decoding is about to be completed, the central processor 5 then determines whether it is necessary to collect a fixed number of image data again, and so on, until the decoding is successful or the decoding times out. There is a time interval between the two acquisitions of fixed frame number of image data before and after the fixed frame mode, rather than being continuous, leaving time for the central processor 5 to make a judgment.
  • timing diagram 600 of an embodiment of FIG. 10 it shows the trigger signal 601 of the central processor 5 , the fill light timing 602 of the fill light 4 , and the image data collection of the image sensor 3 continuously collecting image data.
  • Timing 603 and decoding timing 604 of the decoding unit 8 in which: the trigger signal 601 triggers the image sensor 3 to collect image data and fill light 4 at a high level, and triggers the image sensor 3 to stop collecting image data and filling light at a low level.
  • the lamp 4 stops filling light, and the filling light 4 fills in the high level of the filling light sequence 602 and turns off the filling light in the low level; the image data acquisition timing 603 of the image sensor 3 is synchronized with the filling light timing 602, The image sensor 3 is exposed at a high level in the image data acquisition sequence 603 and outputs image data at a low level; the four dotted arrows from left to right in Figure 10 respectively represent the output of the first to fourth frames of image data respectively. After decoding by the decoding unit 8, the first to third frames of image data are not successfully decoded, and the fourth frame of image data is successfully decoded.
  • the central processor 5 determines the first three fixed frames. Whether the decoding of several image data is completed, thereby determining whether it is necessary to control the image sensor 3 to continue to collect the image data of a fixed number of frames in the next three frames.
  • the image data collection sequence 603 shows that the image sensor 3 collects image data in a fixed frame mode with a fixed frame number of three frames.
  • the central processor 5 controls the image sensor 3 to first collect three frames of image data with a fixed frame number, and then Three frames of image data are transmitted to the decoding unit 8.
  • the image sensor 3 is controlled to collect three frames of image data again and transmitted to the decoding unit 8 for decoding again, and so on until the decoding is successful. (or decoding timeout).
  • the image sensor 3 when the fourth frame of image data collected by the image sensor 3 is successfully decoded, if the image sensor 3 has not collected three frames of image data of a fixed number of frames at this time, the image sensor 3 will continue to execute the fixed frame mode, after collecting the image data of a fixed number of frames, that is, continue to collect the image data of the fifth and sixth frames, and output all the image data of the fifth and sixth frames of a fixed number of frames, and then stop the image data collection, image sensor 3 There will be no remaining image data.
  • the image sensor 3 can also be controlled to stop collecting image data after the decoding is successful, even if the image sensor 3 has not collected a fixed number of image data of frames, which can save power consumption to a certain extent, but This will result in residual image data in the image sensor 3; the remaining image data in the image sensor 3 can be discarded when new optical information is collected next time.
  • the image sensor 3 since the image sensor 3 is controlled to collect three frames of image data after the first three frames of image data are decoded (successfully decoded or not decoded successfully), there is a time interval between the collection of the previous three frames of image data. If the first three frames of image data are collected, If the image data with a fixed number of frames is not decoded successfully, there will be an obvious delay in the image data with a fixed number of frames three frames after acquisition.
  • the image sensor 3 can be controlled to collect three more frames of image data to achieve The balance between decoding speed and power consumption; the time to start collecting the next three frames of image data can be determined according to actual needs, so that there is no obvious delay between the collection of the next three frames of image data.
  • the fixed frame number of the fixed frame mode is three frames, that is, the image sensor 3 collects three frames of image data at a time; in some embodiments, the fixed frame number can be determined according to the performance of the specific optical information collector 100, For example, if the optical information collector 100 can successfully decode the first two frames of image data or the first frame of image data, the fixed number of frames in the fixed frame mode can be set to two frames or one frame first to avoid subsequent multiple acquisitions. Image data causes a waste of power consumption. Every time after the decoding of this frame of image data is completed and the decoding is not successful, the image sensor 3 is controlled to collect the next frame of image data; of course, the fixed number of frames can also be set to two frames or four frames or Five or more frames.
  • the timeout time is usually set to 100ms, that is, when the time for the decoding unit 8 to decode one frame of image data reaches 100ms and the decoding is not successful, it will stop decoding this frame of image data and switch to decoding the next frame of image data.
  • a hybrid mode that combines the advantages of fixed frame mode and digital stream mode can be used to adapt to complex application scenarios and achieve a balance between power consumption and decoding speed.
  • some optical information that is difficult to identify such as high-density QR codes, DPM (Direct Part Mark) or complex text symbols
  • decoding is not successful, then use digital stream mode. Continuously collect image data for decoding; it is conceivable that this hybrid mode can also be used to read simple optical information.
  • the camera 1 can be configured to first use a preset number of fixed frame modes to collect image data, and then use a digital stream mode to collect image data; for example, first use a fixed frame mode to collect a fixed number of image data, and then use the digital stream mode.
  • To continuously collect image data refer to the timing diagram 700 in Figure 11, which shows the trigger signal 701 of the central processor 5, the fill light timing 702 of the fill light 4, and the continuous collection of image data by the image sensor 3.
  • the optical information collector 100 first collects three frames of image data using a fixed frame mode with a fixed number of frames of three.
  • the decoding unit 8 successfully decodes the image data of the first frame of image data collected in the digital stream mode.
  • the fixed frame mode may be used multiple times first, and then the digital stream mode may be used when decoding is not successful.
  • the fixed frame mode may be used twice first, and then the digital stream mode may be used, that is, three frames with a fixed number of frames are first collected.
  • the image data is decoded.
  • the digital stream mode is used for decoding. As you can imagine, you can use it three or more times first.
  • the fixed frame mode is used for decoding. When the decoding is not successful, the digital stream mode is used for decoding.
  • the hybrid mode can first adopt the fixed frame mode, then adopt the digital stream mode, and finally use the fixed frame mode. End of frame mode.
  • the timing diagram 800 of an embodiment in FIG. 12 shows the trigger signal 801 of the central processor 5 , the fill light timing 802 of the fill light 4 , and the continuous collection of image data by the image sensor 3
  • the image data collection timing 803 and the decoding timing 804 of the decoding unit 8 are used.
  • the optical information collector 100 first collects three frames of image data using a fixed frame mode with a fixed number of frames of three. When the decoding is not successful, the digital stream mode is used to collect continuously. and decoding image data.
  • the decoding unit 8 successfully decodes the image data of the first frame of image data collected in the digital stream mode, and when the decoding unit 8 successfully decodes the fourth frame of image data, it controls the image sensor 3 to stop image data collection.
  • the central processor 5 will send the control signal 810 again to individually control the image sensor 3 to continue to collect one frame of image data at the high level 830 and output this frame of image data. Then the image sensor 3 There is no residual image data in the image signal processor 7 , and since the image signal processor 7 is bypassed, there is no residual image data in the image signal processor 7 .
  • the image sensor 3 collects and outputs the first The frame image data is the image data of new optical information, and the decoding unit 8 can directly accept and decode the first frame image data.
  • the fill light 4 is at a low level 820 and no fill light is performed, thereby saving power consumption. It should be noted that when the ambient light is sufficient, fill light is not necessary during the entire process.
  • the hybrid mode can also first use the digital stream mode to collect and decode image data. After the decoding is successful, the fixed frame mode is then used to control the image sensor 3 to continue to collect image data of a fixed number of frames, and control the image sensor 3 If all the image data of a fixed number of frames are output, there will be no remaining image data in the image sensor 3; a special situation has been described in the previous embodiment, that is, after the digital stream mode is used to decode successfully, the image sensor 3 continues to collect one frame. image data.
  • the optical information collector 100 can optimize the image data through the image signal processor 7 , and in order to eliminate the of residual image data, the aforementioned method of discarding the residual image data of a specific number of frames N may be further adopted.
  • the number of specific frames N to be discarded can be determined according to the remaining image data. For example, when the remaining image data is stored in both the image sensor 3 and the image signal processor 7, each time the image data is re-collected, it needs to be discarded.
  • the central processor 5 issues an instruction to discard N frames of image data of a specific number of frames, where the N frames of image data of a specific number of frames are the last image data collected and retained through the trigger, to avoid The remaining image data is decoded and output, causing decoding errors. At the same time, there is no need to decode the remaining image data, saving power consumption.
  • the image sensor 3 collects and outputs image data of a fixed number of frames each time in fixed frame mode. Compared with the existing continuous collection and output of image data through digital stream mode, it can save power consumption and avoid continuous collection and output through digital stream mode. When image data is collected and decoded successfully, subsequent consecutively collected frames of image data are not used for decoding, resulting in a waste of power consumption.
  • the image sensor 3 collects image data through digital stream mode, and does not optimize the image data through the image signal processor 7 to avoid residual image data in the image signal processor 7; and when the decoding is successful or the decoding times out, control the The image sensor 3 stops continuously collecting image data in digital stream mode, and controls the image sensor 3 to continue collecting and outputting image data of a fixed number of frames, thereby avoiding residual image data in the image sensor 3 and avoiding decoding the next time the optical information is collected. errors and improve efficiency.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Artificial Intelligence (AREA)
  • Toxicology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Input (AREA)

Abstract

本申请提供一种光学信息采集器及其方法,光学信息采集器包括图像传感器,用以采集光学信息的图像数据;解码单元,用以根据预置的解码算法对图像进行解码;中央处理器,所述中央处理器经触发控制所述图像传感器采集图像,并控制所述解码单元解码图像,其中,一旦经由触发,所述中央处理器发出指令丢弃特定帧数的N帧图像数据,其中,特定帧数的N帧图像数据为上一次经触发采集并残存于所述光学信息采集器中的图像数据。避免残存的图像数据被解码输出,造成解码错误,提高解码效率。

Description

[根据细则26改正 16.08.2023]光学信息采集器及方法 技术领域
本申请涉及光学信息采集领域,尤指一种光学信息采集器及其方法。
背景技术
基于图像扫描的光学信息采集,包括对机器可读的各种光学信息的采集,常见的机器可读光学信息包括一维码、二维码、OCR图文、紫外防伪码、红外防伪码等。
通常,在采集光学信息时需要进行补光,以获取清晰的图像。传统的补光方式,在采集光学信息时持续补光,设备发热严重,功耗高。
本申请主要发明人开发了中国专利CN201810098421.0中公开的一种低功耗的补光方法,其全部内容作为参考引用于此,其中,光学成像单元在帧周期的曝光时间打开补光,而在帧周期的采样时间关闭补光,通过周期性开启和关闭补光单元,达到降低功耗的目的。
光学成像单元的帧周期通常在50ms以内,优秀的光学成像单元,帧周期可以达到20ms以内,这意味着光学成像单元在1s时间内可以采集数十帧图像。光学成像单元通常采用数字流的方式连续采集图像,而成功识别光学信息通常来自其中的一帧图像,当光学信息识别成功时,光学成像单元将不再继续识别后续采集的图像,并且停止继续采集光学信息。
然而,当光学信息识别成功时,由于光学成像单元采用数字流的方式连续采集图像,当通过其中一帧图像成功识别光学信息,后续已经连续采集了多帧图像,而后续采集的多帧图像不用于光学信息识别,造成了功耗的浪费。
本申请主要发明人进一步开发了中国专利CN201811485945.1中公开的另一种低功耗的条码扫描系统及方法,其全部内容作为参考引用于此,其中,图像传感器在受触发前处于待机状态,并且在解码成功或解码超时后,图像传感器重新进入待机状态。这种方法与手机行业不同,手机在打开摄像头应用之后,摄像头一直采集和预览图像,功耗很高,然而手机并不是长时间使用摄像头,功耗问题可以忽略。条码扫描行业则不同,条码扫描的图像传感器需要长时间使用,如果一直处于工作状态,续航将明显下降。
然而,这带来了另一个问题,本申请发明人发现,图像传感器采用数字流方式连续采集图像时,如果在解码成功或解码超时之后,控制图像传感器突然中断图像采集(此时设备仍然开机,图像传感器仍然上电,但是不会输出和预览图像,从而节省功耗),如前所述,图像传感器中可能已经采集了图像,尽管这是概率性出现的事件,却也会造成一定的不良影响,而这已经采集的图像将不会继续输出,而是存储(残存)于图像传感器的存储区(一些图像传感器配备了单独的缓存器,则图像会残存于缓存器;而对于没有配备单独缓存器的图像传感器,则图像的电信号残存于图像传感器的像素单元的PN结中),当图像传感器下一次采集图像时,会首先输出残存的图像。
发明人经过研究还发现,采用高通或联发科的某些平台,不仅在图像传感器中可能会残存图像,而且在其它存储器,如图像信号处理器的缓存器中,也可能会残存图像。
需要开发一些新的技术来解决上述问题。
发明内容
本申请创作的目的在于提供一种避免解码错误,进而可以降低功耗的光学信息采集器及其方法。
为实现上述目的,本申请采用以下技术手段:
本申请提供一种光学信息采集器,其特征在于,包括:图像传感器,用以采集光学信息的图像数据;解码单元,用以根据预置的解码算法对图像数据进行解码;中央处理器,所述中央处理器经触发控制所述图像传感器采集图像数据,并控制所述解码单元解码图像数据,其中,一旦经由触发,所述中央处理器发出指令丢弃特定帧数的N帧图像数据,其中,特定帧数的N帧图像数据为上一次经触发采集并残存于所述光学信息采集器中的图像数据。
可选地,特定帧数的N帧图像数据包括所述图像传感器的存储区中残存的图像数据。
可选地,包括图像信号处理器,用以接收所述图像传感器采集的图像,并将图像数据传输至所述解码单元,特定帧数的N帧图像数据包括所述图像信号处理器中残存的图像数据。
可选地,丢弃特定帧数的N帧图像数据包括:所述解码单元不接收特定帧数的N帧图像数据,或所述解码单元不解码特定帧数的N帧图像数据,或所述解码单元将特定帧数的N帧图像数据的解码信息不输出或不显示。
可选地,所述解码单元从第N+1帧图像数据开始解码。
本申请提供一种光学信息采集方法,其特征在于,包括:中央处理器经由触发控制图像传感器采集和输出图像数据;中央处理器接收图像数据并发出指令丢弃特定帧数的N帧图像数据,其中,特定帧数的N帧图像数据为上一次经由触发采集并残存的图像数据;解码单元解码图像数据。
可选地,特定帧数的N帧图像数据包括所述图像传感器的存储区中残存的图像数据。
可选地,通过图像信号处理器接收所述图像传感器采集的图像数据,所述图像信号处理器进一步将图像数据传输至所述解码单元;特定帧数的N帧图像数据包括所述图像信号处理器中残存的图像数据。
可选地,丢弃特定帧数的N帧图像数据包括:所述解码单元不接收特定帧数的N帧图像数据,或所述解码单元不解码特定帧数的N帧图像数据,或所述解码单元将特定帧数的N帧图像数据的解码信息不输出或不显示。
可选地,所述解码单元从第N+1帧图像数据开始解码。
本申请提供一种光学信息采集器,其特征在于,包括:图像传感器,用以采集光学信息的图像数据;解码单元,用以接收和解码图像数据;中央处理器,用以控制所述图像传感器采集图像数据,并控制所述解码单元解码图像数据,其中,经由触发,所述中央处理器控制所述图像传感器以固定帧模式采集和输出固定帧数的图像数据,并控制所述解码单元解码图像数据,且当其中一帧图像数据解码成功时,停止解码固定帧数的图像数据中剩余的图像数据。
可选地,所述固定帧模式包括:当所述解码单元解码成功而所述图像传感器未采集完固定帧数的图像数据,所述图像传感器将继续采集完固定帧数的图像数据并输出固定帧数的图像数据。
可选地,所述固定帧模式包括:所述中央处理器控制所述解码单元依次接收并解码固定帧数的图像数据,且当固定帧数的图像数据中最后一帧图像数据未成功解码之时或之前,控制所述图像传感器又一次采集固定帧数的图像数据。
可选地,不包括图像信号处理器或不通过图像信号处理器对所述图像传感器采集的图像数据进行优化处理。
可选地,所述图像传感器配置为以如下次序采集图像数据:采用预设次数的固定帧模式采集图像数据,以及采用数字流模式连续采集图像数据。
本申请提供一种光学信息采集方法,其特征在于,包括:中央处理器经由触发控制图像传感器以固定帧模式采集和输出固定帧数的图像数据;解码单元接收和解码图像数据,且当其中一帧图像数据解码成功时,停止解码固定帧数的图像数据中剩余的图像数据。
可选地,当所述解码单元解码成功而所述图像传感器未采集完固定帧数的图像数据,所述图像传感器将继续采集完固定帧数的图像数据并输出固定帧数的图像数据。
可选地,所述中央处理器控制所述解码单元依次接收并解码固定帧数的图像数据,且当固定帧数的图像数据中最后一帧图像数据未成功解码之时或之前,控制所述图像传感器又一次采集固定帧数的图像数据。
可选地,不包括图像信号处理器或不通过图像信号处理器对所述图像传感器采集的图像数据进行优化处理。
可选地,所述图像传感器配置为以如下次序采集图像数据:采用预设次数的固定帧模式采集图像数据,以及采用数字流模式连续采集图像数据。
本申请提供一种光学信息采集器,其特征在于,包括:图像传感器,用以采集光学信息的图像数据;存储器,预置了一种或多种解码算法;解码单元,用以接收和解码图像数据;中央处理器,用以控制所述图像传感器以数字流模式连续采集图像数据,并控制所述解码单元依次解码图像数据,一旦所述解码单元解码成功或解码超时,则控制所述图像传感器停止以数字流模式连续采集图像数据,并控制所述图像传感器继续采集和输出固定帧数的图像数据。
可选地,所述光学信息采集器不具有图像信号处理器或不通过图像信号处理器对图像数据进行优化处理。
可选地,所述图像传感器输出RAW格式图像数据,所述解码单元基于RAW格式图像数据获取灰度图像数据,并且基于灰度图像数据进行解码。
可选地,所述固定帧数的图像数据为一帧或二帧。
可选地,所述图像传感器采集的图像数据直接传输至所述解码单元进行解码。
本申请提供一种光学信息采集方法,其特征在于,包括:中央处理器控制图像传感器以数字流模式连续采集和输出图像数据;解码单元接收和解码图像数据,一旦所述解码单元解码成功,则控制所述图像传感器停止以数字流模式采集图像数据;控制所述图像传感器继续采集和输出固定帧数的图像数据。
可选地,所述光学信息采集器不具有图像信号处理器或不通过图像信号处理器对图像数据进行优化处理。
可选地,所述图像传感器输出RAW格式图像数据,所述解码单元基于RAW格式图像数据获取灰度图像数据,并且基于灰度图像数据进行解码。
可选地,所述固定帧数的图像数据为一帧或二帧。
可选地,所述图像传感器采集的图像数据直接传输至所述解码单元进行解码。
附图说明
图1为本申请一种实施例光学信息采集器的简化框图;
图2为本申请一种实施例光学信息采集器的示意图;
图3为图2中的光学信息采集器的立体图;
图4为本申请一种实施例光学信息采集器的高级框图;
图5为本申请一种实施例光学信息采集器通过数字流模式采集光学信息的时序图;
图6为本申请一种实施例光学信息采集器采集光学信息的时序图;
图7为本申请一种实施例光学信息采集器采集光学信息的又一时序图;
图8为本申请另一种实施例光学信息采集器的高级框图;
图9为本申请另一种实施例光学信息采集器采集光学信息的时序图;
图10为本申请另一种实施例光学信息采集器通过固定帧模式采集光学信息的时序图;
图11为本申请另一种实施例光学信息采集器通过混合模式采集光学信息的时序图;
图12为本申请另一种实施例光学信息采集器通过另一种混合模式采集光学信息的时序图。
具体实施方式的附图标号说明:
光学信息采集器100;摄像头1;光学系统2;图像传感器3;补光灯4;中央处理器5;存储器6;图像信号处理器7;解码单元8;壳体9;扫描窗10;显示屏11;按钮12;存储区13;寄存器14;缓冲器15。
具体实施方式
为便于更好的理解本申请的目的、结构、特征以及功效等,现结合附图和具体实施方式对本申请作进一步说明。
参照图1所示,为一种实施例的光学信息采集器100的实现简化框图。如下面进一步详细说明的,所述光学信息采集器100可用于采集一种或多种光学信息,如一维码、二维码、OCR图文、紫外防伪码、红外防伪码等。
所述光学信息采集器100可包括至少一个摄像头1,所述摄像头1可以包括捕获光线的光学系统2(lens)与将光学系统2捕获的光线进行光电转化的图像传感器3(sensor)的组合,所述光学系统2可以包括一个或多个反射镜、棱镜、透镜或其组合,所述图像传感器3也可以为一个或多个,可以是一个所述图像传感器3对应一个/一套所述光学系统2,或者多个所述图像传感器3可以共用同一个/一套光学系统2,或者多个/多套光学系统2可以共用同一个图像传感器3。所述图像传感器3可以是CCD或CMOS或其它类型的图像传感器3,所述图像传感器3用以将光信号转化为电信号,进而输出图像数据的数字信号。
所述光学信息采集器100可包括一个或多个补光灯4,所述补光灯4用以在所述摄像头1采集图像数据时,照亮光学信息。当然,在合适的环境光照条件下,可以不使用补光灯4进行补光,或者,所述光学信息采集器100可以不具有补光灯4。所述补光灯4的补光方式可以有多种形式:比如所述补光灯4在所述摄像头1采集光学信息时持续补光;或者所述补光灯4可以是与摄像头1的图像传感器3的曝光时间同步补光,其中,中国专利CN201810098421.0公开了一种补光灯4与图像传感器3的曝光时间同步补光的技术方案,其全部内容作为参考引用于此;所述补光灯4还可以是脉冲式补光,其脉冲时间与图像传感器3的曝光时间的一部分重叠。
所述光学信息采集器100还可包括中央处理器5,用以执行各种指令。
所述光学信息采集器100还可包括单独的或集成的存储器6,根据需要,所述存储器6中预置了一种或多种解码算法,所述存储器6还可以存储其它的程序或指令。存储器6可以包括一个或多个非暂态存储介质,诸如例如可以是固定的或可移动的易失性和/或非易失性存储器。具体来说,存储器6可以被配置成存储信息、数据、应用、指令等用于使处理模块能够执行根据本发明的示例实施例的各种功能。例如,存储器6可以被配置成缓冲输入数据用于由中央处理器5处理。另外地或者替代地,存储器6可以被配置成存储由中央处理器5执行的指令。存储器6可以被认为是主存储器并且被包括在例如以RAM或其他形式的易失性存储装置中,易失性存储装置仅在操作期间保持其内容,和/或存储器6可以被包括在非易失性存储装置中,诸如ROM、EPROM、EEPROM、FLASH或其他类型的存储装置,非易失性存储装置独立于处理模块的电源状态保持存储器内容。存储器6还可以被包括在存储大量数据的辅助存储设备、诸如外部磁盘存储器中。在一些实施例中,存储器可以使用输入/输出部件经由数据总线或其它路由部件与中央处理器5通信。辅助存储器可以包括硬盘、紧致盘、DVD、存储卡或本领域技术人员已知的任何其它类型的大容量存储类型。所述存储器6可以存储了接下来将要描述的多种光学信息采集、传输、处理和解码的流程或方法中的一种或多种。
所述光学信息采集器100还可包括图像信号处理器7(Image Signal Processor,简称ISP),图像信号处理器7用以对所述摄像头1采集的图像数据进行优化处理,优化处理包括线性纠正、噪点去除、坏点修补、颜色插值、白平衡矫正、曝光校正等中的一种或多种,以优化图像数据的质量。对于不需要进行颜色识别的光学信息,前述优化处理中的部分或全部,比如颜色插值等,不是必须的。图像信号处理器7可以是通过单核单线程地一次处理一帧图像数据,图像信号处理器7也可以是多核多线程同时处理多帧图像数据。可选地,所述光学信息采集器100可以不具有图像信号处理器7,或不通过图像信号处理器7对图像数据进行优化处理。
所述光学信息采集器100还可包括解码单元8,所述解码单元8用以根据预置的解码算法对所述摄像头1采集的图像数据进行解码,进而识别出光学信息,如识别出一维码或二维码的编码信息,或识别OCR图文,或识别各种紫外/红外防伪码的编码信息等。所述解码单元8可以是单核单线程一次解码一帧图像数据,或者所述解码单元8也可以是多核多线程同时解码多帧图像数据。
可选地,所述图像信号处理器7的部分或全部功能模块可集成于所述中央处理器5,如中国专利CN201811115589.4公开了一种集成了图像信号处理器7的中央处理器5,其全部内容作为参考引用于此;可选地,所述图像信号处理器7的部分或全部功能模块可集成于所述图像传感器3;可选地,所述解码单元8也可以集成于所述中央处理器5;可选地,所述存储器6也可以集成于所述中央处理器5。下述实施例中,当通过图像信号处理器7对图像数据进行优化处理时,图像信号处理器7和解码单元8优选地集成于所述中央处理器5,从而可以节省成本;当然,图像信号处理器7和解码单元8也可以不集成于中央处理器5。
如图2和图3示出了作为所述光学信息采集器100的一种具体实施例的手持终端的示意图,所述手持终端包括壳体9、显示屏11和按钮12。所述壳体9的前端设有扫描窗10,所述摄像头1收容于壳体9内部,且可以透过所述扫描窗10采集光学信息。可选地,所述光学信息采集器100可以不具有显示屏11,而是将信息输出至单独的显示屏11进行显示。可选地,所述光学信息采集器100可以为固定式、桌面式或其它形式的终端,所述光学信息采集器100也可以作为其它设备的一部分而集成于其它设备。
所述中央处理器5经由外部触发发出触发指令,外部触发可以是用户按压特定按钮12或触摸显示屏11的特定区域或用户通过特定的手势操作所述光学信息采集器100而产生的触发。一旦所述中央处理器5经由外部触发,则根据预设算法发出触发指令,进而触发所述图像传感器3采集图像数据。
其中,还可以通过图像信号处理器7对图像传感器3采集的图像数据进行优化处理,之后再输出至解码单元8进行解码。参考图4的框图示出的一种具体实施例中,光学信息采集器100采集条码的示意图,用户按压按钮12触发所述补光灯4补光和所述图像传感器3采集图像数据,所述图像信号处理器7可通过MIPI接口(移动产业处理器接口Mobile Industry Processor Interface,简称MIPI)依序接收并优化处理所述图像传感器3采集的图像数据,所述解码单元8解码所述图像信号处理器7优化处理之后传送过来的图像数据,当其中一帧图像数据成功解码,所述解码单元8将停止解码,并告知所述中央处理器5解码成功,所述中央处理器5发出指令控制所述图像传感器3停止采集图像数据。
所述图像传感器3可以通过数字流模式连续采集图像数据,所谓数字流模式,即根据预设算法,所述图像传感器3将在预设时间内连续采集图像数据,所述解码单元8单线程依次解码连续采集的图像数据或多线程同时解码连续采集的图像数据,当解码成功或解码超时,则控制所述图像传感器3停止采集图像数据,和控制所述解码单元8停止解码。比如预设时间为五秒,代表所述图像传感器3将在五秒内连续采集图像数据,若图像传感器3在五秒内采集的图像数据都没有成功解码,则解码超时;若其中一帧图像数据成功解码,即使时间还未达到五秒,中央处理器5也将控制所述图像传感器3停止采集图像数据,并控制所述解码单元8停止解码。
如图5示出了一种实施例所述光学信息采集器100通过数字流模式采集光学信息的时序图200,时序图200示出了外部触发的触发信号201、所述补光灯4的补光时序202、所述图像传感器3连续采集图像数据的图像数据采集时序203和解码单元8的解码时序204,其中:触发信号201在高电平触发图像传感器3采集图像数据和补光灯4补光,在低电平触发图像传感器3停止采集图像数据和补光灯4停止补光,所述补光灯4在补光时序202的高电平补光、在低电平关闭补光;所述图像传感器3的图像数据采集时序203与补光时序202同步,所述图像传感器3在图像数据采集时序203的高电平曝光、在低电平将图像数据输出;图5中虚线箭头代表第一帧图像数据输出至解码单元8解码,所述解码单元8在时间点a接收第一帧图像数据,在时间点b成功解码第一帧图像数据,并将成功解码的信息反馈至所述中央处理器5,所述中央处理器5在时间点c控制所述图像传感器3停止采集图像数据并控制所述补光灯4停止补光。由于信号延迟,触发信号201高电平的上升沿略早于图像数据采集时序203高电平的上升沿,而触发信号高电平的下降沿略早于图像传感器3结束采集图像数据的时间点c。需要说明的是,当环境光充足时,补光并不是必须的。
根据时序图200可知,所述解码单元8解码图像数据时,所述图像传感器3同时也在采集新的图像数据,当第一帧图像数据被所述解码单元8解码成功时,所述图像传感器3已经采集了七帧图像数据,其中第二至第七帧图像数据没有传输至所述解码单元8的图像数据,而是存储(残存)于所述图像传感器3的存储区13(缓存或PN结)或所述图像信号处理器7对应的寄存器14中,根据先进先出原则,后采集的图像数据将覆盖之前采集的图像数据,则第七帧图像数据将存储于所述图像信号处理器7的寄存器14中,而第六帧图像数据将存储于所述图像传感器3的存储区13,而第二至五帧图像数据被覆盖而清除。
当所述光学信息采集器100再一次经由触发,采集新的光学信息时,所述解码单元8将最先接收并解码上一次残存于所述图像传感器3的存储区13或图像信号处理器7的寄存器14中的图像数据,这势必会导致解码错误,因为上一次残存的图像数据,并不是新的光学信息的图像数据。
可以通过如下方法解决上述问题,避免解码错误。
一种可选的方法如图6中的时序图300所示,通过丢弃特定帧数的N帧图像数据,而从第N+1帧图像数据开始解码,达到避免解码错误的目的,其中,N≥1。时序图300示出了外部触发的触发信号301、所述补光灯4的补光时序302、所述图像传感器3连续采集图像数据的图像数据采集时序303和解码单元8的解码时序304,其中:触发信号301在高电平触发图像传感器3采集图像数据和补光灯4补光,在低电平触发图像传感器3停止采集图像数据和补光灯4停止补光,所述补光灯4在补光时序302的高电平补光、在低电平关闭补光;所述图像传感器3的图像数据采集时序303与补光时序302同步,所述图像传感器3在图像数据采集时序303的高电平曝光、在低电平将图像数据输出;由于图像传感器3和图像信号处理器7中各残存一帧图像数据,则丢弃特定帧数N为两帧,图像数据采集时序303中的前两帧图像数据未传输至所述解码单元8,图6中虚线箭头代表第三帧图像数据输出至解码单元8解码,解码时序304中,解码单元8在时间点d接收并解码第三帧图像数据,在时间点e成功解码第三帧图像数据,并将成功解码的信息反馈至所述中央处理器5,由于信号的延迟,在时间点f控制所述图像传感器3停止采集图像数据并控制所述补光灯4停止补光。需要说明的是,当环境光充足时,补光并不是必须的。根据图像数据采集时序302可知,此时图像传感器3已经采集了八帧图像数据,而第八帧图像数据将残存于所述图像信号处理器7的寄存器14中,第七帧图像数据将残存于所述图像传感器3的存储区13。当所述光学信息采集器100再一次经由触发采集新的光学信息,所述光学信息采集器100将会再次丢弃所述图像传感器3和所述图像信号处理器7中残存的两帧图像数据,而从第三帧图像数据开始解码输出,避免解码错误。
容易理解的是,丢弃的特定帧数N大于等于上一次采集图像数据时残存的图像数据的帧数即可,而不限定为两帧,比如丢弃所述图像传感器3和所述图像信号处理器7中残存的图像数据,则特定帧数N大于等于两帧。丢弃残存的图像数据可以包括,解码单元8不接收残存的图像数据,或解码单元8虽然接收残存的图像数据,但是不解码残存的图像数据,或虽然解码单元8解码残存的图像数据,但是解码的信息不输出或不显示于显示屏11,输出和显示于显示屏11的信息是新的光学信息的解码信息。比如,已经知道所述图像传感器3的存储区13和所述图像信号处理器7的寄存器14中分别存储了上一次触发采集的图像数据,则在下一次触发采集新的光学信息时,丢弃前两帧图像数据,而将第三帧及以后的图像数据当做新的光学信息的图像数据,进而解码第三帧及以后的图像数据,直至解码成功或解码超时。
一种实施例中,所述光学信息采集器100可以不通过图像信号处理器7对图像数据优化处理,图像信号处理器7仅仅接收所述图像传感器3传输的RAW格式图像数据,紧接着将不经过优化处理的图像数据传输给解码单元8进行解码,解码单元8直接接收到的是灰度图像数据(只取RAW图像数据的亮度信号),方便对图像数据进行二值化解码,而图像信号处理器7仅仅作为单纯的数据传输通道,则图像信号处理器7的寄存器14中没有残存的图像数据;或者所述光学信息采集器100不包括图像信号处理器7,图像传感器3采集的RAW图像数据通过DVP接口(Digital Video Port)或LVDS(Low Voltage Differential Signaling)接口等接口直接传输给所述解码单元8,则只有所述图像传感器3的存储区13中残存有一帧图像数据,则采集新的光学信息时,只需丢弃一帧图像数据即可,第二帧及以后的图像数据为新的光学信息的图像数据,解码第二帧及以后的图像数据,直至解码成功或解码超时;由于从第二帧开始解码光学信息,相对于前一种方式从第三帧开始解码光学信息,节省了一帧图像数据的处理时间和补光时间,可以提高解码速度和降低功耗。这些具体的实施例中,由于不通过图像信号处理器7对图像数据进行优化处理,理论上可以节省一定的图像数据处理时间。
具体来说,参考如图7中的时序图400,示出了外部触发的触发信号401、所述补光灯4的补光时序402、所述图像传感器3连续采集图像数据的图像数据采集时序403和解码单元8的解码时序404,其中:触发信号401在高电平触发图像传感器3采集图像数据和补光灯4补光,在低电平触发图像传感器3停止采集图像数据和补光灯4停止补光,所述补光灯4在补光时序402的高电平补光、在低电平关闭补光;所述光学信息采集器100不通过图像信号处理器7对图像数据进行优化,则前一次采集的图像数据,仅有一帧残存于所述图像传感器3的存储区13,光学信息采集器100丢弃第一帧图像数据,图7中虚线箭头代表第二帧图像数据输出至解码单元8解码,解码单元8在时间点g接收并解码第二帧图像数据,在时间点h成功解码第二帧图像数据,并将成功解码的信息反馈至所述中央处理器5,由于信号的延迟,在时间点i所述图像传感器3停止采集图像数据并控制所述补光灯4停止补光。需要说明的是,当环境光充足时,补光并不是必须的。根据图像数据采集时序403可知,此时图像传感器3已经采集了六帧图像数据,而第六帧图像数据将残存于所述图像传感器3的存储区13。当所述光学信息采集器100再一次经由触发采集新的光学信息,所述光学信息采集器100将会再次丢弃所述图像传感器3中残存的一帧图像数据,而从第二帧图像数据开始解码输出,避免解码错误。
上述方法中,每次采集新的光学信息时,根据需要,丢弃了一帧或两帧残存的图像数据,可以解决图像传感器3或图像信号处理器7中残存图像数据的问题;可选地,也可根据实际需要,丢弃两帧以上的残存数据。
上述方法存在一定的缺陷,即每次采集新的光学信息时,图像传感器3需要输出和丢弃一帧或多种残存的图像数据,解码单元8至少第二帧图像数据才开始解码,浪费时间;可想而知,如果每次采集新的光学信息,图像传感器3输出的第一帧图像数据即为有效图像数据(新的光学信息的图像数据),可以提高效率。
可想而知,如果在每次解码成功之后,清空所述图像传感器3的存储区13或图像信号处理器7的寄存器14,使得下一次采集新的光学信息时,不存在残存的图像数据信息,第一帧图像数据即为新的光学信息的图像数据,从而可以直接从第一帧图像数据开始解码,提升解码速度。这可以通过预置的算法来实现,即通过算法控制,在每次解码成功之后,继续清空图像传感器3的存储区13或图像信号处理器7的寄存器14。这种清空图像传感器3的存储区13或图像信号处理器7的寄存器14的预置的算法,通常需要由图像传感器3厂商或图像信号处理器7厂商(或集成了图像信号处理器7的中央处理器5厂商,下同)对算法进行预置。对于光学信息采集器100生产商来说,采购来的图像传感器3或图像信号处理器7,处理图像数据的运算逻辑通常已经被图像传感器3或图像信号处理器7厂商预定义,不容易进行更改,也即当厂商预定义,当图像传感器3或图像信号处理器7中存储的图像数据未解码时,最后一帧图像数据仍然存储在图像传感器3或图像信号处理器7中,那么,光学信息采集器100生产商是难以更改或直接消除图像信号处理器7中残存的图像数据。而且,由于不同的图像传感器3厂商生产的图像传感器3具有不同运算逻辑,不同的图像信号处理器7厂商生产的图像信号处理器7也有不同的运算逻辑,即使光学信息采集器100生产商可以经过调试,直接消除图像传感器3或图像信号处理器7中残存的图像数据,那么在更换图像传感器3或图像信号处理器7之后,又需要重新调试,工作量巨大,如果有一种可移植的方法,清空不同型号的图像传感器3或图像信号处理器7中的残存图像数据,将会节省工作量。
如图8所示的一种可选的实施例的框图中,通过绕过图像信号处理器7厂商预定义的图像数据处理流程,来消除残存图像数据。光学信息采集器100不通过图像信号处理器7对图像数据进行优化处理,图像传感器3采集的图像数据通过现有的MIPI接口输出至所述图像信号处理器7,并存储于光学信息采集器100生产商可以另行配置的缓冲器15中,所述缓冲器15集成于图像信号处理器7,当然,所述缓冲器15也可以独立于所述图像信号处理器7设置,解码单元8从所述缓冲器15中取出图像数据并解码。本实施例中,仍然通过现有的MIPI接口,将图像传感器3采集的图像数据传输至图像信号处理器7,再传输至解码单元8,因为通过现有的MIPI接口传输图像数据比较简单。一些实施例中,可以完全绕过图像信号处理器7,即图像传感器3采集的图像数据直接传输至解码单元8进行解码。由于不通过图像信号处理器7对图像数据进行优化处理,则仅仅图像传感器3的存储区13中残存一帧图像数据,可以设置特定的流程,消除图像传感器3中残存的一帧图像数据。
一种可选的实施例中,可以在原有的解码流程结束之后,比如在解码成功,所述中央处理器5发送结束指令控制所述图像传感器3结束采集图像数据之后,所述中央处理器5再次发送指令至所述图像传感器3,控制所述图像传感器3继续采集一帧或多帧图像数据,优选的是采集一帧图像数据,并控制所述图像传感器3继续将这一帧图像数据输出,则所述图像传感器3的存储区13中的图像数据被清空,使得下一次采集新的光学信息,图像传感器3输出的第一帧图像数据即为新的光学信息的图像数据。而图像传感器3最后输出的一帧图像数据,可以输入光学信息采集器100生产商可以配置的缓冲器15中,并且进一步被清除,最终消除图像传感器3中的残存图像数据。
具体来说,参考图9中的一种实施例的时序图500,示出了中央处理器5的触发信号501、所述补光灯4的补光时序502、所述图像传感器3连续采集图像数据的图像数据采集时序503和解码单元8的解码时序504,其中:触发信号501在高电平触发图像传感器3采集图像数据和补光灯4补光,在低电平触发图像传感器3停止采集图像数据和补光灯4停止补光,所述补光灯4在补光时序502的高电平补光、在低电平关闭补光;所述图像传感器3的图像数据采集时序503与补光时序502同步,所述图像传感器3在图像数据采集时序503的高电平曝光、在低电平将图像数据输出;图9中虚线箭头代表第一帧图像数据输出至解码单元8解码,所述解码单元8在时间点j接收第一帧图像数据,在时间点k成功解码第一帧图像数据,并将成功解码的信息反馈至所述中央处理器5,所述中央处理器5在时间点i发出触发信号控制所述图像传感器3停止采集图像数据并控制所述补光灯4停止补光。与前述实施例不同的是,中央处理器5将再次发出控制信号510,单独控制所述图像传感器3在高电平530继续采集一帧图像数据,并输出这一帧图像数据,则图像传感器3中没有残存的图像数据,下次一经触发采集新的光学信息,图像传感器3采集和输出的第一帧图像数据即为新的光学信息的图像数据,解码单元8可直接接受和解码第一帧图像数据。此时,补光灯4处于低电平520,没有进行补光,从而节省功耗。需要说明的是,当环境光充足时,整个过程中补光都不是必须的。
前述多个实施例中,通过数字流模式连续采集并解码图像数据,当解码单元8接受到的第一帧图像数据被解码成功,图像传感器3已经采集了多帧图像数据,比如时序图200中,图像传感器3总共采集了七帧图像数据,显然,采集第二至七帧图像数据造成了功耗的浪费。目前iData、Honeywell和Zebra等公司生产的光学信息采集器100,基本在前三帧图像数据之内即可成功解码,也即图像传感器3采集的前三帧图像数据中,至少有一帧可以被解码单元8就成功解码。由前述可知,当光学信息采集器100在第三帧图像数据成功解码时,图像传感器3已经采集了超过三帧图像数据,甚至已经采集了六、七帧图像数据,而采集第四至七帧图像数据同样需要图像传感器3工作,或者还需要补光灯4补光,由于第四至七帧图像数据不用于解码,那么采集第四至七帧图像数据造成了功耗的浪费。需要说明的是,一些实施例中,当环境光充足时,补光并不是必须的,比如日常生活中通过手机扫码,通常并不需要补光。
在一种优选的实施例中,所述光学信息采集器100可采用固定帧模式采集图像数据,不同于数字流模式的连续采集图像数据,在固定帧模式下,中央处理器5控制图像传感器3每次采集固定帧数的图像数据,所述解码单元8解码固定帧数的图像数据,当前一次采集的固定帧数的图像数据解码完成(有一帧图像数据解码成功或固定帧数的图像数据全部解码失败)或即将解码完成时,中央处理器5再判断是否需要又一次采集固定帧数的图像数据,依此类推,直至解码成功或解码超时。固定帧模式前后两次采集固定帧数的图像数据之间具有时间间隔,而不是连续的,留给中央处理器5做出判断的时间。
参见图10的一种实施例的时序图600,示出了中央处理器5的触发信号601、所述补光灯4的补光时序602、所述图像传感器3连续采集图像数据的图像数据采集时序603和解码单元8的解码时序604,其中:触发信号601在高电平触发图像传感器3采集图像数据和补光灯4补光,在低电平触发图像传感器3停止采集图像数据和补光灯4停止补光,所述补光灯4在补光时序602的高电平补光、在低电平关闭补光;所述图像传感器3的图像数据采集时序603与补光时序602同步,所述图像传感器3在图像数据采集时序603的高电平曝光、在低电平将图像数据输出;图10中从左至右的四个虚线箭头分别代表第一至第四帧图像数据分别输出至解码单元8解码,其中第一至第三帧图像数据均没有解码成功,第四帧图像数据成功解码。由图像数据采集时序603可以看到,前三帧固定帧数的图像数据采集时间与后三帧固定帧数的图像数据采集时间具有明显的时间间隔,供中央处理器5判断前三帧固定帧数的图像数据是否解码完成,从而判断是否需要控制图像传感器3继续采集后三帧固定帧数的图像数据。
其中,图像数据采集时序603示出了图像传感器3以固定帧数为三帧的固定帧模式采集图像数据,中央处理器5控制图像传感器3先采集固定帧数的三帧图像数据,并将这三帧图像数据传输至解码单元8,当这三帧图像数据没有解码成功时,控制图像传感器3又一次采集三帧图像数据,并再次传输至解码单元8进行解码,依此类推,直至解码成功(或解码超时)。从时序图600可以看出,当图像传感器3采集的第四帧图像数据成功解码时,如果此时图像传感器3并没有采集完固定帧数的三帧图像数据,图像传感器3将继续执行固定帧模式,采集完固定帧数的图像数据,也即继续采集第五和第六帧图像数据,并将固定帧数的第五和第六帧图像数据全部输出,然后停止图像数据采集,图像传感器3中将没有残存的图像数据。容易理解的是,与之相反,也可以在解码成功之后,控制图像传感器3停止图像数据采集,即使图像传感器3还没有采集完固定帧数的图像数据,这可以一定程度上节省功耗,但是会造成图像传感器3中残存图像数据;可以在下一次采集新的光学信息时,丢弃图像传感器3中残存的图像数据。
前述实施例中,由于对前三帧图像数据完成解码(解码成功或没有解码成功)之后,再控制图像传感器3采集三帧图像数据,前后三帧图像数据采集之间存在时间间隔,如果前三帧固定帧数的图像数据没有解码成功,则采集后三帧固定帧数的图像数据会存在明显的延迟。作为改进,可选地,可以在前三帧图像数据的第二帧图像数据没有成功解码或第三帧图像数据输入解码单元8进行解码时,即控制图像传感器3再采集三帧图像数据,达到解码速度与功耗的平衡;开始采集后三帧图像数据的时间,可以根据实际需求来确定,以使得前后三帧图像数据的采集之间没有明显延迟。
前述实施例中,固定帧模式的固定帧数为三帧,即图像传感器3每次采集三帧图像数据;一些实施例中,可以根据具体的光学信息采集器100的性能来确定固定帧数,比如若光学信息采集器100可以做到在前二帧图像数据之内或第一帧图像数据解码成功,则固定帧模式的固定帧数可以优先设为二帧或一帧,避免后续多采集的图像数据造成功耗浪费,每次在这一帧图像数据解码完成之后,且没有解码成功时,控制图像传感器3采集下一帧图像数据;当然固定帧数也可设置为二帧或四帧或五帧或更多帧数。总之,在结合前面的实施例可知,当前的光学信息采集器100,大多在前三帧图像数据即可解码成功,且固定帧数需要小于等于解码单元8解码一帧图像数据的超时时间,在现有的技术条件下,超时时间通常设置为100ms,也即解码单元8解码一帧图像数据的时间达到100ms且没有解码成功,则停止解码这一帧图像数据,转而解码下一帧图像数据,因此,固定帧模式下的固定帧数优选为不超过五帧(20ms*5=100ms),且进一步优选为三至五帧,使得第一次固定帧模式采集的固定帧数的图像数据即可解码成功,且不会采集过多的图像数据,相对于现有的数字流模式具有功耗优势。可想而知,当具体的光学信息采集器100在数字流模式下,需要五帧以上的图像数据才可以解码成功,则其固定帧数可以设置为五帧或多于五帧。
可以采用结合固定帧模式与数字流模式的优点的混合模式,适应复杂的应用场景,达到功耗与解码速度的平衡。对于一些难以识别的光学信息,如高密度二维码、DPM(Direct Part Mark)或复杂的文字符合等,可以先采用固定帧模式采集和解码图像数据,当没有成功解码,接着采用数字流模式连续采集图像数据进行解码;可想而知,这种混合模式同样也可以用于简单光学信息的识读。
容易想到的是,混合模式可以有多种排列组合方式。
比如所述摄像头1可以配置为先采用预设次数的固定帧模式采集图像数据,接着采用数字流模式采集图像数据;比如先采用一次固定帧模式采集固定帧数的图像数据,再采用数字流模式连续采集图像数据,参考图11中的时序图700所示,示出了中央处理器5的触发信号701、所述补光灯4的补光时序702、所述图像传感器3连续采集图像数据的图像数据采集时序703和解码单元8的解码时序704,光学信息采集器100先采用固定帧数为三帧的固定帧模式采集三帧图像数据,当没有解码成功,则采用数字流模式连续采集和解码图像数据,解码单元8在数字流模式下采集的第一帧图像数据成功解码图像数据。
其它实施例中,可以先采用多次固定帧模式,当没有解码成功,接着采用数字流模式,比如可以先采用两次固定帧模式,接着采用数字流模式,即先采集固定帧数的三帧图像数据进行解码,当没有解码成功,继续采集固定帧数的三帧图像数据进行解码,且当还没有解码成功,再采用数字流模式进行解码;可想而知,可以先采用三次或三次以上的固定帧模式进行解码,当没有解码成功,接着采用数字流模式进行解码。
由于前面已经描述了,当采用数字流模式进行解码,解码成功时图像传感器3中会残存图像数据,为解决这个问题,混合模式可以是先采用固定帧模式,接着采用数字流模式,最后以固定帧模式结尾。
具体来说,参考图12中一种实施例的时序图800,示出了中央处理器5的触发信号801、所述补光灯4的补光时序802、所述图像传感器3连续采集图像数据的图像数据采集时序803和解码单元8的解码时序804,光学信息采集器100先采用固定帧数为三帧的固定帧模式采集三帧图像数据,当没有解码成功,则采用数字流模式连续采集和解码图像数据,解码单元8在数字流模式下采集的第一帧图像数据成功解码图像数据,且当解码单元8在第四帧图像数据解码成功时,控制所述图像传感器3停止图像数据采集和补光灯4停止补光。与前述实施例不同的是,中央处理器5将再次发出控制信号810,单独控制所述图像传感器3在高电平830继续采集一帧图像数据,并输出这一帧图像数据,则图像传感器3中没有残存的图像数据,而且,由于绕过图像信号处理器7,图像信号处理器7中也没有残存的图像数据,下次一经触发采集新的光学信息,图像传感器3采集和输出的第一帧图像数据即为新的光学信息的图像数据,解码单元8可直接接受和解码第一帧图像数据。此时,补光灯4处于低电平820,没有进行补光,从而节省功耗。需要说明的是,当环境光充足时,整个过程中补光都不是必须的。
可想而知,混合模式也可以先采用数字流模式进行图像数据采集和解码,当解码成功之后,接着采用固定帧模式,控制图像传感器3继续采集固定帧数的图像数据,并控制图像传感器3将固定帧数的图像数据全部输出,则图像传感器3中没有残存的图像数据;前述实施例中已经描述了一种特殊情形,即在采用数字流模式解码成功之后,图像传感器3继续采集一帧图像数据。
可想而知,当所述光学信息采集器100采用混合模式采集图像数据,所述光学信息采集器100可以通过图像信号处理器7对图像数据进行优化处理,而为了消除图像信号处理器7中的残存图像数据,则可以进一步采用前述的丢弃特定帧数N的残存图像数据的方法。具体丢弃的特定帧数N的数量,可以根据残存的图像数据来定,比如当图像传感器3和图像信号处理器7中均存储了残存的图像数据,则每次重新采集图像数据时,需要丢弃两帧图像数据;而当图像传感器3中没有残存的图像数据,而图像信号处理器7中残存一帧图像数据,则每次重新采集图像数据时,只需丢弃图像信号处理器7中残存的一帧图像数据。或者,所述光学信息采集器100可以不通过图像信号处理器7对图像数据进行处理,则当图像传感器3中存在一帧残存的图像数据,每次采集新的图像数据时,只需丢弃这一帧残存的图像数据;而当混合模式下,采用固定帧模式结尾时,图像传感器3中也没有残存图像数据,则每次重新采集图像数据时,无需丢弃残存的图像数据。
本申请的光学信息采集器及其方法具有以下有益效果:
1.当图像传感器3经由触发采集图像数据,中央处理器5发出指令丢弃特定帧数的N帧图像数据,其中特定帧数的N帧图像数据为上一次经由触发采集并残存的图像数据,避免残存的图像数据被解码输出,造成解码错误,同时无需解码残存的图像数据,节省功耗。
2.图像传感器3以固定帧模式每次采集和输出固定帧数的图像数据,相对于现有的通过数字流模式连续采集和输出图像数据而言,可以节省功耗,避免通过数字流模式连续采集图像数据,在解码成功时,后续已经连续采集的多帧图像数据不用于解码,造成功耗的浪费。
3.图像传感器3通过数字流模式采集图像数据,并且不通过图像信号处理器7对图像数据优化处理,避免图像信号处理器7中残存图像数据;且在解码成功或解码超时,则控制所述图像传感器3停止以数字流模式连续采集图像数据,并控制图像传感器3继续采集和输出固定帧数的图像数据,从而也避免了图像传感器3中残存图像数据,避免下一次采集光学信息时出现解码错误,提高效率。
以上详细说明仅为本申请之较佳实施例的说明,非因此局限本申请之专利范围,所以,凡运用本创作说明书及图示内容所为之等效技术变化,均包含于本创作之专利范围内。

Claims (15)

  1. 一种光学信息采集器,其特征在于,包括:
    图像传感器,用以采集光学信息的图像数据;
    存储器,预置了一种或多种解码算法;
    解码单元,用以根据预置的解码算法对图像数据进行解码;
    中央处理器,所述中央处理器经触发控制所述图像传感器以数字流模式连续采集图像数据,并控制所述解码单元解码图像数据,其中,一旦经由触发,所述中央处理器发出指令丢弃特定帧数的图像数据,其中,特定帧数的图像数据为上一次经触发采集并残存于所述光学信息采集器中的图像数据。
  2. 如权利要求1所述的光学信息采集器,其特征在于:特定帧数的图像数据包括所述图像传感器的存储区中残存的图像数据。
  3. 如权利要求1所述的光学信息采集器,其特征在于:包括图像信号处理器,用以接收所述图像传感器采集的图像数据,并将图像数据传输至所述解码单元,特定帧数的图像数据包括所述图像信号处理器中残存的图像数据。
  4. 如权利要求1所述的光学信息采集器,其特征在于:丢弃特定帧数的图像数据包括:所述解码单元不接收特定帧数的图像数据,或所述解码单元不解码特定帧数的图像数据,或所述解码单元将特定帧数的图像数据的解码信息不输出或不显示。
  5. 如权利要求1所述的光学信息采集器,其特征在于:所述解码单元从第N+1帧图像数据开始解码。
  6. 一种光学信息采集器,其特征在于,包括:
    图像传感器,用以采集光学信息的图像数据;
    存储器,预置了一种或多种解码算法;
    解码单元,用以接收和解码图像数据;
    中央处理器,用以控制所述图像传感器以数字流模式连续采集图像数据,并控制所述解码单元依次解码图像数据,一旦所述解码单元解码成功或解码超时,则控制所述图像传感器停止以数字流模式连续采集图像数据,并控制所述图像传感器以固定帧模式继续采集和输出固定帧数的图像数据。
  7. 如权利要求6所述的光学信息采集器,其特征在于:所述固定帧模式包括:当所述解码单元解码成功或解码超时,所述图像传感器将继续采集完固定帧数的图像数据并输出固定帧数的图像数据。
  8. 如权利要求6所述的光学信息采集器,其特征在于:所述光学信息采集器不具有图像信号处理器或不通过图像信号处理器对图像数据进行优化处理。
  9. 如权利要求6所述的光学信息采集器,其特征在于:所述图像传感器输出RAW格式图像数据,所述解码单元基于RAW格式图像数据获取灰度图像数据,并且基于灰度图像数据进行解码。
  10. 如权利要求6所述的光学信息采集器,其特征在于:所述图像传感器以数字流模式连续采集图像数据之前,采用固定帧模式采集和输出固定帧数的图像数据。
  11. 一种光学信息采集方法,其特征在于,包括:
    中央处理器控制图像传感器以数字流模式连续采集和输出图像数据;
    解码单元接收和解码图像数据,一旦所述解码单元解码成功,则控制所述图像传感器停止以数字流模式采集图像数据;
    控制所述图像传感器继续采集和输出固定帧数的图像数据。
  12. 如权利要求11所述的光学信息采集方法,其特征在于:所述固定帧模式包括:当所述解码单元解码成功或解码超时,所述图像传感器将继续采集完固定帧数的图像数据并输出固定帧数的图像数据。
  13. 如权利要求11所述的光学信息采集方法,其特征在于:所述光学信息采集器不具有图像信号处理器或不通过图像信号处理器对图像数据进行优化处理。
  14. 如权利要求11所述的光学信息采集方法,其特征在于:所述图像传感器输出RAW格式图像数据,所述解码单元基于RAW格式图像数据获取灰度图像数据,并且基于灰度图像数据进行解码。
  15. 如权利要求11所述的光学信息采集方法,其特征在于:所述图像传感器以数字流模式连续采集图像数据之前,采用固定帧模式采集和输出固定帧数的图像数据。
PCT/CN2023/109637 2022-08-11 2023-07-27 光学信息采集器及其方法 WO2024032379A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
AU2023305023A AU2023305023A1 (en) 2022-08-11 2023-07-27 Optical information collector and method therefor
EP23844164.6A EP4365772A1 (en) 2022-08-11 2023-07-27 Optical information collector and method therefor
US18/413,108 US20240152716A1 (en) 2022-08-11 2024-01-16 Optical information collector and method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210959551.5A CN115034247B (zh) 2022-08-11 2022-08-11 光学信息采集器及方法
CN202210959551.5 2022-08-11

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/413,108 Continuation US20240152716A1 (en) 2022-08-11 2024-01-16 Optical information collector and method thereof

Publications (1)

Publication Number Publication Date
WO2024032379A1 true WO2024032379A1 (zh) 2024-02-15

Family

ID=83129928

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/109637 WO2024032379A1 (zh) 2022-08-11 2023-07-27 光学信息采集器及其方法

Country Status (5)

Country Link
US (1) US20240152716A1 (zh)
EP (1) EP4365772A1 (zh)
CN (1) CN115034247B (zh)
AU (1) AU2023305023A1 (zh)
WO (1) WO2024032379A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115034247B (zh) * 2022-08-11 2022-11-08 无锡盈达聚力科技有限公司 光学信息采集器及方法

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4949391A (en) * 1986-09-26 1990-08-14 Everex Ti Corporation Adaptive image acquisition system
JPH11203392A (ja) * 1998-01-14 1999-07-30 Denso Corp 光学情報読取装置
CN104992207A (zh) * 2015-06-16 2015-10-21 无锡久源软件科技有限公司 一种手机二维条码编解码方法
CN106470321A (zh) * 2015-08-21 2017-03-01 比亚迪股份有限公司 图像传感器及图像传感器的读取方法
CN106934318A (zh) * 2017-03-13 2017-07-07 东软集团股份有限公司 扫码处理方法、装置及系统
CN109740393A (zh) * 2018-12-06 2019-05-10 无锡盈达聚力科技有限公司 条码扫描系统及方法
CN113037989A (zh) * 2019-12-09 2021-06-25 华为技术有限公司 一种图像传感器、相机模组及控制方法
WO2022141333A1 (zh) * 2020-12-31 2022-07-07 华为技术有限公司 一种图像处理方法以及装置
CN115034247A (zh) * 2022-08-11 2022-09-09 无锡盈达聚力科技有限公司 光学信息采集器及方法
CN115396572A (zh) * 2022-08-11 2022-11-25 无锡盈达聚力科技有限公司 光学信息采集器及方法
CN115426442A (zh) * 2022-08-11 2022-12-02 无锡盈达聚力科技有限公司 光学信息采集器及方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9367725B2 (en) * 2011-11-03 2016-06-14 Cognex Corporation Method and apparatus for performing different decoding algorithms in different locations
CN103745186B (zh) * 2013-12-30 2017-11-17 宇龙计算机通信科技(深圳)有限公司 二维码信息的处理方法及通信终端

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4949391A (en) * 1986-09-26 1990-08-14 Everex Ti Corporation Adaptive image acquisition system
JPH11203392A (ja) * 1998-01-14 1999-07-30 Denso Corp 光学情報読取装置
CN104992207A (zh) * 2015-06-16 2015-10-21 无锡久源软件科技有限公司 一种手机二维条码编解码方法
CN106470321A (zh) * 2015-08-21 2017-03-01 比亚迪股份有限公司 图像传感器及图像传感器的读取方法
CN106934318A (zh) * 2017-03-13 2017-07-07 东软集团股份有限公司 扫码处理方法、装置及系统
CN109740393A (zh) * 2018-12-06 2019-05-10 无锡盈达聚力科技有限公司 条码扫描系统及方法
CN113037989A (zh) * 2019-12-09 2021-06-25 华为技术有限公司 一种图像传感器、相机模组及控制方法
WO2022141333A1 (zh) * 2020-12-31 2022-07-07 华为技术有限公司 一种图像处理方法以及装置
CN115034247A (zh) * 2022-08-11 2022-09-09 无锡盈达聚力科技有限公司 光学信息采集器及方法
CN115396572A (zh) * 2022-08-11 2022-11-25 无锡盈达聚力科技有限公司 光学信息采集器及方法
CN115426442A (zh) * 2022-08-11 2022-12-02 无锡盈达聚力科技有限公司 光学信息采集器及方法

Also Published As

Publication number Publication date
CN115034247A (zh) 2022-09-09
AU2023305023A1 (en) 2024-02-29
CN115034247B (zh) 2022-11-08
EP4365772A1 (en) 2024-05-08
US20240152716A1 (en) 2024-05-09

Similar Documents

Publication Publication Date Title
WO2024032379A1 (zh) 光学信息采集器及其方法
CN109740393A (zh) 条码扫描系统及方法
US7268924B2 (en) Optical reader having reduced parameter determination delay
CN113034341B (zh) 一种用于Cameralink高速工业相机的数据采集处理电路
CN115396572B (zh) 光学信息采集器及方法
JP2008219696A (ja) カメラ装置、カメラ装置制御プログラム及びカメラ装置制御方法
CN115426442B (zh) 光学信息采集器及方法
CN113542592B (zh) 控制瞄准光源的扫描系统及方法
JP2005142772A (ja) フレームグラバ
CN110620885B (zh) 一种红外微光图像融合系统、方法及电子设备
CN109460686B (zh) 一种用于瞄准装置瞄准光消隐的方法及系统
CN100440941C (zh) 一种图像处理方法和摄像设备
CN111988541A (zh) 控制瞄准光源的扫描系统及方法
CN111988542A (zh) 控制瞄准光源的扫描系统及方法
US8218023B2 (en) Method and apparatus for processing continuous image data captured by digital image processor
CN101393604B (zh) 一种cmos光学传感器图像采样控制系统及方法
US20060274152A1 (en) Method and apparatus for determining the status of frame data transmission from an imaging device
CN115146664B (zh) 图像采集方法及装置
KR20090090851A (ko) 비접촉식 촬영이 가능한 촬상장치 및 비접촉식 촬영방법
KR100791488B1 (ko) 멀티미디어 프로세서를 이용한 광원 제어 방법 및 장치
CN206442421U (zh) 一种用于光标签采集的手机结构
CN203399224U (zh) 视频监控装置
CN114422671A (zh) 一种基于fpga的双光谱相机
CN202748532U (zh) 具有夜视增强功能的激光照明一体化夜视仪
CN109639979B (zh) 一种基于usb的可穿戴设备传输方法

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2023844164

Country of ref document: EP

Ref document number: 23844164.6

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2023844164

Country of ref document: EP

Effective date: 20240131

ENP Entry into the national phase

Ref document number: 2023305023

Country of ref document: AU

Date of ref document: 20230727

Kind code of ref document: A