CN115426442A - Optical information collector and method - Google Patents

Optical information collector and method Download PDF

Info

Publication number
CN115426442A
CN115426442A CN202210963590.2A CN202210963590A CN115426442A CN 115426442 A CN115426442 A CN 115426442A CN 202210963590 A CN202210963590 A CN 202210963590A CN 115426442 A CN115426442 A CN 115426442A
Authority
CN
China
Prior art keywords
image data
image
fixed frame
optical information
image sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210963590.2A
Other languages
Chinese (zh)
Other versions
CN115426442B (en
Inventor
王冬生
魏江涛
张颂来
陈辰
周小芹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Idata Technology Co ltd
Original Assignee
Wuxi Idata Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Idata Technology Co ltd filed Critical Wuxi Idata Technology Co ltd
Priority to CN202210963590.2A priority Critical patent/CN115426442B/en
Publication of CN115426442A publication Critical patent/CN115426442A/en
Application granted granted Critical
Publication of CN115426442B publication Critical patent/CN115426442B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application provides an optical information collector and a method, wherein the optical information collector comprises an image sensor and a data acquisition unit, wherein the image sensor is used for collecting image data of optical information; a decoding unit to receive and decode image data; and the central processing unit is used for controlling the image sensor to collect image data and controlling the decoding unit to decode the image data, wherein through triggering, the central processing unit controls the image sensor to collect and output the image data with a fixed frame number in a fixed frame mode, controls the decoding unit to decode the image data, and stops decoding the residual image data in the image data with the fixed frame number when one frame of image data is successfully decoded. Compared with the prior art that image data are continuously acquired and output through a digital stream mode, the method has the advantages that power consumption can be saved, the phenomenon that the image data are continuously acquired through the digital stream mode is avoided, and when decoding is successful, multi-frame image data which are continuously acquired subsequently are not used for decoding, so that the waste of power consumption is caused.

Description

Optical information collector and method
Technical Field
The present application relates to the field of optical information acquisition, and more particularly, to an optical information acquisition device and method.
Background
The optical information collection based on image scanning comprises the collection of various optical information readable by a machine, and common optical information readable by the machine comprises a one-dimensional code, a two-dimensional code, an OCR (optical character recognition) image-text, an ultraviolet anti-counterfeiting code, an infrared anti-counterfeiting code and the like.
In general, light supplement is required to acquire clear image data when optical information is acquired. Traditional light filling mode lasts the light filling when gathering optical information, and equipment is serious to generate heat, and the consumption is high.
The main inventor of the present application has developed a low power consumption light supplementing method disclosed in chinese patent CN201810098421.0, the entire contents of which are incorporated herein by reference, wherein the light supplementing unit is turned on during the exposure time of a frame period, and turned off during the sampling time of the frame period, and the purpose of reducing power consumption is achieved by periodically turning on and off the light supplementing unit.
The frame period of the optical imaging unit is usually within 50ms, and in an excellent optical imaging unit, the frame period can be within 20ms, which means that the optical imaging unit can acquire tens of frames of image data within 1 s. The optical imaging unit continuously acquires image data in a digital stream manner, successfully identifies one frame of image data from which the optical information generally comes, and when the optical information identification is successful, the optical imaging unit does not continuously identify the subsequently acquired image data any more and stops continuously acquiring the optical information.
However, when the optical information identification is successful, since the optical imaging unit continuously collects the image data in a digital stream manner, when the optical information is successfully identified by one frame of image data, a plurality of frames of image data have been continuously collected subsequently, and the plurality of frames of image data collected subsequently are not used for the optical information identification, which causes waste of power consumption.
Another low power barcode scanning system and method disclosed in chinese patent CN201811485945.1, the entire contents of which are incorporated herein by reference, is further developed by the main inventor of the present application, wherein the image sensor is in a standby state before being triggered, and after decoding succeeds or decoding times out, the image sensor enters the standby state again. The method is different from the mobile phone industry, after the mobile phone opens the camera application, the camera collects and previews image data, the power consumption is high, however, the mobile phone does not use the camera for a long time, and the power consumption problem can be ignored. Different bar code scanning industries exist, the image sensor for bar code scanning needs to be used for a long time, and if the image sensor is always in a working state, the endurance is obviously reduced.
However, this brings another problem, and the inventor of the present application finds that, when the image sensor continuously acquires image data in a digital stream manner, if the image sensor is controlled to suddenly interrupt image data acquisition after successful decoding or timeout of decoding (at this time, the device is still turned on, and the image sensor is still powered on, but does not output and preview image data, thereby saving power consumption), as mentioned above, image data may have been acquired in the image sensor, although this is a probabilistic event, a certain adverse effect may be caused, and the acquired image data will not be continuously output, but is stored (remained) in a storage area of the image sensor (some image sensors are provided with separate buffers, image data will remain in the buffers, whereas for image sensors not provided with separate buffers, electrical signals of image data remain in PN junctions of pixel units of the image sensor), when the image sensor acquires image data next time, the image sensor outputs the remained image data first.
The inventors have also discovered that certain platforms using high-pass or concurrency may have image data survived not only in the image sensor, but also in other memory, such as the buffers of the image signal processor.
New technologies need to be developed to solve the above problems.
Disclosure of Invention
The invention aims to provide an optical information collector and a method for reducing power consumption.
In order to achieve the purpose, the following technical means are adopted in the application:
the application provides an optical information collector, its characterized in that includes: an image sensor to collect image data of the optical information; a decoding unit for decoding the image data according to a preset decoding algorithm; and the central processing unit controls the image sensor to acquire image data through triggering and controls the decoding unit to decode the image data, wherein once triggered, the central processing unit sends an instruction to discard N frames of image data with a specific frame number, wherein the N frames of image data with the specific frame number are image data which are acquired through triggering last time and remain in the optical information acquisition unit.
Alternatively, the N-frame image data of a certain number of frames includes image data remaining in the storage area of the image sensor.
Optionally, an image signal processor is included to receive the image data collected by the image sensor and transmit the image data to the decoding unit, and the N frames of image data of a specific number of frames include the image data remaining in the image signal processor.
Alternatively, discarding the N-frame image data of the specific number of frames includes: the decoding unit does not receive the N-frame image data of the specific frame number, or the decoding unit does not decode the N-frame image data of the specific frame number, or the decoding unit does not output or display the decoding information of the N-frame image data of the specific frame number.
Alternatively, the decoding unit starts decoding from the N +1 th frame image data.
The application provides an optical information acquisition method, which is characterized by comprising the following steps: the central processing unit controls the image sensor to collect and output image data through triggering; the central processing unit receives the image data and sends an instruction to discard N frames of image data with a specific frame number, wherein the N frames of image data with the specific frame number are image data which are acquired and remained through triggering at the last time; the decoding unit decodes the image data.
Alternatively, the N-frame image data of a certain number of frames includes image data remaining in the storage area of the image sensor.
Optionally, the image data collected by the image sensor is received by an image signal processor, and the image signal processor further transmits the image data to the decoding unit; the N-frame image data of a specific number of frames includes image data remaining in the image signal processor.
Alternatively, discarding the N-frame image data of the specific number of frames includes: the decoding unit does not receive the N-frame image data of the specific frame number, or the decoding unit does not decode the N-frame image data of the specific frame number, or the decoding unit does not output or display the decoding information of the N-frame image data of the specific frame number.
Alternatively, the decoding unit starts decoding from the N +1 th frame image data.
The application provides an optical information collector, its characterized in that includes: an image sensor to collect image data of the optical information; a decoding unit to receive and decode image data; and the central processing unit is used for controlling the image sensor to collect the image data and controlling the decoding unit to decode the image data, wherein through triggering, the central processing unit controls the image sensor to collect and output the image data with a fixed frame number in a fixed frame mode, controls the decoding unit to decode the image data, and stops decoding the residual image data in the image data with the fixed frame number when one frame of image data is successfully decoded.
Optionally, the fixed frame pattern comprises: when the decoding unit successfully decodes the image data with the fixed frame number and the image sensor does not acquire the image data with the fixed frame number, the image sensor continues to acquire the image data with the fixed frame number and outputs the image data with the fixed frame number.
Optionally, the fixed frame pattern comprises: the central processing unit controls the decoding unit to sequentially receive and decode the image data with the fixed frame number, and controls the image sensor to acquire the image data with the fixed frame number again when or before the last frame of image data in the image data with the fixed frame number is not successfully decoded.
Optionally, the image data collected by the image sensor is not included or optimized by the image signal processor.
Optionally, the image sensor is configured to acquire image data in the following order: the method comprises the steps of adopting a fixed frame mode with preset times to collect image data, and adopting a digital stream mode to continuously collect the image data.
The application provides an optical information acquisition method, which is characterized by comprising the following steps: the central processing unit controls the image sensor to collect and output image data with a fixed frame number in a fixed frame mode through triggering; the decoding unit receives and decodes the image data, and stops decoding the image data remaining in the image data of the fixed number of frames when one of the frames of image data is successfully decoded.
Optionally, when the decoding unit succeeds in decoding and the image sensor does not finish acquiring the image data of the fixed frame number, the image sensor will continue to acquire the image data of the fixed frame number and output the image data of the fixed frame number.
Optionally, the central processing unit controls the decoding unit to sequentially receive and decode image data with a fixed frame number, and controls the image sensor to acquire image data with the fixed frame number again when or before the last frame of image data in the image data with the fixed frame number is not successfully decoded.
Optionally, the image data collected by the image sensor is optimized without or without an image signal processor.
Optionally, the image sensor is configured to acquire image data in the following order: the image data is acquired by adopting a fixed frame mode with preset times, and the image data is continuously acquired by adopting a digital stream mode.
The application provides an optical information collector, its characterized in that includes: an image sensor to collect image data of the optical information; a memory, in which one or more decoding algorithms are preset; a decoding unit to receive and decode image data; and the central processing unit is used for controlling the image sensor to continuously acquire the image data in a digital stream mode, controlling the decoding unit to decode the image data in sequence, and once the decoding unit succeeds in decoding or overtime decoding, controlling the image sensor to stop continuously acquiring the image data in the digital stream mode and controlling the image sensor to continuously acquire and output the image data with a fixed frame number.
Optionally, the optical information collector does not have an image signal processor or does not optimize the image data by the image signal processor.
Alternatively, the image sensor outputs RAW format image data, and the decoding unit acquires grayscale image data based on the RAW format image data and performs decoding based on the grayscale image data.
Optionally, the image data of the fixed frame number is one frame or two frames.
Optionally, the image data collected by the image sensor is directly transmitted to the decoding unit for decoding.
The application provides an optical information acquisition method, which is characterized by comprising the following steps: the central processing unit controls the image sensor to continuously collect and output image data in a digital stream mode; the decoding unit receives and decodes the image data, and once the decoding unit successfully decodes, the image sensor is controlled to stop collecting the image data in a digital stream mode; and controlling the image sensor to continuously acquire and output image data with a fixed frame number.
Optionally, the optical information collector does not have an image signal processor or does not optimize the image data by the image signal processor.
Alternatively, the image sensor outputs RAW format image data, and the decoding unit acquires grayscale image data based on the RAW format image data and performs decoding based on the grayscale image data.
Optionally, the image data of the fixed frame number is one frame or two frames.
Optionally, the image data collected by the image sensor is directly transmitted to the decoding unit for decoding.
Drawings
Fig. 1 is a simplified block diagram of an optical information collector according to an embodiment of the present application;
fig. 2 is a schematic diagram of an optical information collector according to an embodiment of the present application;
FIG. 3 is a perspective view of the optical information collector in FIG. 2;
FIG. 4 is a high-level block diagram of an optical information collector according to one embodiment of the present application;
FIG. 5 is a timing diagram illustrating an optical information collector collecting optical information in a digital stream mode according to an embodiment of the present application;
fig. 6 is a timing diagram illustrating an optical information collector collecting optical information according to an embodiment of the present application;
fig. 7 is another timing diagram illustrating an optical information collector collecting optical information according to an embodiment of the present application;
FIG. 8 is a high level block diagram of an optical information collector according to another embodiment of the present application;
fig. 9 is a timing diagram illustrating an optical information collector collecting optical information according to another embodiment of the present application;
FIG. 10 is a timing diagram illustrating an optical information collector collecting optical information in a fixed frame mode according to another embodiment of the present disclosure;
fig. 11 is a timing diagram illustrating an optical information collector collecting optical information through a hybrid mode according to another embodiment of the present disclosure;
fig. 12 is a timing diagram illustrating an optical information collector collecting optical information according to another hybrid mode according to another embodiment of the present disclosure.
Detailed description of the embodiments reference is made to the accompanying drawings in which:
an optical information collector 100; a camera 1; an optical system 2; an image sensor 3; a light supplement lamp 4; a central processing unit 5; a memory 6; an image signal processor 7; a decoding unit 8; a housing 9; a scanning window 10; a display screen 11; a button 12; a storage area 13; a register 14; a buffer 15.
Detailed Description
For a better understanding of the objects, structure, features, and functions of the present application, reference should be made to the drawings and detailed description that follow.
Referring to fig. 1, a simplified block diagram of an implementation of an optical information collector 100 according to an embodiment is shown. As described in further detail below, the optical information collector 100 may be used to collect one or more optical information, such as a one-dimensional code, a two-dimensional code, an OCR image and text, an ultraviolet anti-counterfeit code, an infrared anti-counterfeit code, and the like.
The optical information collector 100 may include at least one camera 1, the camera 1 may include a combination of an optical system 2 (lens) for capturing light and an image sensor 3 (sensor) for photoelectrically converting the light captured by the optical system 2, the optical system 2 may include one or more mirrors, prisms, lenses or a combination thereof, the image sensor 3 may also include one or more image sensors 3, one image sensor 3 may correspond to one/one set of the optical system 2, or a plurality of the image sensors 3 may share one/one set of the optical system 2, or a plurality/multiple sets of the optical systems 2 may share one image sensor 3. The image sensor 3 may be a CCD or CMOS or other type of image sensor 3, and the image sensor 3 is configured to convert an optical signal into an electrical signal and output a digital signal of image data.
The optical information collector 100 may include one or more light supplement lamps 4, where the light supplement lamps 4 are used to illuminate the optical information when the camera 1 collects image data. Of course, under a suitable ambient light condition, the light supplement lamp 4 may not be used for light supplement, or the optical information collector 100 may not have the light supplement lamp 4. The light supplement mode of the light supplement lamp 4 can be various modes: for example, the light supplement lamp 4 continuously supplements light when the camera 1 collects optical information; or the light supplement lamp 4 may supplement light synchronously with the exposure time of the image sensor 3 of the camera 1, wherein chinese patent CN201810098421.0 discloses a technical solution for supplementing light synchronously with the exposure time of the image sensor 3 by the light supplement lamp 4, and the entire contents of the technical solution are incorporated herein by reference; the fill-in light 4 may also be a pulsed fill-in light, the pulse time of which overlaps with a part of the exposure time of the image sensor 3.
The optical information collector 100 may further include a central processing unit 5 for executing various instructions.
The optical information collector 100 may further include a separate or integrated memory 6, where one or more decoding algorithms are preset in the memory 6 according to needs, and the memory 6 may further store other programs or instructions. The memory 6 may include one or more non-transitory storage media such as, for example, volatile and/or non-volatile memory 6, which may be fixed or removable. In particular, the memory 6 may be configured to store information, data, applications, instructions or the like for enabling the processing module to perform various functions in accordance with example embodiments of the present invention. For example, the memory 6 may be configured to buffer input data for processing by the central processor 5. Additionally or alternatively, the memory 6 may be configured to store instructions for execution by the central processor 5. The memory 6 may be considered the main memory 6 and be included in, for example, a volatile storage device, e.g., in RAM or other form, which retains its contents only during operation, and/or the memory 6 may be included in a non-volatile storage device, such as a ROM, EPROM, EEPROM, FLASH or other type of storage device, which retains the memory 6 contents independent of the power state of the processing module. The memory 6 may also be included in a secondary storage device that stores large amounts of data, such as an external disk memory 6. In some embodiments, the disk storage 6 may communicate with the central processor 5 via a data bus or other routing component using input/output components. The secondary memory 6 may comprise a hard disk, a compact disc, a DVD, a memory card or any other type of mass storage type known to those skilled in the art. The memory 6 may store one or more of a variety of optical information acquisition, transmission, processing, and decoding processes or methods to be described below.
The optical information collector 100 may further include an Image Signal Processor 7 (ISP for short), where the Image Signal Processor 7 is configured to perform optimization processing on the Image data collected by the camera 1, where the optimization processing includes one or more of linear correction, noise removal, dead pixel repair, color interpolation, white balance correction, exposure correction, and the like, so as to optimize the quality of the Image data. For optical information that does not require color recognition, some or all of the foregoing optimization processes, such as color interpolation and the like, are not necessary. The image signal processor 7 may process one frame of image data at a time by a single core and a single thread, or the image signal processor 7 may process multiple frames of image data simultaneously by multiple cores and multiple threads. Alternatively, the optical information acquirer 100 may not have the image signal processor 7, or may not perform optimization processing on image data by the image signal processor 7.
The optical information collector 100 may further include a decoding unit 8, where the decoding unit 8 is configured to decode the image data collected by the camera 1 according to a preset decoding algorithm, and further identify optical information, such as identifying encoded information of a one-dimensional code or a two-dimensional code, or identifying an OCR image and text, or identifying encoded information of various ultraviolet/infrared anti-counterfeiting codes. The decoding unit 8 may decode one frame of image data at a time by a single core and a single thread, or the decoding unit 8 may decode multiple frames of image data simultaneously by multiple cores and multiple threads.
Alternatively, part or all of the functional modules of the image signal processor 7 may be integrated with the central processing unit 5, for example, chinese patent CN201811115589.4 discloses a central processing unit 5 integrated with an image signal processor 7, the entire contents of which are incorporated herein by reference; alternatively, part or all of the functional modules of the image signal processor 7 may be integrated with the image sensor 3; optionally, the decoding unit 8 may also be integrated into the central processor 5; optionally, the memory 6 may also be integrated into the central processing unit 5. In the following embodiments, when the image data is subjected to the optimization processing by the image signal processor 7, the image signal processor 7 and the decoding unit 8 are preferably integrated with the central processor 5, so that the cost can be saved; of course, the image signal processor 7 and the decoding unit 8 may not be integrated in the central processor 5.
Fig. 2 and 3 are schematic diagrams illustrating a handheld terminal as a specific embodiment of the optical information collector 100, and the handheld terminal includes a housing 9, a display 11 and a button 12. The front end of the housing 9 is provided with a scanning window 10, and the camera 1 is accommodated in the housing 9 and can collect optical information through the scanning window 10. Alternatively, the optical information collector 100 may not have the display screen 11, and output the information to a separate display screen 11 for display. Optionally, the optical information collector 100 may be a fixed terminal, a desktop terminal, or another terminal, and the optical information collector 100 may also be integrated with other devices as a part of the other devices.
The central processing unit 5 issues a trigger instruction through an external trigger, where the external trigger may be a trigger generated by a user pressing a specific button 12 or touching a specific area of the display screen 11 or a user operating the optical information collector 100 through a specific gesture. Once the central processing unit 5 is triggered externally, a trigger instruction is sent according to a preset algorithm, and the image sensor 3 is triggered to acquire image data.
The image signal processor 7 can also optimize the image data collected by the image sensor 3, and then output the image data to the decoding unit 8 for decoding. Referring to a block diagram of fig. 4, in a specific embodiment, an optical information collector 100 collects a schematic diagram of a barcode, a user presses a button 12 to trigger a fill-in light of a light-filling lamp 4 and the image sensor 3 to collect image data, the image signal Processor 7 may sequentially receive and optimize the image data collected by the image sensor 3 through an MIPI Interface (Mobile Industry Processor Interface, MIPI for short), the decoding unit 8 decodes the image data sent by the image signal Processor 7 after optimization, when one frame of image data is successfully decoded, the decoding unit 8 stops decoding and informs the central processing unit 5 that decoding is successful, and the central processing unit 5 sends an instruction to control the image sensor 3 to stop collecting image data.
The image sensor 3 may continuously acquire image data in a digital stream mode, i.e. according to a preset algorithm, the image sensor 3 may continuously acquire image data within a preset time, the decoding unit 8 may sequentially decode the continuously acquired image data in a single thread or simultaneously decode the continuously acquired image data in multiple threads, and when the decoding is successful or overtime, the image sensor 3 may be controlled to stop acquiring the image data, and the decoding unit 8 may be controlled to stop decoding. For example, the preset time is five seconds, which means that the image sensor 3 will continuously acquire image data within five seconds, and if the image data acquired by the image sensor 3 within five seconds is not successfully decoded, the decoding is overtime; if one frame of image data is successfully decoded, even if the time does not reach five seconds, the central processing unit 5 controls the image sensor 3 to stop acquiring the image data and controls the decoding unit 8 to stop decoding.
Fig. 5 shows a timing chart 200 of the optical information collector 100 for collecting optical information in a digital stream mode according to an embodiment, where the timing chart 200 shows an externally triggered trigger signal 201, a fill-in light timing sequence 202 of the fill-in light 4, an image data collecting timing sequence 203 when the image sensor 3 continuously collects image data, and a decoding timing sequence 204 of the decoding unit 8, where: the triggering signal 201 triggers the image sensor 3 to acquire image data and the light supplement lamp 4 to supplement light at a high level, triggers the image sensor 3 to stop acquiring the image data and the light supplement lamp 4 to stop supplementing light at a low level, and the light supplement lamp 4 supplements light at the high level and closes the light supplement light at the low level of the light supplement time sequence 202; an image data acquisition time sequence 203 of the image sensor 3 is synchronous with a light supplement time sequence 202, and the image sensor 3 is exposed at a high level of the image data acquisition time sequence 203 and outputs image data at a low level; the dotted arrow in fig. 5 represents that the first frame of image data is output to the decoding unit 8 for decoding, the decoding unit 8 receives the first frame of image data at the time point a, successfully decodes the first frame of image data at the time point b, and feeds back the successfully decoded information to the central processing unit 5, and the central processing unit 5 controls the image sensor 3 to stop acquiring the image data and controls the light supplement lamp 4 to stop light supplement at the time point c. Due to the signal delay, the rising edge of the high level of the trigger signal 201 is slightly earlier than the rising edge of the high level of the image data acquisition timing 203, and the falling edge of the high level of the trigger signal is slightly earlier than the time point c when the image sensor 3 finishes acquiring the image data. It should be noted that the supplementary lighting is not necessary when the ambient light is sufficient.
As can be seen from the timing chart 200, when the decoding unit 8 decodes image data, the image sensor 3 also acquires new image data at the same time, when a first frame of image data is successfully decoded by the decoding unit 8, the image sensor 3 has already acquired seven frames of image data, wherein second to seventh frames of image data are not transmitted to the image data of the decoding unit 8, but are stored (remained) in the storage area 13 (buffer or PN junction) of the image sensor 3 or the register 14 corresponding to the image signal processor 7, according to the first-in first-out principle, later acquired image data are to overwrite previously acquired image data, then seventh frame of image data are stored in the register 14 of the image signal processor 7, sixth frame of image data are stored in the storage area 13 of the image sensor 3, and second to fifth frames of image data are overwritten and cleared.
When the optical information collector 100 collects new optical information again by triggering, the decoding unit 8 will receive and decode the image data remained in the storage area 13 of the image sensor 3 or the register 14 of the image signal processor 7 last time, which will inevitably result in a decoding error, because the image data remained last time is not the image data of the new optical information.
The above problem can be solved by the following method, avoiding decoding errors.
An alternative method is shown in the timing diagram 300 of FIG. 6, which can avoid decoding errors by discarding N frames of image data of a specific frame number and starting decoding from the (N + 1) th frame of image data, where N ≧ 1. The timing diagram 300 shows an externally triggered trigger signal 301, a fill-in light timing sequence 302 of the fill-in light 4, an image data acquisition timing sequence 303 when the image sensor 3 continuously acquires image data, and a decoding timing sequence 304 of the decoding unit 8, where: the trigger signal 301 triggers the image sensor 3 to acquire image data and the light supplement lamp 4 to supplement light at a high level, triggers the image sensor 3 to stop acquiring the image data and the light supplement lamp 4 to stop supplementing light at a low level, and the light supplement lamp 4 supplements light at the high level and turns off the light supplement light at the low level of the light supplement time sequence 302; an image data acquisition time sequence 303 of the image sensor 3 is synchronous with a light supplementing time sequence 302, and the image sensor 3 exposes at a high level of the image data acquisition time sequence 303 and outputs image data at a low level; as one frame of image data remains in each of the image sensor 3 and the image signal processor 7, two frames of image data with a specific frame number N are discarded, the first two frames of image data in the image data acquisition time sequence 303 are not transmitted to the decoding unit 8, a dashed arrow in fig. 6 represents that the third frame of image data is output to the decoding unit 8 for decoding, in the decoding time sequence 304, the decoding unit 8 receives and decodes the third frame of image data at a time point d, successfully decodes the third frame of image data at a time point e, and feeds back information of successful decoding to the central processing unit 5, and due to signal delay, the image sensor 3 is controlled at a time point f to stop acquiring image data and the fill light 4 to stop fill light. It should be noted that the supplementary lighting is not necessary when the ambient light is sufficient. According to the image data acquisition sequence 302, at this time, the image sensor 3 has acquired eight frames of image data, the image data of the eighth frame will remain in the register 14 of the image signal processor 7, and the image data of the seventh frame will remain in the storage area 13 of the image sensor 3. When the optical information collector 100 collects new optical information again by triggering, the optical information collector 100 will discard two frames of image data remaining in the image sensor 3 and the image signal processor 7 again, and start decoding and outputting from the third frame of image data, so as to avoid decoding errors.
It is to be understood that the number of discarded specific frames N is not limited to two but may be equal to or greater than the number of frames of image data remaining in the last time of image data acquisition, for example, if the image data remaining in the image sensor 3 and the image signal processor 7 is discarded, the number of discarded specific frames N is equal to or greater than two. Discarding the remaining image data may include the decoding unit 8 not receiving the remaining image data, or the decoding unit 8 not decoding the remaining image data although receiving the remaining image data, or the decoding unit 8 decoding the remaining image data but not outputting or not displaying the decoded information on the display 11, the output and displayed information on the display 11 being the decoded information of the new optical information. For example, if it is known that the storage area 13 of the image sensor 3 and the register 14 of the image signal processor 7 store the image data acquired in the previous trigger respectively, when the new optical information is acquired in the next trigger, the image data of the first two frames are discarded, and the image data of the third frame and later are regarded as the image data of the new optical information, and then the image data of the third frame and later are decoded until the decoding is successful or the decoding is overtime.
In one embodiment, the optical information collector 100 may not optimize the image data by the image signal processor 7, the image signal processor 7 only receives RAW format image data transmitted by the image sensor 3, and then transmits the image data without optimization to the decoding unit 8 for decoding, the decoding unit 8 directly receives grayscale image data (only brightness signal of RAW image data) to facilitate binary decoding of the image data, and the image signal processor 7 only serves as a simple data transmission channel, so that there is no image data remaining in the register 14 of the image signal processor 7; or the optical information collector 100 does not include the image signal processor 7, the RAW image data collected by the image sensor 3 is directly transmitted to the decoding unit 8 through interfaces such as a DVP (Digital Video Port) interface or an LVDS (Low Voltage Differential Signaling) interface, and only one frame of image data remains in the storage area 13 of the image sensor 3, when new optical information is collected, only one frame of image data needs to be discarded, the second frame and the subsequent frame of image data are image data of new optical information, and the second frame and the subsequent frame of image data are decoded until the decoding is successful or the decoding is overtime; since the optical information is decoded from the second frame, compared with the former method in which the optical information is decoded from the third frame, the processing time and the light supplement time of the image data of one frame are saved, the decoding speed can be increased, and the power consumption can be reduced. In these specific embodiments, since the image data is not optimized by the image signal processor 7, a certain amount of image data processing time can be theoretically saved.
Specifically, referring to a timing chart 400 as shown in fig. 7, an externally triggered trigger signal 401, a fill-in light timing 402 of the fill-in light 4, an image data acquisition timing 403 when the image sensor 3 continuously acquires image data, and a decoding timing 404 of the decoding unit 8 are shown, wherein: the trigger signal 401 triggers the image sensor 3 to acquire image data and the light supplement lamp 4 to supplement light at a high level, triggers the image sensor 3 to stop acquiring the image data and the light supplement lamp 4 to stop supplementing light at a low level, and the light supplement lamp 4 supplements light at the high level and turns off the light supplement at the low level of the light supplement time sequence 402; the optical information collector 100 does not optimize the image data through the image signal processor 7, only one frame of image data collected at the previous time remains in the storage area 13 of the image sensor 3, the optical information collector 100 discards the first frame of image data, a dotted arrow in fig. 7 represents that the second frame of image data is output to the decoding unit 8 for decoding, the decoding unit 8 receives and decodes the second frame of image data at a time point g, successfully decodes the second frame of image data at a time point h, and feeds back the successfully decoded information to the central processing unit 5, and due to signal delay, the image sensor 3 stops collecting the image data at the time point i and controls the fill light lamp 4 to stop fill light. It should be noted that the supplementary lighting is not necessary when the ambient light is sufficient. According to the image data collection timing 403, the image sensor 3 has collected six frames of image data, and the sixth frame of image data will remain in the storage area 13 of the image sensor 3. When the optical information collector 100 collects new optical information again by triggering, the optical information collector 100 will discard one frame of image data remaining in the image sensor 3 again, and start decoding and outputting from the second frame of image data, so as to avoid decoding errors.
In the above method, each time new optical information is collected, one or two frames of remaining image data are discarded as required, so that the problem of remaining image data in the image sensor 3 or the image signal processor 7 can be solved; optionally, more than two frames of residual data may be discarded according to actual needs.
The method has certain defects that when new optical information is collected each time, the image sensor 3 needs to output and discard one or more residual image data, the decoding unit 8 starts decoding at least the second frame of image data, and time is wasted; it is conceivable that if new optical information is acquired each time, the first frame image data output by the image sensor 3 is valid image data (image data of new optical information), and efficiency can be improved.
It is conceivable that if the memory area 13 of the image sensor 3 or the register 14 of the image signal processor 7 is emptied after each successful decoding, so that no remaining image data information exists when new optical information is collected next time, the first frame of image data is image data of the new optical information, and thus decoding can be started directly from the first frame of image data, and the decoding speed is increased. This can be achieved by means of a preset algorithm, i.e. by means of an algorithmic control, which, after each successful decoding, continues to empty the memory area 13 of the image sensor 3 or the register 14 of the image signal processor 7. Such an algorithm for clearing the memory area 13 of the image sensor 3 or the register 14 of the image signal processor 7 is normally required to be preset by the manufacturer of the image sensor 3 or the manufacturer of the image signal processor 7 (or the manufacturer of the central processing unit 5 integrating the image signal processor 7, the same applies hereinafter). For the manufacturer of the optical information collector 100, the operating logic for processing the image data of the purchased image sensor 3 or image signal processor 7 is usually predefined by the manufacturer of the image sensor 3 or image signal processor 7, and is not easy to change, that is, when the manufacturer predefines that the last frame of image data is still stored in the image sensor 3 or image signal processor 7 when the image data stored in the image sensor 3 or image signal processor 7 is not decoded, it is difficult for the manufacturer of the optical information collector 100 to change or directly eliminate the image data remaining in the image signal processor 7. Moreover, since image sensors 3 produced by different manufacturers of image sensors 3 have different operation logics, and image signal processors 7 produced by different manufacturers of image signal processors 7 also have different operation logics, even if the manufacturers of optical information collectors 100 can directly eliminate image data remaining in the image sensors 3 or the image signal processors 7 through debugging, the image sensors 3 or the image signal processors 7 need to be debugged again after being replaced, and the workload is huge, and if a portable method is provided, the image sensors 3 or the image signal processors 7 of different models can be cleared, so that the workload can be saved.
In an alternative embodiment, shown in fig. 8, the residual image data is eliminated by bypassing the image data processing flow predefined by the manufacturer of the image signal processor 7. The optical information acquirer 100 does not optimize the image data through the image signal processor 7, the image data acquired by the image sensor 3 is output to the image signal processor 7 through the existing MIPI interface and stored in the buffer 15 that can be configured separately by the manufacturer of the optical information acquirer 100, the buffer 15 is integrated with the image signal processor 7, of course, the buffer 15 may be provided independently of the image signal processor 7, and the decoding unit 8 takes out the image data from the buffer 15 and decodes the image data. In this embodiment, the image data collected by the image sensor 3 is still transmitted to the image signal processor 7 and then to the decoding unit 8 through the existing MIPI interface, because it is simpler to transmit the image data through the existing MIPI interface. In some embodiments, the image signal processor 7 may be completely bypassed, i.e. the image data acquired by the image sensor 3 is directly transmitted to the decoding unit 8 for decoding. Since the optimization processing of the image data is not performed by the image signal processor 7, only one frame of image data remains in the storage area 13 of the image sensor 3, and a specific flow can be set to eliminate one frame of image data remaining in the image sensor 3.
In an alternative embodiment, after the original decoding process is ended, for example, after the decoding is successful, the central processing unit 5 sends an end instruction to control the image sensor 3 to end acquiring image data, the central processing unit 5 sends an instruction to the image sensor 3 again to control the image sensor 3 to continue acquiring one or more frames of image data, preferably one frame of image data, and control the image sensor 3 to continue outputting this frame of image data, then the image data in the storage area 13 of the image sensor 3 is cleared, so that new optical information is acquired next time, and the first frame of image data output by the image sensor 3 is image data of new optical information. The last frame of image data output by the image sensor 3 can be input into a buffer 15 that can be configured by the manufacturer of the optical information collector 100, and further cleared, and finally the residual image data in the image sensor 3 is eliminated.
Specifically, referring to a timing chart 500 of an embodiment in fig. 9, a trigger signal 501 of the central processing unit 5, a fill-in light timing sequence 502 of the fill-in light 4, an image data acquisition timing sequence 503 when the image sensor 3 continuously acquires image data, and a decoding timing sequence 504 of the decoding unit 8 are shown, wherein: the trigger signal 501 triggers the image sensor 3 to acquire image data and the light supplement lamp 4 to supplement light at a high level, triggers the image sensor 3 to stop acquiring the image data and the light supplement lamp 4 to stop supplementing light at a low level, and the light supplement lamp 4 supplements light at the high level and turns off the light supplement light at the low level of the light supplement time sequence 502; an image data acquisition time sequence 503 of the image sensor 3 is synchronous with a light supplement time sequence 502, and the image sensor 3 is exposed at a high level of the image data acquisition time sequence 503 and outputs image data at a low level; a dotted arrow in fig. 9 represents that the first frame of image data is output to the decoding unit 8 for decoding, the decoding unit 8 receives the first frame of image data at a time point j, successfully decodes the first frame of image data at a time point k, and feeds back information of the successful decoding to the central processing unit 5, and the central processing unit 5 sends a trigger signal at a time point i to control the image sensor 3 to stop acquiring image data and control the light supplement lamp 4 to stop light supplement. Unlike the previous embodiment, the central processing unit 5 will send out the control signal 510 again, and control the image sensor 3 to continue to acquire one frame of image data at the high level 530 and output the frame of image data, so that there is no image data remaining in the image sensor 3, and new optical information is acquired at the next time of triggering, the first frame of image data acquired and output by the image sensor 3 is the image data of the new optical information, and the decoding unit 8 can directly receive and decode the first frame of image data. At this time, the fill light 4 is at the low level 520, and no fill light is performed, thereby saving power consumption. It should be noted that when the ambient light is sufficient, the light supplement is not necessary in the whole process.
In the foregoing embodiments, by continuously acquiring and decoding image data in a digital stream mode, when the first frame of image data received by the decoding unit 8 is successfully decoded, the image sensor 3 has already acquired multiple frames of image data, for example, in the timing diagram 200, the image sensor 3 acquires seven frames of image data in total, and obviously, acquiring the second to seven frames of image data causes waste of power consumption. At present, the optical information collector 100 produced by the companies such as the iData, the Honeywell, the Zebra, and the like can successfully decode within the first three frames of image data, that is, at least one frame of the first three frames of image data collected by the image sensor 3 can be successfully decoded by the decoding unit 8. As can be seen from the foregoing, when the optical information acquirer 100 successfully decodes the third frame of image data, the image sensor 3 has already acquired more than three frames of image data, and even has already acquired six or seven frames of image data, and the acquisition of the fourth to seventh frames of image data also requires the operation of the image sensor 3, or needs the light supplement lamp 4 to supplement light, and since the fourth to seventh frames of image data are not used for decoding, the acquisition of the fourth to seventh frames of image data causes waste of power consumption. It should be noted that in some embodiments, when the ambient light is sufficient, supplementary lighting is not necessary, for example, in daily life, a code is scanned through a mobile phone, and supplementary lighting is not usually needed.
In a preferred embodiment, the optical information collector 100 may collect image data in a fixed frame mode, which is different from a continuous collection of image data in a digital stream mode, in the fixed frame mode, the central processing unit 5 controls the image sensor 3 to collect image data of a fixed frame number each time, the decoding unit 8 decodes image data of a fixed frame number, when the decoding of the image data of a fixed frame number collected at the present time is completed (there is a frame of image data that is decoded successfully or all the image data of the fixed frame number are decoded unsuccessfully) or the decoding is completed, the central processing unit 5 determines whether image data of a fixed frame number needs to be collected again, and so on until the decoding is successful or the decoding is overtime. The time interval between the two times of acquiring the image data of the fixed frame number before and after the fixed frame mode is not continuous, and is left for the central processing unit 5 to make judgment.
Referring to a timing diagram 600 of an embodiment of fig. 10, a trigger signal 601 of the central processing unit 5, a fill-in timing sequence 602 of the fill-in lamp 4, an image data acquisition timing sequence 603 for the image sensor 3 to continuously acquire image data, and a decoding timing sequence 604 of the decoding unit 8 are shown, wherein: the trigger signal 601 triggers the image sensor 3 to acquire image data and the light supplement lamp 4 to supplement light at a high level, triggers the image sensor 3 to stop acquiring the image data and the light supplement lamp 4 to stop supplementing light at a low level, and the light supplement lamp 4 supplements light at the high level and closes the light supplement at the low level of the light supplement time sequence 602; an image data acquisition time sequence 603 of the image sensor 3 is synchronous with a light supplementing time sequence 602, and the image sensor 3 exposes at a high level of the image data acquisition time sequence 603 and outputs image data at a low level; four dotted arrows from left to right in fig. 10 respectively represent that the image data of the first to fourth frames are respectively output to the decoding unit 8 for decoding, wherein none of the image data of the first to third frames is successfully decoded, and the image data of the fourth frame is successfully decoded. As can be seen from the image data collecting timing sequence 603, the image data collecting time of the first three fixed frame numbers and the image data collecting time of the last three fixed frame numbers have an obvious time interval, so that the central processing unit 5 can determine whether the decoding of the image data of the first three fixed frame numbers is completed, thereby determining whether to control the image sensor 3 to continuously collect the image data of the last three fixed frame numbers.
The image data acquisition timing sequence 603 shows that the image sensor 3 acquires image data in a fixed frame mode with three fixed frames, the central processing unit 5 controls the image sensor 3 to acquire three frames of image data with fixed frames first and transmit the three frames of image data to the decoding unit 8, when the three frames of image data are not successfully decoded, the image sensor 3 is controlled to acquire three frames of image data again and transmit the three frames of image data to the decoding unit 8 again for decoding, and so on until the decoding is successful (or the decoding is overtime). As can be seen from the timing diagram 600, when the fourth frame of image data acquired by the image sensor 3 is successfully decoded, if the image sensor 3 does not acquire three frames of image data with a fixed frame number at this time, the image sensor 3 will continue to execute the fixed frame mode, and acquire the fixed frame number of image data, that is, continue to acquire the fifth and sixth frames of image data, and output all the fifth and sixth frames of image data with the fixed frame number, and then stop image data acquisition, and there will be no remaining image data in the image sensor 3. It is easy to understand that, on the contrary, the image sensor 3 may be controlled to stop image data acquisition after decoding is successful, even if the image sensor 3 has not acquired image data of a fixed frame number, which may save power consumption to some extent, but may cause image data to remain in the image sensor 3; the image data remaining in the image sensor 3 can be discarded the next time new optical information is acquired.
In the foregoing embodiment, since the image sensor 3 is controlled to acquire three frames of image data after the first three frames of image data are decoded (whether the decoding is successful or not), there is a time interval between the acquisition of the first three frames of image data and the acquisition of the second three frames of image data, and if the decoding is not successful, there is a significant delay in acquiring the second three frames of image data with a fixed frame number. As an improvement, optionally, when the second frame image data of the first three frames of image data is not successfully decoded or the third frame image data is input into the decoding unit 8 for decoding, the image sensor 3 may be controlled to acquire the three frames of image data again, so as to achieve the balance between the decoding speed and the power consumption; the time for starting to acquire the three frames of image data can be determined according to actual requirements, so that no obvious delay exists between the acquisition of the three frames of image data.
In the foregoing embodiment, the fixed frame number of the fixed frame mode is three frames, that is, the image sensor 3 acquires three frames of image data each time; in some embodiments, the fixed frame number may be determined according to the performance of the specific optical information collector 100, for example, if the optical information collector 100 can decode the first frame of image data or the first frame of image data successfully, the fixed frame number in the fixed frame mode may be preferentially set to two frames or one frame, so as to avoid power consumption waste caused by the subsequent image data collected more, and each time after the decoding of the frame of image data is completed and when the decoding is not successful, the image sensor 3 is controlled to collect the next frame of image data; of course, the fixed number of frames may be set to two or four or five or more frames. In summary, in combination with the foregoing embodiments, most of the current optical information collectors 100 can decode successfully in the first three frames of image data, and the fixed frame number needs to be less than or equal to the timeout time for the decoding unit 8 to decode one frame of image data, under the existing technical conditions, the timeout time is usually set to 100ms, that is, the decoding unit 8 stops decoding one frame of image data when the time for decoding one frame of image data reaches 100ms and fails to decode the next frame of image data, so that the fixed frame number in the fixed frame mode preferably does not exceed five frames (20ms × 5= 100ms), and further preferably ranges from three to five frames, so that the image data in the fixed frame number acquired in the first fixed frame mode can be decoded successfully without acquiring too much image data, and has a power consumption advantage over the existing digital stream mode. It is conceivable that when the specific optical information collector 100 needs more than five frames of image data to be decoded successfully in the digital stream mode, the fixed frame number may be set to five frames or more than five frames.
A mixed mode combining the advantages of the fixed frame mode and the digital stream mode can be adopted, the method is suitable for complex application scenes, and the balance between power consumption and decoding speed is achieved. For some optical information which is difficult to identify, such as high-density two-dimensional codes, DPM (Direct Part Mark) or complex character coincidence, image data can be collected and decoded by adopting a fixed frame mode, and when the decoding is not successful, the image data is continuously collected and decoded by adopting a digital stream mode; it is conceivable that such a mixed mode can also be used for the recognition of simple optical information.
It is easily conceivable that the hybrid mode may have various permutation and combination modes.
For example, the camera 1 may be configured to acquire image data in a fixed frame mode for a preset number of times, and then acquire image data in a digital stream mode; for example, a fixed frame mode is first adopted to acquire image data of a fixed frame number, and then a digital stream mode is adopted to continuously acquire image data, as shown in a timing chart 700 in fig. 11, a trigger signal 701 of a central processing unit 5, a fill-in light timing sequence 702 of a fill-in light 4, an image data acquisition timing sequence 703 for continuously acquiring image data by an image sensor 3, and a decoding timing sequence 704 of a decoding unit 8 are shown, the optical information acquisition unit 100 first adopts a fixed frame mode with three fixed frame numbers to acquire three frame image data, and when decoding is not successful, the digital stream mode is adopted to continuously acquire and decode image data, and a first frame image data acquired by the decoding unit 8 in the digital stream mode is successfully decoded.
In other embodiments, a multiple fixed frame mode may be adopted first, and when decoding is not successful, a digital stream mode is adopted next, for example, a two-time fixed frame mode may be adopted first, and then a digital stream mode is adopted, that is, three frames of image data with a fixed frame number are collected first for decoding, when decoding is not successful, three frames of image data with a fixed frame number are collected continuously for decoding, and when decoding is not successful, decoding is carried out again in the digital stream mode; it is conceivable that the decoding may be performed first in three or more fixed frame modes, and then in a digital stream mode when the decoding is not successful.
Since it has been described above that when decoding is performed in the digital stream mode, image data may remain in the image sensor 3 when decoding is successful, and to solve this problem, the hybrid mode may be a fixed frame mode, a digital stream mode, and a fixed frame mode.
Specifically, referring to a timing diagram 800 in fig. 12 of an embodiment, which shows a trigger signal 801 of the central processing unit 5, a fill-in light timing sequence 802 of the fill-in light 4, an image data acquisition timing sequence 803 when the image sensor 3 continuously acquires image data, and a decoding timing sequence 804 of the decoding unit 8, the optical information acquirer 100 first acquires three frames of image data in a fixed frame mode with a fixed frame number of three frames, when decoding is not successful, continuously acquires and decodes image data in a digital stream mode, the decoding unit 8 successfully decodes image data in a first frame of image data acquired in the digital stream mode, and when decoding is successful in a fourth frame of image data, controls the image sensor 3 to stop image data acquisition and stop fill-in light by the fill-in light 4. Unlike the previous embodiment, the central processing unit 5 will send out the control signal 810 again, and control the image sensor 3 to continue to collect one frame of image data at the high level 830 and output the frame of image data, so that there is no image data remaining in the image sensor 3, and since the image signal processor 7 is bypassed, there is no image data remaining in the image signal processor 7, new optical information is collected at the next time, the first frame of image data collected and output by the image sensor 3 is the image data of the new optical information, and the decoding unit 8 can directly receive and decode the first frame of image data. At this time, the fill-in lamp 4 is at the low level 820, and no fill-in is performed, thereby saving power consumption. It should be noted that when the ambient light is sufficient, the light supplement is not necessary in the whole process.
Conceivably, the mixed mode may also adopt a digital stream mode to acquire and decode image data, and after the decoding is successful, adopt a fixed frame mode to control the image sensor 3 to continuously acquire image data with a fixed frame number, and control the image sensor 3 to output all image data with a fixed frame number, so that there is no residual image data in the image sensor 3; the foregoing embodiment has described a special case where the image sensor 3 continues to acquire one frame of image data after the decoding in the digital stream mode is successful.
It is conceivable that when the optical information collector 100 collects image data in a mixed mode, the optical information collector 100 can perform optimization processing on the image data through the image signal processor 7, and in order to eliminate the residual image data in the image signal processor 7, the aforementioned method of discarding the residual image data of a specific frame number N can be further adopted. The specific number of discarded specific frames N may be determined according to the remaining image data, for example, when the image sensor 3 and the image signal processor 7 both store the remaining image data, two frames of image data need to be discarded each time the image data is re-acquired; when there is no image data remaining in the image sensor 3 and one frame of image data remains in the image signal processor 7, only one frame of image data remaining in the image signal processor 7 needs to be discarded each time image data is newly acquired. Or, the optical information collector 100 may not process the image data through the image signal processor 7, and when there is a frame of remaining image data in the image sensor 3, only the frame of remaining image data needs to be discarded each time a new image data is collected; in the mixed mode, when the fixed frame mode is adopted and the end is reached, the image sensor 3 has no residual image data, and the residual image data does not need to be discarded every time the image data is collected again.
The optical information collector and the method thereof have the following beneficial effects:
1. when the image sensor 3 acquires image data by triggering, the central processing unit 5 sends an instruction to discard N frames of image data with a specific frame number, wherein the N frames of image data with the specific frame number are image data acquired by triggering and remaining at the last time, and the remaining image data is prevented from being decoded and output to cause decoding errors.
2. The image sensor 3 collects and outputs image data with a fixed frame number in a fixed frame mode each time, and compared with the conventional method for continuously collecting and outputting image data in a digital stream mode, the method can save power consumption, avoid the phenomenon that image data are continuously collected in the digital stream mode, and avoid the waste of power consumption caused by the fact that subsequent continuously collected multi-frame image data are not used for decoding when decoding is successful.
3. The image sensor 3 collects image data through a digital flow mode, and the image data is not optimized by the image signal processor 7, so that residual image data in the image signal processor 7 is avoided; and if the decoding is successful or the decoding is overtime, the image sensor 3 is controlled to stop continuously acquiring the image data in the digital stream mode, and the image sensor 3 is controlled to continuously acquire and output the image data with fixed frame number, so that the residual image data in the image sensor 3 is avoided, the decoding error when the optical information is acquired next time is avoided, and the efficiency is improved.
The above detailed description is only for the purpose of illustrating the preferred embodiments of the present application, and not for the purpose of limiting the scope of the present application, therefore, all technical changes that can be made by applying the present specification and the drawings are included in the scope of the present application.

Claims (10)

1. An optical information collector, comprising:
an image sensor to collect image data of the optical information;
a decoding unit to receive and decode image data;
and the central processing unit is used for controlling the image sensor to collect image data and controlling the decoding unit to decode the image data, wherein through triggering, the central processing unit controls the image sensor to collect and output the image data with a fixed frame number in a fixed frame mode, controls the decoding unit to decode the image data, and stops decoding the residual image data in the image data with the fixed frame number when one frame of image data is successfully decoded.
2. The optical information collector of claim 1, wherein: the fixed frame mode includes: when the decoding unit successfully decodes the image data with the fixed frame number and the image sensor does not acquire the image data with the fixed frame number, the image sensor continues to acquire the image data with the fixed frame number and outputs the image data with the fixed frame number.
3. The optical information collector of claim 1, wherein: the fixed frame mode includes: the central processing unit controls the decoding unit to sequentially receive and decode the image data with the fixed frame number, and controls the image sensor to acquire the image data with the fixed frame number again when or before the last frame of image data in the image data with the fixed frame number is not successfully decoded.
4. The optical information collector of claim 1, wherein: the image data collected by the image sensor is not included or optimized by the image signal processor.
5. The optical information collector of claim 1, wherein: the image sensor is configured to acquire image data in the following order: the image data is acquired by adopting a fixed frame mode with preset times, and the image data is continuously acquired by adopting a digital stream mode.
6. An optical information acquisition method, comprising:
the central processing unit controls the image sensor to collect and output image data with a fixed frame number in a fixed frame mode through triggering;
the decoding unit receives and decodes the image data, and stops decoding the image data remaining in the image data of the fixed number of frames when one of the frames of image data is successfully decoded.
7. The optical information acquisition method as claimed in claim 6, wherein: when the decoding unit successfully decodes the image data with the fixed frame number and the image sensor does not acquire the image data with the fixed frame number, the image sensor continues to acquire the image data with the fixed frame number and outputs the image data with the fixed frame number.
8. The optical information acquisition method according to claim 6, wherein: the central processing unit controls the decoding unit to sequentially receive and decode the image data with the fixed frame number, and controls the image sensor to acquire the image data with the fixed frame number again when or before the last frame of image data in the image data with the fixed frame number is not successfully decoded.
9. The optical information acquisition method according to claim 6, wherein: the image data collected by the image sensor is not included or optimized by the image signal processor.
10. The optical information acquisition method according to claim 6, wherein: the image sensor is configured to acquire image data in the following order: the image data is acquired by adopting a fixed frame mode with preset times, and the image data is continuously acquired by adopting a digital stream mode.
CN202210963590.2A 2022-08-11 2022-08-11 Optical information collector and method Active CN115426442B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210963590.2A CN115426442B (en) 2022-08-11 2022-08-11 Optical information collector and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210963590.2A CN115426442B (en) 2022-08-11 2022-08-11 Optical information collector and method

Publications (2)

Publication Number Publication Date
CN115426442A true CN115426442A (en) 2022-12-02
CN115426442B CN115426442B (en) 2023-06-27

Family

ID=84198175

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210963590.2A Active CN115426442B (en) 2022-08-11 2022-08-11 Optical information collector and method

Country Status (1)

Country Link
CN (1) CN115426442B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024032379A1 (en) * 2022-08-11 2024-02-15 无锡盈达聚力科技有限公司 Optical information collector and method therefor

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012086654A1 (en) * 2010-12-21 2012-06-28 株式会社日立国際電気 Transmitter, receiver, communication system, and communication method
US20130129203A1 (en) * 2011-11-17 2013-05-23 Honeywell International Inc. doing business as (d. b.a) Honewell Scanning and Mobility Imaging terminal operative for decoding
CN106778409A (en) * 2016-11-28 2017-05-31 北京慧眼智行科技有限公司 A kind of code figure detection method and detecting system
CN106934318A (en) * 2017-03-13 2017-07-07 东软集团股份有限公司 Barcode scanning processing method, apparatus and system
CN109784113A (en) * 2018-12-17 2019-05-21 深圳盈达信息科技有限公司 Scanning means and its barcode scanning method
CN111713107A (en) * 2019-06-28 2020-09-25 深圳市大疆创新科技有限公司 Image processing method and device, unmanned aerial vehicle and receiving end
CN112560536A (en) * 2020-12-26 2021-03-26 苏州斯普锐智能系统股份有限公司 Method for reading and decoding image bar code by applying enhanced code reading mode

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012086654A1 (en) * 2010-12-21 2012-06-28 株式会社日立国際電気 Transmitter, receiver, communication system, and communication method
US20130129203A1 (en) * 2011-11-17 2013-05-23 Honeywell International Inc. doing business as (d. b.a) Honewell Scanning and Mobility Imaging terminal operative for decoding
CN106778409A (en) * 2016-11-28 2017-05-31 北京慧眼智行科技有限公司 A kind of code figure detection method and detecting system
CN106934318A (en) * 2017-03-13 2017-07-07 东软集团股份有限公司 Barcode scanning processing method, apparatus and system
CN109784113A (en) * 2018-12-17 2019-05-21 深圳盈达信息科技有限公司 Scanning means and its barcode scanning method
CN111713107A (en) * 2019-06-28 2020-09-25 深圳市大疆创新科技有限公司 Image processing method and device, unmanned aerial vehicle and receiving end
CN112560536A (en) * 2020-12-26 2021-03-26 苏州斯普锐智能系统股份有限公司 Method for reading and decoding image bar code by applying enhanced code reading mode

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024032379A1 (en) * 2022-08-11 2024-02-15 无锡盈达聚力科技有限公司 Optical information collector and method therefor

Also Published As

Publication number Publication date
CN115426442B (en) 2023-06-27

Similar Documents

Publication Publication Date Title
CN109640007B (en) Artificial intelligence image sensing equipment
JP5282899B2 (en) Information restoration apparatus and information restoration method
US20240152716A1 (en) Optical information collector and method thereof
US20050263678A1 (en) Image processing apparatus
US20090020611A1 (en) Bi-optic imaging scanner with preprocessor for processing image data from multiple sources
CN109740393A (en) Bar code scanning system and method
CN115396572B (en) Optical information collector and method
CN115426442B (en) Optical information collector and method
CN111327835B (en) Multi-line time-sharing exposure processing method and system for camera
JP4220883B2 (en) Frame grabber
US20200151412A1 (en) Barcode reading device that controls illumination without signaling from an image sensor
US11917302B2 (en) Camera exposure processing method and system
CN110620885B (en) Infrared low-light-level image fusion system and method and electronic equipment
CN109460686B (en) Method and system for blanking aiming light of aiming device
CN101441393A (en) Projection device for image projection with document camera device connected thereto, and projection method
CN110636219B (en) Video data stream transmission method and device
CN115146664B (en) Image acquisition method and device
US7519239B2 (en) Systems and methods for concurrent image capture and decoding of graphical codes
JPH07262299A (en) Data symbol reader
CN205581889U (en) Iris recognition device
CN215420507U (en) Automatic focusing vision sensor device
KR20080044726A (en) Terminal having camera and operating method thereof
JPH07262301A (en) Data symbol reader
CN206442421U (en) A kind of handset structure gathered for optical label
US20150022502A1 (en) Method for reading out a light sensor control unit and a device for reading out a light sensor control unit

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Optical information collector and method

Granted publication date: 20230627

Pledgee: Agricultural Bank of China Limited by Share Ltd. Wuxi science and Technology Branch

Pledgor: WUXI IDATA TECHNOLOGY Co.,Ltd.

Registration number: Y2024980003496

PE01 Entry into force of the registration of the contract for pledge of patent right