CN117544849A - Optical information collector and method - Google Patents

Optical information collector and method Download PDF

Info

Publication number
CN117544849A
CN117544849A CN202311503897.5A CN202311503897A CN117544849A CN 117544849 A CN117544849 A CN 117544849A CN 202311503897 A CN202311503897 A CN 202311503897A CN 117544849 A CN117544849 A CN 117544849A
Authority
CN
China
Prior art keywords
image data
decoding
image sensor
image
optical information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311503897.5A
Other languages
Chinese (zh)
Inventor
王冬生
魏江涛
张颂来
周小芹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Idata Technology Co ltd
Original Assignee
Wuxi Idata Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Idata Technology Co ltd filed Critical Wuxi Idata Technology Co ltd
Priority to CN202311503897.5A priority Critical patent/CN117544849A/en
Publication of CN117544849A publication Critical patent/CN117544849A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • H04N23/651Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Abstract

The application provides an optical information collector and a method thereof, wherein the optical information collector comprises: the image sensor is used for collecting image data of optical information; a memory, which is preset with one or more decoding algorithms; a decoding unit to receive and decode image data; the central processing unit is used for controlling the image sensor to collect image data according to a first fixed frame number and controlling the decoding unit to sequentially decode the image data; and if the decoding is not successful, controlling the image sensor to continuously acquire the image data in a digital stream mode, controlling the decoding unit to sequentially decode the image data, and once the decoding unit is successful or the decoding is overtime, controlling the image sensor to stop continuously acquiring the image data in the digital stream mode. Residual image data in the image sensor is avoided, so that decoding errors are avoided when optical information is acquired next time, and efficiency is improved.

Description

Optical information collector and method
The filing application is a filing application of Chinese patent application with the application date of 2022, month 08 and 11, the application number of 202210963616.3 and the name of optical information collector and method.
Technical Field
The present disclosure relates to the field of optical information acquisition, and more particularly, to an optical information acquisition device and method.
Background
The optical information acquisition based on image scanning comprises the acquisition of various machine-readable optical information, and common machine-readable optical information comprises one-dimensional codes, two-dimensional codes, OCR graphics and texts, ultraviolet anti-counterfeiting codes, infrared anti-counterfeiting codes and the like.
In general, light compensation is required to acquire clear images when acquiring optical information. The traditional light supplementing mode is continuous in light supplementing when optical information is acquired, equipment is serious in heating and high in power consumption.
The present inventors have developed a low power consumption light filling method disclosed in chinese patent CN201810098421.0, the entire contents of which are incorporated herein by reference, wherein an optical imaging unit turns on light filling at the exposure time of a frame period and turns off light filling at the sampling time of the frame period, and the purpose of reducing power consumption is achieved by periodically turning on and off the light filling unit.
The frame period of the optical imaging unit is usually within 50ms, and the frame period of the excellent optical imaging unit can reach within 20ms, which means that the optical imaging unit can acquire tens of frames of images in 1 s. The optical imaging unit typically continuously captures images in a digital stream, while successfully identifying the image from which the optical information is typically derived, and when the optical information is successfully identified, the optical imaging unit will not continue to identify the subsequently captured image and stop continuing to capture the optical information.
However, when the optical information identification is successful, since the optical imaging unit continuously acquires images in a digital stream manner, when the optical information is successfully identified by one of the images, a plurality of frames of images are continuously acquired in the following process, and the plurality of frames of images acquired in the following process are not used for the optical information identification, so that the waste of power consumption is caused.
The present inventors have further developed another low power barcode scanning system and method disclosed in chinese patent CN201811485945.1, the entire contents of which are incorporated herein by reference, wherein the image sensor is in a standby state before being triggered, and the image sensor reenters the standby state after decoding is successful or decoding times out. The method is different from the mobile phone industry, after the mobile phone is opened, the camera always collects and previews the image, the power consumption is high, however, the mobile phone does not use the camera for a long time, and the power consumption problem is negligible. Different from the bar code scanning industry, the image sensor for bar code scanning needs to be used for a long time, and if the image sensor is always in a working state, the endurance is obviously reduced.
However, this presents another problem, and the inventors of the present application found that when the image sensor continuously collects image data in a digital stream manner, if the image sensor is controlled to suddenly interrupt image collection after decoding is successful or decoding is timed out (when the device is still powered on, the image sensor still outputs and previews image data, thereby saving power consumption), as described above, an image may already be collected in the image sensor, although this is a probabilistic event, it causes a certain adverse effect, and the image that has been collected will not continue to be output, but will be stored (survived) in the storage area of the image sensor (some image sensors are equipped with separate buffers, the image data will remain in the buffers, whereas for image sensors that are not equipped with separate buffers, the electrical signal of the image data will remain in the PN junction of the pixel unit of the image sensor), and when the image sensor collects the image next time, the survived image data will be output first.
The inventors have also found that with some platforms of the high-pass or concurrency family, image data may remain not only in the image sensor but also in other memories, such as a buffer of the image signal processor.
There is a need to develop new techniques to solve the above problems.
Disclosure of Invention
The invention aims to provide an optical information collector and a method for reducing power consumption.
In order to achieve the above purpose, the present application adopts the following technical means:
the application provides an optical information collector, its characterized in that includes: the image sensor is used for collecting image data of optical information; the decoding unit is used for decoding the image according to a preset decoding algorithm; and the central processing unit is triggered to control the image sensor to acquire images and control the decoding unit to decode the images, wherein once the central processing unit is triggered, the central processing unit sends out an instruction to discard N frames of image data with a specific frame number, wherein the N frames of image data with the specific frame number are the image data acquired by the last trigger and remained in the optical information acquisition unit.
Alternatively, the N frames of image data of a certain frame number include image data remaining in a storage area of the image sensor.
Optionally, the image processing device comprises an image signal processor, and is used for receiving the image acquired by the image sensor and transmitting the image data to the decoding unit, wherein the N frames of image data with a specific frame number comprise the image data remained in the image signal processor.
Optionally, discarding the N frames of image data of the particular frame number includes: the decoding unit does not receive the N-frame image data of the specific frame number, or the decoding unit does not decode the N-frame image data of the specific frame number, or the decoding unit does not output or display the decoding information of the N-frame image data of the specific frame number.
Alternatively, the decoding unit starts decoding from the n+1st frame image data.
The application provides an optical information acquisition method, which is characterized by comprising the following steps: the central processing unit is used for controlling the image sensor to acquire and output image data through triggering; the central processing unit receives the image data and sends out an instruction to discard N frames of image data with a specific frame number, wherein the N frames of image data with the specific frame number are the image data which are acquired and remained through triggering at the last time; the decoding unit decodes the image data.
Alternatively, the N frames of image data of a certain frame number include image data remaining in a storage area of the image sensor.
Optionally, receiving, by an image signal processor, the image data acquired by the image sensor, the image signal processor further transmitting the image data to the decoding unit; the N frames of image data of a certain frame number include image data remaining in the image signal processor.
Optionally, discarding the N frames of image data of the particular frame number includes: the decoding unit does not receive the N-frame image data of the specific frame number, or the decoding unit does not decode the N-frame image data of the specific frame number, or the decoding unit does not output or display the decoding information of the N-frame image data of the specific frame number.
Alternatively, the decoding unit starts decoding from the n+1st frame image data.
The application provides an optical information collector, its characterized in that includes: the image sensor is used for collecting image data of optical information; a decoding unit to receive and decode image data; and the central processing unit is used for controlling the image sensor to acquire image data and controlling the decoding unit to decode the image data, wherein the central processing unit controls the image sensor to acquire and output image data with a fixed frame number in a fixed frame mode through triggering, controls the decoding unit to decode the image data, and stops decoding the remaining image data in the image data with the fixed frame number when one frame of image data is successfully decoded.
Optionally, the fixed frame mode includes: when the decoding unit is successful and the image sensor does not collect the image data with the fixed frame number, the image sensor continuously collects the image data with the fixed frame number and outputs the image data with the fixed frame number.
Optionally, the fixed frame mode includes: the CPU controls the decoding unit to sequentially receive and decode the image data with the fixed frame number, and controls the image sensor to acquire the image data with the fixed frame number again when or before the last frame of the image data with the fixed frame number is not successfully decoded.
Optionally, the image data acquired by the image sensor is not included or optimized by the image signal processor.
Optionally, the image sensor is configured to acquire image data in the following order: image data is acquired in a fixed frame mode for a preset number of times, and image data is continuously acquired in a digital stream mode.
The application provides an optical information acquisition method, which is characterized by comprising the following steps: the central processing unit is used for triggering and controlling the image sensor to acquire and output image data with a fixed frame number in a fixed frame mode; the decoding unit receives and decodes the image data, and stops decoding remaining image data among the image data of the fixed frame number when one of the frame image data is successfully decoded.
Optionally, when the decoding unit decodes successfully and the image sensor does not collect the image data of the fixed frame number, the image sensor continues to collect the image data of the fixed frame number and outputs the image data of the fixed frame number.
Optionally, the central processing unit controls the decoding unit to sequentially receive and decode the image data with the fixed frame number, and controls the image sensor to acquire the image data with the fixed frame number again when or before the last frame of the image data with the fixed frame number is not successfully decoded.
Optionally, the image data acquired by the image sensor is not included or optimized by the image signal processor.
Optionally, the image sensor is configured to acquire image data in the following order: image data is acquired in a fixed frame mode for a preset number of times, and image data is continuously acquired in a digital stream mode.
The application provides an optical information collector, its characterized in that includes: the image sensor is used for collecting image data of optical information; a memory, which is preset with one or more decoding algorithms; a decoding unit to receive and decode image data; and the central processing unit is used for controlling the image sensor to continuously acquire image data in a digital stream mode, controlling the decoding unit to sequentially decode the image data, controlling the image sensor to stop continuously acquiring the image data in the digital stream mode once the decoding unit decodes successfully or the decoding is overtime, and controlling the image sensor to continuously acquire and output the image data with fixed frame numbers.
Optionally, the optical information collector does not have an image signal processor or performs optimization processing on the image data through the image signal processor.
Alternatively, the image sensor outputs RAW format image data, and the decoding unit acquires gradation image data based on the RAW format image data and decodes based on the gradation image data.
Optionally, the fixed frame number of image data is one frame or two frames.
Optionally, the image data collected by the image sensor is directly transmitted to the decoding unit for decoding.
The application provides an optical information acquisition method, which is characterized by comprising the following steps: the central processing unit controls the image sensor to continuously collect and output image data in a digital stream mode; a decoding unit receives and decodes the image data, and controls the image sensor to stop acquiring the image data in a digital stream mode once the decoding unit decodes successfully; and controlling the image sensor to continuously collect and output image data with fixed frame number.
Optionally, the optical information collector does not have an image signal processor or performs optimization processing on the image data through the image signal processor.
Alternatively, the image sensor outputs RAW format image data, and the decoding unit acquires gradation image data based on the RAW format image data and decodes based on the gradation image data.
Optionally, the fixed frame number of image data is one frame or two frames.
Optionally, the image data collected by the image sensor is directly transmitted to the decoding unit for decoding.
Drawings
FIG. 1 is a simplified block diagram of an optical information collector according to one embodiment of the present application;
FIG. 2 is a schematic diagram of an optical information collector according to an embodiment of the present application;
FIG. 3 is a perspective view of the optical information collector of FIG. 2;
FIG. 4 is a high-level block diagram of an optical information collector according to one embodiment of the present application;
FIG. 5 is a timing diagram of an optical information collector collecting optical information in a digital stream mode according to one embodiment of the present application;
FIG. 6 is a timing diagram of an optical information collector collecting optical information according to one embodiment of the present application;
FIG. 7 is a timing diagram of an optical information collector collecting optical information according to an embodiment of the present application;
FIG. 8 is a high-level block diagram of an optical information collector according to another embodiment of the present application;
FIG. 9 is a timing diagram of an optical information collector collecting optical information according to another embodiment of the present application;
FIG. 10 is a timing diagram of an optical information collector collecting optical information in a fixed frame mode according to another embodiment of the present application;
FIG. 11 is a timing diagram of an optical information collector collecting optical information through a mixed mode according to another embodiment of the present application;
FIG. 12 is a timing diagram of an optical information collector collecting optical information in another hybrid mode according to another embodiment of the present application.
Reference numerals of the specific embodiments illustrate:
an optical information collector 100; a camera 1; an optical system 2; an image sensor 3; a light supplementing lamp 4; a central processing unit 5; a memory 6; an image signal processor 7; a decoding unit 8; a housing 9; a scanning window 10; a display screen 11; a button 12; a storage area 13; a register 14; a buffer 15.
Detailed Description
For a better understanding of the objects, structures, features, and effects of the present application, reference should be made to the drawings and to the detailed description.
Referring to fig. 1, a simplified block diagram of an implementation of an optical information collector 100 of one embodiment is shown. As described in further detail below, the optical information collector 100 may be configured to collect one or more types of optical information, such as one-dimensional codes, two-dimensional codes, OCR graphics and text, ultraviolet anti-counterfeiting codes, infrared anti-counterfeiting codes, and the like.
The optical information collector 100 may include at least one camera 1, the camera 1 may include a combination of an optical system 2 (lens) for capturing light and an image sensor 3 (sensor) for photoelectrically converting the light captured by the optical system 2, the optical system 2 may include one or more mirrors, prisms, lenses, or a combination thereof, the image sensor 3 may also be one or more, one image sensor 3 corresponds to one/one optical system 2, or a plurality of image sensors 3 may share the same optical system 2, or a plurality of optical systems 2 may share the same image sensor 3. The image sensor 3 may be a CCD or CMOS or other type of image sensor 3, the image sensor 3 being configured to convert an optical signal into an electrical signal, thereby outputting a digital signal of image data.
The optical information collector 100 may comprise one or more light compensating lamps 4, the light compensating lamps 4 being arranged to illuminate the optical information when the camera 1 collects image data. Of course, the light supplementing lamp 4 may not be used for supplementing light under appropriate ambient lighting conditions, or the optical information collector 100 may not have the light supplementing lamp 4. The light-supplementing mode of the light-supplementing lamp 4 can have various forms: for example, the light supplementing lamp 4 continuously supplements light when the camera 1 collects optical information; or the light filling lamp 4 may be light filling in synchronization with the exposure time of the image sensor 3 of the camera 1, wherein, chinese patent CN201810098421.0 discloses a technical scheme of light filling in synchronization with the exposure time of the light filling lamp 4 and the image sensor 3, the entire contents of which are incorporated herein by reference; the light filling lamp 4 may also be a pulsed light filling, the pulse time of which overlaps a part of the exposure time of the image sensor 3.
The optical information collector 100 may also include a central processor 5 for executing various instructions.
The optical information collector 100 may further comprise a separate or integrated memory 6, one or more decoding algorithms are preset in the memory 6 according to need, and the memory 6 may further store other programs or instructions. The memory 6 may include one or more non-transitory storage media, such as volatile and/or non-volatile memory 6, which may be fixed or removable, for example. In particular, the memory 6 may be configured to store information, data, applications, instructions or the like for enabling the processing module to carry out various functions in accordance with example embodiments of the present invention. For example, the memory 6 may be configured to buffer input data for processing by the central processor 5. Additionally or alternatively, the memory 6 may be configured to store instructions that are executed by the central processor 5. The memory 6 may be considered to be the main memory 6 and be included in a volatile storage device, e.g. in RAM or other form, which retains its contents only during operation, and/or the memory 6 may be included in a non-volatile storage device, such as ROM, EPROM, EEPROM, FLASH or other type of storage device, which retains the memory 6 contents independent of the power state of the processing module. The memory 6 may also be included in a secondary storage device, such as an external disk memory 6, that stores large amounts of data. In some embodiments, disk storage 6 may communicate with central processor 5 via a data bus or other routing means using input/output means. The secondary memory 6 may comprise a hard disk, a compact disk, a DVD, a memory card, or any other type of mass storage type known to those skilled in the art. The memory 6 may store one or more of a variety of optical information gathering, transmission, processing, and decoding processes or methods that will be described below.
The optical information collector 100 may further include an image signal processor 7 (Image Signal Processor, abbreviated as ISP), where the image signal processor 7 is configured to perform optimization processing on the image data collected by the camera 1, where the optimization processing includes one or more of linear correction, noise removal, dead pixel repair, color interpolation, white balance correction, exposure correction, and so on, so as to optimize the quality of the image data. For optical information that does not require color recognition, some or all of the foregoing optimization processes, such as color interpolation, are not necessary. The image signal processor 7 may process one frame of image data at a time by a single core single thread, or the image signal processor 7 may process a plurality of frames of image data simultaneously by a plurality of cores multi-thread. Alternatively, the optical information collector 100 may not have the image signal processor 7, or the image data may not be optimally processed by the image signal processor 7.
The optical information collector 100 may further include a decoding unit 8, where the decoding unit 8 is configured to decode the image data collected by the camera 1 according to a preset decoding algorithm, so as to identify optical information, such as identifying encoded information of a one-dimensional code or a two-dimensional code, or identifying OCR graphics context, or identifying encoded information of various ultraviolet/infrared anti-pseudo codes, and so on. The decoding unit 8 may decode one frame of image data at a time by a single core single thread, or the decoding unit 8 may decode multiple frames of image data simultaneously by multiple cores and multiple threads.
Alternatively, some or all of the functional modules of the image signal processor 7 may be integrated with the central processor 5, such as chinese patent CN201811115589.4, which discloses a central processor 5 with integrated image signal processor 7, the entire contents of which are incorporated herein by reference; alternatively, some or all of the functional blocks of the image signal processor 7 may be integrated with the image sensor 3; alternatively, the decoding unit 8 may be integrated with the central processor 5; alternatively, the memory 6 may be integrated with the central processing unit 5. In the following embodiments, when the image data is optimally processed by the image signal processor 7, the image signal processor 7 and the decoding unit 8 are preferably integrated with the central processor 5, so that costs can be saved; of course, the image signal processor 7 and the decoding unit 8 may not be integrated with the central processor 5.
Fig. 2 and 3 show schematic views of a handheld terminal as a specific embodiment of the optical information collector 100, the handheld terminal comprising a housing 9, a display 11 and buttons 12. The front end of the housing 9 is provided with a scanning window 10, the camera 1 is accommodated in the housing 9, and optical information can be collected through the scanning window 10. Alternatively, the optical information collector 100 may not have the display screen 11, but may output information to a separate display screen 11 for display. Alternatively, the optical information collector 100 may be a stationary, desktop or other form of terminal, and the optical information collector 100 may be integrated with other devices as part of the other devices.
The central processing unit 5 issues a trigger instruction via an external trigger, which may be a trigger generated by a user pressing a specific button 12 or touching a specific area of the display screen 11 or by a specific gesture by a user operating the optical information collector 100. Once the central processing unit 5 is triggered externally, a trigger instruction is sent out according to a preset algorithm, so that the image sensor 3 is triggered to acquire image data.
The image data collected by the image sensor 3 may be optimized by the image signal processor 7, and then output to the decoding unit 8 for decoding. Referring to the schematic diagram of the bar code collection by the optical information collector 100 shown in the block diagram of fig. 4, when the user presses the button 12 to trigger the light supplementing lamp 4 to supplement light and the image sensor 3 to collect image data, the image signal processor 7 may sequentially receive and optimally process the image data collected by the image sensor 3 through the MIPI interface (mobile industry processor interface Mobile Industry Processor Interface, abbreviated as MIPI), the decoding unit 8 decodes the image data transmitted after the optimization process of the image signal processor 7, when one frame of image data is successfully decoded, the decoding unit 8 will stop decoding and notify the cpu 5 that the decoding is successful, and the cpu 5 issues an instruction to control the image sensor 3 to stop collecting the image data.
The image sensor 3 may continuously acquire image data through a digital stream mode, i.e., a digital stream mode in which the image sensor 3 continuously acquires image data for a preset time according to a preset algorithm, the decoding unit 8 decodes the continuously acquired image data by single line Cheng Yici or simultaneously decodes the continuously acquired image data by multiple threads, and when the decoding is successful or the decoding is timed out, the image sensor 3 is controlled to stop acquiring the image data, and the decoding unit 8 is controlled to stop decoding. For example, the preset time is five seconds, which represents that the image sensor 3 continuously collects image data in five seconds, and if the image data collected by the image sensor 3 in five seconds is not successfully decoded, the decoding is overtime; if one of the frames of image data is successfully decoded, the central processor 5 will control the image sensor 3 to stop acquiring image data and control the decoding unit 8 to stop decoding even if the time has not reached five seconds.
Fig. 5 shows a timing diagram 200 of the optical information collector 100 collecting optical information in a digital stream mode according to an embodiment, the timing diagram 200 shows a trigger signal 201 triggered by the outside, a light filling timing 202 of the light filling lamp 4, an image data collecting timing 203 of the image sensor 3 continuously collecting image data, and a decoding timing 204 of the decoding unit 8, wherein: the triggering signal 201 triggers the image sensor 3 to collect image data and the light supplementing lamp 4 to supplement light at a high level, and triggers the image sensor 3 to stop collecting image data and the light supplementing lamp 4 to stop supplementing light at a low level, wherein the light supplementing lamp 4 supplements light at a high level at a light supplementing time sequence 202 and turns off light supplementing at a low level; the image data acquisition time sequence 203 of the image sensor 3 is synchronous with the light supplementing time sequence 202, and the image sensor 3 outputs the image data at a high level exposure and a low level of the image data acquisition time sequence 203; in fig. 5, the dashed arrow represents that the first frame image data is output to the decoding unit 8 for decoding, the decoding unit 8 receives the first frame image data at the time point a, successfully decodes the first frame image data at the time point b, and feeds back the successfully decoded information to the central processing unit 5, and the central processing unit 5 controls the image sensor 3 to stop collecting the image data and controls the light filling lamp 4 to stop filling light at the time point c. The rising edge of the high level of the trigger signal 201 is slightly earlier than the rising edge of the high level of the image data acquisition timing 203 due to the signal delay, and the falling edge of the high level of the trigger signal is slightly earlier than the point in time c at which the image sensor 3 ends acquiring image data. It should be noted that, when the ambient light is sufficient, the light supplement is not necessary.
As can be seen from the timing chart 200, when the decoding unit 8 decodes image data, the image sensor 3 is acquiring new image data, when the decoding unit 8 decodes first frame image data successfully, the image sensor 3 already acquires seven frames of image data, wherein the second to seventh frames of image data are not transmitted to the decoding unit 8, but are stored (remained) in the storage area 13 (buffer or PN junction) of the image sensor 3 or the register 14 corresponding to the image signal processor 7, and according to the first-in first-out principle, the post-acquired image data will cover the previously acquired image data, then the seventh frame of image data will be stored in the register 14 of the image signal processor 7, and the sixth frame of image data will be stored in the storage area 13 of the image sensor 3, and the second to fifth frames of image data will be covered and cleared.
When the optical information collector 100 collects new optical information again by triggering, the decoding unit 8 will first receive and decode the image data last left in the memory area 13 of the image sensor 3 or the register 14 of the image signal processor 7, which will tend to cause decoding errors, because the last left image data is not the image data of the new optical information.
The above problem can be solved by the following method, avoiding decoding errors.
An alternative approach is shown in the timing diagram 300 of FIG. 6, where N.gtoreq.1, by discarding N frames of image data of a particular frame number, and starting decoding from the (n+1) th frame of image data, to avoid decoding errors. The timing diagram 300 shows a trigger signal 301 for external triggering, a light-filling timing 302 for the light-filling lamp 4, an image data acquisition timing 303 for the image sensor 3 to continuously acquire image data, and a decoding timing 304 for the decoding unit 8, wherein: the triggering signal 301 triggers the image sensor 3 to collect image data and the light supplementing lamp 4 to supplement light at a high level, and triggers the image sensor 3 to collect image data and the light supplementing lamp 4 to stop supplementing light at a low level, wherein the light supplementing lamp 4 supplements light at a high level at a light supplementing time sequence 302 and turns off light supplementing at a low level; the image data acquisition time sequence 303 of the image sensor 3 is synchronous with the light supplementing time sequence 302, and the image sensor 3 exposes at a high level of the image data acquisition time sequence 303 and outputs image data at a low level; since one frame of image data remains in each of the image sensor 3 and the image signal processor 7, the specific frame number N is discarded as two frames, the first two frames of image data in the image data acquisition timing 303 are not transmitted to the decoding unit 8, the dashed arrow in fig. 6 represents that the third frame of image data is output to the decoding unit 8 for decoding, the decoding unit 8 receives and decodes the third frame of image data at the time point d, successfully decodes the third frame of image data at the time point e, and feeds back the successfully decoded information to the central processor 5, and controls the image sensor 3 to stop acquiring the image data and the light supplementing lamp 4 to stop light supplementing at the time point f due to the delay of signals. It should be noted that, when the ambient light is sufficient, the light supplement is not necessary. As can be seen from the image data acquisition timing 302, the image sensor 3 has already acquired eight frames of image data at this time, and the eighth frame of image data will remain in the register 14 of the image signal processor 7, and the seventh frame of image data will remain in the storage area 13 of the image sensor 3. When the optical information collector 100 collects new optical information again through triggering, the optical information collector 100 discards the two frames of image data remained in the image sensor 3 and the image signal processor 7 again, and starts decoding output from the third frame of image data, so as to avoid decoding errors.
It is to be understood that the number of frames N of the image data remaining at the time of the last acquisition of the image data may be equal to or larger than the number of frames, and is not limited to two, and the number of frames N of the image data remaining in the image sensor 3 and the image signal processor 7 may be equal to or larger than two. Discarding the surviving image data may include the decoding unit 8 not receiving the surviving image data, or the decoding unit 8 not decoding the surviving image data although receiving the surviving image data, or the decoding unit 8 decoding the surviving image data but the decoded information is not output or displayed on the display screen 11, the information output and displayed on the display screen 11 being the decoded information of the new optical information. For example, it is known that the memory area 13 of the image sensor 3 and the register 14 of the image signal processor 7 store the image data acquired by the last trigger, then when the next trigger acquires new optical information, the first two frames of image data are discarded, and the third frame and the subsequent image data are used as the image data of the new optical information, so as to decode the third frame and the subsequent image data until the decoding is successful or the decoding is overtime.
In one embodiment, the optical information collector 100 may not perform the optimization processing on the image data by using the image signal processor 7, the image signal processor 7 only receives the image data in the RAW format transmitted by the image sensor 3, and then transmits the image data not subjected to the optimization processing to the decoding unit 8 to decode, the decoding unit 8 directly receives the gray image data (only takes the brightness signal of the RAW image data), so as to facilitate the binary decoding of the image data, and the image signal processor 7 only serves as a simple data transmission channel, so that no residual image data exists in the register 14 of the image signal processor 7; or the optical information collector 100 does not include the image signal processor 7, the RAW image data collected by the image sensor 3 is directly transmitted to the decoding unit 8 through interfaces such as a DVP interface (Digital Video Port) or LVDS (Low Voltage Differential Signaling) interface, if only one frame of image data is remained in the storage area 13 of the image sensor 3, only one frame of image data is needed to be discarded when new optical information is collected, the second frame and the later image data are the image data of the new optical information, and the second frame and the later image data are decoded until the decoding is successful or the decoding is overtime; since the optical information is decoded from the second frame, the processing time and the light supplementing time of one frame of image data are saved, and the decoding speed can be improved and the power consumption can be reduced, compared with the previous method in which the optical information is decoded from the third frame. In these specific embodiments, since the image data is not optimally processed by the image signal processor 7, a certain image data processing time can be saved theoretically.
Specifically, referring to the timing chart 400 as in fig. 7, there are shown a trigger signal 401 triggered externally, a light-supplementing timing 402 of the light-supplementing lamp 4, an image data acquisition timing 403 of the image sensor 3 continuously acquiring image data, and a decoding timing 404 of the decoding unit 8, wherein: the triggering signal 401 triggers the image sensor 3 to collect image data and the light supplementing lamp 4 to supplement light at a high level, and triggers the image sensor 3 to stop collecting image data and the light supplementing lamp 4 to stop supplementing light at a low level, wherein the light supplementing lamp 4 supplements light at a high level at a light supplementing time sequence 402 and turns off light supplementing at a low level; the optical information collector 100 does not optimize the image data by the image signal processor 7, only one frame of the image data collected before remains in the storage area 13 of the image sensor 3, the optical information collector 100 discards the first frame of image data, the dashed arrow in fig. 7 represents the output of the second frame of image data to the decoding unit 8 for decoding, the decoding unit 8 receives and decodes the second frame of image data at the time point g, successfully decodes the second frame of image data at the time point h, and feeds back the successfully decoded information to the central processor 5, and the image sensor 3 stops collecting the image data and controls the light supplementing lamp 4 to stop light supplementing at the time point i due to the delay of the signal. It should be noted that, when the ambient light is sufficient, the light supplement is not necessary. As can be seen from the image data acquisition timing 403, six frames of image data have been acquired by the image sensor 3 at this time, and the sixth frame of image data will remain in the storage area 13 of the image sensor 3. When the optical information collector 100 collects new optical information again through triggering, the optical information collector 100 discards the remaining one frame of image data in the image sensor 3 again, and starts decoding output from the second frame of image data, so as to avoid decoding errors.
In the method, when new optical information is collected each time, one or two frames of residual image data are discarded as required, so that the problem of residual image data in the image sensor 3 or the image signal processor 7 can be solved; alternatively, more than two frames of survivor data may be discarded as desired.
The above method has a certain defect that the image sensor 3 needs to output and discard one or more residual image data each time new optical information is acquired, and the decoding unit 8 starts decoding at least the second frame of image data, thereby wasting time; it is conceivable that the efficiency can be improved if the first frame of image data output from the image sensor 3 is valid image data (image data of new optical information) every time new optical information is acquired.
It is conceivable that if the storage area 13 of the image sensor 3 or the register 14 of the image signal processor 7 is emptied after each decoding success, so that no residual image data information exists when new optical information is acquired next time, the first frame image data is the image data of the new optical information, and thus decoding can be directly started from the first frame image data, and the decoding speed is improved. This can be achieved by means of a preset algorithm, i.e. by means of an algorithmic control, after each decoding success the memory area 13 of the image sensor 3 or the register 14 of the image signal processor 7 continues to be emptied. Such a preset algorithm for clearing the memory area 13 of the image sensor 3 or the register 14 of the image signal processor 7 generally requires that the algorithm be preset by the manufacturer of the image sensor 3 or the manufacturer of the image signal processor 7 (or the manufacturer of the central processor 5 in which the image signal processor 7 is integrated, the same applies hereinafter). For the manufacturer of the optical information collector 100, the purchased image sensor 3 or the image signal processor 7, the arithmetic logic for processing the image data is usually predefined by the manufacturer of the image sensor 3 or the image signal processor 7, and is not easy to change, that is, when the manufacturer predefines that the last frame of image data is still stored in the image sensor 3 or the image signal processor 7 when the image data stored in the image sensor 3 or the image signal processor 7 is not decoded, then the manufacturer of the optical information collector 100 has difficulty in changing or directly eliminating the image data remaining in the image signal processor 7. Moreover, since the image sensors 3 produced by different manufacturers of the image sensors 3 have different operation logics, and the image signal processors 7 produced by different manufacturers of the image signal processors 7 also have different operation logics, even if the manufacturers of the optical information collector 100 can debug the image sensor 3 or the image signal processors 7 to directly eliminate the residual image data, the image sensor 3 or the image signal processors 7 need to be debugged again after being replaced, the workload is huge, and if a portable method is provided, the residual image data in the image sensors 3 or the image signal processors 7 of different models can be emptied, so that the workload can be saved.
In a block diagram of an alternative embodiment shown in fig. 8, the surviving image data is eliminated by bypassing the image data processing flow predefined by the manufacturer of the image signal processor 7. The optical information collector 100 does not perform the optimization processing on the image data by the image signal processor 7, the image data collected by the image sensor 3 is output to the image signal processor 7 by the existing MIPI interface, and stored in a buffer 15 that can be configured separately by the manufacturer of the optical information collector 100, the buffer 15 is integrated in the image signal processor 7, and of course, the buffer 15 may be set independently of the image signal processor 7, and the decoding unit 8 may take out the image data from the buffer 15 and decode the image data. In this embodiment, the image data collected by the image sensor 3 is still transmitted to the image signal processor 7 and then to the decoding unit 8 through the existing MIPI interface, because the image data is transmitted through the existing MIPI interface relatively simply. In some embodiments, the image signal processor 7 may be completely bypassed, i.e. the image data acquired by the image sensor 3 is directly transmitted to the decoding unit 8 for decoding. Since the image data is not subjected to the optimization processing by the image signal processor 7, only one frame of image data remains in the memory area 13 of the image sensor 3, a specific flow can be set, eliminating one frame of image data remaining in the image sensor 3.
In an alternative embodiment, after the original decoding process is finished, for example, after the decoding is successful, the central processing unit 5 sends a finishing instruction to control the image sensor 3 to finish collecting the image data, the central processing unit 5 sends an instruction to the image sensor 3 again, controls the image sensor 3 to continuously collect one or more frames of image data, preferably, collects one frame of image data, and controls the image sensor 3 to continuously output the one frame of image data, so that the image data in the storage area 13 of the image sensor 3 is cleared, so that new optical information is collected next time, and the first frame of image data output by the image sensor 3 is the image data of the new optical information. And the last frame of image data output by the image sensor 3 may be input into a buffer 15 that may be configured by the manufacturer of the optical information collector 100 and further cleared, eventually eliminating the remaining image data in the image sensor 3.
Specifically, referring to the timing diagram 500 of one embodiment in fig. 9, there are shown a trigger signal 501 of the central processor 5, a light-supplementing timing 502 of the light-supplementing lamp 4, an image data acquisition timing 503 of the image sensor 3 continuously acquiring image data, and a decoding timing 504 of the decoding unit 8, wherein: the triggering signal 501 triggers the image sensor 3 to collect image data and the light supplementing lamp 4 to supplement light at a high level, and triggers the image sensor 3 to stop collecting image data and the light supplementing lamp 4 to stop supplementing light at a low level, wherein the light supplementing lamp 4 supplements light at a high level at a light supplementing time sequence 502 and turns off light supplementing at a low level; the image data acquisition time sequence 503 of the image sensor 3 is synchronous with the light supplementing time sequence 502, and the image sensor 3 exposes at a high level of the image data acquisition time sequence 503 and outputs image data at a low level; in fig. 9, the dashed arrow represents that the first frame image data is output to the decoding unit 8 for decoding, the decoding unit 8 receives the first frame image data at the time point j, successfully decodes the first frame image data at the time point k, and feeds back the successfully decoded information to the central processing unit 5, and the central processing unit 5 sends a trigger signal at the time point i to control the image sensor 3 to stop collecting the image data and control the light filling lamp 4 to stop filling light. Unlike the foregoing embodiment, the central processor 5 will send out the control signal 510 again, and separately control the image sensor 3 to continuously collect one frame of image data at the high level 530, and output this frame of image data, so that there is no remaining image data in the image sensor 3, and once the next trigger is received, the first frame of image data collected and output by the image sensor 3 is the image data of the new optical information, and the decoding unit 8 can directly receive and decode the first frame of image data. At this time, the light supplementing lamp 4 is at the low level 520, and no light supplementing is performed, thereby saving power consumption. It should be noted that when the ambient light is sufficient, no light supplement is necessary in the whole process.
In the foregoing embodiments, the image data is continuously acquired and decoded by the digital stream mode, and when the decoding unit 8 receives the first frame of image data and decodes the first frame of image data successfully, the image sensor 3 has already acquired the plurality of frames of image data, for example, in the timing chart 200, the image sensor 3 has acquired seven frames of image data in total, and it is obvious that the acquisition of the second to seven frames of image data causes waste of power consumption. The optical information collector 100 produced by the present iData, honeywell and Zebra and other companies can be successfully decoded basically within the first three frames of image data, that is, at least one frame of the first three frames of image data collected by the image sensor 3 can be successfully decoded by the decoding unit 8. As can be seen from the foregoing, when the optical information collector 100 successfully decodes the third frame of image data, the image sensor 3 has collected more than three frames of image data, even six or seven frames of image data, and the collection of the fourth to seven frames of image data also requires the image sensor 3 to operate or the light supplementing lamp 4 to supplement light, since the fourth to seven frames of image data are not used for decoding, the collection of the fourth to seven frames of image data causes waste of power consumption. It should be noted that, in some embodiments, when the ambient light is sufficient, the light supplement is not necessary, for example, the code is scanned by a mobile phone in daily life, and the light supplement is not generally required.
In a preferred embodiment, the optical information collector 100 may collect image data in a fixed frame mode, unlike the continuous collection of image data in a digital stream mode, in which the central processor 5 controls the image sensor 3 to collect image data of a fixed frame number each time, the decoding unit 8 decodes image data of a fixed frame number, and when the decoding of image data of a fixed frame number collected at the previous time is completed (one frame of image data is successfully decoded or all image data of a fixed frame number is failed to be decoded) or the decoding is completed, the central processor 5 determines whether the image data of a fixed frame number needs to be collected again, and so on until the decoding is successful or the decoding is overtime. The fixed frame mode has a time interval between the two acquisitions of the image data of the fixed frame number before and after the fixed frame mode, instead of being continuous, and leaves time for the central processing unit 5 to make a judgment.
Referring to a timing diagram 600 of an embodiment of fig. 10, a trigger signal 601 of the central processing unit 5, a light supplementing timing 602 of the light supplementing lamp 4, an image data acquisition timing 603 of the image sensor 3 continuously acquiring image data, and a decoding timing 604 of the decoding unit 8 are shown, wherein: the triggering signal 601 triggers the image sensor 3 to collect image data and the light supplementing lamp 4 to supplement light at a high level, and triggers the image sensor 3 to collect image data and the light supplementing lamp 4 to stop supplementing light at a low level, wherein the light supplementing lamp 4 supplements light at a high level at a light supplementing time sequence 602 and turns off light supplementing at a low level; the image data acquisition time sequence 603 of the image sensor 3 is synchronous with the light supplementing time sequence 602, and the image sensor 3 exposes at a high level of the image data acquisition time sequence 603 and outputs image data at a low level; the four broken-line arrows from left to right in fig. 10 represent that the first to fourth frame image data, respectively, are output to the decoding unit 8 for decoding, respectively, wherein none of the first to third frame image data is successfully decoded, and the fourth frame image data is successfully decoded. As can be seen from the image data acquisition time sequence 603, the image data acquisition time of the first three frames with the fixed frame number and the image data acquisition time of the last three frames with the fixed frame number have obvious time intervals, so that the central processing unit 5 can judge whether the image data of the first three frames with the fixed frame number is decoded, thereby judging whether the image sensor 3 needs to be controlled to continuously acquire the image data of the last three frames with the fixed frame number.
The image data acquisition time sequence 603 shows that the image sensor 3 acquires image data in a fixed frame mode with a fixed frame number as three frames, the central processor 5 controls the image sensor 3 to acquire three frames of image data with the fixed frame number first, and transmits the three frames of image data to the decoding unit 8, when the three frames of image data are not successfully decoded, controls the image sensor 3 to acquire three frames of image data again, and transmits the three frames of image data to the decoding unit 8 again for decoding, and so on until decoding is successful (or decoding is overtime). As can be seen from the timing chart 600, when the fourth frame image data acquired by the image sensor 3 is successfully decoded, if the image sensor 3 does not acquire three frames of image data of a fixed frame number at this time, the image sensor 3 will continue to perform the fixed frame mode, acquire the image data of the fixed frame number, that is, continue to acquire the image data of the fifth and sixth frames, and output all the image data of the fifth and sixth frames of the fixed frame number, and then stop the image data acquisition, and there will be no image data remaining in the image sensor 3. It is easy to understand that, in contrast, the image sensor 3 may be controlled to stop image data acquisition after the decoding is successful, and even if the image sensor 3 has not acquired image data of a fixed frame number, this may save power consumption to some extent, but may cause residual image data in the image sensor 3; the image data remaining in the image sensor 3 can be discarded the next time new optical information is acquired.
In the foregoing embodiment, since the image sensor 3 is controlled to collect three frames of image data after the decoding (the decoding is successful or not) is completed on the first three frames of image data, there is a time interval between the collection of the first three frames of image data and the collection of the second three frames of image data, if the decoding is not successful on the first three frames of image data, there is a significant delay in collecting the last three frames of image data with fixed frames. As an improvement, alternatively, when the second frame image data of the first three frame image data is not successfully decoded or the third frame image data is input to the decoding unit 8 for decoding, that is, the image sensor 3 is controlled to acquire three frames of image data again, so as to achieve the balance of the decoding speed and the power consumption; the time for starting to acquire the three frames of image data can be determined according to actual requirements, so that no obvious delay exists between the acquisition of the three frames of image data.
In the foregoing embodiment, the fixed frame number of the fixed frame mode is three frames, that is, the image sensor 3 collects three frames of image data at a time; in some embodiments, the fixed frame number may be determined according to the performance of the specific optical information collector 100, for example, if the optical information collector 100 can decode the image data within the previous two frames or the first frame of image data successfully, the fixed frame number in the fixed frame mode may be set to be two frames or one frame preferentially, so as to avoid the waste of power consumption caused by the image data collected in the subsequent multiple frames, and control the image sensor 3 to collect the image data of the next frame each time after the decoding of the image data of the frame is completed and the decoding is not successful; of course, the fixed frame number may be set to two or four frames or five or more frames. In summary, in combination with the foregoing embodiments, it can be seen that, in the present optical information collector 100, most of the previous three frames of image data can be successfully decoded, and the fixed frame number needs to be less than or equal to the timeout time for decoding one frame of image data by the decoding unit 8, under the existing technical conditions, the timeout time is generally set to 100ms, that is, the time for decoding one frame of image data by the decoding unit 8 reaches 100ms and is not successfully decoded, then the decoding of this frame of image data is stopped, and the next frame of image data is decoded, so that the fixed frame number in the fixed frame mode is preferably not more than five frames (20 ms×5=100 ms), and further preferably three to five frames, so that the fixed frame number of image data acquired in the first fixed frame mode can be successfully decoded, and not too many image data can be acquired, which has a power consumption advantage over the existing digital stream mode. It is contemplated that when a particular optical information collector 100 is in digital stream mode, more than five frames of image data are required to be successfully decoded, the fixed frame number may be set to five or more frames.
The method can adopt a mixed mode combining the advantages of a fixed frame mode and a digital stream mode, is suitable for complex application scenes, and achieves the balance of power consumption and decoding speed. For some difficult-to-identify optical information, such as high-density two-dimensional codes, DPM (Direct Part Mark) or complex text coincidence, the image data can be acquired and decoded in a fixed frame mode, and when the decoding is unsuccessful, the image data is continuously acquired in a digital stream mode for decoding; it is conceivable that this hybrid mode can also be used for simple optical information reading.
It is readily appreciated that the hybrid mode may be arranged in a variety of combinations.
For example, the camera 1 may be configured to collect image data in a fixed frame mode for a preset number of times, and then collect the image data in a digital stream mode; for example, the optical information collector 100 collects image data of a fixed frame number by adopting a fixed frame mode, then continuously collects image data by adopting a digital stream mode, and, referring to a timing chart 700 in fig. 11, a trigger signal 701 of the central processing unit 5, a light supplementing timing 702 of the light supplementing lamp 4, an image data collecting timing 703 of the image sensor 3 for continuously collecting image data, and a decoding timing 704 of the decoding unit 8 are shown, and when the decoding is not successful, the optical information collector 100 continuously collects and decodes image data by adopting the digital stream mode, and the decoding unit 8 successfully decodes image data by adopting the first frame image data collected in the digital stream mode.
In other embodiments, the fixed frame mode may be adopted for multiple times, and when the decoding is not successful, then the digital stream mode is adopted, for example, the fixed frame mode may be adopted for two times, then the digital stream mode is adopted, that is, three frames of image data with fixed frames are collected first for decoding, when the decoding is not successful, three frames of image data with fixed frames are continuously collected for decoding, and when the decoding is not successful, the digital stream mode is adopted for decoding; it is conceivable that the decoding may be performed in a fixed frame mode three or more times, and then in a digital stream mode when the decoding is not successful.
Since it has been described above that image data remains in the image sensor 3 when decoding is successful in the digital stream mode, the hybrid mode may be to use the fixed frame mode first, then the digital stream mode, and finally end with the fixed frame mode in order to solve this problem.
Specifically, referring to the timing chart 800 of one embodiment in fig. 12, the trigger signal 801 of the central processing unit 5, the light supplementing timing 802 of the light supplementing lamp 4, the image data acquisition timing 803 of the image sensor 3 for continuously acquiring image data, and the decoding timing 804 of the decoding unit 8 are shown, the optical information acquirer 100 acquires three frames of image data in a fixed frame mode with a fixed frame number of three frames, and when the decoding is not successful, acquires and decodes image data continuously in a digital stream mode, the decoding unit 8 successfully decodes image data in the first frame of image data acquired in the digital stream mode, and when the decoding unit 8 successfully decodes image data in the fourth frame of image data, controls the image sensor 3 to stop image data acquisition and the light supplementing lamp 4 to stop light supplementing. Unlike the previous embodiment, the central processing unit 5 will send out the control signal 810 again, and independently control the image sensor 3 to continuously collect one frame of image data at the high level 830 and output the frame of image data, so that there is no remaining image data in the image sensor 3, and further, since the image signal processor 7 is bypassed, there is no remaining image data in the image signal processor 7, and the next time a new optical information is collected after triggering, the first frame of image data collected and output by the image sensor 3 is the image data of the new optical information, and the decoding unit 8 can directly accept and decode the first frame of image data. At this time, the light supplementing lamp 4 is at the low level 820, and no light supplementing is performed, thereby saving power consumption. It should be noted that when the ambient light is sufficient, no light supplement is necessary in the whole process.
It is conceivable that the mixed mode may also adopt a digital stream mode to collect and decode image data first, and then adopts a fixed frame mode after the decoding is successful, to control the image sensor 3 to continuously collect image data of a fixed frame number, and to control the image sensor 3 to output all image data of the fixed frame number, so that no image data remains in the image sensor 3; in the foregoing embodiment, a special case has been described in which the image sensor 3 continues to acquire one frame of image data after decoding is successful in the digital stream mode.
It is conceivable that, when the optical information collector 100 collects image data in a hybrid mode, the optical information collector 100 may perform an optimization process on the image data by using the image signal processor 7, and in order to eliminate residual image data in the image signal processor 7, the foregoing method of discarding residual image data of a specific frame number N may be further adopted. The number of specific discarded frames N may be determined according to the remaining image data, for example, when the remaining image data is stored in both the image sensor 3 and the image signal processor 7, two frames of image data need to be discarded each time the image data is re-collected; when there is no image data remaining in the image sensor 3 and one frame of image data remains in the image signal processor 7, only one frame of image data remaining in the image signal processor 7 needs to be discarded every time the image data is re-acquired. Alternatively, the optical information collector 100 may not process the image data by the image signal processor 7, and when there is one frame of remaining image data in the image sensor 3, only the frame of remaining image data needs to be discarded each time new image data is collected; in the hybrid mode, when the fixed frame mode is adopted and the image sensor 3 has no residual image data, the residual image data does not need to be discarded every time the image data is collected again.
The optical information collector and the method have the following beneficial effects:
1. when the image sensor 3 acquires image data through triggering, the central processing unit 5 sends out an instruction to discard N frames of image data with a specific frame number, wherein the N frames of image data with the specific frame number are the image data acquired and survived through triggering last time, so that the survived image data are prevented from being decoded and output, and decoding errors are avoided.
2. The image sensor 3 collects and outputs image data of a fixed frame number each time in a fixed frame mode, compared with the existing method for continuously collecting and outputting image data in a digital stream mode, the method can save power consumption, avoid continuously collecting image data in the digital stream mode, and avoid wasting power consumption caused by that multi-frame image data which is continuously collected after the multi-frame image data is not used for decoding when decoding is successful.
3. The image sensor 3 collects image data through a digital stream mode, and does not optimize the image data through the image signal processor 7, so that residual image data in the image signal processor 7 is avoided; and when the decoding is successful or the decoding is overtime, the image sensor 3 is controlled to stop continuously collecting the image data in a digital stream mode, and the image sensor 3 is controlled to continuously collect and output the image data with fixed frame number, so that the residual image data in the image sensor 3 is avoided, the decoding error is avoided when the optical information is collected next time, and the efficiency is improved.
The above detailed description is illustrative of the preferred embodiments of the present application and is not intended to limit the scope of the present application, so that all technical and scientific variations which employ the present invention are intended to be included within the scope of the present invention.

Claims (10)

1. An optical information collector, comprising:
the image sensor is used for collecting image data of optical information;
a memory, which is preset with one or more decoding algorithms;
a decoding unit to receive and decode image data;
the central processing unit is used for controlling the image sensor to collect image data according to a first fixed frame number and controlling the decoding unit to sequentially decode the image data; and if the decoding is not successful, controlling the image sensor to continuously acquire the image data in a digital stream mode, controlling the decoding unit to sequentially decode the image data, and once the decoding unit is successful or the decoding is overtime, controlling the image sensor to stop continuously acquiring the image data in the digital stream mode, and controlling the image sensor to continuously acquire and output the image data of a second fixed frame number.
2. The optical information collector as claimed in claim 1, wherein: the image sensor is used for acquiring image data, and the decoding unit is used for decoding the image data acquired by the image sensor.
3. The optical information collector as claimed in claim 1, wherein: the image sensor outputs RAW format image data, and the decoding unit acquires gradation image data based on the RAW format image data and decodes based on the gradation image data.
4. The optical information collector as claimed in claim 1, wherein: the first fixed frame number of image data is three frames to five frames; the second fixed frame number of image data is one frame or two frames.
5. The optical information collector as claimed in claim 1, wherein: and the image data acquired by the image sensor is directly transmitted to the decoding unit for decoding.
6. An optical information acquisition method, comprising:
the central processing unit controls the image sensor to collect and output image data according to a first fixed frame number;
the decoding unit receives and decodes the image data, if the decoding is not successful, the image sensor is controlled to continuously acquire and output the image data in a digital stream mode, the decoding unit is controlled to sequentially decode the image data, and once the decoding unit is successful, the image sensor is controlled to stop acquiring the image data in the digital stream mode;
And controlling the image sensor to continuously acquire and output the image data of the second fixed frame number.
7. The optical information acquisition method as claimed in claim 6, wherein: the image data acquired by the image sensor is stored in a buffer, and the decoding unit takes out the image data from the buffer and decodes the image data.
8. The optical information acquisition method as claimed in claim 6, wherein: the image sensor outputs RAW format image data, and the decoding unit acquires gradation image data based on the RAW format image data and decodes based on the gradation image data.
9. The optical information acquisition method as claimed in claim 6, wherein: the first fixed frame number of image data is three frames to five frames; the second fixed frame number of image data is one frame or two frames.
10. The optical information acquisition method as claimed in claim 6, wherein: and the image data acquired by the image sensor is directly transmitted to the decoding unit for decoding.
CN202311503897.5A 2022-08-11 2022-08-11 Optical information collector and method Pending CN117544849A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311503897.5A CN117544849A (en) 2022-08-11 2022-08-11 Optical information collector and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202311503897.5A CN117544849A (en) 2022-08-11 2022-08-11 Optical information collector and method
CN202210963616.3A CN115396572B (en) 2022-08-11 2022-08-11 Optical information collector and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202210963616.3A Division CN115396572B (en) 2022-08-11 2022-08-11 Optical information collector and method

Publications (1)

Publication Number Publication Date
CN117544849A true CN117544849A (en) 2024-02-09

Family

ID=84118324

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202311503897.5A Pending CN117544849A (en) 2022-08-11 2022-08-11 Optical information collector and method
CN202210963616.3A Active CN115396572B (en) 2022-08-11 2022-08-11 Optical information collector and method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202210963616.3A Active CN115396572B (en) 2022-08-11 2022-08-11 Optical information collector and method

Country Status (1)

Country Link
CN (2) CN117544849A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115034247B (en) * 2022-08-11 2022-11-08 无锡盈达聚力科技有限公司 Optical information collector and method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7637430B2 (en) * 2003-05-12 2009-12-29 Hand Held Products, Inc. Picture taking optical reader
CN201117008Y (en) * 2005-06-27 2008-09-17 手持产品公司 Bar code reading device, marking reading device and data acquisition device
CN101931756B (en) * 2009-06-19 2012-03-21 比亚迪股份有限公司 Device and method for improving dynamic range of CMOS image sensor
CN106470321B (en) * 2015-08-21 2020-03-31 比亚迪股份有限公司 Image sensor and reading method of image sensor
CN108416239B (en) * 2018-01-31 2019-10-11 深圳盈达信息科技有限公司 A kind of bar code identifies the control method of engine and its reduction power consumption
CN109740393A (en) * 2018-12-06 2019-05-10 无锡盈达聚力科技有限公司 Bar code scanning system and method
CN113542545B (en) * 2021-05-28 2024-03-26 青岛海信移动通信技术有限公司 Electronic device and video recording method

Also Published As

Publication number Publication date
CN115396572B (en) 2023-12-15
CN115396572A (en) 2022-11-25

Similar Documents

Publication Publication Date Title
US8083146B2 (en) Optical code reading system and method for processing multiple resolution representations of an image
US20050263678A1 (en) Image processing apparatus
CN109640007B (en) Artificial intelligence image sensing equipment
CN115426442B (en) Optical information collector and method
CN115396572B (en) Optical information collector and method
US20020125317A1 (en) Optical reader having reduced parameter determination delay
US20090020611A1 (en) Bi-optic imaging scanner with preprocessor for processing image data from multiple sources
CN114036967A (en) Device and method for automatic exposure by adopting double targets
CN109740393A (en) Bar code scanning system and method
CN115034247B (en) Optical information collector and method
JPH0823417A (en) Information input device and method therefor
JP4220883B2 (en) Frame grabber
US20070230941A1 (en) Device and method for detecting ambient light
CN115146664B (en) Image acquisition method and device
US8218023B2 (en) Method and apparatus for processing continuous image data captured by digital image processor
CN110636219A (en) Video data stream transmission method and device
US20100328514A1 (en) Image processing device, imaging apparatus, and thumbnail image displaying method
CN113287295A (en) Image pickup element, image pickup device, method for operating image pickup element, and program
US20050173534A1 (en) Systems and methods for concurrent image capture and decoding of graphical codes
US8319794B2 (en) Image display control apparatus, method for controlling the same, and program
JP4531294B2 (en) Symbol information reader
CN109688314B (en) Camera system and method with low delay, less cache and controllable data output mode
JP5181935B2 (en) Image processing apparatus, program, and subject detection method
JP3996014B2 (en) Image recognition processor
CN206442421U (en) A kind of handset structure gathered for optical label

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination