CN110191261B - Image sensor and module structure notification method for image sensor - Google Patents

Image sensor and module structure notification method for image sensor Download PDF

Info

Publication number
CN110191261B
CN110191261B CN201811357897.8A CN201811357897A CN110191261B CN 110191261 B CN110191261 B CN 110191261B CN 201811357897 A CN201811357897 A CN 201811357897A CN 110191261 B CN110191261 B CN 110191261B
Authority
CN
China
Prior art keywords
image sensor
unit
constituent elements
combination
notification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811357897.8A
Other languages
Chinese (zh)
Other versions
CN110191261A (en
Inventor
伊奈裕史
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omron Corp
Original Assignee
Omron Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Omron Corp filed Critical Omron Corp
Publication of CN110191261A publication Critical patent/CN110191261A/en
Application granted granted Critical
Publication of CN110191261B publication Critical patent/CN110191261B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00519Constructional details not otherwise provided for, e.g. housings, covers
    • H04N1/00538Modular devices, i.e. allowing combinations of separate components, removal or replacement of components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • G03B17/12Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets
    • G03B17/14Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets interchangeably
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/663Remote control of cameras or camera parts, e.g. by remote control devices for controlling interchangeable camera parts based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Stroboscope Apparatuses (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)
  • Structure And Mechanism Of Cameras (AREA)

Abstract

The invention prevents the application of inappropriate data to a modular image sensor as a copy target when an image sensor for copying the module structure of the modular image sensor is introduced. The image sensor includes: an imaging system configured by combining the modularized components; and a processing section that executes processing using the image acquired by the imaging system. Each of the constituent elements has a memory that stores information for specifying the constituent element, and the image sensor further has: a collecting unit that collects information for specifying constituent elements from the memory; a storage unit that stores information indicating a combination of constituent elements of the image sensor and data used when the image sensor having the constituent elements indicated by the combination executes processing; a comparison unit that compares the combination of the constituent elements of the image sensor with the combination of the constituent elements of the image sensor indicated by the information stored in the storage unit; and a notification unit configured to perform notification based on the comparison result.

Description

Image sensor and module structure notification method for image sensor
Technical Field
The present invention relates to an image sensor (sensor) used in a manufacturing line (line) of a factory, and more particularly, to a modular image sensor including a combination of a plurality of modules and a method of notifying a module structure of the image sensor.
Background
In a manufacturing line of a factory, a system (system) called an image sensor is often used in order to automate or save labor for inspection or management of a product. Conventionally, a configuration in which a camera (camera) and an image processing apparatus are connected by a cable (cable) has been generally adopted (see patent document 1), but recently, a process-integrated image sensor in which a camera and an image processing apparatus are integrated and imaging to image processing is performed by a single apparatus has also been proposed. Such a process-integrated image sensor is also called a "smart camera", and also has illumination or a lens integrated therewith.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open No. 2007-214682
Disclosure of Invention
Problems to be solved by the invention
In order to perform a stable inspection using an image sensor, it is desirable to optimize the model (kind) or specification/performance of the illumination/lens/imaging element according to the imaging environment, the object to be inspected, the purpose, or the like. Therefore, manufacturers (makers) that provide smart cameras have conventionally listed (line up) a plurality of products in which models (types) or specifications/performance of illumination/lenses/photographing elements and the like are changed a little by a little so that users can select an optimum specification (spec).
In addition, in the process of accelerating Internet of Things (IoT) of a factory, the application range of the smart camera is expanded, and it is gradually difficult to provide product change (variation) that supports diversified user demands (needs). Further, since the change of the inspection target is in a short cycle, for example, in order to differentiate from competitors in the competition of commodities, the supply of large-scale customization (mass customization) and season-limited commodities according to the preference of each customer, the shortening of life cycles (life cycles) of digital equipment commodities represented by smart phones, and the like are expanded, there is an increasing demand for partially changing the lighting/lens, and the like, in accordance with the inspection so as to optimize them. Therefore, in recent years, a smart camera having a so-called modular structure in which an illumination, a lens, and an imaging element are individually modularized and the illumination, the lens, and the imaging element are freely combined on the user side has been known. For example, if five types of illumination modules, lens modules, and imaging element modules are provided on the factory side, 125 combinations can be realized, and the user can select a combination that meets the (match) specification from among them.
The use of the modular image sensor has an advantage of reducing product variation (unit) for manufacturers and an advantage of expanding options or degrees of freedom for users. On the other hand, however, there is a possibility that the following disadvantage (demerit) may exist.
For example, in maintenance of the modular image sensor, modules incorporated in the image sensor may be exchanged. For example, when an image sensor is newly provided by copying a module configuration used for the above-described modular image sensor, an inspection program used for the image sensor as a copy source or data (hereinafter, referred to as scene data) including setting values such as various parameters (parameters) is applied to the newly provided image sensor as a copy target, and the operation is performed.
In the case of the conventional image sensor, since the device configuration is fixed, there is no fear of occurrence of a problem even if the same scene data is applied before and after maintenance, and, for example, if the model of the image sensor as the copy source is the same as that of the image sensor as the copy target, there is no fear of occurrence of a problem even if the scene data applied to the image sensor as the copy source is applied to the image sensor as the copy target. However, in the case of a modular image sensor, even if the image sensor uses scene data before maintenance, it is not guaranteed that the exchanged module operates normally, or that the modules of the image sensor are always identical between the copy source and the copy destination. Further, before and after maintenance, it is determined whether or not the module configurations of the image sensors match, or whether or not the module configurations of the image sensors match between the copy source and the copy destination is determined by the user confirming the image sensors. As a result, the following possibilities exist: the image sensor is operated without performing the examination intended by the user, because the inappropriate scene data is applied, a problem occurs in the image sensor, or the image sensor is operated in a state where the inappropriate scene data is applied.
The present invention has been made in view of the above circumstances, and an object thereof is to provide a technique for supporting application of appropriate data corresponding to a module configuration in a modular type image sensor.
Means for solving the problems
A first aspect of the present invention provides an image sensor, comprising: an imaging system configured by combining a plurality of modularized components; and a processing section that performs processing using an image acquired by the imaging system, the plurality of constituent elements of the image sensor each having a nonvolatile memory that stores information for specifying the constituent element, the image sensor further including: a collection unit that collects information for specifying the constituent elements from the memories included in the plurality of constituent elements; a storage unit that stores information indicating a combination of constituent elements of an image sensor and data used when processing is executed by the image sensor having the constituent elements indicated by the combination; a comparison unit that compares a combination of its own constituent elements identified by the information collected by the collection unit with a combination of constituent elements of the image sensor indicated by the information stored in the storage unit; and a notification unit configured to perform notification based on a comparison result of the comparison unit.
Accordingly, in the image sensor having the modularized constituent elements, since the comparison result between the combination of the constituent elements that execute the processing using the data stored in the image sensor and the current combination of the constituent elements can be notified, it can be expected that the possibility of the occurrence of the problem due to the difference in the constituent elements of the image sensor during the execution of the processing becomes low. Further, for example, in the case of newly providing a copy line (copy line) of a manufacturing line in which a modular image sensor is used, the configuration of the image sensor as the copy target can be made to coincide with that of the image sensor as the copy source based on a notification concerning the result of comparing the configurations of the image sensor as the copy target and the image sensor as the copy source before the copy line is operated.
The notification unit may notify that the combination of the constituent elements does not match when the result of the comparison indicates that the combination of the constituent elements of the image sensor identified by the information collected by the collection unit does not match the combination of the constituent elements of the image sensor indicated by the information stored in the storage unit. The image sensor may further include an operation setting unit that sets an operation related to the notification by the notification unit when a combination of the constituent elements of the image sensor specified by the information collected by the collection unit does not match a combination of the constituent elements of the image sensor indicated by the information stored in the storage unit. Here, the operation setting unit may set the operation for each timing of the process performed by the image sensor. Further, the operation setting unit may set the output method of the notification to the operation by the notification unit. The operation setting unit sets a method of using the data by the processing unit as the operation. The image sensor may further include a data setting unit that sets data used when the image sensor executes processing based on a comparison result of the comparison unit. In addition, the data setting unit may set the data as data to be used by the processing unit for processing when a result of the comparison by the comparison unit indicates that a combination of the constituent elements of the image sensor identified by the information collected by the collection unit matches a combination of the constituent elements of the image sensor indicated by the information stored in the storage unit.
For example, when an adjustment operation is performed to match each component of the image sensor as the copy target with each component of the image sensor as the copy source, if the notification by the notification unit is performed every time the components of the image sensor are changed, there is a possibility that an unnecessary operation time such as confirmation of the notification may occur. However, according to the above configuration, the notification by the notification unit is controlled so as to ignore the notification, and the operation of the notification unit can be customized (customized) so that the notification is performed only when the user thinks it is necessary.
The plurality of constituent elements may include: an illumination unit configured to illuminate a subject; a lens unit that images an optical image of the subject; and an imaging unit that generates an image based on the optical image. This is because various specifications of imaging systems can be configured by changing the combination of the illumination unit, the lens unit, and the imaging unit.
The information stored in the storage unit may be information indicating a combination of a plurality of constituent elements included in another image sensor, and the data stored in the storage unit may be data used when the another image sensor executes processing.
Further, according to another aspect of the present application, there is provided a module configuration notification method for an image sensor in which information for specifying a plurality of constituent elements stored in respective memories included in the constituent elements and information indicating a combination of the constituent elements of the image sensor are collected from the image sensor by a collection unit of a computer, the image sensor including: an imaging system configured by combining the plurality of modularized constituent elements; and a processing section that performs processing using the image acquired by the imaging system, the image sensor further including a saving section, the storage unit stores information indicating a combination of the constituent elements of the image sensor and data used when the image sensor having the constituent elements indicated by the combination executes processing, the plurality of constituent elements each have a nonvolatile memory storing information for specifying the constituent element, and the comparison unit of the computer compares, based on the information collected by the collection unit, comparing the combination of the constituent elements stored in the memories of the constituent elements with the combination of the constituent elements of the image sensor indicated by the information stored in the storage unit, and a notification unit of the computer configured to perform notification based on a comparison result of the comparison unit.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, it is possible to provide a technique for preventing application of inappropriate data to a modular image sensor as a copy target when an image sensor that copies a module configuration of the modular image sensor is introduced.
Drawings
Fig. 1 is a diagram schematically showing a transfer example of a manufacturing line using an image sensor.
Fig. 2A is a perspective view schematically showing an external appearance of the image sensor, and fig. 2B is a perspective view schematically showing a state in which the image sensor is disassembled.
Fig. 3 is a block diagram schematically showing the structure of the image sensor.
Fig. 4 is a diagram showing an example of use of the image sensor.
Fig. 5 is a diagram showing an example of a storage area of a Read Only Memory (ROM) of the image sensor.
Fig. 6 is a flowchart showing an example of processing performed by the processing section of the image sensor.
Fig. 7 is a diagram showing an example of a screen for performing operation setting of an error (error) notified by the processing unit of the image sensor.
Description of the symbols
100. 200: image sensor with a plurality of pixels
300: managing computer
101. 201: illumination unit
102. 202: lens part
103. 203: image pickup unit
104. 204: treatment section
105. 205: input/output I/F
106. 206: sensor body
107: lighting module memory
108: lens module memory
109: shooting module memory
Detailed Description
Hereinafter, an image sensor according to an embodiment of the present invention will be described with reference to the drawings. However, the embodiments described below are examples of the image sensor, and are not limited to the configurations described below.
< application example >
First, an example of a scenario to which the present invention is applied will be described. In the embodiments illustrated below, a case is assumed where a new copy line is provided, the copy line being a copy of a manufacturing line using a modular image sensor. As shown in fig. 1, in the present embodiment, a modular image sensor 100 used in a manufacturing line 10 is duplicated to an image sensor 200 of a manufacturing line 20 to be duplicated in the manufacturing line 10. In addition, the duplicated image sensors are not limited to one. Here, the term "copy of the image sensor" means that, for example, data used in processing of the module and the image sensor incorporated in the image sensor as a copy source is matched with data used in processing of the module and the image sensor incorporated in the image sensor as a copy target, for the purpose of performing the same inspection as that performed by the image sensor as a copy source also in the image sensor as a copy target. The objects that match during copying are not limited to data used for processing between the module and the image sensor, and other elements of the image sensor may also be matched.
In the present embodiment, the image sensors 100 and 200 are connected to the management computer 300 via a Factory Automation (FA) network such as EtherCAT, for example. The user operates the management computer 300 to transmit and receive various data including the module configuration data and scene data described below to and from the image sensors 100 and 200.
The image sensor 100 of the present embodiment is a so-called processing-integrated image sensor having a modular structure. As illustrated in fig. 2, the illumination unit 101, the lens unit 102, and the imaging unit 103, which are components of the imaging system, are modularized, and a user can arbitrarily combine the modules according to the use of the image sensor 100. Each module (the lighting unit 101, the lens unit 102, and the imaging unit 103) is provided with nonvolatile memories 107, 108, and 109. The memories 107, 108, and 109 store form information, individual information, and the like at the time of factory shipment, for example. Further, the user can write arbitrary information (user data) into the memories 107, 108, and 109. The processing unit 104 (see fig. 3) of the sensor main body 106 can read and/or write information from/to the memories 107, 108, and 109 of the respective modules.
By providing a nonvolatile memory for each module and storing specific information on the module so as to be referred to, the combination of the modules constituting the image sensor 100 can be easily verified by, for example, the image sensor 100 (the processing unit 104) itself or an external computer or the like. Therefore, management of the image sensor having a modular structure can be facilitated.
In the present embodiment, when a copy line of a manufacturing line to be inspected by an image sensor is newly provided by the management of the image sensor, an error can be notified when a module incorporated in the image sensor as a copy source does not match a module incorporated in the image sensor as a copy target.
< Structure of image sensor >
An image sensor according to an embodiment of the present invention will be described with reference to fig. 2A, 2B, 3, and 4. Fig. 2A is a perspective view schematically showing the appearance of the image sensors 100 and 200, and fig. 2B is a perspective view schematically showing a state in which the image sensors 100 and 200 are disassembled. Fig. 3 is a block diagram schematically showing the configuration of the image sensors 100 and 200. Fig. 4 is a diagram showing an example of use of the image sensors 100 and 200.
The image sensors 100 and 200 are devices that are installed in a manufacturing line of a factory, for example, and that use various processes of images. The image sensors 100 and 200 are also called vision sensors (vision sensors), vision systems (vision systems), and the like. The image sensors 100 and 200 of the present embodiment are process-integrated image sensors (so-called smart cameras) in which an imaging system and a processing system are integrated.
The image sensors 100 and 200 include an illumination unit 101, an illumination unit 201, a lens unit 102, a lens unit 202, an image pickup unit 103, and an image pickup unit 203, respectively, as an image pickup system. The illumination units 101 and 201 are devices (devices) for illuminating objects (inspection objects and the like) in the fields of view of the image sensors 100 and 200, and include, for example, a plurality of Light Emitting elements (Light Emitting diodes (LEDs) and the like) arranged around the lens units 102 and 202. The lens portions 102 and 202 are optical systems for forming optical images of a subject on the image capturing portions 103 and 203, and for example, optical systems having functions of focusing, diaphragm, zooming (zoom), and the like are used. The image pickup units 103 and 203 are devices that generate and output image data by photoelectric conversion, and include image pickup elements such as a Charge Coupled Device (CCD) and a Complementary Metal-Oxide-Semiconductor (CMOS), for example.
Further, the image sensors 100 and 200 include a processing section 104, a processing section 204, an input/output I/F105, and an input/output I/F205 as a processing system. The processing units 104 and 204 are devices that perform the following processes and the like: image processing (for example, preprocessing, feature amount extraction, and the like) for image data imported from the imaging system, various processes (for example, inspection, character recognition, individual recognition, and the like) based on the image processing result, data transmission and reception with an external device via the input/output I/F105, 205, generation of data to be output to the external device, processing for data received from the external device, and control of the imaging system or the input/output I/F105, 205.
The processing units 104 and 204 include, for example, a processor 104a, a processor 204a, a ROM (Read Only Memory)104b, a ROM204b, a Random Access Memory (RAM) 104c, a RAM 204c, and the like, respectively, and the processors 104a and 204a expand and execute programs stored in the ROMs 104b and 204b in the RAMs 104c and 204c to realize the above-described various processes and the processes described below. Some or all of the functions of the processing units 104 and 204 may be realized by an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), or the like, or may be provided from an external device. The input/output I/ fs 105 and 205 are communication interfaces (interfaces) for transmitting and receiving data to and from external devices. For example, the input/output I/F105, 205 may include a network interface (network interface) for connecting with a Programmable Logic Controller (PLC) or a management terminal (computer), a parallel interface (parallel interface) for connecting with other sensors or controllers, and the like.
The image sensors 100 and 200 of the present embodiment have a modular structure, and as shown in fig. 2B, three modules, i.e., the illumination units 101 and 201, the lens units 102 and 202, and the imaging units 103 and 203, are selected and assembled to the sensor bodies 106 and 206. The illumination unit may be used without selection. Each module is fixed to the sensor bodies 106 and 206 by, for example, screw fastening or the like, and the modules can be freely attached and detached to and from the user side.
As the illumination units (illumination modules) 101 and 201, for example, various modules are prepared, such as those having different wavelengths of illumination light, or those having different arrangements of light emitting elements, different light amounts, and different light emission patterns (patterns). Further, a lighting module may be used in which a plurality of light sources (LEDs, etc.) of red, blue, green, infrared, etc. are provided in one module, and light of wavelengths other than red, blue, green, infrared, etc. (for example, white, violet, pink, etc.) can be irradiated by controlling the light emission of each light source. Such illumination is called multicolor illumination or the like. As the lens units (lens modules) 102 and 202, for example, a plurality of modules are prepared, such as a module having a function of automatically focusing by manual operation or using an actuator (activator), a module having different visual fields such as a narrow visual field and a wide visual field, and a module having a zoom function. Various modules such as different pixel numbers, frame rates (frame rates), shutter (shutter) systems (rolling shutter)/global shutter), color/cell (color/monochrome) elements, and the like are prepared as the imaging units 103 and 203. The user can appropriately combine the appropriate modules according to the use or required specifications of the image sensor 100, 200.
Each module incorporates a nonvolatile memory. Specifically, as shown in fig. 3, the lighting units 101 and 201 have lighting module memories 107 and 207, the lens units 102 and 202 have lens module memories 108 and 208, and the imaging units 103 and 203 have imaging module memories 109 and 209. Hereinafter, they are collectively referred to as "module memories". As the module Memory, for example, an Electrically Programmable Read Only Memory (EPROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Ferroelectric Random Access Memory (FeRAM), a Magnetoresistive Random Access Memory (MRAM), or the like can be used, and the data capacity is arbitrary. In this embodiment, an EEPROM having a capacity of about several kilobytes (byte) to several tens of megabytes is used.
In the module memory, two write areas, a "factory area" and a "user area" may be provided. The factory area is an area for the factory to write data at the time of shipment of the module. The user can read the data in the home area, but cannot rewrite or delete the data in the home area. In the factory area, for example, form information (form name, form number, etc.) of the module, individual information (serial number, lot number, hardware version (version), etc.) are held. Further, the set value or the correction parameter (parameter) at the time of driving the module, or the individual difference information of the module (for example, data measured by inspection at the time of factory shipment) may be stored in the module memory. For example, in the case of an illumination unit, an illumination control setting value (control method, voltage, Duty, delay, method of block lighting, and the like), a difference in luminance/color of each light source, optical axis information, and the like may be stored. In the case of the lens unit, information on a lens/Focus (Focus) set value (a Focus initial reference value, etc.), presence or absence of an Auto Focus (AF) function, a focal length, a viewing angle, an F value, a distortion amount, an optical axis, and the like may be stored. In the case of the imaging unit, camera setting values (such as initial values of imaging elements), pixel defect correction, vertical line correction data, initial values of white balance (white balance), and the like may be stored. On the other hand, the user area is an area rewritable by the user. The user can freely utilize the user area. For example, any information such as information for specifying the installation location (factory, manufacturing line) of the image sensor, information on the date of purchase or maintenance of the module, and the use status of the module may be stored. Note that, as an example, any data may be stored in the module memory as long as it is information useful in management or operation of the image sensors 100 and 200.
The image sensors 100 and 200 can be used for various purposes. Examples of the method include image recording of an inspection object, shape Recognition, edge (edge) detection or width/number measurement, area measurement, acquisition of color features, labeling (labeling) or segmentation, object Recognition, bar code (bar code) or two-dimensional code reading, Optical Character Recognition (OCR), individual Recognition, and the like. Fig. 4 shows an example in which an image of a product 501 flowing on a conveyor (conveyor)500 of the manufacturing line 10 is captured by the image sensor 100, and the appearance of the product 501 is inspected.
< example of error Notification of Module Structure of image sensor >
First, the storage area of the ROM104b, 204b of the image sensor 100, 200 in the present embodiment will be described with reference to fig. 5. The ROMs 104b and 204b store communication setting data related to various communication settings, such as an Internet Protocol (IP) address (address) used when the image sensors 100 and 200 communicate with other devices or the like in the manufacturing lines 10 and 20. The ROMs 104b and 204b store signal setting data related to information transmitted through various signal lines included in the image sensors 100 and 200. The ROMs 104b and 204b store data relating to format information such as the model numbers of the respective modules incorporated in the image sensors 100 and 200. The ROMs 104b and 204b store data (scene data) used for execution of an inspection, such as an inspection program executed by the image sensors 100 and 200 or various parameters used in the inspection program. The ROM204b is an example of a storage unit that stores information indicating a combination of the constituent elements of the image sensor and data used when the image sensor executes processing.
In the present embodiment, since the image sensor 100 is already operated in the manufacturing line 10, the ROM104b stores module configuration data and scene data. On the other hand, in the manufacturing line 20 which is the replication target of the manufacturing line 10, since the image sensor 200 is before the start of operation, the ROM204b of the image sensor 200 does not store the module configuration data and the scene data. Accordingly, the image sensor 200 acquires the module configuration data and the scene data stored in the image sensor 100, and stores the acquired module configuration data and scene data in the ROM204 b. The block configuration data stored in the image sensor 100 is an example of information indicating a combination of a plurality of constituent elements included in another image sensor, and the scene data stored in the image sensor 100 is an example of data used when the image sensor executes processing.
Next, a process executed by the processing unit 204 of the image sensor 200 will be described with reference to a flowchart illustrated in fig. 6. For example, when the power of the image sensor 200 is turned on, the processing unit 204 of the image sensor 200 starts the processing of the flowchart shown in fig. 6.
In the OP101, the processing unit 204 functions as a collecting unit that collects data on modules from the module memories of the respective modules of the image sensor 200. The data relating to the modules of each module is an example of information for specifying the constituent elements. For example, the processing unit 204 collects, as data relating to the module, format information indicating the module, serial number, lot number, and hardware version, which are stored in the factory area of the module memory. The processing unit 204 temporarily stores the data on the modules collected from the modules in the RAM 204c as module configuration data. Next, the processing section 204 advances the process to the OP 102.
In the OP102, the processing section 204 acquires the module configuration data and scene data stored in the ROM104b of the image sensor 100. The processing unit 204 may communicate with the image sensor 100 to acquire these data, or the management computer 300 may acquire these data from the image sensor 100 and the processing unit 204 may acquire these data from the management computer 300. When the module configuration data and scene data of the image sensor 100 are acquired, the processing unit 204 stores the acquired data in the ROM204 b. Next, the processing section 204 advances the process to the OP 103.
In the OP103, the processing unit 204 performs a consistency check as to whether or not data is applicable to the image sensor 200, for example, whether or not the data acquired in the OP102 is data indicating a module configuration for the image sensor 200 or scene data executable by the image sensor 200. Next, the processing section 204 advances the process to the OP 104.
In the OP104, the processing unit 204 determines whether or not the data can be applied to the image sensor 200 based on the verification by the OP 103. If the data can be applied to the image sensor 200, that is, if the matching of the data is normal (OP 104: Yes), the processing section 204 advances the processing to the OP 105. On the other hand, if the data cannot be applied to the image sensor 200, that is, if there is an abnormality in the matching of the data (OP 104: No), the processing section 204 advances the processing to the OP 108.
In the OP105, the processing unit 204 functions as a comparison unit that compares information on the module configuration indicated by the module configuration data of the image sensor 200 collected in the OP101 (for example, format information of each module) with information on the module configuration indicated by the module configuration data of the image sensor 100 acquired in the OP 102. Next, the processing section 204 advances the process to the OP 106.
In the OP106, the processing unit 204 determines whether or not the module configuration is consistent between the image sensor 100 and the image sensor 200 based on the comparison by the OP 105. If the module configurations are identical (OP 106: YES), the processing section 204 advances the processing to OP 107. On the other hand, if the module configurations do not match (OP 106: no), the processing unit 204 advances the process to the OP 109.
In the OP107, as a result of the determination by the OP106, the module mounted on the image sensor 100 matches the module mounted on the image sensor 200, and therefore the processing unit 204 applies the scene data acquired by the OP102 to the image sensor 200. Here, the applicable scene data is, for example, the scene data acquired by the OP102 that is expanded in the RAM 204c by the processing unit 204 and set as the data setting unit. For example, the processing unit 204 performs a setting for executing the inspection executed in the image sensor 100 using the scene data also in the image sensor 200. The settings may include settings for each module incorporated in the image sensor 200. When the processing of the OP107 is completed, the processing section 204 ends the processing of the present flowchart.
In the OP108, the processing unit 204 notifies the management computer 300 of an error indicating that the data acquired in the OP102 cannot be applied to the image sensor 200. When receiving the notification from the image sensor 200, the management computer 300 displays a message (message) or the like notifying the user of the error on a monitor (monitor) of the management computer 300. After the notification, the processing unit 204 ends the processing of the flowchart.
In the OP109, the processing unit 204 functions as a notification unit that notifies the management computer 300 of an error indicating that the configuration of each block of the image sensor 100 is different from the configuration of each block of the image sensor 200. The notification may include information indicating which module has a different structure or information indicating the structure of each module. When receiving the notification from the image sensor 200, the management computer 300 displays a message or the like notifying the user of the error on the monitor of the management computer 300. After the notification, the processing unit 204 ends the processing of the flowchart.
As described above, according to the present embodiment, when each module constituting a modular image sensor is copied to newly constitute an image sensor, it is possible to use an image sensor as a copy target after matching each module of the image sensor as a copy target with each module of the image sensor as a copy source. In the above example, when a copy line of a manufacturing line of a modular image sensor is newly used, a user can know whether or not the structures of the respective modules to be assembled are different between the image sensor as the copy source and the image sensor as the copy destination before operating the copy line. In this way, since the copy line can be operated after the modules of the image sensor used in the copy line are matched with the modules used in the image sensor as the copy source, there is a low possibility that a problem which has been concerned in the past, in which the user cannot normally check the image sensor before noticing the problem, occurs in the copy line.
In the present embodiment, although an error is displayed when there is a difference in the module configuration between the image sensor as the copy source and the image sensor as the copy target by the processing of the OP106 and the OP109, if the error is displayed every time the module of the image sensor as the copy target is adjusted, for example, there is a possibility that the efficiency of the adjustment work may be impaired. If the processing for setting the image sensor is fixed (for example, the setting included in the scene data is initialized or a setting screen is displayed on the monitor of the management computer 300) when an error is displayed, there is a possibility that an unnecessary initialization or setting operation is generated. Therefore, in the present embodiment, as an example, the user can operate the management computer 300 to set notification for each error occurring in the above-described processing.
Fig. 7 shows an example of a setting screen 400 for setting the error notification process. The screen is displayed on the monitor of the management computer 300. The setting screen 400 includes: a setting list display area 401 for displaying a list of various settings of the image sensors 100 and 200; an error list display area 402 for displaying the errors notified in the settings in the setting list as a list (list); and an error operation setting display area 403 for displaying the contents of the error operation setting such as display and processing when each error in the error list occurs.
In the present embodiment, the user selects "malfunction setting" from the setting list. When the user selects "error operation setting" from the setting list, an error in which the user can set an error operation is displayed in the error list display area 402. In the example of fig. 7, the user selects an error of the "module assembly error (photographing element)" with the error number "100". As a result, the setting content of the erroneous operation is displayed in the erroneous operation setting display area 403.
In the example of fig. 7, in the display area 403 for setting the error operation, the name of the error is displayed in the "error type" column 404. In the error output field 405, when an error displayed in the error type field 404 occurs in the error notification process, settings are displayed such as whether or not to notify an external device such as the management computer 300 from the image sensor 200, whether or not to notify by lighting an LED (not shown) for error notification of the image sensor 200, and the like. In the case of the example of fig. 7, the user selects the "radio button with" 406 to enable the notification to the management computer 300 when an error occurs, and selects the "radio button without" 407 to disable the notification to the management computer 300 when an error occurs.
As shown in fig. 7, when the user selects the "radio present" button 406 in the "error output" column 405, the setting content of the error operation is displayed in the "detailed setting" column 408. In the example of fig. 7, items of "timing", "UI", "parallel IO", "display lamp", and "communication module" are displayed in the "detailed setting" column 408.
In the item "timing", timing for notifying an error displayed in the "error type" column 404 is displayed. As an example, the timing of the processing performed by the image sensor 200 may be set when the image sensor 200 is powered ON ("power ON (ON) in the figure"), when scene data is read into the RAM 204c in the image sensor 200 ("data loading" in the figure), or when an inspection program is executed in the image sensor 200 and an inspection is performed ("measurement" in the figure). In addition, in the item "timing", the timing of acquiring the scene data from the ROM204b, the timing of acquiring the scene data from an external device of the image sensor 200, or the like may be displayed as the respective timings. Thus, the presence or absence of the error notification can be flexibly set according to the acquisition path of the scene data in the image sensor 200.
In the item "UI", a display form of an error is displayed, the display form of the error representing: how to display an error at each timing of the error notification displayed in the item "timing". For example, as setting examples selectable in the item "UI", there are "dialog (dialog)", "color display", "no notification".
The setting of the "dialog" is a setting in which, when an error occurs, a dialog box (dialog box) for notifying that an error has occurred is displayed on the monitor of the management computer 300, and is a notification method in which, when an error occurs, the user is required to perform some operation, for example, close the dialog box, in order to continue the processing of the image sensor 200. The "color display" is a setting in which, when an error occurs, a message or the like notifying that an error has occurred is displayed in color on the monitor of the management computer 300, and is a notification method in which, when an error occurs, the error is notified, for example, a target error is highlighted in a log (log) or the like storing an error occurrence history, or a text (text) message or the like is displayed on the monitor of the management computer 300, but the processing of the image sensor 200 is continued. The setting of "not notifying" is a notification method in which when an error occurs, the error is not notified and the processing of the image sensor 200 is continued.
In the item "parallel IO", a setting is displayed as to whether or not an error signal (ON or OFF in the drawing) of an error displayed in the "error type" column 404 is output from the signal line of the image sensor 200. In the item "display lamp", a setting is displayed as to whether or not the error notification LED (not shown) of the image sensor 200 is turned ON (ON or OFF in the drawing) when the error displayed in the "error type" field 404 occurs. In the item "communication module", a setting is displayed as to whether or not an error bit (ON or OFF in the drawing) is output from the image sensor 200 to the FA network when an error displayed in the "error type" field 404 occurs.
For example, the settings selectable for the items "UI", "parallel IO", "display lamp", and "communication module" are displayed in a drop-down list (drop-down list). The user sets various output methods of the error displayed in the "error type" column 404 by selecting a desired setting for each item from the pull-down list in accordance with each timing displayed in the item "timing".
Further, in the display area 403 for setting the erroneous operation, a "clear (clear) process" field 409 is provided, and the "clear process" field 409 is used to set a process to be executed by the image sensor 200 and/or the management computer 300 when an error displayed in the "error type" field 404 occurs. In the example of fig. 7, in the "clear processing" column 409, the user can select four kinds of usage methods of scene data, that is, "ignore", "initialize", "convert (convert)", and "turn to setting screen", from the pull-down list.
Here, "ignore" is a method of using scene data in which all the current settings (parameters and the like) that cause an error displayed in the "error type" field 404 are used. "initialization" is a method of using scene data in which the current setting that causes an error displayed in the "error type" field 404 is initialized. "conversion" is a scene data usage method of initializing a part that cannot be used, while using a part that can be used in the current setting as a cause of an error displayed in the "error type" field 404. Further, which part of the setting is used and which part is initialized for each error can be set separately by the user operating the management computer 300 or the like. The "diversion setting screen" is a scene data use method in which, when an error displayed in the "error type" field 404 occurs, a setting screen for specifying a process to be executed is displayed on the monitor of the management computer 300. In this case, each time an error occurs, the user can designate and execute an appropriate process for the error that has occurred on the setting screen displayed.
In this way, the user can operate the management computer 300 to set the error operations of the errors notified from the OP101 to the OP 109. Thus, when the module configuration of the image sensor 200 is adjusted in the manufacturing line 20 that is the replication target of the manufacturing line 10, the processing unit 204 of the image sensor 200 functions as an operation setting unit, and various settings of the error operation set on the setting screen 400 are performed, whereby customized error notification can be performed in the OP101 to OP109, for example, notification of an error that the user thinks is necessary, and notification of an error that the user thinks is unnecessary. As a result, reduction in the number of working steps due to unnecessary error notification that has conventionally occurred can be expected.
< Others >
The above embodiments are merely structural examples for illustrating the present invention. The present invention is not limited to the specific embodiments described above, and various modifications can be made within the scope of the technical idea. For example, as described above, the processing system and the imaging system are integrally configured in the image sensors 100 and 200, but the image sensors 100 and 200 may be configured as separate bodies and the processing system and the imaging system may be connected by a wired cable or the like. In the above embodiment, three modules, that is, the illumination unit, the lens unit, and the imaging unit, are exemplified, but the constituent elements incorporated in the image sensor are not limited to these. For example, an optical filter, an input/output I/F, a processing unit (processor or memory), a display, and the like may be modularized. In addition, as a supply form (delivery form) of the smart camera, there are a form in which the module is separately supplied and assembled on the user side, and a form in which the lighting module or the lens module is supplied in a state of being incorporated in the sensor body. In the latter mode, there is an advantage that the image sensor can be introduced more easily because adjustment of optical conditions on the user side or the like is not required.
The management computer 300 may function as a collection unit, a comparison unit, and a notification unit, and execute the processing of the flowchart. For example, the management computer 300 collects the module configuration data or scene data stored in the respective ROMs 104b, 204b, and information on each assembled module from the image sensors 100, 200 connected to the management computer 300, and compares the module configurations based on the collected information. Further, the management computer 300 may notify the user of an error relating to an image sensor whose comparison result indicates that the module configuration is inconsistent via the monitor of the management computer 300, or instruct the application of scene data to an image sensor whose module configuration is consistent. Accordingly, the management computer 300 can verify the module configurations of the plurality of image sensors in a unified manner, and thus management of the module configurations of the image sensors becomes easier.
< computer-readable recording Medium >
A computer, other machine, or device (hereinafter, referred to as a computer or the like) can be provided with a program that causes the computer, the other machine, or the device to implement a tool (tool), an Operating System (OS), or the like for setting the computer, the image sensor, or the like. The functions can be provided by causing a computer or the like to read and execute the program of the recording medium.
Here, the computer-readable recording medium refers to a recording medium capable of storing information such as data and programs by an electric, magnetic, optical, mechanical, or chemical action and capable of being read from a computer or the like. Examples of such a recording medium that can be removed from a computer include a Memory card (Memory) such as a floppy disk (flash disk), a magneto-optical disk, a Compact disk Read Only Memory (CD-ROM), a Compact disk Rewritable (CD-R/W), a Digital Versatile Disk (DVD), a blu-ray disk (blu-ray Disc), a Digital Audio Tape (DAT), an 8mm magnetic Tape, and a flash Memory (flash Memory). A hard disk (hard disk) or ROM is a recording medium fixed to a computer or the like.
< Note attached >
An image sensor 200, comprising:
imaging systems 201, 202, and 203 each configured by combining a plurality of modular components; and
a processing unit 204 for executing a process using the images acquired by the imaging systems 201, 202, and 203,
the image sensor 200 is characterized in that,
the plurality of constituent elements have nonvolatile memories 207, 208, 209, respectively, the nonvolatile memories 207, 208, 209 store information for determining the constituent elements,
the image sensor 200 further includes:
a collecting unit 204 that collects information for specifying the constituent elements from the memories 207, 208, and 209 included in the plurality of constituent elements;
a storage unit 204b that stores information indicating a combination of constituent elements of the image sensor and data used when the image sensor executes processing;
a comparison unit 204 that compares the combination of the constituent elements of the image sensor identified by the information collected by the collection unit 204 with the combination of the constituent elements of the image sensor indicated by the information stored in the storage unit 204 b; and
and a notification unit 204 for performing notification based on the comparison result of the comparison unit 204.

Claims (8)

1. An image sensor, comprising:
an imaging system configured by combining a plurality of modularized components; and
a processing section that executes processing using the image acquired by the imaging system,
the image sensor is characterized in that,
the plurality of constituent elements each have a nonvolatile memory that stores information for specifying the constituent element,
the image sensor further includes:
a collection unit that collects information for specifying the constituent elements from the memories included in the plurality of constituent elements;
a storage unit that stores information indicating a combination of constituent elements of an image sensor and data used when processing is executed by the image sensor having the constituent elements indicated by the combination;
a comparison unit that compares a combination of its own constituent elements identified by the information collected by the collection unit with a combination of constituent elements of the image sensor indicated by the information stored in the storage unit; and
a notification unit configured to perform notification based on a comparison result of the comparison unit,
the notifying unit notifies that the combination of the constituent elements does not match when the result of the comparison indicates that the combination of the constituent elements of the image sensor identified by the information collected by the collecting unit does not match the combination of the constituent elements of the image sensor indicated by the information stored in the storage unit,
the image sensor further includes an operation setting unit that is operated by a user to set an operation related to the notification by the notification unit when a combination of constituent elements determined from the information collected by the collection unit does not match a combination of constituent elements of the image sensor indicated by the information stored in the storage unit,
the notification unit operates in accordance with the operation set by the operation setting unit,
the user operates the operation setting unit to set the operation for each timing of the process performed by the image sensor.
2. The image sensor of claim 1,
the operation setting unit sets an output method of the notification to the operation.
3. The image sensor of claim 1,
the operation setting unit sets a method of using the data by the processing unit as the operation.
4. The image sensor of claim 1,
the image sensor further includes a data setting unit that sets data used when the image sensor executes processing based on a comparison result of the comparison unit.
5. The image sensor of claim 4,
the data setting unit sets the data as data to be used by the processing unit for processing when a comparison result of the comparison unit indicates that a combination of the constituent elements of the image sensor identified by the information collected by the collection unit matches a combination of the constituent elements of the image sensor indicated by the information stored in the storage unit.
6. The image sensor of claim 1,
the plurality of constituent elements include: an illumination unit configured to illuminate a subject; a lens unit that images an optical image of the subject; and an imaging unit that generates an image based on the optical image.
7. The image sensor of claim 1,
the information stored in the storage unit is information indicating a combination of a plurality of constituent elements included in another image sensor, and the data stored in the storage unit is data used when the another image sensor executes processing.
8. A method for notifying a module configuration of an image sensor,
collecting, by a collecting unit of a computer, information for specifying a plurality of constituent elements stored in respective memories included in the constituent elements and information indicating a combination of the constituent elements of the image sensor from the image sensor, the image sensor including: an imaging system configured by combining the plurality of modularized constituent elements; and a processing unit that executes processing using the image acquired by the imaging system, the image sensor further including a storage unit that stores information indicating a combination of constituent elements of the image sensor and data used when processing is executed by the image sensor having the constituent elements indicated by the combination, each of the plurality of constituent elements having a nonvolatile memory that stores information for specifying the constituent element,
comparing, by a comparing unit of the computer, a combination of the constituent elements stored in the memories of the plurality of constituent elements and a combination of the constituent elements of the image sensor indicated by the information stored in the storage unit based on the information collected by the collecting unit,
notifying, by a notification section of the computer, based on a comparison result of the comparison section,
the notification unit notifies that the combination of the constituent elements of the image sensor does not match when the comparison result of the comparison unit indicates that the combination of the constituent elements of the image sensor specified by the information collected by the collection unit does not match the combination of the constituent elements of the image sensor indicated by the information stored in the storage unit,
a user operating an operation setting unit of the image sensor to set an operation related to the notification by the notification unit when a combination of the constituent elements of the image sensor specified by the information collected by the collection unit does not match a combination of the constituent elements of the image sensor indicated by the information stored in the storage unit,
the notification unit operates in accordance with the operation set by the operation setting unit,
the user operates the operation setting unit to set the operation for each timing of the process performed by the image sensor.
CN201811357897.8A 2018-02-23 2018-11-15 Image sensor and module structure notification method for image sensor Active CN110191261B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018031004A JP6962237B2 (en) 2018-02-23 2018-02-23 Image sensor and module configuration notification method for image sensor
JP2018-031004 2018-02-23

Publications (2)

Publication Number Publication Date
CN110191261A CN110191261A (en) 2019-08-30
CN110191261B true CN110191261B (en) 2020-12-11

Family

ID=64308573

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811357897.8A Active CN110191261B (en) 2018-02-23 2018-11-15 Image sensor and module structure notification method for image sensor

Country Status (4)

Country Link
US (1) US10686978B2 (en)
EP (1) EP3531678A1 (en)
JP (1) JP6962237B2 (en)
CN (1) CN110191261B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7246869B2 (en) * 2018-06-28 2023-03-28 キヤノン株式会社 IMAGE FORMING APPARATUS, IMAGE FORMING APPARATUS CONTROL METHOD AND PROGRAM
CN113973163A (en) 2020-07-23 2022-01-25 北京小米移动软件有限公司 Image acquisition module, electronic device, image acquisition method and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104168409A (en) * 2008-12-29 2014-11-26 Red.Com公司 Modular digital camera
WO2016134318A1 (en) * 2015-02-19 2016-08-25 Makoto Odamaki Systems, methods, and media for modular cameras

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001160911A (en) * 1999-12-02 2001-06-12 Canon Inc Imaging device and imaging system
JP2005345802A (en) * 2004-06-03 2005-12-15 Casio Comput Co Ltd Imaging device, replacement unit used for the imaging device, and replacement unit use control method and program
JP4775013B2 (en) 2006-02-07 2011-09-21 オムロン株式会社 Imaging device
JP4822544B2 (en) * 2006-04-26 2011-11-24 株式会社リコー Image forming apparatus capable of managing a plurality of module configuration information
JP2011253058A (en) * 2010-06-02 2011-12-15 Olympus Corp Information processor and camera
JP5822630B2 (en) * 2011-10-04 2015-11-24 富士機械製造株式会社 Camera device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104168409A (en) * 2008-12-29 2014-11-26 Red.Com公司 Modular digital camera
WO2016134318A1 (en) * 2015-02-19 2016-08-25 Makoto Odamaki Systems, methods, and media for modular cameras

Also Published As

Publication number Publication date
CN110191261A (en) 2019-08-30
EP3531678A1 (en) 2019-08-28
JP2019146112A (en) 2019-08-29
US20190268528A1 (en) 2019-08-29
US10686978B2 (en) 2020-06-16
JP6962237B2 (en) 2021-11-05

Similar Documents

Publication Publication Date Title
CN110191257B (en) Sensor system, information processing apparatus, sensor management method
CN110191261B (en) Image sensor and module structure notification method for image sensor
CN104040424B (en) Camera arrangement, camera body and interchangeable lenses
US7728875B2 (en) Image processing apparatus and operation condition setting method thereof
CN109743500A (en) A kind of calibration, switching method and the device of thermal camera diurnal pattern
EP3547660B1 (en) Image sensor
US11490072B2 (en) Image sensor system and image sensor
CN101441393A (en) Projection device for image projection with document camera device connected thereto, and projection method
JP2020024401A (en) Device and method for controlling illumination of industrial camera system
JP6513325B1 (en) Control device, control system, notification method and program
EP3531687B1 (en) Sensor system, information processing device, and sensor management method
CN104781717A (en) System and method of acquiring images with a rolling shutter camera while asynchronously sequencing microscope devices
CN109932160B (en) AOI and gray scale meter detection system and method
CN105898161A (en) Lens module system, image sensor, and method of controlling lens module
JP7294463B2 (en) program, recording medium
JP6819630B2 (en) Image sensor and main unit module
US20220417412A1 (en) Imaging system, control method, and program
US7973432B2 (en) Microscope system and control method for same
CN111385473A (en) ICR line sequence determination device, method and device
JP7017074B2 (en) Interchangeable lens, image pickup device, and camera system
CN113557711A (en) Image capturing apparatus, information processing method, and program
US20160117537A1 (en) Apparatus for Machine Vision and Recognition and Associated Methods
CN106796576A (en) The sensor assembly for giving tacit consent to calibration is set

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant