CN105092051B - Information acquisition apparatus and information acquisition method - Google Patents

Information acquisition apparatus and information acquisition method Download PDF

Info

Publication number
CN105092051B
CN105092051B CN201510202320.XA CN201510202320A CN105092051B CN 105092051 B CN105092051 B CN 105092051B CN 201510202320 A CN201510202320 A CN 201510202320A CN 105092051 B CN105092051 B CN 105092051B
Authority
CN
China
Prior art keywords
information
subject
image
acquisition
detection data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510202320.XA
Other languages
Chinese (zh)
Other versions
CN105092051A (en
Inventor
王浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Mission Infrared Electro Optics Technology Co Ltd
Original Assignee
Hangzhou Mission Infrared Electro Optics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Mission Infrared Electro Optics Technology Co Ltd filed Critical Hangzhou Mission Infrared Electro Optics Technology Co Ltd
Priority to CN201510202320.XA priority Critical patent/CN105092051B/en
Publication of CN105092051A publication Critical patent/CN105092051A/en
Application granted granted Critical
Publication of CN105092051B publication Critical patent/CN105092051B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Studio Devices (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an information acquisition device and an information acquisition method, and relates to the field of detection application. The detection device in the prior art cannot conveniently correspond to the information of the detected body; the information acquisition apparatus of the present invention includes a first acquisition section for acquiring a first image; a second acquisition section for acquiring detection data and/or entry data; a control section for performing processing control based on first subject information acquired from the first image and/or second subject information selected based on the first subject information; the second subject information is information related to the subject selected based on the first subject information. Thus, the existing problems are solved.

Description

Information acquisition apparatus and information acquisition method
Technical Field
The invention discloses an information acquisition device and an information acquisition method, and relates to the field of detection application.
Background
At present, various detection devices such as various imaging instruments (visible light, infrared, ultraviolet, laser, etc.), partial discharge testing, gas leakage testing instruments, etc. are widely used; the detection device in the prior art is very important to perform processing control in a targeted manner when detecting different detected objects, for example, displaying various targeted prompt messages, for example, changing the acquisition control mode of the detection data in a targeted manner, for example, performing various processing on the detection data in a targeted manner.
Taking thermal image data processing of a thermal image shooting device as an example, the processing configurations of analysis processing, diagnosis processing and the like may be different aiming at the thermal image data obtained by shooting different detected bodies; in the prior art, corresponding to the processing of different tested objects, users are often required to perform tedious manual setting operations, such as setting of analysis areas and analysis modes related to analysis, setting of diagnostic rules related to thermal imagery diagnosis, and recording operation of additional information related to records. How to reduce the manual tedious operation of the user is a difficult point.
Therefore, it is understood that there is a need for an information acquisition apparatus that can reduce the number of setting operations by a user or omit the setting operation steps to solve the existing problems.
Disclosure of Invention
To this end, the invention adopts the technical proposal that an information acquisition device is provided with,
a first acquisition section for acquiring a first image;
a second acquisition section for acquiring detection data;
a control section for performing processing control based on first subject information acquired by the first image recognition and/or second subject information selected based on the first subject information; the second subject information is information related to the subject selected based on the first subject information.
The information acquisition method comprises the following steps:
a first acquisition step of acquiring a first image;
a second acquisition step of acquiring detection data;
a control step of performing processing control in accordance with first subject information acquired by the first image recognition and/or second subject information selected based on the first subject information; the second subject information is information related to the subject selected based on the first subject information.
Other aspects and advantages of the invention will become apparent from the following description.
Description of the drawings:
fig. 1 is a block diagram of an electrical configuration of an information acquisition apparatus 100 of embodiment 1.
Fig. 2 is an outline schematic diagram of the information acquisition apparatus 100 according to embodiment 1.
Fig. 3 is a schematic diagram showing an implementation of the subject information and the configuration information stored in the storage medium of the information acquisition apparatus 100.
Fig. 4 is a control flowchart of the information acquisition apparatus 100 of embodiment 1.
Fig. 5 is a schematic diagram of the information acquisition apparatus 100 of embodiment 1 capturing and acquiring a first image.
Fig. 6 is a display example of a display interface in which the information acquisition apparatus 100 according to embodiment 1 performs corresponding processing based on the configuration information associated with the identified first subject information;
fig. 7 is a block diagram of the electrical configuration of the information acquisition apparatus 101 of embodiment 2.
Fig. 8 is a control flowchart of the information acquisition apparatus 101 of embodiment 2.
Fig. 9 is a block diagram of the electrical configuration of the information acquisition apparatus 102 of embodiment 3.
Fig. 10 is a control flowchart of the information acquisition apparatus 102 of embodiment 3.
Detailed Description
The following examples are set forth to provide a better understanding of the present invention without limiting the scope thereof and may be modified in various forms within the scope thereof. Furthermore, although the present invention is illustrated in embodiment 1, the information acquisition device 100 is a portable thermographic imaging device, but is not limited thereto, and the idea of the embodiment is applicable to general detection devices, and the information acquisition device 100 may be various detection devices that obtain detection data based on detectors (including various detectors and sensors), such as various imaging devices (such as imaging devices of visible light, infrared, ultraviolet, laser, etc.), partial discharge testing devices, gas leakage testing devices, vibration testing devices, and the like.
Embodiment 1, the structure of an information acquisition apparatus 100 of embodiment 1 is explained with reference to fig. 1. Fig. 1 is a block diagram of an electrical configuration of an information acquisition apparatus 100 of embodiment 1.
Specifically, the information acquisition device 100 includes a first acquisition unit 1, a second acquisition unit 2, a temporary storage unit 3, a communication unit 4, a memory card unit 5, an image processing unit 6, a flash memory 7, a display control unit 8, a display unit 9, a control unit 10, and an operation unit 11, and the control unit 10 is connected to the respective corresponding units via a control and data bus and takes charge of overall control of the information acquisition device 100.
The first acquisition unit 1 acquires a first image including information to be recognized. In embodiment 1, the first acquisition unit 1 is a visible light imaging unit, and includes an optical component, a lens driving component, a visible light detector, a signal preprocessing circuit, and the like, which are not shown; a first image containing information to be identified is obtained by photographing a label related to a subject.
The first acquiring unit 1 may be implemented in various ways depending on the material, color, and the like of the label information and the background of the target label, and depending on the application environment conditions.
In another example, the first acquiring unit 1 may employ a near-infrared photographing device, and may photograph a first image of a sign of a corresponding retroreflective material at night based on an auxiliary lighting device, for example, an infrared lighting device of 950 nm; preferably, the information acquisition apparatus 100 includes an auxiliary lighting device.
In still another example, the first acquiring unit 1 may employ a thermal image capturing unit for far infrared, and obtain the first image based on far infrared radiation of the sign; wherein the first image may be acquired based on a difference in emissivity of the signage information and the background surface material.
In embodiment 1, the first image is image data obtained based on an output signal of an image detector; according to the different embodiment of the first acquisition unit, the image data may be raw image data obtained by AD-converting the output signal of the image sensor, or may be image data obtained by performing predetermined processing on the basis of the raw image data, for example, by performing various image processing such as white balance compensation processing, Y compensation processing, and YC conversion processing, to generate image data composed of a digitized luminance signal and color difference signals. But not limited thereto, in other embodiments, the first image may be obtained by receiving external image data, and the information acquisition apparatus 100 may obtain, through the communication interface, the first image provided by an external apparatus connected to the information acquisition apparatus 100 by wire or wirelessly, such as a visible light image of a sign output by a visible light photographing apparatus connected to the information acquisition apparatus 100; or may be obtained by reading the first image file from a storage medium.
A second acquisition unit 2 for acquiring detection data; in embodiment 1, the detection data may be obtained based on signals of detection probes, including corresponding probes and/or sensors, etc. But not limited thereto, in other embodiments, the detection data may be obtained by receiving external detection data, such as detection data output by a detection device connected to the information acquisition device 100; or may be obtained by reading the test data file from the storage medium.
In embodiment 1, the second acquiring unit 2 is a thermal image capturing unit (e.g. a thermal image capturing unit of 8-14 μm), and is composed of an optical component, a lens driving component, an infrared detector, a signal preprocessing circuit, and the like, which are not shown in the figure. The optical component is composed of an infrared optical lens for focusing the received infrared radiation to the infrared detector. The lens driving part drives the lens to perform focusing or zooming operation according to a control signal of the control part 10. Furthermore, it may be an optical component that is manually adjusted. An infrared detector, such as a refrigeration or non-refrigeration type infrared focal plane detector, converts infrared radiation passing through the optical components into electrical signals. The signal preprocessing circuit comprises a sampling circuit, an AD conversion circuit, a timing trigger circuit and the like, and is used for sampling and other signal processing of an electric signal output from the infrared detector in a specified period and converting the electric signal into a digital thermal image signal through the AD conversion circuit, wherein the thermal image signal is binary data (also called thermal image AD value data, AD value data for short) of 14 bits or 16 bits, for example; the corresponding pseudo-color plate range can be determined according to the range of the AD value of the thermal image signal or the set range of the AD value, the specific color value corresponding to the AD value of the thermal image signal in the pseudo-color plate range is used as the image data of the corresponding pixel position in the infrared thermal image, and the image data of the infrared thermal image can be obtained. In embodiment 1, the second acquiring unit 2 is used as an example of the second acquiring unit for capturing and acquiring thermal image data (an example of detection data).
In embodiment 1, thermal image data is obtained by shooting; but not limited thereto, in other embodiments, the image may be obtained by receiving external thermal image data, or may be obtained by reading a thermal image file from a storage medium; the thermal image data is obtained based on the output signal of the infrared detector; according to different embodiments, for example, the infrared image sensor can be a thermal image signal (thermal image AD value data obtained after AD conversion of an output signal of the infrared thermal image detector); the image data may be thermal image data obtained by performing predetermined processing on the thermal image signal, for example, image data based on an infrared thermal image obtained by performing pseudo-color processing, or array data based on a temperature value obtained by performing temperature conversion.
The imaging unit is not limited to a different imaging unit; in some examples, the first and second acquiring parts may be the same photographing part, for example, the thermal image photographing part is used for photographing a first image of the label and photographing the subject to obtain the thermal image data.
A temporary storage section 3, such as a volatile memory such as a RAM, a DRAM, or the like, as a buffer memory for temporarily storing data output from the first acquisition section 1 and the second acquisition section 2; at the same time, the image processing unit 6 and the control unit 10 function as working memories and temporarily store data processed by the image processing unit 6 and the control unit 10.
A communication unit 4 that connects the information acquisition apparatus 100 to an external apparatus and exchanges data with the external apparatus according to a communication specification such as USB, 1394, network, GPRS, 3G, 4G, or 5G; examples of the external device include a personal computer, a server, a PDA (personal digital assistant), another information acquisition device, and another storage device.
The memory card unit 5 is connected to a memory card as a rewritable nonvolatile memory as an interface of the memory card, is detachably mounted in a card slot of the main body of the information acquisition apparatus 100, and records detection data under the control of the control unit 10.
An image processing unit 6 for performing predetermined processing on the data acquired by the first acquisition unit 1 and the second acquisition unit 2; for example, when the display timing comes, a frame at each predetermined time interval is selected and read from thermal image data of predetermined time shares temporarily stored in the temporary storage unit 3; the image processing unit 6 performs processing such as correction, interpolation, pseudo color, synthesis, compression, decompression, and the like, and converts the data into data suitable for display, recording, and the like. The image processing unit 6 may be implemented by, for example, a DSP, another microprocessor, a programmable FPGA, or the like, or may be a processor integrated with the control unit 10.
Preferably, the image processing unit 6 includes a first recognition unit configured to recognize the first image acquired by the first acquisition unit 1 and acquire first subject information.
Taking as an example a first image obtained by imaging a sign including character-constituting sign information, in one embodiment, the first recognition unit includes a positioning unit, a dividing unit, a recognition unit, and a determination unit.
The positioning unit is used for positioning the label area in the first image; for example, performing a related search in a specified range on the first image, finding a plurality of areas which accord with the sign characteristics (such as different colors according to the sign area and the environmental background) as candidate areas, further analyzing the candidate areas, and finally selecting an optimal area as the sign area and extracting the optimal area from the first image; here, predetermined processing such as inclination correction may be performed on the label image or the like.
The segmentation unit is used for segmenting the characters in the label; specifically, the placard area is divided into individual characters according to the extracted placard area, and the character division may be performed by, for example, dividing an image of an individual character based on a blank space between characters by using a vertical projection method, taking into consideration conditions such as a character writing format of the placard, characters, size restrictions, and the like.
And the recognition unit is used for identifying the characters of the signs and recognizing the well-divided characters. The placard character recognition method may employ, for example, a template matching algorithm, binarizes the segmented character and scales the size of the binarized character to the size of a character template in a character database, and then matches all the character templates to select the best match as a result. How to extract a character image from an image and perform character recognition is a technique well known in the art, and a detailed description is omitted.
The judging unit is used for judging whether the character information is first detected body information or not; the character information is information constituted by the recognized character or characters. Specifically, in one example, when the information acquisition apparatus 100 matches the recognized character information with predetermined subject information (e.g., subject information stored in the flash memory 7), and if the keywords of the two match, the first subject information is obtained as a representative of the recognition. If not, the first detected body information is judged not to be identified.
In the explanation of the identification by the label content "subject 1" in fig. 5, the character information "subject 1" constituted by the characters "subject", "measure", "body", and "1" is identified and obtained for the first image taken by the label, and if the character information "subject 1" matches the keyword of the subject information "subject 1" stored in the flash memory 7, the first subject information "subject 1" is obtained for the representative identification. The configuration information associated with the subject information "subject 1" stored in the flash memory 7 may be used as the configuration information associated with the identified first subject information "subject 1".
When the content of the placard is composed of characters (letters, numbers, letters, etc.), the first subject information may be determined whether or not to obtain the first subject information based on the recognized character information according to a matching comparison with stored subject information. However, for example, when the content of the label is a barcode, a graphic code, or the like, the barcode may be matched with a barcode template in the template library, and when the content of the label is matched, the subject information associated with the barcode template may be used as the first subject information for identification. The configuration information associated with the subject information may be used as the configuration information associated with the identified first subject information. Obviously, it is preferable that a library of barcode templates is stored in the flash memory 7 in advance.
Various embodiments of recognizing the first image to obtain the character information and/or the first subject information may be employed.
However, in another example, the determination unit may be omitted and character information may be recognized and obtained based on the first image.
In another embodiment, the information acquisition apparatus 100 may have a configuration that does not include the first recognition unit or includes only a partial function of the first recognition unit; in one example, the first image may be transmitted to a destination through the communication unit 4, for example, to a designated server through wireless 3G, the server may perform recognition of the first image, and the first subject information obtained through the recognition of the first image and/or the configuration information associated with the first subject information may be received through the communication unit 4. In another example, the divided character image extracted from the first image may be transmitted to a server of a destination through the communication unit 4, the server may perform recognition of the character and a process of obtaining the first object information, and the first object information obtained by the recognition process and/or the configuration information associated with the first object information may be received through the communication unit 4. Obviously, when the configuration information can be received through the communication section 4, the content shown in fig. 3 may not need to be stored in the storage medium of the information acquisition apparatus 100.
The flash memory 7 stores a program for control and various data used for control of each part. Data relating to the first image recognition and the like is stored in a storage medium such as the flash memory 7, for example, a character database for character template matching is stored. In embodiment 1, as shown in the table shown in fig. 3, a flash memory 7 is used as an example of a storage medium for storing information such as subject information and associated configuration information.
The information acquisition apparatus 100 can acquire data necessary for processing from a storage medium, which may be a storage medium in the information acquisition apparatus 100, such as a nonvolatile storage medium like the flash memory 7, the memory card, or the like, a volatile storage medium like the temporary storage section 3, or the like; it may be another storage medium connected to the information acquisition apparatus 100 by wire or wirelessly, such as another storage apparatus or detection apparatus, a storage medium in a computer or the like, or a storage medium of a network destination, which performs communication by wire or wirelessly connecting to the communication interface 4.
The subject information is information related to the subject, for example, information representing a specific attribute of the subject such as a location, a type, a number, and the like of the subject;
the information of the subject stored in the flash memory 7 should include the keyword in the label information corresponding to the subject for the subsequent matching. Taking the label of the power equipment as an example, the labels of the tested objects installed in the actual substation field can be listed as the following common types: in one example, a subject label installed in correspondence with an entity of the subject includes information representing a location, a number, a type, a phase, etc. of the subject; correspondingly, the information of the tested object comprises information representing the location (such as a transformer substation and an equipment area), the number, the type (such as the type of a transformer, a switch and the like), the phase (such as A, B, C phase) and the like of the tested object; in another example, a placard associated with the device region of the subject; correspondingly, the information of the detected body is the information of the equipment area; in yet another example, the label represents the type of subject; accordingly, the subject information is information of the type of the subject. The subject information stored in advance in the storage medium may further include information such as an attribution unit, a voltage class, a model number, an importance level, a manufacturer, performance, and characteristics of the subject, past imaging or inspection history, a manufacturing date, a lifetime, an ID number, and a detection notice. The information of the measured body can be formed in various ways according to different applications.
When there are the above-mentioned various cases of the measured object information prepared in advance, for example, the actual placard contents "device area 1-1200-switch-a phase"; the stored information of the tested body comprises an equipment area 1, a switch and an equipment area 1-1200-switch-A phase; when character information 'equipment area 1-1200-switch-A phase' is obtained by recognition, the tested body information 'equipment area 1-1200-switch-A phase' with the highest matching degree is selected as representing the first tested body information obtained by recognition during matching comparison.
For convenience of explanation, in the embodiment, the following explanation will be given with reference to the abbreviated symbol contents "subject 1" shown in fig. 5.
And a display control unit 8 for displaying the image data for display stored in the temporary storage unit 3 on the display unit 9. Specifically, the display control unit 8 includes a VRAM, a VRAM control means, a signal generation means (not shown), and the like, and the signal generation means periodically reads image data from the VRAM (image data read from the temporary storage unit 3 and stored in the VRAM) under the control of the control unit 10, generates a video signal, and outputs the video signal to the display unit 9.
The display unit 9 is, for example, a liquid crystal display device. Without being limited thereto, the display portion 9 may be another display device connected to the information acquisition device 100, and the information acquisition device 100 itself may not have a display portion in its electrical configuration.
The control unit 10 controls the overall operation of the information acquisition apparatus 100, and a program for control and various data used for controlling each part are stored in a storage medium such as the flash memory 7. The control unit 10 is realized by, for example, a CPU, an MPU, an SOC, a programmable FPGA, or the like.
A control unit 10 for performing corresponding processing control based on the first subject information and/or the second subject information acquired from the first image; the method comprises the step of executing corresponding processing on the acquired detection data according to the configuration information related to the first detected body information and the configuration information related to the second detected body information. The processing control, for example, one or more of processing control related to notification, display of prompt information, processing of detection data, acquisition of detection data, and the like; the configuration information may be one or more items of parameters, information, programs, and the like necessary to perform process control. The information acquisition apparatus 100 can perform processing control according to the configuration information.
The second measured object information is information related to the measured object selected based on the first measured object information, and preferably, the storage medium such as the flash memory 7 stores the measured object information containing or associated with the second measured object information. As shown in the table of fig. 3, the stored subject information "subject 1" and "subject 2" are associated with the corresponding second subject information.
For example, when the first object information "object 1" is obtained by the first image recognition, assuming that the "object 1" corresponds to a certain electric power equipment and it is possible that a certain component of the electric power equipment, such as a body, a joint, etc., is detected, when it is necessary to photograph the "joint", the user may select the associated second object information, such as the "joint", based on the first object information "object 1".
For example, when the first subject information "subject 2" is obtained from the first image recognition, assuming that "subject 2" corresponds to a certain device area and that a certain device such as a switch, a blade, or the like of the device area is detected, the user may select the associated second subject information such as "switch" based on the first subject information "subject 2" when the "switch" needs to be photographed.
In a preferred example, the detection data is matched based on the identification configuration information associated with the subject 1, such as a subject identification information joint template and a subject identification information body template respectively corresponding to the second subject information "joint" and "subject", and the second subject information "subject" is automatically selected when the detection data is correlated with the subject template.
For example, when the first subject information includes only a part of information related to the subject; the complete information of the measured object also comprises other attribute information such as ID, and other attribute information of the same measured object is selected according to the first measured object information, and the second measured object information is ID number.
The notification processing, for example, executes control for performing notification when the first subject information is recognized. The first subject information obtained by the recognition may be caused to be displayed based on the control; in other examples, the change may be or be accompanied by one or more of a vibration component in the information acquisition apparatus 100, a change in light of an indicator light, a sound of a sound component, and the like; any notification means that the user can perceive may be used. The notification control may be performed based on the first subject information and/or the second subject information obtained by the recognition.
The display control of the presentation information may be, for example, one or more of display control of presentation information related to the subject, such as display control of a reference image, past imaging or inspection history, a detection notice, second subject information, and the like. The display control of the prompt message can be carried out based on the configuration information related to the first measured object information and/or the second measured object information. The configuration information may be one or more items of parameters, information, programs, and the like necessary to perform display control of the guidance information.
The processing control of the detection data includes one or more of various processing controls such as recording, labeling, communication, recognition, analysis region setting, analysis, diagnosis, classification, and the like with the detection data. The processing control of the detection data may be performed based on the configuration information associated with the first subject information and/or the second subject information. The configuration information may be one or more items of parameters, information, programs, and the like necessary to perform process control of the detected data.
Such as lens switching, control relating to the information acquisition apparatus 100, acquisition frequency of a detector output signal for detecting data, image processing, and the like. The acquisition control of the detection data may be performed based on the configuration information associated with the first subject information and/or the second subject information. The configuration information may be one or more items of parameters, information, programs, and the like necessary to perform acquisition control of the detection data.
There may be various kinds of configuration information according to different applications, and the configuration information related to the processing may be stored in the table shown in fig. 3 in association with the subject information, the second subject information, and when the information obtained from the first image recognition matches the subject information, the configuration information associated with the subject information may be used as the configuration information associated with the first subject information obtained from the first image recognition; when the second subject information is selected, configuration information associated with the second subject information may be employed in the processing.
A part of the processing in the display interface of the information acquisition apparatus 100 shown in fig. 6 will be described with reference to the display example thereof;
display control of the presentation information, for example, when the first subject information is detected, the first subject information is displayed on the display unit 9; preferably, when the first subject information is detected, if the first subject information is associated with the second subject information, the display unit 9 displays the first subject information and the associated second subject information; the user can conveniently select the second detected body information. As shown in fig. 6(a), when the first subject information "subject 1" is detected, the first subject information and the associated second subject information "body", "joint", and the like may be displayed.
And performing the display processing of the reference image, such as executing the corresponding reference image display processing according to the reference image configuration information related to the first measured object information and/or the second measured object information. The reference image configuration information includes one or more of configuration parameters related to the reference image display process, and when only a part of the configuration parameters related to the reference image is included, the configuration parameters of the other part may adopt a default configuration of the information acquisition apparatus 100 or configuration parameters set by a user. The reference image configuration parameters include, for example, composition data of the reference image, a position parameter of display of the reference image or a rule for obtaining the position parameter, and other display parameters of the reference image such as a transparency, a line type, a color, and the like. For example, the acquired thermal image data is subjected to display processing of a reference image according to reference image composition data associated with the first subject information. The reference image may be various figures and image data related to the detection of the subject, such as a previously captured historical image, for example, an image for assisting the capturing of the subject, which represents morphological features of the subject, and is used for assisting the user in aligning the subject for detection when the reference image is superimposed on the image obtained by the detection data according to predetermined position parameters (position, size, or rotation angle); for example, a standard pattern or image of the detection data for reference when the user detects the subject may be displayed in a region other than the pattern or image obtained from the detection data displayed on the display unit 9. As shown in fig. 6(c), a reference image T1 obtained by associating reference image composition data with the first subject information is superimposed on the infrared thermal image obtained by the captured thermal image data according to the predetermined position parameters as a reference for capturing the subject thermal image IR1, so that the quality of capturing the subject thermal image IR1 can be standardized.
The recording process is, for example, to record the first image acquired by the first acquisition unit 1 and/or the detection data acquired by the second acquisition unit 2, and to record the first subject information and/or the second subject information selected based on the first subject information in association with the first image acquired by the first acquisition unit 1 and/or the detection data acquired by the second acquisition unit 2.
The marking process is, for example, to record the first subject information and/or the second subject information selected based on the first subject information in association with a predetermined frame in the first image acquired by the first acquisition unit 1 and/or the detection data acquired by the second acquisition unit 2 in response to a marking instruction when the first image continuously acquired by the first acquisition unit 1 and/or the detection data continuously acquired by the second acquisition unit 2 are dynamically recorded. For example, when the detection data continuously acquired by the second acquisition unit is dynamically recorded, the first subject information and/or the second subject information selected based on the first subject information are recorded in association with the detection data acquired by the second acquisition unit at the marking timing in response to a marking instruction issued by a user via the operation unit.
The communication processing is, for example, to associate the first subject information and/or the second subject information with the first image acquired by the first acquisition unit 1 and/or the detection data acquired by the second acquisition unit 2, and to transmit the information to a storage medium of a destination via the communication unit 4.
And performing identification processing, for example, according to identification configuration information associated with the first measured object information and/or the second measured object information, and based on the acquired detection data, performing corresponding identification processing control to identify whether a specific measured object is detected. The identification configuration information may include, for example, one or more of configuration parameters related to the identification process, and when only a part of the configuration parameters related to the identification is included, the configuration parameters of the other part may adopt a default configuration of the information acquisition apparatus 100 or configuration parameters set by a user. The identification configuration information includes, for example, information related to the identification of the subject, a judgment value of the degree of correlation, and an identification search policy. The subject identification information is used for identifying matched image templates and feature quantities, for example; the judgment value of the correlation is used for comparing with the correlation obtained by matching to determine whether the detection data of a specific detected object is identified; the identification search strategy includes, for example, determination of a detection area in the detection data, for example, determination of a processing order when there is a combination of a plurality of templates, detection areas, and determination values. For example, according to the identification information (such as a characteristic template of the measured object) of the measured object associated with the first measured object information and the corresponding judgment value of the degree of correlation, extracting thermal image data from a specified detection area from the acquired thermal image data, matching and comparing the thermal image data with the identification information of the measured object to obtain the degree of correlation, and comparing the correlation with the judgment value of the degree of correlation to identify whether a specific measured object thermal image is shot or not; as shown in fig. 6(b), when a subject thermal image IR1 matching the template T1 associated with the first subject information is detected, a blinking icon SS1 is displayed.
And analysis region setting processing, such as analysis region configuration information associated with the first measured object information and/or the second measured object information, based on the acquired detection data, and corresponding analysis region setting processing is executed. The analysis area configuration information includes one or more of configuration parameters related to the analysis area setting process, and when only a part of the configuration parameters related to the analysis area setting is included, the configuration parameters of the other part may adopt a default configuration of the information acquisition apparatus 100 or configuration parameters set by a user. The analysis area configuration parameter includes, for example, analysis area configuration data, a position parameter of the analysis area, or a rule for obtaining the position parameter. For example, according to analysis region composition data and position parameters associated with the first measured body information, setting a corresponding analysis region for the acquired thermal image data; as shown in fig. 6(d), the infrared thermography has analysis regions S01, S02 set according to analysis region configuration data and position parameters associated with the first subject information.
And analyzing, such as analyzing configuration information associated with the first measured object information and/or the second measured object information, based on the acquired detection data, and executing corresponding analysis processing. The analysis configuration information includes one or more of configuration parameters related to the analysis processing, and when only a part of the configuration parameters related to the analysis is included, the configuration parameters of the other part may adopt a default configuration of the information acquisition apparatus 100 or configuration parameters set by a user. The analysis parameters include, for example, analysis area configuration data related to analysis, a position parameter of the analysis area or a rule for obtaining the position parameter, an analysis pattern. For example, the acquired thermal image data is analyzed according to the analysis area and/or analysis mode associated with the first measured object information. As shown in fig. 6(e), the analysis region configuration data and the analysis mode are associated with the first subject information, the analysis regions S01, S02 are set, and the analysis data is obtained according to the analysis mode (S01MAX-S02 MAX). The analysis mode refers to an analysis calculation rule; taking the detected data as thermal image data as an example, the detected data represents an analysis calculation rule adopted by performing temperature analysis on the thermal image data determined based on the analysis area to obtain an analysis result, such as calculating a maximum temperature, an average temperature, a minimum temperature, a percentage content and the like; and, a calculation relationship between the analysis regions such as a temperature difference calculation, etc. may also be included.
And diagnosis processing, such as diagnosis configuration information associated with the first measured object information and/or the second measured object information, based on the acquired detection data, and executing corresponding diagnosis processing. The diagnostic configuration information may include one or more of configuration parameters related to the diagnostic process, and when only a part of the configuration parameters related to the diagnosis is included, the configuration parameters of the other part may adopt a default configuration of the information acquisition apparatus 100 or configuration parameters set by a user. Taking diagnosis of thermal image data as an example, configuration parameters contained in the diagnosis configuration information include analysis region configuration data, position parameters of the analysis region in the thermal image, an analysis mode, a diagnosis threshold value and a diagnosis conclusion corresponding to the analysis region configuration data; as shown in fig. 6(f), the analysis region configuration data, the position parameter of the analysis region in the thermal image, the analysis mode, the diagnosis threshold, and the diagnosis conclusion are associated with the first measured object information; the set analysis regions S01 and S02, and the analysis data obtained according to the analysis mode (S01MAX-S02MAX), the diagnosis result "serious defect!obtained according to the diagnosis threshold and the corresponding diagnosis conclusion (S01MAX-S02MAX ≧ 2 ℃, serious defect)! "
And classifying the detection data, for example, classifying the detection data according to the first detected object information and/or the second detected object information associated with the detection data, for example, storing the detection data in a specific folder, so as to facilitate subsequent batch processing.
And lens switching processing, namely switching lens parameters such as changing the aperture of a lens and the like according to lens configuration information related to the first measured object information and/or the second measured object information.
And controlling the processing of the detection data, such as executing corresponding processing based on the acquired detection data according to the configuration information related to the first detected body information and/or the second detected body information. For example, image display processing such as pseudo color is performed on the acquired thermal image data based on processing parameters such as interpolation and pseudo color associated with the first object information.
The above-described processing is not limited to one of the processing, and a combination of plural processing may be performed;
for example, based on the identification configuration information associated with the first measured object information and/or the second measured object information, based on the acquired detection data, corresponding identification processing is performed, and when the measured object identification information matches the detection data, the detection data is processed according to the first measured object information, the second measured object information, the identification configuration information, or one or more other associated configuration information.
In one example, according to identification configuration information related to the first measured object information and/or the second measured object information, corresponding identification processing control is executed based on the acquired detection data, and when the photographed thermal image data is identified to have a specific measured object thermal image, an analysis area is set based on analysis area configuration information related to the first measured object information; as shown in fig. 6(f), when a subject thermal image IR1 matching the template T1 associated with the first subject information is detected, setting of analysis regions S01, S02 may be performed according to the analysis region configuration data and the position parameters associated with the first subject information; preferably, the analysis area may be set based on analysis area configuration data and a positional parameter of an analysis area having a specific positional relationship with the template T1.
In another example, the user photographs the tested object 1 to obtain the detection data, and the detection data is identified based on the tested object identification information body template and joint template associated with the first tested object information, and when the detection data is matched with the body template, the detection data is processed according to the configuration information associated with the tested object identification information body template; for example, the "ontology template" is associated with the "ontology" of the second detected body information, and the second detected body information and the detection data can be associated and recorded, so that subsequent classification processing is facilitated. For example, if the "ontology template" is associated with the diagnosis configuration information, the detection data can be diagnosed and processed according to the diagnosis configuration information.
The operation unit 11: the control unit 10 executes a program in response to an operation signal from the operation unit 11, for the user to perform various instruction operations or various operations such as inputting setting information. Fig. 2(a) shows an outline of the information acquisition apparatus 100; the information acquisition apparatus 100 includes a plurality of keys for user operations; the related operation may be realized by a touch panel, a voice recognition unit (not shown), or the like. The information acquisition apparatus 100 can be held and used by a user.
As shown in fig. 2(b), the information acquiring apparatus 100 includes a lens 201 of the first acquiring unit 1 (visible light imaging unit), a lens 202 of the second acquiring unit 2 (thermographic imaging unit), and an auxiliary lighting device 203. The auxiliary lighting device 203 is, for example, a high-intensity lighting lamp, and is used for photographing the label of the subject by the irradiation of the light source at night by the first acquisition unit 1.
The control procedure of embodiment 1 is explained with reference to the flowchart of fig. 4.
An example of the specific operation and control flow of embodiment 1 will be described in detail below. Before the main shooting, the contents of the table as in fig. 3 are stored in advance in the flash memory 7; the control unit 10 controls the overall operation of the information acquisition apparatus 100 based on the control program stored in the flash memory 7 and various data used for controlling each unit, and the control procedure is as follows:
step a01, the control unit 10 determines whether or not there is an instruction to acquire the first image; when the user determines to acquire the first image through the operation part 11, the next step is performed;
step a02, the first acquiring unit 1 captures a first image, in this example, a visible light image, and the user adjusts the capturing angle to display the first image as shown in fig. 5(a) on the display unit 9, in which the subject 1(501) and the label 502 attached to the subject mount are visible; preferably, as shown in fig. 5(b), a positioning frame 503 is displayed in the image displayed on the display unit 9, so that the user can frame the image of the sign 502 in the positioning frame 503; the first recognition unit can quickly locate the signboard region according to the range of the location frame 503, and can increase the recognition processing speed.
A specific example of the first image is a logo region image extracted from the first image obtained by imaging, or a character image obtained by further extracting segments.
Step A03, judging whether there is identification indication; if not, the process goes to step a06, if step a06 does not exit, the process goes back to step a02, the user can adjust the angle, distance, etc. of the information acquisition device 100 for photographing the sign, or can adjust the position parameters (position, size, or rotation angle) of the positioning frame 503, and when the user confirms through the operation unit 11 (step a 03: yes), the process goes to the next step;
a step a04 of performing processing relating to recognition based on the first image;
specifically, the recognition section may locate the placard area based on the first image or the range of the location frame 503 in the first image; then, the character is divided, template matching is performed to specify the characters "measured", "volume", and "1", and character information "measured volume 1" formed from the recognized characters is obtained and stored in a predetermined area of the temporary storage unit 3.
Step A05, judging whether to identify and obtain the first detected body information;
specifically, when the information acquiring apparatus 100 matches the recognized character information "subject 1" with the subject information "subject 1" stored in the flash memory 7, if the keyword matches, it represents that the first subject information is recognized, and step a07 is performed;
if not, judging that the first detected body information is not detected; in step a06, if the step a06 does not exit, the process returns to step a02, and the user can adjust the shooting angle, distance, etc. of the information acquiring apparatus 100 to acquire the first image again for subsequent processing;
in step a07, the control unit 10 controls the second acquisition unit to acquire the detection data;
step A08, processing according to the acquired first detected body information and/or the associated configuration information;
the processing control, for example, one or more of processing control related to notification processing, display control of the presentation information, processing control of the acquired detection data, acquisition control of the detection data, and the like; there may be various processing configurations associated with the identified first subject information and/or second subject information depending on the application of the detection data.
For example, the acquired detection data may be subjected to analysis processing based on analysis configuration information associated with the identified first subject information "subject 1".
Because the measured object information "measured object 1" is also associated with the second measured object information: the "main body" and the "joint" are displayed on the display unit under the control of the control unit 10, and the second subject information associated with the identified first subject information "subject 1" is displayed on the display unit: the body and the joint can be selected when the user intends to shoot the joint, and the acquired detection data can be analyzed and processed according to the analysis configuration information related to the joint of the second detected body information.
In this case, it is preferable that the first subject information obtained by the recognition be displayed on the display unit 9 to notify the user.
Step A09, judging whether to end, if yes, ending; if not, the process returns to step a07, and the detected data acquired subsequently by the second acquisition unit can be processed.
Obviously, when the detected body is detected, corresponding processing is conveniently carried out according to the identified first detected body information, and manual complicated operation of a user is avoided.
Further, when the user selects the second subject information "joint" based on the identified first subject information "subject 1", the processing may be performed based on the configuration information associated with the second subject information "joint".
In another embodiment, the processing may be performed based on the configuration information associated with the first subject information "subject 1" and the configuration information associated with the second subject information "joint".
As described above, the first image is acquired, and then the first image is identified to acquire the first detected body information; performing processing control according to first subject information and/or second subject information selected based on the first subject information; the problems of the prior art are solved; in the prior art, the label of the tested object needs to be checked to manually record the information of the tested object, and the use is inconvenient; and the detection device with the first image shooting function is convenient for acquiring first detected body information of a detected body with a label, is convenient to use in the daytime and at night, does not need to be close to the label mounting part of the detected body, and achieves the beneficial effects of simple and effective operation, great reduction of the labor intensity of a user, accurate acquisition of the first detected body information, convenience in implementation of processing control, convenience in realization of powerful processing effect and the like.
Further, the instruction to acquire the first image and the instruction to recognize by the user through the operation portion 11 are not limited; in one example, the first image may be continuously acquired, and the recognition processing may be performed according to a recognition instruction of the user; in another example, it may be configured to acquire the first images continuously, and also to identify the continuously acquired first images; and when the first detected body information is identified, carrying out subsequent corresponding processing according to the acquired first detected body information.
Example 2
The shooting and collecting functions of the first and second acquisition parts are not necessary, and the first image and/or the detection data can be acquired by reading the first image and/or the detection data stored in the storage medium and/or through the communication interface; therefore, the present invention can be widely applied to an information acquisition apparatus that reads detection data or receives and processes detection data from the outside, such as various devices such as a personal computer, a personal digital assistant, and the like.
The information acquisition apparatus 101 according to embodiment 2 reads the first image and the detection data stored in the storage medium to realize the functions of the first acquisition unit and the second acquisition unit.
The structure of the information acquisition apparatus 101 of embodiment 2 is explained with reference to fig. 7; the information acquisition device 101 includes a communication interface 1, an auxiliary storage unit 2, a display unit 3, a RAM4, a hard disk 5, an operation unit 6, and a CPU7 connected to the above components via a bus and controlling the whole. The information acquisition apparatus 101 may be, for example, a computer or a personal digital assistant.
The communication interface 1 may include various wired or wireless communication interfaces, such as a network interface, a USB interface, a 1394 interface, a video interface, GPRS, 3G, 4G, 5G, and the like.
The auxiliary storage unit 2 is a storage medium such as a memory card and a related interface.
The display section 3 is, for example, a liquid crystal display, and the display section 3 may be another display connected to the information providing apparatus 100, and the information providing apparatus 100 itself may have no display in its electrical configuration.
The RAM4 functions as a work memory for the CPU7, and temporarily stores data processed by the CPU 7.
The hard disk 5 stores therein a program for control and various data used in the control. And stores a table as shown in fig. 3.
The operation unit 6: the CPU7 executes a program in response to an operation signal from the operation unit 6, for the user to perform various instruction operations or various operations such as inputting setting information. A touch screen or keys (not shown) may be employed to implement the relevant operations.
In embodiment 2, the CPU7 may be exemplified as the first acquisition section and the second acquisition section, for example, to read the first image and the detection data from a storage medium such as the hard disk 5.
Further, the CPU7 may also receive the first image and/or the detection data provided by an external device such as a device for photographing the first image, a device for obtaining the detection data by a probe, through the communication interface 1.
An example of the control flow of the information acquisition apparatus 101 is explained with reference to fig. 8; here, as an example, a process of storing a file of a first image and detection data associated with each other in the hard disk 5 is taken, the first image and the detection data are acquired by, for example, a detection device with a function of capturing the first image and obtaining the detection data by a probe, and the first image and the detection data are recorded in association with each other.
Step B01, when there is an instruction to acquire the first image, if the user selects a first image file through the operation unit 6, the CPU7 is used as the first acquisition unit, and reads the corresponding file from the hard disk 5, acquires the first image, and proceeds to the next step;
step B02, identifying the first image to obtain the first detected body information; specifically, the CPU7 locates the placard region in the first image, then performs character segmentation, then performs template matching to determine the character, and obtains the recognized character information; then, whether the first detected body information is obtained or not can be identified by matching the identified character information with the detected body information stored in the hard disk 5, and if not, the step B05 is skipped, or further information prompt such as a word of 'unrecognized detected body information' can be performed; if yes, entering the next step;
in step B03, the CPU7 may function as a second acquisition unit that acquires the detection data; specifically, the detection data file is selected to be acquired according to the detection data file associated with the first image file;
not limited to the pre-associated first image file and the inspection data file; for example, the user may select from among the test data files stored on the hard disk 5; alternatively, the selection is performed according to a predetermined condition, for example, a plurality of modes such as selecting detection data close to the time of the first image capturing according to the time.
Step B04, processing according to the configuration information related to the first detected body information;
the processing control, for example, one or more of processing control related to notification processing, display control of the presentation information, processing control of the acquired detection data, acquisition control of the detection data, and the like; there may be various processing configurations associated with the identified first subject information and/or second subject information depending on the application of the detection data.
Step B05, judging whether to end, if yes, ending; if not, then returning to step B01, the user can select the next first image file for subsequent processing. If a plurality of first image files to be processed are selected previously, the process can return to step B01 to read the next first image file for processing;
preferably, when the plurality of first image files are selected to be processed, the first image files without the first detected body information identified and/or the detection data files associated with the first image files can be automatically classified, for example, stored in a specific folder, so that subsequent manual processing is facilitated;
further, it is preferable that, when the first subject information is recognized, the detection data associated with the first image is subjected to recognition processing based on the subject recognition information associated with the first subject information, and whether the first subject is photographed or a part thereof is photographed is determined and the classification is automatically performed.
As described above, since the first subject information is obtained by identifying the first image and the detection data related to the first image is processed according to the configuration information related to the first subject information, the processing quality of the detection data can be greatly improved and the manual operation of the user can be reduced.
Example 3
One of the first acquisition section and the second acquisition section may have a function of acquiring the first image or the detection data, and the other may be realized by reading the first image or the detection data stored in the storage medium or received through the communication interface 1.
In the information processing apparatus 102 according to embodiment 3, the first acquisition unit has a function of capturing the first image, and the second acquisition unit receives the detection data via the communication interface 1. The information processing apparatus 102 may be exemplified by a personal digital assistant, a portable computer, or the like with a visible light imaging apparatus.
The structure of the information acquisition apparatus 102 of embodiment 3 is explained with reference to fig. 9, and fig. 9 is a block diagram of the electrical structure of the information acquisition apparatus 102 of embodiment 3.
The information acquiring apparatus 102 includes a communication interface 1, an auxiliary storage unit 2, a display unit 3, a RAM4, a hard disk 5, an operation unit 6, a CPU7 connected to the above components via a bus and configured to perform overall control, and a first acquiring unit 8. The description of the components having the same numbers as those in embodiment 2 is similar to that in embodiment 2, and the description thereof is omitted.
In this example, the first acquisition unit 8 is a visible light imaging unit that images a label of the subject to acquire a first image; the CPU7 may receive, as an example of the second acquisition section, detection data supplied from an external device, such as a detection device with a detection data acquisition function, through the communication interface 1; the scene of use is, for example, a user who uses the portable information acquisition device 102 and the portable detection device, first captures a first image using the information acquisition device 102, and then processes the detection data. The information acquisition device 102 and the portable detection device may be mounted on a mounting platform such as a vehicle.
An example of the control flow of the information acquisition apparatus 102 is explained with reference to fig. 10; the information acquiring device 102 is connected to a detecting device such as a thermal image capturing device through the communication interface 1.
Step C01, when an instruction to acquire the first image is given, the CPU7 controls the first acquiring part 8 to shoot and acquire the first image, and the next step is carried out; a user holds the information acquisition device and first shoots a first image;
step C02, performing recognition processing on the first image to obtain first subject information; specifically, the CPU7 locates the placard region, then performs character segmentation, then performs template matching to determine the character, and obtains the recognized character information; then, whether the first detected body information is obtained or not can be identified by matching the identified character information with the detected body information stored in the hard disk 5, if not, the step C05 is skipped, and if yes, the next step is performed; the detected first subject information may be displayed as a notification;
step C03, the CPU7 may function as a second acquisition section to acquire the detection data; specifically, detection data collected by a connected detection device is received through the communication interface 1; the user holds the detection device to collect the detection data.
Step C04, processing according to the first detected body information or the configuration information related to the first detected body information;
the processing control, for example, one or more of processing control related to notification processing, display control of the presentation information, processing control of the acquired detection data, acquisition control of the detection data, and the like; there may be various processing configurations associated with the identified first subject information and/or second subject information depending on the application of the detection data.
Step C05, judging whether to end, if yes, ending; if not, the process returns to step C01 to obtain the next first image for subsequent processing.
Other embodiments
The present invention is not limited to the acquisition of detection data based on an output signal of a probe or from the outside, and may be configured as one component or functional block of a detection device or a processing device of an information acquisition device, for example, as another component.
In the above examples, a certain step order is described, but various sequences are possible according to different embodiments, and the processing order described in the above examples is not limited. When the control unit 11, the image processing unit, and the like include a plurality of processors, there may be parallel processing in which some steps are applied.
In the above example, the first image is acquired first, and then the detection data is acquired for processing; however, the detection data may be acquired first, then the first image may be acquired, and the detection data may be processed according to the first detected object information acquired by the first image recognition.
Aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU, MPU, or the like) that performs the functions of the above-described embodiments by separately and executing a program recorded on a storage device, and a method known by a computer of a system or apparatus by the steps thereof to perform the functions of the above-described embodiments by, for example, reading out and executing a program recorded on a storage device. For this purpose, the program is supplied to the computer, for example, via a network or from a recording medium of various types serving as a storage device (e.g., a computer-readable medium).
The present invention provides a computer program in which a digital signal formed by the computer program is recorded in a computer-readable recording medium such as a hard disk or a memory. After the program is operated, the following steps are executed:
a first acquisition step of acquiring a first image;
a second acquisition step of acquiring detection data;
a control step of performing processing control in accordance with first subject information acquired by the first image recognition and/or second subject information selected based on the first subject information; the second subject information is information related to the subject selected based on the first subject information.
Embodiments of the present invention also provide a readable storage medium storing a computer program for electronic data exchange, wherein the computer program causes a computer in a thermographic imaging apparatus to perform the above steps.
Although the functional blocks in the drawings may be implemented by hardware, software, or a combination thereof, there is generally no need for structures to implement the functional blocks in a one-to-one correspondence. One or more of the functional blocks may be implemented by one or more of hardware, software, and hardware modules. In addition, some or all of the processing and control functions of the components in the embodiments of the present invention may be implemented by dedicated circuits, general-purpose processors, or programmable FPGAs.
In addition, the example is exemplified by the application of the object to be measured in the power industry, and the application is also widely applied to various detection industries.
The above description is only a specific example (embodiment) of the invention, and various illustrations do not limit the essence of the invention, and various embodiments can be configured into more embodiments by performing corresponding substitution and combination. Other modifications and variations to the specific embodiments can be practiced by those skilled in the art upon reading the present specification without departing from the spirit and scope of the invention.

Claims (9)

1. The information acquisition device comprises a data acquisition unit,
a first acquisition section for acquiring a first image; the first acquisition unit acquires a first image by imaging a label related to a subject;
a first recognition unit configured to acquire first subject information based on recognition of the first image;
a second acquisition section for acquiring detection data;
a control unit configured to perform corresponding processing on the acquired detection data according to first subject information acquired by the first image recognition and/or second subject information selected based on the first subject information, and configuration information associated with the first subject information and/or configuration information associated with the second subject information;
the processing control includes one or more of notification control, display control of prompt information, acquisition control of detection data, and processing control of the acquired detection data;
the second subject information is information related to the subject selected based on the first subject information.
2. The information acquisition apparatus according to claim 1, comprising,
and a notification unit configured to perform notification based on first subject information acquired by the first image recognition and/or second subject information selected based on the first subject information.
3. The information acquisition apparatus according to claim 1, wherein there is a first recognition section for acquiring character information based on recognition of the first image, and comparing the character information with prescribed subject information, and when the two match, first subject information is acquired on behalf of the recognition.
4. The information acquisition apparatus according to claim 1, wherein the detection data is image data acquired by shooting.
5. The information acquisition apparatus according to claim 1, having auxiliary light source means for illuminating a label related to the subject; the first acquisition unit acquires a first image by imaging the sign illuminated by the auxiliary light source device.
6. The information acquisition apparatus according to claim 1, wherein the first acquisition section is a visible light camera or a near infrared camera, and the first image is acquired by photographing a sign related to the subject.
7. The information acquisition apparatus according to claim 1, wherein the first acquisition section acquires the first image by receiving visible light and/or infrared light.
8. The information acquisition apparatus according to claim 1, wherein the first acquisition section and the second acquisition section may be the same acquisition section, or different acquisition sections.
9. The information acquisition method comprises the steps of,
s91: a first acquisition step of acquiring a first image; the first acquiring step acquires a first image by photographing a label related to a subject;
s92: a first recognition step for acquiring character information and/or first subject information based on the recognition of the first image;
s93: a second acquisition step of acquiring detection data;
s94: a control step, configured to perform corresponding processing on the acquired detection data according to first object information acquired by the first image recognition and/or second object information selected based on the first object information, configuration information associated with the first object information, and/or configuration information associated with the second object information;
the processing control includes one or more of notification control, display control of prompt information, acquisition control of detection data, and processing control of the acquired detection data;
the second subject information is information related to the subject selected based on the first subject information.
CN201510202320.XA 2014-04-29 2015-04-26 Information acquisition apparatus and information acquisition method Active CN105092051B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510202320.XA CN105092051B (en) 2014-04-29 2015-04-26 Information acquisition apparatus and information acquisition method

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN2014101779501 2014-04-29
CN201410177950 2014-04-29
CN201510202320.XA CN105092051B (en) 2014-04-29 2015-04-26 Information acquisition apparatus and information acquisition method

Publications (2)

Publication Number Publication Date
CN105092051A CN105092051A (en) 2015-11-25
CN105092051B true CN105092051B (en) 2021-04-06

Family

ID=54572989

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510202320.XA Active CN105092051B (en) 2014-04-29 2015-04-26 Information acquisition apparatus and information acquisition method

Country Status (1)

Country Link
CN (1) CN105092051B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101791981A (en) * 2010-03-09 2010-08-04 成都市猎户座科技有限责任公司 All-weather automobile radar anti-collision driving system
CN201788026U (en) * 2010-07-27 2011-04-06 凯迈广微(洛阳)光电设备有限公司 Front-end dynamic forest fire recognizing and alarming system and thermal infrared imager thereof
CN102156862A (en) * 2011-05-06 2011-08-17 杨星 License plate recognition system and license plate recognition method preventing blocking and altering
CN102842034A (en) * 2012-07-10 2012-12-26 重庆大学 Device for laser scanning and automatically identifying carved character and identification method
CN103105234A (en) * 2012-01-12 2013-05-15 杭州美盛红外光电技术有限公司 Thermal image device and thermal image standardized shooting method
CN103428429A (en) * 2012-05-23 2013-12-04 杭州美盛红外光电技术有限公司 Image shooting device and image shooting method
CN103674270A (en) * 2012-09-21 2014-03-26 杭州美盛红外光电技术有限公司 Thermal image information recording device and thermal image information recording method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101791981A (en) * 2010-03-09 2010-08-04 成都市猎户座科技有限责任公司 All-weather automobile radar anti-collision driving system
CN201788026U (en) * 2010-07-27 2011-04-06 凯迈广微(洛阳)光电设备有限公司 Front-end dynamic forest fire recognizing and alarming system and thermal infrared imager thereof
CN102156862A (en) * 2011-05-06 2011-08-17 杨星 License plate recognition system and license plate recognition method preventing blocking and altering
CN103105234A (en) * 2012-01-12 2013-05-15 杭州美盛红外光电技术有限公司 Thermal image device and thermal image standardized shooting method
CN103428429A (en) * 2012-05-23 2013-12-04 杭州美盛红外光电技术有限公司 Image shooting device and image shooting method
CN102842034A (en) * 2012-07-10 2012-12-26 重庆大学 Device for laser scanning and automatically identifying carved character and identification method
CN103674270A (en) * 2012-09-21 2014-03-26 杭州美盛红外光电技术有限公司 Thermal image information recording device and thermal image information recording method

Also Published As

Publication number Publication date
CN105092051A (en) 2015-11-25

Similar Documents

Publication Publication Date Title
US20160005156A1 (en) Infrared selecting device and method
KR20130006878A (en) Method for restoring an image of object using base marker in smart device, and smart device for restoring an image of object using the base marker
CN114923583A (en) Thermal image selection device and thermal image selection method
CN105224896B (en) Recording apparatus, processing apparatus, recording method, and processing method
WO2015096824A1 (en) Analysis device and analysis method
CN105262943A (en) Thermal image recording device, thermal image processing device, thermal image recording method and thermal image processing method
CN105224897B (en) Information providing apparatus, detecting system, and information providing method
CN105092051B (en) Information acquisition apparatus and information acquisition method
CN104655636B (en) Thermal image analysis device, thermal image configuration device, thermal image analysis method and thermal image configuration method
CN105157742B (en) Identification device and identification method
CN105208299A (en) Thermal image shooting device, thermal image processing device, thermal image shooting method and thermal image processing method
CN114923581A (en) Infrared selecting device and infrared selecting method
US20150358559A1 (en) Device and method for matching thermal images
WO2015074628A1 (en) Analysis comparison apparatus and analysis comparison method
CN104219425A (en) Thermal-image dynamic recording device, thermal-image dynamic playback device, thermal-image dynamic recording method and thermal-image dynamic playback method
CN104655284B (en) Analysis device, processing device, analysis method, and processing method
US20150334314A1 (en) Device and method for detecting thermal images
CN116358711A (en) Infrared matching updating device and infrared matching updating method
CN105021290B (en) Shooting device, pseudo color setting device, shooting method and pseudo color setting method
CN104748863A (en) Infrared analysis area setting device and infrared analysis area setting method
CN104751445A (en) Thermal image analysis configuration device and thermal image analysis configuration method
CN104655637B (en) Selection device and selection method
CN114838829A (en) Thermal image selection notification device and thermal image selection notification method
CN115993191A (en) Thermal image matching updating device and thermal image matching updating method
CN105306841A (en) Thermal image recording device, thermal image playback device, thermal image recording method and thermal image playback method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 310030 Building 1, qixianqiao village, Liangzhu street, Yuhang District, Hangzhou City, Zhejiang Province

Applicant after: Mission Infrared Electro-optics Technology Co., Ltd.

Address before: 310030 Zhejiang city of Hangzhou province Xihu District city Hongkong No. 386 thick Renlu 14 Building 3 floor

Applicant before: Mission Infrared Electro-optics Technology Co., Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant