CN111665201A - Spectrum detection method, mobile terminal and chip system - Google Patents

Spectrum detection method, mobile terminal and chip system Download PDF

Info

Publication number
CN111665201A
CN111665201A CN202010364316.4A CN202010364316A CN111665201A CN 111665201 A CN111665201 A CN 111665201A CN 202010364316 A CN202010364316 A CN 202010364316A CN 111665201 A CN111665201 A CN 111665201A
Authority
CN
China
Prior art keywords
target object
detection
spectrum
target
light wave
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010364316.4A
Other languages
Chinese (zh)
Inventor
石巍巍
刘默翰
周力
赵磊
俞清华
白立勋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010364316.4A priority Critical patent/CN111665201A/en
Publication of CN111665201A publication Critical patent/CN111665201A/en
Priority to PCT/CN2021/082076 priority patent/WO2021218489A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/255Details, e.g. use of specially adapted sources, lighting or optical systems

Abstract

The application is applicable to the technical field of detection, and provides a spectrum detection method, a mobile terminal and a chip system. The spectrum detection method comprises the following steps: acquiring a first attribute of a target object, and determining a target light wave band according to the first attribute; sending an emission instruction to a light wave emission device on the mobile terminal, wherein the emission instruction is used for instructing the light wave emission device to emit detection light waves, and the wave bands of the detection light waves are within the target light wave band; acquiring a target object scattering spectrum received by a light wave receiving device on a mobile terminal, wherein the target object scattering spectrum is a scattering spectrum obtained by reflecting the detection light wave by the target object; and performing spectrum detection on the second attribute of the target object according to the target object scattering spectrum to obtain a first detection result. By the method, the efficiency of spectrum detection and the accuracy of a detection result can be effectively improved.

Description

Spectrum detection method, mobile terminal and chip system
Technical Field
The present application relates to the field of detection, and in particular, to a spectrum detection method, a mobile terminal, and a chip system.
Background
Spectral detection refers to a method of determining the chemical composition of a substance from the spectral information of the substance. The spectrum detection technology is applied to the fields of food safety, medical treatment, biology and the like. For example: the pesticide residue on the surface of the fruit can be detected by using a spectrum detection technology, and the nutritional ingredients of the vegetables, such as fat content, protein content and the like, can also be detected.
The process of spectral detection is typically: the light wave emitting device emits full-waveband light waves to a target object, the light waves are reflected by the target object to obtain scattered light waves, the light wave receiving device receives the scattered light waves to generate full-waveband scattering spectra, and the processor performs spectrum detection on chemical components in the target object according to the full-waveband scattering spectra.
The existing spectrum detection method is used for detecting according to a full-waveband scattering spectrum, and the full-waveband detection time is long, so that the existing spectrum detection method is low in detection efficiency; in addition, the spectrum energy in the full-waveband scattering spectrum is mixed, so that the detection precision of the existing spectrum detection method is low.
Disclosure of Invention
The application provides a spectrum detection method, a mobile terminal and a chip system, which can improve the efficiency and accuracy of spectrum detection.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, a method for spectrum detection is provided, which includes:
acquiring a first attribute of a target object, and determining a target light wave band according to the first attribute;
sending an emission instruction to a light wave emission device on the mobile terminal, wherein the emission instruction is used for instructing the light wave emission device to emit detection light waves, and the wave bands of the detection light waves are within the target light wave band;
acquiring a target object scattering spectrum received by a light wave receiving device on a mobile terminal, wherein the target object scattering spectrum is a scattering spectrum obtained by reflecting the detection light wave by the target object;
and performing spectrum detection on the second attribute of the target object according to the target object scattering spectrum to obtain a first detection result.
In a possible implementation manner of the first aspect, the first attribute of the object includes a kind of the object;
the acquiring a first attribute of a target object and determining a target optical wave band according to the first attribute comprises the following steps:
acquiring a first image of the target object, and identifying the type of the target object according to the first image of the target object;
and determining the target light wave band according to the type of the target object.
In a possible implementation manner of the first aspect, the performing spectrum detection on the second attribute of the target according to the target scattering spectrum to obtain a first detection result includes:
acquiring an environment scattering spectrum in the target light wave band, wherein the environment scattering spectrum is a scattering spectrum corresponding to the situation that no target object exists in the emission range of the light wave emission device;
carrying out light wave denoising processing on the target object scattering spectrum according to the environment scattering spectrum to obtain a processed target object scattering spectrum;
and performing spectrum detection on the second attribute of the target object according to the processed scattering spectrum of the target object to obtain a first detection result.
In a possible implementation manner of the first aspect, the performing spectrum detection on the second attribute of the target according to the processed target scattering spectrum to obtain a first detection result includes:
acquiring the light wave wavelength corresponding to the second attribute, and determining a spectrum intensity value corresponding to the light wave wavelength in the processed target object scattering spectrum, wherein the light wave wavelength is within the target light wave band;
and carrying out spectrum detection on the second attribute of the target object according to the spectrum intensity value to obtain a first detection result.
In a possible implementation manner of the first aspect, the performing, according to the spectral intensity value, spectral detection on the second attribute of the target object to obtain a first detection result includes:
acquiring volume information of the target object;
inputting the volume information and the spectral intensity value of the target object into a preset detection model, and performing spectral detection on the second attribute of the target object to obtain a first detection result.
In a possible implementation manner of the first aspect, the acquiring the volume information of the target object includes:
acquiring a second image of the target object;
and determining the volume information of the target object according to the size relation between the marker in the image of the second image of the target object and the second image.
In a possible implementation manner of the first aspect, after the performing spectral detection on the second attribute of the target according to the target scattering spectrum to obtain a first detection result, the method further includes:
sending the emission instruction to the light wave emission device for N times again, and obtaining the scattering spectrum of the target object received by the light wave receiving device after sending the emission instruction each time to obtain N spectrums to be detected, wherein N is a positive integer;
performing spectrum detection on the second attribute of the target object according to each spectrum to be detected to obtain N second detection results;
and determining a second attribute of the target object according to the first detection result and the N second detection results.
In a possible implementation manner of the first aspect, after determining the second attribute of the target object according to the first detection result and the N second detection results, the method further includes:
and updating the detection model according to the volume information of the target object, the spectral intensity value and the second attribute of the target object.
In a second aspect, there is provided a spectrum detecting apparatus, comprising:
the acquisition unit is used for acquiring a first attribute of a target object and determining a target light wave band according to the first attribute;
the transmitting unit is used for transmitting a transmitting instruction to a light wave transmitting device on the mobile terminal, wherein the transmitting instruction is used for instructing the light wave transmitting device to transmit a detection light wave, and the wave band of the detection light wave is within the target light wave band;
the receiving unit is used for acquiring a target object scattering spectrum received by a light wave receiving device on the mobile terminal, wherein the target object scattering spectrum is a scattering spectrum obtained by reflecting the detection light wave by the target object;
and the detection unit is used for performing spectrum detection on the second attribute of the target object according to the target object scattering spectrum to obtain a first detection result.
In a third aspect, an embodiment of the present application provides a mobile terminal, including: one or more processors, memories, display screens, lightwave emitting devices, and lightwave receiving devices;
the memory, the display screen, the lightwave transmitting means, and the lightwave receiving means are respectively coupled with the one or more processors, the memory for storing computer program code, the computer program code comprising computer instructions;
the computer instructions, when executed by the one or more processors, cause the mobile terminal to perform the spectral detection method as described in any of the first aspects above.
In a fourth aspect, the present application provides a chip system, where the chip system includes a processor, the processor is coupled with a memory, and the processor executes a computer program stored in the memory to implement the spectrum detection method according to any one of the first aspect.
In a fifth aspect, the present application provides a computer-readable storage medium, and the present application provides a computer-readable storage medium, where a computer program is stored, where the computer program is executed by a processor to implement the spectrum detection method according to any one of the above first aspects.
In a sixth aspect, the present application provides a computer program product, which when run on a terminal device, causes the terminal device to perform the spectral detection method according to any one of the above first aspects.
The spectrum detection method provided by the application comprises the steps of determining a target light wave band according to a first attribute of a target object, then transmitting detection light waves in the target light wave band through a light wave transmitting device on a mobile terminal, and correspondingly, the band of a target object scattering spectrum received through a light wave receiving device on the mobile terminal is also in the target light wave band; because the wave band of the detection light wave is limited when the detection light wave is emitted, when the second attribute of the target object is detected according to the target object scattering spectrum in the subsequent process, only the target object scattering spectrum in the wave band of the target light wave needs to be detected, and the target object scattering spectrum in the full wave band range does not need to be detected, so that the detection range is shortened, the detection time is shortened, and the detection efficiency is improved; in addition, because the target light wave band is determined according to the first attribute of the target object, the target object scattering spectrum in the target light wave band can accurately and intensively reflect the component characteristics of the target object, and further, when spectrum detection is carried out according to the target object scattering spectrum in the target light wave band, the influence of the scattering spectra of other bands except the target light wave band can be avoided, the detection precision is effectively improved, and the method has strong usability and practicability.
It is understood that the beneficial effects of the second to sixth aspects can be seen from the description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a schematic diagram of a spectral detection system provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a detection device on a mobile terminal according to an embodiment of the present disclosure;
fig. 3 is a schematic view of an application scenario of a spectrum detection method provided in an embodiment of the present application;
fig. 4 is a schematic diagram of an application scenario of a spectrum detection method according to another embodiment of the present application;
FIG. 5 is a schematic flow chart of a spectral detection method according to an embodiment of the present application;
fig. 6 is a schematic view of an application scenario for acquiring a first image according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram of a first filter provided in an embodiment of the present application;
FIG. 8 is a flowchart illustrating a method for spectral detection of a second property of a target according to an embodiment of the present disclosure;
FIG. 9 is a schematic flow chart of a spectral detection method according to another embodiment of the present application;
FIG. 10 is a block diagram of a spectrum detection apparatus according to an embodiment of the present disclosure;
fig. 11 is a block diagram of a partial structure of a mobile phone according to an embodiment of the present disclosure;
fig. 12 is a schematic diagram of a software structure of a mobile phone according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that in the embodiments of the present application, "one or more" means one, two, or more than two.
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Referring to fig. 1, a schematic diagram of a spectrum detection system according to an embodiment of the present application is provided. As shown in fig. 1, the spectral detection system may include an application server 11 and at least one mobile terminal 12. The application server 11 is a server corresponding to the spectrum detection application program. The mobile terminal 12 is a mobile terminal device, such as a mobile phone, a notebook computer, a tablet computer, a camera, etc., installed with a spectrum detection application program. The application server 11 is communicatively coupled to the mobile terminal 12.
The mobile terminal 12 is provided with a detection device 121 for spectrum detection, a display 122 for interaction with a user, and a processor 123, wherein the detection device 121 and the display 122 are respectively in communication connection with the processor 123, and the processor 123 is further adapted to be in communication connection with the application server 11.
Referring to fig. 2, a schematic diagram of a detection device on a mobile terminal according to an embodiment of the present application is provided. As shown in fig. 2, the detecting device 121 on the mobile terminal 2 includes a lightwave emitting apparatus 21 and a lightwave receiving apparatus 22. The light wave emitting device 21 may include a light source emitter 211 (e.g., a flash lamp, an LED lamp, etc. on the mobile device), and a first filter 212 (e.g., a filter wheel, a filter, etc.) disposed on the light source emitter. Optionally, the light wave emitting device 21 may further include an actuator 213 for rotating or moving the first filter device, and when the actuator rotates or moves, the first filter device is driven to rotate or move, so that the first filter device rotates or moves to a position where the light wave of a specific wavelength band can be filtered. The lightwave receiving device 22 may include a light source receiver 221 (e.g., a camera on a mobile device, etc.), and a second filter 222 (e.g., a grating, etc.) disposed on the light source receiver. In FIG. 2, 23 is a target.
In one application scenario, the spectrum detection method provided by the embodiment of the present application may be executed by an application server. Referring to fig. 3, a schematic view of an application scenario of the spectrum detection method provided in the embodiment of the present application is shown. As shown in fig. 3, the user opens the spectrum detection application on the mobile terminal and inputs the first attribute of the target object in the application; a processor of the mobile terminal sends a first attribute of a target object input by a user to an application server; the application server determines a target light wave band according to the received first attribute of the target object, then sends the target light wave band to a processor of the mobile terminal, sends a transmitting instruction to a light wave transmitting device on the mobile terminal through the processor of the mobile terminal to control a light wave generating device to transmit detection light waves in the target light wave band, then obtains a target object scattering spectrum received by a light wave receiving device through the processor of the mobile terminal, and finally performs spectrum detection on the second attribute of the target object according to the target object scattering spectrum. The application server may then send the second attribute of the object to the processor of the mobile terminal and be displayed by the processor of the mobile terminal to the user via the display of the mobile terminal.
In the application scenario shown in fig. 3, as long as the spectrum detection application is installed on the mobile terminal and the mobile terminal can normally communicate with the application server, the user can obtain the target object scattering spectrum of any target object at any time and any place through the mobile terminal, and communicate with the application server through the mobile terminal to obtain the spectrum detection result of the second attribute of the target object. In practical application, the spectrum detection result can be bound with an account number of a user in the spectrum detection application program, so that the spectrum detection result can be obtained when the user logs in the spectrum detection application program on different mobile terminals by using the same account number. However, in this application scenario, the mobile terminal is mainly used for data transmission (e.g. command interaction between the application server and the lightwave transmitting device/lightwave receiving device), and acquisition of the scattering spectrum of the target object, and the determination of the target lightwave band and the spectrum detection task of the second attribute of the target object are performed by the application server. In this case, the data processing amount of the application server is large, and particularly when the application server is connected to a plurality of mobile terminals and simultaneously performs a spectrum detection task on each mobile terminal, the load of the application server is heavy, and the operation speed of the application server is easily reduced, thereby reducing the detection efficiency.
In another application scenario, the spectrum detection method provided by the embodiment of the present application may also be executed by a processor of the mobile terminal. Referring to fig. 4, a schematic diagram of an application scenario of a spectrum detection method according to another embodiment of the present application is provided. As shown in fig. 4, the user opens the spectrum detection application on the mobile terminal and inputs the first attribute of the target object in the application; and the processor of the mobile terminal determines a target light wave band according to the received first attribute, sends a transmitting instruction to a light wave transmitting device on the mobile terminal to control a light wave generating device to transmit detection light waves in the target light wave band, then obtains a target object scattering spectrum received by a light wave receiving device, and finally performs spectrum detection on a second attribute of the target object according to the target object scattering spectrum. After the spectrum detection result is obtained, the processor of the mobile terminal can display the detection result (the second attribute of the target object) to the user through the display of the mobile terminal, and can also store the detection result in the mobile terminal, so that the user can obtain the detection result at any time without detecting again; the detection result (the second attribute of the target object) may also be sent to the application server, as shown in fig. 4, so that the application server stores the spectrum detection result.
In the application scenario shown in fig. 4, the processor of the mobile terminal is responsible for acquiring the scattering spectrum of the target object, determining the wavelength band of the target light wave, and detecting the spectrum of the second attribute of the target object, and the application server is only used for data transmission and data storage. In this case, since the data processing amount of the application server is small, the application server may be connected to a plurality of mobile terminals, and each mobile terminal may simultaneously perform its own spectrum detection task, so as to implement a distributed spectrum detection function, thereby improving the detection efficiency.
Referring to fig. 5, a schematic flow chart of a spectrum detection method provided in the embodiment of the present application is shown. By way of example and not limitation, the spectral detection method may include the steps of:
s501, acquiring a first attribute of a target object, and determining a target light wave band according to the first attribute.
Wherein the target object is an object to be detected. The first attribute of the object includes a category of the object.
In the category classification, classification may be made according to the level of biological classification, i.e., kingdom, phylum, class, order, family, genus and species. The specific classification level can be set according to actual needs. Illustratively, they may be classified by "kingdom", e.g., as animals, plants, microorganisms, etc. Or classifying again according to "species" (i.e. species) based on "world", such as dividing animals into chicken, duck, fish, etc., and dividing plants into apple, cabbage, etc. Of course, classification may be made according to varieties in economics, in addition to classification according to the grades of biological classification. For example, the apples are divided into red Fuji apples, yellow marshal apples and the like, and the corns are divided into sweet corns, sticky corns and the like. The above description is merely an example of the classification, and is not intended to limit the principle of the classification and the result of the classification, as long as the result of the classification (i.e., the type of the object) reflects the difference in the components between different objects.
Generally, the higher the classification level, the fewer biological commonalities that are contained; the lower the classification level, the more biological commonalities are contained. Therefore, in practical applications, the classification method of the category may be selected according to the degree of detail of the component discrimination (i.e., the degree of biological commonality) to be obtained. For example, animal fat is contained in animal bodies, plant fat is contained in plant bodies, and if the animal fat or the plant fat is contained in different bodies, the animal fat or the plant fat is classified according to the "kingdom", namely, the animal fat or the plant fat is classified into animals, plants, microorganisms and the like. Further, the fat content of animals in different animals is different, and if the fat content in different animals is to be distinguished, the animals need to be classified according to species, such as chicken, pig, fish, etc.
In the application scenario as described in the embodiment of fig. 3 or fig. 4, one way to obtain the first attribute of the target object in S501 may be by user input. For example: the user directly inputs 'apple', 'fish', etc. However, this method has a disadvantage that it can only be applied to the case where the user knows the first attribute of the target object, and when the user does not know the first attribute of the target object, the first attribute of the target object cannot be obtained.
In order to solve the above problem, in one embodiment, the first attribute of the target object is obtained in S501, and another method may also be adopted:
s5011, acquiring a first image of the target, and identifying a type of the target according to the first image of the target.
Alternatively, the first image of the object may be a still image (e.g., a photograph) of the object or a moving image (e.g., a video) of the object. In this case, the type of the target object may be identified based on the first image, and the type of the target object may be directly obtained by performing type identification processing on the first image using a trained first identification model (such as a neural network model); or firstly, carrying out object recognition processing on the first image by using a trained second recognition model to obtain the name of the target object, and then searching the type corresponding to the name in a preset first database. The first database stores corresponding relations between different objects and the types of the objects.
Alternatively, the first image of the target object may also be an image of a label capable of representing the target object, such as an image of a two-dimensional code including name/kind information of the target object, an image of a barcode of a commodity, and the like. In this case, the type of the target object is identified according to the first image, which may be directly obtaining the name of the target object included in the image of the two-dimensional code or the image of the commodity barcode, and then searching the type corresponding to the name in a preset first database; the type information of the target object included in the image of the two-dimensional code or the image of the commodity barcode may be directly acquired.
In an application scenario, referring to fig. 6, a schematic view of an application scenario for acquiring a first image according to an embodiment of the present application is provided. As shown in fig. 6(a), after the user opens the spectrum detection application on the mobile terminal 61, a detection frame 62 is displayed on the application interface (the detection range of the detection frame may be the shooting range of the camera device on the mobile terminal), and the user can select to scan a code or identify an object; when the first image of the target object represents the image of the label of the target object, the code scanning can be selected; when the first image of the target object represents a static image or a dynamic image of the target object, the object can be selected; the user moves the target object or the mobile terminal, so that a first image of the target object can be displayed in the detection frame (as shown in fig. 6 (b)); the processor of the mobile terminal acquires a first image of a target object in the detection frame, and then identifies the type of the target object according to the first image. In the process of acquiring the first image of the target object, when the user aligns the detection frame with the target object, the size of the detection frame can be automatically adjusted, so that the first image of the target object is completely displayed in the detection frame.
Because the different kinds of objects have different compositions, the different kinds of objects have different reflection capabilities for light waves of different wave bands. For example: in a scattering spectrum obtained after the light waves are reflected by the object A, the part with higher spectral intensity value is concentrated in a light wave band of 400-500 nm; in the scattering spectrum obtained after the light wave reflected by the object B, the part with higher spectral intensity value is concentrated in the light wave band of 500-600 nm; as can be seen, in the scattering spectra corresponding to the objects a and B belonging to different types, the light wave bands concentrated in the portions having higher spectral intensity values are different. Therefore, after identifying the type of the object according to the first image of the object in S511, determining the target optical wave band according to the first attribute in S51 may include:
and S5012, determining the target light wave band according to the type of the target object.
Optionally, the optical wave band corresponding to the type of the target object may be searched in a preset second database, and the optical wave band is the target optical wave band. The second database stores different kinds of information and their respective corresponding light wave bands.
By the methods in S5011 and S5012, even in the case where the user does not know the first attribute of the object, the first attribute of the object can be automatically acquired. In addition, the type information corresponding to the name of the target object can be directly searched according to the preset first database, and the target light wave band corresponding to the type of the target object can be directly searched according to the preset second database, so that the efficiency of acquiring the target light wave band is effectively improved.
S502, sending an emission instruction to a light wave emission device on the mobile terminal, wherein the emission instruction is used for instructing the light wave emission device to emit detection light waves, and the wave bands of the detection light waves are within the target light wave band.
In an application scenario, before transmitting an instruction to a light wave transmitting device on the mobile terminal, a processor of the mobile terminal controls a display to display a detection instruction to a user; after checking the detection instruction on the display, the user aims the light wave emitting device on the mobile terminal at the target object and inputs a confirmation instruction through the display; and after receiving the confirmation instruction input by the user, the processor of the mobile terminal sends a transmitting instruction to the light wave transmitting device on the mobile terminal.
Optionally, the emission instruction may include a target lightwave band. Thus, the processor of the mobile terminal can control the light wave emitting device to emit the detection light wave in the target light wave band by using the emission instruction. In one application scenario, in a detection device on a mobile terminal as described in the embodiment of fig. 2, the lightwave emitting device includes a light source emitter, a first filter, and an actuator for rotating or moving the first filter. In the application scene, a processor of the mobile terminal firstly sends a transmitting instruction to the transmission equipment so as to control the transmission equipment to drive the first filter to rotate or move to a target position, and then the filtering wave band of the first filter is selected; and then the processor of the mobile terminal sends a transmitting instruction to the light source transmitter to control the light source transmitter to transmit the light source. By the mode, the light source emitted by the light source emitter is emitted to the target object after passing through the first filter, and the first filter can filter out light waves outside the target light wave band, so that the directional selection of the light wave band of the detection light wave is realized.
Exemplarily, referring to fig. 7, a schematic diagram of a first filter provided in an embodiment of the present application is shown.
As shown in fig. 7(a), the first filter may be a filter wheel (e.g., component 71 in fig. 7 (a)), and the filter wheel includes a plurality of filters, each of which can filter light waves in different light wave bands. The filter wheel may be mounted on an actuator (e.g., element 72 in fig. 7 (a)). The transmission is provided with a rotating shaft (see element 73 in fig. 7 (a)). When the optical filter is applied, the processor on the mobile terminal controls the rotating shaft to rotate so as to drive the transmission device to rotate and further drive the optical filter wheel to rotate, so that different optical filters are selected, and directional selection of the optical wave band of the detected optical wave is realized.
The first filter may also be a filter frame consisting of a plurality of filters (e.g., 74 in fig. 7 (b)), as shown by element 75 in fig. 7 (b). The actuator may be a slide (e.g., element 76 in fig. 7 (b)) on which the filter frame is mounted. When the optical filter is applied, the processor on the mobile terminal controls the sliding rail to move so as to drive the optical filter to move left and right along the moving direction of the sliding rail, and then different optical filters are selected, namely the directional selection of the optical wave band of the detected optical wave is realized.
Of course, the first filter may have other structures in practical applications. In addition, the first filter can also be separately and fixedly installed, and the transmission device and the light source emitter are installed together to drive the light source emitter to rotate or move, so that the selection of different optical filters is realized through the transmission of the light source emitter. It should be noted that the above is only an example of emitting the detection light waves of different light wave bands, and the structure of the first filter, the structure of the transmission device, and the manner of selecting the light wave band by combining the two are not limited.
And S503, acquiring the scattering spectrum of the target object received by the light wave receiving device on the mobile terminal.
And the target object scattering spectrum is a scattering spectrum obtained by reflecting the detection light wave by the target object. The scattering spectrum typically includes different wavelengths of light and their respective corresponding spectral intensity values.
In the detecting device on the mobile terminal as described in the embodiment of fig. 2, the light wave receiving means includes a light source receiver and a second filter. The second filter is used for separating light waves with different wavelengths to obtain a scattering spectrum. For example, the second filter may be a grating.
In the application scenario described in S502, the target lightwave band is selected during lightwave emission. In another application scenario, the target lightwave band may also be selected during lightwave reception. Specifically, the light wave receiving device may include a light wave receiver, a second filter, a third filter, and an actuator. When the filter is applied, the processor of the mobile terminal controls the transmission device to rotate or move so as to drive the third filter to rotate or move to a target position, and then different light wave bands are selected. Therefore, after the detection light wave is reflected by the target object, the detection light wave is filtered by the third filter to obtain a receiving light wave in the target light wave band, then the receiving light wave is subjected to light wave separation by the second filter, and the light source receiver receives the light wave after the light wave separation to obtain the target object scattering spectrum.
In practical applications, steps S502-S503 may be repeated multiple times in order to improve the accuracy of the scattering spectrum of the target. And correspondingly acquiring a target object scattering spectrum every time S502-S503 are executed, and then comprehensively considering the acquired multiple target object scattering spectra to calculate a final target object scattering spectrum which can be used for spectrum detection. For example, for each wavelength in the scattering spectrum of the target object, the spectral intensity values in the scattering spectra of the target object corresponding to the wavelength may be averaged to obtain an average value; and after calculating the average value corresponding to each wavelength, combining each wavelength and the average value corresponding to each wavelength into a final target scattering spectrum. By the method, the situation that the target object scattering spectrum acquired once has an abnormal value can be avoided, the target object scattering spectrum acquired for multiple times is comprehensively considered, the accuracy of the final target object scattering spectrum can be effectively improved, and reliable data basis is further provided for subsequent spectrum detection.
S504, performing spectrum detection on the second attribute of the target object according to the target object scattering spectrum to obtain a first detection result.
The second attribute of the target object can be used for representing whether the target object contains one or more substances, such as sugar and vitamins in the apple; the method can also be used for characterizing the content of one or more substances in the target object, namely components, such as sugar content of apple is x mg, vitamin content is y mg, or pesticide residue is z mg, and the like; of course, the method can also be used for characterizing the content of one or more substances in the target object, such as sugar in apple, vitamin in apple, and the like.
In the embodiment shown in fig. 5, since the optical wavelength band of the detection light wave is defined according to the first property of the object, i.e. the optical wavelength band of the scattering spectrum of the object is defined; when the second attribute of the target object is subjected to spectrum detection according to the target object scattering spectrum, the target object scattering spectrum in the target light wave band is only required to be detected, and the target object scattering spectrum in the full wave band range is not required to be detected, so that the detection range is shortened, the detection time is shortened, and the detection efficiency is further improved. In addition, the target light wave band is determined according to the first attribute of the target object, so that the scattering spectrum of the target object in the target light wave band can accurately and intensively reflect the component characteristics of the target object, and further, when spectrum detection is performed according to the scattering spectrum of the target object in the target light wave band, the influence of the scattering spectra of other bands except the target light wave band can be avoided, and the detection precision is effectively improved.
In one embodiment, the first attribute of the target object may also include a name of the second attribute, such as a sugar, vitamin, pesticide name, and the like. In this case, the target light wave band is determined according to the name of the second attribute, and then the corresponding target scattering spectrum is obtained according to the target light wave band, so that the light wave scattering characteristics of a certain component (i.e., a component corresponding to the name of the second attribute) in the target can be directly reflected, the spectrum detection is more targeted, the detection range is further narrowed, and the detection efficiency is improved.
In one application scenario, the user may enter a name of the second attribute that is desired to be detected, such as fat; and then the processor of the mobile terminal determines a target light wave band according to the name of the second attribute, sends a transmitting instruction to a light wave transmitting device on the mobile terminal, acquires a target object scattering spectrum received by a light wave receiving device on the mobile terminal, and finally performs spectrum detection on the fat content of the target object according to the target object scattering spectrum.
In one embodiment, referring to fig. 8, a flowchart of a method for detecting a spectrum of a second property of a target object according to an embodiment of the present application is shown. As an example and not by way of limitation, as shown in fig. 8, in S804, performing spectrum detection on the second attribute of the target according to the target scattering spectrum to obtain a first detection result, the method may include the following steps:
and S801, acquiring an environment scattering spectrum in the target light wave band.
The environment scattering spectrum is a corresponding scattering spectrum when no target object exists in the emission range of the light wave emission device.
When the device is applied, after a user is indicated to aim at a target object through a display on the mobile terminal (for example, a detection frame is displayed in the display, and the display range of the detection frame is the emission range of the light wave emission device), emitting detection light waves to the target object, and acquiring a reflected target object scattering spectrum; then, the user is instructed to move the target object out of the emission range of the light wave emission device or move the mobile device to a position where the target object does not exist in the emission range of the light wave emission device (for example, the target object is moved so that the image of the target object cannot be displayed in the detection frame), then the detection light wave is emitted, and the reflected ambient scattering spectrum is obtained. Since no target object exists in the emission range of the light wave emission device, the reflection characteristics of various particles in the environment where the target object is located on the detection light wave can be reflected by the environmental scattering spectrum.
In the process of reflecting the detection light wave by the target object, the detection light wave also needs to be reflected by various particles in the environment where the target object is located, so that the target object scattering spectrum includes both the scattering spectrum reflected by the target object and the scattering spectrum reflected by the environmental particles. In order to obtain a scattering spectrum obtained by simple reflection of the target itself, the following process may be performed.
S802, carrying out light wave denoising processing on the target object scattering spectrum according to the environment scattering spectrum to obtain the processed target object scattering spectrum.
Optionally, the process of denoising the light wave may be: and subtracting the environmental scattering spectrum from the target scattering spectrum to obtain a processed target scattering spectrum. Specifically, for each wavelength in the target scattering spectrum, the spectral intensity value in the target scattering spectrum corresponding to the wavelength is subtracted from the spectral intensity value in the environmental scattering spectrum corresponding to the wavelength to obtain the processed spectral intensity value corresponding to the wavelength. And after the processed spectral intensity value corresponding to each wavelength is obtained, generating a processed target scattering spectrum by using each wavelength and the processed spectral intensity value corresponding to each wavelength.
The scattering spectrum of the target object obtained after the light wave denoising treatment only contains the scattering spectrum obtained by the simple reflection of the target object, so that the interference of environmental particles is removed, and a reliable data basis is provided for the subsequent spectrum detection.
When the method is applied, the step S801 may be repeated for multiple times, that is, the environmental scattering spectrum is obtained multiple times, then the average environmental scattering spectrum is calculated, and then the average environmental scattering spectrum is used to perform the light wave denoising processing on the target object scattering spectrum. By calculating the average value of the environment scattering spectra acquired for many times, the light wave reflection characteristics of various particles in the environment where the target object is located can be reflected more accurately.
The steps S801-S802 can also be repeated for a plurality of times, namely, after the environmental scattering spectrum is obtained each time, the scattering spectrum of the target object is subjected to primary light wave denoising treatment to obtain a treated scattering spectrum of the target object; and then calculating the average value of the obtained multiple processed target object scattering spectra to obtain the final processed target object scattering spectrum. By carrying out multiple denoising processing on the target object scattering spectrum, the interference of environmental particles in the target object scattering spectrum can be effectively filtered, the purity of the target object scattering spectrum is improved, and a reliable data basis is provided for subsequent spectrum detection.
And S803, performing spectrum detection on the second attribute of the target object according to the processed target object scattering spectrum to obtain a first detection result.
In an implementation manner of step S803, the target scattering spectrum may be directly input into the trained detection model for spectrum detection, and the output result is the first detection result. This implementation requires training the spectrum detection model in advance. During training, target object scattering spectra of various target objects can be acquired to train the detection model, and when the detection precision of the detection model reaches preset precision, the trained detection model is obtained.
Optionally, in order to further narrow the detection range and improve the detection efficiency, another implementation manner of step S803 may further include the following steps:
obtaining the light wave wavelength corresponding to the second attribute, and determining the spectral intensity value corresponding to the light wave wavelength in the processed scattering spectrum of the target object; and then carrying out spectrum detection on the second attribute of the target object according to the spectrum intensity value to obtain a first detection result.
Wherein the wavelength of the light wave is within the target light wave band.
Generally, the scattering properties of light waves of different components are different, and one component may have a strong reflection capability only for light waves of a certain wavelength or certain wavelengths, in other words, the scattering spectrum corresponding to one component may have a high spectral intensity value corresponding to a certain wavelength or certain wavelengths. Therefore, when the second attribute is subjected to spectrum detection, one or more light wave wavelengths capable of representing the strong emission capability of the second attribute can be obtained, and then the corresponding spectrum intensity values of the specific wavelengths in the scattering spectrum of the target object are utilized to carry out spectrum detection, so that the detection efficiency can be greatly improved.
In application, the optical wavelength corresponding to the second attribute may be searched in a preset third database. The light wave scattering characteristics of each component (i.e. the light wave of which wavelength or wavelengths each component has a strong reflection capability) need to be obtained in advance, and the corresponding relationship between each component and its corresponding light wave wavelength is established and stored in the third database.
It should be noted that the first database, the second database, and the third database may be independent databases, or may be different storage portions in the same database.
When the second attribute of the target object is detected spectrally from the spectral intensity value, there are the following two cases.
In the first case, if it is only necessary to detect whether a certain substance is contained in the target object, the spectral intensity value may be directly input to the detection model when the second attribute of the target object is spectrally detected according to the spectral intensity value. The output of the detection model is that the detection model contains a certain substance or does not contain a certain substance.
In the second case, if the content of a certain substance in the target object needs to be detected, the volume information of the target object also needs to be acquired. Specifically, the method can comprise the following steps:
acquiring volume information of the target object; inputting the volume information and the spectral intensity value of the target object into a preset detection model, and performing spectral detection on the second attribute of the target object to obtain a first detection result.
The volume information may include volume, area, or data that can calculate length, width, height, etc. of the volume.
Optionally, the manner of obtaining the volume information of the target object may include:
acquiring a second image of the target object; and determining the volume information of the target object according to the size relation between the marker in the image of the second image of the target object and the second image.
The second image of the target object may be the first image of the target object or the newly acquired image of the target object. The step of re-acquiring is the same as the step of acquiring the first image of the target object, and reference may be specifically made to the description in S5011, which is not repeated herein.
The marker can be a detection frame when the second image is acquired, and the size of the detection frame has a proportional relation with the shooting range of the camera device on the mobile terminal for acquiring the second image of the target object, and the proportional relation can be preset or acquired in advance, so that the actual volume information of the target object can be calculated as long as the size relation between the second image and the marker is acquired.
For example, assuming that the size of the detection frame is 5cm × 5cm, and the corresponding shooting range of the detection frame is 5 × 5m, i.e. the ratio of the image area occupied by the detection frame to the actual shooting range is 1: 10000, and assuming that the size relationship of the image area occupied by the second image and the image area occupied by the detection frame is 1:2, the actual area of the object can be calculated to be 10000 × (1/2) 5000cm2=0.5m2
The marker can also be a third image in the image of the second image, and the volume information of the actual object corresponding to the third image is known, so that a first proportional relation can be calculated through the size of the third image and the volume information of the actual object corresponding to the third image; a second proportional relationship between the size of the second image and the actual volumetric information of the target object may be calculated using the first proportional relationship. However, since the actual object and the target object corresponding to the third image may not be in the same shooting plane, i.e. their distances from the camera device are different, which may result in the first proportional relationship being different from the second proportional relationship, the size relationship between the first proportional relationship and the second proportional relationship also needs to be determined.
For example, a distance sensor may be disposed on the mobile terminal, and a first distance from an actual object corresponding to the third image to the mobile terminal and a second distance from the target object to the mobile terminal may be measured; then, determining a size relation between the first proportional relation and the second proportional relation according to the first distance and the second distance; and then calculating a second proportional relationship according to the known first proportional relationship and the determined dimensional relationship. And finally, calculating the actual area of the target object according to the second proportional relation and the image area occupied by the second image.
As another example, the camera device may be composed of two cameras, so that the depth information of the third image and the depth information of the second image are represented in the captured image, and the size relationship between the first proportional relationship and the second proportional relationship is determined according to the two depth information. Then, a second proportional relationship is calculated based on the known first proportional relationship and the determined dimensional relationship. And finally, calculating the actual area of the target object according to the second proportional relation and the image area occupied by the second image.
Furthermore, on the basis of obtaining the actual area of the target object, the depth of field information corresponding to the second image can be obtained according to the two cameras on the camera device, and then the actual volume of the target object is calculated according to the actual area of the target object and the depth of field information corresponding to the second image.
By the method described in the embodiment of fig. 8, the environmental scattering spectrum in the target scattering spectrum can be filtered, that is, the influence of the scattering spectrum of various particles in the environment where the target is located is eliminated, so that the processed target scattering spectrum only includes the scattering spectrum reflected by the target itself; and then, the second attribute of the target object is subjected to spectrum detection by using the processed target object scattering spectrum, so that the accuracy of a spectrum detection result can be effectively improved.
In one embodiment, the detection model in step S803 may be updated periodically, and the update period may be set artificially, for example, once after each spectrum detection, once after each M spectrum detections, or once at intervals, etc. Illustratively, after each spectral detection, the detection model is updated with the volume information of the target object in the current spectral detection, the spectral intensity value, and the second attribute of the target object in the current spectral detection. The detection accuracy of the detection model can be ensured by regularly updating the detection model.
In the embodiment shown in fig. 5, only one spectrum detection is performed, and since data abnormality or operation abnormality may occur in the single spectrum detection process, the accuracy of the single spectrum detection result cannot be ensured. To address the above issues, in one embodiment, the accuracy of the final detection result may be improved by multiple spectral detections. The specific method comprises the following steps:
step one, sending the emission instruction to the light wave emission device for N times again, and obtaining the target object scattering spectrum received by the light wave receiving device after sending the emission instruction each time to obtain N to-be-detected spectrums, wherein N is a positive integer.
The process of sending the emission instruction to the light wave emitting device and the process of acquiring the scattering spectrum of the target object received by the light wave receiving device may join the descriptions in steps S502 and S503, and are not described herein again.
And step two, performing spectrum detection on the second attribute of the target object according to each spectrum to be detected to obtain N second detection results.
The process of performing spectrum detection on the second attribute of the target according to each spectrum to be detected may refer to the description in step S504, and is not described herein again.
Exemplarily, referring to fig. 9, a schematic flow chart of a spectrum detection method according to another embodiment of the present application is provided. As shown in fig. 9, after step S504, a counting step may be added, such as adding 1 to the counting value, and a judging step is added, i.e. judging whether the current counting value is equal to N. If the count value is less than N, re-executing S502-S504; if the current count value is equal to N, S901 continues, that is, the following step three is performed.
And step three, determining a second attribute of the target object according to the first detection result and the N second detection results.
Optionally, one implementation manner of step three is as follows: and counting the number of the same results in the first detection result and the N second detection results, and determining the same result with the largest number as the second attribute of the target object. For example, assuming that N is 10, the first detection result is that the target substance contains sugar, 6 second detection results are that the target substance contains sugar, and 4 second detection results are that the target species does not contain sugar; and determining the second attribute of the target object as the sugar content in the target object if the total number of the results indicating that the target object contains the sugar is 7 and the number of results indicating that the target object does not contain the sugar is 4.
When it is desired to detect whether or not a certain substance is contained in a target, the above embodiment may be used. However, if the content of a certain substance in the target object needs to be detected, the second attribute of the target object may not be determined by the above implementation manner since the content values may be different in each detection result.
Optionally, another implementation manner of the step three is as follows: and calculating the average value of the first detection result and the N second detection results, and determining the average value as the second attribute of the target object. For example, assuming that N is 3, the first detection result is 0.3mg of fat in the target, and the 3 second detection results are: the fat content in the target object is 0.5mg, the fat content in the target object is 0.2mg, and the fat content in the target object is 0.3 mg. The average value of the 4 detection results is calculated to be (0.3+0.5+0.2+0.3)/4 ═ 0.325, and then the second attribute of the target object is determined to be that the fat content in the target object is 0.325.
Of course, there may be other implementation manners of step three, and the above is only an example and is not limited specifically.
Through the steps one to three, the target object scattering spectrum is obtained for multiple times, the second attribute of the target object is subjected to spectrum detection by using the target object scattering spectrum obtained each time, and finally the second attribute of the target object is determined according to the obtained multiple detection results, so that the abnormal condition in single spectrum detection can be effectively avoided, and the accuracy of the final detection result is effectively improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 10 shows a block diagram of a spectrum detection device provided in an embodiment of the present application, corresponding to the spectrum detection method described in the above embodiment, and only the part related to the embodiment of the present application is shown for convenience of description.
Referring to fig. 10, the apparatus includes:
the acquiring unit 101 is configured to acquire a first attribute of a target object, and determine a target optical wave band according to the first attribute.
A sending unit 102, configured to send a sending instruction to a lightwave emitting device on the mobile terminal, where the sending instruction is used to instruct the lightwave emitting device to emit a detection lightwave, and a waveband of the detection lightwave is within the target lightwave waveband.
A receiving unit 103, configured to obtain a target scattering spectrum received by an optical wave receiving device on the mobile terminal, where the target scattering spectrum is a scattering spectrum obtained by reflecting the detection optical wave by the target.
And the detection unit 104 is configured to perform spectrum detection on the second attribute of the target object according to the target object scattering spectrum to obtain a first detection result.
Optionally, the first attribute of the object includes a type of the object.
Optionally, the obtaining unit is further configured to:
acquiring a first image of the target object, and identifying the type of the target object according to the first image of the target object;
and determining the target light wave band according to the type of the target object.
Optionally, the detecting unit 104 includes:
and the acquisition subunit is configured to acquire an environment scattering spectrum within the target light wave band, where the environment scattering spectrum is a scattering spectrum corresponding to the absence of the target object within the emission range of the light wave emission device.
And the denoising subunit is used for carrying out light wave denoising processing on the target object scattering spectrum according to the environment scattering spectrum to obtain the processed target object scattering spectrum.
And the detection subunit is used for performing spectrum detection on the second attribute of the target object according to the processed target object scattering spectrum to obtain a first detection result.
Optionally, the detecting subunit includes:
and the determining module is used for acquiring the light wave wavelength corresponding to the second attribute and determining the spectral intensity value corresponding to the light wave wavelength in the processed target object scattering spectrum, wherein the light wave wavelength is within the target light wave band.
And the detection module is used for performing spectrum detection on the second attribute of the target object according to the spectrum intensity value to obtain a first detection result.
Optionally, the detection module includes:
and the acquisition submodule is used for acquiring the volume information of the target object.
And the detection submodule is used for inputting the volume information and the spectral intensity value of the target object into a preset detection model and carrying out spectral detection on the second attribute of the target object to obtain a first detection result.
Optionally, the obtaining sub-module is further configured to:
acquiring a second image of the target object;
and determining the volume information of the target object according to the size relation between the marker in the image of the second image of the target object and the second image.
In an embodiment, the sending unit 102 is further configured to send the emission instruction to the light wave emitting device N times again after the second attribute of the target is subjected to spectrum detection according to the target scattering spectrum to obtain a first detection result;
the receiving unit 103 is further configured to obtain the target scattering spectrum received by the optical wave receiving device after sending the emission instruction each time, so as to obtain N to-be-detected spectra, where N is a positive integer.
The detecting unit 104 is further configured to perform spectrum detection on the second attribute of the target object according to each to-be-detected spectrum, so as to obtain N second detection results.
A determining unit 105, configured to determine a second attribute of the target object according to the first detection result and the N second detection results.
Optionally, the apparatus 10 further comprises:
and the updating unit is used for updating the detection model according to the volume information of the target object, the spectral intensity value and the second attribute of the target object after determining the second attribute of the target object according to the first detection result and the N second detection results.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
The spectrum detection device shown in fig. 10 may be a software unit, a hardware unit, or a combination of software and hardware unit built in the existing terminal device, may be integrated into the terminal device as a separate pendant, or may exist as a separate terminal device.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The spectrum detection method provided by the embodiment of the application can be applied to mobile terminals such as mobile phones, mobile computers or cameras, and the specific types of the mobile terminals are not limited at all.
The mobile terminal provided by the embodiment of the application comprises: one or more processors, memories, display screens, lightwave emitting devices, and lightwave receiving devices; the memory, the display screen, the lightwave transmitting means, and the lightwave receiving means are respectively coupled with the one or more processors, the memory for storing computer program code, the computer program code comprising computer instructions; the computer instructions, when executed by the one or more processors, cause the mobile terminal to perform the steps in any of the various spectral detection method embodiments described above.
Taking a mobile terminal as a mobile phone as an example, see fig. 11, which is a block diagram of a part of a structure of a mobile phone provided in the embodiment of the present application. As shown in fig. 11, the cellular phone includes: the image capturing apparatus includes a camera 1101, a Radio Frequency (RF) circuit 1102, a memory 1103, an input unit 1104, a display screen 1105, a sensor 1106, an audio circuit 1107, a wireless fidelity (WiFi) module 1108, a processor 1109, a power supply 1110, and the like. Those skilled in the art will appreciate that the handset configuration shown in fig. 11 is not intended to be limiting of handsets or mobile terminals and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 11:
the camera 1101 may be used to acquire the first image and the second image of the target object in the embodiment of the present application, and may be used to acquire other images. The camera 1101 may include a series of components for shooting, for example, a flash, a filter, a transmission, at least one camera, and the like.
The RF circuit 1102 may be configured to receive and transmit information or signals during a call, and in particular, receive downlink information of a base station and then process the received downlink information to the processor 1109; in addition, the data for designing uplink is transmitted to the base station. Or data interaction with an application server. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuit 1102 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE)), e-mail, Short Messaging Service (SMS), and the like.
The memory 1103 may be used to store software programs and modules, and the processor 1109 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 1103. The memory 1103 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program (such as a spectrum detection application program) required for at least one function, and the like; the storage data area may store data created according to the use of the mobile phone (such as a first image, a second image, a spectrum detection result, and the like of the target object), and the like. Further, the memory 1103 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The input unit 1104 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 1104 may include a touch panel 1141 for accepting a user to input a first attribute of a target object and other input devices 1142. The touch panel 1141, also referred to as a touch screen, may collect touch operations performed by a user on or near the touch panel 1141 (e.g., operations performed by the user on or near the touch panel 1141 using any suitable object or accessory such as a finger, a stylus, etc.), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch panel 1141 may include two parts, i.e., a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 1109, and can receive and execute commands sent from the processor 1109. In addition, the touch panel 1141 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 1104 may include other input devices 1142 in addition to the touch panel 1141. In particular, other input devices 1142 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, on-off keys, etc.), a microphone for capturing user voice information, and the like.
The display screen 1105 may be used to display information entered by or provided to the user as well as various menus of the handset. The display screen 1105 may include a display panel 1151, and optionally, the display panel 1151 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 1141 may cover the display panel 1151, and when the touch panel 1141 detects a touch operation thereon or nearby (e.g., a user inputs a first attribute of an object on the touch screen), the touch operation is transmitted to the processor 1109 to determine the type of the touch event, and then the processor 1109 provides a corresponding visual output on the display panel 1151 according to the type of the touch event. Although in fig. 11, the touch panel 1141 and the display panel 1151 are two separate components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 1141 and the display panel 1151 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 1106, such as a light sensor, a distance sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1151 according to the brightness of ambient light, and may also adjust the brightness of a flash in the photographing device 1101 according to the brightness of ambient light. The distance sensor can be used for measuring the distance between the mobile phone and the target object. As for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The audio circuitry 1107, speaker 1171, and microphone 1172 may provide an audio interface between the user and the handset. The audio circuit 1107 can transmit the electrical signal converted from the received audio data to the speaker 1171, and the electrical signal is converted into a sound signal by the speaker 1171 and output; on the other hand, the microphone 1172 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 1107, and outputs the audio data to the processor 1109 for processing, or outputs the audio data to the memory 1103 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send emails, browse webpages, access streaming media and the like through the WiFi module 1108, and provides wireless broadband Internet access for the user. Although fig. 11 shows the WiFi module 1108, it is understood that it does not belong to the essential constitution of the handset, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 1109 is a control center of the cellular phone, connects various parts of the entire telephone wristwatch with various interfaces and lines, and performs various functions of the cellular phone and processes data by operating or executing software programs and/or modules stored in the memory 1103 and calling data stored in the memory 1103, thereby performing overall monitoring of the cellular phone. Alternatively, the processor 1109 may include one or more processing units; preferably, the processor 1109 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1109.
The handset also includes a power supply 1110 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 1109 via a power management system, so that the power management system may perform functions such as managing charging, discharging, and power consumption.
Alternatively, the mobile phone may include a Global Positioning System (GPS).
In addition, although not shown, the mobile phone may further include a bluetooth module, etc., which will not be described herein.
Fig. 12 is a schematic diagram of a software structure of a mobile phone according to an embodiment of the present application. Taking a mobile phone operating system as an Android system as an example, in some embodiments, the Android system is divided into four layers, which are an application layer, an application Framework (FWK) layer, a system layer and a hardware abstraction layer, and the layers communicate with each other through a software interface.
As shown in fig. 12, the application layer may be a series of application packages, which may include applications such as spectrum detection applications, short messages, calendars, cameras, navigation, gallery, call, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application programs of the application layer. The application framework layer may include some predefined functions, such as functions for receiving events sent by the application framework layer.
As shown in fig. 12, the application framework layer may include a window manager, a resource manager, and a notification manager, among others.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The application framework layer may further include:
a viewing system that includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The telephone manager is used for providing the communication function of the mobile phone. Such as management of call status (including on, off, etc.).
The system layer may include a plurality of functional modules. For example: a sensor service module, a physical state identification module, a three-dimensional graphics processing library (such as OpenGL ES), and the like.
The sensor service module is used for monitoring sensor data uploaded by various sensors in a hardware layer and determining the physical state of the mobile phone;
the physical state recognition module is used for analyzing and recognizing user gestures, human faces and the like;
the three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The system layer may further include:
the surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The hardware abstraction layer is a layer between hardware and software. The hardware abstraction layer may include a display driver, a camera driver, a sensor driver, etc. for driving the relevant hardware of the hardware layer, such as a display screen, a camera, a sensor, etc.
The above embodiments of the spectrum detection method can be implemented on a mobile phone with the above hardware structure/software structure.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program may implement the steps in the foregoing various embodiments of the spectrum detection method.
The embodiment of the present application provides a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the foregoing spectral detection method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), random-access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
The embodiment of the present application further provides a chip system, where the chip system includes a processor, the processor is coupled with a memory, and the processor executes a computer program stored in the memory to implement the steps in the foregoing embodiments of the spectrum detection method. The chip system can be a single chip or a chip module consisting of a plurality of chips.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A method of spectral detection, comprising:
acquiring a first attribute of a target object, and determining a target light wave band according to the first attribute;
sending an emission instruction to a light wave emission device on the mobile terminal, wherein the emission instruction is used for instructing the light wave emission device to emit detection light waves, and the wave bands of the detection light waves are within the target light wave band;
acquiring a target object scattering spectrum received by a light wave receiving device on a mobile terminal, wherein the target object scattering spectrum is a scattering spectrum obtained by reflecting the detection light wave by the target object;
and performing spectrum detection on the second attribute of the target object according to the target object scattering spectrum to obtain a first detection result.
2. The method for spectral detection of claim 1, wherein the first property of the target comprises a species of the target;
the acquiring a first attribute of a target object and determining a target optical wave band according to the first attribute comprises the following steps:
acquiring a first image of the target object, and identifying the type of the target object according to the first image of the target object;
and determining the target light wave band according to the type of the target object.
3. The method according to claim 1 or 2, wherein the spectrally detecting the second property of the target according to the target scattering spectrum to obtain the first detection result comprises:
acquiring an environment scattering spectrum in the target light wave band, wherein the environment scattering spectrum is a scattering spectrum corresponding to the situation that no target object exists in the emission range of the light wave emission device;
carrying out light wave denoising processing on the target object scattering spectrum according to the environment scattering spectrum to obtain a processed target object scattering spectrum;
and performing spectrum detection on the second attribute of the target object according to the processed scattering spectrum of the target object to obtain a first detection result.
4. The method according to claim 3, wherein the performing spectral detection on the second property of the target object according to the processed scattering spectrum of the target object to obtain a first detection result comprises:
acquiring the light wave wavelength corresponding to the second attribute, and determining a spectrum intensity value corresponding to the light wave wavelength in the processed target object scattering spectrum, wherein the light wave wavelength is within the target light wave band;
and carrying out spectrum detection on the second attribute of the target object according to the spectrum intensity value to obtain a first detection result.
5. The method according to claim 4, wherein the performing spectral detection on the second attribute of the target object according to the spectral intensity value to obtain a first detection result comprises:
acquiring volume information of the target object;
inputting the volume information and the spectral intensity value of the target object into a preset detection model, and performing spectral detection on the second attribute of the target object to obtain a first detection result.
6. The method for spectrum detection according to claim 5, wherein said obtaining volume information of the target comprises:
acquiring a second image of the target object;
and determining the volume information of the target object according to the size relation between the marker in the image of the second image of the target object and the second image.
7. The method according to claim 5 or 6, wherein after said spectrally detecting the second property of the target according to the target scattering spectrum to obtain a first detection result, the method further comprises:
sending the emission instruction to the light wave emission device for N times again, and obtaining the scattering spectrum of the target object received by the light wave receiving device after sending the emission instruction each time to obtain N spectrums to be detected, wherein N is a positive integer;
performing spectrum detection on the second attribute of the target object according to each spectrum to be detected to obtain N second detection results;
and determining a second attribute of the target object according to the first detection result and the N second detection results.
8. The method for spectral detection according to claim 7, wherein after determining a second property of said target from said first detection result and N of said second detection results, said method further comprises:
and updating the detection model according to the volume information of the target object, the spectral intensity value and the second attribute of the target object.
9. A mobile terminal, comprising: one or more processors, memories, display screens, lightwave emitting devices, and lightwave receiving devices;
the memory, the display screen, the lightwave transmitting means, and the lightwave receiving means are respectively coupled with the one or more processors, the memory for storing computer program code, the computer program code comprising computer instructions;
the computer instructions, when executed by the one or more processors, cause the mobile terminal to perform the spectral detection method of any of claims 1-8.
10. A chip system, comprising a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the spectral detection method of any of claims 1 to 8.
CN202010364316.4A 2020-04-30 2020-04-30 Spectrum detection method, mobile terminal and chip system Pending CN111665201A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010364316.4A CN111665201A (en) 2020-04-30 2020-04-30 Spectrum detection method, mobile terminal and chip system
PCT/CN2021/082076 WO2021218489A1 (en) 2020-04-30 2021-03-22 Spectrum detection method, mobile terminal and chip system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010364316.4A CN111665201A (en) 2020-04-30 2020-04-30 Spectrum detection method, mobile terminal and chip system

Publications (1)

Publication Number Publication Date
CN111665201A true CN111665201A (en) 2020-09-15

Family

ID=72383083

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010364316.4A Pending CN111665201A (en) 2020-04-30 2020-04-30 Spectrum detection method, mobile terminal and chip system

Country Status (2)

Country Link
CN (1) CN111665201A (en)
WO (1) WO2021218489A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113447450A (en) * 2021-05-11 2021-09-28 中国农业科学院农产品加工研究所 Optical nondestructive testing device for determining nutrient content of dishes based on spectrum
WO2021218489A1 (en) * 2020-04-30 2021-11-04 华为技术有限公司 Spectrum detection method, mobile terminal and chip system
WO2022080284A1 (en) * 2020-10-13 2022-04-21 国立研究開発法人理化学研究所 Mobile apparatus, food characteristics determination device, food characteristics determination method, and program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106596412A (en) * 2016-12-31 2017-04-26 上海复展智能科技股份有限公司 Method for monitoring plant growth by using unmanned aerial vehicle with multispectral light source
CN107576396A (en) * 2017-08-30 2018-01-12 中国科学院上海技术物理研究所 The program control multispectral imaging method and device of mobile phone

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9788730B2 (en) * 2012-03-08 2017-10-17 Dermasensor, Inc. Optical process and apparatus for non-invasive detection of melanoma
CN103281459A (en) * 2013-06-06 2013-09-04 仝晓萌 Mobile phone capable of measuring sweetness and PH value of fruit
CN106501190A (en) * 2016-12-23 2017-03-15 成都光数云联科技有限公司 A kind of mobile terminal and the analysis system based on mobile terminal
CN109655414B (en) * 2018-11-27 2021-11-02 Oppo广东移动通信有限公司 Electronic equipment, information pushing method and related product
CN111077088B (en) * 2019-12-16 2022-07-26 王娟 Smart phone imaging spectrometer and spectrum identification method thereof
CN111665201A (en) * 2020-04-30 2020-09-15 华为技术有限公司 Spectrum detection method, mobile terminal and chip system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106596412A (en) * 2016-12-31 2017-04-26 上海复展智能科技股份有限公司 Method for monitoring plant growth by using unmanned aerial vehicle with multispectral light source
CN107576396A (en) * 2017-08-30 2018-01-12 中国科学院上海技术物理研究所 The program control multispectral imaging method and device of mobile phone

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
付军栋等: "《实景图像拼接及其漫游控制技术》", 30 June 2011, 西南交通大学出版社 *
刘煜等: "《阵列相机成像技术与应用》", 30 April 2018, 国防科技大学出版社 *
李志刚: "《光谱数据处理与定量分析技术》", 31 July 2017, 北京邮电大学大学出版社 *
王成: "《生物医学光学》", 28 February 2017, 东南大学出版社 *
邵俊: "《环境与化学实验基础知识汇编》", 31 October 2016, 吉林教育出版社 *
顾理琴: "基于图像识别技术的食品种类检测方法", 《食品研究与开发》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021218489A1 (en) * 2020-04-30 2021-11-04 华为技术有限公司 Spectrum detection method, mobile terminal and chip system
WO2022080284A1 (en) * 2020-10-13 2022-04-21 国立研究開発法人理化学研究所 Mobile apparatus, food characteristics determination device, food characteristics determination method, and program
CN113447450A (en) * 2021-05-11 2021-09-28 中国农业科学院农产品加工研究所 Optical nondestructive testing device for determining nutrient content of dishes based on spectrum

Also Published As

Publication number Publication date
WO2021218489A1 (en) 2021-11-04

Similar Documents

Publication Publication Date Title
WO2021218489A1 (en) Spectrum detection method, mobile terminal and chip system
US20220036079A1 (en) Context based media curation
CN109784424A (en) A kind of method of image classification model training, the method and device of image procossing
US9900427B2 (en) Electronic device and method for displaying call information thereof
CN109655414B (en) Electronic equipment, information pushing method and related product
US11570572B2 (en) Geo-fence selection system
KR20210137236A (en) Device location based on machine learning classifications
CN102646408B (en) Method and terminal for adjusting the font size
CN107943683B (en) Test script generation method and device, electronic equipment and storage medium
CN106844484A (en) Information search method, device and mobile terminal
US20240044814A1 (en) Sorting support apparatus, sorting support system, sorting support method, and program
US9059957B2 (en) Complex handling of conditional messages
US20160285802A1 (en) Contextual mobile communication platform
KR102547894B1 (en) Dynamic Context Media Filter
CN109886092A (en) Object identifying method and its device
CN106053494A (en) Method, device and mobile terminal for detecting hazardous articles
US20160109295A1 (en) Portable electronic apparatus, spectrometer combined therewith, and method for detecting quality of test object by using the same
CN109830114A (en) Traffic lights based reminding method and device
CN110263729A (en) A kind of method of shot boundary detector, model training method and relevant apparatus
KR20200094801A (en) Tag distribution visualization system
CN110795007A (en) Method and device for acquiring screenshot information
CN106534452A (en) Quick communication method and apparatus, and mobile terminal
CN107918529A (en) Image processing method and device, computer equipment, computer-readable recording medium
CN107168791A (en) Terminal Memory Optimize Method, device and mobile terminal under bright screen state
CN109993337A (en) A kind of financial product real-time management method, device and server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200915