CN118285113A - Image processing device, terminal and monitoring method - Google Patents

Image processing device, terminal and monitoring method Download PDF

Info

Publication number
CN118285113A
CN118285113A CN202280077159.6A CN202280077159A CN118285113A CN 118285113 A CN118285113 A CN 118285113A CN 202280077159 A CN202280077159 A CN 202280077159A CN 118285113 A CN118285113 A CN 118285113A
Authority
CN
China
Prior art keywords
information
color
region
terminal
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280077159.6A
Other languages
Chinese (zh)
Inventor
柏木美悠贵
平井骏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Corporate Club
Original Assignee
Corporate Club
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Corporate Club filed Critical Corporate Club
Publication of CN118285113A publication Critical patent/CN118285113A/en
Pending legal-status Critical Current

Links

Abstract

The invention aims to output information based on color information. An image processing device of the present invention includes: a calculating unit that calculates relative information relating to a difference between first color information in a first region included in a captured image and second color information in a second region included in the captured image; and an output unit that outputs the relative information in time series for a plurality of captured images captured at different timings.

Description

Image processing device, terminal and monitoring method
Technical Field
The present disclosure relates to an image processing apparatus, a terminal, and a monitoring method.
Background
For example, there is proposed an image processing apparatus as follows: based on color information of an image in which a subject is a human, the skin state, the skin age, and the like of the human are analyzed. For example, patent document 1 discloses an image processing apparatus that detects a face region and as a result instructs a photographer to correct the position or orientation of an imaging unit according to whether or not the face region enters a target region in an imaging region.
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open No. 2008-118276
Disclosure of Invention
According to an embodiment of the present disclosure, there is provided: a calculating unit that calculates relative information relating to a difference between first color information in a first region included in a captured image and second color information in a second region included in the captured image; and an output unit that outputs the relative information in time series for a plurality of captured images captured at different timings.
Drawings
Fig. 1 is a diagram showing an example of a functional configuration of an image processing apparatus.
FIG. 2 is a diagram showing an example of a time-series change in chromatic aberration.
Fig. 3 is a flowchart showing an example of the flow of image processing (color information processing).
Fig. 4 is a diagram showing an example of the functional configuration of the terminal according to the embodiment.
Fig. 5 is a flowchart showing an example of the flow of processing performed by the terminal according to the embodiment.
Fig. 6 is a diagram showing another example of the functional configuration of the terminal according to the embodiment.
Fig. 7 is a diagram showing an example of the functional configuration of the server according to the embodiment.
Fig. 8 is a flowchart showing an example of the flow of processing performed by the terminal and the server according to the embodiment.
Fig. 9 is a diagram showing an example of a display screen displayed on the display unit of the terminal according to the embodiment.
Fig. 10 is a diagram showing an example of a display screen displayed on the display unit of the terminal according to the embodiment.
Fig. 11 is a view showing an example of a display screen displayed on the display unit of the terminal according to the embodiment.
Fig. 12 is a view showing an example of a display screen displayed on the display unit of the terminal according to the embodiment.
Fig. 13 is a view showing an example of a display screen displayed on the display unit of the terminal according to the embodiment.
Fig. 14 is a view showing an example of a display screen displayed on the display unit of the terminal according to the embodiment.
Fig. 15 is a view showing an example of a display screen displayed on the display unit of the terminal according to the embodiment.
Fig. 16 is a diagram showing an example of a display screen displayed on the display unit of the terminal according to the embodiment.
Fig. 17 is a diagram showing an example of a recording medium.
Fig. 18 is a diagram showing time-series data representing fatigue level displayed on the terminal.
Fig. 19 is a diagram showing a configuration example of a terminal.
Fig. 20 is a diagram showing a configuration example of another terminal.
Detailed Description
An example of an embodiment for implementing the present disclosure will be described below with reference to the drawings. In the description of the drawings, the same members may be denoted by the same reference numerals, and overlapping description thereof may be omitted. However, the structural members described in the above embodiments are merely examples, and the scope of the present disclosure is not limited to the above description and the like.
Embodiment
An example of an embodiment for realizing the image processing technology of the present disclosure will be described below.
Fig. 1 is a block diagram showing an example of a functional configuration of an image processing apparatus 1 according to one embodiment of the present invention.
The image processing apparatus 1 may include, for example, a first region detection unit 11, a first color information calculation unit 12, a second region detection unit 13, a second color information calculation unit 14, a color relative information calculation unit 15, and an output unit 16. These functional units (functional blocks) included in a processing unit (processing unit) or a control unit (control unit) not shown in the image processing apparatus 1 may be configured by an integrated circuit such as a processor including a central processing unit (central processing unit, CPU) and a digital signal processor (DIGITAL SIGNAL processor, DSP) or an Application Specific Integrated Circuit (ASIC).
The first region detection unit 11 has a function of detecting a first region in an imaging image based on the imaging image obtained by imaging by an imaging unit not shown, for example.
The captured image may be a captured image obtained by capturing an image with an arbitrary camera.
As will be described in detail below by way of examples, the captured image may be an image obtained by capturing an image with a front camera, for example.
The front camera may be a camera capable of checking itself through a display unit (not shown) when a photographer performs photographing, and specifically, may be a camera (built-in camera) provided on the front side (including the surface side of the display unit) of the housing of the apparatus, for example.
The captured image may be obtained by capturing an image with a rear camera.
The rear camera may be a camera that cannot be checked by a display unit (not shown) when a photographer performs photographing, and specifically, may be a camera (a rear camera, an external camera) provided on the back side (the side opposite to the side on which the display unit is formed) of the housing of the apparatus, for example.
The first color information calculating unit 12 has, for example, the following functions: based on the image information in the first region detected by the first region detecting section 11, color information expressed in Hue (Hue) -chroma (Saturation) -luminance (Lightness) defined in accordance with the HSL color space is calculated, for example.
The second region detection unit 13 has a function of detecting a second region in the captured image based on the captured image, for example.
The second color information calculating unit 14 has, for example, the following functions: the color information defined by the HSL color space is calculated, for example, based on the image information in the second region detected by the second region detecting section 13.
The first region detection unit 11 may detect a first region from the captured image by, for example, setting a region defined by a user or a program as the first region in advance. The same applies to the second area detecting unit 13.
Specific examples of the first region or the second region are described below.
The color relative information calculating unit 15 has, for example, the following functions: color relative information, which is information relating to the relative relationship, is calculated based on the first color information calculated by the first color information calculating unit 12 and the second color information calculated by the second color information calculating unit 14.
In the color relative information, a distance of a color based on the first color information and the second color information, that is, a color difference may be included as an example. The color difference is a scalar value, and can be calculated as the euclidean distance based on the first color information and the second color information, for example.
Further, the color relative information may include a color relative vector expressed in terms of a vector, in addition to or instead of the color difference. The color relative vector may be calculated, for example, as a difference vector between the first color information in three dimensions (e.g., components of the HSL color space) and the second color information in three dimensions.
The output unit 16 has a function of outputting the color relative information calculated by the color relative information calculating unit 15.
Although not shown, the memory may be configured to store the color relative information calculated by the color relative information calculating unit 15 inside or outside the image processing apparatus, and may store the color relative information in time series, for example. The output unit 16 may output the relative information stored in the memory in time series for a plurality of captured images obtained by capturing images at different timings, for example.
The first region detecting unit 11 and the second region detecting unit 13 may be configured as a single region detecting unit having the same function (a function of outputting an image obtained by detecting (extracting) a specific region in an image when the image is input).
The first color information calculating unit 12 and the second color information calculating unit 14 may be configured as a single color information calculating unit having the same function (a function of outputting color information when an image is input).
< Principle >
The first region and the second region may be, for example, two different regions included in the captured image.
Although this is merely an example, the first region and the second region may be, for example, different regions of the same subject included in the captured image (different regions included in the same subject).
The situation or environment in which imaging is performed may be different each time, and therefore, even in the case of imaging the same subject, it is difficult to determine the color of the subject from the imaged image due to the influence of light or the like. That is, if an object whose color is known exists on the image processing apparatus side in the captured image, there is no color that can be set as a comparison object (reference), and therefore it is difficult to determine the color of the object from the captured image. Although colored or color chart is sufficient, it is troublesome for the user.
Therefore, in the present embodiment, the color of the subject is not specified, but relative information (color relative information) relating to the relative relationship between the first color information in the first region and the second color information in the second region is calculated for each captured image for the same subject included in the captured image. Then, according to the calculated relative information, the degree to which the color of the first region deviates from the color of the second region can be analyzed.
In the same subject, it can be assumed that the influence of light caused by the imaging environment or the like is also the same or similar. Therefore, the color relative information is considered to be information that is not affected by the imaging environment or the like, or is not easily affected.
Fig. 2 is a graph showing time-series changes in color difference in the case of using color difference as the color relative information outputted from the output unit 16, and illustrates the color difference with the horizontal axis as the time axis (t) and the vertical axis as the color difference. Further, the color difference is illustrated here as a graph of a continuous curve.
The closer the color difference is to zero means that the color of the first region is closer to the color of the second region. Conversely, the farther the color difference is from zero, the more the color of the first region deviates from the color of the second region. The use is briefly described below and is described in detail in the examples below.
The subject may be set to, for example, an animal including a human (human, human character) (where human is also included in the animal).
Here, a case in which a human is a subject is exemplified. In this case, the first region and the second region may be set to different regions in the same human being included in the captured image, for example.
In that case, the first region may be set to, for example
Face region
Neck region
Either or both of (a) and (b).
Further, the second region may be set to, for example
The area inside the arm.
Furthermore, the arm refers to the shoulder to the wrist.
Further, the face area may be set to, for example
Cheek region.
The region inside the arm portion may be, for example, set to
Region of the inner side of the upper arm (upper arm)
Area inside the wrist
Either or both of (a) and (b).
Further, the combination of the first region and the second region may be set to the arbitrary combination.
In the present specification, for example, based on medical definition, a person who approaches the shoulder is referred to as an "upper arm (upper arm)", and a person who approaches the hand is referred to as a "forearm".
The reason why the region inside the arm is exemplified is that the inside of the arm is considered to be less prone to tanning (skin tone is less prone to change). However, even if the inner side of the arm portion is worn, the inner side of the forearm may be tanned. Therefore, as the inner side of the arm portion, the inner side of the upper arm (upper arm portion) can be used. Further, the wrist may be less susceptible to tanning, and therefore, even the forearm may be used on the inner side of the wrist.
The inside of the upper arm or the inside of the wrist is considered particularly less prone to tanning and is prone to maintaining skin tone. Therefore, in the following description, the color of the inner side of the upper arm or the inner side of the wrist is set to be an ideal skin color (skin color before being affected by disturbance such as sunlight), and how much the color of the face (cheek) or the neck, which is easily affected by sunlight or the like, deviates from the color of the inner side of the upper arm or the inner side of the wrist can be analyzed by the color relative information.
In recent trends in the sales of cosmetic products, for example, not limited to the region of the face (cheek) of a human, there may be cases where the neck region of a human also becomes a target region for skin care such as aging.
In addition, the face region may be set to a region other than the cheek.
The subject is not limited to humans, and may be an animal other than humans.
The subject is not limited to an animal, and may be an object.
Further, the present invention is not limited to the case where the first region and the second region different from each other are the regions of the same subject included in the captured image, and for example, any of the first region and the second region different from each other included in the captured image may be used as the target, and the color relative information may be calculated, and how far the color is deviated may be analyzed. For example, a partial region of one subject may be set as a first region, a partial region of another subject may be set as a second region, and how far the color deviates may be analyzed.
Sequence of image processing
Fig. 3 is a flowchart showing an example of the sequence of image processing in the present embodiment.
The processing shown in the flowchart of fig. 3 can be realized by, for example, a processing unit (not shown) of the image processing apparatus 1 causing a random access memory (random access memory, RAM) or the like (not shown) to read and execute program codes of an image processing program stored in a storage unit (not shown).
Although the storage unit is shown in fig. 1 as being excluded from the structural members of the image processing apparatus 1, the storage unit may be included in the structural members of the image processing apparatus 1.
The flowcharts described below merely show an example of the procedure of image processing in the present embodiment, and other steps may be added or some steps may be deleted.
First, the image processing apparatus 1 determines whether or not a captured image is input (A1). When it is determined that the input is made (A1: yes), the first region detection unit 11 performs a process of detecting the first region from the input captured image (A3). In this process, the first region detecting unit 11 performs, for example, a region extraction process, and detects the first region.
In the region extraction process, the first region may be detected, for example, using a method of keypoint detection (Key Point Detection). Furthermore, semantic segmentation may be performed on the captured image using a full convolution network (Fully Convolutional Network, FCN) or SegNet, U-Net, or the like, which is a method of deep learning. Then, the position of the pixel classified as the first region and the color information of the pixel are taken as the extraction result.
For example, if the first region is a cheek region, the position of the pixel classified as the cheek region and the color information of the pixel are used as the extraction result.
The captured image may be extracted by using, for example, a feature amount of a directional gradient histogram (Histogram of Oriented Gradients, HOG), and the first region may be extracted by using a recognizer such as a support vector machine (Support Vector Machine, SVM).
In the case where the first region is skin, the certainty of the skin tone of the pixel may be evaluated using the color histogram, and the first region may be extracted from the captured image. Skin tone certainty can also be evaluated for the extraction results obtained by the SVM.
The region extraction may be performed by combining the results of the deep learning with those results.
Similarly, the second region detection unit 13 detects a second region from the captured image (A5). This can also be achieved by, for example, the same region extraction process as described above. For example, if the second region is set as the region on the inner side of the upper arm or the region on the inner side of the wrist, the position of the pixel and the color information of the pixel classified into these regions are used as the extraction result by the key point detection (Key Point Detection) or the semantic segmentation method.
Further, the image processing apparatus 1 may perform the extraction of the first region and the extraction of the second region at one time based on, for example, the result of the semantic segmentation.
Then, the image processing apparatus 1 determines whether or not the detection of the first region and the second region is successful (A7). The determination may be achieved, for example, by: it is determined whether the first region occupies a set proportion of the captured image based on the detection result (extraction result) of A3, and it is determined whether the second region occupies a set proportion of the captured image based on the detection result (extraction result) of A5.
Further, the determination of A7 may be performed by determining whether or not the first region and the second region occupy a set proportion of the captured image.
If it is determined that at least one of the detections has failed (no in A7), the image processing apparatus 1 returns the process to A1.
In this case, the image processing apparatus 1 may return the process to A1 after performing some kind of error processing. Since there is also a possibility that there is a problem in imaging, information or the like that reminds the user of the situation may be output.
When the detection of the first region and the second region is successful (yes in A7), the first color information calculating unit 12 calculates first color information based on the detection result of A3 (A9). For example, an average value of color information of each pixel is calculated for each image block (small region of image) of a set size. Then, based on the average value, color information (first color information) of each image block in a predetermined color space is calculated, for example.
Similarly, the second color information calculating unit 14 calculates second color information based on the detection result of A5 (a 11). The method may be set to be the same as A9, for example.
Then, the color relative information calculating unit 15 calculates color relative information based on the first color information calculated in A9 and the second color information calculated in a11 (a 13).
For example, when the color difference is calculated as the color relative information, the color relative information calculating unit 15 calculates the euclidean distance between the first color information calculated in A9 and the second color information calculated in a11, and calculates the color difference.
For example, when the color relative vector is calculated as the color relative information, the color relative information calculating unit 15 calculates a vector (three-dimensional) of differences between the three-dimensional color information calculated in A9 and the three-dimensional color information calculated in a 11.
Then, the image processing apparatus 1 determines whether or not the output condition of the color relative information is satisfied (a 15). If it is determined that the output condition is not satisfied (no in a 15), the image processing apparatus 1 returns the process to A1. On the other hand, if it is determined that the output condition is satisfied (yes in a 15), the image processing apparatus 1 ends the process.
The output conditions may be predetermined, for example, various conditions such as outputting for each captured image, calculating color relative information for a set number of captured images, calculating color relative information for a set amount of captured images, and the like.
In addition, when calculating the color relative information for the set number of captured images, the image processing apparatus 1 may output, for example, an average value or a central value of the respective color relative information calculated for the respective captured images. The same applies to the case of calculating the color relative information for the picked-up image of the set period. That is, the image processing apparatus 1 can output one piece of color relative information from a plurality of captured images. By outputting one piece of color relative information from a plurality of captured images, it is expected that the influence of the imaging environment can be further reduced.
In this process, the step A5 is performed after the step A3, but the order may be reversed. The same applies to steps A9 and a 11.
Examples (example)
(1) Embodiments of the terminal
An embodiment of applying the image processing apparatus 1 or a terminal including the image processing apparatus 1 will be described. The terminal may be a terminal device owned by a user such as a mobile phone including a smart phone, a camera, a Personal digital assistant (Personal DIGITAL ASSISTANT, PDA), a Personal computer, a navigation device, a wristwatch, or various tablet terminals.
Here, as an example, an embodiment of a smart phone which is one of mobile phones having a camera function (having an imaging function) will be described. In terms of illustration, a terminal 100 is illustrated.
The embodiments to which the present disclosure can be applied are not limited to the embodiments described below.
Functional structure
Fig. 4 is a diagram showing an example of a functional configuration of a terminal 100A as an example of the terminal 100 of the smart phone in the present embodiment.
The terminal 100 includes, for example, a processing unit 110, an operation unit 120, a touch panel 125, a display unit 130, a sound output unit 140, an imaging unit 150, an environmental information detection unit 160, a timepiece unit 170, a communication unit 180, and a storage unit 190.
The processing unit 110 is a processing device that controls the respective units of the terminal 100 or performs various processes related to image processing in accordance with various programs such as a system program stored in the storage unit 190, and is configured to include a processor such as a CPU or DSP, or an integrated circuit such as an ASIC.
The processing unit 110 includes, for example, a first region detection unit 111, a second region detection unit 112, a first color information calculation unit 113, a second color information calculation unit 114, a color difference calculation unit 115, and a display control unit 116. The first region detection unit 111 to the second color information calculation unit 114 correspond to the first region detection unit 11 to the second color information calculation unit 14 described above.
The color difference calculating unit 115 is one of the above-described color relative information calculating units 15, and calculates the color difference between the first color information and the second color information.
The display control unit 116 controls the display unit 130 to display the information of the color difference calculated by the color difference calculation unit 115 and output in time series.
For example, the processing unit 110 may transmit color relative information (color difference in this example) to each functional unit as an output for performing various controls (display control, audio output control, communication control, etc.), and the output unit 16 of the image processing apparatus 1 of fig. 1 may be the processing unit 110 including the display control unit 116.
For example, the output unit 16 of the image processing apparatus 1 of fig. 1 may be a functional unit such as the display unit 130, the audio output unit 140, or the communication unit 180.
The operation unit 120 includes an input device such as an operation button or an operation switch for a user to input various operations to the terminal 100. The operation unit 120 includes a touch panel 125 integrally formed with the display unit 130, and the touch panel 125 functions as an input interface between the user and the terminal 100. An operation signal according to the operation of the user is output from the operation unit 120 to the processing unit.
The display unit 130 is a display device including a Liquid crystal display (Liquid CRYSTAL DISPLAY, LCD) or the like, and performs various displays based on a display signal output from the display control unit 116. In the present embodiment, the display portion 130 is integrally formed with the touch panel 125 to form a touch screen.
The audio output unit 140 is an audio output device including a speaker or the like, and performs various audio outputs based on the audio output signal output from the processing unit 110.
The image pickup unit 150 is an image pickup apparatus configured to be capable of picking up an image of an arbitrary scene, and includes an image pickup device (semiconductor device) such as a charge coupled device (Charge Coupled Device) image sensor or a Complementary Metal Oxide Semiconductor (CMOS) image sensor. The image pickup unit 150 forms an image of light emitted from an image pickup object on a light receiving plane of an image pickup device through a lens, not shown, and converts the brightness of the image light into an electric signal through photoelectric conversion. The converted electric signal is converted into a Digital signal by an Analog-to-Digital (a/D) converter (not shown), and is output to a processing unit. The image pickup unit 150 is disposed on the side of the terminal 100 where the touch panel 125 is present, for example. May also be referred to as a front camera.
The image pickup unit 150 may be disposed on the back surface of the touch panel 125 where the terminal 100 is not present, and may include an image pickup unit (rear camera) including a flash (flash) that can be used as a light source during image pickup. The flash lamp may be a flash lamp whose color temperature is known when the flash lamp emits light, or may be a flash lamp whose color temperature can be dimmed by the processing unit.
Further, two imaging units of the front camera and the rear camera may be included.
Further, when these image capturing sections capture images, the display control section 116 may cause the display section 130 to display live view images.
The environment information detecting section 160 detects information (hereinafter referred to as "environment information") related to the environment of the own device. The environmental information may include, for example, at least any one of temperature and humidity.
The timepiece unit 170 is a built-in timepiece of the terminal 100, and outputs time information (time information). The timepiece 170 includes, for example, a clock using a crystal oscillator.
The timepiece 170 may include a clock to which a Network identifier, a time Zone (NITZ) standard, and the like are applied.
The communication unit 180 is a communication device for transmitting and receiving information used in the device between the external information processing device and the communication device. As a communication method of the communication unit 180, various methods such as a form of wired connection via a cable according to a predetermined communication standard, a form of connection via an intermediate device that also serves as a charger called a cradle, and a form of wireless connection by wireless communication can be applied.
The storage unit 190 is a storage device including a volatile or nonvolatile memory such as a Read Only Memory (ROM) or an electrically erasable and rewritable ROM (ELECTRICALLY ERASABLE PROGRAMMABLE READ ONLY MEMORY, EEPROM), a flash memory, a RAM, or a hard disk device.
In the present embodiment, for example, a color information processing program 191, a color difference calculation processing program 193, an image buffer 195, and color difference history data 197 are stored in the storage unit 190.
The color information processing program 191 is a program that is read by the processing unit 110 and executed as color information processing.
The color difference calculation processing program 193 is a program that is read by the processing unit 110 and executed as a color difference calculation process.
The image buffer 195 is, for example, a buffer storing an image captured by the imaging unit 150.
The color difference history data 197 is, for example, data stored by associating the color difference calculated by the color difference calculating unit 115 with the date and time (or time) counted by the timepiece unit 170.
In addition to these functional units, for example, a gyro sensor or the like that detects the axial angular velocity of three axes may be provided.
< Treatment >)
Fig. 5 is a flowchart showing an example of the flow of the color information processing performed by the processing unit 110 of the terminal 100A in the present embodiment.
First, the processing unit 110 determines whether or not imaging is performed by the imaging unit 150 (B1), and if it is determined that imaging is performed (B1: yes), the data of the imaged image is stored in the image buffer 195. Then, the color difference calculating unit 115 performs a color difference calculating process (B3). Specifically, based on the processing illustrated in fig. 3 or the like, a color difference is calculated as color relative information for the captured image. Then, the processing unit 110 associates the calculated color difference with time information (date and time, etc.) of the timepiece unit 170 and stores the association in the color difference history data 197 (B5).
The processing unit 110 may obtain data of a plurality of captured images by the image capturing unit 150, and calculate color relative information (color difference) based on the data of the plurality of captured images stored in the image buffer 195.
Thereafter, the processing unit 110 determines whether or not the color difference history information is displayed (B7). Specifically, for example, it is determined whether or not the user inputs the color difference history information via the operation unit 120, the touch panel 125, or the like to display the color difference history information.
When it is determined that the color difference history information is displayed (B7: yes), the processing unit causes the display unit 130 to display the color difference history information based on the history of the color difference stored in the color difference history data 197 (B9).
Then, the processing unit 110 determines whether to end the processing, and if it is determined to continue the processing (no in B11), returns the processing to B1.
On the other hand, if it is determined to end the processing (yes in B11), the processing unit 110 ends the processing.
If it is determined that the image capturing unit 150 does not capture an image (no in B1), the processing unit 110 advances the process to B7.
If it is determined that the color difference history information is not displayed (no in B7), the processing unit 110 advances the process to B11.
In addition, when the terminal 100A displays the color difference history information, the terminal may display the color difference in association with the imaging date and time or the imaging time of the imaging unit 150 based on the time information of the clock unit 170.
Further, when the terminal 100A displays the color difference history information, the color difference may be displayed in association with the environmental information detected by the environmental information detection unit 160. More specifically, the detection result of the environmental information detection unit 160 at the date and time or the time corresponding to the imaging date and time or the imaging time of the imaging unit 150 is used to cause the color difference and the environmental information to be displayed in correspondence with the date and time or the time.
Further, as the environmental information, any information of temperature and humidity may be displayed. The environment information can be obtained from an environment information providing server, not shown, via the communication unit 180.
In the above processing, an example in which a color difference (scalar value) is calculated and displayed as color relative information is shown, but the present invention is not limited to this. As described above, the color relative vector may be calculated and displayed as color relative information. For example, in a color space, color relative vectors may be represented by arrows.
In the following embodiments, a display screen illustrated as a skin analysis application will be described with respect to these display screen examples.
(2) Embodiments of terminal and Server
Next, an embodiment of a system including a server and a terminal to which the image processing apparatus 1 or a server including the image processing apparatus 1 is applied will be described. The server may be provided, for example, as a server (e.g., a server in a client server system) in communication with the user's terminal as described above.
Here, an example of a client server system will be described in which a user terminal communicates with a server and color relative information is displayed by an application program for analyzing human skin (hereinafter referred to as a "skin analysis application program").
The skin analysis application (application program) may be an application program that is downloaded from a server and stored in a storage unit of the terminal and executed, or an application program (e.g., web application program) that is executed without being downloaded.
Functional structure
Fig. 6 is a diagram showing an example of a functional configuration of a terminal 100B as an example of the terminal 100 in the present embodiment.
The same reference numerals are given to the same constituent elements as those of the terminal 100A shown in fig. 4, and a description thereof will be omitted.
In the present embodiment, the processing section 110 of the terminal 100B includes, for example, the display control section 116 described above as a functional section.
In the present embodiment, the communication unit 180 of the terminal 100B communicates with the server 200 that manages various information related to the skin analysis application via the network 300.
In the present embodiment, for example, the storage unit 190 of the terminal 100B stores a skin analysis application processing program 192 that is read by the processing unit 110 and executed as skin analysis application processing, an application ID194 that is information on the terminal 100B using the skin analysis application or an account of a user of the terminal 100B, and the above-described image buffer 195.
Further, in the present embodiment, the terminal 100B may not include the environment information detecting section 160, but obtain the environment information from the server 200, for example.
Fig. 7 is a diagram showing an example of the functional configuration of the server 200 in the present embodiment.
The server 200 includes, for example, a processing unit 210, a display unit 230, an environmental information obtaining unit 260, a timepiece unit 270, a communication unit 280, and a storage unit 290, which are connected via a bus B.
The HW configuration of the processing unit 210, the display unit 230, the timepiece unit 270, the communication unit 280, and the storage unit 290 may be the same as that of the terminal 100, and therefore, the description thereof will be omitted.
The environmental information obtaining unit 260 obtains, for example, environmental information detected by an environmental information detecting unit (a temperature sensor, a humidity sensor, or the like) included in the own device, or environmental information from an environmental information providing server, not shown, that provides environmental information. In the case of obtaining from the environment information providing server, the communication section may be set as the environment information obtaining section.
The communication unit 280 receives transmission information (data) from and to other devices including the terminal 100B via the network 300 under the control of the processing unit 210.
The storage unit 290 stores, for example, the terminal 100B using the skin analysis application program, or management data related to the user of the terminal 100B.
For example, the application IDs are stored in the storage unit 290 as database of color difference history data in which color differences are stored in association with the time information of the timepiece unit 270.
Further, in the present embodiment, the server 200 may not include the environment information obtaining section 260, but obtain the environment information from the terminal 100B.
< Treatment >)
Fig. 8 is a flowchart showing an example of the flow of processing performed by each device in the present embodiment. In the figure, the left side of the drawing shows an example of skin analysis application processing performed by the processing unit 110 of the terminal 100B, and the right side of the drawing shows an example of management processing related to skin analysis application performed by the processing unit 210 of the server 200.
First, the processing unit 110 of the terminal 100B determines whether or not imaging has been performed by the imaging unit 150 by the skin analysis application (C1). When it is determined that image capturing has been performed (C1: yes), the processing unit 110 of the terminal 100B transmits image capturing data including the application ID194 stored in the storage unit 190 and the data of the captured image to the server 200 via the communication unit 180, for example (C3). The processing unit 110 of the terminal 100B may repeat imaging by the imaging unit 150 and transmit imaging data including data of a plurality of imaging images.
The processing unit 210 of the server 200 determines whether or not the image data is received from the terminal 100B via the communication unit 280 (S1), and if it is determined that the image data has been received (S1: yes), performs, for example, a process of calculating a color difference between the first color information and the second color information from an image captured by the image data included in the image data received from the terminal 100B in accordance with a color difference calculation processing program (not shown) stored in the storage unit 290 (S3). Then, the processing unit 210 of the server 200 associates the calculated color difference with the time information (date and time, etc.) of the timepiece unit 270, and stores the color difference as color difference history data corresponding to the application ID included in the received image data (S5).
After C3, the processing unit 110 of the terminal 100B determines whether or not the color difference history information is displayed by the skin analysis application (C5). Specifically, for example, it is determined whether or not the color difference history information is displayed by the skin analysis application program and input by the user via the operation unit 120, the touch panel 125, or the like.
If it is determined that the color difference history information is displayed (C5: yes), the processing unit 110 of the terminal 100B requests the server 200 to display the color difference history information including the application ID194 stored in the storage unit 190 and the information requesting the color difference history information through the communication unit 180, for example (C7).
After S5, the processing unit 210 of the server 200 determines whether or not there is a request for displaying color difference history information from the terminal 100B (S7), and if it is determined that there is a request (S7: yes), the processing unit transmits color difference history information including the history of color differences for the set amount of time to the terminal 100B, for example, through the communication unit 280, based on the color difference history data stored in the storage unit 290 and corresponding to the received application ID194 (S9).
After C7, when the communication unit 180 receives the color difference history information from the server 200, the processing unit 110 of the terminal 100B displays the received color difference history information on the display unit 130 through the skin analysis application (C9).
Thereafter, the processing unit 110 of the terminal 100B determines whether to end the processing (C11), and if it is determined to continue the processing (C11: no), returns the processing to C1.
On the other hand, if it is determined to end the processing (yes at C11), the processing unit 110 of the terminal 100B ends the processing of the skin analysis application.
After S9, the processing unit 210 of the server 200 determines whether to end the processing (S11), and if it is determined to continue the processing (S11: no), returns the processing to S1.
On the other hand, if it is determined to end the processing (yes in S11), the processing unit 210 of the server 200 ends the processing.
If it is determined that image capturing by the image capturing unit 150 has not been performed (C1: no), the processing unit 110 of the terminal 100B advances the process to C5.
If it is determined that the color difference history information is not displayed (C5: no), the processing unit 110 of the terminal 100B advances the process to C11.
If it is determined that the image capturing data has not been received (S1: no), the processing unit 210 of the server 200 advances the process to S7.
If it is determined that no color difference history information is required for display (S7: no), the processing unit 210 of the server 200 advances the process to S11.
In this process, only one terminal 100B is illustrated, but each terminal 100 using the skin analysis application program can perform the same process. The server 200 may perform the same process for each terminal 100.
In the above processing, an example in which a color difference (scalar value) is calculated and displayed as color relative information is shown, but the present invention is not limited to this. As described above, the color relative vector may be calculated and displayed as color relative information. For example, in a color space, color relative vectors may be represented by arrows.
< Display Screen >)
Fig. 9 is a diagram showing an example of a screen of the skin analysis application displayed on the display unit 130 of the terminal 100 in the present embodiment.
The screen is a diagram showing an example of a navigation screen (B1 in fig. 5 and C1 in fig. 8) displayed when the user inputs the image pickup unit 150 by using the skin analysis application, and a text step 1 and a navigation area R1 for conveying how to take an image in an illustration are displayed on the upper part of the screen. In the illustration, a case is shown in which a female holds the terminal 100 with his right hand and lifts the opposite arm to take a photograph. The text "please shoot so as to fit inside the cheek and upper arm" is displayed below the illustration as a more detailed navigation, and an "OK" button BT1 for entering the next screen is displayed below the illustration.
When the "OK" button BT1 is clicked, for example, the display is switched to the screen shown in fig. 10 (B1 in fig. 5 and C1 in fig. 8).
The screen is a diagram showing an example of an image capturing screen for a user to capture an image by a skin analysis application program, and includes, for example, a live view image in which a front camera is disposed and an image capturing button BT3 shown in concentric circles for capturing an image. In addition, as in fig. 9, the letter "please take a photograph so as to be taken inside the cheek and upper arm" is displayed below. In the screen, the user can take an image by pressing the image capturing button BT3 after taking the posture shown in fig. 9.
Fig. 11 is a view showing an example of a color difference history information display screen (B9 in fig. 5 and C5: yes to C9 in fig. 8) displayed when a user inputs color difference history information to be displayed by the skin analysis application.
In the screen, a state is shown in which a tab of "skin chart" among tabs having a plurality of functions available to the user in the skin analysis application program provided at the upper part of the screen is clicked.
At the center of the screen, the history of the color difference in the past predetermined period (in this example, the period from "2021, 11, 1" to "2021, 11, 7" is represented graphically). In this example, the user photographs each day during the 7 days, and the change in the color difference per day is represented by a graph (a line graph in this example) in which the horizontal axis is the date and the vertical axis is the color difference.
The display is not limited to a line drawing, and may be a bar graph or the like. Moreover, numerical values themselves may be represented instead of charts.
In the example, the word "skin condition" is displayed in association with the vertical axis. A poor icon IC1 (in this example, a face icon other than a smiling face) showing a poor skin condition is displayed on the upper side of the vertical axis, and a good icon IC2 (in this example, a smiling face icon) showing a good skin condition is displayed on the lower side of the vertical axis. Further, a plurality of lower triangles for indicating that the skin state is good in stages are displayed from the poor icon IC1 toward the good icon IC 2.
In this example, the cheek region of the user is set as a determination target region (measurement target region). The region on the inner side of the upper arm of the user is set as the comparison target region. The above-described process is performed with the cheek region of the user being a first region and the upper arm region of the user being a second region.
As described above, it is considered that the inside of the upper arm is not easily affected by disturbance such as sunlight, and skin color is easily maintained. Therefore, in this example, the color of the inner side of the upper arm is set to the skin color (ideal value) of the user as the target. That is, the color of the region inside the upper arm is set as a reference for whitening. Then, based on an image captured by the user, the degree to which the color of the cheek of the user deviates from the color of the inner side of the upper arm is reported to the user using the color difference as an index value.
According to the above, it is considered that the smaller the color difference is, the closer the color of the cheek of the user is to the ideal value. Conversely, the greater the color difference, the more the color of the user's cheek deviates from the ideal value.
Fig. 12 is a view showing another example of the color difference history information display screen of fig. 11.
In the screen, a graph of time series of temperature and humidity as environmental information is displayed in association with a graph of time series of color differences in the graph shown in fig. 11. That is, the timing is set to correspond to the information of the color difference and the environmental information.
By performing such display, the user can compare the time-series color difference with the time-series temperature or humidity, and can analyze under what environment the skin condition tends to be good, and conversely, under what environment the skin condition tends to be poor.
Further, as the environmental information, any information of temperature and humidity may be displayed.
In this case, for example, in step S9 of fig. 8, the processing section 210 of the server 200 may transmit the information of the time-series color difference and the environment information obtained by the environment information obtaining section 260 to the terminal 100. In this case, the server 200 may obtain environmental information in the region or the position of the terminal 100 based on information of the region in which the user registered in advance resides or information of the position of the terminal 100 transmitted from the terminal 100, and transmit the environmental information to the terminal 100. That is, in the example, the terminal 100 may obtain the environment information from the server 200.
In addition, instead of the above, the server 200 may receive image capturing data including the environmental information from the terminal 100 (C3, S1: yes in fig. 8), and store the environmental information received from the terminal 100 in the storage unit 290. Then, information obtained by associating the color difference history information with the environment information may be transmitted to the terminal 100 (S9 of fig. 8).
< Action, effect of the embodiment >
In the above embodiment, the terminal (an example of the image processing apparatus) calculates the first color information in the first area included in the captured image and the second color information in the second area included in the captured image. Then, the terminal calculates color relative information (an example of relative information relating to the relative relationship) of the calculated first color information and second color information. Then, the terminal outputs a time-series color difference with respect to a plurality of captured images obtained by capturing images at different timings.
Thus, the terminal can calculate the relative information relating to the relative relationship between the first color information in the first region included in the captured image and the second color information in the second region included in the captured image. Further, with respect to a plurality of captured images obtained by capturing images at different timings, the time-series relative information can be output.
In this case, the color relative information may be, for example, information of color difference.
Thus, the terminal can output time-series color difference information on a plurality of captured images obtained by capturing images at different timings.
In this case, the first region and the second region may be different regions of the same subject.
Thus, color relative information based on color information of different areas of the same subject can be calculated. For example, the second region may be a comparison target region, and the criterion may be to determine how far the color of the first region of the same subject deviates from the color of the second region of the same subject.
Also, in this case, the first region may be set as a face region.
Thus, the criterion may be set to determine how far the color of the face region of the same subject deviates from the color of a region different from the face region of the same subject.
Also, in this case, the first region may be set as a cheek region.
Thus, the criterion can be set to determine how far the color of the cheek region of the same subject deviates from the color of a region different from the face region of the same subject.
Also, in this case, the first region may be set as a neck region.
Thus, the criterion can be set to determine how far the color of the neck region of the same subject deviates from the color of a region different from the neck region of the same subject.
Also, in this case, the second region may be set as a region inside the arm portion.
Thus, the criterion may be to determine how far the color of the region different from the inside of the arm of the same subject deviates from the color of the region inside the arm of the same subject.
In this case, the second region may be an area on the inner side of the upper arm or an area on the inner side of the wrist.
Thus, the criterion may be to determine how far the color of the region different from the region inside the upper arm of the same subject deviates from the color of the region inside the arm of the same subject. The criterion may be to determine how far the color of the region different from the region inside the wrist of the same subject deviates from the color of the region inside the wrist of the same subject.
Further, in the embodiment, the user's terminal may include the any one of the image processing apparatuses, an image pickup section 150 that picks up the picked-up image, and a display section 130 that displays color relative information.
In this way, the relative information calculated based on the image captured by the imaging unit can be displayed on the display unit, and the user can be identified.
In this case, the display unit 130 may display the color-related information in association with the information of the imaging date and time or the imaging time (an example of the information capable of specifying the timing).
This enables the user to recognize the timing of image capturing and the relative information. The timing of image capturing can be referred to, so that convenience for the user can be improved.
In this case, the user terminal may further include a communication unit 180 (an example of an environment information obtaining unit that obtains environment information) that receives the environment information from the environment information detecting unit 160 or the server, and the display unit 130 may display the color relative information in association with the environment information.
This enables the user to recognize the acquired environmental information and relative information. Since the environment information can be referred to, the user's convenience can be improved.
In the above embodiment, the server (an example of the image processing apparatus) calculates the first color information in the first region included in the captured image and the second color information in the second region included in the captured image. Then, the server calculates color relative information (an example of relative information relating to the relative relationship) of the calculated first color information and second color information. Then, the server outputs a time-series color difference with respect to a plurality of captured images obtained by capturing images at different timings.
Thus, the server can calculate the relative information about the relative relationship between the first color information in the first region included in the captured image and the second color information in the second region included in the captured image. Further, with respect to a plurality of captured images obtained by capturing images at different timings, the time-series relative information can be output.
In the above embodiment, the server includes any one of the image processing apparatuses and a communication unit that receives the captured image from the terminal and transmits the calculated relative information to the terminal.
In this way, the server can calculate the relative information based on the captured image after acquiring the captured image from the terminal, and transmit the calculated relative information to the terminal. In view of the terminal, the terminal may transmit the captured image to the server without calculating, and thus the processing load can be reduced.
Other embodiments
Hereinafter, other embodiments (modifications) will be described.
(1) Mode
The above-described embodiment is an example, and is applicable to a case where a user performs image capturing in a plain (non-makeup) state.
However, the present invention is not limited to this, and may be applied to a case where a user performs image capturing in a state of applying foundation, for example.
For example, a mode of imaging in a plain state is referred to as a "plain Yan Moshi" and a mode of imaging in a state of applying a foundation is referred to as a "foundation mode". For example, the user can select "element Yan Moshi" and "foundation mode" to take an image.
In this case, although not shown, a User Interface (UI) is configured such that a User can select "element Yan Moshi" and "foundation mode" before the screen shown in fig. 9 is displayed or in the screen shown in fig. 9. Then, when the "plain mode" is selected, the user performs image capturing on the image capturing screen of fig. 10 in the plain state. When the "foundation mode" is selected, the user can take an image on the image pickup screen of fig. 10 in a state where the foundation is applied.
In each mode, the image captured in the mode may be subjected to the same processing as described above, and the result may be displayed on the terminal 100.
The specific examples are illustrated in connection with the following examples.
(2) Informing the user
The terminal 100 may notify the user based on the color relative information.
Fig. 13 is a diagram showing an example of a color difference history information display screen displayed on the display unit 130 of the terminal 100 in the present embodiment. The method of observing the screen is the same as that of fig. 11.
Here, as an example, a case is illustrated in which different notification contents are respectively notified in the two modes of "element Yan Moshi" and "foundation mode" described in the "(1) mode".
In the screen of fig. 13, a mode switching tab MT1 including a tab for displaying color difference history information based on an image captured by capturing in the "plain mode" and a tab for displaying color difference history information based on an image captured by capturing in the "foundation mode" is provided below a region in which a tab having a plurality of functions available to the user is displayed in a skin analysis application program provided at the upper part of the screen. In this example, the label of the "plain mode" is displayed in reverse on the basis of the label of the "plain mode" in the user click mode switching label MT1, and history information of the color difference calculated in the "plain mode" is displayed below the label.
In the above example, the time-series color difference chart shows that the color difference is equal to or larger than the set value (threshold value) or exceeds the set value. In this example, a state is shown in which the color difference calculated based on the captured image obtained by capturing the image at "2021, 11, 3, or more is equal to or greater than the set value. In this example, the attention mark MK1 is displayed on the side of the value of the color difference. When the user clicks the attention marker MK1, a screen such as that shown in fig. 14 is displayed, for example.
In the screen, a region for displaying notification information transmitted from the server is formed below the region for displaying the graph based on the click attention marker MK 1. In the example, the case where the text "skin" is included is displayed in the region R3 indicated by the dialog box from the angel illustration at the lower part of the screen; the skin is in a state of being easily dried and roughened, text conveying points of attention to the user, and information of an "OK" button to make these displays no longer displayed.
That is, in the above example, the notification information to the user is displayed on the display unit 130 based on the color difference satisfying the setting condition that the color difference is equal to or larger than the first setting value or exceeds the first setting value.
Specifically, as the notification information, information that the second color information is ideal and the user is reminded that the first color information deviates from the ideal is displayed, more specifically, information that the user is reminded of the skin condition (information indicating that the skin condition is poor is notified to the user) is displayed.
In contrast to the above example, although not shown, notification information to the user may be displayed on the display unit 130 based on the color difference satisfying the setting condition that the second setting value is not satisfied or the setting condition that the second setting value is not satisfied. The second set value may be set to a value smaller than the first set value.
Specifically, for example, as the notification information, information notifying the user that the first color information is close to ideal may be displayed with the second color information as ideal, and more specifically, information notifying the user that the skin condition is good may be displayed.
The display form of the notification is merely an example, and is not limited thereto.
For example, the notification information may be displayed by a dialog box, a bubble, or the like based on the click of the attention marker MK1 and according to the attention marker MK1 or the value of the color difference marked with the attention marker MK 1.
Note that the notice mark MK1 may be not displayed but notification information may be displayed.
The notification is not limited to display, and may be performed by outputting sound (including sound) from the sound output unit 140. That is, the notification information may be sound information (including sound information).
For example, attention sound or attention-seeking broadcast may be output from the sound output unit 140 based on the color difference satisfying a setting condition that the color difference is equal to or greater than the first setting value or exceeds the first setting value. Note that the reminder broadcasting may be set to the same content as the content indicated by text in fig. 14, or may be set to a different content.
For example, the horn sound or the blessing broadcast may be output from the sound output unit 140 based on the color difference satisfying the setting condition that the second setting value is not satisfied or the second setting value is not satisfied.
As described above, the "prime Yan Moshi" and the "foundation mode" may not be provided, and if the design of shooting under the supposition prime color is basically provided, the display mode switching tab MT1 is not required in the screens of fig. 13 and 14.
Fig. 15 is a diagram showing an example of a color difference history information display screen displayed on the display unit 130 of the terminal 100 in the present embodiment.
In the screen, the label of the foundation mode is displayed in reverse based on the label of the foundation mode in the mode switching label MT1 clicked by the user, and the color difference history information in the foundation mode is displayed thereunder.
Here, in the same mode as fig. 13, a case is shown in which the color difference is equal to or larger than a set value or exceeds a set value in a graph of the time-series color difference in the foundation mode. In this example, a state is shown in which the color difference calculated based on the captured image obtained by capturing the image in the foundation mode at "2021, 12, 4 days" is equal to or greater than the set value. Also, in the example, the attention mark MK1 described above is displayed on the side of the color difference. When the user clicks the attention marker MK1, a screen such as that shown in fig. 16 is displayed, for example.
In the screen, an area for displaying notification information transmitted from the server is formed below the area in which the graph is displayed. In this example, the text "do foundation change" is displayed in the region R5 indicated by the dialog box from angel's illustration in the lower part of the screen similar to fig. 14? The difference from the skin tone becomes large), text conveying points of attention to the user, and information of the "OK" button to make these displays no longer displayed.
Since the "foundation mode" is adopted, the notification content is different from the "element Yan Moshi" in fig. 14, and notification information corresponding to the "foundation mode" is displayed.
That is, in the above example, the notification information to the user is displayed on the display unit 130 based on the fact that the color difference calculated in the "foundation mode" satisfies the setting condition that the color difference is equal to or greater than the third setting value or exceeds the third setting value.
Specifically, as the notification information, information that the second color information is ideal and that the first color information deviates from ideal is displayed to remind the user, more specifically, information that the attention to foundation (color of foundation) is not matched, for example, is displayed.
In contrast to the above example, although not shown, notification information to the user may be displayed on the display unit 130 based on the fact that the color difference calculated in the "foundation mode" satisfies the setting condition that the fourth setting value is not satisfied or is equal to or smaller than the fourth setting value.
The fourth set point may be set to a value smaller than the third set point.
Specifically, for example, as the notification information, the second color information may be desirably displayed to notify the user that the first color information is close to the ideal, and more specifically, for example, the information to notify the user that the foundation (color of foundation) is matched may be displayed.
The display form of the notification is merely an example, and is not limited thereto.
For example, the notification information may be displayed by a dialog box, a bubble, or the like based on the click of the attention marker MK1 and according to the attention marker MK1 or the value of the color difference marked with the attention marker MK 1.
Note that the notice mark MK1 may be not displayed but notification information may be displayed.
The notification is not limited to display, and may be performed by outputting sound (including sound) from the sound output unit 140. That is, the notification information may be sound information (including sound information).
For example, attention sound or attention-seeking broadcast may be output from the sound output unit 140 based on the color difference satisfying the setting condition that the color difference is equal to or greater than the third setting value or exceeds the third setting value. Note that the reminder broadcasting may be set to the same content as the content indicated by text in fig. 16, or may be set to a different content.
For example, the horn sound or the blessing broadcast may be output from the sound output unit 140 based on the color difference satisfying the setting condition that the fourth setting value is not satisfied or the fourth setting value is not satisfied.
In addition, in the case where the user clicks the tab of the "plain mode" of the mode switching tab MT1 shown in the display screen, the tab of the "plain mode" may be displayed in reverse, and the color difference history information in the "plain mode" may be displayed. Also, in this case, for example, the screen of fig. 14 as described above is displayed, and in the case where the screen includes the attention mark MK1, the screen of fig. 15 as described above may be displayed, for example, based on clicking on the attention mark MK 1.
The content of the notification (notification information) may be stored in the storage unit of the terminal 100 or the server 200 by associating the content with a set value and making the content into a database, for example. And, according to whether the color difference meets the threshold value condition based on any set value, the corresponding notification information is read from the database, and the notification information is displayed.
Further, the color difference history information of the "plain mode" may be displayed on a single graph (superimposed display) in association with the timing of the color difference history information of the "foundation mode". The above-described environmental information may be displayed in correspondence with the timing (superimposed display).
In the present embodiment, when the color difference (an example of the relative information) satisfies the setting condition, the display unit 130 of the user terminal displays notification information to the user.
Thus, when the relative information satisfies the setting condition, the user can be notified by the display of the notification information.
In this case, the setting condition may include that the color difference is equal to or larger than the first set value or exceeds the first set value, and the notification information may include information that the second color information is desirable to notify the user that the first color information deviates from the desirable.
Therefore, when the color difference is larger than a certain value, the user can be reminded of that the first color information deviates from ideal.
Also, in this case, the notification information may include information reminding the user of the skin state of the user.
Thus, when the color difference is a value greater than a certain level, the user can be reminded of the skin condition of the user.
In the above, the setting condition may include a second setting value having a color difference smaller than the first setting value or a second setting value smaller than the first setting value or less, and the notification information may include information for notifying the user that the first color information is close to the ideal with the second color information as the ideal.
Thus, when the color difference is small to a certain extent, the user can be notified that the first color information is near ideal.
In this case, the notification information may include information notifying the user that the skin condition of the user is good.
Thus, when the color difference is small to a certain extent, the user can be notified that the skin condition of the user is good.
(3) Staged notification
For example, a stepwise set value may be set as the set value, and notification under different notification information may be performed based on whether or not the color difference is equal to or larger than any set value or exceeds the set value. As described above, the notification may be performed by display or by sound output.
For example, in the "plain mode", when the color difference becomes equal to or larger than a set value a set as the lowest set value (or exceeds the set value a), the terminal displays notification information prompting attention of "the skin state is slightly deteriorated". When the color difference is equal to or larger than the set value B (or exceeds the set value B) that is larger than the set value a, the terminal may display notification information that notifies the attention that the skin state is further deteriorated.
The same applies to the case where the color difference is less than the set value (or equal to or less than the set value). In this case, the terminal can report the following to the stepwise set values: the less the color difference is, the lower the set value (or the lower the set value is), the more preferable the skin state is.
Moreover, these matters can be similarly applied to the "foundation mode".
(4) Zone setting
For example, a user of the terminal 100 may input a setting to the terminal 100 to set which region is to be processed.
For example, if the user wants to know the result of setting the first area as the cheek and the second area as the inner side of the wrist, the user can select the cheek as the first area and the inner side of the wrist as the second area, and set the terminal 100.
If the user is aware of the result of the first region being the neck region and the second region being the inner side of the upper arm, for example, the user can select the neck region as the first region and the inner side of the upper arm as the second region, and set the terminal 100.
When the terminal 100 calculates the chromatic aberration, the terminal 100 may detect the first region and the second region set based on the user input as described above from the captured image, and then perform the same processing as described above.
When the server 200 performs the processing, the terminal 100 may transmit setting information of the first and second areas set by user input as described above to the server 200 together with the data of the captured image, and the server 200 may perform the processing similar to the above after detecting the first and second areas included in the received setting information from the captured image.
(5) Processing in a terminal
As described above, the processing is not limited to, for example, processing by the application client server system, and the terminal 100 may be used to perform all the processing. In this case, for example, the processing of the server 200 shown in fig. 8 can be similarly applied to the color information processing shown in fig. 5 to which the configuration example of the terminal 100A shown in fig. 4 is applied, and the same processing is performed by the terminal 100A.
(6) Color system
In the above embodiment, as the color information, color information expressed by Hue (Hue), saturation (Saturation), and brightness (Lightness) defined in accordance with the HSL color space is used, but the present invention is not limited thereto.
Specifically, color information expressed in YCbCr, for example, can also be used. Also, color information expressed in RGB may be used. In addition, each color system is a mapping relationship. For example, YCbCr and RGB may be converted by linear conversion. In the case of using any of the color systems, the same method as the above embodiment can be applied.
(7) Terminal
As described above, the terminal 100 of the user can be applied to various devices such as a camera, a PDA, a personal computer, a navigation device, a wristwatch, and various tablet terminals, in addition to a mobile phone such as a smart phone.
Further, the user's terminal 100 may not necessarily include the image pickup section 150. In this case, for example, the terminal 100 may obtain data of an image-captured image from an external device including the image-capturing section 150, and perform the image processing based on the obtained data of the image-captured image.
(8) Recording medium
In the above-described embodiments, various programs or data relating to image processing are stored in the storage section, and the processing section reads these programs and executes them, thereby realizing the image processing in the above-described embodiments. In this case, the storage unit of each device may include, in addition to an internal storage device such as a ROM or EEPROM, a flash memory, a hard disk, or a RAM, a memory card (secure digital (SD) card) or a compact flash (registered trademark) card, a memory stick, a universal serial bus (universal serial bus, USB) memory, a rewritable Optical disk (compact disc rewritable, CD-RW) (Optical disk), a magneto-Optical disk (MO) (magneto-Optical disk), or the like (recording medium, external storage device, storage medium), or may store the various programs or data in these recording media.
Fig. 17 is a diagram showing an example of a recording medium in this case.
In the example, a card slot 410 into which a memory card 430 is inserted is provided in the image processing apparatus 1, and a card reader/writer (R/W) 420 for reading information stored in the memory card 430 inserted into the card slot 410 or writing information to the memory card 430 is provided.
The card reader/writer 420 performs an operation of writing the program or data recorded in the storage unit into the memory card 430 in accordance with the control of the processing unit. The image processing in the embodiment is configured in such a manner that the program or data recorded in the memory card 430 is read by an external device other than the image processing apparatus 1, so that the image processing in the embodiment can be realized in the external device.
The recording medium can be applied to various devices such as a terminal, a server, an electronic device (electronic apparatus), a color information analysis device, and an information processing device described in the above embodiments.
[ Others ]
In the embodiment, the image processing apparatus 1 may be configured as an apparatus such as a skin analysis apparatus. Further, as a device including the image processing device 1 and the notifying unit described above, a device such as a rough skin notifying device may be constituted.
The technique of calculating color relative information from first color information and second color information of an image in which a subject is a human being and displaying the color relative information in time series is applicable to various monitoring methods. According to one example, the above-described technique may be used as a non-contact vital sign sensing technique to measure fatigue of a worker. The worker is not particularly limited, and may be a business person who may cause an accident when doing a work. Such a worker is, for example, a driver of a large vehicle.
Fig. 18 is a diagram showing an example of a screen of the fatigue degree measurement application program. According to one example, the terminal 100 is a smart phone, and includes a built-in camera 101 and a display unit 130. The worker takes an image of himself, for example, daily with the built-in camera 101. The color relative information is calculated by the above method for a plurality of images of different photographing days. The calculated color relative information is displayed in time series according to the operation of the terminal operator. Fig. 18 illustrates changes in color relative information from 2022, 12, 5, to 2022, 12, 11. The color relative information is a color difference or a color relative vector, the higher the value, the higher the fatigue and the lower the fatigue are reported to the end operator. In this example, the first threshold TH1 is set as the degree of fatigue at which a certain degree of fatigue is observed, and the second threshold TH2 is set as the degree of fatigue at which deep fatigue is achieved. The threshold value may be set as well, or a plurality of threshold values may be set stepwise. In the example of fig. 18, the fatigue exceeding the first threshold TH1 is detected on day 12 months 8, the fatigue exceeding the second threshold TH2 is detected on day 12 months 9, and the threshold exceeding the first threshold TH1 is detected on day 12 months 10. According to an example, when the fatigue exceeding the threshold is detected, an external manager or the like may be notified by the communication function of the terminal 100, a message prompting a rest due to an increase in the fatigue may be displayed on the display unit 130, or the message may be read by voice.
According to other examples, the above-described techniques may be used for effect confirmation in beauty parlors, gyms, health foods, and the like. According to still another example, the above-described technique can be flexibly utilized as one of the detection conditions for confirming the meaning of a dementia patient or the like.
The function of generating time-series data of the color relative information may be implemented by a processing circuit. That is, the processing circuit calculates the color information and the color relative information, and generates time-series data of the color relative information. The processing circuit may be dedicated hardware or may be a CPU (Central Processing Unit, also referred to as a central processing circuit, a processing device, an arithmetic circuit, a microprocessor, a microcomputer, a DSP) that executes a program stored in a memory.
In the case where the processing circuit is dedicated hardware, the processing circuit is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC, a field programmable gate array (Field Programmable GATE ARRAY, FPGA), or a processing circuit combining these.
Fig. 19 is a diagram showing a configuration example of a terminal. In the image processing circuit 10B, data such as JPEG is generated from data captured by the camera 10A. The image data generated in the image processing circuit 10B is supplied to the processing circuit 10C. The processing circuit 10C generates time-series data of the color relative information. Then, the time-series data is displayed on the display 10D according to an operation of the end user.
Fig. 20 shows a configuration example in the case where the processing circuit is a CPU. In which case the functions of the processing circuit are implemented by software or a combination of software and firmware. The software or firmware is described in the form of a program and stored in the memory 10c. The processor 10b realizes functions by reading and executing programs stored in the memory 10c. That is, the processing circuit of fig. 20 includes a memory 10c storing a program that ultimately generates time-series data of the color relative information. These programs may also cause the computer to perform the sequences and methods of fig. 3, 5, 8, etc. Here, the memory includes, for example, a nonvolatile or volatile semiconductor memory such as a RAM, a ROM, a flash memory, an erasable programmable read only memory (Erasable Programmable Read Only Memory, EPROM), an EEPROM, a magnetic disk, a floppy disk, an optical disk (optical disk), a compact disk (compact disk), a mini magnetic disk, a digital versatile disk (DIGITAL VERSATILEDISC, DVD), and the like. Of course, part of the functions described may also be implemented in hardware, with part being implemented in software or firmware.
Description of symbols
1: Image processing apparatus and method
100: Terminal
200: Server device
300: Network system

Claims (11)

1. An image processing apparatus comprising: a calculating unit that calculates relative information relating to a difference between first color information in a first region included in a captured image and second color information in a second region included in the captured image; and
An output unit that outputs the relative information in time series for a plurality of captured images captured at different timings.
2. The image processing apparatus according to claim 1, wherein the relative information includes information of a color difference of the first color information and the second color information.
3. The image processing apparatus according to claim 1 or 2, wherein the first region and the second region are different regions of the same subject.
4. The image processing apparatus according to claim 3, wherein the first region is a face region, a cheek region, or a neck region, and the second region is a region on an inner side of an upper arm or a region on an inner side of a wrist.
5. The image processing apparatus according to claim 1, wherein the calculating section calculates the relative information from a plurality of captured images.
6. A terminal, comprising: a camera;
a processing circuit that generates first time-series data relating to the beauty of the skin of the person or second time-series data relating to the fatigue of the person from the image captured by the camera; and
And a display for displaying the first time-series data or the second time-series data.
7. The terminal of claim 6, wherein the processing circuit comprises a processor and a memory.
8. The terminal of claim 6, wherein the display is a touch screen, and the first time-series data or the second time-series data is displayed on the display according to an operation of a terminal user.
9. A monitoring method, comprising: for the same person, shooting a plurality of images at different shooting moments; and
Time-series data relating to the beauty of the skin of the person or time-series data relating to the fatigue of the person is generated from the plurality of images.
10. The monitoring method according to claim 9, wherein the environment information at the time of shooting is reported to a user in addition to the time-series data.
11. The monitoring method of claim 9, comprising: in addition to the time-series data, if there is data exceeding a predetermined threshold value in the time-series data, a message, an illustration, or a sound is notified to the user.
CN202280077159.6A 2021-12-17 2022-12-13 Image processing device, terminal and monitoring method Pending CN118285113A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-205527 2021-12-17
JP2022-191870 2022-11-30

Publications (1)

Publication Number Publication Date
CN118285113A true CN118285113A (en) 2024-07-02

Family

ID=

Similar Documents

Publication Publication Date Title
KR101168546B1 (en) Face collation device, electronic appliance, method of controlling face collation device, and control program of face collation device
US20190081945A1 (en) Portable Device with Thermal Sensor
CN102147856A (en) Image recognition apparatus and its control method
WO2020194497A1 (en) Information processing device, personal identification device, information processing method, and storage medium
JPWO2007142227A1 (en) Image direction determination apparatus, image direction determination method, and image direction determination program
US20170047096A1 (en) Video generating system and method thereof
EP3430897A1 (en) Monitoring device, monitoring method, and monitoring program
US11523062B2 (en) Image capture apparatus and method for controlling the same
CN110099606B (en) Information processing system
CN111144169A (en) Face recognition method and device and electronic equipment
JP2019192082A (en) Server for learning, image collection assisting system for insufficient learning, and image estimation program for insufficient learning
JP2005149370A (en) Imaging device, personal authentication device and imaging method
JP2008209306A (en) Camera
JP2022003526A (en) Information processor, detection system, method for processing information, and program
CN115516502A (en) Organism determination device and organism determination method
CN118285113A (en) Image processing device, terminal and monitoring method
US9088699B2 (en) Image communication method and apparatus which controls the output of a captured image
JP2018200597A (en) Image estimation device
JP7401866B2 (en) Image processing device, terminal, monitoring method
JP2013218393A (en) Imaging device
TW202326608A (en) Image processing device, terminal, and monitoring method
JP2016114480A (en) Management method and management system for blood glucose level data
US11012671B2 (en) Information processing apparatus, image capturing apparatus, and information processing method
CN113454978A (en) Information processing apparatus, information processing method, and recording medium
US11822974B2 (en) Information processing apparatus, control method of information processing apparatus, and recording medium

Legal Events

Date Code Title Description
PB01 Publication