WO2013132636A1 - Indicateur, et unité de bonnette d'approche - Google Patents

Indicateur, et unité de bonnette d'approche Download PDF

Info

Publication number
WO2013132636A1
WO2013132636A1 PCT/JP2012/056025 JP2012056025W WO2013132636A1 WO 2013132636 A1 WO2013132636 A1 WO 2013132636A1 JP 2012056025 W JP2012056025 W JP 2012056025W WO 2013132636 A1 WO2013132636 A1 WO 2013132636A1
Authority
WO
WIPO (PCT)
Prior art keywords
close
lens
subject
image
panel
Prior art date
Application number
PCT/JP2012/056025
Other languages
English (en)
Japanese (ja)
Inventor
貞雄 安達
隆司 松山
Original Assignee
株式会社洛洛.Com
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社洛洛.Com filed Critical 株式会社洛洛.Com
Priority to PCT/JP2012/056025 priority Critical patent/WO2013132636A1/fr
Publication of WO2013132636A1 publication Critical patent/WO2013132636A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • G03B17/565Optical accessories, e.g. converters for close-up photography, tele-convertors, wide-angle convertors
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B43/00Testing correct operation of photographic apparatus or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis

Definitions

  • the present invention relates to an index body mainly used for a close-up lens and a technology of a close-up lens unit.
  • Patent Documents 1 and 2 a mobile phone terminal, a portable video game machine, a camera module mounted on a PC (Personal Computer), etc. is attached to a human skin or the like in a state of being in contact with or close to human skin. A close-up lens for photographing is disclosed.
  • the present invention has been made in view of such problems, and is a shooting environment such as camera function and performance, illumination light, and light source for an image acquired by a user using a close-up lens.
  • the purpose is to make it possible to reduce the effects of
  • the index body according to one aspect of the present invention is present at a position in contact with or close to the subject when the close-up lens attached to the camera is in contact with or close to the subject to photograph the subject, and at least one of them.
  • the index body when a user acquires an image using a close-up lens, a predetermined color that is an index for analysis of an image photographed by a camera is attached to the subject. The indicator part is photographed.
  • the index body it is possible to analyze the acquired image based on the index section photographed together with the subject. As a result, it is possible to reduce the influence of the shooting environment such as the function and performance of the camera, the illumination light, and the light source on the image acquired by the user using the close-up lens.
  • the index body is configured to capture the subject when the close-up lens mounted on the camera is in contact with or close to the subject and captures the subject. It may further include a mark indicating predetermined information that is photographed together with the subject and the indicator portion.
  • each of the index bodies is configured such that when the close-up lens mounted on the camera is in contact with or close to the subject and the subject is photographed, You may further provide opening in the part contained in an imaging
  • the opening may be provided so that the center of the opening is located in the vicinity of the optical axis of the camera.
  • the indicator portion may be provided so as to surround an outer edge of the opening.
  • the opening may have a rotationally symmetric shape.
  • the indicator portion may have a color pattern arranged according to a predetermined regularity corresponding to the rotational symmetry of the opening.
  • a plurality of the marks may be arranged according to a predetermined regularity corresponding to the rotational symmetry of the opening.
  • the indicator portion may have a step in the optical axis direction.
  • a close-up lens unit that realizes each of the above configurations may be used.
  • a close-up lens unit is used by being attached to a camera, and when attached to the camera, at least a part of the close-up lens unit is in contact with or close to a subject at the time of shooting.
  • the present invention it is possible to reduce the influence of the shooting environment such as the function and performance of the camera, the illumination light, and the light source on the image acquired by the user using the close-up lens.
  • FIG. 1 is a perspective view showing an example of a close-up lens to which a panel according to an embodiment is attached.
  • FIG. 2 is a bottom perspective view showing an example of a close-up lens mounted with the panel according to the embodiment.
  • FIG. 3 is a plan view showing an example of a close-up lens mounted with the panel according to the embodiment.
  • FIG. 4 is a bottom view showing an example of a close-up lens mounted with the panel according to the embodiment.
  • FIG. 5 is a front view showing an example of a close-up lens mounted with the panel according to the embodiment.
  • FIG. 6 is a cross-sectional view taken along line AA of FIG. 3, showing an example of a close-up lens to which the panel according to the embodiment is attached.
  • FIG. 7 is an exploded perspective view showing an example of a close-up lens mounted with the panel according to the embodiment.
  • FIG. 8 is a bottom exploded perspective view showing an example of a close-up lens mounted with the panel according to the embodiment.
  • FIG. 9 shows an example of using a close-up lens equipped with the panel according to the embodiment.
  • FIG. 10 shows an example of a panel according to the embodiment.
  • FIG. 11 is a perspective view showing another mounting example of the panel according to the embodiment.
  • FIG. 12 is an exploded perspective view illustrating another mounting example of the panel according to the embodiment.
  • FIG. 13 is a cross-sectional view showing another mounting example of the panel according to the embodiment.
  • FIG. 14 shows an example of a panel according to another embodiment.
  • FIG. 15A shows an example of a panel according to another embodiment.
  • FIG. 15B shows an example of a panel according to another embodiment.
  • FIG. 15C shows an example of a panel according to another embodiment.
  • FIG. 15D shows an example of a panel according to another embodiment.
  • FIG. 15E shows an example of a panel according to another embodiment.
  • FIG. 15F shows an example of a panel according to another embodiment.
  • FIG. 16 shows an example of a panel according to another embodiment.
  • FIG. 17A shows an example of a panel according to another embodiment.
  • FIG. 17B shows an example of a panel according to another embodiment.
  • FIG. 17C shows an example of a panel according to another embodiment.
  • FIG. 18 is a perspective view showing an example of a panel according to another embodiment.
  • FIG. 19 is a cross-sectional view showing an example of a close-up lens mounted with a panel according to another embodiment.
  • FIG. 20A shows an example of a panel according to another embodiment.
  • FIG. 20B shows an example of a panel according to another embodiment.
  • FIG. 21 illustrates an outline of a hardware configuration of the information processing system according to the embodiment.
  • FIG. 22 illustrates a schematic hardware configuration of an information processing system according to another embodiment.
  • FIG. 23 illustrates an outline of a functional configuration of the information processing system.
  • FIG. 24 is a flowchart illustrating an example of an information processing procedure performed by the information processing system according to the embodiment.
  • FIG. 25 shows an example of an image acquired in the information processing system according to the embodiment.
  • FIG. 26 shows an example of a screen displayed during shooting in the information processing system according to the embodiment.
  • FIG. 27 is a flowchart illustrating an example of a processing procedure of analysis processing and correction processing by the information processing system according to the embodiment.
  • FIG. 28 illustrates data stored in the information processing system according to the embodiment.
  • this embodiment an embodiment according to one aspect of the present invention (hereinafter also referred to as “this embodiment”) will be described with reference to the drawings.
  • the present embodiment described below is merely an example of the present invention in all points, and is not intended to limit the scope thereof. It goes without saying that various improvements and modifications can be made without departing from the scope of the present invention. That is, in carrying out the present invention, a specific configuration according to each embodiment may be adopted as appropriate.
  • data appearing in this embodiment is described in a natural language, more specifically, it is specified by a pseudo language, a command, a parameter, a machine language, etc. that can be recognized by a computer.
  • ⁇ 1 Close-up lens a scene in which a user photographs his / her skin with a camera module connected to or mounted on a mobile phone terminal, a portable video game machine, a PC (Personal Computer) or the like is illustrated.
  • the application scene of the present invention is not limited to such an example.
  • the present invention is widely applicable in scenes where close-ups of subjects are taken.
  • even a device having another name is referred to as a “camera” if the device has or can have a camera function.
  • FIG. 1 is a perspective view showing an example of a close-up lens 50 to which a panel 40 according to this embodiment is attached.
  • FIG. 2 is a bottom perspective view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached.
  • FIG. 3 is a plan view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached.
  • FIG. 4 is a bottom view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached.
  • FIG. 5 is a front view showing an example of the close-up lens 50 to which the panel 40 according to this embodiment is attached.
  • FIG. 6 is a cross-sectional view taken along the line AA in FIG.
  • FIG. 7 is an exploded perspective view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached.
  • FIG. 8 is a bottom exploded perspective view showing an example of the close-up lens 50 to which the panel 40 according to the present embodiment is attached.
  • FIG. 9 shows an example of using the close-up lens 50 to which the panel 40 according to this embodiment is attached. Note that a rear view, a left side view, and a right side view showing an example of a close-up lens to which the panel 40 according to the present embodiment is attached are each expressed in the same manner as the front view shown in FIG.
  • the close-up lens 50 according to the present embodiment will be described mainly with reference to FIG.
  • the close-up lens 50 according to the present embodiment is used to magnify a subject (skin).
  • the close-up lens 50 according to the present embodiment includes a lens unit 54 that realizes the function of the magnified photographing, and an outer portion that surrounds the lens unit 54.
  • the outer portion includes a bottom surface portion 51 that faces the camera when the close-up lens 50 is attached to the camera, a groove portion 52 to which a lens fixing clip used for attaching the close-up lens 50 to the camera is attached, and when photographing. And an upper end 53 that contacts or is close to the subject.
  • the bottom surface portion 51 is a portion that comes into contact with the camera when the close-up lens 50 is attached to the camera, as shown in FIG. Therefore, a detachable or reattachable adhesive for attaching the close-up lens 50 to the camera may be applied or affixed to the bottom surface portion 51.
  • the groove portion 52 is provided above the bottom surface portion 51 and around the lens portion 54 as shown in FIGS. 5 and 6.
  • the groove portion 52 is a portion to which a lens fixing clip for attaching the close-up lens 50 to the camera is attached.
  • the lens fixing clip is attached to the close-up lens 50 by sandwiching the groove formed by the groove 52 with the lens fixing clip.
  • the groove part 52 exists so that the circumference
  • the upper end portion 53 is a portion that comes into contact with or is close to the subject when the subject is photographed with the close-up lens 50 attached to the camera. As shown in FIGS. 6 and 7, a recess having substantially the same shape and size as the outer periphery of the panel 40 is formed inside the upper end portion 53 in order to fit a circular panel 40 described later.
  • the lens unit 54 has a convex lens shape to magnify a subject.
  • the lens unit 54 is designed to be suitable for a subject to be processed. For example, when the skin of the hand or face is mainly taken as an object to be photographed, the lens unit 54 is designed so that the magnification of the lens is about 30 times as an example. For example, when the scalp is mainly taken as an imaging target, the lens unit 54 is designed so that the magnification of the lens is about 50 to 80 times as an example.
  • the lens portion 54 is formed integrally with the outer portion of the close-up lens 50.
  • the user attaches such a close-up lens 50 to the camera and shoots his / her skin as shown in FIG.
  • the subject illustrated in FIG. 9 is his own skin (skin on the back of his hand).
  • the subject is not particularly limited, and may be other than its own skin.
  • the close-up lens 50 is made of a translucent member, for example.
  • the close-up lens 50 is made of a transparent plastic.
  • the close-up lens 50 is made of a translucent member, so that ambient light can be taken in as illumination light even when photographing with the upper end 53 in contact with the subject.
  • a panel 40 is attached to the close-up lens 50 of the present embodiment.
  • the panel 40 corresponds to an embodiment of the index body of the present invention.
  • the index body of the present invention is not limited to the close-up lens panel exemplified by the panel 40.
  • FIG. 10 shows an example of the panel 40 according to the present embodiment.
  • the panel 40 is present at a position where the close-up lens 50 attached to the camera is in contact with or close to the subject and the subject is in contact with or close to the subject, and at least a part of the panel 40 is in the shooting range 70 of the camera.
  • an index portion 41 with a predetermined color as an index for analysis of an image photographed by the camera is photographed together with the subject.
  • the acquired image can be analyzed based on the index unit 41.
  • the shooting range 70 shown in FIG. 10 or the like is obtained when shooting is performed with each panel mounted on the close-up lens 50 and the close-up lens 50 mounted on a camera (a user terminal 2 described later). The range photographed by the camera on the panel is shown.
  • the panel 40 is attached to the close-up lens 50 by being fitted at the upper end portion 53.
  • the upper end portion 53 is a portion that is in contact with or close to the subject at the time of shooting. Therefore, the panel 40 is present at a position in contact with or close to the subject at the time of shooting.
  • an index portion 41 described later is provided on the surface of the panel 40 on the camera side. Therefore, an index unit 41 to be described later is also present at a position in contact with or close to the subject at the time of shooting.
  • the mounting method of the panel 40 which concerns on this embodiment is not limited to such a method.
  • the panel 40 may be bonded to the upper end portion 53 of the close-up lens 50.
  • the panel 40 may be attached to the close-up lens 50 by fitting the upper end portion 53 of the close-up lens 50 to the panel 40 as a cap of the close-up lens 50.
  • FIG. 11 is a perspective view showing another mounting example of the panel 40 according to the present embodiment.
  • FIG. 12 is an exploded perspective view showing another mounting example of the panel 40 according to the present embodiment.
  • FIG. 13 is sectional drawing which shows the other mounting example of the panel 40 which concerns on this embodiment. As shown in FIGS. 11 to 13, for example, when the cap 60 is attached to the close-up lens 50, the panel 40 is sandwiched between the close-up lens 50 and the cap 60. It may be attached to.
  • the shape of the panel 40 is a circle.
  • the shape of the panel 40 may be other than circular.
  • the shape of the panel 40 may be a polygon.
  • the panel 40 is fitted into a recess provided in the upper end portion 53 of the close-up lens 50. Therefore, the said recessed part is shape
  • FIG. 14 shows an example of variations in the shape of the panel 40.
  • the panel 40A shown in FIG. 14 has a semicircular shape.
  • the panel 40A includes an indicator portion 41A that is a variation of the indicator portion 41 of the panel 40 according to the present embodiment.
  • the index portion 41A appears in the right half of the photographing range 70 and the subject appears in the left half of the photographed image.
  • the shape of the indicator portion 41 according to the present embodiment is a donut shape in which the outer periphery and the inner periphery of the indicator portion 41 are circular. As shown in FIG. 10, the indicator portion 41 is formed on the entire surface of the panel 40 except for the region of the opening 43.
  • the index unit 41 only needs to be photographed together with the subject, and the shape of the index unit 41 is not limited to the shape shown in FIG. The shape of the index part 41 is appropriately selected.
  • FIGS. 15A to 15F illustrate variations in the shape of the indicator portion 41, respectively.
  • symbol is attached
  • the same type of mark is given the same symbol.
  • a mark 44 when referring to a mark regardless of the type, it is referred to as a mark 44.
  • the indicator portion 41B of the panel 40B has a donut shape like the indicator portion 41, but unlike the indicator portion 41, it is not formed on the entire surface of the panel 40B. Further, unlike the indicator portion 41, the color assigned to the indicator portion 41B is a single color.
  • the shape of the outer periphery of the indicator portion 41C of the panel 40C shown in FIG. 15B is a regular polygon (regular dodecagon in the figure) unlike the indicator portion 41. Further, unlike the indicator unit 41, the indicator unit 41C has a color pattern in which four colors are arranged.
  • the panel 40C has four types of marks (44a to 44d).
  • the shape of the outer periphery of the indicator portion 41D of the panel 40D shown in FIG. 15C is the same as the shape of the outer periphery of the indicator portion 41C.
  • the index part 41D has a larger area than the index part 41C.
  • the index part 41D has a color pattern in which two colors are arranged, like the index part 41.
  • the shape of the outer periphery of the indicator portion 41E of the panel 40E shown in FIG. 15D is the same as the shape of the indicator portion 41.
  • the indicator portion 41E does not cover the entire surface of the panel 40E.
  • the index part 41E has a color pattern in which four colors are arranged, like the index part 41C. Note that a mark 44 described later may not be present in all color pattern regions, as shown on the panel 40E.
  • the shape of the outer periphery of the indicator portion 41F of the panel 40F shown in FIG. 15E is the same as the shape of the outer periphery of the indicator portion 41D.
  • the index part 41F has the same area as the index part 41D.
  • the indicator portion 41F has a color pattern in which four colors are arranged.
  • the indicator unit 41 may or may not exist in order to realize information processing using a mark 44 described later. Therefore, as in the panel 40G shown in FIG. 15F, the indicator portion may not be present from the viewpoint of realizing information processing using a mark 44 described later.
  • the indicator unit 41 according to the present embodiment has a color pattern in which white and black are alternately arranged.
  • the predetermined color or colors assigned to the index unit 41 may be any color as long as it is an index for analysis of an image captured by the camera, and may be selected as appropriate.
  • the indicator portion 41 may be formed in a single color. Further, for example, the indicator unit 41 may be colored in cyan, magenta, yellow, black, and white. For example, the indicator unit 41 may be colored in red, green, blue, black, and white. As described above, the predetermined one or a plurality of colors attached to the indicator unit 41 may be appropriately selected.
  • FIG. 16 illustrates a variation of the color scheme of the indicator unit 41.
  • the region of the opening 43 is a region where the subject is captured.
  • the area (index part 41G) adjacent to the area where the subject is photographed on the panel 40H shown in FIG. 16 is arranged so as to gradually change from position B in the clockwise direction within the range of colors that the subject can take.
  • the indicator unit 41 may be arranged in this way. The effects obtained by such color arrangement can be described as follows.
  • the color of the range that can be taken by the subject is arranged stepwise on the indicator 41G. Therefore, it is possible to specify the position (angle) of the index unit 41G to which the same color as the subject color is given on the acquired image.
  • an identifiable position can be used as a reference for the color of a subject as an effect of color arrangement as in the indicator portion 41G.
  • the color of the subject on the photographed image may be different from the actual color of the subject due to the influence of the photographing environment such as the function and performance of the camera, illumination light, and light source.
  • the index unit 41G since the index unit 41G is disposed at a position adjacent to the subject, the index unit 41G is affected in substantially the same manner as the subject. Therefore, on the captured image, it can be estimated that the actual color of the index unit 41 at the position (angle) of the index unit 41G to which the same color as the subject color is given is almost the same as the actual color of the subject. It is. Therefore, the actual color of the subject can be estimated by analyzing the photographed image and specifying the position of the same color as the subject in the index portion 41G. In order to obtain such an effect, the indicator section 41 may be colored like the indicator section 41G shown in FIG.
  • the indicator 41 is provided not on the subject side surface of the panel 40 but on the camera side surface.
  • the indicator 41 is provided by printing on the camera side surface of the panel 40 by oil-based offset printing, silk printing, or the like.
  • the index part 41 may be printed on the subject side of the panel 40.
  • the panel 40 is made of a translucent member such as transparent plastic, the subject 41 can be photographed even if the index portion 41 is printed on the subject side of the panel 40.
  • the index portion 41 may be provided by sticking a sticker on which a picture of the index portion 41 is printed on the surface of the panel 40 on the camera side.
  • the sticker may be attached to the subject side of the panel 40 as in the case of the printing. This also applies to the mark 44 described later.
  • the subject side surface of the panel 40 that cannot be photographed from the camera may be used arbitrarily.
  • the difference between the color of the index portion 41 on the acquired image and the actual color of the index portion 41 is used in the color correction and uneven illumination correction of the image, which will be described later.
  • a two-dimensional distribution in which the difference is obtained at a position where the index portion 41 is shown on the image is used. Therefore, in order to increase the accuracy of color correction and illumination unevenness correction, which will be described later, it is preferable that the index unit 41 exists in all directions around the area where the subject appears in the shooting range 70 of the camera.
  • the panel 40 includes a mark 44 indicating predetermined information.
  • the mark 44 is photographed together with the subject and the index unit 41 by being included in the photographing range 70 of the camera when photographing the subject with the close-up lens 50 attached to the camera being in contact with or close to the subject.
  • the mark 44 may be attached
  • the predetermined information indicated by the mark 44 is information for identifying that the close-up lens 50 is a regular lens.
  • the predetermined information indicated by the mark 44 is information for identifying the distributor who distributed the close-up lens 50 together with the panel 40.
  • the predetermined information indicated by the mark 44 is information on a product sold together with the panel 40 and the close-up lens 50.
  • mark 44 exists in the acquired image, for example, information processing such as identifying the acquired image based on predetermined information indicated by the mark 44 becomes possible. That is, the mark 44 may be handled as an identifier for performing predetermined identification.
  • the mark 44 may be, for example, a barcode that represents information in accordance with a predetermined standard.
  • the mark 44 may be a trademark of a distributor, a product manufacturer, a target product, etc., for example.
  • the mark 44 may be set as appropriate.
  • the panel 40 may have a plurality of types of information by including a plurality of types of marks 44.
  • the panel 40 according to the present embodiment includes two types of marks, a mark 44a and a mark 44b.
  • the panel 40C shown in FIG. 15B includes four types of marks 44a to 44d.
  • the mark 44 may or may not exist in realizing image analysis using the index unit 41. Accordingly, as illustrated in FIG. 10 and the like, the mark 44 may not be present from the viewpoint of realizing image analysis using the index unit 41.
  • the panel 40 according to the present embodiment shown in FIG. 10 has an opening 43 in a portion included in the photographing range 70 of the camera when the close-up lens 50 attached to the camera is brought into contact with or close to the subject and the subject is photographed.
  • the opening 43 is created, for example, by making a hole in the panel 40 by a processing method such as Thomson processing.
  • the region of the opening 43 in the shooting range 70 of the camera is a region where the subject can be seen from the camera side, in other words, a region where the subject is captured. That is, the opening 43 may not be provided in the panel 40 as long as the subject can be photographed.
  • the panel 40 according to the present embodiment is made of a translucent member such as a transparent plastic, it is possible to capture the subject, and thus the opening 43 may not be provided.
  • the panel 40 is not provided with the opening 43 and is not made of a translucent member, as long as the subject can be photographed as in the panel 40A shown in FIG. Good.
  • the opening 43 should be provided so as to include the optical axis.
  • the center of the opening 43 is provided in the vicinity of the optical axis of the camera (close-up lens 50).
  • the center of the opening 43 is provided so as to be located on the optical axis.
  • the indicator portion 41 may be provided so as to surround the outer edge of the opening 43 provided so that the center is located near or on the optical axis. As a result, even if the panel 40 is mounted so as to be shifted vertically and horizontally, the index portion 41 exists around the opening 43, so that the index portion 41 can be included in the shooting range 70 of the camera.
  • the opening 43 may have a circular symmetry shape. Thereby, even if the panel 40 rotates around the optical axis, the influence can be reduced or ignored.
  • the indicator section 41 may have a color pattern arranged according to a predetermined regularity corresponding to the rotational symmetry of the opening 43.
  • a plurality of marks 44 may be arranged according to a predetermined regularity corresponding to the rotational symmetry of the opening 43.
  • the predetermined regularity is a regularity corresponding to the rotational symmetry of the opening 43.
  • the predetermined regularity corresponds to, for example, the rotational symmetry of the opening 43, and an angle p at which the state of the index portion 41 or the mark 44 included in the imaging range 70 does not change before and after the rotation except for a multiple of 360 degrees. Is present (0 ⁇ p ⁇ 360).
  • the shape of the opening 43 according to the present embodiment is a circle.
  • the shape of the opening 43 is not limited to a circle.
  • the shape of the opening 43 is appropriately selected.
  • the shape of the opening 43 includes a polygon, an ellipse, a star, and the like.
  • FIG. 17A-17C illustrate variations of the opening 43.
  • the shape of the opening 43A of the panel 40I shown in FIG. 17A is a regular polygon as a shape having rotational symmetry.
  • the indicator portion 41H of the panel 40I is provided on the entire surface of the panel 40I.
  • the indicator portion 41I of the panel 40J shown in FIG. 17B has a shape that protrudes toward the center of the opening 43B.
  • the indicator part 41J of the panel 40K shown by FIG. 17C has a shape which protrudes toward the center of the opening 43B, and 2 color schemes of white and black are made
  • the indicator portion 41J is provided with marks 44a and 44b.
  • the user brings the panel 40 (close-up lens 50) into contact with the subject, for example.
  • the subject may partially enter the close-up lens 50 through the opening 43.
  • the index part 41 and the mark 44 are attached to the surface of the panel 40 on the camera side. Therefore, when the deformation of the subject is not taken into consideration, the index portion 41 and the mark 44 are positioned closer to the lens than the subject by the thickness of the panel 40 in the optical axis direction.
  • the index part 41 and the mark 44 exist at positions different from the subject in the optical axis direction. Therefore, when focusing on the subject, there is a possibility that the index unit 41 and the mark 44 may not be focused due to a difference in position in the optical axis direction.
  • the indicator portion 41 may have one or a plurality of steps in the optical axis direction.
  • the mark 44 may be attached to a surface perpendicular to the optical axis direction of each step formed in the panel 40.
  • FIG. 18 and 19 show variations of the panel 40.
  • FIG. FIG. 18 is a perspective view showing an example of a panel 40L in which one step is provided in the indicator portion 41K.
  • FIG. 19 is a cross-sectional view showing an example of a panel 40L in which one step is provided in the indicator portion 41K.
  • a panel 40M shown in FIG. 20A is a modification of the panel 40B shown in FIG. 15A.
  • the panel 40M has one step at the indicator portion 41L.
  • the panel 40N shown by FIG. 20B is a modification of the panel 40C shown by FIG. 15B.
  • the panel 40N has one step at the indicator portion 41M.
  • the panel 40 may be formed integrally with the close-up lens 50.
  • a unit in which the panel 40 and the close-up lens 50 are integrated corresponds to an embodiment of the close-up lens unit of the present invention.
  • the index body of the present invention may be a sticker that is attached to a subject and that exhibits the same function as the panel 40.
  • sticker can be demonstrated as a seal
  • the user for example, affixes a picture sticker that realizes the surface of the panel 40 to the subject, and contacts the close-up lens 50 on the sticker to photograph the subject. Thereby, the image to be photographed can obtain the same effect as that obtained by photographing using the panel 40.
  • a sticker is mainly used by sticking to a subject. Therefore, instead of the opening 43 present in the center of the seal, for example, a transparent sheet that reacts with skin oil may be provided. Thus, the user can check the oil content of the skin when photographing the skin using the close-up lens.
  • the information processing system when a close-up lens mounted on a camera is brought into contact with or close to a subject and the subject is photographed, predetermined information existing at a position in contact with or close to the subject is obtained.
  • the mark shown is included in the imaging range of the camera, an image shot with the subject is acquired. Then, the information processing system according to the present embodiment extracts a mark from the acquired image, and stores the image in association with information indicated by the extracted mark.
  • the information processing system operates as described above, and stores an image photographed using the close-up lens in association with information indicated by a mark. Therefore, the stored image can be identified based on information indicated by the mark. As a result, according to the present embodiment, it is possible to efficiently use the images captured and collected using the close-up lens.
  • FIG. 21 illustrates a hardware configuration of the information processing system 1 according to the present embodiment.
  • An information processing system 1 according to the present embodiment includes a user terminal 2 and a server 3 connected via a network 5.
  • Information transmission between the user terminal 2 connected to the network 5 and the server 3 is, for example, data communication via the network 5 such as a 3G (3rd Generation) network, the Internet, a telephone network, and a dedicated network. It is realized with.
  • the type of the network 5 is appropriately selected according to each data communication.
  • the user terminal 2 includes a control unit 21 including a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and an auxiliary storage device 22 that stores programs executed by the control unit 21.
  • the information processing apparatus includes a network interface 23 for performing communication via the network 5, an input device 24 including a camera module 26, and an output device 25 including a display 27, an LED, a speaker, and the like. In FIG. 21 and FIG. 22 described later, the network interface is described as “NW I / F”.
  • the specific hardware configuration of the user terminal 2 may be appropriately omitted, replaced, and added according to the embodiment.
  • the user terminal 2 may be further connected with a mouse, a keyboard, or the like as an input device.
  • the control unit 21 may include a plurality of processors.
  • a PC for example, a PC, a mobile phone, a smartphone, a tablet terminal, a portable game machine, or the like may be used in addition to a terminal designed exclusively for the provided service.
  • the user photographs the subject using the camera module 26 provided in the user terminal 2.
  • the user attaches the close-up lens 50 attached with the panel 40 to the camera module 26 of the user terminal 2.
  • the user performs photographing in a state where the close-up lens 50 is in contact with or close to his / her skin.
  • an image obtained by enlarging the skin, the index unit 41, and the mark 44 is acquired by shooting in this way.
  • the server 3 includes a control unit 31 including a CPU, a RAM, a ROM, and the like, an auxiliary storage device 32 that stores programs executed by the control unit 31, and a network interface 33 for performing communication via the network 5. And an information processing apparatus. In the present embodiment, images taken by the user are stored in the server 3.
  • the server 2 may be implemented by one or a plurality of information processing devices.
  • the information processing system 1 is not limited to the example realized by the two information processing apparatuses illustrated in FIG.
  • the information processing system 1 may be realized by one information processing apparatus or may be realized by three or more information processing apparatuses.
  • the hardware configuration of the information processing system 1 it is possible to omit, replace, and add components according to the embodiment.
  • FIG. 22 illustrates a variation of the hardware configuration of the information processing system 1 according to the present embodiment.
  • An information processing system 1A shown in FIG. 22 is realized by a single information processing apparatus.
  • the information processing system 1A includes a control unit 10 including a CPU, RAM, ROM, and the like, an auxiliary storage device 11 that stores a program executed by the control unit 10, and a network interface 12 for performing communication via a network.
  • the information processing apparatus includes an input device 13 including a camera module 26 and an output device 14 including a display, an LED, a speaker, and the like.
  • the specific hardware configuration of the information processing apparatus as in the case of the user terminal 2 and the server 3, the omission, replacement, and addition of components may be appropriately performed according to the embodiment.
  • FIG. 23 illustrates an outline of a functional configuration of the information processing system 1 according to the present embodiment.
  • the information processing system 1 according to the present embodiment includes an image acquisition unit 15, an image processing unit 16, a storage unit 17, and a display control unit 18.
  • each function may be realized in either the user terminal 2 or the server 3.
  • the display control unit 18 may be realized in the server 3.
  • the image processing unit 16 may be realized in the user terminal 2.
  • one function may be realized across a plurality of information processing apparatuses.
  • the image processing unit 16 may be realized across the user terminal 2 and the server 3.
  • the user terminal 2 and the server 3 realize the image processing unit 16 by sharing a plurality of processes executed by the image processing unit 16 described later. That is, a plurality of information processing apparatuses may realize the one function by sharing a plurality of processes in one function.
  • the plurality of information processing apparatuses cooperate with each other to realize the function of the information processing system 1.
  • data communication for information exchange related to the cooperation is performed between the plurality of information processing apparatuses.
  • the user terminal 2 and the server 3 perform data communication via the network 5 to cooperate with each other to realize the function of the information processing system 1 of the present embodiment.
  • the functional configuration shown in FIG. 23 is only an example of the information processing system 1 according to the present embodiment. Therefore, the functions of the information processing system 1 may be appropriately omitted, replaced, and added according to the embodiment.
  • the display control unit 18 may be omitted when image display or the like at the time of shooting, which will be described later, is omitted.
  • the image acquisition unit 15 When the close-up lens 50 mounted on the camera module 26 of the user terminal 2 is brought into contact with or close to the subject and the image acquisition unit 15 captures the subject, the image acquisition unit 15 is located at a position in contact with or close to the subject. Is included in the shooting range of the camera module 26, and an image shot with the subject is acquired.
  • the image processing unit 16 extracts the mark 44 from the image acquired by the image acquisition unit 15. Then, the storage unit 17 stores the image in association with the information indicated by the mark 44 extracted from the image.
  • the image acquisition unit 15 exists at a position that contacts or approaches the subject.
  • the index unit 41 with a predetermined color that serves as an index for analyzing an image captured by the camera module 26 is included in the imaging range of the camera module 26 so that the subject and the mark are included.
  • An image taken together with 44 may be acquired.
  • the image processing unit 16 may extract the index unit 41 from the image acquired by the image acquisition unit 15. Further, the image processing unit 16 may execute an analysis process on the acquired image based on the predetermined color of the extracted index unit 41. Further, the image processing unit 16 may perform a correction process on the acquired image based on the result of the analysis process of the image. Then, the storage unit 17 may store the image that has been subjected to the correction processing by the image processing unit 16.
  • the image processing unit 16 may perform the subject analysis processing on the portion related to the subject of the image acquired by the image acquisition unit 15. Then, the storage unit 17 may store the result of the subject analysis process executed by the image processing unit 16 in association with the image acquired by the image acquisition unit 15.
  • the image processing unit 16 may instruct the image acquisition unit 15 to acquire an image again when the subject analysis processing fails.
  • the image acquisition unit 15 may acquire an image including a mark indicating a position where the close-up lens 50 is attached in the camera module 26 of the user terminal 2.
  • the image processing unit 16 may determine whether or not the mark is included in a predetermined area on the image acquired by the image acquisition unit 15. If the image processing unit 16 determines that the mark is not included in the predetermined area on the acquired image, the image processing unit 16 may notify that the mounting position of the close-up lens 50 is incorrect.
  • the display control unit 18 indicates an image acquired by the camera module 26 while the camera module 26 of the user terminal 2 captures a subject, and a position where the close-up lens 50 is mounted on the camera module 26.
  • An image including the mark may be output, and guidance information indicating a reference for the position where the close-up lens 50 is attached to the mark may be output on the output image.
  • FIG. 24 shows an example of the processing procedure of the information processing system 1 according to the present embodiment.
  • the step is abbreviated as “S”.
  • the program stored in the auxiliary storage device 22 is expanded in the RAM or the like of the control unit 21. Then, the program developed in the RAM or the like of the control unit 21 is executed by the CPU of the control unit 21. In this way, the information processing system 1 starts processing.
  • Step 100 when the close-up lens 50 attached to the camera module 26 of the user terminal 2 is brought into contact with or close to the subject and the subject is photographed by the image acquisition unit 15, the image acquisition unit 15 is brought into contact with or close to the subject.
  • the existing mark 44 indicating the predetermined information is included in the shooting range of the camera module 26, whereby an image shot with the subject is acquired.
  • the user attaches the close-up lens 50 with the panel 40 attached to the camera module 26 of the user terminal 2. Then, the user operates the user terminal 2 with the close-up lens 50 in contact with or close to his / her skin.
  • the control unit 21 controls shooting by the camera module 26 and acquires an image related to the shooting from the camera module 26. Thereby, the information processing system 1 acquires an image in which the mark (44a, 44b) is photographed together with the subject (skin).
  • FIG. 25 illustrates an image 80 captured by the camera module 26 of the user terminal 2.
  • the region 81 corresponds to the region of the opening 43 of the panel 40 and is a region where the subject (skin) is captured.
  • An area 82 corresponds to an area where the index portion 41 and the mark 44 of the panel 40 exist, and is an area where the index portion 41 and the mark 44 are reflected.
  • the boundary 83 is a boundary between the region 81 and the region 83 and corresponds to the boundary between the opening 43 and the indicator portion 41.
  • the image acquisition unit 15 may acquire an image in which the index unit 41 is photographed together with the subject and the mark 44.
  • the image acquisition unit 15 acquires an image that does not include the index part 41.
  • the information processing system 1 may process an image that does not include such an index unit 41.
  • the boundary 83 corresponds to the boundary between the opening 43 and the index portion 41, that is, the edge that defines the opening 43.
  • the panel 40 provided with the opening 43 is mounted so as to be fitted into the upper end portion 53 of the close-up lens 50. Therefore, when viewed from the camera module 26 of the user terminal 2, the position of the edge that defines the opening 43 corresponds to the position where the close-up lens 50 to which the panel 40 is mounted is present. That is, the position of the edge that defines the opening 43 is determined according to the position where the close-up lens 50 exists, and further, the position of the boundary 83 on the acquired image is determined. Therefore, the boundary 83 serves as a mark indicating the position where the close-up lens 50 is attached in the camera module 26 of the user terminal 2.
  • the image acquisition unit 15 acquires an image including a mark indicating a position where the close-up lens 50 is mounted in the camera module 26 of the user terminal 2.
  • the said mark is not limited to the boundary 83 illustrated by this embodiment.
  • the mark only needs to indicate the positional relationship between the camera module 26 of the user terminal 2 and the close-up lens 50.
  • an edge in the color pattern of the index unit 41, the mark 44, or the like may be used.
  • a predetermined pattern or the like provided at a position on the close-up lens 50 included in the shooting range of the camera module 26 is used. May be.
  • the image processing unit 16 may determine whether or not the boundary 83 is included in a predetermined area on the acquired image. When the image processing unit 16 determines that the boundary 83 is not included in the predetermined region on the acquired image, the image processing unit 16 may notify that the close-up lens 50 is mounted in the wrong position.
  • an area 84 shown in FIG. 25 is an example of the predetermined area. The predetermined area may be arbitrarily set.
  • control unit 21 may extract the boundary 83 from the image acquired by the camera module 26 by edge extraction or the like. Further, the control unit 21 may determine whether or not the extracted boundary 83 is included in the region 84.
  • control unit 21 determines that the boundary 83 is not included in the region 84, the control unit 21 operates the speaker included in the output device 25 to notify the user that the close-up lens 50 is mounted incorrectly. May be output. In addition, an image that informs the user that the close-up lens 50 is mounted in an incorrect position may be displayed on the display 27.
  • control unit 21 controls the output device 25 to perform audio output or screen output instructing the user of the mounting position of the close-up lens 50, thereby making it difficult to adjust the mounting position of the close-up lens 50. It may be reduced.
  • the control unit 21 determines that the boundary 83 is included in the region 84, the sound for notifying the user that the position of the close-up lens 50 is correct by operating the speaker included in the output device 25. May be output. Further, a screen for informing the user that the close-up lens 50 is mounted correctly may be displayed on the display 27. Thus, the user can recognize that the close-up lens 50 is mounted correctly.
  • the process related to determining whether or not the boundary 83 is included in the region 84 may be executed in an analysis process in step 200 described later. Further, the server 3 may execute a process related to a determination as to whether or not the boundary 83 is included in the region 84. For example, the user terminal 2 may transmit the acquired image to the server 3. If the server 3 determines that the boundary 83 is not included in the region 84, the server 3 may notify the user terminal 2 that the close-up lens 50 is mounted incorrectly. The user terminal 2 may execute the above-described audio output, screen output, and the like by receiving the notification.
  • the display control unit 18 indicates an image acquired by the camera module 26 while the camera module 26 of the user terminal 2 captures a subject, and a position where the close-up lens 50 is mounted on the camera module 26.
  • An image including the mark (boundary 83) may be output, and guidance information indicating a reference of the position where the close-up lens 50 is attached to the mark (boundary 83) may be output to the display 27 on the output image. .
  • FIG. 26 illustrates a screen display of the display 27 according to the present embodiment.
  • the control unit 21 outputs images continuously acquired by the camera module 26 during shooting to the display 27.
  • the control unit 21 outputs a guide line 91 to the display 27 as guide information indicating the reference of the mounting position of the close-up lens 50 with respect to the boundary 83 on the output image.
  • a guide line 91 indicating the reference of the boundary 83 is displayed together with an image as shown in FIG.
  • the user can mount the close-up lens 50 at the correct mounting position by determining the mounting position of the close-up lens 50 so that the imaged boundary 83 is aligned with the guide line 91. That is, by displaying such a guide line 91, the difficulty associated with the adjustment of the mounting position of the close-up lens 50 can be reduced.
  • the shape of the guide line 91 that is guide information indicating the reference of the mounting position of the close-up lens 50 corresponds to the shape of the boundary 83 that is a mark indicating the position where the close-up lens 50 is mounted.
  • the guide line 91 is not limited to a shape corresponding to the shape of the boundary 83. The shape of the guide line 91 may be selected as appropriate.
  • the guide information according to the present embodiment is limited to a sign indicating a position to match a mark (boundary 83) indicating a position where the close-up lens 50 is mounted, such as the guide line 91 according to the present embodiment. is not.
  • the guidance information may be a sign indicating the direction in which the close-up lens 50 is moved, or may be a character display indicating the mounting position of the close-up lens 50. The guide information is appropriately set according to the embodiment.
  • the close-up lens 50 is a lens used for magnified photographing, a small shift in the mounting position has a great influence on the photographed image.
  • the mounting position of the close-up lens 50 is not fixed in the camera module 26, the user must adjust the mounting position of the close-up lens 50 so that such a small shift does not occur. There was a scene with difficulty.
  • the guide information reduces the difficulty in such a situation by indicating a reference of the mounting position of the close-up lens 50 with respect to a mark indicating the position where the close-up lens 50 is mounted.
  • an image taken by an apparatus included in the information processing system 1 is acquired.
  • the information processing system 1 may acquire an image taken by another device that is not included in the information processing system 1.
  • the camera that captures an image is not limited to the camera module 26 of the user terminal 2.
  • the information processing system 1 may acquire an image from another device.
  • the information processing system 1 may acquire an image in which the mark 44 is photographed together with the subject from another information processing apparatus connected to the network 5.
  • the image to be processed by the information processing system 1 may be an image that is continuously acquired during shooting by a camera such as the camera module 26 of the user terminal 2 or the shutter of the camera is released.
  • the image obtained in (1) may be used.
  • step 200 the image processing unit 16 executes analysis processing and correction processing.
  • a specific example of the analysis process and the correction process is shown in FIG. Below, the server 3 shows the example which performs these processes. However, these processes may be executed in either the user terminal 2 or the server 3. Further, the user terminal 2 and the server 3 may share these processes and execute them. When these processes are shared by the user terminal 2 and the server 3, the user terminal 2 and the server 3 exchange data to be processed by data communication via the network 5 and switch processing subjects.
  • FIG. 27 is a flowchart illustrating an analysis process and a correction process by the information processing system 1 according to this embodiment.
  • the server 3 receives the image acquired from step 100 and transmitted from the user terminal 2. Then, the server 3 performs analysis processing and correction processing on the received image. Specifically, the process is executed as follows.
  • step 201 the control unit 31 extracts the index unit 41 from the image acquired in step 100.
  • the control unit 31 extracts a region 82 in which the index unit 41 is shown from the image 80 shown in FIG. Note that this processing corresponds to an example of image analysis processing according to the present invention.
  • the indicator unit 41 has a color pattern in which white and black are alternately arranged.
  • the control unit 31 estimates a region with a color approximated to white or black as a region 82 in which the index unit 41 is reflected, and extracts the region.
  • the index unit 41 refers to the RGB value of each pixel of the image 80. If the difference between the referenced RGB value and the RGB value indicating white (255, 255, 255) is within a predetermined threshold value, the control unit 31 selects a pixel having the RGB value as a white region of the index unit 41. It is determined that the pixel is included in the captured area. If the difference between the referenced RGB value and the RGB value (0, 0, 0) indicating black is within a predetermined threshold, the control unit 31 determines that the pixel having the RGB value is the black value of the index unit 41. The pixel is determined to be included in the region where the region is reflected.
  • the control unit 31 determines that the pixel having the RGB value is a pixel that is not included in the region where the index unit 41 is captured.
  • the control unit 31 estimates and extracts the region 82 in which the index unit 41 is captured by making these determinations for each pixel of the image 80.
  • the pixel in the region (region 81) in which the subject is captured may be erroneously determined as the pixel in the region (region 82) in which the index portion 41 is captured.
  • control unit 31 may store the shape of the opening 43 or the shape of the index unit 41 in advance in the auxiliary storage device 32 or the like. Then, the control unit 31 erroneously determines the pixel of the region 81 as the pixel of the region 82 by referring to the information such as the shape of the opening 43 stored in advance and estimating the position of the region 81 or the region 82. You may avoid that.
  • control unit 31 extracts the boundary 83 and determines in advance the position of the boundary between the region 81 and the region 82, thereby avoiding erroneous determination of the pixel in the region 81 as the pixel in the region 82. May be.
  • control unit 31 may specify the position of the boundary 83 from the extracted region 82. And the control part 31 may determine whether the mounting position of the close-up lens 50 mentioned above is correct. For example, the control unit 31 may determine whether the region 84 includes the boundary 83 that specifies the position. If the control unit 31 determines that the boundary 83 is not included in the region 84, the control unit 31 may notify the user terminal 2 of a message indicating that the close-up lens 50 is mounted in the wrong position. On the other hand, when it is determined that the boundary 83 is included in the region 84, the control unit 31 may notify the user terminal 2 of a message indicating that the close-up lens 50 is mounted correctly.
  • control unit 31 tries to extract the region 82 in which the index unit 41 is captured from the image 80.
  • the method of extracting the region 82 is not limited to such a method, and may be appropriately selected according to the embodiment.
  • control unit 31 ends the analysis and correction process (“NO” in step 201). On the other hand, if the area 82 can be extracted from the image 80 (“YES” in step 202), the control unit 31 advances the process to step 203.
  • the control unit 31 performs image blur correction. For example, the control unit 31 determines an area where the image 80 is out of focus. And the control part 31 performs a defocus correction
  • the control unit 31 determines whether or not defocusing has occurred using the boundary (edge) between white and black in the color pattern of the index unit 41 that is captured in the region 82 extracted in step 201. .
  • the change (gradient) of the pixel value (RGB value) at the edge becomes gentler. Therefore, the control unit 31 determines whether or not a blur has occurred based on a change in the pixel value (RGB value) at the edge.
  • control part 31 sharpens the image of the area
  • the method of correcting the defocus of the image is not limited to such a method, and may be appropriately selected according to the embodiment.
  • step 204 the controller 31 corrects image color and illumination unevenness. For example, an image having a color different from the actual color may be obtained depending on the function and performance of the camera. In addition, the illumination may be uneven due to the influence of the illumination light and the light source, and an image having a color different from the actual color may be obtained.
  • the control unit 31 analyzes the influence of these photographing environments and corrects the image so as to reduce the influence.
  • the process for analyzing the influence of the shooting environment corresponds to an example of the image analysis process of the present invention. Further, the process of correcting an image so as to reduce the influence of the shooting environment corresponds to an example of the correction process of the present invention.
  • the indicator section 41 shown in the area 82 of the image 80 has a color pattern in which white and black are alternately arranged. Pixels included in the white pattern region should have the same RGB value. Also, the pixels included in the black pattern region should have the same RGB value.
  • the RGB value of each pixel becomes a value different from the actual color according to the influence. Therefore, when affected by these shooting environments, variations in RGB values occur in regions that should have the same RGB values. By correcting the RGB values so as to eliminate this variation, the control unit 31 performs color correction and illumination unevenness correction.
  • control unit 31 uses the actual color of each area as a reference (reference color) for each of the area including the white pattern and the area including the black pattern, and the RGB values of the pixels included in each area And the difference (shift) between the RGB value of the reference color and the reference color.
  • control unit 31 obtains the difference between the RGB value of each pixel included in the region and the RGB value indicating white in the region where the white pattern is captured, thereby determining the color of each pixel and the reference Measure the difference from the color. Thereby, the control unit 31 obtains a two-dimensional distribution indicating the difference between the color of each pixel and the reference color in the region where the white pattern is reflected. In addition, the control unit 31 similarly measures a region where a black pattern is captured, and obtains a two-dimensional distribution indicating a difference between the color of each pixel and the reference color.
  • control unit 31 obtains the color of the pixel included in the region 81 and the reference color from the waveform of the two-dimensional distribution obtained for the pixel included in the region 82 and the difference between the color of each pixel and the reference color.
  • a two-dimensional distribution related to the difference is estimated. For example, by filling a blank area (area 81) of the two-dimensional distribution created in the area 82 with a predetermined interpolation calculation, a two-dimensional distribution related to the difference between the color of the pixel included in the area 81 and the reference color is estimated. .
  • the control unit 31 corrects the RGB value of each pixel using the two-dimensional distribution.
  • color correction and uneven illumination correction are realized.
  • the reference color may be set as appropriate. However, from the viewpoint of bringing the color of the subject close to the actual color, it is desirable to use the actual color of each area as a reference as in the present embodiment.
  • the color of the subject shown in the image can be brought close to the actual color by the correction in step 204. Therefore, it is possible to improve the accuracy of skin analysis described later. Also, since the color of the subject can be brought close to the actual color, the color recognition accuracy is improved, and for example, the number of colors that can be used in a code system that expresses information using a two-dimensional array of colors is increased. Can do.
  • step 205 the control unit 31 corrects image distortion.
  • the control unit 31 determines a region where distortion is generated in the image 80.
  • the control part 31 correct
  • the process of determining a region where distortion occurs corresponds to an example of an image analysis process of the present invention.
  • the process for correcting the distortion corresponds to an example of the correction process of the present invention.
  • the control unit 31 determines whether or not distortion has occurred using an edge in the color pattern of the index unit 41 shown in the region 82 extracted in step 201.
  • the shape of the edge in the color pattern of the index unit 41 is a straight line. Therefore, the control unit 31 determines whether or not distortion has occurred by determining whether or not the shape of the edge exceeds the threshold and is no longer a straight line.
  • control unit 31 corrects the distortion of the region determined to be distorted by converting the shape of the edge determined to be no longer a straight line beyond the threshold value so as to become a straight line.
  • the method for correcting image distortion is not limited to such a method, and may be appropriately selected according to the embodiment.
  • the control unit 31 extracts the mark 44 from the image 80.
  • the control unit 31 uses the pattern of the mark 44 stored in advance in the auxiliary storage device 32 or the like, and detects the pattern that matches the pattern of the mark 44 from the image 80, thereby extracting the mark 44 from the image 80. Do (pattern matching).
  • two types of marks 44 that is, marks 44a and 44b are extracted.
  • the mark 44a indicates that the close-up lens 50 to which the panel 40 is attached is a regular product.
  • the mark 44b indicates a business operator (A trader) of a product sold together with the close-up lens 50 and the panel. Based on the pattern used for the extraction, the control unit 31 recognizes the predetermined information as indicated by the mark 44 attached to the panel 40.
  • the panel 40 there is a panel 40L having a step in the index portion.
  • the panel 40 ⁇ / b> L has a step around the opening, and includes an index portion 41 ⁇ / b> K and marks (44 a, 44 b) at each step.
  • the position in the optical axis direction is different, so that one of the images in the stage is clearer than the image in the other stage.
  • the information processing system 1 changes the place (stage) from which an image is acquired according to the scene by using a panel having a step as described above, thereby providing a clearer image including the index unit 41 and the mark 44. It becomes possible to get. That is, it is possible to reduce the shift of the focus between the index portion 41 and the mark 44 when the subject is focused.
  • the image processing unit 16 executes subject analysis processing.
  • the control unit 31 analyzes the skin (skin analysis) shown in the image 80. For example, the control unit 31 binarizes and thins the image of the area 81 in which the subject is captured. And the control part 31 detects the regularity by a frequency analysis, etc., and detects the texture, the pore, the stain, etc. in an image.
  • skin analysis is exemplified as the subject analysis processing.
  • the subject analysis process is not limited to skin analysis, and may be set as appropriate according to the subject.
  • the panel 40 shown in FIG. 16 there is a panel 40H in which the indicator portions are colored stepwise.
  • the information processing system 1 compares each color of the index unit 41G with the color of the subject, so that the position of the same color as the subject (in the index unit 41G ( Angle) can be specified. Then, the information processing system 1 can estimate the color of the subject on the basis of the color of the index unit 41G attached to the specified position (angle).
  • step 300 This completes the image analysis processing and correction processing. Then, the process proceeds to step 300.
  • control unit 31 may change the order of correction processing in steps 203 to 205. Further, the control unit 31 may execute the process of step 206 before step 201. Further, the control unit 31 may change the order of the correction processing in steps 203 to 205.
  • processing may be omitted, replaced, and added as appropriate according to the embodiment.
  • the indicator portion does not exist in the panel 40G shown in FIG. 15F. Therefore, the control unit 31 may omit the processes in steps 201 to 205.
  • RGB red, green, blue, and blue
  • the image processing unit 16 determines whether the image acquisition unit 15 is to acquire an image again. For example, if the index unit 41 cannot be extracted in step 201 described above, or if the mark 44 cannot be extracted in step 206, the control unit 31 does not succeed in subject (skin) analysis processing in step 207. In such a case, the image acquisition unit 15 determines to acquire the image again. And the control part 31 returns a process to step 100 by performing the notification for making the user terminal 2 acquire an image again. On the other hand, if there is no such situation, the control unit 31 advances the process to step 400.
  • step 400 the display control unit 18 outputs the analysis result of the subject in step 207.
  • the control unit 31 of the server 3 notifies the user terminal 2 of the analysis result of the subject obtained in step 207.
  • the control unit 21 of the user terminal 2 presents the analysis result of the subject to the user by outputting the received analysis result of the subject to the display 27.
  • the storage unit 17 stores the image on which each correction process has been executed, the predetermined information indicated by the mark 44, and the analysis result of the subject in association with each other.
  • the auxiliary storage device 32 of the server 3 includes a database for storing the acquired information.
  • FIG. 28 illustrates a database record used by the information processing system 1 according to the present embodiment for storing images. Specifically, FIG. 28 illustrates a list of records related to the user A.
  • the database used to store the acquired information includes an ID field for storing an identifier for identifying a record, a shooting date / time field for storing the shooting date / time of the acquired image, and a file for storing the file name of the acquired image A name field, an attribute information field for storing predetermined information indicated by a mark 44, and an analysis result field for storing an analysis result of a subject (skin).
  • the control unit 31 creates a record in which fields are filled so as to match the image on which each correction process has been executed, the predetermined information indicated by the mark 44 extracted in step 206, and the analysis result of the subject in step 207. Add the created record to the database. Thereby, the control unit 31 accumulates the image on which each correction process has been executed, the predetermined information indicated by the mark 44, and the analysis result of the subject in association with each other.

Abstract

Le but d'un aspect de la présente invention est de réduire l'impact de la performance et de la fonction d'une caméra, la lumière d'éclairage, la source de lumière, ou similaire, sur une image qu'un utilisateur a acquise à l'aide d'une bonnette d'approche. Un indicateur selon un aspect de la présente invention présente des sections d'indicateur à couleurs spécifiques qui, lorsqu'une bonnette d'approche montée sur une caméra entre en contact avec un sujet photographique ou s'approche dudit sujet, et que ledit sujet est capturé, se situent à des emplacements qui entrent en contact étroit avec ledit sujet ou s'approchent étroitement dudit sujet. Lesdites sections sont au moins partiellement comprises dans la portée de capture de la caméra, sont alors capturées avec le sujet photographique, et forment des indicateurs d'analyse destinés à l'image qui a été capturée par la caméra.
PCT/JP2012/056025 2012-03-08 2012-03-08 Indicateur, et unité de bonnette d'approche WO2013132636A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/056025 WO2013132636A1 (fr) 2012-03-08 2012-03-08 Indicateur, et unité de bonnette d'approche

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/056025 WO2013132636A1 (fr) 2012-03-08 2012-03-08 Indicateur, et unité de bonnette d'approche

Publications (1)

Publication Number Publication Date
WO2013132636A1 true WO2013132636A1 (fr) 2013-09-12

Family

ID=49116150

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/056025 WO2013132636A1 (fr) 2012-03-08 2012-03-08 Indicateur, et unité de bonnette d'approche

Country Status (1)

Country Link
WO (1) WO2013132636A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019026472A1 (fr) * 2017-07-31 2019-02-07 マクセルホールディングス株式会社 Module de lentille de conversion et système de mesure d'état

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0719839A (ja) * 1993-06-30 1995-01-20 Shiseido Co Ltd 表面状態解析システム
JPH1163943A (ja) * 1997-08-19 1999-03-05 Teijin Ltd 顔面形状計測方法及び装置
JP2003162712A (ja) * 2001-08-29 2003-06-06 L'oreal Sa 人体の一部分のイメージを取得する方法及び装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0719839A (ja) * 1993-06-30 1995-01-20 Shiseido Co Ltd 表面状態解析システム
JPH1163943A (ja) * 1997-08-19 1999-03-05 Teijin Ltd 顔面形状計測方法及び装置
JP2003162712A (ja) * 2001-08-29 2003-06-06 L'oreal Sa 人体の一部分のイメージを取得する方法及び装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019026472A1 (fr) * 2017-07-31 2019-02-07 マクセルホールディングス株式会社 Module de lentille de conversion et système de mesure d'état

Similar Documents

Publication Publication Date Title
US20130113888A1 (en) Device, method and program for determining obstacle within imaging range during imaging for stereoscopic display
GB2533692A (en) Auto-contrast viewfinder for an indicia reader
US20150125030A1 (en) Image processing device, image processing system, image processing method, and program
EP2635019B1 (fr) Procédé et dispositif de traitement d'images et programme
JP2017182038A (ja) 投影システム及び投影画面の修正方法
US8988546B2 (en) Image processing device, image processing method, image capturing device, and program
KR102134751B1 (ko) 정자 간이 검사 키트, 시스템 및 정자의 간이 검사 방법
JP2005176230A (ja) 画像処理装置およびプリントシステム
EP3627822B1 (fr) Procédé et appareil d'affichage de région de mise au point, et dispositif terminal
JP6294012B2 (ja) レンズユニット
US9554121B2 (en) 3D scanning apparatus and method using lighting based on smart phone
JP2017038162A (ja) 撮像装置及び画像処理方法、プログラム、並びに記憶媒体
JP5619124B2 (ja) 画像処理装置、撮像装置、画像処理プログラムおよび画像処理方法
JPWO2014007268A1 (ja) レンズユニット
WO2013132636A1 (fr) Indicateur, et unité de bonnette d'approche
WO2013132637A1 (fr) Système de traitement d'informations, procédé de traitement d'informations, et programme
JP2014183565A (ja) レンズ情報登録システム,レンズ情報登録システムに用いられるレンズ情報サーバおよびカメラ本体
WO2021172019A1 (fr) Dispositif de traitement d'image et procédé de commande d'un dispositif de traitement d'image
CN111314608B (zh) 一种图像定焦提示方法、计算机装置及计算机可读存储介质
JP2014116789A (ja) 撮影装置、その制御方法及びプログラム
WO2020059157A1 (fr) Système d'affichage, programme, procédé d'affichage et dispositif monté sur la tête
JP2019111004A (ja) 描画システム、描画装置及び端末装置
JP5294100B1 (ja) ドットパターン読み取り用レンズユニット、ドットパターン読み取り用レンズユニットを台座に装着されたフィギュア、ドットパターン読み取り用レンズユニットに載置するカード、情報処理装置、情報処理システム
JP2021136684A (ja) 画像処理装置および画像処理装置の制御方法
US10372287B2 (en) Headset device and visual feedback method and apparatus thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12870344

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12870344

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP