WO2014103045A1 - 画像処理装置、画像処理方法、画像処理プログラム及びそのプログラムを記録するコンピュータ読み取り可能な記録媒体 - Google Patents
画像処理装置、画像処理方法、画像処理プログラム及びそのプログラムを記録するコンピュータ読み取り可能な記録媒体 Download PDFInfo
- Publication number
- WO2014103045A1 WO2014103045A1 PCT/JP2012/084169 JP2012084169W WO2014103045A1 WO 2014103045 A1 WO2014103045 A1 WO 2014103045A1 JP 2012084169 W JP2012084169 W JP 2012084169W WO 2014103045 A1 WO2014103045 A1 WO 2014103045A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- item
- image
- area
- model image
- region
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
Definitions
- the present invention relates to an image processing apparatus, an image processing method, an image processing program, and a computer-readable recording medium for recording the program.
- Non-Patent Document 1 There is known a technique for combining and displaying an image of an item worn by a person such as an accessory on a person's image (for example, see Non-Patent Document 1). In this technique, the image of the selected accessory is displayed superimposed on the user's image.
- the images of items posted on electronic commerce sites that sell items such as accessories are often taken with the items placed on a horizontal base. Items that are worn on a part of the body, such as earrings, earrings, and necklaces, are different in the direction in which gravity acts when placed horizontally and when worn on the human body. The aspect of the parts constituting the item is different.
- an image of an item posted on an electronic commerce site or the like may be photographed with the entire item tilted for balance of composition. If such an item image is superimposed on a model image of a user or the like and displayed, it becomes unnatural and causes the user to feel uncomfortable.
- an object of the present invention is to make the image closer to the state actually attached to the model when displaying the image of the item attached to the model superimposed on the image of the model.
- an image processing apparatus includes an item image acquisition unit that acquires an item image that displays an item that is an accessory that is hung and worn, and a portion on which the item is mounted
- a model image acquisition unit that acquires a model image in which the item is represented; and a centroid position of the correction target region with respect to a fulcrum position of the correction target region that is at least a part of the item region in which the item of the item image is represented
- Correction means for correcting the inclination of the correction target area so that the direction of the image follows the direction of gravity in the model image, and output means for outputting the model image and the item area so that the item area is superimposed on the model image And comprising.
- An image processing method is an image processing method executed by a computer, an item image acquisition step for acquiring an item image that displays an item that is an accessory that is worn while being hung, and A model image acquisition step for acquiring a model image in which a part to be attached is represented, and the correction target for a fulcrum position of a correction target region that is at least a part of an item region in which the item of the item image is represented A correction step for correcting the inclination of the correction target region so that the direction of the center of gravity position of the region is along the direction of gravity in the model image, and the model image and the item region for displaying the item region so as to be superimposed on the model image. And an output step for outputting.
- An image processing program is a model in which an item image acquisition function for acquiring an item image for displaying an item that is an accessory that is worn while being hung on a computer, and a part on which the item is mounted is represented.
- the model image acquisition function for acquiring an image, and the direction of the center of gravity position of the correction target area with respect to the fulcrum position of the correction target area that is at least a part of the item area in which the item of the item image is represented A correction function that corrects the inclination of the correction target area along the gravity direction in the image and an output function that outputs the model image and the item area so that the item area is superimposed on the model image are realized.
- a computer-readable recording medium is a computer-readable recording medium that stores an image processing program, and displays an item image that displays an item that is an accessory worn by being hung on the computer.
- the inclination of the correction target area in the item area is corrected so that the direction of the center of gravity of the correction target area with respect to the fulcrum position of the correction target area is along the gravity direction in the model image. Since the corrected item area is output so as to be superimposed on the model image, an image closer to the state in which the item is actually worn can be displayed. Therefore, the displayed image is not unnatural and is less uncomfortable for the user.
- the correction unit corrects the inclination with the entire item area as the correction target area, and the fulcrum position is a position where the mounting portion of the item in the item area is represented. According to this aspect, when the item is composed of one rigid body, the inclination of the item area is appropriately corrected.
- the correction target area is connected to each other via the connecting portion and In the item area where an item including the other part is displayed, at least the other part is represented, and the fulcrum position is a connection end on one part side in the correction target area.
- the area where the one part is represented is specified as the correction target area, and the correction target area
- the fulcrum position is set appropriately.
- the correcting unit has a length that is occupied by a region in which the item is represented along the predetermined direction in the item image is relatively greater than or equal to a predetermined region relative to a region near the predetermined range.
- a short region with a difference is specified as a connecting portion, and a nearby region is specified as a region where a part is represented.
- the constricted portion in the item image is specified as the connecting portion, and the vicinity region of the connecting portion is specified as the region where the parts are represented.
- the connection part in the image of an item is specified appropriately.
- the image processing apparatus further includes a specifying unit that specifies a gravity direction in the model image based on information that can be acquired from the model image, and the correction unit is based on the gravity direction specified by the specifying unit, The inclination of the correction target area is corrected.
- the direction of gravity in the model image is appropriately specified based on information acquired from the model image, it is possible to provide a natural composite image that is closer to the state actually worn on the human body.
- the correction unit when the item image whose vertical direction is specified by the item image acquisition unit is acquired, specifies the uppermost part in the item area as the mounting portion. According to this form, the mounting part in the item area is specified appropriately and easily.
- the correcting means includes the size of the part on which the item in the model image obtained from the model image is mounted, information on the actual size of the part in a preset human body, the item image Based on the information on the size of the item area in the item image acquired from the above and the actual size of the item stored in advance, the ratio between the size of the part where the item is mounted in the composite image and the size of the item image is The size of the image of the item area is adjusted so as to be approximately equal to the ratio between the actual size of the part to be attached and the actual size of the item.
- the item area to be superimposed is adjusted to an appropriate size according to the model image.
- a natural composite image with less discomfort for the user is provided.
- the output unit searches the model image using a template indicating the feature of the part to which the item is attached, thereby identifying the part to which the item is attached in the model image, The item region is superimposed on the model image so that the position where the item mounting portion is represented is located in the part where the item is mounted in the model image.
- the part where the item is mounted in the model image is specified with high accuracy, so that the item area is superimposed at an appropriate position.
- an image of an item to be attached to a model when displayed superimposed on the image of the model, it can be made closer to the state actually attached to the model.
- FIG. 4A is a diagram illustrating an example of an item image.
- FIG. 4B is a diagram illustrating a specific example of the correction target area. It is a figure which shows the example of the structure of the item information storage part, and the data stored.
- FIG. 6A is a diagram illustrating a specific example of the mounting portion.
- FIG. 6B is a diagram showing another example of specifying the mounting portion. It is a figure which shows the example of the correction process of the inclination of a correction object area
- FIGS. 8A to 8D are diagrams showing preprocessing prior to matching processing using a template.
- FIG. 8E is a diagram showing a template showing the characteristics of the piercing hole. It is a figure which shows the example of the synthesized image output by the output part.
- FIG. 10A is a flowchart illustrating an example of processing contents of an image processing method in the image processing apparatus.
- FIG. 10B is a flowchart showing the correction process in step S4 of the flowchart of FIG.
- Fig.11 (a) is a figure which shows the example of the piercing comprised by several parts.
- FIG. 11B is a diagram illustrating an image of an item that has been divided into a plurality of regions by performing the reduction / expansion process.
- FIG. 15A is a flowchart in the case of sequentially correcting inclinations of a plurality of parts included in an item.
- FIG. 15B is a flowchart in the case of correcting the inclination of a plurality of parts included in an item in parallel. It is a figure which shows the structure of an image processing program.
- FIG. 1 is a block diagram showing a functional configuration of the image processing apparatus 1 according to the present embodiment.
- the image processing apparatus 1 is an apparatus that synthesizes an image of an item such as an accessory with a model image and outputs it.
- the item is, for example, an accessory worn while hanging, and includes earrings, earrings, a necklace charm, and the like.
- the image processing apparatus 1 of the present embodiment can be applied to an electronic commerce site that sells items such as accessories.
- the image processing apparatus 1 functionally includes an item image acquisition unit 11 (item image acquisition unit), a model image acquisition unit 12 (model image acquisition unit), and a specification unit 13 ( A specifying unit), a correcting unit 14 (correcting unit), a superimposing unit 15 (output unit), and an output unit 16 (output unit).
- Each of the functional units 11 to 16 of the image processing apparatus 1 can access storage means such as the item image storage unit 21, the model image storage unit 22, and the item information storage unit 23.
- the image processing apparatus 1 can be configured as a server that can communicate with a user terminal via a network. Further, the image processing apparatus 1 may be configured as a device such as a smartphone or a personal computer.
- FIG. 2 is a hardware configuration diagram of the image processing apparatus 1.
- the image processing apparatus 1 is physically composed of a CPU 101, a main storage device 102 composed of memories such as a RAM and a ROM, an auxiliary storage device 103 composed of a hard disk, a network card, and the like.
- the computer system includes a communication control device 104, an input device 105 such as a keyboard and mouse, an output device 106 such as a display, and the like.
- the image processing apparatus 1 may not include the input device 105 and the output device 106.
- Each function shown in FIG. 1 is a communication control device under the control of the CPU 101 by reading predetermined computer software (image processing program) on the hardware such as the CPU 101 and the main storage device 102 shown in FIG. 104, the input device 105, and the output device 106 are operated, and data is read and written in the main storage device 102 and the auxiliary storage device 103. Data and databases necessary for processing are stored in the main storage device 102 and the auxiliary storage device 103.
- the item image storage unit 21 is a storage unit that stores item images.
- the item image is, for example, an image representing an accessory sold on an electronic commerce site.
- the item image storage unit 21 may store item images in advance.
- the model image storage unit 22 is a storage unit that stores a model image including a portion where an item is attached.
- the model image may be stored in advance or may be uploaded by the user.
- the model image represents, for example, a part of a person wearing the accessory.
- FIG. 3 is a diagram illustrating an example of a screen D displayed when the image processing apparatus 1 is applied to an electronic commerce site, for example.
- the image processing apparatus 1 is configured as a server, as illustrated in FIG. 3, the image processing apparatus 1 includes a plurality of item images PI acquired from the item image storage unit 21 and a model acquired from the model image storage unit 22.
- the image PI is output to a user terminal that can communicate via a network and displayed on the screen D of the user terminal.
- the item image acquisition unit 11 is a part that acquires an item image for displaying an item.
- An item is an accessory or the like that is worn by hanging, and is used by being attached to a model part. Items such as accessories include a mounting part and a main body part (parts) which are parts to be mounted on the model.
- the item image acquisition unit 11 acquires an item image from the item image storage unit 21.
- the model image acquisition unit 12 is a part that acquires a model image in which a part to which an item is attached is represented.
- the model image acquisition unit 12 acquires a model image from the model image storage unit 22.
- the model image may be an image stored in advance, or may be an image uploaded by the user and taken of the user's face or the like.
- the model image may be an image representing a user's face and the like acquired in real time.
- the specifying unit 13 is a part that specifies the direction of gravity in the model image acquired by the model image acquiring unit 12. For example, when the vertical direction of the model image is specified, the specifying unit 13 can specify the downward direction as the gravity direction. Further, when the orientation information acquired by the acceleration sensor or the like of the photographing apparatus at the time of photographing is attached to the model image, the specifying unit 13 determines the gravity direction in the model image based on the orientation information attached to the model image. Can be identified.
- the specifying unit 13 may specify the direction of gravity in the model image by analyzing the model image. For example, the position of a plurality of parts (for example, eyes, ears, nose, mouth) of the head in the model image is specified by a known method, and the direction of gravity is specified based on their relative positional relationship. More specific example will be described. First, the position of the eye and the position of the mouth are specified, and the direction in which the direction of the position of the mouth with respect to the position of the eye is rotated by a predetermined angle is defined as the direction of gravity. The predetermined angle is an angle between the direction of the mouth position relative to the position of the eyes in a general profile and the vertical direction of the head.
- the predetermined angle is an angle between the direction of the mouth position relative to the position of the eyes in a general profile and the vertical direction of the head.
- an area in which an object generally along the direction of gravity is represented by image processing is specified, and an extending direction of the object represented in the image in the specified area is specified by image processing,
- the specified direction may be specified as the direction of gravity.
- the specifying unit 13 is not an essential component in the image processing apparatus 1 of the present embodiment.
- a functional unit such as the correction unit 14 may regard a predetermined direction such as a downward direction in the model image as the gravity direction.
- the correction unit 14 is configured such that the direction of the center of gravity position of the correction target region with respect to the fulcrum position of the correction target region that is at least a part of the item region that is the region in which the item is represented in the item image is along the gravity direction in the model image. As described above, this is a portion for correcting the inclination of the correction target region. Specifically, the correction unit 14 first extracts an item area in which an item is represented from an item image as pre-processing for correcting the inclination. Further, the correction unit 14 specifies a correction target area from the item area.
- the correction unit 14 specifies an item region in the item image acquired by the item image acquisition unit 11 using a known image processing technique.
- FIG. 4A is a diagram illustrating an example of an item image.
- the correction unit 14 specifies an item area (foreground area) in which the item I 1 is represented from the item image P 1 . If the two regions are identified here, the correction unit 14 recognizes that the left and right of the item identified in the item image P 1.
- amendment part 14 specifies any item area
- the correction unit 14 may specify any region as the correction target region. Then, as shown in item image P 2 in FIG. 4 (b), if the item is constituted by one rigid (Parts), the correction unit 14, the whole of the item region I 2 as the correction target region Identify. For example, it is possible to determine whether an item is composed of one part or a plurality of parts connected to each other by using a well-known image processing technique such as reduction / expansion processing. This determination process will be described later.
- the correction unit 14 adjusts the size of the item area to be superimposed on the model image.
- the scale of the actual size is different between the item image and the model image.
- the correction unit 14 determines the size of the part on which the item is mounted in the model image, information on the actual size of the part in the preset model, and the size of the item area in the item image.
- the ratio between the size of the part where the item is mounted and the size of the item area in the model image based on the information about the actual size of the item stored in advance, the actual size of the part where the item is mounted and the actual size of the item The size of the item area is adjusted so as to be substantially equal to the ratio of.
- the part to which the item in the model image is attached is an ear
- the number of vertical pixels in the rectangle in which the area where the ear is reflected is acquired as the size of the part to which the item is attached in the model image.
- the region where the ear is reflected can be specified by known image processing. For example, information (vertical: 5 cm) is stored as information on the actual size of the part in the preset model.
- FIG. 5 is a diagram illustrating a configuration of the item information storage unit 23 and an example of stored data.
- the item information storage unit 23 stores vertical and horizontal sizes in association with the product ID.
- the product ID is an item identifier.
- the correction unit 14 adjusts the size of the item area using information on either or both of the vertical and horizontal sizes.
- the correction unit 14 identifies the mounting unit in the item image.
- the mounting part is a part to be mounted on the model in the item. Specifically, for example, when the item image whose vertical direction is specified by the item image acquisition unit 11 is acquired, the correction unit 14 specifies the uppermost part in the item area of the item image as the mounting unit. .
- FIG. 6A is a diagram illustrating a specific example of the mounting portion.
- Item image P 3 shown in FIG. 6 (a) it is assumed that the vertical direction is specified as the attribute.
- the item image P 3, the image of the item I 3 are represented.
- Correction portion 14 determines the uppermost portion A 3 of the image of the item I 3 as a mounting portion.
- the correction unit 14 identifies a pixel having the maximum y coordinate value in the image of the item I 3 as a mounting portion.
- the correction unit 14 When the item represented in the item image is a stud-type piercing, the correction unit 14 includes a pixel having the maximum y-coordinate value in the item image, and has a predetermined diameter (for example, 1 in the actual size). .5 mm) circular part may be specified, and the central part of the part may be specified as the mounting part. Information regarding whether or not the item is studded piercing may be acquired from text information about the product stored in association with the item image, or acquired from the item information storage unit 23. Also good.
- FIG. 6B is a diagram showing another example of specifying the mounting portion.
- Item image P 4 shown in FIG. 6 (b) represents an image of the item I 4.
- the item I 4 is a piercing, and has a hook part F passed through the piercing hole and a plurality of parts B 41 and B 42 .
- Correction portion 14 determines the uppermost part A 4 of the image of the item I 4 as a mounting portion.
- the correction portion 14 extracts the hook portion F by a known image processing technique such as pattern matching. And it is good also as specifying the part where the curvature of hook part F which draws a curve becomes the maximum as a wearing part.
- the correction unit 14 specifies the specified mounting unit as at least one fulcrum position when the mounting unit is mounted so as to be hung from the mounting position of the model.
- the correction unit 14 specifies the position of the center of gravity in the correction target area of the item area. Specifically, for example, assuming that the mass is uniformly distributed in the correction target area of the item image, the correction unit 14 uses a known image processing technique and analysis technique based on the shape of the correction target area. Thus, the position of the center of gravity in the correction target area is specified.
- FIG. 7 is a diagram illustrating an example of correction processing for the inclination of the correction target area when the back of the item area is the correction target area.
- FIG. 7 (a) is identified support position A 5 and the gravity center position G 5, when the direction of gravity in the model image is the direction indicated by the arrow DG, the correction unit 14, FIG. 7 (b as shown in), the direction of the gravity center position G 5 with respect to the support position a 5 corrects the inclination of the correction target region along the direction of gravity DG in the model image.
- the superimposing unit 15 superimposes the item region whose correction target region is corrected by the correcting unit 14 on the model image.
- the superimposing unit 15 can align and superimpose the model image on the model image so that the mounting unit in the item area is positioned on the part of the model image where the item is mounted.
- the superimposing unit 15 identifies a part on which an item is attached in the model image prior to the superimposing process for the superimposition of the item area.
- the superimposing unit 15 can specify a part on which the item is attached in the model image by searching (matching) the model image using a template indicating the feature of the part on which the item is attached.
- FIG. 8 is a diagram for explaining an example of processing for specifying a part to which an item is attached.
- FIGS. 8A to 8D show preprocessing prior to matching processing using a template.
- the superimposing unit 15 extracts an image of a part to which an item is attached from the model image. In the example of FIG. 8A, an image of an ear part where a piercing is worn is extracted. Subsequently, as illustrated in FIG. 8B, the superimposing unit 15 performs edge extraction processing on the image of the ear portion by a known image processing technique.
- the superimposing unit 15 performs an expansion process which is a well-known image processing technique as shown in FIG. 8C and a contraction process as shown in FIG. Make it manifest. The expansion process and the contraction process are repeatedly performed according to the characteristics of the image.
- the superimposing unit 15 wears the piercing by performing a matching process on the model image in which the piercing hole is made visible using a template showing the characteristics of the piercing hole as shown in FIG. Identify the part.
- the superimposing unit 15 may specify a preset relative position as a position where the item is worn in the extracted image of the ear part as illustrated in FIG. 8A, for example.
- the superimposing unit 15 can specify, for example, a position at which 10% of the height of the ear image from the lower end of the extracted ear image and 30% of the width of the ear image from the left end is a position where the item is worn.
- the superimposing unit 15 determines the item region in which the inclination of the correction target region is corrected so that the item mounting unit in the item region is positioned at the position where the item in the model image is mounted as described above. Superimpose on the model image.
- the output unit 16 is a part that outputs a model image and an item area so that the item area is superimposed on the model image. Specifically, the output unit 16 outputs a model image in which the item region is superimposed by the superimposing unit 15. For example, the output unit 16 outputs an image to the user terminal.
- the output unit 16 may output an image on a display.
- FIG. 9 is a diagram illustrating an example of an image output by the output unit 16. As shown in FIG. 9, the inclination of the correction target area in the item area is corrected so that the direction of the center of gravity of the correction target area with respect to the fulcrum position of the correction target area is along the gravity direction in the model image. Is aligned with the mounting position and superimposed on the model image, so that an image closer to the state of being actually mounted on the model is output.
- FIG. 10A is a flowchart showing an example of processing contents of the image processing method in the image processing apparatus 1 shown in FIG.
- FIG. 10B is a flowchart showing the correction process in step S4 of the flowchart of FIG. 10A, and shows an example of the process when the item is an accessory made of one rigid body.
- the item image acquisition unit 11 acquires an item image from the item image storage unit 21.
- the model image acquisition unit 12 acquires a model image from the model image storage unit 22 (S1).
- the specifying unit 13 may specify the direction of gravity in the model image acquired by the model image acquiring unit 12.
- the correction unit 14 extracts an item area from the item image (S2). Subsequently, the correction unit 14 performs, as preprocessing, adjustment processing of the size of the item area and identification processing of the mounting unit in the item image (S3).
- the correcting unit 14 specifies the specified mounting unit as a fulcrum position when the mounting unit is mounted so as to be hung from the model.
- the superimposing unit 15 specifies a portion where the item is attached in the model image (S3).
- the correction unit 14 corrects the inclination of the correction target area (S4).
- the correction unit 14 extracts the entire item area as a correction target area that is a target for tilt correction.
- the correction unit 14 specifies the position of the center of gravity of the correction target region using a known image processing technique and analysis technique (S11). Then, the correction unit 14 corrects the inclination of the correction target region so that the direction of the center of gravity position with respect to the fulcrum position in the correction target region is along the direction of gravity in the model image (S12).
- the superimposing unit 15 superimposes the item region in which the correction target region is corrected, on the model image so that the mounting unit is positioned at a part where the item in the model image is mounted (S5).
- the output unit 16 outputs the image superimposed by the superimposing unit 15 (S6).
- Fig.11 (a) is a figure which shows the example of the item which is the piercing comprised by several parts.
- the item includes a plurality of parts B 11 , B 12 , B 13 , and B 14 .
- the part B 11 includes a mounting part A.
- Part B 12 via the connecting part C 12 swingably is connected to the part B 11 of the connecting part C 12 as a fulcrum.
- Part B 13 via the connecting part C 13 swingably is connected to the part B 12 of the connecting part C 13 as a fulcrum.
- the connecting portion is shown in a small circle, but the actual shape of the connecting portion is more complicated.
- the connecting portion includes a ring-shaped portion fixed to one part and the other. It consists of a ring-shaped part fixed to the part of the two, and the two ring-shaped parts are connected.
- the correction unit 14 can determine, for example, whether an item is composed of one part or a plurality of parts connected to each other using a well-known image processing technique such as reduction / expansion processing. It is. As illustrated in FIG. 11B, when the item region is divided into a plurality of regions by performing the reduction / expansion processing on the item region of the item image, the correction unit 14 determines that the item includes a plurality of items. It can be determined that it is composed of parts.
- the correction unit 14 can specify a region disappeared by the reduction / expansion process as shown in FIG. More specifically, the correction unit 14 specifies the difference between the image illustrated in FIG. 11A and the image illustrated in FIG. In addition, the correction unit 14 specifies each region divided by the reduction / expansion process as a region where a part is represented, and specifies a region where each part is represented as a correction target region. That is, when an item is composed of a plurality of parts, the image of the item has a plurality of correction target areas.
- FIG. 12 is a diagram illustrating another example of the specifying process of the connecting unit by the correcting unit 14.
- the correcting unit 14 specifies a short region as a connecting unit in which the length of the region in which the item is represented in the predetermined direction in the item image has a predetermined difference or more relative to a region near the predetermined range. In this case, it is possible to specify a nearby region as a region where parts are represented.
- the correction unit 14 calculates the number of pixels in the area of the item image distributed along the direction orthogonal to the vertical direction (arrow V in FIG. 12).
- a histogram to be expressed is generated, and portions corresponding to the minimum points in the histogram and having a number of pixels equal to or less than a predetermined number are specified as item connection portions C 12 , C 13 , and C 14 .
- the correction unit 14 the region in the vicinity of the specified area as the connecting portion is specified as area B 11, B 12, B 13 , B 14 which each part is represented in the image of the item.
- the correction part 14 is at least the other part B
- the area where B is represented (the shaded area in FIG. 13A) is set as the correction target area.
- the correction target region can include a region where the connecting portion CAB is represented as illustrated in FIG.
- the correction unit 14 identifies the connecting end portion of one of the parts B A side in the correction target region as a fulcrum position.
- any position in the region of the connecting portion C AB can be set to support position.
- Support position for example, may be a one part B
- a side end portion E 1 at the junction C AB may be a center position of the region of the connecting portion C AB.
- the correction unit 14 when the correction target region (hatched portion in FIG. 13 (b)) does not include the connecting portion C AB, the correction unit 14, one in the other parts B B The part B A side end E 2 is specified as a fulcrum position.
- one part has a plurality of connecting end parts.
- the straight line connecting the connecting portions at both ends of the part does not pass through the position of the center of gravity
- the fulcrum position is set to the lower connecting end portion of the plurality of connecting end portions and the inclination is corrected,
- the state is corrected to a state different from the state in which the item is actually mounted. Therefore, in this case, it is preferable that the fulcrum position related to the correction target area in which the one part is represented is set to a connection end portion located higher than the plurality of connection end portions.
- FIG. 14 is a diagram showing the specification of the fulcrum position and the gravity center position of each part, which is the correction target area, and the correction processing of the inclination of each correction target area when the item is configured by a plurality of parts.
- the center position of the connecting portion is specified as the fulcrum position.
- the example shown in FIG. 14 may be regarded as an interpretation of the connecting portion as a point.
- the correction unit 14 specifies the mounting part as a fulcrum position of the part.
- the correction unit 14 specifies the mounting part A as the fulcrum position of the part B 11 because the part B 11 has the mounting part A.
- the correction unit 14 identifies the well-known image analysis techniques gravity center position G 11 parts B11. Then, the correction unit 14, the direction of the center of gravity position with respect to the fulcrum position (mounting portion A) corrects the inclination of part B 11 along the direction of gravity in the model image.
- the correcting portion 14 is connected to the uppermost position among the plurality of connecting portions when the mounting portion is mounted on the model so that the item is hung. Specify the part as the fulcrum position.
- the correction unit 14 identifies a support position in part B 12 having a plurality of connecting portions C 12, C 13, the connecting portion C 12 located more upward (mounting portion) To do.
- the correction unit 14 identifies the well-known image analysis techniques gravity center position G 12 parts B 12.
- the mass is uniformly distributed in the area where the part is represented (correction target area).
- the mass of all the parts supported by the connecting portion is distributed to the connecting portion not specified as the fulcrum position.
- the connecting part C 13 which have not been identified as a fulcrum position of the part B 12, the mass of the connecting portion all other parts B 13 which is supported by the C 13, B 14 is It can be assumed that it is distributed.
- the correction unit 14 the direction of the gravity center position G 12 relative to the support position (connecting portion C 12) is, correcting the inclination of part B 12 along the model image definitive gravity direction To do.
- the inclinations of the uncorrected parts B13 and B14 are also tentatively corrected according to the correction of the inclination of the part B12.
- the correction unit 14 is configured to identify the connection part C 13 as a fulcrum position of the part B 13, identifies the centroid position G 13 parts B 13 by well-known image analysis techniques To do. Then, as shown in FIG. 14 (c), the correction unit 14, the direction of the gravity center position G 13 relative to the support position (connecting portion C 13) is, correcting the inclination of part B 13 along the direction of gravity in the model image To do.
- the correction unit 14 is configured to identify the connection part C 14 as a fulcrum position of the part B 14, to identify the center of gravity position G 14 parts B 14 by well-known image analysis techniques . Then, as shown in FIG. 14D, the correction unit 14 corrects the inclination of the part B 14 so that the direction of the gravity center position G 14 with respect to the fulcrum position (connecting portion C 14 ) is along the direction of gravity in the model image. To do.
- FIGS. 15A and 15B are flowcharts specifically showing the correction process in step S4 of the flowchart of FIG. 10A.
- FIG. 15A is a flowchart in the case where the inclination of a plurality of parts included in an item is sequentially corrected from the upper part.
- the correction unit 14 specifies an area in which the parts to be the connection part and the correction target area are represented in the item image (S21). Subsequently, the correction unit 14 extracts a part including the mounting part from the plurality of parts, sets an area where the part is represented as a correction target area, and sets the mounting part of the part as a fulcrum position of the part. Specify (S22).
- the correction unit 14 specifies the position of the center of gravity in the correction target area using a known image analysis technique (S23). Then, the correction unit 14 corrects the inclination of the correction target region so that the direction of the center of gravity position with respect to the fulcrum position is along the gravity direction of the model image (S24).
- the correction unit 14 determines whether or not the correction processing of the inclinations of all the parts included in the item image has been completed (S25). If it is determined that the inclinations of all the parts have been corrected, the correction processing procedure ends. On the other hand, if it is not determined that the inclinations of all parts have been corrected, the processing procedure proceeds to step S26.
- step S ⁇ b> 26 the correction unit 14 sets, as a correction target region, an area in which a part connected to a part whose inclination has been corrected is connected via a connection unit, and specifies the connection unit as a fulcrum position ( S26). Then, the processing procedure returns to step S23, and the processing of steps S23 to S26 is repeated until it is determined in step S25 that the inclinations of all parts have been corrected.
- the correction target area is sequentially set from the upper part among the plurality of parts, and the inclination correction process is performed. However, the process is sequentially performed from the lower part. It is good.
- FIG. 15B is a flowchart in the case of correcting the inclination of a plurality of parts included in an item in parallel.
- the correction unit 14 specifies a plurality of parts and a connecting part of each part in the item image (S31). Subsequently, the correction unit 14 sets each of the areas where the parts are represented as correction target areas, and specifies the mounting part or the connection part located above each part as a fulcrum position of each correction target area ( S32).
- the correction unit 14 specifies the position of the center of gravity in each correction target region using a known image analysis technique (S33). Then, the correction unit 14 corrects the inclination of each correction target region so that the direction of the center of gravity with respect to the fulcrum position is along the direction of gravity (S34). By performing the correction process in this way, the processing time is shortened compared to the case where a plurality of correction target areas are sequentially corrected.
- the image processing program 1P includes a main module P10, an item image acquisition module P11, a model image acquisition module P12, a specific module P13, a correction module P14, a superposition module P15, and an output module P16.
- the main module P10 is a part that comprehensively controls image processing.
- the functions realized by executing the item image acquisition module P11, the model image acquisition module P12, the specific module P13, the correction module P14, the superimposition module P15, and the output module P16 are items of the image processing apparatus 1 shown in FIG.
- the functions of the image acquisition unit 11, model image acquisition unit 12, identification unit 13, correction unit 14, superimposition unit 15, and output unit 16 are the same.
- the image processing program 1P is provided by a storage medium 1D such as a CD-ROM, a DVD, or a ROM, or a semiconductor memory, for example. Further, the image processing program 1P may be provided as a computer data signal superimposed on a carrier wave via a communication network.
- a storage medium 1D such as a CD-ROM, a DVD, or a ROM, or a semiconductor memory, for example.
- the image processing program 1P may be provided as a computer data signal superimposed on a carrier wave via a communication network.
- the inclination of the correction target area in the item area is Since the direction of the center of gravity position of the correction target area with respect to the fulcrum position is corrected so that it follows the direction of gravity in the model image, the item area whose inclination of the correction target area is corrected is output so as to be superimposed on the model image. An image closer to the state in which the item is worn can be displayed. Therefore, the displayed image is not unnatural and is less uncomfortable for the user.
- the storage area can be reduced.
- an accessory such as a piercing is shown and described. Since the pierced earring is attached to a hole (pierced hole) opened in a person's earlobe, it cannot be tried on in an actual store or the like. Therefore, in the image processing apparatus 1 of the present embodiment, it is very useful that the user can virtually try on and the user can check the wearing state.
- the image processing apparatus 1 since the ratio of parts to the overall size of the piercing is large, if an item image with an unnaturally tilted part is combined with a model image, the combined image becomes very unnatural. Become. Therefore, in the image processing apparatus 1 according to the present embodiment, the inclination of the item image is corrected in a natural state, which is very useful when the item is pierced.
- the image processing apparatus 1 of the present embodiment has a special effect when an item is pierced, but the item is not limited to pierced.
- the item may be worn while being hung, such as an earring or a necklace charm.
- the position of the center of gravity of the item is determined by a known image analysis technique assuming that the mass is uniformly distributed in the correction target region. Also good.
- a database in which a material used for an item such as an accessory, a mass per unit volume, and a color of the material are associated and stored is prepared in advance, and the correction unit 14 acquires information on the color of each part acquired from the item image.
- the center of gravity may be calculated by specifying the material and mass of each part based on the above.
- amendment part 14 extracts the keyword which shows material from the web page on which the item image was published, specifies the color of the said material based on the said database, and the part in an item image has the color
- the part may be specified as an object made of the material, and the center-of-gravity position may be specified using information on the mass of the material.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
Claims (11)
- ぶら下げられて着用されるアクセサリであるアイテムを表示するアイテム画像を取得するアイテム画像取得手段と、
前記アイテムが装着される部分が表されたモデル画像を取得するモデル画像取得手段と、
前記アイテム画像のうちのアイテムが表された領域であるアイテム領域の少なくとも一部である補正対象領域の支点位置に対する該補正対象領域の重心位置の方向が、前記モデル画像における重力方向に沿うように前記補正対象領域の傾きを補正する補正手段と、
前記モデル画像に前記アイテム領域が重畳されるように表示させるために前記モデル画像及び前記アイテム領域を出力する出力手段と、
を備える画像処理装置。 - 前記補正手段は、前記アイテム領域の全部を前記補正対象領域として傾きを補正し、
前記支点位置は、前記アイテム領域のうちのアイテムの装着部が表された位置である、
請求項1に記載の画像処理装置。 - 前記アイテムが連結部を介して互いに揺動可能に連結された複数のパーツを含む場合において、
前記補正対象領域は、連結部を介して互いに連結された一方及び他方のパーツを含むアイテムが表示されたアイテム領域における少なくとも前記他方のパーツが表された領域であり、
前記支点位置は、前記補正対象領域における前記一方のパーツ側の連結端部である、
請求項1または2に記載の画像処理装置。 - 前記補正手段は、前記アイテム画像において所定方向に沿ってアイテムが表された領域が占める長さが、所定範囲の近傍の領域に対して相対的に所定以上の差をもって短い領域を連結部として特定し、前記近傍の領域をパーツが表された領域として特定する、
請求項3に記載の画像処理装置。 - 前記モデル画像から取得可能な情報に基づき該モデル画像における重力方向を特定する特定手段を更に備え、
前記補正手段は、前記特定手段により特定された重力方向に基づき、前記補正対象領域の傾きを補正する、
請求項1~4のいずれか1項に記載の画像処理装置。 - 前記アイテム画像取得手段により上下方向が特定されたアイテム画像が取得された場合に、前記補正手段は、アイテム領域における最も上方の部分を装着部として特定する、
請求項1~5のいずれか1項に記載の画像処理装置。 - 前記補正手段は、
前記モデル画像から取得した該モデル画像における前記アイテムが装着される部分の大きさ、予め設定された該部分の実サイズに関する情報、前記アイテム画像から取得した該アイテム画像におけるアイテム領域の大きさ、及び予め記憶された前記アイテムの実サイズに関する情報に基づき、モデル画像におけるアイテムが装着される部分の大きさとアイテム領域の大きさとの比が、前記アイテムが装着される部分の実サイズと前記アイテムの実サイズとの比に略等しくなるように前記アイテム領域の大きさを調整する、
請求項1~6のいずれか1項に記載の画像処理装置。 - 前記出力手段は、前記アイテムが装着される部分の特徴を示すテンプレートを用いて前記モデル画像を探索することにより、モデル画像におけるアイテムが装着される部分を特定し、アイテムの装着部が表された位置が前記モデル画像におけるアイテムが装着される部分に位置するように前記アイテム領域を前記モデル画像に重畳する、
請求項1~7のいずれか1項に記載の画像処理装置。 - コンピュータにより実行される画像処理方法であって、
ぶら下げられて着用されるアクセサリであるアイテムを表示するアイテム画像を取得するアイテム画像取得ステップと、
前記アイテムが装着される部分が表されたモデル画像を取得するモデル画像取得ステップと、
前記アイテム画像のうちのアイテムが表された領域であるアイテム領域の少なくとも一部である補正対象領域の支点位置に対する該補正対象領域の重心位置の方向が、前記モデル画像における重力方向に沿うように前記補正対象領域の傾きを補正する補正ステップと、
前記モデル画像に前記アイテム領域が重畳されるように表示させるために前記モデル画像及び前記アイテム領域を出力する出力ステップと、
を有する画像処理方法。 - コンピュータに、
ぶら下げられて着用されるアクセサリであるアイテムを表示するアイテム画像を取得するアイテム画像取得機能と、
前記アイテムが装着される部分が表されたモデル画像を取得するモデル画像取得機能と、
前記アイテム画像のうちのアイテムが表された領域であるアイテム領域の少なくとも一部である補正対象領域の支点位置に対する該補正対象領域の重心位置の方向が、前記モデル画像における重力方向に沿うように前記補正対象領域の傾きを補正する補正機能と、
前記モデル画像に前記アイテム領域が重畳されるように表示させるために前記モデル画像及び前記アイテム領域を出力する出力機能と、
を実現させる画像処理プログラム。 - 画像処理プログラムを記憶するコンピュータ読み取り可能な記録媒体であって、
前記コンピュータに、
ぶら下げられて着用されるアクセサリであるアイテムを表示するアイテム画像を取得するアイテム画像取得機能と、
前記アイテムが装着される部分が表されたモデル画像を取得するモデル画像取得機能と、
前記アイテム画像のうちのアイテムが表された領域であるアイテム領域の少なくとも一部である補正対象領域の支点位置に対する該補正対象領域の重心位置の方向が、前記モデル画像における重力方向に沿うように前記補正対象領域の傾きを補正する補正機能と、
前記モデル画像に前記アイテム領域が重畳されるように表示させるために前記モデル画像及び前記アイテム領域を出力する出力機能と、
を実現させるコンピュータ読み取り可能な記録媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013509769A JP5254507B1 (ja) | 2012-12-28 | 2012-12-28 | 画像処理装置、画像処理方法、画像処理プログラム及びそのプログラムを記録するコンピュータ読み取り可能な記録媒体 |
PCT/JP2012/084169 WO2014103045A1 (ja) | 2012-12-28 | 2012-12-28 | 画像処理装置、画像処理方法、画像処理プログラム及びそのプログラムを記録するコンピュータ読み取り可能な記録媒体 |
US14/129,501 US9396570B2 (en) | 2012-12-28 | 2012-12-28 | Image processing method to superimpose item image onto model image and image processing device thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2012/084169 WO2014103045A1 (ja) | 2012-12-28 | 2012-12-28 | 画像処理装置、画像処理方法、画像処理プログラム及びそのプログラムを記録するコンピュータ読み取り可能な記録媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014103045A1 true WO2014103045A1 (ja) | 2014-07-03 |
Family
ID=49052913
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/084169 WO2014103045A1 (ja) | 2012-12-28 | 2012-12-28 | 画像処理装置、画像処理方法、画像処理プログラム及びそのプログラムを記録するコンピュータ読み取り可能な記録媒体 |
Country Status (3)
Country | Link |
---|---|
US (1) | US9396570B2 (ja) |
JP (1) | JP5254507B1 (ja) |
WO (1) | WO2014103045A1 (ja) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109643440B (zh) * | 2016-08-26 | 2024-03-22 | 日本电气株式会社 | 图像处理设备、图像处理方法和计算机可读记录介质 |
AU2021285994A1 (en) * | 2020-06-05 | 2023-03-09 | Maria Tashjian | Technologies for virtually trying-on items |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002140581A (ja) * | 2000-11-02 | 2002-05-17 | Zaza International:Kk | 商品販売支援方法、商品販売支援システム及びコンピュータ読み取り可能なプログラム |
JP2005242566A (ja) * | 2004-02-25 | 2005-09-08 | Canon Inc | 画像合成装置及び方法 |
JP2012073961A (ja) * | 2010-09-29 | 2012-04-12 | Konami Digital Entertainment Co Ltd | ゲーム装置、ゲーム装置の制御方法、及びプログラム |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003182508A (ja) * | 2001-12-18 | 2003-07-03 | Denso Corp | 車両用乗員保護装置 |
US9101279B2 (en) * | 2006-02-15 | 2015-08-11 | Virtual Video Reality By Ritchey, Llc | Mobile user borne brain activity data and surrounding environment data correlation system |
JP4946741B2 (ja) * | 2007-09-05 | 2012-06-06 | セイコーエプソン株式会社 | 画像処理装置、画像処理方法、及び画像処理システム |
US20100030578A1 (en) * | 2008-03-21 | 2010-02-04 | Siddique M A Sami | System and method for collaborative shopping, business and entertainment |
US20100169059A1 (en) * | 2009-02-13 | 2010-07-01 | Grant Thomas-Lepore | Layered Personalization |
WO2011132552A1 (ja) * | 2010-04-22 | 2011-10-27 | コニカミノルタホールディングス株式会社 | 情報処理装置、プログラム、情報処理方法、および情報処理システム |
US8655053B1 (en) * | 2010-05-31 | 2014-02-18 | Andrew S Hansen | Body modeling and garment fitting using an electronic device |
JP6045139B2 (ja) * | 2011-12-01 | 2016-12-14 | キヤノン株式会社 | 映像生成装置、映像生成方法及びプログラム |
US20140002492A1 (en) * | 2012-06-29 | 2014-01-02 | Mathew J. Lamb | Propagation of real world properties into augmented reality images |
US9129404B1 (en) * | 2012-09-13 | 2015-09-08 | Amazon Technologies, Inc. | Measuring physical objects and presenting virtual articles |
US20140368869A1 (en) * | 2013-06-12 | 2014-12-18 | Samsung Electronics Co., Ltd. | Method of printing web page by using mobile terminal and mobile terminal for performing the method |
JP6152365B2 (ja) * | 2014-06-11 | 2017-06-21 | 京セラドキュメントソリューションズ株式会社 | 情報処理装置、及び画像処理プログラム |
-
2012
- 2012-12-28 US US14/129,501 patent/US9396570B2/en active Active
- 2012-12-28 JP JP2013509769A patent/JP5254507B1/ja active Active
- 2012-12-28 WO PCT/JP2012/084169 patent/WO2014103045A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002140581A (ja) * | 2000-11-02 | 2002-05-17 | Zaza International:Kk | 商品販売支援方法、商品販売支援システム及びコンピュータ読み取り可能なプログラム |
JP2005242566A (ja) * | 2004-02-25 | 2005-09-08 | Canon Inc | 画像合成装置及び方法 |
JP2012073961A (ja) * | 2010-09-29 | 2012-04-12 | Konami Digital Entertainment Co Ltd | ゲーム装置、ゲーム装置の制御方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
US9396570B2 (en) | 2016-07-19 |
JP5254507B1 (ja) | 2013-08-07 |
JPWO2014103045A1 (ja) | 2017-01-12 |
US20150317811A1 (en) | 2015-11-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5656603B2 (ja) | 情報処理装置、情報処理方法、およびそのプログラム | |
US20190206031A1 (en) | Facial Contour Correcting Method and Device | |
CN103140879B (zh) | 信息呈现装置、数字照相机、头戴式显示器、投影仪、信息呈现方法和信息呈现程序 | |
US10911695B2 (en) | Information processing apparatus, information processing method, and computer program product | |
US11482041B2 (en) | Identity obfuscation in images utilizing synthesized faces | |
CN111696213A (zh) | 图像生成装置、图像生成方法以及计算机可读介质 | |
US20150378433A1 (en) | Detecting a primary user of a device | |
JP2012174116A (ja) | オブジェクト表示装置、オブジェクト表示方法及びオブジェクト表示プログラム | |
JP7278724B2 (ja) | 情報処理装置、情報処理方法、および情報処理プログラム | |
JP5613741B2 (ja) | 画像処理装置、方法、及びプログラム | |
US10277814B2 (en) | Display control method and system for executing the display control method | |
CN105387847B (zh) | 非接触式测量方法、测量设备及其测量系统 | |
JP2012168642A (ja) | オブジェクト表示装置、オブジェクト表示方法及びオブジェクト表示プログラム | |
JP2012138892A (ja) | 非可視化情報を用いたコンテンツ提供システム、非可視化情報の埋込装置、認識装置、埋込方法、認識方法、埋込プログラム、及び認識プログラム | |
JP5254507B1 (ja) | 画像処理装置、画像処理方法、画像処理プログラム及びそのプログラムを記録するコンピュータ読み取り可能な記録媒体 | |
CN110269586A (zh) | 用于捕获具有暗点的人的视野的设备和方法 | |
JP7238998B2 (ja) | 推定装置、学習装置、制御方法及びプログラム | |
JP6858007B2 (ja) | 画像処理システム、画像処理方法 | |
CN109194952A (zh) | 头戴式眼动追踪设备及其眼动追踪方法 | |
US11023739B2 (en) | Flow line combining device, flow line combining method, and recording medium | |
KR101321022B1 (ko) | 증강 현실 구현 방법 및 증강 현실을 구현하는 컴퓨팅 장치 및 시스템 | |
JP5706995B2 (ja) | 靴画像処理システム、靴画像処理方法、プログラム | |
US8860761B2 (en) | Image processing device, image processing method, image processing program and computer-readable recording medium storing the program | |
JP5891879B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP4345440B2 (ja) | 輪郭判断装置及び輪郭判断方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2013509769 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14129501 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12891129 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12891129 Country of ref document: EP Kind code of ref document: A1 |