WO2020003510A1 - 特定方法、判定方法、特定プログラム、判定プログラムおよび情報処理装置 - Google Patents
特定方法、判定方法、特定プログラム、判定プログラムおよび情報処理装置 Download PDFInfo
- Publication number
- WO2020003510A1 WO2020003510A1 PCT/JP2018/024894 JP2018024894W WO2020003510A1 WO 2020003510 A1 WO2020003510 A1 WO 2020003510A1 JP 2018024894 W JP2018024894 W JP 2018024894W WO 2020003510 A1 WO2020003510 A1 WO 2020003510A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- contour
- data
- subject
- outline
- captured image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/564—Depth or shape recovery from multiple images from contours
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/42—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
- G06V10/422—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation for representing the structure of the pattern or shape of an object therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/48—Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/752—Contour matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/513—Sparse representations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/759—Region-based matching
Definitions
- the present invention relates to a specifying method and the like.
- edges from an image, specifying an outline (outline), and recognizing an object. For example, a plurality of edges extending between blocks are extracted from each image block of the captured image data. Then, there is a conventional technique for extracting an outline from a plurality of edges, narrowing down an object model corresponding to a captured angle, and performing object recognition.
- an object of the present invention is to provide a specifying method, a determining method, a specifying program, a determining program, and an information processing apparatus capable of reducing a calculation load required for specifying a subject included in a captured image. .
- the computer acquires a photographed image photographed by the photographing device.
- the computer refers to a storage unit that stores a plurality of pieces of contour data indicating the shapes of the contours of the plurality of objects, and stores a plurality of pieces of contour data corresponding to a subject included in the acquired captured image in the plurality of pieces of contour data. It is determined whether or not it is included. When the determination result is affirmative, the computer refers to the storage unit that stores the part data indicating the shape or pattern of the part located inside the outline of the object in association with the outline data indicating the shape of the outline of the object. Then, a plurality of part data respectively associated with a plurality of contour data corresponding to the contour of the subject are obtained.
- the computer specifies an object corresponding to the subject among the plurality of objects based on the acquired plurality of part data.
- FIG. 1 is a diagram for explaining the process of the information processing apparatus according to the first embodiment.
- FIG. 2 is a functional block diagram illustrating the configuration of the information processing apparatus according to the first embodiment.
- FIG. 3 is a diagram illustrating an example of the data structure of the image buffer according to the first embodiment.
- FIG. 4 is a diagram illustrating an example of PostScript data.
- FIG. 5 is a diagram for explaining the relationship between the outline and the script data.
- FIG. 6 is a flowchart illustrating the processing procedure of the information processing apparatus according to the first embodiment.
- FIG. 7 is a diagram for explaining processing of the information processing apparatus according to the second embodiment.
- FIG. 8 is a diagram for explaining the relationship between the outline and the script data.
- FIG. 1 is a diagram for explaining the process of the information processing apparatus according to the first embodiment.
- FIG. 2 is a functional block diagram illustrating the configuration of the information processing apparatus according to the first embodiment.
- FIG. 3 is a diagram illustrating an example of the data structure of the
- FIG. 9 is a functional block diagram illustrating the configuration of the information processing apparatus according to the second embodiment.
- FIG. 10 is a diagram for explaining the process of the specifying unit according to the second embodiment.
- FIG. 11 is a flowchart illustrating the processing procedure of the information processing apparatus according to the second embodiment.
- FIG. 12 is a diagram for explaining another process of the information processing device.
- FIG. 13 is a diagram illustrating an example of a hardware configuration of a computer that realizes functions similar to those of the information processing apparatus according to the first embodiment.
- FIG. 14 is a diagram illustrating an example of a hardware configuration of a computer that realizes the same function as the information processing apparatus according to the second embodiment.
- FIG. 1 is a diagram for explaining the processing of the information processing apparatus according to the first embodiment.
- the information processing device acquires captured image data 10 captured by a stereo camera.
- the information processing device specifies the shape of the contour of the subject from the captured image data using the parallax of the stereo camera.
- the outline shape of the subject specified by the information processing apparatus is defined as an outline 10a.
- the information processing device compares the outline 10a with the outline information 142 and specifies which of the plurality of objects registered in the outline information the subject included in the captured image data 10 corresponds to. .
- the outline information 142 holds outline information of an object and outline information of parts included in the object. For example, the outline information 142 associates the outline 20a of the vehicle A with the outline 21a of the parts of the vehicle A. The outline information 142 associates the outline 20b of the vehicle B with the outline 21b of the parts of the vehicle B. The outline information 142 associates the outline 20c of the vehicle C with the outline 21c of the parts of the vehicle C.
- the information processing device compares the outline 10a of the subject with the outlines 20a, 20b, and 20c of the outline information 142, and specifies an outline similar to the outline 10a from the outlines 20a, 20b, and 20c.
- outlines similar to the outline 10a are referred to as outlines 20a and 20c.
- the information processing device compares the edges 21a and 21c of the parts associated with the outline with the image edge 10b.
- the image edge 10b is an image obtained by extracting an edge from a region inside the contour of the subject in the entire region of the captured image data 10.
- the information processing apparatus compares the image edge 10b with the outlines 21a and 21c. If the outline 21a is more similar to the image edge 10b than the outline 21b, the object corresponding to the subject is referred to as “vehicle”. A ”.
- the information processing apparatus identifies the outline 10a from the captured image data 10 and narrows down the objects by comparing the outline 10a with the outlines 20a to 20c of the objects in the outline information 142. Do. After narrowing down, the information processing device performs a process of comparing the image edge 10b of the captured image data 10 with the edge of the part and specifying an object corresponding to the subject. As a result, the calculation load required for specifying the subject included in the captured image data can be reduced.
- FIG. 2 is a functional block diagram illustrating the configuration of the information processing apparatus according to the first embodiment.
- the information processing apparatus 100 includes a camera 105, a communication unit 110, an input unit 120, a display unit 130, a storage unit 140, and a control unit 150.
- the camera 105 is a stereo camera (binocular camera) or a monocular camera movable left and right, which simultaneously captures an object from two different directions, and can identify the outline of the object by parallax.
- the camera 105 outputs, to the information processing apparatus 100, first captured image data captured in a first direction and second captured image data captured in a second direction different from the first direction.
- first captured image data and the second captured image data are collectively referred to as “captured image data” as appropriate.
- the communication unit 110 is a processing unit that executes data communication with an external device via a network.
- the communication unit 110 is an example of a communication device.
- the information processing apparatus 100 may be connected to the camera 105 via a network, and may receive the captured image data via the network.
- the input unit 120 is an input device for inputting various types of information to the information processing device 100.
- the input unit 120 corresponds to a keyboard, a mouse, a touch panel, or the like.
- the display unit 130 is a display device for displaying various information output from the control unit 150.
- the display unit 130 corresponds to a liquid crystal display, a touch panel, or the like.
- the storage unit 140 has an image buffer 141 and outline information 142.
- the storage unit 140 corresponds to a semiconductor memory device such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory (Flash Memory), and a storage device such as an HDD (Hard Disk Drive).
- a semiconductor memory device such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory (Flash Memory), and a storage device such as an HDD (Hard Disk Drive).
- the image buffer 141 is a buffer that stores the image data captured by the camera 105.
- FIG. 3 is a diagram illustrating an example of the data structure of the image buffer according to the first embodiment. As shown in FIG. 3, the image buffer 141 associates time with captured image data. The time is the time at which the captured image data was captured. The captured image data is image data captured by the camera 105.
- the outline information 142 is information that holds the orientation information of the object, the outline of the object, and the information of each part included in the object.
- the outline information 142 has PostScript data as information of each part included in the object.
- PostScript data is PostScript data for drawing outlines and edges of a plurality of parts.
- FIG. 4 is a diagram illustrating an example of PostScript data. As shown in FIG. 4, the PostScript data 40 includes a plurality of PostScript data 40a, 40b, 40c, and 40d. As an example, PostScript data 40a to 40d are shown, but PostScript data 40 may include other PostScript data.
- the PostScript data 40a in the top layer is PostScript data for drawing the outline of the outline of the object (entire).
- the outline of the outline of the object (entire) corresponds to the outline 20a described with reference to FIG.
- an outline drawn by the PostScript data of the uppermost layer is referred to as an “uppermost outline” as appropriate.
- Each of the PostScript data 40b to 40d under the PostScript data 40a is PostScript data for drawing the edge of each part included in the outline of the object.
- the PostScript data 40b is PostScript data for drawing the edge of the right turn signal of the object.
- the PostScript data 40c is PostScript data for drawing the edge of the left turn signal of the object.
- the PostScript data 40d is PostScript data for drawing the edge of another part in the outline of the object.
- an edge drawn by the PostScript data under the control is appropriately described as a “part edge”.
- the coordinates specified in each of the PostScript data 40b to 40d under the PostScript data 40a may be relative coordinates based on the coordinates specified in the PostScript data in the uppermost layer.
- FIG. 5 is a diagram for explaining the relationship between the outline and the script data.
- FIG. 5 shows PostScript data 6 corresponding to the outline 5 as an example.
- the outline 5 can be drawn by the PostScript data 6.
- the outline 5 includes a straight line 5AB, a curved line 5BC, a straight line 5CD, and a straight line 5DA.
- the straight line 5AB is a straight line connecting the control point A and the control point B.
- the straight line 5CD is a straight line connecting the control point C and the control point D.
- the straight line 5DA is a straight line connecting the control point D and the control point A.
- the curve 5BC is a curve connecting the control point B and the control point C, and the shape of the curve is determined by the control points ⁇ and ⁇ and the control points (end points) B and C.
- the PostScript data 6 of the outline 5 is generated based on the control points A, B, C, and D of the outline 5 and the control points ⁇ and ⁇ .
- “Xa, Ya” included in the PostScript data 6 indicates the coordinates of the control point A.
- “Xb, Yb” indicates the coordinates of the control point B.
- “Xc, Yc” indicates the coordinates of the control point C.
- “Xd, Yd” indicates the coordinates of the control point D.
- “X ⁇ , Y ⁇ ” indicates the coordinates of the control point ⁇ .
- X ⁇ , Y ⁇ ” indicates the coordinates of the control point ⁇ .
- the PostScript data 6 includes various commands “newpath ⁇ moveto ⁇ lineto ⁇ curveto ⁇ stroke ⁇ showpage”.
- the control unit 150 includes an acquisition unit 151, a determination unit 152, a specification unit 153, and an output unit 154.
- the control unit 150 can be realized by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like.
- the control unit 150 can also be realized by hard wired logic such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
- the acquisition unit 151 is a processing unit that acquires captured image data from the camera 105.
- the acquisition unit 151 stores the acquired captured image data in the image buffer 141 in association with the time.
- the camera 105 may add time information to the captured image data at the time of shooting, or the obtaining unit 151 may obtain time information from a timer (not shown).
- the determination unit 152 is a processing unit that determines, based on the captured image data stored in the image buffer 141 and the outline information 142, the top-level outline similar to the outline of the subject included in the captured image data. Hereinafter, an example of the process of the determination unit 152 will be described.
- the determination unit 152 extracts a contour shape of a subject on a captured image based on the captured image data (first captured image data and second captured image data) based on the principle of a stereo camera.
- the determination unit 152 specifies the contour shape as the outline of the subject.
- image outline the outline of the subject extracted from the captured image data.
- the determination unit 152 draws the top-level outline for each of the PostScript data in the outline information 142 based on the top-level PostScript data.
- the determination unit 152 calculates the similarity by comparing each drawn top outline with the image outline.
- the determination unit 152 determines an uppermost-layer outline whose similarity with the image outline is equal to or greater than a predetermined similarity.
- the determination unit 152 outputs, to the identification unit 153, information on the top-level outline whose similarity to the image outline is equal to or more than a predetermined similarity among the top-level outlines.
- the top-level outline whose similarity to the image outline is equal to or higher than a predetermined similarity is referred to as “candidate outline”.
- the determination unit 152 may calculate the similarity between the top outline and the image outline in any manner. For example, the determination unit 152 may calculate, as a similarity, a coincidence rate between a region surrounded by the top-level outline and a region surrounded by the image outline.
- the determination unit 152 extracts a region surrounded by the image outline from the entire region of the captured image data, extracts an edge from the image of the extracted region, and generates an image edge. For example, the determination unit 152 generates an image edge using Hough transform or the like.
- the determination unit 152 executes the above processing, and outputs information on the candidate outline and information on the image edge to the specifying unit 153.
- the information on the candidate outline output from the determination unit 152 to the specification unit 153 is information that can specify the candidate outline, and includes identification information and an angle (direction information).
- the specifying unit 153 Upon receiving the information on the candidate outline, the specifying unit 153 determines, based on the shape or pattern of the part located inside the candidate outline and the image edge of the captured image data, an object corresponding to the subject of the captured image data. Is a processing unit that specifies Hereinafter, an example of the process of the specifying unit 153 will be described.
- the specifying unit 153 selects one candidate outline, draws an edge of each part from PostScript data under the selected candidate outline, and compares each shape of the edge included in the image edge with the edge of each part. .
- the specifying unit 153 counts the number of edges of the part having the shape of the edge whose similarity is equal to or larger than the threshold among the edges of the compared parts. In the following description, the number of edges of a part having an edge shape whose similarity is equal to or larger than a threshold is referred to as a “number of matches”.
- the specifying unit 153 may calculate the similarity between the edge of the part and the shape of the image edge in any manner.
- the determination unit 152 may calculate a coincidence rate between a region surrounded by the edge of the part and a region formed by the image edge as the similarity.
- the identifying unit 153 repeatedly executes the process of counting the number of matches for other candidate outlines.
- the specifying unit 153 specifies the identification information corresponding to the candidate outline having the largest number of matches among the candidate outlines.
- the identification information specified by the specifying unit 153 becomes the identification information of the object corresponding to the subject in the captured image data.
- the specifying unit 153 outputs the specified identification information to the output unit 154.
- the output unit 154 is a processing unit that performs a process of outputting the identification information specified by the specifying unit 153. For example, the output unit 154 outputs the identification information to another external device via the communication unit 110. The output unit 154 may output information in which captured image data from which an image outline has been extracted is associated with identification information.
- FIG. 6 is a flowchart illustrating the processing procedure of the information processing apparatus according to the first embodiment.
- the acquisition unit 151 of the information processing apparatus 100 acquires captured image data and stores the captured image data in the image buffer 141 (Step S101).
- the determination unit 152 of the information processing device 100 extracts an image outline of the subject (Step S102).
- the determination unit 152 generates an image edge in the image outline (Step S103).
- the determination unit 152 compares the image outline with the uppermost outline of the outline information 142, and specifies an uppermost outline (candidate outline) similar to the image outline (step S104).
- the specifying unit 153 of the information processing apparatus 100 selects a candidate outline (Step S105).
- the specifying unit 153 draws the edge of the part from the PostScript data under the selected candidate outline (step S106).
- the identification unit 153 compares the image edge with the edge of the part and counts the number of matches (step S107). The specifying unit 153 determines whether all candidate outlines have been selected (Step S108). If all the candidate outlines have not been selected (No at Step S108), the specifying unit 153 selects an unselected candidate outline (Step S109), and proceeds to Step S106.
- the specifying unit 153 specifies the identification information corresponding to the candidate outline having the maximum number of matches (Step S110).
- the output unit 154 of the information processing device 100 outputs the specified identification information and the captured image data to the external device (or the display unit 130) (Step S111).
- the information processing apparatus 100 specifies an image outline from the captured image data, and narrows down the top outline by comparing the image outline with the outline information 142. After performing the narrowing down, the information processing apparatus performs a process of comparing the image edge of the captured image data with the edge of the part and specifying an object corresponding to the subject. As a result, the calculation load required for specifying the subject included in the captured image data can be reduced.
- the outline shape (image outline) of the subject on the captured image is extracted based on the principle of the stereo camera, and the outline of the uppermost layer is narrowed down. For this reason, compared with the case where the edge extracted from the image is used, the uppermost layer outline can be narrowed down more easily.
- FIG. 7 is a diagram for explaining the processing of the information processing apparatus according to the second embodiment.
- the information processing device acquires photographed image data 31, 32, 33, and 34 photographed by a stereo camera.
- the information processing apparatus specifies the shape of the contour of the subject from each of the captured image data 31 to using the parallax of the stereo camera.
- the outline shape of the subject included in the captured image data 31 be the outline 31a.
- the outline shape of the subject included in the captured image data 32 is defined as an outline 32a.
- the outline shape of the subject included in the captured image data 33 is defined as an outline 33a.
- the outline shape of the subject included in the captured image data 34 is defined as an outline 34a.
- the outline information 242 holds the orientation information of the object and the outline information of the object in association with each other.
- the direction information of the object indicates the angle of the object and is arranged in ascending order.
- the outline 43a is the shape of the contour of the vehicle A in the direction “0 °”.
- the outline 41a is the shape of the contour of the vehicle A in the direction “60 °”.
- the outline 42a is a contour shape of the vehicle A in the direction “120 °”. Illustration of outlines of the vehicle A in other directions is omitted.
- the outline 43b is the shape of the contour of the vehicle B in the direction “0 °”.
- the outline 41b is the shape of the contour of the vehicle B in the direction “60 °”.
- the outline 42b is a contour shape of the vehicle B in the direction “120 °”. Illustration of outlines of the vehicle B in other directions is omitted.
- the outline 43c is the shape of the contour of the vehicle C in the direction “0 °”.
- the outline 41c is the shape of the contour of the vehicle C in the direction “60 °”.
- the outline 42c is the shape of the contour of the vehicle C in the direction “120 °”. Illustration of outlines of the vehicle C in other directions is omitted.
- FIG. 8 is a diagram illustrating an example of a data structure of outline information used in the second embodiment.
- the outline information 242 has identification information and PostScript data for each angle (the direction of the object in time expression).
- the identification information is information for uniquely identifying an object.
- the information processing device compares each of the outlines 31a to 34a with the outline information 242, and determines that the subject included in the captured image data 31 to 34 is assigned to any of the plurality of objects registered in the outline information 242. Identify whether they will respond.
- the information processing apparatus repeatedly performs a process of comparing each of the outlines 31a to 34a with the outline information 242 and identifying an outline similar to each of the outlines 31a to 34a.
- the information processing device compares the outline 31a with the outlines 43a to 43c of the direction information “0 °” and specifies outlines 43a and 43c similar to the outline 31a.
- the information processing apparatus compares the outline 32a with each outline of the direction information “30 °” (not shown), and specifies a similar outline.
- the information processing device compares the outline 33a with the outlines 41a to 41c of the direction information “60 °”, and specifies the outline 41a similar to the outline 33a.
- the information processing device compares the outline 34a with each outline of the direction information “90 °” (not shown), and specifies a similar outline.
- the information processing device executes the above processing, and totals the number of similar outlines for each vehicle in the outline information 242.
- the information processing device determines that the vehicle having the largest number of similar outlines is the vehicle corresponding to the subject in the captured image data.
- the information processing apparatus extracts an outline from each piece of captured image data, and determines the subject based on the correspondence between the outline for each angle stored in the outline information 242 and the extracted outline.
- the object corresponding to is specified. As a result, even if the shape of the contour of the subject continuously changes, it is possible to prevent the accuracy of determining the object corresponding to the subject from decreasing.
- FIG. 9 is a functional block diagram illustrating the configuration of the information processing apparatus according to the second embodiment.
- the information processing device 200 includes a camera 205, a communication unit 210, an input unit 220, a display unit 230, a storage unit 240, and a control unit 250.
- the camera 205 is a stereo camera (binocular camera) that simultaneously captures an object from two different directions or a monocular camera that can move left and right.
- the camera 205 outputs the captured image data to the information processing device 200.
- the other description of the camera 205 is the same as the description of the camera 105.
- the communication unit 210 is a processing unit that performs data communication with an external device via a network.
- the communication unit 210 is an example of a communication device.
- the information processing device 200 may be connected to the camera 205 via a network, and may receive the captured image data via the network.
- the input unit 220 is an input device for inputting various types of information to the information processing device 200.
- the input unit 220 corresponds to a keyboard, a mouse, a touch panel, and the like.
- the display unit 230 is a display device for displaying various information output from the control unit 250.
- the display unit 230 corresponds to a liquid crystal display, a touch panel, or the like.
- the storage unit 240 has an image buffer 241 and outline information 242.
- the storage unit 240 corresponds to a semiconductor memory device such as a RAM, a ROM, and a flash memory, and a storage device such as an HDD.
- the image buffer 241 is a buffer that stores the image data captured by the camera 205.
- the data structure of the image buffer 241 is the same as the data structure of the image buffer 141 described with reference to FIG.
- the outline information 242 is information that holds the orientation information of the object, the outline of the object, and the information of each part included in the object.
- the data structure of the outline information 242 is the same as that described with reference to FIG.
- the control unit 250 includes an acquisition unit 251, an identification unit 252, a determination unit 253, and an output unit 254.
- the control unit 250 can be realized by a CPU, an MPU, or the like. Further, the control unit 250 can also be realized by hard wired logic such as an ASIC or an FPGA.
- the acquisition unit 251 is a processing unit that acquires captured image data from the camera 205.
- the acquisition unit 251 stores the acquired captured image data in the image buffer 241 in association with the time.
- the camera 205 may add time information to the shot image data at the time of shooting, or the obtaining unit 251 may obtain time information from a timer (not shown).
- the identification unit 252 is a processing unit that identifies the top-level outline similar to the image outline of the captured image data based on the captured image data stored in the image buffer 241 and the outline information 242.
- the uppermost layer outline is an outline drawn by the uppermost layer PostScript data as described with reference to FIG.
- the identification unit 252 generates identification information including the time of the captured image data, identification information corresponding to a similar top-level outline, orientation information, and the like, and outputs the generated identification information to the determination unit 253.
- the specifying unit 252 When there are N top-level outlines similar to the outline of the subject included in one piece of captured image data, the specifying unit 252 generates N pieces of specific information.
- FIG. 10 is a diagram for explaining the process of the specifying unit according to the second embodiment.
- the identifying unit 252 the photographed image data 31 at time t 1, and extracts an image outline (outline) 31a.
- the process in which the specifying unit 252 extracts the image outline is the same as the process in which the determination unit 152 extracts the image outline.
- the specifying unit 252 draws the top outline from PostScript data stored in each area of the outline information 242, and calculates the similarity between the drawn top outline and the image outline 31a.
- the specifying unit 252 specifies the information of the uppermost outline whose similarity with the image outline is equal to or higher than a predetermined similarity, and generates specific information.
- the specifying unit 252 may perform the same processing as the determination unit 152 of the first embodiment, and may count the number of matches based on the image edge and the edge of the part.
- the similarity between the uppermost outline stored in the areas 242A and 242B of the outline information 242 and the image outline 31a is equal to or higher than a predetermined similarity.
- the identification unit 252 generates identification information 242a corresponding to the area 242A.
- the time “t 1 ”, the identification information “C001”, the direction information “0 °”, and the number of matches “M 1 ” are associated with the specific information 242a.
- the identification unit 252 generates identification information 242b corresponding to the area 242B.
- the time “t 1 ”, the identification information “C003”, the direction information “0 °”, and the number of matches “M 2 ” are associated with the specific information 242b.
- the identifying unit 252 repeatedly performs the same process on other captured image data (for example, the captured image data 32 to 34 in FIG. 8) subsequent to the captured image data 31, and generates specific information.
- the specifying unit 252 outputs the generated specific information to the determining unit 253.
- the determination unit 253 is a processing unit that determines identification information of an object corresponding to a subject in each piece of captured image data based on each piece of specific information acquired from the specifying unit 252. The determination unit 253 outputs the determined identification information to the output unit 254.
- the determination unit 253 classifies a plurality of pieces of specific information for each piece of identification information.
- the determination unit 253 counts the number of pieces of specific information classified into each piece of identification information, and determines identification information corresponding to a group having the largest number of pieces of specific information.
- the number of specific information having the identification information “C001” is “l”
- the number of the specific information having the identification information “C002” is “m”
- the number of the specific information having the identification information “C003” is “n”.
- l> m> n the identification information of the object corresponding to the subject included in the captured image data is “C001”.
- the determination unit 253 may determine one piece of identification information by executing the following processing. As an example, the process of the determination unit 253 will be described with the identification information having the same number of specific information as identification information “C001” and “C003”.
- the determination unit 253 arranges the specific information including the identification information “C001” in chronological order.
- the determining unit 253 scans the specific information arranged in a time series, and determines whether the overall tendency of the change of the direction information is the ascending order or the descending order. When the change tendency is ascending order, the direction information of the specific information before and after is compared, and if the direction information increases, 1 is added to the evaluation value. If the direction information does not increase, the evaluation information is evaluated. By repeatedly executing the process of adding 0 to the value, the evaluation value of the identification information “C001” is calculated.
- the determination unit 253 scans the specific information arranged in chronological order, and when the tendency of the change of the overall direction information is in descending order, compares the direction information of the preceding and succeeding specific information and decreases the direction information. If so, a process of adding 1 to the evaluation value and adding 0 to the evaluation value when the orientation information has not decreased is repeatedly performed, thereby calculating the evaluation value of the identification information “C001”.
- the determination unit 253 calculates the evaluation value of the identification information “C003” by executing the above-described processing also for the identification information “C003”.
- the determination unit 253 compares the evaluation value of the identification information “C001” with the evaluation value of “C003”, and determines the identification information with the larger evaluation value as the identification information of the object corresponding to the subject included in the captured image data. Is determined.
- the determination unit 253 may perform a process of determining the identification information using the number of matches included in the specific information. For example, when performing the above process, the determination unit 253 performs a process of excluding, from the targets, specific information whose number of matches is less than a threshold from among the specific information. The determination unit 253 performs a process of classifying the remaining specific information and a process of calculating an evaluation value after excluding the specific information based on the number of matches.
- the output unit 254 is a processing unit that performs a process of outputting the identification information specified by the determination unit 253.
- the determination unit 253 outputs the identification information to another external device via the communication unit 110.
- the output unit 254 may output information in which captured image data from which an image outline has been extracted is associated with identification information.
- FIG. 11 is a flowchart illustrating the processing procedure of the information processing apparatus according to the second embodiment.
- the acquisition unit 251 of the information processing device 200 acquires captured image data from the camera 205 and stores the captured image data in the image buffer 241 (step S201).
- the specifying unit 252 of the information processing apparatus 200 selects unselected captured image data from the image buffer 241 (Step S202).
- the specifying unit 252 extracts an image outline from the captured image data (Step S203).
- the specifying unit 252 compares the image outline with each top-level outline of the outline information 242 (step S204).
- the specifying unit 252 specifies the top-level outline similar to the image outline, and generates specifying information (Step S205). The specifying unit 252 determines whether or not unselected captured image data exists in the image buffer 241 (Step S206).
- step S206 If there is unselected captured image data (step S206, Yes), the specifying unit 252 selects unselected captured image data (step S207), and proceeds to step S203. On the other hand, when there is no unselected captured image data (step S206, No), the specifying unit 252 proceeds to step S208.
- the determination unit 253 of the information processing device 200 excludes, from the specific information, specific information whose number of matches is less than the threshold (step S208).
- the determination unit 253 classifies the specific information and determines identification information corresponding to the captured image data (step S209).
- the output unit 254 of the information processing device 200 outputs the specified identification information and the captured image data to the external device (or the display unit 230) (Step S210).
- the information processing device 200 extracts an outline from each piece of captured image data, and specifies an object corresponding to the subject from the correspondence between the outline stored for each angle stored in the outline information 242 and the extracted outline. As a result, even if the shape of the contour of the subject continuously changes, it is possible to prevent the accuracy of determining the object corresponding to the subject from decreasing.
- the processing of the information processing apparatus 200 described above is not limited to the above processing.
- the specifying unit 252 extracts the image outline from the captured image data, specifies the plurality of uppermost outlines whose similarity is equal to or more than the threshold, and compares the uppermost outline with the image outline of the subsequent captured image data.
- narrowing down may be performed based on the order relation.
- FIG. 12 is a diagram for explaining another process of the information processing device.
- the identifying unit 252 the photographed image data 31 at time t 1, and extracts an image outline (outline) 31a.
- the process in which the specifying unit 252 extracts the image outline is the same as the process in which the determination unit 152 extracts the image outline.
- the specifying unit 252 draws the top outline from PostScript data stored in each area of the outline information 242, and calculates the similarity between the drawn top outline and the image outline 31a.
- the specifying unit 252 specifies information on the top-level outline whose similarity with the image outline is equal to or greater than a predetermined similarity. For example, the similarity between the uppermost outline stored in the areas 242A and 242B of the outline information 242 and the image outline 31a is equal to or higher than a predetermined similarity.
- the identification unit 252 determines the top-level outline of the areas 242C and 242D corresponding to the areas 242A and 242B of the outline information 242, out of the direction information “30 °” having a predetermined order relationship with the direction information “0 °”. To identify.
- the determination unit 253 calculates the similarity R1 between the top outline of the region 242C specified by the specifying unit 252 and the image outline 32a.
- the determination unit 253 calculates the similarity R2 between the outline of the uppermost layer of the region 242D and the image outline 32a.
- the determination unit 253 determines the identification information “C001” for the subject included in the captured image data 31, 32.
- the determination unit 253 determines the identification information “C003” for the subject included in the captured image data 31, 32.
- the information processing apparatus can reduce the amount of calculation by narrowing down the uppermost layer outline vector to be compared with the image outline of the captured image data based on the order relation. Further, even if the shape of the contour of the subject continuously changes, it is possible to prevent the accuracy of determining the object corresponding to the subject from decreasing.
- the regions 242A and 242B of the outline information 242 are included. Are identified as the uppermost outlines of the regions 242C and 242D corresponding to.
- the direction information having a predetermined order relationship with the direction information “0 °” may be, for example, the direction information “ ⁇ 30 ° (330 °)” not shown, or both of the direction information (30 ° and 30 °). 330 °).
- FIG. 13 is a diagram illustrating an example of a hardware configuration of a computer that realizes functions similar to those of the information processing apparatus according to the first embodiment.
- the computer 300 includes a CPU 301 that executes various arithmetic processing, an input device 302 that receives input of data from a user, and a display 303.
- the computer 300 includes a reading device 304 that reads a program or the like from a storage medium, and an interface device 305 that exchanges data with an external device, the camera 105, or the like via a wired or wireless network.
- the computer 300 includes a RAM 306 for temporarily storing various information, and a hard disk device 307. Each of the devices 301 to 307 is connected to the bus 308.
- the hard disk device 307 has an acquisition program 307a, a determination program 307b, a specific program 307c, and an output program 307d.
- the CPU 301 reads the acquisition program 307a, the determination program 307b, the specific program 307c, and the output program 307d, and expands them on the RAM 306.
- the acquisition program 307a functions as the acquisition process 306a.
- the determination program 307b functions as a determination process 306b.
- the specific program 307c functions as a specific process 306c.
- the output program 307d functions as an output process 306d.
- the processing of the acquisition process 306a corresponds to the processing of the acquisition unit 151.
- the processing of the determination process 306b corresponds to the processing of the determination unit 152.
- the processing of the specifying process 306c corresponds to the processing of the specifying unit 153.
- the processing of the output process 306d corresponds to the processing of the output unit 154.
- each program does not necessarily need to be stored in the hard disk device 307 from the beginning.
- each program is stored in a “portable physical medium” such as a flexible disk (FD), a CD-ROM, a DVD disk, a magneto-optical disk, or an IC card inserted into the computer 300. Then, the computer 300 may read out and execute each of the programs 307a to 307e.
- a “portable physical medium” such as a flexible disk (FD), a CD-ROM, a DVD disk, a magneto-optical disk, or an IC card inserted into the computer 300.
- the computer 300 may read out and execute each of the programs 307a to 307e.
- FIG. 14 is a diagram illustrating an example of a hardware configuration of a computer that realizes the same function as the information processing apparatus according to the second embodiment.
- the computer 400 includes a CPU 401 that executes various types of arithmetic processing, an input device 402 that receives input of data from a user, and a display 403. Further, the computer 400 includes a reading device 404 that reads a program or the like from a storage medium, and an interface device 405 that exchanges data with an external device, the camera 205, or the like via a wired or wireless network.
- the computer 400 includes a RAM 406 for temporarily storing various information, and a hard disk device 407. The devices 401 to 407 are connected to a bus 408.
- the hard disk device 407 has an acquisition program 407a, a specific program 407b, a determination program 407c, and an output program 407d.
- the CPU 401 reads the acquisition program 407a, the specific program 407b, the determination program 407c, and the output program 407d, and expands them on the RAM 406.
- the acquisition program 407a functions as the acquisition process 406a.
- the specific program 407b functions as the determination process 406b.
- the determination program 407c functions as a determination process 406c.
- the output program 407d functions as an output process 406d.
- the processing of the acquisition process 406a corresponds to the processing of the acquisition unit 251.
- the processing of the specifying process 406b corresponds to the processing of the specifying unit 252.
- the processing of the determination process 406c corresponds to the processing of the determination unit 253.
- the processing of the output process 406d corresponds to the processing of the output unit 254.
- each program does not necessarily have to be stored in the hard disk device 307 from the beginning.
- each program is stored in a “portable physical medium” such as a flexible disk (FD), a CD-ROM, a DVD disk, a magneto-optical disk, or an IC card inserted into the computer 400.
- the computer 300 may read out and execute each of the programs 407a to 407e.
Abstract
Description
105,205 カメラ
110,210 通信部
120,220 入力部
130,230 表示部
140,240 記憶部
141,241 画像バッファ
142,242 アウトライン情報
150,250 制御部
151,251 取得部
152,253 判定部
153,252 特定部
154,254 出力部
Claims (18)
- コンピュータが、
撮影装置により撮影された撮影画像を取得し、
複数の物体の輪郭の形状を示す複数の輪郭データを記憶する記憶部を参照して、前記複数の輪郭データに、取得した前記撮影画像に含まれる被写体の輪郭に対応する複数の輪郭データが含まれるか否かの判定を行い、
判定結果が肯定的である場合、物体の輪郭の内側に位置する部位の形状又は模様を示す部位データを前記物体の輪郭の形状を示す輪郭データに対応付けて記憶する記憶部を参照して、前記被写体の輪郭に対応する前記複数の輪郭データにそれぞれ対応付けられた複数の部位データを取得し、
取得した前記複数の部位データに基づき、前記複数の物体のうち、前記被写体に対応する物体を特定する、
処理を実行することを特徴とする特定方法。 - 前記複数の部位データを取得する際に、前記物体の輪郭の内側に位置する部位又は模様の該物体に対する位置を該物体の輪郭を示す輪郭データに対応付けて記憶する記憶部を参照して、前記被写体の輪郭に対応する前記複数の輪郭データにそれぞれ対応付けられた前記複数の部位データのうち、前記被写体の輪郭に対応する前記複数の輪郭データにそれぞれ対応付けられた複数の位置に対応する部位データを取得する、
ことを特徴とする請求項1に記載の特定方法。 - 前記撮影装置は被写体を異なる方向から同時に撮影する撮影装置であり、前記撮影装置に同時に撮影された第1方向撮影画像と第2方向撮影画像とを基にして、前記被写体の輪郭を抽出する処理を更に実行することを特徴とする請求項1または2に記載の特定方法。
- 前記撮影画像のうち、前記抽出された被写体の輪郭の内側の領域から複数のエッジを抽出する処理を更に実行させ、前記特定する処理は、前記抽出された複数のエッジと、前記複数の部位データとに基づき、前記複数の物体のうち、前記被写体に対応する物体を特定することを特徴とする請求項3に記載の特定方法。
- コンピュータが、
撮影装置により撮影された第1の撮影画像を取得すると、複数の物体それぞれが有する複数の輪郭の形状を示す複数の輪郭データ群を記憶する記憶部を参照して、前記複数の輪郭データ群のうち、取得した前記第1の撮影画像に含まれる被写体の輪郭に対応する輪郭データが含まれる複数の輪郭データ群を特定し、
特定した前記複数の輪郭データ群それぞれに含まれる複数の輪郭データを、該複数の輪郭データの順序に対応付けて記憶する記憶部を参照して、特定した前記複数の輪郭データ群それぞれに含まれる複数の輪郭データのうち、前記被写体の輪郭に対応する前記輪郭データに対して特定の順序関係を有する複数の輪郭データを新たに特定し、
前記撮影装置により新たに撮影された第2の撮影画像を取得すると、新たに特定した前記複数の輪郭データと、前記第2の撮影画像に含まれる前記被写体の輪郭との対応関係に基づき、前記被写体がいずれの物体に対応するかの判定を行う、
処理を実行することを特徴とする判定方法。 - 前記撮影装置は被写体を異なる方向から同時に撮影する撮影装置であり、前記撮影装置に同時に撮影された第1方向撮影画像と第2方向撮影画像とを基にして、前記被写体の輪郭を抽出する処理を更に実行することを特徴とする請求項5に記載の判定方法。
- 撮影装置により撮影された撮影画像を取得し、
複数の物体の輪郭の形状を示す複数の輪郭データを記憶する記憶部を参照して、前記複数の輪郭データに、取得した前記撮影画像に含まれる被写体の輪郭に対応する複数の輪郭データが含まれるか否かの判定を行い、
判定結果が肯定的である場合、物体の輪郭の内側に位置する部位の形状又は模様を示す部位データを前記物体の輪郭の形状を示す輪郭データに対応付けて記憶する記憶部を参照して、前記被写体の輪郭に対応する前記複数の輪郭データにそれぞれ対応付けられた複数の部位データを取得し、
取得した前記複数の部位データに基づき、前記複数の物体のうち、前記被写体に対応する物体を特定する、
処理をコンピュータに実行させることを特徴とする特定プログラム。 - 前記複数の部位データを取得する際に、前記物体の輪郭の内側に位置する部位又は模様の該物体に対する位置を該物体の輪郭を示す輪郭データに対応付けて記憶する記憶部を参照して、前記被写体の輪郭に対応する前記複数の輪郭データにそれぞれ対応付けられた前記複数の部位データのうち、前記被写体の輪郭に対応する前記複数の輪郭データにそれぞれ対応付けられた複数の位置に対応する部位データを取得する、
ことを特徴とする請求項7に記載の特定プログラム。 - 前記撮影装置は被写体を異なる方向から同時に撮影する撮影装置であり、前記撮影装置に同時に撮影された第1方向撮影画像と第2方向撮影画像とを基にして、前記被写体の輪郭を抽出する処理を更に実行することを特徴とする請求項7または8に記載の特定プログラム。
- 前記撮影画像のうち、前記抽出された被写体の輪郭の内側の領域から複数のエッジを抽出する処理を更に実行させ、前記特定する処理は、前記抽出された複数のエッジと、前記複数の部位データとに基づき、前記複数の物体のうち、前記被写体に対応する物体を特定することを特徴とする請求項9に記載の特定プログラム。
- 撮影装置により撮影された第1の撮影画像を取得すると、複数の物体それぞれが有する複数の輪郭の形状を示す複数の輪郭データ群を記憶する記憶部を参照して、前記複数の輪郭データ群のうち、取得した前記第1の撮影画像に含まれる被写体の輪郭に対応する輪郭データが含まれる複数の輪郭データ群を特定し、
特定した前記複数の輪郭データ群それぞれに含まれる複数の輪郭データを、該複数の輪郭データの順序に対応付けて記憶する記憶部を参照して、特定した前記複数の輪郭データ群それぞれに含まれる複数の輪郭データのうち、前記被写体の輪郭に対応する前記輪郭データに対して特定の順序関係を有する複数の輪郭データを新たに特定し、
前記撮影装置により新たに撮影された第2の撮影画像を取得すると、新たに特定した前記複数の輪郭データと、前記第2の撮影画像に含まれる前記被写体の輪郭との対応関係に基づき、前記被写体がいずれの物体に対応するかの判定を行う、
処理をコンピュータに実行させることを特徴とする判定プログラム。 - 前記撮影装置は被写体を異なる方向から同時に撮影する撮影装置であり、前記撮影装置に同時に撮影された第1方向撮影画像と第2方向撮影画像とを基にして、前記被写体の輪郭を抽出する処理を更に実行することを特徴とする請求項11に記載の判定プログラム。
- 撮影装置により撮影された撮影画像を取得する取得部と、
複数の物体の輪郭の形状を示す複数の輪郭データを記憶する記憶部を参照して、前記複数の輪郭データに、取得した前記撮影画像に含まれる被写体の輪郭に対応する複数の輪郭データが含まれるか否かの判定を行う判定部と、
前記判定部による判定結果が肯定的である場合、物体の輪郭の内側に位置する部位の形状又は模様を示す部位データを前記物体の輪郭の形状を示す輪郭データに対応付けて記憶する記憶部を参照して、前記被写体の輪郭に対応する前記複数の輪郭データにそれぞれ対応付けられた複数の部位データを取得し、取得した前記複数の部位データに基づき、前記複数の物体のうち、前記被写体に対応する物体を特定する特定部と
を有することを特徴とする情報処理装置。 - 前記特定部は、前記複数の部位データを取得する際に、前記物体の輪郭の内側に位置する部位又は模様の該物体に対する位置を該物体の輪郭を示す輪郭データに対応付けて記憶する記憶部を参照して、前記被写体の輪郭に対応する前記複数の輪郭データにそれぞれ対応付けられた前記複数の部位データのうち、前記被写体の輪郭に対応する前記複数の輪郭データにそれぞれ対応付けられた複数の位置に対応する部位データを取得する、
ことを特徴とする請求項13に記載の情報処理装置。 - 前記撮影装置は被写体を異なる方向から同時に撮影する撮影装置であり、前記判定部は、前記撮影装置に同時に撮影された第1方向撮影画像と第2方向撮影画像とを基にして、前記被写体の輪郭を抽出することを特徴とする請求項13または14に記載の情報処理装置。
- 前記特定部は、前記撮影画像のうち、前記抽出された被写体の輪郭の内側の領域から複数のエッジを抽出し、抽出した複数のエッジと、前記複数の部位データとに基づき、前記複数の物体のうち、前記被写体に対応する物体を特定することを特徴とする請求項15に記載の情報処理装置。
- 撮影装置により撮影された第1の撮影画像を取得すると、複数の物体それぞれが有する複数の輪郭の形状を示す複数の輪郭データ群を記憶する記憶部を参照して、前記複数の輪郭データ群のうち、取得した前記第1の撮影画像に含まれる被写体の輪郭に対応する輪郭データが含まれる複数の輪郭データ群を特定し、前記複数の輪郭データ群それぞれに含まれる複数の輪郭データを、該複数の輪郭データの順序に対応付けて記憶する記憶部を参照して、特定した前記複数の輪郭データ群それぞれに含まれる複数の輪郭データのうち、前記被写体の輪郭に対応する前記輪郭データに対して特定の順序関係を有する複数の輪郭データを新たに特定する特定部と、
前記撮影装置により新たに撮影された第2の撮影画像を取得すると、新たに特定した前記複数の輪郭データと、前記第2の撮影画像に含まれる前記被写体の輪郭との対応関係に基づき、前記被写体がいずれの物体に対応するかの判定を行う判定部と
を有することを特徴とする情報処理装置。 - 前記撮影装置は被写体を異なる方向から同時に撮影する撮影装置であり、前記特定部は、前記撮影装置に同時に撮影された第1方向撮影画像と第2方向撮影画像とを基にして、前記被写体の輪郭を抽出することを特徴とする請求項17に記載の情報処理装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18923787.8A EP3816931A4 (en) | 2018-06-29 | 2018-06-29 | IDENTIFICATION PROCEDURES, DETERMINATION PROCEDURES, IDENTIFICATION PROGRAM, DETERMINATION PROGRAM AND INFORMATION PROCESSING DEVICE |
PCT/JP2018/024894 WO2020003510A1 (ja) | 2018-06-29 | 2018-06-29 | 特定方法、判定方法、特定プログラム、判定プログラムおよび情報処理装置 |
AU2018429247A AU2018429247B2 (en) | 2018-06-29 | 2018-06-29 | Specifying method, determination method, specifying program, determination program, and information processing apparatus |
JP2020527141A JP7264163B2 (ja) | 2018-06-29 | 2018-06-29 | 判定方法、判定プログラムおよび情報処理装置 |
US17/130,169 US20210110558A1 (en) | 2018-06-29 | 2020-12-22 | Specifying method, determination method, non-transitory computer readable recording medium, and information processing apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2018/024894 WO2020003510A1 (ja) | 2018-06-29 | 2018-06-29 | 特定方法、判定方法、特定プログラム、判定プログラムおよび情報処理装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/130,169 Continuation US20210110558A1 (en) | 2018-06-29 | 2020-12-22 | Specifying method, determination method, non-transitory computer readable recording medium, and information processing apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020003510A1 true WO2020003510A1 (ja) | 2020-01-02 |
Family
ID=68986332
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/024894 WO2020003510A1 (ja) | 2018-06-29 | 2018-06-29 | 特定方法、判定方法、特定プログラム、判定プログラムおよび情報処理装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20210110558A1 (ja) |
EP (1) | EP3816931A4 (ja) |
JP (1) | JP7264163B2 (ja) |
AU (1) | AU2018429247B2 (ja) |
WO (1) | WO2020003510A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05265547A (ja) * | 1992-03-23 | 1993-10-15 | Fuji Heavy Ind Ltd | 車輌用車外監視装置 |
JP2007232389A (ja) * | 2006-02-27 | 2007-09-13 | Toyota Motor Corp | 立体形状検出装置 |
JP2012212322A (ja) | 2011-03-31 | 2012-11-01 | Sony Corp | 画像処理装置および方法、プログラム、並びに記録媒体 |
JP2014229010A (ja) * | 2013-05-21 | 2014-12-08 | 株式会社デンソー | 物体検出装置 |
JP2017091202A (ja) | 2015-11-10 | 2017-05-25 | 国立大学法人静岡大学 | 物体認識方法及び物体認識装置 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012161346A1 (ja) * | 2011-05-24 | 2012-11-29 | 日本電気株式会社 | 情報処理装置、情報処理方法および情報処理プログラム |
JP5596628B2 (ja) * | 2011-06-17 | 2014-09-24 | トヨタ自動車株式会社 | 物体識別装置 |
US9866900B2 (en) * | 2013-03-12 | 2018-01-09 | The Nielsen Company (Us), Llc | Methods, apparatus and articles of manufacture to detect shapes |
US9582739B2 (en) * | 2014-11-18 | 2017-02-28 | Harry Friedbert Padubrin | Learning contour identification system using portable contour metrics derived from contour mappings |
-
2018
- 2018-06-29 AU AU2018429247A patent/AU2018429247B2/en active Active
- 2018-06-29 JP JP2020527141A patent/JP7264163B2/ja active Active
- 2018-06-29 WO PCT/JP2018/024894 patent/WO2020003510A1/ja active Application Filing
- 2018-06-29 EP EP18923787.8A patent/EP3816931A4/en not_active Withdrawn
-
2020
- 2020-12-22 US US17/130,169 patent/US20210110558A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05265547A (ja) * | 1992-03-23 | 1993-10-15 | Fuji Heavy Ind Ltd | 車輌用車外監視装置 |
JP2007232389A (ja) * | 2006-02-27 | 2007-09-13 | Toyota Motor Corp | 立体形状検出装置 |
JP2012212322A (ja) | 2011-03-31 | 2012-11-01 | Sony Corp | 画像処理装置および方法、プログラム、並びに記録媒体 |
JP2014229010A (ja) * | 2013-05-21 | 2014-12-08 | 株式会社デンソー | 物体検出装置 |
JP2017091202A (ja) | 2015-11-10 | 2017-05-25 | 国立大学法人静岡大学 | 物体認識方法及び物体認識装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3816931A4 |
Also Published As
Publication number | Publication date |
---|---|
AU2018429247A9 (en) | 2021-05-13 |
AU2018429247A1 (en) | 2021-01-28 |
EP3816931A1 (en) | 2021-05-05 |
AU2018429247B2 (en) | 2022-07-07 |
JP7264163B2 (ja) | 2023-04-25 |
EP3816931A4 (en) | 2021-07-07 |
JPWO2020003510A1 (ja) | 2021-06-24 |
US20210110558A1 (en) | 2021-04-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9767604B2 (en) | Image analysis method by analyzing point cloud using hierarchical search tree | |
JP6176388B2 (ja) | 画像識別装置、画像センサ、画像識別方法 | |
JP6525635B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
US10679358B2 (en) | Learning image automatic sorting device, learning image automatic sorting method, and learning image automatic sorting program | |
JP2018508888A5 (ja) | ||
US9082000B2 (en) | Image processing device and image processing method | |
JP6487642B2 (ja) | 手指形状の検出方法、そのプログラム、そのプログラムの記憶媒体、及び、手指の形状を検出するシステム。 | |
JP2016031679A (ja) | オブジェクト識別装置、オブジェクト識別方法及びプログラム | |
JP2012053606A (ja) | 情報処理装置および方法、並びにプログラム | |
CN109446364A (zh) | 抓拍检索方法、图像处理方法、装置、设备及存储介质 | |
JP2018151830A (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP6278108B2 (ja) | 画像処理装置、画像センサ、画像処理方法 | |
JP2018088049A (ja) | 画像処理装置、画像処理方法、及びプログラム | |
JP2014182754A (ja) | 学習装置、学習方法及びプログラム | |
JP2010117981A (ja) | 顔検出装置 | |
CN109711287B (zh) | 人脸采集方法及相关产品 | |
JP2015045919A (ja) | 画像認識方法及びロボット | |
JP6393495B2 (ja) | 画像処理装置および物体認識方法 | |
US20220156977A1 (en) | Calibration apparatus, calibration method, and non-transitory computer readable medium storing program | |
WO2020003510A1 (ja) | 特定方法、判定方法、特定プログラム、判定プログラムおよび情報処理装置 | |
KR20160148806A (ko) | 방향정보를 이용한 객체 검출기 생성 방법, 이를 이용한 객체 검출 장치 및 방법 | |
JP2016081472A (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP6116916B2 (ja) | 画像検出装置及び制御プログラム並びに画像検出方法 | |
KR101884874B1 (ko) | 부분 이미지 기반 객체 판별 방법 및 장치 | |
JP2015187770A (ja) | 画像認識装置、画像認識方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18923787 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020527141 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2018429247 Country of ref document: AU Date of ref document: 20180629 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2018923787 Country of ref document: EP |