CN108140255B - The method and system of reflecting surface in scene for identification - Google Patents

The method and system of reflecting surface in scene for identification Download PDF

Info

Publication number
CN108140255B
CN108140255B CN201680056908.1A CN201680056908A CN108140255B CN 108140255 B CN108140255 B CN 108140255B CN 201680056908 A CN201680056908 A CN 201680056908A CN 108140255 B CN108140255 B CN 108140255B
Authority
CN
China
Prior art keywords
reflecting surface
scene
identification
candidate
method described
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201680056908.1A
Other languages
Chinese (zh)
Other versions
CN108140255A (en
Inventor
玛坦·普洛特
莫蒂·库什尼尔
费利克斯·戈德堡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Damo Institute Hangzhou Technology Co Ltd
Original Assignee
Infinity Augmented Reality Israel Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Infinity Augmented Reality Israel Ltd filed Critical Infinity Augmented Reality Israel Ltd
Publication of CN108140255A publication Critical patent/CN108140255A/en
Application granted granted Critical
Publication of CN108140255B publication Critical patent/CN108140255B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/60Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Data Mining & Analysis (AREA)
  • Quality & Reliability (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)
  • Processing Or Creating Images (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)

Abstract

There is provided herein the method and systems of the reflecting surface in scene for identification.The system may include being configured as the sensing device of photographed scene.The system further includes storage device, which is configured as the three-dimensional position of at least some of storage scene object.The system further includes computer processor, and the reflecting surface which is configured to attempt to obtain one or more candidate surfaces of the selection of the surface from scene indicates.In the case where attempting acquisition and being successful situation, computer processor is additionally configured to determine that candidate reflecting surface is strictly the reflecting surface for indicating to limit by surface obtained.According to certain embodiments of the present invention, in the case where attempting calculating and being unsuccessful situation, determine that the identification part of object is the object unrelated with the object stored.

Description

The method and system of reflecting surface in scene for identification
Technical field
Present invention relates generally to field of image processings, and the reflection more particularly, to detection in the scene of shooting Surface.
Background technique
Before illustrating background of the invention, it may be helpful for illustrating the definition for the certain terms that will be used hereinafter 's.
As used herein term " sensing device " (being referred to as " camera " in computer vision sometimes) is by broadly It is defined as any combination of any kind of one or more sensors, being not necessarily optical (but may include radar, super Sound wave etc.).In addition, sensing device is configured as photographed scene image, some three-dimensional datas of scene are obtained or obtained.Example Property sensing device may include a pair of of camera, which is configured as shooting passive stereo sound, which can be with Depth data is exported by comparing from the image of different position acquisitions.Another example of sensing device may include structured light Sensor, the structured light sensor are configured as the anti-of the predefined light pattern that reception and analysis have projected in scene It penetrates.However another important example is 2D sensing device, multiple 2D images of the 2D sensing device photographed scene are simultaneously further The space data of relationship between each 2D shooting image are provided.It will be noted that for the purpose of this application, in scene All dimensions can be and opposite (as long as example, the ratio is to provide from camera or can be derived, then there is relative movement It is enough).
The interface that as used herein term " reflecting surface " is defined as between two kinds of different media changes wave Before direction (for example, light or sound) make wavefront back to the surface in the medium of its origin.Mirror-reflection is from surface Light (or other kinds of wave) mirror reflection, wherein the light (ray) from single incident direction is reflected to single outgoing Direction.This performance is described by reflection law, which set forth direction (incident ray) and the reflection of incident light Emergent light direction (indirect ray) it is at the same angle relative to surface normal shape, therefore incidence angle be equal to angle of reflection, and And incident direction, normal direction and reflection direction are coplanar.Partially reflecting surface can be referred to as any in two types It is a kind of: type one --- not all surface is all reflexive.The rank of type two --- mirror-reflection can change, and It can be considered as " can reflect " more than the rank of an appropriate threshold.
The challenge of computer vision first is that the presence of the reflecting surface in detecting field scape and obtaining related reflecting surface Knowledge.In mirror-reflection more particularly to mirror in the case where, existing risk is, carries out computer based to scene It is real object that analysis, which can be mistakenly considered the image shot in reflection,.
It is recommended that some logics or process will be advantageous, these logics or process will enable the vision system of computerization Real object and their own image are enough distinguished, can automatically detect the reflecting surface in the scene of shooting, and more specific Ground is said, the space representation of reflecting surface can be generated.
Summary of the invention
Some embodiments of the present invention provide the method and system of the reflecting surface in scene for identification.The system can To include the sensing device for being configured as photographed scene.The system can also include storage device, which is configured as Store the three-dimensional position of at least some of scene object.The system can also include computer processor, the computer disposal The reflecting surface that device is configured to attempt to obtain the one or more candidate surfaces selected from the surface in scene indicates.It is tasting It is in successful situation that examination, which obtains, and computer processor is additionally configured to determine that candidate reflecting surface is strictly by table obtained Face indicates the reflecting surface limited.According to certain embodiments of the present invention, it in the case where attempting calculating and being unsuccessful situation, determines The identification part of object is the object unrelated with storage object.
Of the invention these, additional and/or other aspects and/or advantage are illustrated in the following detailed description;It may It is inferred to from detailed description;And/or practice through the invention learns.
Detailed description of the invention
It is considered as that subject of the present invention is highlighted and is distinctly claimed in the conclusion part of specification.So And preferably it can pass through ginseng when reading attached drawing about organizing and operating method of the invention and its objects, features and advantages Examine it is described in detail below understand, in the accompanying drawings:
Fig. 1 is to show the block diagram of the unrestricted exemplary system architecture of some embodiments according to the present invention;
Fig. 2 is to show the high level flow chart of the unrestricted illustrative methods of some embodiments according to the present invention;
Fig. 3 is the ray diagram in terms of showing some optical paths of some embodiments according to the present invention;And
Fig. 4 is to show the example photographed figure of the real scene of several aspects according to certain embodiments of the present invention Picture.
It will be appreciated that element shown in the accompanying drawings is not drawn necessarily to scale for simple and clear explanation.Example Such as, for the sake of clarity, the size of some elements may be exaggerated relative to other elements.In addition, thinking suitable place, Reference label can be repeated among the figures to instruction correspondence or similar element.
Specific embodiment
In the following description, various aspects of the invention will be described.For purposes of explanation, elaborate specifically configure and Details is in order to provide thorough understanding of the present invention.It is, however, obvious to a person skilled in the art that of the invention It can be not implementing in the case where detail presented herein.In addition, in order not to the fuzzy present invention, it is convenient to omit or letter Change well known feature.
Unless expressly stated otherwise, otherwise from following discussion it is readily apparent that it should be understood that in the whole instruction In, computer or computing system or class are referred to using the discussion of the terms such as " processing ", " operation ", " calculating ", " determination " As computing electronics movement and/or process, which will be indicated as in the register and/or memory of computing system The data manipulation of physical quantity (such as amount of electrons) and/or be converted to the memory for being similarly represented as computing system, register or Other data of physical quantity in other this information storage, transmission or display devices.
Fig. 1 is the block diagram for showing the exemplary architecture that can implement embodiments of the present invention on it.System 100 can be with The sensing device 110 that may include the scene of object (such as 14) and surface (such as 10) including being configured as shooting.System 100 It can also include the storage device 120 for being configured as the database of maintenance scene, the database purchase of the scene is in the scene The proximal location of at least some objects and/or surface (for example including object 14).It is important to note that database 120 may be used also It is known anti-in scene to indicate which object and surface is also reflexive, and when therefore after carrying to ray tracing Reflective surface is taken into account.
It should be noted that database 120 needs not be 3D in itself, and can actually can save therefrom export The form of any data structure of the data of the relative position of object in scene.Therefore, the 3D of storage point is not needed actually Position.For actual purpose, the data that storage can therefrom be inferred to the position 3D are sufficient.One unrestricted example It is depth map and position and angle from the shooting of this depth map.Such depth map does not provide the position 3D a little It sets, but can be inferred that the position 3D.
Once such perception reflex surface, it will be added in database, so that it can be used for needing in scene In the various application programs of the cognition of reflecting surface, or minimally by reflecting surface and " well (well) " or other are flat " recess " on surface distinguishes.
According to some embodiments, the data of storage are also possible to the 3D model of object, are not necessarily the special object Real scan, but be located at room in object model.
System 100 can also include computer processor 130, and computer processor is configured to attempt to obtain from scene Surface (for example, surface 10) selection one or more candidate surfaces reflecting surface indicate.It is successful attempting to obtain In the case of, computer processor 130 is additionally configured to determine that candidate reflecting surface 10 is strictly to indicate 134 by surface obtained The reflecting surface of restriction.Alternatively, in the case where attempting to calculate and being unsuccessful situation, determine that the identification part of object is and stores pair As unrelated new object 132, and it can be used as new entry and be added to storage device 120.
According to certain embodiments of the present invention, identify that the candidate in scene is anti-based on the image of sensing before trial Reflective surface, and wherein to executing the trial on the candidate identified.
According to certain embodiments of the present invention, the cognition and expression of reflecting surface can be also used for reanalysing previous table Face, because new surface can change the understanding (in a probabilistic manner) on analysed surface in the scene.Therefore, improvement is executed With some iterative process for verifying data related with the reflecting surface in scene.
According to certain embodiments of the present invention, the object of candidate reflecting surface on the database by identification storage is identified One of at least part execute, wherein the identification part of object is not located at position associated with the object stored. For example, some features of image 12 are identified as, (it is registered in database 120 and its proximal location is in scene with object 14 In be known) similar (in addition to some spaces tilt or translate).
According to certain embodiments of the present invention, identify that candidate reflecting surface is deep by the 3D of identification instruction reflecting surface Degree pattern is performed.Main " suspection " more specifically, for the surface as reflecting surface is that it is based on image analysis Depth analysis be similar to boundary in other flat surfaces and be clearly recessed or the surface of " well ".Reflecting surface or mirror provide The depth concept of similar this recess, and pair shown in the surface for being suspect to be surface by analysis and rear orientation trace As reflecting surface and true recess are distinguished.In the case where this is the true recess in concrete surface, object will be located In its " true " position.Only in case of reflection, real object is in different positions, and sensing device actually refers to To the image of real object.
According to certain embodiments of the present invention, in the case where determining candidate reflecting surface is reflecting surface, computer Processor is additionally configured to indicate to generate the virtual image for the virtual objects being located in scene based on reflecting surface.
According to certain embodiments of the present invention, it would be possible to be considered as the opposite figure of anchor point with wherein at least one image Requirement as replacing the position 3D of above-mentioned object.In this way, it would be possible to derive the confined volume that reflecting surface is located at.This will It is realized by the backward ray tracing of the above-mentioned light ray of application.In this case, exact surface expression will be not total It all derives, but some tolerances or volume range will be provided for its position, and still may answer many It is beneficial with program.For example, it is enough to know that the particular range in scene is limited sometimes in path planning application program, and And the accurate location of confined surfaces (reflecting surface) is not needed.
Fig. 2 is to show for identification the side of the reflecting surface of such as mirror and other planes and nonplanar reflecting surface The high level flow chart of the method 200 of method.Method 200 may include at least one image of scene of the sensing comprising surface and object Step 210;Meanwhile this method can safeguard the three of at least some of the storage scene scene of the three-dimensional position 220 of object Dimensional database;Followed by iteration subprocess, method 200 attempt to obtain one or more candidate lists of the surface selection from scene The reflecting surface in face indicates 230.It executes and attempts whether successfully to check 240.Then, in the case where attempting acquisition and being successful situation, Determine that candidate's reflecting surface is the reflecting surface 250 for indicating to limit by surface obtained.Some embodiment party according to the present invention Formula, surface indicate to be realized by the numerical approximation value of surface equation.It should be noted that scene may be included known Mirror, therefore to the calculating of new potential reflecting surface may in view of they and may store it in database.
According to certain embodiments of the present invention, in the case where attempting calculating and being unsuccessful situation, the Identification Division of object is determined Dividing is the object 260 unrelated in the object stored.
According to certain embodiments of the present invention, by identifying the time in scene based on the image of sensing before trial Reflecting surface is selected, and wherein to executing the trial on the candidate identified.
According to certain embodiments of the present invention, at least part of one of object by identification storage on the database Candidate reflecting surface is identified to execute, and wherein the identification part of object is not located at position associated with the object stored.
According to certain embodiments of the present invention, the 3D depth pattern for indicating reflecting surface includes that accurately depth walks for linking Suddenly.
According to certain embodiments of the present invention, wherein in the case where candidate reflecting surface is determined as reflecting surface, It is described to export by the way that image processing algorithm to be applied to the identification part of the corresponding object of object and storage in the database The albedo parameters of reflecting surface.
According to certain embodiments of the present invention, albedo parameters further include the unreflecting part on perception reflex surface.
According to certain embodiments of the present invention, wherein albedo parameters include the rank and type of reflectivity.
According to certain embodiments of the present invention, it in the case where candidate reflecting surface is determined as reflecting surface, is based on Reflecting surface indicates to generate the virtual image of the virtual objects in scene.In addition, by analyzing derived reflection characteristic For generating the photorealism of the virtual objects integrated in scene.
Fig. 3 is the ray diagram 300 for showing some aspects of the optical path of embodiment according to the present invention.Specifically, sensing The sensor array 50 of device is shown as the transversal of the part 60 of the projection with the doubtful image for indicating the object in scene Face.When attempting is the doubtful framing reflecting surface expression of known real object 70 for position in the scene, Ke Yicong 310 rear orientation trace of ray to potential reflecting surface 330B is arrived real object 70 by the focus 40 of sensing device again, and assuming that Reflection law is abided by when the surface normal 340B of potential reflecting surface 330B.For another possibility with surface normal 340A Reflecting surface 330A, the process can be iteratively repeated.
The above process be used to map candidate reflecting surface, wherein the known location of real object 70, surface normal and anti- Law is penetrated as constraint condition, by the constraint condition, potential reflecting surface is based on reflecting surface 330A etc., and gradually generation is latent Reflecting surface.
Fig. 4 is the example photographed image for showing the real scene of several aspects of embodiment according to the present invention.By It is detected as image 430B in some objects of such as lamp 430A, so scene seems to include flat mirror 410.In addition, scheming Other objects not shot as in, (such as picture 420 (describing eyes)) can be stored in database with its accurate position 3D In.When attempting the expression of export reflecting surface, the sensor array of the sensing device from picture 420 to camera is executed as described above The ray rear orientation trace of column.
Once may indicate that it can be used due to the numerical approximation value on surface or the plane equation export reflecting surface of mirror In the image for the virtual objects that reflection is introduced into scene.For example, can be introduced cylindrical body 450A as augmented reality object Into scene.In order to make user more realistically perceive virtual objects, in the reflection law for abiding by reflecting surface detected and its Its respective reflection 450B is generated while his optical property.
In the above description, embodiment is example or realization of the invention." embodiment ", " embodiment " or The various appearances of " some embodiments " are not necessarily all referring to identical embodiment.
Although various features that can be of the invention in the described in the text up and down of single embodiment, feature can also be single Solely or in any suitable combination provide.On the contrary, although for the sake of clarity the present invention can be in individual embodiment Upper and lower described in the text, but the present invention can also realize in single embodiment.
To " some embodiments ", " embodiment ", " embodiment " or " other embodiments " in specification Reference means that a particular feature, structure, or characteristic for combining embodiment description is included at least some embodiments but not It must be all of the embodiments of the present invention.
It should be understood that wording as used herein and term are not necessarily to be construed as limiting, and it is only used for descriptive mesh 's.
With reference to it is appended description, drawings and embodiments may be better understood present invention teach that principle and purposes.
It should be understood that details set forth herein does not explain the limitation to application of the invention.
Further, it is understood that the present invention can be practiced or carried out in various ways, and the present invention can divided by It is realized in embodiment except the embodiment summarized in upper description.
Should be understood that term " includes ", "comprising", " by ... form " and its grammatical variants be not excluded for addition one A or multiple components, feature, step or its entirety or group, and term should be interpreted assignment component, feature, step or whole Body.
If specification or claim are related to " adding " element, more than one add ons are not excluded the presence of.
It should be understood that in the case where claim or specification mention "one" or "one" element, it is such With reference to not being interpreted that there is only one in the element.
It, " can with " or " can it should be understood that in specification Statement component, feature, structure or characteristic " possibility " " perhaps " In the case that energy " is included, do not need to include specific component, feature, structure or characteristic.
Under applicable circumstances, although state diagram, flow chart or both can be used to describe embodiment, this hair It is bright to be not limited to these figures or corresponding description.For example, process does not need the frame or state by each diagram, or with diagram It is mobile with the identical sequence of sequence of description.
Method of the invention can execute or complete selected step or task by manual, automatic or combinations thereof come real It is existing.
Description, example, method and the material presented in claims and specification be not necessarily to be construed as it is restrictive, And it is merely illustrative.
Unless otherwise defined, the meaning of technical and scientific terms used herein is usually by of the art common Technical staff is generally understood.
The present invention can in test or practice with method described herein and material be equivalent or similar method and material Material is to implement.
Although the embodiment for being directed to limited quantity describes the present invention, these are not necessarily to be construed as to this hair The limitation of bright range, but the example as some preferred embodiments.Other possible variations modify and apply also at this In the range of invention.Therefore, the scope of the present invention should not be limited by content described so far, but by appended power Benefit requires and its legal equivalents limit.

Claims (18)

1. a kind of method of the reflecting surface in scene for identification, comprising:
At least one image of scene of the sensing comprising surface and object;
Safeguard the object database of the scene, at least the one of some objects of the database purchase in the scene Partial position;
Pass through the rear portion assigned to ray tracing from the Identification Division of object described at least one of described scene with the object At least one optical path of associated the stored position of split-phase come attempt to obtain the surface from the scene selection one The reflecting surface expression of a or multiple candidate reflecting surfaces,
Wherein, in the case where the acquisition attempted is successful situation, determine that at least one candidate reflecting surface is by obtained anti- Reflective surface indicates the reflecting surface limited, and
Generate the image of the object in the scene for being introduced into and indicating reflection by the reflecting surface.
2. according to the method described in claim 1, wherein, the numerical value by the surface equation based on the backward ray tracing is close Realize that the reflecting surface is indicated like value.
3. according to the method described in claim 1, wherein, before the trial, being identified based on the image sensed described The candidate reflecting surface in scene, and wherein, the trial is executed to the candidate identified.
4. according to the method described in claim 1, wherein, storing one of described object on the database at least by identification A part executes the identification to the candidate reflecting surface, wherein the part of the object recognized is not located at and is deposited At the associated position of the object of storage.
5. according to the method described in claim 4, wherein, indicating the 3D depth pattern of the reflecting surface by identification to execute Identification to the candidate reflecting surface.
6. according to the method described in claim 5, wherein, indicating that the 3D depth pattern of the reflecting surface includes that boundary is specific Well.
7. according to the method described in claim 1, wherein, in the case where the calculating attempted is unsuccessful situation, it is described right to determine The identification part of elephant is the object unrelated with the object stored.
8. according to the method described in claim 7, the object newly recognized is also added to database.
9. according to the method described in claim 1, wherein, in the feelings that the candidate reflecting surface is determined as to the reflecting surface Under condition, by the way that image processing techniques to be applied to the identification part of the object and is stored in corresponding on the database Object exports the albedo parameters of the reflecting surface.
10. according to the method described in claim 9, wherein, the albedo parameters further include identifying the reflecting surface not The part of reflection.
11. according to the method described in claim 9, wherein, the albedo parameters include the rank and type of reflectivity.
12. according to the method described in claim 1, wherein, the case where the candidate reflecting surface is determined as reflecting surface Under, the virtual image for the virtual objects that rendering is located in the scene is indicated based on the reflecting surface.
13. a kind of system of the reflecting surface in scene for identification, comprising:
Camera is configured as at least one image of scene of the sensing comprising surface and object;
Memory is configured as safeguarding that the database of the scene, the database purchase are some described in the scene At least part of position of object;And
Computer processor is configured as the identification by rear to ray tracing from object described at least one of described scene It attempts to obtain from the scene at least one optical path of stored position associated with the part of the object in part In the reflecting surface of the candidate reflecting surfaces of one or more of surface selection indicate;
Wherein, in the case where the acquisition attempted is successful situation, the computer processor is additionally configured to determine candidate reflection Surface is the reflecting surface for indicating to limit by surface obtained
Wherein, the computer processor is additionally configured to generate in the scene for being introduced into and indicating reflection by the reflecting surface The image of object.
14. system according to claim 13, wherein before the trial, based on the image sensed to identify The candidate reflecting surface in scene is stated, and wherein, the trial is executed to the candidate identified.
15. system according to claim 14, wherein by recognizing one of the object being stored on the database At least part execute the identification to the candidate reflecting surface, wherein the part of the object recognized is not located at At position associated with the object stored.
16. system according to claim 14, wherein indicate the 3D depth pattern of the reflecting surface by identification to hold Identification of the row to the candidate reflecting surface.
17. system according to claim 13, wherein in the case where the calculating attempted is unsuccessful situation, described in determination The identification part of object is the object unrelated with the object stored.
18. system according to claim 13, wherein the candidate reflecting surface is being determined as the reflecting surface In the case of, the computer processor is additionally configured to indicate to generate based on the reflecting surface virtual in the scene The virtual image of object.
CN201680056908.1A 2015-10-01 2016-09-27 The method and system of reflecting surface in scene for identification Active CN108140255B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US14/872,160 2015-10-01
US14/872,160 US10049303B2 (en) 2015-10-01 2015-10-01 Method and a system for identifying reflective surfaces in a scene
PCT/IL2016/051062 WO2017056089A2 (en) 2015-10-01 2016-09-27 Method and a system for identifying reflective surfaces in a scene

Publications (2)

Publication Number Publication Date
CN108140255A CN108140255A (en) 2018-06-08
CN108140255B true CN108140255B (en) 2019-09-10

Family

ID=58422968

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680056908.1A Active CN108140255B (en) 2015-10-01 2016-09-27 The method and system of reflecting surface in scene for identification

Country Status (3)

Country Link
US (3) US10049303B2 (en)
CN (1) CN108140255B (en)
WO (1) WO2017056089A2 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108780228B (en) * 2016-01-19 2021-04-20 奇跃公司 Augmented reality system and method using images
US9868212B1 (en) * 2016-02-18 2018-01-16 X Development Llc Methods and apparatus for determining the pose of an object based on point cloud data
KR102522502B1 (en) 2016-04-26 2023-04-17 매직 립, 인코포레이티드 Electromagnetic tracking with augmented reality systems
US10594920B2 (en) * 2016-06-15 2020-03-17 Stmicroelectronics, Inc. Glass detection with time of flight sensor
CN108416837A (en) * 2018-02-12 2018-08-17 天津大学 Trivector Database Modeling method in ray trace
KR20200018207A (en) * 2018-08-10 2020-02-19 일렉트로닉 아트 아이엔씨. Systems and methods for rendering reflections
US11043025B2 (en) 2018-09-28 2021-06-22 Arizona Board Of Regents On Behalf Of Arizona State University Illumination estimation for captured video data in mixed-reality applications
US10839560B1 (en) * 2019-02-26 2020-11-17 Facebook Technologies, Llc Mirror reconstruction
US11462000B2 (en) * 2019-08-26 2022-10-04 Apple Inc. Image-based detection of surfaces that provide specular reflections and reflection modification

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1630892A (en) * 2002-02-12 2005-06-22 日本发条株式会社 Identifying medium and identifying method for object

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7973790B2 (en) 2005-08-11 2011-07-05 Realtime Technology Ag Method for hybrid rasterization and raytracing with consistent programmable shading
US8432395B2 (en) * 2009-06-16 2013-04-30 Apple Inc. Method and apparatus for surface contour mapping
US20150185857A1 (en) 2012-06-08 2015-07-02 Kmt Global Inc User interface method and apparatus based on spatial location recognition
US9600927B1 (en) * 2012-10-21 2017-03-21 Google Inc. Systems and methods for capturing aspects of objects using images and shadowing
US9874749B2 (en) * 2013-11-27 2018-01-23 Magic Leap, Inc. Virtual and augmented reality systems and methods
US9035946B1 (en) 2014-02-13 2015-05-19 Raycast Systems, Inc. Computer hardware architecture and data structures for triangle binning to support incoherent ray traversal
US10210628B2 (en) 2014-03-03 2019-02-19 Mitsubishi Electric Corporation Position measurement apparatus for measuring position of object having reflective surface in the three-dimensional space

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1630892A (en) * 2002-02-12 2005-06-22 日本发条株式会社 Identifying medium and identifying method for object

Also Published As

Publication number Publication date
US10049303B2 (en) 2018-08-14
WO2017056089A3 (en) 2017-07-27
US20180357516A1 (en) 2018-12-13
CN108140255A (en) 2018-06-08
US20190370612A1 (en) 2019-12-05
US10719740B2 (en) 2020-07-21
US10395142B2 (en) 2019-08-27
WO2017056089A2 (en) 2017-04-06
US20170098139A1 (en) 2017-04-06

Similar Documents

Publication Publication Date Title
CN108140255B (en) The method and system of reflecting surface in scene for identification
US9303525B2 (en) Method and arrangement for multi-camera calibration
JP5024067B2 (en) Face authentication system, method and program
CN113196296A (en) Detecting objects in a crowd using geometric context
MacLean et al. Recovery of egomotion and segmentation of independent object motion using the EM-algorithm
CN110998658A (en) Depth map using structured light and flood light
Takimoto et al. 3D reconstruction and multiple point cloud registration using a low precision RGB-D sensor
US9990710B2 (en) Apparatus and method for supporting computer aided diagnosis
CN104662561A (en) Skin-based user recognition
CN108876835A (en) Depth information detection method, device and system and storage medium
CN103562934A (en) Face location detection
US10474232B2 (en) Information processing method, information processing apparatus and user equipment
Canessa et al. A dataset of stereoscopic images and ground-truth disparity mimicking human fixations in peripersonal space
Wang et al. Accuracy of monocular gaze tracking on 3d geometry
CN108475434A (en) The method and system of radiation source characteristic in scene is determined based on shadowing analysis
Governi et al. Improving surface reconstruction in shape from shading using easy-to-set boundary conditions
EP2120206A1 (en) Object shape generating method, object shape generating device and program
CN103268474A (en) Three-dimensional scanning imaging device of mobile phone or tablet personal computer
US10048752B2 (en) Information processing method, information processing apparatus and user equipment
Hirofuji et al. 3D reconstruction of specular objects with occlusion: A shape-from-scattering approach
Rodríguez A methodology to develop computer vision systems in civil engineering: Applications in material testing and fish tracking
Burns Matching two-dimensional images to multiple three-dimensional objects using view description networks
Körtgen Robust automatic registration of range images with reflectance
US20100054608A1 (en) Surface Extraction Method, Surface Extraction Device, and Program
CN117456396A (en) Infrared light spot identification method, device, system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20200131

Address after: 3 / F, No.20, galiarehapura, helzria, Israel

Patentee after: Alibaba (Israel) Technology Co.,Ltd.

Address before: Paktikwa, Israel

Patentee before: Infinity Augmented Reality Israel Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230410

Address after: Room 516, floor 5, building 3, No. 969, Wenyi West Road, Wuchang Street, Yuhang District, Hangzhou City, Zhejiang Province

Patentee after: Alibaba Dharma Institute (Hangzhou) Technology Co.,Ltd.

Address before: 3 / F, No.20, galiarehapura, helzria, Israel

Patentee before: Alibaba (Israel) Technology Co.,Ltd.