CN206946068U - The vision sensing equipment of unmanned machine and there is its unmanned machine - Google Patents
The vision sensing equipment of unmanned machine and there is its unmanned machine Download PDFInfo
- Publication number
- CN206946068U CN206946068U CN201720800827.XU CN201720800827U CN206946068U CN 206946068 U CN206946068 U CN 206946068U CN 201720800827 U CN201720800827 U CN 201720800827U CN 206946068 U CN206946068 U CN 206946068U
- Authority
- CN
- China
- Prior art keywords
- image acquiring
- unmanned machine
- supporting part
- sensing equipment
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Abstract
This application discloses a kind of vision sensing equipment of unmanned machine and there is its unmanned machine, the vision sensing equipment of the unmanned machine includes:Supporting part;Four image acquiring devices being arranged on supporting part, four image acquiring devices are respectively arranged at the both sides of supporting part, wherein, first image acquiring device and the second image acquiring device are arranged at the first side of the supporting part, and the 3rd image acquiring device and the 4th image acquiring device are arranged at the second side of the supporting part;Control unit, described control unit is connected with four image acquiring devices respectively, described control unit is used to receive the image that four image acquiring devices obtain, and the distance between target location and vision sensing equipment are determined with the image obtained according to four image acquiring devices.Thereby, it is possible to carrying out wide-angle deep vision measurement, apparatus structure is simple, and cost is low.
Description
Technical field
The application is related to unmanned technology field, and more particularly to a kind of vision sensing equipment of unmanned machine and one kind have it
Unmanned machine.
Background technology
With the rise in robot and unmanned field, allow robot and automobile perceive around the world into
For academic research and the much-talked-about topic of enterprise innovation.Wherein, the purpose of perception is to allow the equipment such as robot and unmanned vehicle to know
The location of oneself, understand the environment of surrounding, the foundation of decision-making is provided for ensuing take action of equipment.In correlation technique, machine
The main sensing solutions in device people and unmanned field are laser radar and the major class of vision two.Laser radar is using laser as light
Source, remote sensing survey is completed with the detected lightwave signal without interaction by exploring laser light, map making can be used to, determined
Position and avoidance;Vision is camera, is widely used in the scenes such as object identification and object tracking.
But correlation technique the problem of existing:First, laser radar yields is low, manufacturing cost is high, and big in order to measure
Angular range, the mechanical part of rotation need to be set to cause service life to be limited in laser radar;Second, use monocular or binocular
Camera can not realize that panoramic picture obtains, and multi-cam, which is circular layout, to be placed and can not unify because of different terminals, causes camera
Between calibration amount and difficulty it is huge.
Therefore, the vision measurement technology of unmanned machine needs to improve.
The content of the invention
The application is intended to one of technical problem at least solving in correlation technique to a certain extent.Therefore, the application
One purpose is the vision sensing equipment for proposing a kind of unmanned machine and has its unmanned machine, being capable of a wide range of vision
Measurement, simple in construction, cost is low.
Further object is to propose a kind of unmanned machine.
To reach above-mentioned purpose, the application one side embodiment proposes a kind of vision sensing equipment of unmanned machine, bag
Include:Supporting part;Four image acquiring devices being arranged on supporting part, four image acquiring devices are respectively arranged at support
The both sides in portion, wherein, the first image acquiring device and the second image acquiring device are arranged at the first side of the supporting part, and the 3rd
Image acquiring device and the 4th image acquiring device are arranged at the second side of the supporting part;Control unit, described control unit
It is connected respectively with four image acquiring devices, described control unit is used to receive the figure that four image acquiring devices obtain
Picture, the distance between target location and vision sensing equipment are determined with the image obtained according to four image acquiring devices.
According to the vision sensing equipment of the unmanned machine of the proposition of the embodiment of the present application, by by four image acquiring devices
The both sides of supporting part are separately positioned on, and receive the image of four image acquiring devices acquisitions by control unit, with according to four
The image that individual image acquiring device obtains determines the distance between target location and vision sensing equipment.Thus, the application is implemented
Example can be to carrying out wide-angle deep vision measurement, and apparatus structure is simple, and cost is low.
According to one embodiment of the application, the first side of the supporting part can be the front side of supporting part, the of supporting part
Two sides can be the rear side of supporting part.
According to one embodiment of the application, described first image acquisition device and second image acquiring device difference
It is arranged at first end and the second end of the first side of the supporting part;And the 3rd image acquiring device and the 4th figure
As acquisition device is respectively arranged at first end and the second end of the second side of the supporting part.
According to one embodiment of the application, the visible angle of the horizontal direction of the camera lens of four image acquiring devices
More than 180 degree.
According to one embodiment of the application, four image acquiring devices can be fish eye lens.
According to one embodiment of the application, the supporting part can be cuboid or cylinder.
According to one embodiment of the application, the image obtained according to four image acquiring devices determines target
The mode of the distance of position can be triangulation calculation.
To reach above-mentioned purpose, the application another aspect embodiment proposes a kind of unmanned machine, including affiliated nobody
The vision sensing equipment of equipment.
The unmanned machine proposed according to the embodiment of the present application, can be to carrying out by the vision sensing equipment of unmanned machine
Wide-angle deep vision measures, and apparatus structure is simple, and cost is low.
According to one embodiment of the application, the vision sensing equipment of the unmanned machine is disposed vertically or horizontal positioned.
According to one embodiment of the application, the unmanned machine is robot or automatic driving vehicle.
Brief description of the drawings
Fig. 1 is the block diagram according to the vision sensing equipment of the unmanned machine of the embodiment of the present application;
Fig. 2 is the front view according to the vision sensing equipment of the unmanned machine of the application one embodiment;
Fig. 3 is the top view according to the vision sensing equipment of the unmanned machine of the application one embodiment;
Fig. 4 is the side view according to the vision sensing equipment of the unmanned machine of the application one embodiment;
Fig. 5 is the front view according to the vision sensing equipment of the unmanned machine of the application another embodiment;
Fig. 6 is the top view according to the vision sensing equipment of the unmanned machine of the application another embodiment;
Fig. 7 is the block diagram according to the unmanned machine of the embodiment of the present application.
Embodiment
Embodiments herein is described below in detail, the example of the embodiment is shown in the drawings, wherein from beginning to end
Same or similar label represents same or similar element or the element with same or like function.Below with reference to attached
The embodiment of figure description is exemplary, it is intended to for explaining the application, and it is not intended that limitation to the application.
The vision sensing equipment and unmanned machine of the unmanned machine of the embodiment of the present application are described with reference to the accompanying drawings.
Fig. 1 is the block diagram according to the vision sensing equipment of the unmanned machine of the embodiment of the present application.
As shown in figure 1, the vision sensing equipment of the unmanned machine of the embodiment of the present application includes:Supporting part 10, four images
Acquisition device (i.e. the first image acquiring device 21, the second image acquiring device 22, the 3rd image acquiring device 23 and the 4th image
Acquisition device 24) and control unit 30.
Wherein, four image acquiring devices are arranged on supporting part 10, and four image acquiring devices are respectively arranged at support
The both sides in portion 10, the first image acquiring device 21 and the second image acquiring device 22 are arranged at the first side of supporting part 10, and the 3rd
The image acquiring device 24 of image acquiring device 23 and the 4th is arranged at the second side of supporting part 10, control unit 30 respectively with four
Image acquiring device is connected, and control unit 30 is used to receive the image that four image acquiring devices obtain, with according to four images
The image that acquisition device obtains determines the distance between target location and vision sensing equipment.
It should be noted that the first image acquiring device 21 and the second image is set to obtain dress in the first side of supporting part 10
22 are put, i.e., vision survey is carried out to the first side of supporting part 10 by the first image acquiring device 21 and the second image acquiring device 22
Amount, the 3rd image acquiring device 23 and the 4th image acquiring device 24 are set in the second side of supporting part 10, that is, pass through the 3rd figure
As the image acquiring device 24 of acquisition device 23 and the 4th carries out vision measurement to the second side of supporting part 10.That is, support
First side and the second side in portion 10 can carry out vision measurement by two vision sensing equipments respectively.
Specifically, first image acquiring device 21 and the second image acquiring device are set in the first side of supporting part 10
22, the second side of supporting part 10 sets the 3rd image acquiring device 23 and the 4th image acquiring device 24, control unit 30 to receive
The image that four image acquiring devices obtain, with the image obtained according to four image acquiring devices determine target location away from
From.
Thus, the vision sensing equipment of unmanned machine can carry out wide-angle deep vision measurement, dress to supporting part both sides
Put simple in construction, cost is low.Solving single camera can not realize that the measurement of Object Depth and dual camera can only obtain list
The problem of image in one direction, while save optional mechanical part built in laser radar, effectively save cost.
According to one embodiment of the application, as shown in figs. 4 and 6, the first side of supporting part 10 is the front side of supporting part 10,
Second side of supporting part 10 is the rear side of supporting part 10.
It should be noted that before the front side of supporting part 10 and the rear side of supporting part 10 are respectively unmanned machine direct of travel
Side and rear side, to ensure the vision measurement of the main traveling process of unmanned machine (moving forward and backward).That is, unmanned machine
The setting direction of vision sensing equipment can be consistent with the direct of travel of unmanned machine, for example, when unmanned machine is robot, machine
The direction of device people front institute direction can be the front side of supporting part 10, and the direction of back side institute of robot direction can be for after supporting part 10
Side, similarly, when unmanned machine for automatic driving vehicle when, the direction of headstock institute direction can be the front side of supporting part 10, the tailstock
The direction of institute's direction is the rear side of supporting part 10.So as to by respectively setting two images to obtain in the front side of supporting part 10 and rear side
Take device to carry out visual sensing, ensure the scope of visual sensing, improve the precision of visual sensing.
According to one embodiment of the application, the first image acquiring device 21 and the second image acquiring device 22 are set respectively
First end and the second end in the first side of supporting part 10;And the 3rd image acquiring device 23 and the 4th image acquiring device 24
It is respectively arranged at first end and the second end of the second side of supporting part.
It should be noted that the first end of supporting part 10 and the distance between the second end of supporting part 10 can be according to applied fields
Scape is adjusted, so that the visual range for being arranged at the image acquiring device at both ends ensures measured zone and overlapped as far as possible.
According to one embodiment of the application, the visible angle of the horizontal direction of the camera lens of four image acquiring devices is more than
180 degree.
Specifically, it is fish eye lens according to one embodiment of the application, four image acquiring devices.
Specifically, when vision sensing equipment is disposed vertically (i.e. shown in Fig. 2), as shown in figure 3, the second image obtains dress
Visual zone C, visual zone A and visual zone D visual pattern can be obtained by putting 22, and the 4th image acquiring device 24, which can obtain, to be regarded
Feel region C, visual zone B and visual zone D visual pattern, i.e. the second image acquiring device 22 and the 4th image acquiring device
24 can obtain visual zone C and visual zone D visual pattern.
When vision sensing equipment is horizontal positioned (shown in Fig. 5), as shown in fig. 6, the first image acquiring device 21 and
Two image acquiring devices 22 can obtain visual zone C, visual zone A and visual zone D visual pattern, and the 3rd image obtains dress
Put 23 and the 4th image acquiring device 24 can obtain visual zone C, visual zone B and visual zone D visual pattern, i.e., first
The image acquiring device 23 of image acquiring device 21 and the 3rd can obtain visual zone C visual pattern, the second image acquiring device
22 and the 4th image acquiring device 24 can obtain visual zone D visual pattern.
That is, by setting the visible angle of the horizontal direction of the camera lens of four image acquiring devices to be more than 180 degree,
The vision measurement scope of vision sensing equipment can be made to be expanded to 360 degree of scopes.Wherein, visual zone E and visual zone F is four
The blind spot of image acquiring device, because the region belongs to vision sensing equipment side and closer to the distance, therefore visual zone E and regard
Feel that region F does not influence vision sensing equipment and carries out vision measurement.Wherein, visual zone E and visual zone F is obtained according to image and filled
The visible angle of the horizontal direction for the camera lens put determines.
According to one embodiment of the application, supporting part 10 is cuboid or cylinder, so that four image acquiring devices
In two adjacent images acquisition device there is identical visual zone, and then vision measurement is carried out to visual zone.
According to one embodiment of the application, the image that is obtained according to four image acquiring devices determine target location away from
From mode be triangulation calculation.
It should be noted that as shown in figs. 4 and 6, tested point X can carry out image acquisition by the first image acquiring device 21,
And can be obtained by the second image acquiring device 22, i.e. the first image acquiring device 21 and the second image acquiring device 22 can obtain simultaneously
Tested point X image is taken, the distance of tested point X and vision sensing equipment is calculated by triangulation calculation (triangulation).Together
Reason, tested point X, which may be disposed at two adjacent images acquisition device, can obtain the region of image jointly, that is to say, that four figures
As acquisition device can measure to any point in visual zone A, visual zone B, visual zone C and visual zone D, with basis
The image that four image acquiring devices obtain determines the distance between tested point X and vision sensing equipment.Wherein, tested point X can
For the target location of vision measurement.
It should be appreciated that tested point X can be multiple, i.e., the image that control unit 20 obtains according to image acquiring device is true
The distance between fixed multiple tested point X and vision sensing equipment.Control unit 20 can be additionally used according to multiple tested point X and vision
The distance between sensing device realizes VO (Visual odometry, visual odometry are calculated) algorithms and SLAM (Simultaneous
Localization and Mapping, synchronously build figure and positioning) algorithm carries out deep learning to identify that vision sensing equipment regards
Feel the object in region.
In summary, according to the vision sensing equipment of the unmanned machine of the proposition of the embodiment of the present application, by the way that four are schemed
As acquisition device is separately positioned on the both sides of supporting part, and the figure obtained by control unit four image acquiring devices of receiving
Picture, the distance between target location and vision sensing equipment are determined with the image obtained according to four image acquiring devices.Thus,
The embodiment of the present application can be to carrying out wide-angle deep vision measurement, and apparatus structure is simple, and cost is low.
The embodiment of the present application also proposed a kind of unmanned machine.
Fig. 7 is the block diagram according to the unmanned machine of the embodiment of the present application.
As shown in fig. 7, unmanned machine 200 includes the vision sensing equipment 100. of above-mentioned unmanned machine
According to one embodiment of the application, the vision sensing equipment 100 of unmanned machine is disposed vertically or horizontal positioned, i.e.,
The vision sensing equipment 100 of unmanned machine can be disposed vertically in unmanned machine, can be also placed horizontally in unmanned machine, but not
It is only limitted to this.
According to one embodiment of the application, unmanned machine 200 can be robot or automatic driving vehicle.
The unmanned machine proposed according to the embodiment of the present application, can be to carrying out by the vision sensing equipment of unmanned machine
Wide-angle deep vision measures, and apparatus structure is simple, and cost is low.
In the description of the present application, it is to be understood that term " " center ", " longitudinal direction ", " transverse direction ", " length ", " width ",
" thickness ", " on ", " under ", "front", "rear", "left", "right", " vertical ", " level ", " top ", " bottom " " interior ", " outer ", " up time
The orientation or position relationship of the instruction such as pin ", " counterclockwise ", " axial direction ", " radial direction ", " circumference " be based on orientation shown in the drawings or
Position relationship, it is for only for ease of description the application and simplifies description, rather than indicates or imply that signified device or element must
There must be specific orientation, with specific azimuth configuration and operation, therefore it is not intended that limitation to the application.
In addition, term " first ", " second " are only used for describing purpose, and it is not intended that instruction or hint relative importance
Or the implicit quantity for indicating indicated technical characteristic.Thus, define " first ", the feature of " second " can be expressed or
Implicitly include at least one this feature.In the description of the present application, " multiple " are meant that at least two, such as two, three
It is individual etc., unless otherwise specifically defined.
In this application, unless otherwise clearly defined and limited, term " installation ", " connected ", " connection ", " fixation " etc.
Term should be interpreted broadly, for example, it may be fixedly connected or be detachably connected, or integrally;Can be that machinery connects
Connect or electrically connect;Can be joined directly together, can also be indirectly connected by intermediary, can be in two elements
The connection in portion or the interaction relationship of two elements, limited unless otherwise clear and definite.For one of ordinary skill in the art
For, the concrete meaning of above-mentioned term in this application can be understood as the case may be.
In this application, unless otherwise clearly defined and limited, fisrt feature can be with "above" or "below" second feature
It is that the first and second features directly contact, or the first and second features pass through intermediary mediate contact.Moreover, fisrt feature exists
Second feature " on ", " top " and " above " but fisrt feature are directly over second feature or oblique upper, or be merely representative of
Fisrt feature level height is higher than second feature.Fisrt feature second feature " under ", " lower section " and " below " can be
One feature is immediately below second feature or obliquely downward, or is merely representative of fisrt feature level height and is less than second feature.
In the description of this specification, reference term " one embodiment ", " some embodiments ", " example ", " specifically show
The description of example " or " some examples " etc. means specific features, structure, material or the spy for combining the embodiment or example description
Point is contained at least one embodiment or example of the application.In this manual, to the schematic representation of above-mentioned term not
Identical embodiment or example must be directed to.Moreover, specific features, structure, material or the feature of description can be with office
Combined in an appropriate manner in one or more embodiments or example.In addition, in the case of not conflicting, the skill of this area
Art personnel can be tied the different embodiments or example and the feature of different embodiments or example described in this specification
Close and combine.
Although embodiments herein has been shown and described above, it is to be understood that above-described embodiment is example
Property, it is impossible to the limitation to the application is interpreted as, one of ordinary skill in the art within the scope of application can be to above-mentioned
Embodiment is changed, changed, replacing and modification.
Claims (10)
- A kind of 1. vision sensing equipment of unmanned machine, it is characterised in that including:Supporting part;Four image acquiring devices being arranged on supporting part, four image acquiring devices are respectively arranged at the two of supporting part Side, wherein, the first image acquiring device and the second image acquiring device are arranged at the first side of the supporting part, and the 3rd image obtains Device and the 4th image acquiring device is taken to be arranged at the second side of the supporting part;Control unit, described control unit are connected with four image acquiring devices respectively, and described control unit is used to receive The image that four image acquiring devices obtain, with the image obtained according to four image acquiring devices determine target location and The distance between vision sensing equipment.
- 2. the vision sensing equipment of unmanned machine according to claim 1, it is characterised in that the first side of the supporting part is The front side of supporting part, the second side of the supporting part are the rear side of supporting part.
- 3. the vision sensing equipment of unmanned machine according to claim 1, it is characterised in that described first image acquisition device First end and the second end of the first side of the supporting part are respectively arranged at second image acquiring device;And3rd image acquiring device and the 4th image acquiring device are respectively arranged at the second side of the supporting part First end and the second end.
- 4. according to the vision sensing equipment of the unmanned machine of claim 1 or 3, it is characterised in that four images obtain The visible angle of the horizontal direction of the camera lens of device is more than 180 degree.
- 5. the vision sensing equipment of unmanned machine according to claim 4, it is characterised in that four image acquiring devices For fish eye lens.
- 6. the vision sensing equipment of unmanned machine according to claim 1, it is characterised in that the supporting part be cuboid or Cylinder.
- 7. vision sensing equipment according to claim 1, it is characterised in that described according to four image acquiring devices The image of acquisition determines that the mode of the distance of target location is triangulation calculation.
- 8. a kind of unmanned machine, include the vision sensing equipment of the unmanned machine according to claim any one of 1-7.
- 9. unmanned machine according to claim 8, it is characterised in that the vision sensing equipment of the unmanned machine is disposed vertically It is or horizontal positioned.
- 10. unmanned machine according to claim 9, it is characterised in that the unmanned machine is robot or automatic driving car .
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201720800827.XU CN206946068U (en) | 2017-07-04 | 2017-07-04 | The vision sensing equipment of unmanned machine and there is its unmanned machine |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201720800827.XU CN206946068U (en) | 2017-07-04 | 2017-07-04 | The vision sensing equipment of unmanned machine and there is its unmanned machine |
Publications (1)
Publication Number | Publication Date |
---|---|
CN206946068U true CN206946068U (en) | 2018-01-30 |
Family
ID=61367633
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201720800827.XU Active CN206946068U (en) | 2017-07-04 | 2017-07-04 | The vision sensing equipment of unmanned machine and there is its unmanned machine |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN206946068U (en) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107153247A (en) * | 2017-07-04 | 2017-09-12 | 深圳普思英察科技有限公司 | The vision sensing equipment of unmanned machine and the unmanned machine with it |
US10162362B2 (en) | 2016-08-29 | 2018-12-25 | PerceptIn, Inc. | Fault tolerance to provide robust tracking for autonomous positional awareness |
US10192113B1 (en) | 2017-07-05 | 2019-01-29 | PerceptIn, Inc. | Quadocular sensor design in autonomous platforms |
US10354396B1 (en) | 2016-08-29 | 2019-07-16 | Perceptln Shenzhen Limited | Visual-inertial positional awareness for autonomous and non-autonomous device |
US10366508B1 (en) | 2016-08-29 | 2019-07-30 | Perceptin Shenzhen Limited | Visual-inertial positional awareness for autonomous and non-autonomous device |
US10390003B1 (en) | 2016-08-29 | 2019-08-20 | Perceptln Shenzhen Limited | Visual-inertial positional awareness for autonomous and non-autonomous device |
US10395117B1 (en) | 2016-08-29 | 2019-08-27 | Trifo, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous tracking |
US10402663B1 (en) | 2016-08-29 | 2019-09-03 | Trifo, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous mapping |
US10410328B1 (en) | 2016-08-29 | 2019-09-10 | Perceptin Shenzhen Limited | Visual-inertial positional awareness for autonomous and non-autonomous device |
US10423832B1 (en) | 2016-08-29 | 2019-09-24 | Trifo, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous tracking |
US10437252B1 (en) | 2017-09-08 | 2019-10-08 | Perceptln Shenzhen Limited | High-precision multi-layer visual and semantic map for autonomous driving |
US10453213B2 (en) | 2016-08-29 | 2019-10-22 | Trifo, Inc. | Mapping optimization in autonomous and non-autonomous platforms |
US10496104B1 (en) | 2017-07-05 | 2019-12-03 | Perceptin Shenzhen Limited | Positional awareness with quadocular sensor in autonomous platforms |
US10571925B1 (en) | 2016-08-29 | 2020-02-25 | Trifo, Inc. | Autonomous platform guidance systems with auxiliary sensors and task planning |
US10571926B1 (en) | 2016-08-29 | 2020-02-25 | Trifo, Inc. | Autonomous platform guidance systems with auxiliary sensors and obstacle avoidance |
US10794710B1 (en) | 2017-09-08 | 2020-10-06 | Perceptin Shenzhen Limited | High-precision multi-layer visual and semantic map by autonomous units |
US11314262B2 (en) | 2016-08-29 | 2022-04-26 | Trifo, Inc. | Autonomous platform guidance systems with task planning and obstacle avoidance |
US11774983B1 (en) | 2019-01-02 | 2023-10-03 | Trifo, Inc. | Autonomous platform guidance systems with unknown environment mapping |
-
2017
- 2017-07-04 CN CN201720800827.XU patent/CN206946068U/en active Active
Cited By (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10769440B1 (en) | 2016-08-29 | 2020-09-08 | Trifo, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous tracking |
US11900536B2 (en) | 2016-08-29 | 2024-02-13 | Trifo, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous tracking |
US10571925B1 (en) | 2016-08-29 | 2020-02-25 | Trifo, Inc. | Autonomous platform guidance systems with auxiliary sensors and task planning |
US10354396B1 (en) | 2016-08-29 | 2019-07-16 | Perceptln Shenzhen Limited | Visual-inertial positional awareness for autonomous and non-autonomous device |
US10366508B1 (en) | 2016-08-29 | 2019-07-30 | Perceptin Shenzhen Limited | Visual-inertial positional awareness for autonomous and non-autonomous device |
US10390003B1 (en) | 2016-08-29 | 2019-08-20 | Perceptln Shenzhen Limited | Visual-inertial positional awareness for autonomous and non-autonomous device |
US11948369B2 (en) | 2016-08-29 | 2024-04-02 | Trifo, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous mapping |
US10402663B1 (en) | 2016-08-29 | 2019-09-03 | Trifo, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous mapping |
US10410328B1 (en) | 2016-08-29 | 2019-09-10 | Perceptin Shenzhen Limited | Visual-inertial positional awareness for autonomous and non-autonomous device |
US10423832B1 (en) | 2016-08-29 | 2019-09-24 | Trifo, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous tracking |
US10571926B1 (en) | 2016-08-29 | 2020-02-25 | Trifo, Inc. | Autonomous platform guidance systems with auxiliary sensors and obstacle avoidance |
US10453213B2 (en) | 2016-08-29 | 2019-10-22 | Trifo, Inc. | Mapping optimization in autonomous and non-autonomous platforms |
US11842500B2 (en) | 2016-08-29 | 2023-12-12 | Trifo, Inc. | Fault-tolerance to provide robust tracking for autonomous and non-autonomous positional awareness |
US10496103B2 (en) | 2016-08-29 | 2019-12-03 | Trifo, Inc. | Fault-tolerance to provide robust tracking for autonomous and non-autonomous positional awareness |
US11953910B2 (en) | 2016-08-29 | 2024-04-09 | Trifo, Inc. | Autonomous platform guidance systems with task planning and obstacle avoidance |
US10162362B2 (en) | 2016-08-29 | 2018-12-25 | PerceptIn, Inc. | Fault tolerance to provide robust tracking for autonomous positional awareness |
US10395117B1 (en) | 2016-08-29 | 2019-08-27 | Trifo, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous tracking |
US11544867B2 (en) | 2016-08-29 | 2023-01-03 | Trifo, Inc. | Mapping optimization in autonomous and non-autonomous platforms |
US10832056B1 (en) | 2016-08-29 | 2020-11-10 | Trifo, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous tracking |
US10929690B1 (en) | 2016-08-29 | 2021-02-23 | Trifo, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous mapping |
US10943361B2 (en) | 2016-08-29 | 2021-03-09 | Trifo, Inc. | Mapping optimization in autonomous and non-autonomous platforms |
US10983527B2 (en) | 2016-08-29 | 2021-04-20 | Trifo, Inc. | Fault-tolerance to provide robust tracking for autonomous and non-autonomous positional awareness |
US11314262B2 (en) | 2016-08-29 | 2022-04-26 | Trifo, Inc. | Autonomous platform guidance systems with task planning and obstacle avoidance |
US11328158B2 (en) | 2016-08-29 | 2022-05-10 | Trifo, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous tracking |
US11398096B2 (en) | 2016-08-29 | 2022-07-26 | Trifo, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous mapping |
US11501527B2 (en) | 2016-08-29 | 2022-11-15 | Trifo, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous tracking |
CN107153247A (en) * | 2017-07-04 | 2017-09-12 | 深圳普思英察科技有限公司 | The vision sensing equipment of unmanned machine and the unmanned machine with it |
US10496104B1 (en) | 2017-07-05 | 2019-12-03 | Perceptin Shenzhen Limited | Positional awareness with quadocular sensor in autonomous platforms |
US10192113B1 (en) | 2017-07-05 | 2019-01-29 | PerceptIn, Inc. | Quadocular sensor design in autonomous platforms |
US10794710B1 (en) | 2017-09-08 | 2020-10-06 | Perceptin Shenzhen Limited | High-precision multi-layer visual and semantic map by autonomous units |
US10437252B1 (en) | 2017-09-08 | 2019-10-08 | Perceptln Shenzhen Limited | High-precision multi-layer visual and semantic map for autonomous driving |
US11774983B1 (en) | 2019-01-02 | 2023-10-03 | Trifo, Inc. | Autonomous platform guidance systems with unknown environment mapping |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN206946068U (en) | The vision sensing equipment of unmanned machine and there is its unmanned machine | |
CN107153247A (en) | The vision sensing equipment of unmanned machine and the unmanned machine with it | |
CN105607635B (en) | Automatic guided vehicle panoramic optical vision navigation control system and omnidirectional's automatic guided vehicle | |
WO2020258721A1 (en) | Intelligent navigation method and system for cruiser motorcycle | |
CN104217439B (en) | Indoor visual positioning system and method | |
US10152120B2 (en) | Information provision device and information provision method | |
RU2734643C1 (en) | Parking assistance method for parking assistance device and parking assistance device | |
KR101703177B1 (en) | Apparatus and method for recognizing position of vehicle | |
CN111201879A (en) | Grain harvesting and transporting integrated loading device/method based on image recognition | |
CN109360245A (en) | The external parameters calibration method of automatic driving vehicle multicamera system | |
CN109634279A (en) | Object positioning method based on laser radar and monocular vision | |
JP4843190B2 (en) | Image sensor system calibration method and apparatus | |
CN107627957A (en) | Operation Van | |
CN108888187A (en) | A kind of sweeping robot based on depth camera | |
CN109373975A (en) | Detect vehicle control apparatus, control method and computer program | |
CN109085598A (en) | Detection system for obstacle for vehicle | |
CN106291535A (en) | A kind of obstacle detector, robot and obstacle avoidance system | |
US20120162360A1 (en) | Wide-Angle Image Pickup Unit And Measuring Device | |
CN106384382A (en) | Three-dimensional reconstruction system and method based on binocular stereoscopic vision | |
US20050030378A1 (en) | Device for image detecting objects, people or similar in the area surrounding a vehicle | |
CN109444916A (en) | The unmanned travelable area determining device of one kind and method | |
CN106292684A (en) | Carry the vehicle of aircraft | |
US11774981B2 (en) | Driver aid and autonomous tractor-trailer parking and loading dock alignment system | |
CN111243029A (en) | Calibration method and device of vision sensor | |
CN109375629A (en) | A kind of cruiser and its barrier-avoiding method that navigates |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |