CN108646727A - A kind of vision cradle and its localization method and recharging method - Google Patents

A kind of vision cradle and its localization method and recharging method Download PDF

Info

Publication number
CN108646727A
CN108646727A CN201810457865.9A CN201810457865A CN108646727A CN 108646727 A CN108646727 A CN 108646727A CN 201810457865 A CN201810457865 A CN 201810457865A CN 108646727 A CN108646727 A CN 108646727A
Authority
CN
China
Prior art keywords
robot
vision
cradle
camera
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810457865.9A
Other languages
Chinese (zh)
Inventor
赖钦伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN201810457865.9A priority Critical patent/CN108646727A/en
Publication of CN108646727A publication Critical patent/CN108646727A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0005Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot with arrangements to save energy
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/10Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means

Abstract

The present invention discloses a kind of vision cradle with camera and robot localization method and recharging method based on the vision cradle.The characteristics of image auxiliary robot that the vision cradle is captured by its camera is realized positioning and is recharged.Wherein, the fuselage length of the position and the robot of the robot on the image that the vision cradle is captured according to its camera, and combine the geometric proportion relationship of national forest park in Xiaokeng and similar triangles, find out relative position coordinates of the robot relative to the vision cradle, the robot angle feed-back that the vision cradle determines under the state that recharges is to robot, so that the robot constantly correct oneself recharge route, so as on straight to above cradle.Cradle auxiliary robot completes positioning, reduces consumption of the robot to battery, also improves and recharges effect.

Description

A kind of vision cradle and its localization method and recharging method
Technical field
The present invention relates to automation field more particularly to a kind of localization methods of view-based access control model technology, and in particular to one The localization method and recharging method of kind vision cradle and robot.
Background technology
Automatic action robot is more and more widely used, such as sweeping robot, furniture security robot, row Industry seeks advice from robot etc., and one most important feature of these robots is automatic positioning and recharges automatically.Automatic positioning packet at present Containing multiple technologies, such as inertial navigation, vision guided navigation, laser navigation etc., they the characteristics of be can independent navigation, be not required to To depend on the auxiliary of external device (ED), applicability stronger.Charge seating portion, is largely the signal of actively transmitting guiding, example Such as infrared, ultrasonic wave when robot is near cradle, can be relatively easy to and be directed to immediately ahead of cradle, then return Seat charging.In the prior art, the robot of this type only plays signal guiding by itself realization location navigation, cradle Effect, the two is independent mutually, and robot needs a large amount of calculation resources to carry out navigation operation, therefore during location navigation It is bigger to the consumption of battery.
Invention content
A kind of vision cradle is provided with the electrode slice and wireless communication module of charging above the vision cradle, this is regarded Feel camera and data processing and wireless communication module there are one being also set up on cradle, wherein data processing and wireless communication Module obtains the location information of robot by the image that analyzing processing camera is shot, and according to the location information of robot with Communication control robot is positioned and/or is recharged.
Further, the camera is fixed at right over the panel of the vision cradle.
Further, there are one fixed visual angles for the camera tool, for being regarded to appearing in the vision cradle Robot in wild range is positioned and/or is recharged.
A kind of localization method of robot, the localization method are based on the vision cradle, and suitable for appearing in The robot within the vision for stating vision cradle is positioned, and is included the following steps:
Step 1, the vision cradle identify the robot from the image that its camera captures;
Step 2, after successfully identifying the robot, according to the position of the robot on the image captured, using aperture at As model and triangle geometry proportionate relationship, the angle of the robot and the vision cradle is calculated;According to being captured Image on the robot fuselage length, utilize the geometric proportion relationship of national forest park in Xiaokeng and similar triangles, calculate Go out distance of the robot relative to the vision cradle;
Step 3 is filled according to the robot and the angle of the vision cradle and the robot relative to the vision The distance of electric seat calculates relative position coordinates of the robot relative to the vision cradle;
Wherein, the angle of the robot and the vision cradle is that vision described in the fuselage center deviation of the robot is filled The angle of vertical direction of the electric seat on level ground.
Further, it in step 2, is obtained according to the geometric proportion relationship of national forest park in Xiaokeng and similar triangles, it is described The ratio of length of the robot on the image captured and the fuselage length of the robot is equal to the focal length of the camera Ratio at a distance from the robot relatively described vision cradle, can find out the robot phase by above-mentioned ratio relation To the distance of the vision cradle;
Wherein, the fuselage length of the robot is obtained by measuring, and the focal length of the camera is the intrinsic parameter of the camera, Length of the robot on the image captured is obtained by the imaging sensor of the camera.
Further, in step 2, according to national forest park in Xiaokeng and similar triangles geometric proportion relationship, the robot Fuselage center deviate in the camera imaging plane camera lens centre axis angle be equal to the machine The angle of device people and the vision cradle, then according to the fuselage center of the robot in the position of the camera imaging plane Confidence ceases the angle for finding out the robot and the vision cradle.
Further, in step 3, the robot is in the world relative to the relative position coordinates of the vision cradle On coordinate system, origin position of the position where camera as world coordinate system on the vision cradle.
Further, in step 1, following steps are specifically included:
Select the robot side photo that the camera takes as training sample first;
Then the training sample is pre-processed, therefrom extracts image feature value;
Then grader is designed by described image characteristic value, and grader is trained using training sample and is newly classified Device;
It is finally generated to obtain detection by the new grader, is used for carrying out target identification to robot;
Wherein, detection is a rectangular area for including target object.
A kind of recharging method of robot, the recharging method are based on the vision cradle and the localization method, determine The robot is relative to the relative position coordinates of the vision cradle and the folder of the robot and the vision cradle Behind angle, the robot recharges route according to default in vertical direction on level ground of the vision cradle, corrects Recharge circuit, make its along preset recharge route return seat charging;
Wherein, the angle of the robot and the vision cradle is that vision described in the fuselage center deviation of the robot is filled The angle of vertical direction of the electric seat on level ground.
Further, described correct recharges the process of route and includes, when the vision cradle according to the robot with It is described pre- toward a direction deviation to determine the robot current relative position coordinate position for the angle of the vision cradle If recharge route, the vision cradle to the robot send instruction control its move back to toward opposite direction it is described pre- If recharging on route, then controls the robot and return seat charging along the default route that recharges.
Compared with the existing technology, the vision cradle has determining for view-based access control model technology because it is provided with camera Bit function so that cradle auxiliary robot completes positioning, reduces consumption of the robot to battery;In the vision cradle What location base raised whole robot recharges route, and effect is recharged to improve.
Description of the drawings
Fig. 1 is the structural schematic diagram of vision cradle provided by the invention;
Fig. 2 is the field of view flat distribution map of camera during the present invention is implemented;
Fig. 3 is that the present invention implements geometrical model schematic diagram of the Computer device people at a distance from vision cradle;
Fig. 4 is the geometrical model schematic diagram that the present invention implements Computer device people and the angle of vision cradle;
Fig. 5 is the flow chart of the localization method of robot provided by the invention.
Specific implementation mode
The specific implementation mode of the present invention is described further below in conjunction with the accompanying drawings:
In the description of invention, it is to be understood that term "center", " longitudinal direction ", " transverse direction ", "upper", "lower", "front", "rear", The orientation or positional relationship of the instructions such as "left", "right", " hard straight ", "horizontal", "top", "bottom", "inner", "outside" is based on attached drawing institute The orientation or positional relationship shown is merely for convenience of description invention and simplifies description, do not indicate or imply the indicated device Or element must have a particular orientation, with specific azimuth configuration and operation, therefore should not be understood as the limitation to invention.
As shown in Fig. 1, Fig. 1 is the structural schematic diagram of vision cradle, in Fig. 1 vision cradle include camera 101, Shell 102, contact chip 103 and data processing and wireless communication module 104.Electrode slice of the contact chip 103 as charging, setting On the bottom plate of the vision cradle;Data processing and wireless communication module 104 are used for and robot is communicated, and passes through receipts Send instructions, the data of visual processes is transferred to robot, movement state information is passed to cradle by robot;The vision Also set up that there are one camera 101 and data processing and wireless communication modules 104 on cradle, wherein data processing and wireless Communication module 104 obtains the location information of robot by the image that analyzing processing camera 101 is shot, and according to robot Location information is positioned and/or is recharged with communication control robot.Wherein, the setting of camera 101 is regarded described Right over the panel for feeling cradle, the position remained relatively unchanged over the visual sensor.
As shown in Fig. 2, Fig. 2 is the field of view flat distribution map of camera, Tu2Zhong robots 201 and vision cradle 203 distance is labeled as 205, and the visual angle wire tag of the camera 101 on the vision cradle 203 is 202, robot 201 The dotted line 205 being connect with the vision cradle is labeled as relative to the angle of 203 vertical direction 206 of the vision cradle 204, the linear mark in the vertical direction of the vision cradle is default to recharge route 206.
Preferably, the tool of the camera 101 is there are one fixed visual angle, the visual field model as defined by visual angle line 202 in Fig. 2 It encloses, when defined by robot 201 appears in visual angle line 202 within sweep of the eye, could it be captured by the camera 101 Image information, to realize the positioning to robot 201 and/or recharge.
Based on above-mentioned vision cradle, the present invention implements to provide a kind of localization method of robot, suitable for appearing in The robot within the vision of the vision cradle is positioned, which includes the following steps, such as the method for Fig. 5 Shown in flow chart, in step 1, the image that the vision cradle is captured from its camera knows the feature of robot Not;In step 2, after successfully identifying the robot, the robot represented by pixel on the image captured The camera is demarcated in position(The data of image coordinate system are transformed on world coordinate system)Afterwards, pinhole imaging system is utilized The geometric proportion relationship of model and similar triangles calculates the angle of the robot and the vision cradle(Robot Angle of 201 dotted lines 205 being connect with the vision cradle relative to 203 vertical direction 206 of the vision cradle);Together When fuselage length size according to the robot on the image captured represented by pixel, using national forest park in Xiaokeng and The geometric proportion relationship of similar triangles calculates distance of the robot relative to the vision cradle;In step 3, According to the robot and the angle of the vision cradle and the robot relative to the vision cradle at a distance from, Calculate relative position coordinates of the robot relative to the vision cradle.Wherein, the robot and the vision The angle of cradle is vertical direction of the vision cradle on level ground described in the fuselage center deviation of the robot Angle.
Preferably, it in step 2, is obtained according to the geometric proportion relationship of national forest park in Xiaokeng and similar triangles, the machine The ratio of the fuselage length of length of the device people on the image captured and the robot, be equal to the focal length of the camera with It is opposite can to find out the robot by above-mentioned ratio relation for the ratio of the distance of the relatively described vision cradle of the robot The distance of the vision cradle;As shown in figure 3, the fuselage length of the robot 201 is D(Measurement can obtain);The machine Size of the people 201 on the image captured is L, and L values are by imaging sensor according to the spy in the side photo of the robot Image procossing output is levied, and there is quantitative relations for the length scale represented with each pixel on image, therefore L is needed from picture Primitive unit cell is converted into distance length unit;Position O is the position of camera, and position P is labeled as the fuselage center of the robot Point position;The focal length of the lens of camera is f(The intrinsic parameter of camera);The relatively described vision cradle of the robot 201 203 distance is labeled as 205, wherein 205 length is set as OP.According to basic national forest park in Xiaokeng, pass through similar triangles Geometric proportion relationship obtain
It is released by aforementioned proportion equation, the fuselage of the robot size on the image captured and the robot The ratio of length, related at a distance from the robot relatively described vision cradle, related coefficient is the camera Focal length f.So in step 2, according to fuselage length size of the robot on the image captured represented by pixel, profit With the geometric proportion relationship of national forest park in Xiaokeng and similar triangles, the robot is calculated relative to the vision cradle Distance.
As a kind of mode for implementing of the present invention, in step 2, closed according to national forest park in Xiaokeng and triangle geometry ratio System, the angle of the lens centre axis of the camera is deviateed at the fuselage center of the robot in the camera imaging plane Degree is equal to the angle of the robot and the vision cradle, then according to the fuselage center of the robot in the camera The location information of imaging plane finds out the angle of the robot and the vision cradle.As shown in figure 4, being filled with the vision The direction of origin positions of the position O as world coordinate system where camera on electric seat, the vertical vision cradle is y Axis direction establishes world coordinate system.The robot 201 is obtained relative to 203 Vertical Square of vision cradle by geometrical relationship To angle 204, i.e. the angle in line segment OP and y-axis direction is specially ɑ;The focal length of the lens of known camera is f;Camera institute The position of the robot is length m, i.e., the fuselage central point of the described robot by image recording sensor on the image of capture Position P deviates the distance value of focus on imaging plane by point of the lens projects of camera on imaging plane.Utilize aperture The geometric proportion relationship of imaging model and similar triangles calculates the folder of the robot 201 and the vision cradle 203 Angle(The angle of line segment OP and y-axis direction):
Specifically, in step 3, according to the angle of the robot and the vision cradle 203(Line segment OP and y-axis side To angle)The distance OP of ɑ and the robot 201 relative to the vision cradle 203, can be obtained by triangle geometrical relationship Go out, the robot 201 is in the abscissa of the world coordinate system
The robot 201 is in the ordinate of the world coordinate system
The robot 201 is relative to the relative position coordinates of the vision cradle 203
Preferably, in step 1, before identifying the robot, the robot side that selects the camera to take first Face photo, wherein having 500 active flank photos and 500 invalid side photos respectively, as training sample;Then to described Training sample pre-processes, and therefrom extracts image feature value, and pretreated method includes gray processing processing, histogram equalization; Grader is designed then for the feature samples, the characteristic value of the grader derives from described image characteristic value, and uses Training sample is trained grader, wherein by the image feature value extracted in the training sample to the grader into The grader of row weighted average combination Cheng Xin;It is finally generated to obtain detection by new grader, be arranged by trained grader Nontarget area in the image of shooting is removed, thereby using the target area that the search window based on detection is selected, is improved Detection speed;Wherein, robot side photo is conducive to distinguish the airframe structure feature of the robot;The training is The image feature value extracted in the training sample is weighted the grader average process;Detection is one A rectangular area for including target object.
Implement to provide a kind of robot recharging method as the present invention, which is based on the vision cradle and institute Localization method is stated, this method includes determining the robot relative to the relative position coordinates of the vision cradle and described With after the angle of the vision cradle, the robot is vertical on level ground according to the vision cradle for robot Default on direction recharges route, and amendment recharges circuit, it is made to charge along preset to recharge route and return seat.
Preferably, the process that the amendment recharges route includes, when the vision cradle is according to the robot and institute The angle for stating vision cradle determines the robot current relative position coordinate and is recharged toward described presets of direction deviation When route, the vision cradle to the robot sends instruction control, and it toward opposite direction moves back to described default recharge On route, then in conjunction with the revised coordinate position of the robot, controls the robot and recharge route along described preset Return seat charging.In Fig. 2, the straight line in the vertical direction of the vision cradle 203 is that described preset recharges route 206.Institute The value non-zero of robot 201 and the angle 204 of the vision cradle is stated, described in the position deviation for indicating 201 place of robot It is default to recharge route 206.After the vision cradle 203 completes the positioning of robot 201, determine robot 201 relative to The value non-zero of the default angle 204 for recharging route 206 is turned left deviation one relative to the default route 206 that recharges Angle 204.When robot 201 according to it is default recharge route 206 and return to the vision cradle 203 and charge when, data need to be passed through Processing and wireless communication module 104 control robot 201 and move right so that robot 201 is relative to the vision cradle 203, which turn right, is adapted on the default direction for recharging route 206, then again by data processing and wireless communication module 104 control robots 201 return to the charging of vision cradle 203 along default 206 straight line of route that recharges, and reduction recharges The error of process, raising recharge efficiency.
Above example be only it is fully open is not intended to limit the present invention, all creation purports based on the present invention, without creating Property labour equivalence techniques feature replacement, should be considered as the application exposure range.

Claims (10)

1. a kind of vision cradle, the electrode slice and wireless communication module of charging, feature are provided with above the vision cradle Be, also set up that there are one camera and data processing and wireless communication modules on the vision cradle, wherein data processing and Wireless communication module obtains the location information of robot by the image that analyzing processing camera is shot, and according to the position of robot Confidence breath is positioned and/or is recharged with communication control robot.
2. vision cradle according to claim 1, which is characterized in that the camera is fixed at the vision charging Right over the panel of seat.
3. vision cradle according to claim 1, which is characterized in that there are one fixed visual angles for the camera tool, use It is positioned and/or is recharged in the robot within the vision for appearing in the vision cradle.
4. a kind of localization method of robot, which is based on vision cradle described in claim 1, and suitable for going out The robot within the vision of the present vision cradle is positioned, which is characterized in that is included the following steps:
Step 1, the vision cradle identify the robot from the image that its camera captures;
Step 2, after successfully identifying the robot, according to the position of the robot on the image captured, using aperture at As model and triangle geometry proportionate relationship, the angle of the robot and the vision cradle is calculated;According to being captured Image on the robot fuselage length, utilize the geometric proportion relationship of national forest park in Xiaokeng and similar triangles, calculate Go out distance of the robot relative to the vision cradle;
Step 3 is filled according to the robot and the angle of the vision cradle and the robot relative to the vision The distance of electric seat calculates relative position coordinates of the robot relative to the vision cradle;
Wherein, the angle of the robot and the vision cradle is that vision described in the fuselage center deviation of the robot is filled The angle of vertical direction of the electric seat on level ground.
5. localization method according to claim 4, which is characterized in that in step 2, according to national forest park in Xiaokeng and similar triangle The geometric proportion relationship of shape obtains, length of the robot on the image captured and the fuselage length of the robot Ratio is equal to ratio of the focal length of the camera at a distance from the robot relatively described vision cradle, can be by upper State the distance that ratio relation finds out the relatively described vision cradle of the robot;
Wherein, the fuselage length of the robot is obtained by measuring, and the focal length of the camera is the intrinsic parameter of the camera, Length of the robot on the image captured is obtained by the imaging sensor of the camera.
6. localization method according to claim 4, which is characterized in that in step 2, according to national forest park in Xiaokeng and similar triangle The lens of the camera are deviateed at shape geometric proportion relationship, the fuselage center of the robot in the camera imaging plane The angle of central axis is equal to the angle of the robot and the vision cradle, then according to the fuselage center of the robot The angle of the robot and the vision cradle is found out in the location information of the camera imaging plane.
7. localization method according to claim 4, which is characterized in that in step 3, the robot is filled relative to the vision The relative position coordinates of electric seat are on world coordinate system, and the position where camera on the vision cradle is as generation The origin position of boundary's coordinate system.
8. localization method according to claim 4, which is characterized in that in step 1, specifically include following steps:
Select the robot side photo that the camera takes as training sample first;
Then the training sample is pre-processed, therefrom extracts image feature value;
Then grader is designed by described image characteristic value, and grader is trained using training sample and is newly classified Device;
It is finally generated to obtain detection by the new grader, is used for carrying out target identification to robot;
Wherein, detection is a rectangular area for including target object.
9. a kind of recharging method of robot, which is based on vision cradle and claim 4 institute described in claim 1 State localization method, which is characterized in that determine the robot relative to the relative position coordinates of the vision cradle and described With after the angle of the vision cradle, the robot is vertical on level ground according to the vision cradle for robot Default on direction recharges route, and amendment recharges circuit, it is made to charge along preset to recharge route and return seat;
Wherein, the angle of the robot and the vision cradle is that vision described in the fuselage center deviation of the robot is filled The angle of vertical direction of the electric seat on level ground.
10. recharging method according to claim 9, which is characterized in that the process that the amendment recharges route includes, when described Vision cradle determines the robot current relative position and sits according to the angle of the robot and the vision cradle Mark toward direction deviate it is described it is default recharge route when, the vision cradle to the robot sends instruction control, and its is past Opposite direction moves back to described preset and recharges on route, then controls the robot and returns seat along the default route that recharges Charging.
CN201810457865.9A 2018-05-14 2018-05-14 A kind of vision cradle and its localization method and recharging method Pending CN108646727A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810457865.9A CN108646727A (en) 2018-05-14 2018-05-14 A kind of vision cradle and its localization method and recharging method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810457865.9A CN108646727A (en) 2018-05-14 2018-05-14 A kind of vision cradle and its localization method and recharging method

Publications (1)

Publication Number Publication Date
CN108646727A true CN108646727A (en) 2018-10-12

Family

ID=63755336

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810457865.9A Pending CN108646727A (en) 2018-05-14 2018-05-14 A kind of vision cradle and its localization method and recharging method

Country Status (1)

Country Link
CN (1) CN108646727A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109510266A (en) * 2018-11-30 2019-03-22 广东银狐医疗科技股份有限公司 A kind of electric wheelchair charging unit, charging system
CN109900275A (en) * 2019-04-01 2019-06-18 珠海市一微半导体有限公司 The control method of the guidance signal of seat is found back by robot
CN109991980A (en) * 2019-04-01 2019-07-09 珠海市一微半导体有限公司 The forming method of the signal quantization distribution map of cradle
CN111596694A (en) * 2020-07-21 2020-08-28 追创科技(苏州)有限公司 Automatic recharging method, device, storage medium and system
CN111880524A (en) * 2020-06-12 2020-11-03 珠海市一微半导体有限公司 Charging seat, recharging docking system and laser docking method
CN111987768A (en) * 2020-08-19 2020-11-24 创新奇智(重庆)科技有限公司 Automatic charging equipment and automatic charging method of equipment
CN113359712A (en) * 2021-05-25 2021-09-07 深圳优地科技有限公司 Charging docking method and device and charging pile
CN113534796A (en) * 2021-07-07 2021-10-22 安徽淘云科技股份有限公司 Control method for electric equipment, storage medium and electric equipment
CN111880524B (en) * 2020-06-12 2024-05-07 珠海一微半导体股份有限公司 Charging seat, recharging docking system and laser docking method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060085105A1 (en) * 2004-10-20 2006-04-20 Infinite Electronics Inc. Automatic charging station for autonomous mobile machine
CN101033958A (en) * 2007-02-01 2007-09-12 华中科技大学 Mechanical vision locating method
CN106443650A (en) * 2016-09-12 2017-02-22 电子科技大学成都研究院 Monocular vision range finding method based on geometric relation
CN106647747A (en) * 2016-11-30 2017-05-10 北京智能管家科技有限公司 Robot charging method and device
CN107070000A (en) * 2017-04-27 2017-08-18 联想(北京)有限公司 Wireless charging method and equipment
CN107147873A (en) * 2017-05-27 2017-09-08 上海挚达科技发展有限公司 Intelligent charging spot with camera and charging pile operation method
CN107703933A (en) * 2016-08-09 2018-02-16 深圳光启合众科技有限公司 Charging method, device and the equipment of robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060085105A1 (en) * 2004-10-20 2006-04-20 Infinite Electronics Inc. Automatic charging station for autonomous mobile machine
CN101033958A (en) * 2007-02-01 2007-09-12 华中科技大学 Mechanical vision locating method
CN107703933A (en) * 2016-08-09 2018-02-16 深圳光启合众科技有限公司 Charging method, device and the equipment of robot
CN106443650A (en) * 2016-09-12 2017-02-22 电子科技大学成都研究院 Monocular vision range finding method based on geometric relation
CN106647747A (en) * 2016-11-30 2017-05-10 北京智能管家科技有限公司 Robot charging method and device
CN107070000A (en) * 2017-04-27 2017-08-18 联想(北京)有限公司 Wireless charging method and equipment
CN107147873A (en) * 2017-05-27 2017-09-08 上海挚达科技发展有限公司 Intelligent charging spot with camera and charging pile operation method

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109510266A (en) * 2018-11-30 2019-03-22 广东银狐医疗科技股份有限公司 A kind of electric wheelchair charging unit, charging system
CN109510266B (en) * 2018-11-30 2024-02-06 广东银狐医疗科技股份有限公司 Electric wheelchair charging device and charging system
CN109900275A (en) * 2019-04-01 2019-06-18 珠海市一微半导体有限公司 The control method of the guidance signal of seat is found back by robot
CN109991980A (en) * 2019-04-01 2019-07-09 珠海市一微半导体有限公司 The forming method of the signal quantization distribution map of cradle
CN109900275B (en) * 2019-04-01 2020-11-17 珠海市一微半导体有限公司 Control method for guiding signal of robot for finding back seat
CN109991980B (en) * 2019-04-01 2022-07-08 珠海一微半导体股份有限公司 Method for forming signal quantization distribution diagram of charging seat
CN111880524B (en) * 2020-06-12 2024-05-07 珠海一微半导体股份有限公司 Charging seat, recharging docking system and laser docking method
CN111880524A (en) * 2020-06-12 2020-11-03 珠海市一微半导体有限公司 Charging seat, recharging docking system and laser docking method
CN112327940A (en) * 2020-07-21 2021-02-05 追创科技(苏州)有限公司 Automatic recharging method, device, storage medium, charging base and system
WO2022017341A1 (en) * 2020-07-21 2022-01-27 追觅创新科技(苏州)有限公司 Automatic recharging method and apparatus, storage medium, charging base, and system
US11865937B2 (en) 2020-07-21 2024-01-09 Dreame Innovation Technology (Suzhou) Co., Ltd. Automatic recharging method, device, storage medium and system
CN111596694B (en) * 2020-07-21 2020-11-17 追创科技(苏州)有限公司 Automatic recharging method, device, storage medium and system
CN111596694A (en) * 2020-07-21 2020-08-28 追创科技(苏州)有限公司 Automatic recharging method, device, storage medium and system
CN111987768B (en) * 2020-08-19 2022-06-07 创新奇智(重庆)科技有限公司 Automatic charging equipment and automatic charging method of equipment
CN111987768A (en) * 2020-08-19 2020-11-24 创新奇智(重庆)科技有限公司 Automatic charging equipment and automatic charging method of equipment
CN113359712A (en) * 2021-05-25 2021-09-07 深圳优地科技有限公司 Charging docking method and device and charging pile
CN113534796A (en) * 2021-07-07 2021-10-22 安徽淘云科技股份有限公司 Control method for electric equipment, storage medium and electric equipment

Similar Documents

Publication Publication Date Title
CN108646727A (en) A kind of vision cradle and its localization method and recharging method
CN110988912B (en) Road target and distance detection method, system and device for automatic driving vehicle
Veľas et al. Calibration of rgb camera with velodyne lidar
CN108919838B (en) Binocular vision-based automatic tracking method for power transmission line of unmanned aerial vehicle
CN110142785A (en) A kind of crusing robot visual servo method based on target detection
CN105302151B (en) A kind of system and method for aircraft docking guiding and plane type recognition
CN102650886B (en) Vision system based on active panoramic vision sensor for robot
CN109238240A (en) A kind of unmanned plane oblique photograph method that taking landform into account and its camera chain
CN109753076A (en) A kind of unmanned plane vision tracing implementing method
CN109872324A (en) Ground obstacle detection method, device, equipment and storage medium
CN108508916B (en) Control method, device and equipment for unmanned aerial vehicle formation and storage medium
CN106625673A (en) Narrow space assembly system and assembly method
CN110434516A (en) A kind of Intelligent welding robot system and welding method
CN101839692A (en) Method for measuring three-dimensional position and stance of object with single camera
CN109917420A (en) A kind of automatic travelling device and robot
KR20200001471A (en) Apparatus and method for detecting lane information and computer recordable medium storing computer program thereof
CN109737981A (en) Unmanned vehicle target-seeking device and method based on multisensor
CN106096207B (en) A kind of rotor wing unmanned aerial vehicle wind resistance appraisal procedure and system based on multi-vision visual
CN111968048A (en) Method and system for enhancing image data of few samples in power inspection
CN114004977A (en) Aerial photography data target positioning method and system based on deep learning
CN110998241A (en) System and method for calibrating an optical system of a movable object
Li et al. 3D autonomous navigation line extraction for field roads based on binocular vision
CN107272037A (en) A kind of road equipment position, image information collecting device and the method for gathering information
CN114659499B (en) Smart city 3D map model photography establishment method based on unmanned aerial vehicle technology
CN111402324B (en) Target measurement method, electronic equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong

Applicant after: Zhuhai Yiwei Semiconductor Co.,Ltd.

Address before: Room 105-514, No.6 Baohua Road, Hengqin New District, Zhuhai City, Guangdong Province

Applicant before: AMICRO SEMICONDUCTOR Co.,Ltd.