WO2015101311A1 - 光点指示机器人及其光点指示方法 - Google Patents

光点指示机器人及其光点指示方法 Download PDF

Info

Publication number
WO2015101311A1
WO2015101311A1 PCT/CN2014/095772 CN2014095772W WO2015101311A1 WO 2015101311 A1 WO2015101311 A1 WO 2015101311A1 CN 2014095772 W CN2014095772 W CN 2014095772W WO 2015101311 A1 WO2015101311 A1 WO 2015101311A1
Authority
WO
WIPO (PCT)
Prior art keywords
motor
laser
image plane
module
projection position
Prior art date
Application number
PCT/CN2014/095772
Other languages
English (en)
French (fr)
Inventor
汤进举
Original Assignee
科沃斯机器人有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 科沃斯机器人有限公司 filed Critical 科沃斯机器人有限公司
Priority to US15/109,374 priority Critical patent/US10639795B2/en
Publication of WO2015101311A1 publication Critical patent/WO2015101311A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/31From computer integrated manufacturing till monitoring
    • G05B2219/31048Project on workpiece, image of finished workpiece, info or a spot
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40103Show object with laser pointer, give oral command for action on, with object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/02Arm motion controller
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/30End effector

Definitions

  • the invention relates to a light point indicating robot and a light point indicating method thereof, and belongs to the technical field of household small electric appliance manufacturing.
  • the shopping guide robot is one of a variety of self-mobile robots that are commonly used, and has strong interaction with users.
  • the existing spot-finding robot usually adopts a light-point indicating object, and generally uses a laser pointer provided on the body to hit the laser spot onto the designated object, thereby completing the shopping guide action of the light-point indicating object.
  • the specific process includes: the light point indicating object guiding robot first needs to acquire the three-dimensional coordinates of the target object, and then the control unit controls the laser pen motion according to the obtained three-dimensional coordinates, and moves the laser point to the three-dimensional coordinates of the target object, thereby completing the shopping guide robot.
  • the light spot indicates the complete motion of the object.
  • the three-dimensional coordinates of the target object are actually not easily acquired; in addition, according to the three-dimensional coordinates of the target object, the laser pointer is rotated by a certain angle in a certain direction to move the laser spot to a specified position, and the control process of the entire motion is calculated.
  • the amount is large and the requirements for the control unit are relatively high.
  • the technical problem to be solved by the present invention is to provide a light spot indicating robot and a light spot indicating method thereof according to the deficiencies of the prior art.
  • the present invention converts the positioning of a three-dimensional space of a target object into a two-dimensional space with a small amount of calculation and indicating the position. Fast and accurate.
  • a light point indicating robot includes a robot body, and the body is provided with a control module, a camera module and a laser indicating module, wherein the laser indicating module emits a laser beam, and the camera module captures an object to be instructed to form an image plane, and the laser beam And the object to be indicated are respectively projected on the image plane to form a laser point projection position and an object projection position to be indicated, and the spot indicating robot further comprises a signal input module, according to the image plane of the object to be indicated captured by the camera module The displayed content determines the target object in the object to be indicated by the signal input module input information, and the control module controls the laser pointer module to move, so that the laser point projection position on the image plane coincides with the target object projection position.
  • the signal input module can adopt various structures, and can be a mouse module or a touch screen module, and the target object in the object to be indicated can be determined by clicking through a mouse or a touch screen;
  • a module or an audio input module the image plane is divided into cells having a specific position code, and the target object in the object to be indicated is determined by inputting the coding information of the cell in which the target object is located through a keyboard or an audio.
  • the laser indicating module includes a laser pen and a driving device
  • the driving device includes a first motor and a second motor
  • the second motor is fixed on an output shaft of the first motor, the first a motor and a second motor output shaft are disposed perpendicular to each other
  • the laser pen includes a fixed end and a free end, and the fixed end is fixed at the intersection of the first motor and the second motor output shaft at the second motor output On the shaft, the free end swings around the fixed end, and the swinging plane of the laser pointer is perpendicular to the output shaft of the second motor.
  • the first motor output shaft is perpendicular to the image plane.
  • the laser indicating module may further include a laser pen and a driving device, the driving device includes a third motor and a fourth motor, and the third motor and the fourth motor respectively drive the laser pen Swing in mutually perpendicular directions.
  • the light spot indicating robot is a shopping guide robot or a pointing robot.
  • the present invention also provides a light spot indicating method for a light spot indicating robot, the light spot indicating robot comprising a camera module, a laser indicating module and a signal input module, the light spot indicating method comprising the following steps:
  • Step 1 The laser indicating module emits a laser beam, and the camera module captures an object to be instructed to form an image plane, and the image plane is provided with X-axis and Y-axis perpendicular to each other;
  • Step 2 determining, according to the content displayed by the image plane of the object to be indicated captured by the camera module, the target object in the object to be indicated by the input information of the signal input module, and acquiring the target object projection position Q' located on the image plane;
  • Step 3 Moving the laser indicating module so that the laser spot real-time projection position P' on the image plane coincides with the target object projection position Q'.
  • the laser indicating module comprises a laser pen and a driving device
  • the driving device comprises a first motor and a second motor
  • the second motor is fixed on an output shaft of the first motor
  • the first motor And the second motor output shaft is disposed coaxially with each other
  • the laser pen includes a fixed end and a free end, and the fixed end is fixed on the output shaft of the second motor with a fixed point of the first motor and the second motor output shaft as a fixed point
  • the free end swings around the fixed end, and the swinging plane of the laser pointer is perpendicular to the output shaft of the second motor
  • the projection position of the laser spot on the image plane is O';
  • the output shaft of the second motor is parallel to the X axis, between the second motor and the X axis
  • the angle of the angle is 0°;
  • the step 3 specifically includes:
  • Step 3-1 Calculate the magnitude of the angle ⁇ between the straight line O'Q' and the X axis in the image plane A', and the first motor drives the second motor to rotate to an angular position of ⁇ ⁇ 90°;
  • Step 3-2 The camera module captures the projection position P′ of the laser spot on the image plane in real time and compares it with the target object projection position Q′.
  • the second motor drives the free end of the laser pen to swing around the fixed end until the laser
  • the real-time projection position P' of the point on the image plane coincides with the target object projection position Q'.
  • the laser indicating module includes a laser pen and a driving device
  • the driving device includes a first motor and a second motor
  • the second motor is fixed on an output shaft of the first motor, The first motor and the second motor output shaft are disposed perpendicular to each other;
  • the laser pen includes a fixed end and a free end.
  • the fixed end is fixed on the output shaft of the second motor with a fixed point of the first motor and the second motor output shaft as a fixed point, and the free end rotates around the fixed end. a plane of rotation of the laser pointer perpendicular to the output shaft of the second motor;
  • the projection position of the laser spot on the image plane is O'; the distance between the focus N of the lens and the image plane A' in the camera module is d;
  • the output shaft of the second motor is parallel to the X axis, the angle between the second motor and the X axis is 0°.
  • the step 3 specifically includes:
  • Step 3-1' Calculate the angle ⁇ between the straight line O'Q' and the X axis in the image plane A'; according to the distance between the focus N and the image plane A' is d, in the plane O'NQ ' Calculate the size of ⁇ O'NQ' as ⁇ ;
  • Step 3-2' driving the first and second motors separately or simultaneously, the second motor is rotated to an angle of ⁇ 90°, and the laser pen is swung to the ⁇ angle to make the real-time projection position P′ of the laser spot on the image plane and the target The object projection position Q' coincides.
  • the laser indicating module includes a laser pen and a driving device
  • the driving device includes a third motor and a fourth motor
  • the third motor and the fourth motor respectively drive the laser pen to oscillate in mutually perpendicular X-axis directions and Y-axis directions
  • the camera module captures the projection position P′ of the laser spot on the image plane in real time and compares it with the target object projection position Q′.
  • Step 3 specifically includes:
  • the laser pointer is swung in the X-axis direction until the projection position P' of the laser spot on the image plane is the same as the X-axis coordinate of the target object projection position Q';
  • the laser pointer is swung in the Y-axis direction until the projection position P' of the laser spot on the image plane is the same as the Y-axis coordinate of the target object projection position Q'.
  • the present invention converts the positioning of the three-dimensional space of the target object into a two-dimensional space, and the calculation amount is small, and the indication position is fast and accurate.
  • FIG. 1 is a schematic structural diagram of a laser indicating module according to Embodiment 1 of the present invention.
  • FIG. 2 is a second schematic structural diagram of a laser indicating module according to Embodiment 1 of the present invention.
  • FIG. 3 is a schematic diagram showing a projection relationship of any laser beam on an image plane and a physical plane according to the present invention
  • Figure 4 is a schematic view showing the coincidence of the laser spot real-time projection position P' and the target object projection position Q' on the image plane of the present invention.
  • the invention provides a light point indicating robot, comprising a robot body, wherein the body is provided with a control module, a camera module and a laser indicating module, wherein the laser indicating module emits a laser beam, and the camera module captures an object to be instructed to form an image plane, The laser beam and the object to be indicated are respectively projected on the image plane to form a laser point projection position and an object projection position to be indicated, and the spot indicating robot is further provided with a signal input module, which is to be instructed according to the camera module.
  • the content displayed by the image plane of the object determines the target object in the object to be indicated by the input information of the signal input module, and the control module controls the movement of the laser indicating module to project the position of the laser point on the image plane and the target object The positions coincide.
  • the signal input module can adopt various structures according to different needs of the user, and can be a mouse module or a touch screen module, and can determine a target object in the object to be indicated by clicking a mouse or a touch screen; or can be a keyboard module or an audio input module.
  • the image plane is divided into cells with specific position codes, and the target object in the object to be indicated is determined by inputting the coding information of the cell in which the target object is located through a keyboard or audio.
  • the target object is determined in the object to be indicated. For example, when the light point indicating robot is a shopping guide robot, the object to be indicated is all the goods placed on the shelf, and the target object is the product that the user really needs to purchase.
  • the signal input module on the light point indicating robot can directly receive the input signal of the body, and can also receive the input signal of the remote terminal, such as by wired or wireless (broadband, Bluetooth, infrared, GPRS, 3G, WIFI, etc.) Input signal.
  • FIG. 1 and FIG. 2 are respectively a schematic diagram and a second structural diagram of a laser indicating module according to Embodiment 1 of the present invention.
  • the laser indicating module 100 includes a laser pointer 110 and a driving device, the driving device includes a first motor 120 and a second motor 130, and the second motor 130 is fixed in the On the output shaft M of the first motor 120, the output shaft M of the first motor 120 and the output shaft N of the second motor 130 are disposed perpendicular to each other.
  • the laser pointer 110 includes a fixed end 111 and a free end 112.
  • the fixed end 111 is fixed on the output shaft N of the second motor 130 with a fixed point of the output shaft N of the first motor 120 and the second motor 130.
  • the swinging plane of the laser pointer 110 is perpendicular to the output shaft N of the second motor 130.
  • the output shaft M of the first motor 120 is perpendicular to the image plane.
  • the light spot indicating method of the light spot indicating robot of the present invention includes the following steps: Step 1: The laser indicating module emits a laser beam, and the camera module captures an object to be instructed to form an image plane A'.
  • the image plane A' is provided with X-axis and Y-axis perpendicular to each other;
  • Step 2 determining the object to be indicated a' through the input information of the signal input module according to the content displayed by the image plane A' of the object to be indicated captured by the camera module , the target object a' in b', c', and d', obtain the target object projection position Q' located on the image plane A';
  • Step 3 move the laser indicating module to make the laser spot real-time projection position on the image plane A' P' coincides with the target object projection position Q', as shown in Fig. 4, the coincident indicated position is the center of the target object a'.
  • Step 3 above specifically includes:
  • Step 3-1 Calculate the magnitude of the angle ⁇ between the straight line O'Q' and the X axis in the image plane A', and the first motor drives the second motor to rotate to an angular position of ⁇ ⁇ 90°;
  • Step 3-2 The camera module captures the projection position P′ of the laser spot on the image plane in real time and compares it with the target object projection position Q′.
  • the second motor drives the free end of the laser pen to swing around the fixed end until the laser
  • the real-time projection position P' of the point on the image plane coincides with the target object projection position Q'.
  • the process of superimposing the real-time projection position P' of the laser spot on the image plane A' and the projection position Q' of the target object is first calculated by calculating the relationship between the straight line O'Q' and the X-axis.
  • the magnitude of the angle ⁇ is achieved by driving the free end of the laser pointer to swing.
  • the light point indicating robot is a shopping guide robot.
  • the working process of the shopping guide robot is as follows: First, the laser indicating module disposed in the shopping guide robot emits a laser beam, and at the same time, is set in the shopping guide.
  • the camera module in the robot captures an object to be instructed, such as various commodities on the supermarket shelf, forming an image plane A', and setting X-axis and Y-axis perpendicular to each other on the image plane A'.
  • the user can determine the target object a' in the objects to be indicated a', b', c' and d' through the input information of the signal input module according to various items on the supermarket shelf photographed by the camera module, that is, the target object a' to be purchased. That item acquires the projection position Q' of the item on the image plane A'.
  • the input module can take a variety of structural forms, which can be accessed by mouse or touch screen.
  • the user can move the laser pointer module to make the laser spot real-time projection position P' on the image plane A' coincide with the desired purchase item projection position Q'.
  • the laser indicating position is indicated on the item to be purchased.
  • the usual indication position is the center of the product.
  • this indication method may also have errors, but the goods themselves have a certain volume. Even if there are some indication errors, the correct indication of the required purchase goods can be guaranteed.
  • the laser input position can be further fine-tuned by the signal input module, such as moving the upper, lower, left and right buttons of the keyboard module on the robot body or the remote control terminal through the light spot, and the control module controls the laser point movement correspondingly to ensure The laser point is indicated on the item you want to purchase.
  • the light spot indicating robot may also be a pointing robot.
  • the laser spot real-time projection position P' on the image plane A' coincides with the required purchase item projection position Q', except that the above-mentioned first calculation of the angle ⁇ between the line O'Q' and the X-axis is performed, and then the laser is driven.
  • another mode of operation is provided in this embodiment.
  • the projection position of the laser spot on the image plane is O';
  • the distance between the focus N of the lens and the image plane A' in the module is d;
  • the output shaft of the second motor is parallel to the X axis, the angle between the second motor and the X axis is 0°,
  • the step 3 specifically includes:
  • Step 3-1' Calculate the angle ⁇ between the straight line O'Q' and the X axis in the image plane A'; according to the distance between the focus N and the image plane A' is d, in the plane O'NQ ' Calculate the size of ⁇ O'NQ' as ⁇ ;
  • Step 3-2' driving the first and second motors separately or simultaneously, the second motor is rotated to an angle of ⁇ 90°, and the laser pen is swung to the ⁇ angle to make the real-time projection position P′ of the laser spot on the image plane and the target The object projection position Q' coincides.
  • the process of making the laser spot real-time projection position P' on the image plane A' coincide with the target object projection position Q' is firstly based on the distance between the focus N and the image plane A'. , calculate the size of the angle ⁇ ; then directly swing the laser pointer to the angle ⁇ .
  • the structure of the driving device in the laser indicating module in addition to the structures described in the first embodiment and the second embodiment, the driving device may further include a third motor and a fourth motor, a third motor and a The four motors respectively drive the laser pen to oscillate along the mutually perpendicular X-axis direction and the Y-axis direction.
  • the operation process of the robot using such a driving device is also different from the above two embodiments.
  • the first two steps of the indicating method are the same as those of the first embodiment and the second embodiment, and the step 3 has some differences, specifically, the camera.
  • the module captures the projection position P′ of the laser spot on the image plane and compares it with the target object projection position Q′.
  • Step 3 specifically includes: oscillating the laser pointer along the X-axis direction until the projection position of the laser spot on the image plane P 'The point is the same as the X-axis coordinate of the target object's projection position Q'; swing the laser pointer in the Y-axis direction until the projection position P' of the laser spot on the image plane is the same as the Y-axis coordinate of the target object's projection position Q' . That is to say, in the present embodiment, the laser pen is respectively driven to swing in the X direction and the Y direction by two motors, and the real-time projection position P' of the laser spot on the image plane is coincident with the target object projection position Q'.
  • the indication process is respectively driven to swing in the X direction and the Y direction by two motors, and the real-time projection position P' of the laser spot on the image plane is coincident with the target object projection position Q'.
  • the present invention provides a light point indicating robot and a light point indicating method thereof.
  • the laser indicating module emits a laser beam
  • the camera module captures an object to be instructed to form an image plane, and sets mutually perpendicular X axes and on the image plane.
  • Y axis according to the content displayed by the image plane of the object to be indicated captured by the camera module, the target object in the object to be indicated is determined by the input information of the signal input module, and the target object projection position Q′ located on the image plane is acquired;
  • the indicating module makes the laser spot real-time projection position P' on the image plane coincide with the target object projection position Q'.
  • the invention transforms the positioning of the three-dimensional space of the target object into the two-dimensional space, including the acquisition of the two-dimensional space coordinates of the target object and the laser point and the calculation of moving the laser point to the target object, the calculation amount is small, and the indication position is fast and accurate.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Manipulator (AREA)
  • Position Input By Displaying (AREA)

Abstract

一种光点指示机器人及其光点指示方法,该光电指示机器人包括机器人本体,本体上设有控制模块、摄像模块和激光指示模块(100),激光指示模块(100)发射激光光束,摄像模块拍摄待指示物体形成图像平面,激光光束和待指示物体分别投影在所述图像平面上,形成激光点投影位置和待指示物体投影位置,光点指示机器人还设有信号输入模块,根据摄像模块拍摄到的待指示物体的图像平面所显示的内容,通过信号输入模块输入信息确定待指示物体中的目标物体,控制模块控制激光指示模块移动,使图像平面上激光点投影位置与目标物体投影位置重合。本光点指示机器人将目标物体三维空间的定位转化到二维空间,计算量小,指示位置快速准确。

Description

光点指示机器人及其光点指示方法 技术领域
本发明涉及一种光点指示机器人及其光点指示方法,属于家用小电器制造技术领域。
背景技术
导购机器人是常用的多种自移动机器人中的一种,与使用者的互动性强。现有的导购机器人通常采用的光点指示物体,一般是通过设置在机体上的激光笔将激光点打到指定物体上,从而完成光点指示物体的导购动作。具体过程包括:光点指示物体导购机器人首先需要获取目标物体的三维坐标,然后控制单元根据所获得的三维坐标控制激光笔运动,使激光点移动到目标物体的三维坐标处,即可完成导购机器人光点指示物体的完整动作。然而,目标物体的三维坐标实际上时很不容易获取的;另外,根据目标物体的三维坐标控制激光笔沿一定的方向转动一定的角度,使激光点移动到指定位置,整个动作的控制过程计算量较大,且对控制单元的要求也比较高。
发明内容
本发明所要解决的技术问题在于针对现有技术的不足,提供一种光点指示机器人及其光点指示方法,本发明将目标物体三维空间的定位转化到二维空间,计算量小,指示位置快速准确。
本发明的所要解决的技术问题是通过如下技术方案实现的:
一种光点指示机器人,包括机器人本体,所述本体上设有控制模块、摄像模块和激光指示模块,所述激光指示模块发射激光光束,摄像模块拍摄待指示物体形成图像平面,所述激光光束和待指示物体分别投影在所述图像平面上,形成激光点投影位置和待指示物体投影位置,所述光点指示机器人还设有信号输入模块,根据摄像模块拍摄到的待指示物体的图像平面所显示的内容,通过所述信号输入模块输入信息确定待指示物体中的目标物体,所述控制模块控制激光指示模块移动,使所述图像平面上激光点投影位置与目标物体投影位置重合。
根据使用者的不同需求,所述信号输入模块可以采用多种结构,可以为鼠标模块或触摸屏模块,通过鼠标或触摸屏点击确定待指示物体中的目标物体;也可以为键盘 模块或音频输入模块,所述图像平面划分为具有具体位置编码的单元格,通过键盘或音频输入目标物体所在单元格的编码信息,确定待指示物体中的目标物体。
更具体地,所述激光指示模块包含激光笔和驱动装置,所述驱动装置包括第一电机和第二电机,所述第二电机固设在所述第一电机的输出轴上,所述第一电机与第二电机输出轴相互垂直共面设置;所述激光笔包括固定端与自由端,固定端以第一电机与第二电机输出轴交点为固定点固设在所述第二电机输出轴上,自由端以固定端为中心摆动,所述激光笔的摆动平面与所述第二电机输出轴相互垂直。
为了保证激光指示模块的正常工作,所述第一电机输出轴与所述图像平面垂直。
除了上述的驱动方式之外,所述激光指示模块还可以包含激光笔和驱动装置,所述驱动装置包括第三电机和第四电机,所述第三电机和第四电机分别驱动所述激光笔沿相互垂直的方向摆动。
所述光点指示机器人为导购机器人或指路机器人。
本发明还提供一种光点指示机器人的光点指示方法,该光点指示机器人包括摄像模块、激光指示模块和信号输入模块,所述光点指示方法包括如下步骤:
步骤1:激光指示模块发射激光光束,摄像模块拍摄待指示物体形成图像平面,所述图像平面上设有相互垂直的X轴和Y轴;
步骤2:根据摄像模块拍摄到的待指示物体的图像平面所显示的内容,通过信号输入模块输入信息确定待指示物体中的目标物体,获取位于图像平面上的目标物体投影位置Q’;
步骤3:移动激光指示模块,使图像平面上激光点实时投影位置P’和目标物体投影位置Q’重合。
其中,所述激光指示模块包含激光笔和驱动装置,所述驱动装置包括第一电机和第二电机,所述第二电机固设在所述第一电机的输出轴上,所述第一电机与第二电机输出轴相互垂直共面设置;所述激光笔包括固定端与自由端,固定端以第一电机与第二电机输出轴交点为固定点固设在所述第二电机输出轴上,自由端以固定端为中心摆动,所述激光笔的摆动平面与所述第二电机输出轴相互垂直;
设当激光笔的中心线与第一电机输出轴共线时,激光点在图像平面上的投影位置为O’;定义第二电机输出轴与X轴平行时,第二电机与X轴之间的夹角为0°;
所述步骤3具体包含:
步骤3-1:在图像平面A’中计算出直线O’Q’与X轴之间夹角θ的大小,第一电机驱动第二电机旋转至θ±90°角位置;
步骤3-2:摄像模块实时拍摄获取激光点在图像平面上的投影位置P’并与目标物体投影位置Q’进行比较,第二电机驱动激光笔的自由端以固定端为中心摆动,直至激光点在图像平面上的实时投影位置P’和目标物体投影位置Q’重合。
在另一实施例中,所述激光指示模块包含激光笔和驱动装置,所述驱动装置包括第一电机和第二电机,所述第二电机固设在所述第一电机的输出轴上,所述第一电机与第二电机输出轴相互垂直共面设置;
所述激光笔包括固定端与自由端,固定端以第一电机与第二电机输出轴交点为固定点固设在所述第二电机输出轴上,自由端以固定端为中心转动,所述激光笔的转动平面与所述第二电机输出轴相互垂直;
设当激光笔的中心线与第一电机输出轴共线时,激光点在图像平面上的投影位置为O’;摄像模块中镜头的焦点N与图像平面A’之间的距离为d;定义第二电机输出轴与X轴平行时,第二电机与X轴之间的夹角为0°,
所述步骤3具体包含:
步骤3-1’:在图像平面A’中计算出直线O’Q’与X轴之间夹角θ的大小;根据焦点N与图像平面A’之间的距离为d,在平面O’NQ’中计算出∠O’NQ’的大小为α;
步骤3-2’:分别或同时驱动第一、第二电机,第二电机旋转至θ±90°角,激光笔摆动至α角,使激光点在图像平面上的实时投影位置P’和目标物体投影位置Q’重合。
另外,所述激光指示模块包含激光笔和驱动装置,所述驱动装置包括第三电机和第四电机,第三电机和第四电机分别驱动激光笔沿相互垂直的X轴方向和Y轴方向摆动,摄像模块实时拍摄获取激光点在图像平面上的投影位置P’并与目标物体投影位置Q’进行比较,步骤3具体包含:
沿X轴方向摆动激光笔,直至激光点在图像平面上的投影位置P’点与目标物体投影位置Q’点的X轴坐标相同;
沿Y轴方向摆动激光笔,直至激光点在图像平面上的投影位置P’点与目标物体投影位置Q’点的Y轴坐标相同。
综上所述,本发明将目标物体三维空间的定位转化到二维空间,计算量小,指示位置快速准确。
下面结合附图和具体实施例,对本发明的技术方案进行详细地说明。
附图说明
图1为本发明实施例一中激光指示模块的结构示意图之一;
图2为本发明实施例一中激光指示模块的结构示意图之二;
图3为本发明任一激光光束在图像平面和实物平面上的投影关系示意图;
图4为本发明图像平面上激光点实时投影位置P’和目标物体投影位置Q’重合示意图。
具体实施方式
实施例一
本发明提供一种光点指示机器人,包括机器人本体,在所述本体上设有控制模块、摄像模块和激光指示模块,所述激光指示模块发射激光光束,摄像模块拍摄待指示物体形成图像平面,所述激光光束和待指示物体分别投影在所述图像平面上,形成激光点投影位置和待指示物体投影位置,所述光点指示机器人还设有信号输入模块,根据摄像模块拍摄到的待指示物体的图像平面所显示的内容,通过所述信号输入模块输入信息确定待指示物体中的目标物体,所述控制模块控制激光指示模块移动,使所述图像平面上激光点投影位置与目标物体投影位置重合。
根据使用者的不同需求,所述信号输入模块可以采用多种结构,可以为鼠标模块或触摸屏模块,通过鼠标或触摸屏点击确定待指示物体中的目标物体;也可以为键盘模块或音频输入模块,所述图像平面划分为具有具体位置编码的单元格,通过键盘或音频输入目标物体所在单元格的编码信息,确定待指示物体中的目标物体。无论采用上述何种方式的信号输入模块,都是为了在待指示物体中确定目标物体。举例来说,当所述光点指示机器人为导购机器人时,待指示物体为货架上摆放的所有商品,而目标物体则是使用者真正需要购买的商品。需要指出的是,光点指示机器人上的信号输入模块可以直接接收本体的输入信号,也可以接收远程终端的输入信号,如通过有线或无线(宽带、蓝牙、红外、GPRS、3G、WIFI等)的输入信号。
图1和图2分别为本发明实施例一中激光指示模块的结构示意图之一和之二。如图1并结合图2所示,所述激光指示模块100包含激光笔110和驱动装置,所述驱动装置包括第一电机120和第二电机130,所述第二电机130固设在所述第一电机120的输出轴M上,所述第一电机120的输出轴M与第二电机130输出轴N相互垂直共面设置。所述激光笔110包括固定端111与自由端112,固定端111以第一电机120与第二电机130输出轴N交点为固定点固设在所述第二电机130输出轴N上,自由端 112以固定端111为中心摆动,所述激光笔110的摆动平面与所述第二电机130输出轴N相互垂直。为了保证激光指示模块的正常工作,所述第一电机120输出轴M与所述图像平面垂直。
图3为本发明任一激光光束在图像平面和实物平面上的投影关系示意图;图4为本发明图像平面上激光点实时投影位置P’和目标物体投影位置Q’重合示意图。如图3并结合图4所示,本发明光点指示机器人的光点指示方法,包括如下步骤:步骤1:激光指示模块发射激光光束,摄像模块拍摄待指示物体形成图像平面A’,所述图像平面A’上设有相互垂直的X轴和Y轴;步骤2:根据摄像模块拍摄到的待指示物体的图像平面A’所显示的内容,通过信号输入模块输入信息确定待指示物体a’、b’、c’和d’中的目标物体a’,获取位于图像平面A’上的目标物体投影位置Q’;步骤3:移动激光指示模块,使图像平面A’上激光点实时投影位置P’和目标物体投影位置Q’重合,如图4所示,重合后的指示位置为目标物体a’的中心。
如图1至图4所示,设当激光笔的中心线与第一电机输出轴共线时,激光点在图像平面上的投影位置为O’;定义第二电机输出轴与X轴平行时,第二电机与X轴之间的夹角为0°。
上述的步骤3具体包含:
步骤3-1:在图像平面A’中计算出直线O’Q’与X轴之间夹角θ的大小,第一电机驱动第二电机旋转至θ±90°角位置;
步骤3-2:摄像模块实时拍摄获取激光点在图像平面上的投影位置P’并与目标物体投影位置Q’进行比较,第二电机驱动激光笔的自由端以固定端为中心摆动,直至激光点在图像平面上的实时投影位置P’和目标物体投影位置Q’重合。
也就是说,在本实施例中,使图像平面A’上激光点实时投影位置P’和目标物体投影位置Q’重合的过程,是首先通过计算出直线O’Q’与X轴之间夹角θ的大小,再通过驱动激光笔的自由端摆动来实现的。在本实施例中,该光点指示机器人为导购机器人,具体来说,导购机器人的工作过程就是这样的:首先,设置在导购机器人中的激光指示模块发射激光光束,与此同时,设置在导购机器人中的摄像模块拍摄待指示物体,比如:超市货架上的各种商品,形成图像平面A’,并在所述图像平面A’上设置相互垂直的X轴和Y轴。使用者可以根据摄像模块拍摄到的超市货架上的各种物品,通过信号输入模块输入信息确定待指示物体a’、b’、c’和d’中的目标物体a’,也就是需要购买的那件商品,获取该商品位于图像平面A’上的投影位置Q’。根据使用者的需要或操作习惯,输入模块可以采用多种结构形式,可以通过鼠标或触摸屏点 击确定需要购买的商品;也可以将图像平面划分为具有具体位置编码的单元格,通过键盘或音频输入所需要购买的商品所在单元格的编码信息。一旦通过上述的各种输入模块确定所需要购买的商品之后,使用者就可以通过移动激光指示模块,使图像平面A’上激光点实时投影位置P’和所需购买商品投影位置Q’重合,使激光指示位置指示在所需购买的商品上。通常的指示位置为商品的中心,当然,这种指示方法也有可能出现误差,但商品本身都具有一定的体积,即使存在一些指示误差,也可以保证对所需购买商品的正确指示,进一步的,如果商品本身体积较小,还可以通过信号输入模块对激光指示位置进一步微调,如通过光点指示机器人本体或远程控制终端上的键盘模块的上下左右按键移动,控制模块对应控制激光点移动,确保激光点指示在所需购买的商品上。
另外,除了本实施例中上述的导购机器人之外,所述的光点指示机器人还可以为指路机器人。
实施例二
使图像平面A’上激光点实时投影位置P’和所需购买商品投影位置Q’重合,除了上述首先通过计算出直线O’Q’与X轴之间夹角θ的大小,再通过驱动激光笔的自由端摆动来实现的一种工作方式之外,在本实施例中还提供另外一种工作方式。
结合图3和图4所示,具体来说,在本实施例中,设当激光笔的中心线与第一电机输出轴共线时,激光点在图像平面上的投影位置为O’;摄像模块中镜头的焦点N与图像平面A’之间的距离为d;定义第二电机输出轴与X轴平行时,第二电机与X轴之间的夹角为0°,
所述步骤3具体包含:
步骤3-1’:在图像平面A’中计算出直线O’Q’与X轴之间夹角θ的大小;根据焦点N与图像平面A’之间的距离为d,在平面O’NQ’中计算出∠O’NQ’的大小为α;
步骤3-2’:分别或同时驱动第一、第二电机,第二电机旋转至θ±90°角,激光笔摆动至α角,使激光点在图像平面上的实时投影位置P’和目标物体投影位置Q’重合。
也就是说,在实施例二中,使图像平面A’上激光点实时投影位置P’和目标物体投影位置Q’重合的过程,是首先根据焦点N与图像平面A’之间的距离为d,计算出α角的大小;然后直接摆动激光笔至α角即可。
本实施例中的其他技术特征与实施例一相同,请参照前述内容,在此不再赘述。
实施例三
另外,所述激光指示模块中驱动装置的结构设置,除了实施例一和实施例二中所述的结构之外,所述驱动装置还可以包括第三电机和第四电机,第三电机和第四电机分别驱动激光笔沿相互垂直的X轴方向和Y轴方向摆动。采用这种驱动装置的机器人的动作过程也与前述两个实施例中有所不同,指示方法的前两个步骤与实施例一、二相同,而步骤3则有一些差异,具体来说,摄像模块实时拍摄获取激光点在图像平面上的投影位置P’并与目标物体投影位置Q’进行比较,步骤3具体包含:沿X轴方向摆动激光笔,直至激光点在图像平面上的投影位置P’点与目标物体投影位置Q’点的X轴坐标相同;沿Y轴方向摆动激光笔,直至激光点在图像平面上的投影位置P’点与目标物体投影位置Q’点的Y轴坐标相同。也就是说,在本实施例中,是通过两个电动机,分别驱动激光笔沿X方向和Y方向摆动,完成使激光点在图像平面上的实时投影位置P’和目标物体投影位置Q’重合的指示过程。
综上所述,本发明所提供的光点指示机器人及其光点指示方法,激光指示模块发射激光光束,摄像模块拍摄待指示物体形成图像平面,并在图像平面上设置相互垂直的X轴和Y轴;根据摄像模块拍摄到的待指示物体的图像平面所显示的内容,通过信号输入模块输入信息确定待指示物体中的目标物体,获取位于图像平面上的目标物体投影位置Q’;移动激光指示模块,使图像平面上激光点实时投影位置P’和目标物体投影位置Q’重合。本发明将目标物体三维空间的定位转化到二维空间,包括目标物体与激光点二维空间坐标的获取以及将激光点移动到目标物体处所进行的计算,计算量小,指示位置快速准确。

Claims (11)

  1. 一种光点指示机器人,包括机器人本体,所述本体上设有控制模块、摄像模块和激光指示模块(100),所述激光指示模块发射激光光束,摄像模块拍摄待指示物体形成图像平面,所述激光光束和待指示物体分别投影在所述图像平面上,形成激光点投影位置和待指示物体投影位置,其特征在于,所述光点指示机器人还设有信号输入模块,根据摄像模块拍摄到的待指示物体的图像平面所显示的内容,通过所述信号输入模块输入信息确定待指示物体中的目标物体,所述控制模块控制激光指示模块移动,使所述图像平面上激光点投影位置与目标物体投影位置重合。
  2. 如权利要求1所述的光点指示机器人,其特征在于,所述信号输入模块为鼠标模块或触摸屏模块,通过鼠标或触摸屏点击确定待指示物体中的目标物体。
  3. 如权利要求1所述的光点指示机器人,其特征在于,所述信号输入模块为键盘模块或音频输入模块,所述图像平面划分为具有具体位置编码的单元格,通过键盘或音频输入目标物体所在单元格的编码信息,确定待指示物体中的目标物体。
  4. 如权利要求1所述的光点指示机器人,其特征在于,所述激光指示模块包含激光笔(110)和驱动装置,所述驱动装置包括第一电机(120)和第二电机(130),所述第二电机固设在所述第一电机的输出轴(M)上,所述第一电机与第二电机输出轴相互垂直共面设置;
    所述激光笔包括固定端(111)与自由端(112),固定端以第一电机与第二电机输出轴(N)交点为固定点固设在所述第二电机输出轴上,自由端以固定端为中心摆动,所述激光笔的摆动平面与所述第二电机输出轴相互垂直。
  5. 如权利要求4所述的光点指示机器人,其特征在于,所述第一电机输出轴与所述图像平面垂直。
  6. 如权利要求1所述的光点指示机器人,其特征在于,所述激光指示模块包含激光笔和驱动装置,所述驱动装置包括第三电机和第四电机,所述第三电机和第四电机分别驱动所述激光笔沿相互垂直的方向摆动。
  7. 如权利要求1所述的光点指示机器人,其特征在于,所述光点指示机器人为导购机器人或指路机器人。
  8. 一种光点指示机器人的光点指示方法,该光点指示机器人包括摄像模块、激光指示模块和信号输入模块,其特征在于,所述光点指示方法包括如下步骤:
    步骤1:激光指示模块发射激光光束,摄像模块拍摄待指示物体形成图像平面,所述图像平面上设有相互垂直的X轴和Y轴;
    步骤2:根据摄像模块拍摄到的待指示物体的图像平面所显示的内容,通过信号输入模块输入信息确定待指示物体中的目标物体,获取位于图像平面上的目标物体投影位置Q’;
    步骤3:移动激光指示模块,使图像平面上激光点实时投影位置P’和目标物体投影位置Q’重合。
  9. 如权利要求8所述的光点指示方法,所述激光指示模块(100)包含激光笔(110)和驱动装置,所述驱动装置包括第一电机(120)和第二电机(130),所述第二电机固设在所述第一电机的输出轴上,所述第一电机与第二电机输出轴相互垂直共面设置;
    所述激光笔包括固定端(111)与自由端(112),固定端以第一电机与第二电机输出轴交点为固定点固设在所述第二电机输出轴上,自由端以固定端为中心摆动,所述激光笔的摆动平面与所述第二电机输出轴相互垂直;
    设当激光笔的中心线与第一电机输出轴共线时,激光点在图像平面上的投影位置为O’;定义第二电机输出轴与X轴平行时,第二电机与X轴之间的夹角为0°;
    其特征在于,所述步骤3具体包含:
    步骤3-1:在图像平面(A’)中计算出直线O’Q’与X轴之间夹角θ的大小,第一电机驱动第二电机旋转至θ±90°角位置;
    步骤3-2:摄像模块实时拍摄获取激光点在图像平面上的投影位置P’并与目标物体投影位置Q’进行比较,第二电机驱动激光笔的自由端以固定端为中心摆动,直至激光点在图像平面上的实时投影位置P’和目标物体投影位置Q’重合。
  10. 如权利要求8所述的光点指示方法,所述激光指示模块(100)包含激光笔(110)和驱动装置,所述驱动装置包括第一电机(120)和第二电机(130),所述第二电机固设在所述第一电机的输出轴上,所述第一电机与第二电机输出轴相互垂直共面设置;
    所述激光笔(110)包括固定端(111)与自由端(112),固定端以第一电机与第二电机输出轴交点为固定点固设在所述第二电机输出轴上,自由端以固定端为中心转动,所述激光笔的转动平面与所述第二电机输出轴相互垂直;
    设当激光笔的中心线与第一电机输出轴共线时,激光点在图像平面上的投影位置为O’;摄像模块中镜头的焦点N与图像平面A’之间的距离为d;定义第二电机输出轴与X轴平行时,第二电机与X轴之间的夹角为0°,
    其特征在于,所述步骤3具体包含:
    步骤3-1’:在图像平面(A’)中计算出直线O’Q’与X轴之间夹角θ的大小;根据焦点N与图像平面A’之间的距离为d,在平面O’NQ’中计算出∠O’NQ’的大小为α;
    步骤3-2’:分别或同时驱动第一、第二电机,第二电机旋转至θ±90°角,激光笔摆动至α角,使激光点在图像平面上的实时投影位置P’和目标物体投影位置Q’重合。
  11. 如权利要求8所述的光点指示方法,所述激光指示模块包含激光笔和驱动装置,所述驱动装置包括第三电机和第四电机,第三电机和第四电机分别驱动激光笔沿相互垂直的X轴方向和Y轴方向摆动,其特征在于,摄像模块实时拍摄获取激光点在图像平面上的投影位置P’并与目标物体投影位置Q’进行比较,步骤3具体包含:
    沿X轴方向摆动激光笔,直至激光点在图像平面上的投影位置P’点与目标物体投影位置Q’点的X轴坐标相同;
    沿Y轴方向摆动激光笔,直至激光点在图像平面上的投影位置P’点与目标物体投影位置Q’点的Y轴坐标相同。
PCT/CN2014/095772 2014-01-03 2014-12-31 光点指示机器人及其光点指示方法 WO2015101311A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/109,374 US10639795B2 (en) 2014-01-03 2014-12-31 Light spot indication robot and light spot indication method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410003146.1A CN104765380B (zh) 2014-01-03 2014-01-03 光点指示机器人及其光点指示方法
CN201410003146.1 2014-01-03

Publications (1)

Publication Number Publication Date
WO2015101311A1 true WO2015101311A1 (zh) 2015-07-09

Family

ID=53493257

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/095772 WO2015101311A1 (zh) 2014-01-03 2014-12-31 光点指示机器人及其光点指示方法

Country Status (3)

Country Link
US (1) US10639795B2 (zh)
CN (1) CN104765380B (zh)
WO (1) WO2015101311A1 (zh)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI734867B (zh) * 2017-11-20 2021-08-01 達明機器人股份有限公司 機器手臂作業軌跡的教導系統及方法
CN108181610B (zh) * 2017-12-22 2021-11-19 鲁东大学 室内机器人定位方法和系统
CN108363603B (zh) * 2018-01-29 2022-04-01 上海闻泰电子科技有限公司 信息指引方法、装置、移动终端以及存储装置
GB2574205B (en) * 2018-05-29 2021-06-16 Sony Interactive Entertainment Inc Robot interaction system and method
CN109141451B (zh) * 2018-07-13 2023-02-10 京东方科技集团股份有限公司 购物定位系统及方法、智能购物车、电子设备
CN110186451B (zh) * 2019-06-12 2023-04-18 英业达科技有限公司 适用于仓储系统的导航系统与物料输送载具的导航方法
CN112975940A (zh) * 2019-12-12 2021-06-18 科沃斯商用机器人有限公司 机器人控制方法、信息生成方法及机器人
CN112880560B (zh) * 2021-01-19 2022-12-30 广东博智林机器人有限公司 一种激光位置检测装置及设备
CN113771688B (zh) * 2021-09-28 2024-04-02 安徽绿舟科技有限公司 基于视觉引导电池定位的新能源汽车换电方法和装置
CN114457983B (zh) * 2022-02-28 2023-09-22 广东博智林机器人有限公司 交互系统、抹灰机器人及抹灰方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997024206A1 (fr) * 1995-12-27 1997-07-10 Fanuc Ltd Systeme robotique composite de detection
US20090234502A1 (en) * 2008-03-12 2009-09-17 Denso Wave Incorporated Apparatus for determining pickup pose of robot arm with camera
CN102323829A (zh) * 2011-07-29 2012-01-18 青岛海信电器股份有限公司 一种显示屏视角调整方法及显示设备
US20120194651A1 (en) * 2011-01-31 2012-08-02 Nikon Corporation Shape measuring apparatus
CN203673356U (zh) * 2014-01-03 2014-06-25 科沃斯机器人科技(苏州)有限公司 光点指示机器人

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002172575A (ja) * 2000-12-07 2002-06-18 Fanuc Ltd 教示装置
JP3782679B2 (ja) * 2001-05-09 2006-06-07 ファナック株式会社 干渉回避装置
US20060111814A1 (en) * 2004-11-19 2006-05-25 Shuji Hachitani Mobile robot
US9002511B1 (en) * 2005-10-21 2015-04-07 Irobot Corporation Methods and systems for obstacle detection using structured light
JP5028568B2 (ja) * 2006-01-24 2012-09-19 株式会社国際電気通信基礎技術研究所 ロボット制御システム
JP4871160B2 (ja) * 2007-02-16 2012-02-08 株式会社東芝 ロボットおよびその制御方法
JP5164811B2 (ja) * 2008-11-26 2013-03-21 キヤノン株式会社 作業システム及び情報処理方法
KR101590331B1 (ko) * 2009-01-20 2016-02-01 삼성전자 주식회사 이동 가능한 디스플레이 장치와 이를 구비한 로봇 및 그 디스플레이 방법
CN102411749A (zh) * 2011-08-24 2012-04-11 厦门市鼎朔信息技术有限公司 一种基于定位信息和网络显示终端的虚拟引导系统
CN102645932A (zh) * 2012-04-27 2012-08-22 北京智能佳科技有限公司 一种远程控制的导购机器人

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997024206A1 (fr) * 1995-12-27 1997-07-10 Fanuc Ltd Systeme robotique composite de detection
US20090234502A1 (en) * 2008-03-12 2009-09-17 Denso Wave Incorporated Apparatus for determining pickup pose of robot arm with camera
US20120194651A1 (en) * 2011-01-31 2012-08-02 Nikon Corporation Shape measuring apparatus
CN102323829A (zh) * 2011-07-29 2012-01-18 青岛海信电器股份有限公司 一种显示屏视角调整方法及显示设备
CN203673356U (zh) * 2014-01-03 2014-06-25 科沃斯机器人科技(苏州)有限公司 光点指示机器人

Also Published As

Publication number Publication date
US20160368143A1 (en) 2016-12-22
US10639795B2 (en) 2020-05-05
CN104765380B (zh) 2017-04-19
CN104765380A (zh) 2015-07-08

Similar Documents

Publication Publication Date Title
WO2015101311A1 (zh) 光点指示机器人及其光点指示方法
CN109643127B (zh) 构建地图、定位、导航、控制方法及系统、移动机器人
JP5474202B2 (ja) 顔検出および画像測定に基づいて注視点を検出する方法および装置
EP1629366B1 (en) Single camera system for gesture-based input and target indication
US10888998B2 (en) Method and device for verifying one or more safety volumes for a movable mechanical unit
CN203673356U (zh) 光点指示机器人
US20150377606A1 (en) Projection system
TW201508561A (zh) 用於移動追蹤的光斑感測
US11669173B2 (en) Direct three-dimensional pointing using light tracking and relative position detection
CN107003744B (zh) 视点确定方法、装置和电子设备
Placidi et al. A Virtual Glove System for the Hand Rehabilitation based on Two Orthogonal LEAP Motion Controllers.
US9310851B2 (en) Three-dimensional (3D) human-computer interaction system using computer mouse as a 3D pointing device and an operation method thereof
TW201944047A (zh) 旋轉軸多自由度誤差量測系統及其方法
CN113784767A (zh) 热电堆阵列融合跟踪
JP2008204196A (ja) 情報入力システム
US20150213309A1 (en) Measurement method, measurement device, projection apparatus, and computer-readable recording medium
CN106774941B (zh) 触屏终端3d虚拟角色与场景摄像机运动冲突的解决方法
KR102460361B1 (ko) 캘리브레이션 시스템 및 방법
TWI419012B (zh) A method of positioning an optical beacon device for interaction of a large display device
WO2021190421A1 (zh) 基于虚拟现实的控制器光球追踪方法和虚拟现实设备
JP2011215692A (ja) 3次元3自由度回転パラメータ処理装置
JP2019507349A (ja) レーザー・トラッカー・システム
Vincze et al. What-You-See-Is-What-You-Get Indoor Localization for Physical Human-Robot Interaction Experiments
CN103425270B (zh) 光标控制系统
TWI446214B (zh) Remote control system based on user action and its method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14876607

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 15109374

Country of ref document: US

122 Ep: pct application non-entry in european phase

Ref document number: 14876607

Country of ref document: EP

Kind code of ref document: A1