CN203673356U - Light spot indication robot - Google Patents

Light spot indication robot Download PDF

Info

Publication number
CN203673356U
CN203673356U CN201420003872.9U CN201420003872U CN203673356U CN 203673356 U CN203673356 U CN 203673356U CN 201420003872 U CN201420003872 U CN 201420003872U CN 203673356 U CN203673356 U CN 203673356U
Authority
CN
China
Prior art keywords
module
motor
laser
robot
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn - After Issue
Application number
CN201420003872.9U
Other languages
Chinese (zh)
Inventor
汤进举
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecovacs Robotics Suzhou Co Ltd
Ecovacs Commercial Robotics Co Ltd
Original Assignee
Ecovacs Robotics Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecovacs Robotics Suzhou Co Ltd filed Critical Ecovacs Robotics Suzhou Co Ltd
Priority to CN201420003872.9U priority Critical patent/CN203673356U/en
Application granted granted Critical
Publication of CN203673356U publication Critical patent/CN203673356U/en
Anticipated expiration legal-status Critical
Withdrawn - After Issue legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

Provided is a light spot indication robot. The light spot indication robot comprises a robot body, the body is provided with a control module, a shooting module and a laser indication module, the laser indication module emits a laser beam, the shooting module shoots a to-be-indicated object and forms an image plane, the laser beam and the to-be-indicated object are respectively projected to the image plane to form a laser spot projection position and a to-be-indicated object projection position, the light spot indication robot is also provided with a signal input module, information is inputted via the signal input module and a target object of the to-be-indicated object is determined according to contents displayed on the image plane of the to-be-indicated object shot by the shooting module, and the control module controls the movement of the laser indication module so that the laser spot projection position on the image plane and a target object projection position are overlapping. According to the light spot indication robot, the positioning of a three-dimensional space of the target object is converted to a two-dimensional space, the calculated amount is small, and the indication of positions is rapid and accurate.

Description

Luminous point indication robot
Technical field
The utility model relates to a kind of luminous point and indicates robot, belongs to family expenses small electric apparatus manufacturing technology field.
Background technology
Shopping guide robot is the one in the multiple self-movement robot of commonly using, interactive strong with user.The existing shopping guide robot luminous point directing object of employing conventionally, is generally that the laser pen by being arranged on body is got to laser spots in specified object, thereby completes shopping guide's action of luminous point directing object.Detailed process comprises: first luminous point directing object shopping guide robot needs to obtain the three-dimensional coordinate of target object, then control module is according to obtained three-dimensional coordinate control laser pen motion, make laser spots move to the three-dimensional coordinate place of target object, can complete the complete action of shopping guide robot luminous point directing object.But the three-dimensional coordinate of target object is not easy to obtain in fact time very much; In addition, rotate certain angle according to the three-dimensional coordinate control laser pen of target object along certain direction, make laser spots move to assigned address, the control procedure calculated amount of whole action is larger, and also higher to the requirement of control module.
Utility model content
Technical problem to be solved in the utility model is for the deficiencies in the prior art, provides a kind of luminous point to indicate robot, and three-dimensional target object location is transformed into two-dimensional space by the utility model, and calculated amount is little, and indicating positions quick and precisely.
Technical matters to be solved of the present utility model is achieved by the following technical solution:
A kind of luminous point is indicated robot, comprise robot body, described body is provided with control module, photographing module and laser designation module, described laser designation module Emission Lasers light beam, photographing module is taken object to be instructed and is formed the plane of delineation, described laser beam and object to be instructed are projected in respectively on the described plane of delineation, form laser spots projected position and project objects to be instructed position, described luminous point indicates robot to be also provided with signal input module, the shown content of the plane of delineation of the object to be instructed photographing according to photographing module, determine the target object in object to be instructed by described signal input module input message, described control module control laser designation module moves, laser spots projected position on the described plane of delineation is overlapped with target object projected position.
According to user's different demands, described signal input module can adopt various structures, can be mouse module or touch screen module, clicks by mouse or touch-screen the target object of determining in object to be instructed; Also can be Keysheet module or audio frequency load module, the described plane of delineation be divided into the cell with particular location coding, by the coded message of keyboard or audio frequency input target object place cell, determines the target object in object to be instructed.
More specifically, described laser designation module comprises laser pen and drive unit, described drive unit comprises the first motor and the second motor, and described the second motor is installed on the output shaft of described the first motor, described the first motor and the vertical coplanar setting mutually of the second motor output shaft; Described laser pen comprises stiff end and free end, stiff end take the first motor and the second motor output shaft intersection point as point of fixity is installed in described the second motor output shaft on, free end swings centered by stiff end, and the swinging plane of described laser pen is mutually vertical with described the second motor output shaft.
In order to guarantee the normal work of laser designation module, described the first motor output shaft is vertical with the described plane of delineation.
Except above-mentioned type of drive, described laser designation module can also comprise laser pen and drive unit, described drive unit comprises the 3rd motor and the 4th motor, and described the 3rd motor and the 4th motor drive respectively described laser pen to swing along orthogonal direction.
Described luminous point indication machine artificial guider device people or the robot that shows the way.
In sum, three-dimensional target object location is transformed into two-dimensional space by the utility model, and calculated amount is little, and indicating positions quick and precisely.
Below in conjunction with the drawings and specific embodiments, the technical solution of the utility model is described in detail.
Accompanying drawing explanation
Fig. 1 is one of structural representation of laser designation module in the utility model embodiment mono-;
Fig. 2 be laser designation module in the utility model embodiment mono-structural representation two;
Fig. 3 is the projection relation schematic diagram of the arbitrary laser beam of the utility model in the plane of delineation and plane in kind;
Fig. 4 is the real-time projected position P ' of laser spots and target object projected position Q ' coincidence schematic diagram on the utility model plane of delineation.
Embodiment
Embodiment mono-
The utility model provides a kind of luminous point to indicate robot, comprise robot body, on described body, be provided with control module, photographing module and laser designation module, described laser designation module Emission Lasers light beam, photographing module is taken object to be instructed and is formed the plane of delineation, described laser beam and object to be instructed are projected in respectively on the described plane of delineation, form laser spots projected position and project objects to be instructed position, described luminous point indicates robot to be also provided with signal input module, the shown content of the plane of delineation of the object to be instructed photographing according to photographing module, determine the target object in object to be instructed by described signal input module input message, described control module control laser designation module moves, laser spots projected position on the described plane of delineation is overlapped with target object projected position.
According to user's different demands, described signal input module can adopt various structures, can be mouse module or touch screen module, clicks by mouse or touch-screen the target object of determining in object to be instructed; Also can be Keysheet module or audio frequency load module, the described plane of delineation be divided into the cell with particular location coding, by the coded message of keyboard or audio frequency input target object place cell, determines the target object in object to be instructed.No matter adopting the signal input module of above-mentioned which kind of mode, is all in order to determine target object in object to be instructed.For instance, in the time of the artificial guider device of described luminous point indication machine people, object to be instructed is all commodity of putting on shelf, and target object is the commodity that user really need to buy.It is to be noted, signal input module in luminous point indication robot can directly receive the input signal of body, also input signal that can receiving remote terminal, as by the input signal of wired or wireless (broadband, bluetooth, infrared, GPRS, 3G, WIFI etc.).
Fig. 1 and Fig. 2 be respectively one of structural representation of laser designation module in the utility model embodiment mono-and two.As Fig. 1 and in conjunction with as shown in Fig. 2, described laser designation module 100 comprises laser pen 110 and drive unit, described drive unit comprises the first motor 120 and the second motor 130, described the second motor 130 is installed on the output shaft M of described the first motor 120, the output shaft M of described the first motor 120 and the second motor 130 output shaft N vertical coplanar setting mutually.Described laser pen 110 comprises stiff end 111 and free end 112, stiff end 111 take the first motor 120 and the second motor 130 output shaft N intersection points as point of fixity is installed in described the second motor 130 output shaft N on, free end 112 swings centered by stiff end 111, and the swinging plane of described laser pen 110 is mutually vertical with described the second motor 130 output shaft N.In order to guarantee the normal work of laser designation module, described the first motor 120 output shaft M are vertical with the described plane of delineation.
Fig. 3 is the projection relation schematic diagram of the arbitrary laser beam of the utility model in the plane of delineation and plane in kind; Fig. 4 is the real-time projected position P ' of laser spots and target object projected position Q ' coincidence schematic diagram on the utility model plane of delineation.As Fig. 3 and in conjunction with as shown in Fig. 4, the luminous point indicating means of the utility model luminous point indication robot, comprise the steps: step 1: laser designation module Emission Lasers light beam, photographing module is taken object to be instructed and is formed plane of delineation A ', and described plane of delineation A ' is provided with orthogonal X-axis and Y-axis; Step 2: the shown content of plane of delineation A ' of the object to be instructed photographing according to photographing module, determine the target object a ' in object a ' to be instructed, b ', c ' and d ' by signal input module input message, obtain the target object projected position Q ' being positioned on plane of delineation A '; Step 3: mobile laser designation module, the upper real-time projected position P ' of laser spots of plane of delineation A ' and target object projected position Q ' are overlapped, as shown in Figure 4, the indicating positions after coincidence is the center of target object a '.
As shown in Figures 1 to 4, establish in the time of the center line of laser pen and the first motor output shaft conllinear, the projected position of laser spots on the plane of delineation is O '; Define the second motor output shaft when parallel with X-axis, the angle between the second motor and X-axis is 0 °.
Above-mentioned step 3 specifically comprises:
Step 3-1: calculate the size of angle theta between straight line O ' Q ' and X-axis in plane of delineation A ', the first motor drives the second motor to rotate ° position, angle to θ ± 90;
Step 3-2: photographing module is taken in real time and obtained the projected position P ' of laser spots on the plane of delineation and compare with target object projected position Q ', the free end of the second motor driving laser pen swings centered by stiff end, until real-time projected position P ' and the target object projected position Q ' of laser spots on the plane of delineation overlaps.
That is to say, in the present embodiment, making the process of the upper real-time projected position P ' of laser spots of plane of delineation A ' and target object projected position Q ' coincidence, is first by calculating the size of angle theta between straight line O ' Q ' and X-axis, then swing to realize by the free end of driving laser pen.In the present embodiment, the artificial guider device of this luminous point indication machine people, specifically, the course of work of shopping guide robot is exactly such: first, be arranged on the laser designation module Emission Lasers light beam in shopping guide robot, meanwhile, the photographing module being arranged in shopping guide robot is taken object to be instructed, such as: the extensive stock on supermarket shelves, forms plane of delineation A ', and on described plane of delineation A ', orthogonal X-axis and Y-axis is set.Various article on the supermarket shelves that user can photograph according to photographing module, determine the target object a ' in object a ' to be instructed, b ', c ' and d ' by signal input module input message, namely need those part commodity of buying, obtain these commodity and be positioned at the projected position Q ' on plane of delineation A '.According to user's needs or operating habit, load module can adopt various structures form, can be clicked and be determined the commodity that need purchase by mouse or touch-screen; Also the plane of delineation can be divided into the cell with particular location coding, input the coded message of the commodity place cell of required purchase by keyboard or audio frequency.Once after determining the commodity of required purchase by above-mentioned various load modules, user just can be by mobile laser designation module, the upper real-time projected position P ' of laser spots of plane of delineation A ' and required purchase commodity projected position Q ' are overlapped, laser designation position is indicated on the commodity of required purchase.Common indicating positions is the center of commodity, certainly, also likely there is error in this indicating means, but commodity itself all have certain volume, even if there are some index errors, also can guarantee the correct indication to required purchase commodity, further, if the small volume of commodity own, can also further finely tune laser designation position by signal input module, as the button up and down of indicating the Keysheet module on robot body or remote control terminal by luminous point moves, the corresponding laser spots of controlling of control module moves, guarantee that laser spots indicates on the commodity of required purchase.
In addition, in the present embodiment, above-mentioned shopping guide robot, described luminous point indication robot can also be the robot that shows the way.
Embodiment bis-
The upper real-time projected position P ' of laser spots of plane of delineation A ' and required purchase commodity projected position Q ' are overlapped, except above-mentioned first by calculating the size of angle theta between straight line O ' Q ' and X-axis, outside a kind of working method that swings to realize by the free end of driving laser pen again, also provide in the present embodiment another working method.
Shown in Fig. 3 and Fig. 4, specifically, in the present embodiment, to establish in the time of the center line of laser pen and the first motor output shaft conllinear, the projected position of laser spots on the plane of delineation is O '; Distance in photographing module between the focus N of camera lens and plane of delineation A ' is d; Define the second motor output shaft when parallel with X-axis, the angle between the second motor and X-axis is 0 °,
Described step 3 specifically comprises:
Step 3-1 ': the size that calculates angle theta between straight line O ' Q ' and X-axis in plane of delineation A '; Be d according to the distance between focus N and plane of delineation A ', the size that calculates ∠ O ' NQ ' in plane O ' NQ ' is α;
Step 3-2 ': drive respectively or simultaneously first, second motor, the second motor rotates ° angle to θ ± 90, and laser pen is rocked to α angle, overlaps real-time projected position P ' and the target object projected position Q ' of laser spots on the plane of delineation.
That is to say, in embodiment bis-, the process that the upper real-time projected position P ' of laser spots of plane of delineation A ' and target object projected position Q ' are overlapped, is to be first d according to the distance between focus N and plane of delineation A ', calculates the size at α angle; Then directly swing laser pen to α angle.
Other technologies feature in the present embodiment is identical with embodiment mono-, please refer to aforementioned content, does not repeat them here.
Embodiment tri-
In addition, the structure setting of drive unit in described laser designation module, except the structure described in embodiment mono-and embodiment bis-, described drive unit can also comprise the 3rd motor and the 4th motor, and the 3rd motor and the 4th motor respectively driving laser pen swing along orthogonal X-direction and Y direction.Adopt this drive unit robot course of action also with aforementioned two embodiment in different, the first two step of indicating means is identical with embodiment mono-, two, step 3 has some differences, specifically, photographing module is taken in real time and is obtained the projected position P ' of laser spots on the plane of delineation and compare with target object projected position Q ', step 3 specifically comprises: swing laser pen along X-direction, until the projected position P ' point of laser spots on the plane of delineation is identical with the X-axis coordinate of target object projected position Q ' point; Swing laser pen along Y direction, until the projected position P ' point of laser spots on the plane of delineation is identical with the Y-axis coordinate of target object projected position Q ' point.That is to say, in the present embodiment, be by two motor, and driving laser pen swings along directions X and Y-direction respectively, completes the real-time projected position P ' of laser spots on the plane of delineation and the indication process of target object projected position Q ' coincidence of making.
In sum, luminous point provided by the utility model is indicated robot, laser designation module Emission Lasers light beam, and photographing module is taken object to be instructed and is formed the plane of delineation, and orthogonal X-axis and Y-axis are set on the plane of delineation; The shown content of the plane of delineation of the object to be instructed photographing according to photographing module, determines the target object in object to be instructed by signal input module input message, obtains the target object projected position Q ' being positioned on the plane of delineation; Mobile laser designation module, makes the real-time projected position P ' of laser spots and target object projected position Q ' coincidence on the plane of delineation.Three-dimensional target object location is transformed into two-dimensional space by the utility model, comprises obtaining and laser spots being moved to the calculating that carry out in target object place of target object and laser spots two-dimensional space coordinate, and calculated amount is little, and indicating positions quick and precisely.

Claims (7)

1. a luminous point indication robot, comprise robot body, described body is provided with control module, photographing module and laser designation module, described laser designation module Emission Lasers light beam, photographing module is taken object to be instructed and is formed the plane of delineation, described laser beam and object to be instructed are projected in respectively on the described plane of delineation, form laser spots projected position and project objects to be instructed position, it is characterized in that, described luminous point indicates robot to be also provided with signal input module, the shown content of the plane of delineation of the object to be instructed photographing according to photographing module, determine the target object in object to be instructed by described signal input module input message, described control module control laser designation module moves, laser spots projected position on the described plane of delineation is overlapped with target object projected position.
2. luminous point as claimed in claim 1 is indicated robot, it is characterized in that, described signal input module is mouse module or touch screen module, clicks by mouse or touch-screen the target object of determining in object to be instructed.
3. luminous point as claimed in claim 1 is indicated robot, it is characterized in that, described signal input module is Keysheet module or audio frequency load module, the described plane of delineation is divided into the cell with particular location coding, by the coded message of keyboard or audio frequency input target object place cell, determine the target object in object to be instructed.
4. luminous point as claimed in claim 1 is indicated robot, it is characterized in that, described laser designation module comprises laser pen and drive unit, described drive unit comprises the first motor and the second motor, described the second motor is installed on the output shaft of described the first motor, described the first motor and the vertical coplanar setting mutually of the second motor output shaft;
Described laser pen comprises stiff end and free end, stiff end take the first motor and the second motor output shaft intersection point as point of fixity is installed in described the second motor output shaft on, free end swings centered by stiff end, and the swinging plane of described laser pen is mutually vertical with described the second motor output shaft.
5. luminous point as claimed in claim 4 is indicated robot, it is characterized in that, described the first motor output shaft is vertical with the described plane of delineation.
6. luminous point as claimed in claim 1 is indicated robot, it is characterized in that, described laser designation module comprises laser pen and drive unit, and described drive unit comprises the 3rd motor and the 4th motor, and described the 3rd motor and the 4th motor drive respectively described laser pen to swing along orthogonal direction.
7. luminous point as claimed in claim 1 is indicated robot, it is characterized in that, described luminous point indication machine artificial guider device people or the robot that shows the way.
CN201420003872.9U 2014-01-03 2014-01-03 Light spot indication robot Withdrawn - After Issue CN203673356U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201420003872.9U CN203673356U (en) 2014-01-03 2014-01-03 Light spot indication robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201420003872.9U CN203673356U (en) 2014-01-03 2014-01-03 Light spot indication robot

Publications (1)

Publication Number Publication Date
CN203673356U true CN203673356U (en) 2014-06-25

Family

ID=50969629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201420003872.9U Withdrawn - After Issue CN203673356U (en) 2014-01-03 2014-01-03 Light spot indication robot

Country Status (1)

Country Link
CN (1) CN203673356U (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104360633A (en) * 2014-10-10 2015-02-18 南开大学 Human-computer interaction system for service robot
CN104765380A (en) * 2014-01-03 2015-07-08 科沃斯机器人科技(苏州)有限公司 Light spot indication robot and light spot indication method thereof
CN108858132A (en) * 2018-08-24 2018-11-23 安徽爱依特科技有限公司 A kind of suspension type shopping guide robot and its track
CN109213202A (en) * 2018-08-17 2019-01-15 深圳蓝胖子机器人有限公司 Cargo arrangement method, device, equipment and storage medium based on optical servo
CN113639588A (en) * 2021-07-29 2021-11-12 彩虹无人机科技有限公司 Laser indication and detection modularized integrated system

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104765380A (en) * 2014-01-03 2015-07-08 科沃斯机器人科技(苏州)有限公司 Light spot indication robot and light spot indication method thereof
WO2015101311A1 (en) * 2014-01-03 2015-07-09 科沃斯机器人有限公司 Light spot indication robot and light spot indication method thereof
CN104765380B (en) * 2014-01-03 2017-04-19 科沃斯商用机器人有限公司 Light spot indication robot and light spot indication method thereof
US10639795B2 (en) 2014-01-03 2020-05-05 Ecovacs Robotics Co., Ltd. Light spot indication robot and light spot indication method thereof
CN104360633A (en) * 2014-10-10 2015-02-18 南开大学 Human-computer interaction system for service robot
CN109213202A (en) * 2018-08-17 2019-01-15 深圳蓝胖子机器人有限公司 Cargo arrangement method, device, equipment and storage medium based on optical servo
CN108858132A (en) * 2018-08-24 2018-11-23 安徽爱依特科技有限公司 A kind of suspension type shopping guide robot and its track
CN108858132B (en) * 2018-08-24 2020-09-01 安徽爱依特科技有限公司 Suspension type shopping guide robot and track thereof
CN113639588A (en) * 2021-07-29 2021-11-12 彩虹无人机科技有限公司 Laser indication and detection modularized integrated system

Similar Documents

Publication Publication Date Title
CN104765380A (en) Light spot indication robot and light spot indication method thereof
CN203673356U (en) Light spot indication robot
CN109352658B (en) Industrial robot positioning control method, system and computer readable storage medium
Pérez et al. Robot guidance using machine vision techniques in industrial environments: A comparative review
EP1629366B1 (en) Single camera system for gesture-based input and target indication
US10888998B2 (en) Method and device for verifying one or more safety volumes for a movable mechanical unit
US20180173318A1 (en) Method and apparatus for detecting gesture in user-based spatial coordinate system
US20150377606A1 (en) Projection system
CN104360891B (en) Visible images seeker emulates simple target simulation system and its analogy method
US20230057965A1 (en) Robot and control method therefor
CN101504275A (en) Hand-hold line laser three-dimensional measuring system based on spacing wireless location
KR20120067013A (en) Apparatus and method for indoor localization based on camera
CN103824282A (en) Touch and motion detection using surface map, object shadow and a single camera
CN102508578A (en) Projection positioning device and method as well as interaction system and method
EP3482340A1 (en) Laser marking system with through-the-lens autofocus
CN105300310A (en) Handheld laser 3D scanner with no requirement for adhesion of target spots and use method thereof
US20230224576A1 (en) System for generating a three-dimensional scene of a physical environment
TWI677671B (en) Rotary shaft multi-degree-of-freedom error measurement system and method
CN104697466B (en) Fountain type 3-D measuring apparatus
US20220191995A1 (en) Systems and methods for determining lighting fixture arrangement information
CN111780715A (en) Visual ranging method
TW200422754A (en) Method for determining the optical parameters of a camera
JP6693616B1 (en) Surveying system and surveying method
CN102551724B (en) Intelligent laser projection positioning device
KR101535801B1 (en) Process inspection device, method and system for assembling process in product manufacturing using depth map sensors

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
C56 Change in the name or address of the patentee
CP01 Change in the name or title of a patent holder

Address after: 215168 Wuzhong District, Jiangsu, Stone Lake Road West, No. 108

Patentee after: ECOVACS ROBOTICS Co.,Ltd.

Address before: 215168 Wuzhong District, Jiangsu, Stone Lake Road West, No. 108

Patentee before: ECOVACS ROBOTICS (SUZHOU) Co.,Ltd.

TR01 Transfer of patent right

Effective date of registration: 20160511

Address after: Wuzhong Economic Development Zone in Suzhou City, Jiangsu Province, the River Street 215104 Youxiang Road No. 18 building 3

Patentee after: ECOVACS COMMERCIAL ROBOTICS Co.,Ltd.

Address before: 215168 Wuzhong District, Jiangsu, Stone Lake Road West, No. 108

Patentee before: ECOVACS ROBOTICS Co.,Ltd.

AV01 Patent right actively abandoned
AV01 Patent right actively abandoned
AV01 Patent right actively abandoned

Granted publication date: 20140625

Effective date of abandoning: 20170419

AV01 Patent right actively abandoned

Granted publication date: 20140625

Effective date of abandoning: 20170419