CN203706137U - Forming equipment with gesture control device - Google Patents
Forming equipment with gesture control device Download PDFInfo
- Publication number
- CN203706137U CN203706137U CN201420071918.0U CN201420071918U CN203706137U CN 203706137 U CN203706137 U CN 203706137U CN 201420071918 U CN201420071918 U CN 201420071918U CN 203706137 U CN203706137 U CN 203706137U
- Authority
- CN
- China
- Prior art keywords
- former
- control device
- gesture
- user
- gesture control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
- 238000004519 manufacturing process Methods 0.000 claims abstract description 5
- 230000003287 optical effect Effects 0.000 claims description 18
- 230000002093 peripheral effect Effects 0.000 claims description 9
- 230000008859 change Effects 0.000 claims description 8
- 230000005055 memory storage Effects 0.000 claims description 3
- 238000005253 cladding Methods 0.000 claims 1
- 230000009471 action Effects 0.000 description 7
- 238000004512 die casting Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000002347 injection Methods 0.000 description 3
- 239000007924 injection Substances 0.000 description 3
- 238000001746 injection moulding Methods 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 238000003860 storage Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000004913 activation Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000009849 deactivation Effects 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000008676 import Effects 0.000 description 2
- 238000000034 method Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 230000003750 conditioning effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Landscapes
- Numerical Control (AREA)
Abstract
The utility model relates to forming equipment (12). The forming equipment is provided with a device used for manufacturing a forming piece, and an operating unit (2). The operating unit (2) of the forming equipment (12) is designed in the mode that at least one user logging onto the operating unit can operate the operating unit through a gesture control device (7) and gestures of the user in a contactless mode, wherein the gesture control device (7) comprises at least one recognition device (1), an analyzer (5) and a convertor (6).
Description
Technical field
The utility model relates to a kind of former, and it comprises for the manufacture of the device of profiled member and executive component.
Background technology
Former by shaped device for example injection machine, die casting machine, pressing machine or similar device, alternatively at least one robot and alternatively at least one peripheral unit form, its be connected for controlling with the operating unit of conditioning equipment and controlling computing machine.Can be for example on operating unit also can visualization, foundation and the various parameter of input, instruction and other data.
Utility model content
Task of the present utility model is to provide a kind of former improving with respect to prior art.Particularly the operation of described former should be designed to for one and/or multiple user be meet ergonomics and therefore can be during fabrication between and manufacturing expense aspect be improved.
This so realizes according to the utility model, the operating unit that is former is designed to be able to contactlessly be operated via its gesture by the user of at least one login by gesture control device, and wherein gesture control device comprises at least one recognition device, a resolver and a converter.It is particularly preferably the injection-moulding device with injection machine, die casting machine, pressing machine or similar device according to the example of former of the present utility model.
Brief description of the drawings
Next other details of the present utility model and advantage elaborate by means of brief description of the drawings with reference to embodiment illustrated in the accompanying drawings.
Fig. 1 illustrates the former 12 of injection-moulding device form as example, it has forming machine 3(, and it is designed to injection machine, die casting machine, pressing machine or similar device), corresponding peripheral unit 9, robot 8, gesture control device 7(its there is the optical identifying apparatus 1 being coupled with operating unit 2) and control computing machine 10(its be for example connected with forming machine 3);
Fig. 2 illustrates to have gesture control device 7, and it has optical identifying apparatus 1, memory storage 4, resolver 5 and converter 6.
Embodiment
Fig. 1 schematically illustrates gesture control device 7(, and it has the optical identifying apparatus 1 being coupled with operating unit 2) and control computing machine 10, it is for example connected with forming machine 3.In addition at least one robot 8 and/or a peripheral unit 9 are coupled on operating unit 2 alternatively.Peripheral unit 9 can be for example conveying device or withdrawing device, and it is for example for delivery of profiled member, instrument or analog.
In order to identify and to resolve by gesture control device 7, at least one optical identifying apparatus 1 is set, detected motion is offered resolver 5 by it.Converter 6 is by the data-switching so obtaining and it is further flowed to operating unit 2.Operating unit 2 further flows to forming machine 3, robot 8 or peripheral unit 9 and maybe this information is illustrated on display via controlling computing machine 10 alternatively by the data processing of so input and by it.
Optical identifying apparatus 1, operating unit 2, forming machine 3 and to control computing machine 10 illustrating in Fig. 1 be only schematically and can not draw thus the conclusion of the setting position in former 12 about each discrete component, thus for example optical identifying apparatus 1 can directly be fixed on operating unit 2 and/or forming machine 3 and/or alternatively also be designed to movably.Therefore, gesture control device 7 also can use on the optional position of former 12.By this, example is set, by user, instruction is further flowed to gesture control device 7 by the motion on three dimensions now.
Gesture control device 7 can for example can be used for action below: in output unit, navigate; Select input field; Set or change setup parameter; Trigger action; Control/adjustment movement.These actions elaborate now:
Navigation: operating unit 2 can be for example display or touch-screen now.On operating unit 2, in menu navigation, can select each single page at this.Also can navigate the known basic function of each single view as rolling, page turning, amplify and dwindle.In addition the motion of cursor realizes by means of gesture control.
Select: on the menu navigation of operating unit 2, realize setup parameter, input field, numerical example as ratings and actual value, icon and/or the selection of the element of other type arbitrarily, to see its view or for later change.
Set or change parameter: selected setup parameter or other element of mentioning therein can be via gesture control breaks on operating unit 2.Setup parameter can be for example ratings (position, speed and pressure), ratings curve, screen intensity, time and date.Position change, speed or the acceleration of the gesture of for example implementing by user at this can be converted into the change with respect to the ratings in a Geju City of the absolute value of ratings or new ratings.
Example: user for example, is placed into hand on optical identifying apparatus 1 in a predetermined minimum time (1 second).Then user by its hand remove a definite distance and and then this motion stop a given minimum time.Optical identifying apparatus 1 is identified the displacement that user's hand channel is gone through.The resolved device 5 of motion being identified is analyzed.By means of converter 6, the data-switching of being analyzed by resolver 5 is become to the amount of imports definite value for operating unit 2.Thereby distance " zero " can be corresponding to the input value on operating unit 2 " zero " as the measured value of optical identifying apparatus 1, and the measured value of a meter of measuring by optical identifying apparatus 1 can be corresponding to the maximal value for operating unit 2.By value " zero " and one meter, can realize to the calculating of the input value of the expectation of operating unit 2 by linear interpolation now.In the time that operator implements this motion, the corresponding input value for hand position is shown on operating unit 2 in real time.This can help user very accurately to reach the input value of expectation via its gesture.
Alternatively and/or additionally, also can realize input value on gesture control device with the increasing progressively and/or successively decrease of velocity correlation.This can mean, comparatively faster motion changes ratings with larger stride, and stride reduces in the time of slower motion.
Substitute and for example change rate curve or other ratings curve by numerical value input by means of each single value, the whole curve of gesture also can be resolved.The different movement velocity of user's gesture can for example be directly changed into speed ratings curve via converter 6.Also be contemplated that, user can be before recognition device 1 " imitation " ratings curve, wherein at the curve of ratings shown in three dimensions.The motion of the motion of the first dimension at this on X-axis line and the second dimension on Y-axis line can be transmitted to operating unit 2 as the new ratings for curve via resolver 5 and converter 6.
The triggering of action: instruction (it is so far by the operational hardware for example button on mouse, button or screen key (touch-screen) importing on keyboard) can import by gesture now.The for example startup of the open and/or closed of for example protective device of each single motion in the whole robotization circulation of former 12 or storage of data recording etc. can trigger by gesture now.
Control and adjustment movement: in this case as not only moving through gesture and can trigger in forming machine 3, peripheral unit 9 or robot 8 of example, and the position of operator's gesture, speed or acceleration can be resolved, next it form the corresponding reaction that can affect on forming machine 3, robot 8 and/or peripheral unit 9.Unessential at this, whether the reaction on machine and gesture realize simultaneously or only have finish after and confirms by user after realization.As application example, can be robot motion " teaching " at this.
The activation of gesture control device 7 and deactivation (gesture control device is quit work) and user's identification is the important component part of system, to can avoid trigger action unintentionally.The user that identification has the right to operate machines.Can use arbitrary nowadays known recognition methods at this.This task but also can realize by optical identifying apparatus 1, wherein user is by its living things feature recognition.Also can realize operator's identification by special gesture for example " login and cancellation gesture ".
Optical identifying apparatus 1 should only be started according to the rules and be stopped by the parsing of gesture.That is to say, start or finish the process of gesture identification by means of for example " start/stop gesture ".Two schemes can be used as example below, it stops the startup unintentionally of the action in injection moulding apparatus 12, one is bimanual input scheme, its sharp left-handed position shows should resolve the gesture of the right hand in which time period, or a second bimanual input scheme, wherein two hands are implemented identical gesture.
Other possibility can certainly be traditional on-off switch, its activation or deactivation gesture control device.
User preferably can be fed back by gesture control device, so that can be directed.Different from when the touch-screen, user lacks the plane of a helpful property in the time of orientation.If user moves in three dimensions, can employ so the feedback of vision or the sense of hearing at this, to offer help to user.The simplest possibility is the feedback via operating unit 2.Another possibility is that feedback is projected on a surface, for example, in machine coverture or protection fence or special spectacles.
System can further have additional function, and it additionally makes to utilize the work of gesture control device to become easy.For example can one completing default " quick gesture " at this calls one and completes default action and dwindle needed displacement.
Another easy way can be optical identifying apparatus 1 " tracking " user, moves in space once user.Optical identifying apparatus 1 must be designed to movable and/or have a corresponding driving mechanism.
Fig. 2 schematically illustrates the another one example of gesture control device 7, and it has optical identifying apparatus 1 and internal storage device 4.The function of gesture control device 7 corresponds essentially to the description of accompanying drawing 1, difference is, the information about measured motion being obtained by recognition device 1 is not further to send resolver 5 and converter 6 to, but in an internal storage device 4, utilize for the time being, to compare with the data as ratings in memory storage 4 operator's gesture as actual value.Next gesture control device is connected with operating unit 2 and control computing machine 10 on interface 11.
Claims (14)
1. former (12), it comprises for the manufacture of the device of profiled member and comprises operating unit (2), it is characterized in that, the operating unit (2) of former (12) is designed to be able to contactlessly be operated via its gesture by the user of at least one login by gesture control device (7), and wherein gesture control device (7) comprises at least one optical identifying apparatus (1), a resolver (5) and a converter (6).
2. former as claimed in claim 1 (12), is characterized in that, at least one robot (8) of former (12) and/or peripheral unit (9) can be operated via gesture control device (7).
3. former as claimed in claim 1 or 2 (12), is characterized in that, described at least one optical identifying apparatus (1) is fixed on forming machine (3) or operating unit (2).
4. former as claimed in claim 1 or 2 (12), is characterized in that, gesture control device (7) can be applied actively.
5. former as claimed in claim 4 (12), is characterized in that, gesture control device (7) is designed to movably and follows user and move in space once user by means of drive unit, surveys its gesture.
6. former as claimed in claim 1 or 2 (12), is characterized in that, gesture control device (7) can be by rise/stopping gesture and/or being activated or being deactivated by switch.
7. former as claimed in claim 1 (12), is characterized in that, for gesture control device (7), user's biological characteristic can be identified and can the upper login of the operating unit (2) in former (12) by user.
8. former as claimed in claim 1 (12), is characterized in that, gesture control device (7) can be surveyed user's identification gesture and the upper login of the operating unit (2) in former (12) by user.
9. former as claimed in claim 1 or 2 (12), is characterized in that, user's operation that former (12) only can be logged.
10. former as claimed in claim 1 (12), is characterized in that, the parameter of wanting of overstating for forming machine (3), operating unit (2), robot (8) and peripheral unit (9) can change by gesture control device (7).
11. formers as claimed in claim 1 (12), is characterized in that, the parameter important for identification user can change by gesture control device (7).
12. formers as claimed in claim 1 (12), is characterized in that, after identification gesture, for user, can feel a characteristic signal from former (12).
13. formers as claimed in claim 1 or 2 (12), is characterized in that, gesture control device (7) preferably has memory storage (4) in gesture control device (7) for the temporary information detecting by gesture.
14. formers as claimed in claim 3 (12), is characterized in that, described optical identifying apparatus (1) is fixed on the cladding element of forming machine (3).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201420071918.0U CN203706137U (en) | 2014-02-20 | 2014-02-20 | Forming equipment with gesture control device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201420071918.0U CN203706137U (en) | 2014-02-20 | 2014-02-20 | Forming equipment with gesture control device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN203706137U true CN203706137U (en) | 2014-07-09 |
Family
ID=51056543
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201420071918.0U Expired - Lifetime CN203706137U (en) | 2014-02-20 | 2014-02-20 | Forming equipment with gesture control device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN203706137U (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105058396A (en) * | 2015-07-31 | 2015-11-18 | 深圳先进技术研究院 | Robot teaching system and control method thereof |
-
2014
- 2014-02-20 CN CN201420071918.0U patent/CN203706137U/en not_active Expired - Lifetime
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105058396A (en) * | 2015-07-31 | 2015-11-18 | 深圳先进技术研究院 | Robot teaching system and control method thereof |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104379307B (en) | For operating the method for industrial robot | |
CN104071097A (en) | Input apparatus, input method, and input program | |
DK2834715T3 (en) | Procedure for operating an industrial robot | |
KR20100106203A (en) | Multi-telepointer, virtual object display device, and virtual object control method | |
KR101392936B1 (en) | User Customizable Interface System and Implementing Method thereof | |
CN105277126A (en) | A method for controlling motion of a coordinate measuring machine and a guide element motion tracking system used for the machine | |
JP2014094734A (en) | Vehicle control device | |
DE102017008827A1 (en) | Display device, test procedure and test program | |
US20160364367A1 (en) | Information processing device for editing electronic data by touch operations | |
PL1947538T3 (en) | Method for controlling a moveable tool, input device and processing machine | |
US9411911B2 (en) | Process control system for production of parts with graphical interface | |
CN104246657A (en) | Device for gestural control of a system, and associated method | |
CN203706137U (en) | Forming equipment with gesture control device | |
CN102662592A (en) | Data output method and data output device | |
CN108376030A (en) | Control method, device and the electronic equipment of a kind of electronic equipment | |
JP2018073356A (en) | Touch type input device and operation detection method | |
US20090122066A1 (en) | Drawing-editing system and apparatus and grouping processing method | |
JP2007260268A5 (en) | ||
JP2011081447A5 (en) | ||
CN106041966A (en) | Robot teaching action control method and robot teaching action control device | |
CN105468273A (en) | Method and apparatus used for carrying out control operation on device touch screen | |
KR102336216B1 (en) | Apparatus for displaying manufacturing time of machine tool | |
JP2015111338A (en) | Processing program creation device, processing system, and program for processing program creation | |
CN105224211A (en) | A kind of method of controlling operation thereof of operand, device and mobile terminal | |
KR101370830B1 (en) | System and Method for Implementing User Interface |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CX01 | Expiry of patent term |
Granted publication date: 20140709 |
|
CX01 | Expiry of patent term |