WO2019044400A1 - Système d'entraînement de moteur - Google Patents

Système d'entraînement de moteur Download PDF

Info

Publication number
WO2019044400A1
WO2019044400A1 PCT/JP2018/029436 JP2018029436W WO2019044400A1 WO 2019044400 A1 WO2019044400 A1 WO 2019044400A1 JP 2018029436 W JP2018029436 W JP 2018029436W WO 2019044400 A1 WO2019044400 A1 WO 2019044400A1
Authority
WO
WIPO (PCT)
Prior art keywords
operation command
user
motor
gesture
unit
Prior art date
Application number
PCT/JP2018/029436
Other languages
English (en)
Japanese (ja)
Inventor
山崎 正裕
憲一 相馬
Original Assignee
株式会社日立産機システム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立産機システム filed Critical 株式会社日立産機システム
Priority to CN201880048127.7A priority Critical patent/CN110945778B/zh
Priority to JP2019539123A priority patent/JP6915066B2/ja
Publication of WO2019044400A1 publication Critical patent/WO2019044400A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02PCONTROL OR REGULATION OF ELECTRIC MOTORS, ELECTRIC GENERATORS OR DYNAMO-ELECTRIC CONVERTERS; CONTROLLING TRANSFORMERS, REACTORS OR CHOKE COILS
    • H02P27/00Arrangements or methods for the control of AC motors characterised by the kind of supply voltage
    • H02P27/04Arrangements or methods for the control of AC motors characterised by the kind of supply voltage using variable-frequency supply voltage, e.g. inverter or converter supply voltage
    • H02P27/06Arrangements or methods for the control of AC motors characterised by the kind of supply voltage using variable-frequency supply voltage, e.g. inverter or converter supply voltage using dc to ac converters or inverters
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02PCONTROL OR REGULATION OF ELECTRIC MOTORS, ELECTRIC GENERATORS OR DYNAMO-ELECTRIC CONVERTERS; CONTROLLING TRANSFORMERS, REACTORS OR CHOKE COILS
    • H02P31/00Arrangements for regulating or controlling electric motors not provided for in groups H02P1/00 - H02P5/00, H02P7/00 or H02P21/00 - H02P29/00

Definitions

  • the present invention relates to a technique for operating / setting a motor drive system provided with a motor represented by industrial equipment with a gesture.
  • Patent Document 1 a technique for performing a complicated operation by combining a touch panel and gesture recognition has been proposed.
  • an object of the present invention is to improve the safety when operating and setting the motor drive system by a gesture.
  • the operation state of the motor is managed, and the operation command is confirmed from the relationship between the recognition result of the gesture recognition unit and the motor operation state.
  • the safety when operating and setting the motor drive system by gesture can be improved.
  • FIG. 1 is a configuration diagram showing a configuration of a system according to a first embodiment.
  • FIG. 1 is a block diagram showing an example of a functional configuration of a system according to a first embodiment.
  • FIG. 2 is a block diagram showing a hardware configuration of a control unit according to the first embodiment.
  • FIG. 7 is a diagram showing the configuration of an operation command correspondence table provided in the control unit according to the first embodiment.
  • FIG. 7 is a diagram showing the configuration of a user confirmation necessity list included in the control unit according to the first embodiment.
  • FIG. 7 is a diagram showing the configuration of a user confirmation method list provided in the control unit according to the first embodiment.
  • FIG. 7 is a diagram showing the configuration of a parameter generation table provided in the control unit according to the first embodiment.
  • FIG. 8 is a diagram showing a display example of the display unit when the system according to the first embodiment recognizes a hand.
  • FIG. 7 is a diagram showing a display example of the display unit when the system according to the first embodiment recognizes a rotation gesture. It is a figure showing an example of a display of a display part when a system concerning Example 1 recognizes rotation gesture and can confirm continuation of a gesture for a definite period of time.
  • FIG. 7 is a diagram showing a display example of the display unit when the system according to the first embodiment recognizes a frequency change gesture.
  • FIG. 7 is a diagram showing a display example of the display unit when the system according to the first embodiment recognizes a lock release gesture.
  • FIG. 7 is a diagram showing a display example of the display unit when the system according to the first embodiment recognizes a frequency change gesture in the unlocked state.
  • FIG. 7 is a diagram showing a display example of the display unit when the system according to the first embodiment recognizes an emergency stop gesture.
  • FIG. 7 is a block diagram illustrating an example of a functional configuration of a system according to a second embodiment.
  • FIG. 7 is a configuration diagram showing configurations of a drive unit and an emergency stop reception unit included in an inverter according to a second embodiment. It is a figure which shows the processing flow at the time of the system which concerns on Example 2 acquiring a recognition result, and transmitting an operation command and an emergency stop signal to an inverter.
  • FIG. 18 is a diagram illustrating a processing flow when the system according to the third embodiment acquires an error of the gesture recognition unit.
  • FIG. 1 is a block diagram showing the configuration of a system (motor drive system) according to the first embodiment.
  • FIG. 1 shows a gesture recognition unit 10, a control unit 20, an inverter 30, a display unit 40, a motor 50, and a load 60.
  • the gesture recognition unit 10, the control unit 20, and the display unit 40 are arranged near the inverter 30, the motor 50, and the load 60, respectively.
  • the gesture recognition unit 10 when the user holds the hand over the motor 50 and moves the hand intuitively, an operation command to the motor 50 can be issued.
  • the gesture recognition unit 10 according to the first embodiment is installed so that when the user makes a gesture representing the rotation of the motor 50 from the direction of the rotation axis of the motor 50, the rotation direction can be recognized.
  • the motor 50 is displayed on the display unit 40 disposed far from the motor 50, and remote control is performed at a location away from the motor 50 while looking at the display unit 40. good.
  • the gesture recognition unit 10 is installed so that the user's gesture on the front side of the display unit 40 can be recognized.
  • the display part 40 and the gesture recognition part 10 illustrated the example connected with the control part 20 by wire communication, data may be exchanged by wireless.
  • the display unit 40 may be a glasses-type wearable display device.
  • the display unit 40 and the gesture recognition unit 10 may be integrated.
  • FIG. 2 is a block diagram illustrating an example of a functional configuration of the system according to the first embodiment.
  • the gesture recognition unit 10 recognizes the position and / or movement of a human limb such as a hand, an arm or a finger, a face or a trunk.
  • An imaging unit 101, a matching processing unit 102, and a feature amount database 103 are provided, and gesture information including the type of gesture is output.
  • the imaging unit 101 includes, for example, an infrared camera and an infrared LED (Light Emitting Diode) that emits infrared light.
  • the infrared LED emits infrared light
  • the infrared camera detects infrared light reflected by an object near the imaging unit 101. Then, the detected infrared image is output.
  • the matching processing unit 102 performs image processing on the image of the gesture output by the imaging unit 101, and data corresponding to the data items stored in the feature database 103 (for example, the position of the hand or finger in the image, Extract coordinates of movement, direction vector, etc.)
  • the position extraction is performed, for example, in the case of an infrared image, extraction of a white area, extraction of a hand area, extraction of finger tip coordinates, etc., and in the case of an image obtained from a visible light camera, human skin By extracting the area where the color of the image is captured, and thereafter performing the same process as the infrared image, the position of the hand, the finger, or the like can be extracted.
  • the movement can be extracted by extracting time-series positional information of a body such as a hand that makes a gesture from a plurality of images at consecutive imaging times. For example, by comparing the position of the hand in the image at time t with the position of the hand in the image at time t + 1, it can be extracted in which direction the hand is moving, and furthermore, the hand in the image from time t to time t + n It can be extracted that the hand is "rotated" when the position of the circle changes in a circle.
  • the matching processing unit 102 refers to the feature amount database 103, identifies a gesture type from the degree of correlation between the position of the gesture and / or the movement from the captured image and the feature amount database 103, and outputs it as gesture information.
  • the feature amount database 103 is a database storing feature amounts of the position and / or movement of the hand, and is referred to by the matching processing unit 102. Specifically, for example, for each gesture type such as "rotation” or "flick", position and / or movement (for example, relative coordinates of the forefinger's fingertip and its temporal change when the thumb's fingertip is the origin Etc) are accumulated. The position may be a relative distance. Further, reference to the feature amount database 103 may be made only when the type of gesture can not be specified only by the processing in the matching processing unit 102.
  • the control unit 20 generates the parameter for the inverter using the gesture information including the gesture type acquired from the gesture recognition unit 10, and transmits the parameter signal to the inverter when the user's confirmation is obtained as needed.
  • the operation of the motor 50 can be set variously. For example, the rotation of the motor 50 can be made faster or slower. Further, by acquiring the value of the parameter, the operating state such as the rotational speed of the motor 50 can be acquired.
  • the gesture recognition unit 10 may be a wearable device.
  • a device such as a ring worn on the user's hand or toes, or a device worn on an arm may be used.
  • the gesture recognition unit 10 may include, for example, an acceleration sensor or a gyro sensor that can recognize the position and / or the movement of a hand or a finger instead of the imaging unit 101.
  • the display unit 40 is a display device provided with a touch panel, and the user may recognize an operation of tracing the touch panel with a hand or a finger as a gesture.
  • the control unit 20 includes a motor operation state management unit 201, an operation instruction generation unit 202, a parameter generation unit 203, a communication unit 204, a database (operation instruction correspondence table 205, user confirmation necessity list 206, user confirmation method list 207, A parameter generation table 208) is provided.
  • the motor operation state management unit 201 manages the operation state of the motor 50.
  • the motor 50 is stored as an internal variable whether it is at rest, rotating, accelerating or decelerating.
  • the operation command generation unit 202 acquires gesture information from the gesture recognition unit 10, and determines an operation instruction to the motor 50 from the acquired gesture information.
  • it is first confirmed from the operating state of the motor 50 and the operation command whether the user really intends the operation command as needed.
  • the parameter is converted into a parameter of the inverter 30 and transmitted.
  • a conversion instruction is issued to a parameter generation unit 203 described later.
  • an instruction for transmission is issued to the communication unit 204 described later.
  • the parameter generation unit 203 generates a parameter to be transmitted to the inverter from the type and data of the operation command.
  • the type of operation command is, for example, operation start, operation stop, frequency change, and the like.
  • the data of the operation command indicates, for example, the value of the frequency to be changed if the type of the operation command is “frequency change”.
  • the communication unit 204 communicates with the inverter 30 and an external device. For example, communication is performed using the Modbus communication protocol.
  • the operation command correspondence table 205 is a table in which the correspondence between the gesture recognition result and the operation command is described.
  • FIG. 4 shows a configuration example of the operation command correspondence table 205.
  • an operation command 2052 for the gesture recognition result 2051 is described. For example, if the gesture recognition result is "rotate a hand", it becomes an operation command of "operation start”. Note that the gesture recognition result of "locking” and “opening a key” indicates not the operation instruction to the motor 50 but the unlocking and locking of the "locked state” described later.
  • the user confirmation necessity list 206 is a list indicating whether or not confirmation to the user is necessary. Whether the confirmation is necessary or not depends on the operation command and the operating state of the motor 50.
  • the structural example of the user confirmation necessity list 206 is shown in FIG.
  • the user confirmation necessity list 206 includes an operation command 2061 and an operation state of the motor 50.
  • the operating state of the motor 50 is, for example, stopping 2062, rotating (constant speed) 2063, rotating (accelerating) 2064, rotating (decelerating) 2065 and the like.
  • the operation command 2061 is “operation start”
  • the operation state of the motor 50 is stopping 2062
  • the necessity of user confirmation is “necessary”. This is dangerous in the case of false recognition, since the motor 50 is rotated by the start of operation from the state where the motor 50 is stopped. Therefore, confirmation to the user is required.
  • the operation command 2061 is "emergency stop"
  • the necessity of the user confirmation is “not” regardless of the operation state of the motor 50 in any case. Since this is an emergency stop, it is an operation command to be executed with the highest priority. Thus, the operation command for emergency stop is immediately transmitted to the inverter without user confirmation.
  • the user confirmation method list 207 is a list showing how to confirm when confirmation to the user is required.
  • FIG. 6 shows a configuration example of the user confirmation method list 207.
  • the user confirmation method list 207 includes an operation command 2071, a confirmation method 2072 and the like.
  • the confirmation method 2072 is a confirmation method of "continue for a predetermined time”. This indicates that it is determined that the user's confirmation has been obtained when the hand rotation operation, which is a gesture to start driving, is continuously performed for a predetermined fixed time. That is, for example, when the user continues the rotation gesture for a predetermined time such as 3 seconds, the control unit 20 determines that the user intends to surely start the driving.
  • the confirmation method 2072 indicates a confirmation method of "lock release".
  • "Unlocking” is, for example, a gesture such as turning a door key, and is a gesture different from other operation command gestures.
  • the control unit 20 determines that the confirmation by the user has been obtained.
  • the parameter generation table 208 is a list for generating parameters to be transmitted to the inverter based on the type and data of the operation command.
  • FIG. 7 shows a configuration example of the parameter generation table (operation command-parameter relationship) 208. It comprises an operation command type 2081, operation command data 2082, parameter address 2083, parameter data 2084 and the like.
  • the operation command type 2081 indicates the type of operation command such as “operation start” or “frequency change”.
  • the operation command data 2082 indicates data such as how much the frequency is to be changed, for example, when the operation command type is “frequency change”.
  • the parameter data 2084 has a current value + 1. If the current value of the frequency is, for example, 50, it becomes an operation command to set the frequency to 51.
  • a value obtained by multiplying operation command data by a predetermined coefficient may be used. If you want to change the parameters greatly with small hand movement gestures, increase the coefficient, and if you want to make finer settings, decrease the coefficient.
  • the parameter address 2083 is an address for identifying a parameter managed by the inverter 30.
  • the parameter indicating the value of frequency is stored at the address “0x0006”.
  • the parameter value can be changed by changing the value of the data stored in this address.
  • the parameter of the parameter address “0x0001” indicates the operating state, and when the parameter data is 1, it indicates the start of operation, and when the parameter data is 0, it indicates the stop of the operation.
  • the parameter address is “0x0001” and the parameter data is “0”.
  • the control unit 20 acquires gesture information as a recognition result from the gesture recognition unit 10 (step S101).
  • the control unit 20 refers to the action command correspondence table 205 to obtain an action command corresponding to the gesture information (step S102).
  • whether or not the acquired operation command is executable is confirmed (step S103).
  • the detailed process flow of confirmation will be described later with reference to FIG.
  • execution confirmation processing if the execution of the operation command is OK (step S104; Yes), the parameter is notified to the inverter 30 (step S105). If the execution of the operation command is NG (step S104; No), the process ends without doing anything.
  • the detailed process flow of the parameter notification will be described later with reference to FIG.
  • the control unit 20 acquires the operation state of the motor 50 managed by the motor operation state management unit 201 (step S201).
  • the motor operation state management unit 201 inquires of the inverter 30 about the operation state of the motor 50 and acquires it. Specifically, the value of the address at which the operation state is stored is acquired from the storage unit in the inverter 30 using Modbus communication or the like. For example, an address of 0x0021 indicates the operating state, and if the value is “1”, it is determined that the vehicle is in operation if it is “1”, and if it is “0” it is in the stopped state.
  • the control unit 20 can obtain the operating state of the motor 50.
  • control unit 20 refers to the user confirmation necessity list 206 and acquires the necessity of confirmation of the operation command (step S202).
  • the control unit 20 confirms the user's confirmation and confirms that the operation command is intended by the user.
  • the operation command is "emergency stop”
  • the necessity of the user's confirmation is “not” regardless of the operation state of the motor 50 in any case.
  • Emergency stop is important because it is important to immediately issue an operation command and put it in a safe state without waiting for confirmation to the user.
  • the control unit 20 can reliably transmit the operation command causing the danger such as operation start and the operation command for emergency stop to the inverter 30 immediately. .
  • the user can use the gesture to intuitively and safely issue an operation command to the motor 50.
  • Step S203 Yes
  • confirmation to a user is notified
  • step S204 The detailed processing flow of the notification will be described later with reference to FIG.
  • step S203; No it is determined that the operation instruction execution is OK, and the present execution availability confirmation process is ended. Then, the process returns to the process (step S104) shown in FIG. 8 and continues.
  • step S205 If the confirmation by the user can be detected after notifying the user of the confirmation (step S205; Yes), it is determined that the operation command execution is OK, and the present executability confirmation process is ended. If the confirmation by the user is not detected (step S205; No), it is determined that the operation instruction execution is NG, and the present execution availability confirmation process is ended.
  • control unit 20 acquires the operation command and the operation state of the motor 50 (step S301). Then, referring to the user confirmation method list 207, a confirmation method for the user is acquired (step S302).
  • a gauge is displayed on the display unit 40 (step S303).
  • the gauge is, for example, a horizontally long rectangle or a tachometer, and is displayed with animation that approaches 100% over time only while the gesture operation continues. By looking at this gauge, the user recognizes that the gesture needs to be continued.
  • FIG. 13 the example of a display screen which displays a gauge on the display part 40 is shown to FIG. 13 and FIG.
  • step S304 when the gesture operation continues for a predetermined time (step S304; Yes), it is determined that the confirmation by the user has been obtained (step S305). If the gesture operation is interrupted before the predetermined time elapses (step S305; No), it is determined that confirmation for the user has not been obtained (step S310).
  • the display unit 40 displays the operation method of the lock release (step S306).
  • the unlocking operation method displays, for example, an image or animation in which a key is turned by hand.
  • a warning message such as “The motor is rotating. Confirm that it is not a false recognition and release the lock” may be displayed. By looking at these displays, the user confirms that the gesture he / she has made matches the gesture recognized by the machine, and also recognizes that unlocking is necessary to transmit an operation command.
  • FIGS. 15 and 16 show examples of the display screen on which the operation method of the lock release is displayed on the display unit 40.
  • FIG. 15 and 16 show examples of the display screen on which the operation method of the lock release is displayed on the display unit 40.
  • step S307; Yes When the unlocking operation is detected (step S307; Yes), the process proceeds to step S305.
  • step S307; No the process proceeds to step S310 to execute each process.
  • step S302 the operation method by the plural is displayed on the display unit 40 (step S308). For example, words such as “Please perform gestures by multiple people at the same time” are displayed on the display unit 40. By looking at this display, the user recognizes that it is necessary for a plurality of people to simultaneously make a gesture. For example, when two right hands are detected by the gesture recognition unit 10, gestures by a plurality of people are recognized as gestures by the plurality of people.
  • step S309 if the motion of a plurality of persons is detected (step S309; Yes), the process proceeds to step S305. If the operation can not be detected (step S309; No), the process proceeds to step S310 to execute each process.
  • the control unit 20 acquires the gesture recognition result from the gesture recognition unit 10, and acquires the type and data of the operation command (step S401).
  • the control unit 20 refers to the parameter generation table 208, and acquires parameter addresses and parameter data (step S402).
  • an error check of parameter data is performed (step S403). If there is no error in the parameter data (step S404; Yes), the communication unit 204 of the control unit 20 transmits the parameter address and parameter data to the communication unit 301 of the inverter 30 (step S405). If there is an error in the parameter data (step S404; No), an error is notified (step S406), and the process is ended.
  • the inverter 30 includes a communication unit 301, a parameter management unit 302, and a drive unit 303, and supplies power to the motor 50 for operation.
  • the communication unit 301 exchanges data with the control unit 20 and other external devices. For example, it transmits and receives data of parameters. Multiple communication protocols such as Modbus, EtherCAT, TCP / IP, etc. may be provided in order to be available.
  • Multiple communication protocols such as Modbus, EtherCAT, TCP / IP, etc. may be provided in order to be available.
  • the parameter management unit 302 manages parameters stored inside the inverter 30.
  • the parameter information acquired by the communication unit 301 is reflected on the parameters stored in the inverter 30, or the information on the parameters stored in the inverter 30 is output to the communication unit 301 and transmitted to the external device Do.
  • the drive unit 303 supplies power for applying torque to the motor 50.
  • the control unit 20 includes a processing unit 211, a storage unit 212, an operation unit 213, a communication unit 214, a display unit 215, and the like.
  • the processing unit 211 is configured by, for example, a CPU (Central Processing Unit) and a program operating on the CPU. For example, processing programs of the motor operation state management unit 201, the operation command generation unit 202, and the parameter generation unit 203 are executed.
  • a CPU Central Processing Unit
  • the storage unit 212 is configured by an appropriate storage device such as a read only memory (ROM), a random access memory (RAM), an auxiliary storage device, and the like, and stores various data and various programs. For example, the operation command correspondence table 205 and the user confirmation necessity list 206 are stored.
  • the storage unit 212 also has a work area of the processing unit 211.
  • the storage unit 212 may be configured from at least one of a built-in memory or a removable external memory. In addition, the storage unit 212 may be provided with a part of the function outside if the processing unit 211 can access it.
  • the operation unit 213 includes, for example, a touch panel, a keyboard, a microphone, and the like, and receives user input.
  • the communication unit 204 exchanges data with other information processing apparatuses.
  • the control unit 20 performs communication processing and the like for accessing the inverter 30, the Internet, and other information processing apparatuses.
  • the communication unit 204 is not limited to using only one, and, for example, can use a plurality of communication methods such as Modbus, EtherCAT, Code Division Multiple Access (CDMA), Long Term Evolution (LTE), and wireless LAN. You may have more than one to do.
  • the display unit 215 is configured of, for example, a liquid crystal display (LCD) or an organic electro luminescence (EL), and displays an image and information.
  • LCD liquid crystal display
  • EL organic electro luminescence
  • FIG. 1 An example of the display screen of the display unit 40 is shown in FIG. 1
  • the display unit 40 displays a state display 400 indicating the operation state of the motor 50, icons 401 to 405 indicating the operation method of the gesture, a recognition result 406 of the user's hand, and the like.
  • FIG. 13 shows an example of a display screen when a gauge is displayed on the display unit 40.
  • a frame icon 407 is displayed around the “start of rotation” icon to indicate that the user has made a start of rotation gesture.
  • a gauge icon 408 indicates the operation duration time.
  • the black part of the gauge icon 408 displays an animation in which the area expands in the right direction.
  • the notification icon 409 is a notification displayed to urge the user to continue the gesture.
  • FIG. 4 The example of a display screen of the display part 40 when the gesture by a user is continued and a gauge becomes full is shown in FIG.
  • the black part of the gauge icon 408 is filled in the entire area.
  • the notification icon 409 displays that the continuation has been confirmed.
  • “motor operation state” of the state display 400 of the motor 50 is displayed as “during rotation”.
  • FIG. 15 shows an example of a display screen of the display unit 40 when a gesture for raising the frequency is made by the user.
  • a frame icon 407 is displayed around the "frequency up" icon to indicate that the user has made a gesture to raise the frequency.
  • the notification icon 409 displays that the motor 50 is rotating, and the user is urged to release the lock.
  • an icon 410 indicating the unlocking operation is displayed near the icon. The user can know how to unlock by looking at the icon 410.
  • the notification icon 409 displays a message that the lock has been released.
  • “motor operation state” of the state display 400 of the motor 50 is displayed as “during rotation (lock release)”, and a display indicating that the lock is released is performed.
  • FIG. 411 displays the value of the frequency whose setting has been changed. This screen shows that the frequency is set to 56 Hz. Every time the user moves his hand up and down, the value of the frequency is changed according to the position of the user's hand.
  • FIG. 18 shows an example of a display screen of the display unit 40 when a gesture indicating an emergency stop is made by the user.
  • the user's two hands are recognized, and the hand recognition result 406 and the recognition result 412 are displayed.
  • the notification icon 409 also displays that an emergency stop is to be performed.
  • “motor operation state” of the state display 400 of the motor 50 is displayed as “during stop”.
  • the user can intuitively issue an operation command of the motor 50 by the icon indicating the operation method of the gesture. Further, since the operating state of the motor 50 can be easily grasped, the motor 50 can be operated intuitively.
  • the gesture for instructing the operation of the motor 50 is usually composed of a one-hand gesture.
  • the gesture for performing the emergency stop operation command is a gesture using both hands and is clearly separated from the normal operation command.
  • the user is well intuitive just by holding both hands at the time of emergency stop.
  • the gesture recognition unit 10 has a clear difference as to whether there are two hands or only one hand, the difference between the two is clear and the possibility of misrecognition can be reduced.
  • STO Safe torque off
  • SS1 Safe stop 1
  • a safety stop function such as STO or SS1 mounted on the inverter 30 is used.
  • FIG. 19 is a block diagram illustrating an example of a functional configuration of a system according to a second embodiment.
  • the control unit 20 includes an emergency stop transmission unit 220 in addition to the components described in the first embodiment. Further, the inverter 30 includes an emergency stop reception unit 320 in addition to the components described in the first embodiment.
  • the emergency stop reception unit 320 is connected to the drive unit 303.
  • the emergency stop reception unit 320 cuts off the power supplied by the drive unit 303 to apply torque to the motor 50. As a result, the motor 50 is not supplied with torque and stops rotating.
  • FIG. 20 is a diagram showing an example of the configuration of the drive unit 303 and the emergency stop reception unit 320 included in the inverter 30 according to the second embodiment. The mechanism of driving of the motor 50 and emergency stop will be described with reference to FIG.
  • the emergency stop reception unit 320 includes an STO signal reception unit 321 and an STO signal reception unit 322.
  • the STO signal reception unit 321 is a terminal for receiving an emergency stop signal, and transmits an emergency stop signal output from the emergency stop transmission unit to the drive unit 303.
  • the STO signal receiving unit 322 is similarly configured.
  • the driving unit 303 includes a gate driving unit 3035, a rectification circuit unit 3032, a DC smoothing circuit unit 3033, an inverter unit 3034, and a main body control unit 3030.
  • the rectifier circuit unit 3032 is formed of, for example, a diode bridge, and converts an AC voltage supplied from the external AC power supply 70 into a DC voltage.
  • the DC smoothing circuit unit 3033 is formed of, for example, a capacitor, and smoothes the DC voltage converted by the rectifier circuit unit 3032.
  • the inverter unit 3034 includes, for example, six IGBTs (Insulated Gate Bipolar Transistors), and converts the DC voltage smoothed by the DC smoothing circuit unit 3033 into an AC voltage.
  • IGBTs Insulated Gate Bipolar Transistors
  • the gate drive unit 3035 is formed of, for example, a gate driver IC (Integrated Circuit), and drives the IGBT of the inverter unit 3034.
  • a gate driver IC Integrated Circuit
  • the main body control unit 3030 outputs a PWM (Pulse Width Modulation) control signal. For example, it is a program operating on the CPU.
  • the main body control unit 3030 outputs the PWM control signal to the gate drive unit 3035, and the gate drive unit 3035 outputs the received PWM control signal to the inverter unit 3034.
  • the inverter unit 3034 generates a PWM controlled AC voltage using the received PWM control signal, and supplies the AC voltage to the motor 50. Thereby, drive control of the motor 50 is possible.
  • the gate driver 3035 receives an emergency stop signal from the STO signal receivers 321 and 322.
  • the gate drive unit 3035 that has received the emergency stop signal shuts off the PWM control signal supplied to the inverter unit 3034.
  • the PWM control of the voltage supplied to the motor 50 by the inverter unit 3034 is also cut off, and the torque of the motor 50 is not generated.
  • the drive unit 303 stops the torque supply to the motor 50 and stops the motor 50.
  • step S101 processing similar to that of FIG. 8 is performed to obtain a gesture recognition result (step S101), and an operation command is obtained (step S102).
  • step S501 it is determined whether or not the type of the acquired operation command is "emergency stop", and if it is "emergency stop" (step S501; Yes), the emergency stop transmission unit 220 transmits an emergency stop signal (step S502). ). If the type of operation command is other than "emergency stop" (step S501; No), nothing is performed. Thereafter, the same processing as that of step S103 and thereafter in FIG. 8 is executed.
  • the motor 50 can be reliably stopped, which is more secure.
  • control unit 20 inquires the gesture recognition unit 10 about the recognition result (step S600). Then, if the recognition result can be acquired (step S601; Yes), the process proceeds to step S102, and the same processing as step S102 and subsequent steps in FIG.
  • step S601 If the recognition result can not be obtained (step S601; No), it is checked whether it is a time out (step S602). Then, if a time out occurs (step S602; Yes), the emergency stop transmission unit 220 transmits an emergency stop signal (step S605).
  • step S602 If a timeout has not occurred (step S602; No), it is checked whether an error has been acquired from the gesture recognition unit 10 (step S603).
  • the error output from the gesture recognition unit 10 is, for example, “camera contamination error” indicating that the camera lens of the imaging unit 101 is dirty, or the communication speed of data between the gesture recognition unit 10 and the control unit 20 is For example, "a communication speed decrease error” indicating that it is extremely slow.
  • the gesture recognition result can not be correctly acquired from the gesture recognition unit 10. Therefore, even if the user makes an emergency stop gesture, the control unit 20 can not obtain the gesture, so the motor 50 can not be stopped. . In order not to fall into such a dangerous state, when an error of the gesture recognition unit 10 is detected, an emergency stop signal is sent to stop the motor 50 and keep it in a safe state.
  • the emergency stop transmission unit 220 transmits an emergency stop signal (step S605).
  • step S603 If an error can not be acquired from the gesture recognition unit 10 (step S603; No), wait processing is performed for a predetermined time interval (step S604), and the recognition result is again inquired to the gesture recognition unit 10 (step S604) Step S600).
  • the motor 50 can be stopped by transmitting the emergency stop signal, so that the safety can be further improved.
  • the display unit 40 may be displayed to notify the user that the error is to be notified. By doing this, the operation of the motor 50 can be continued, and the availability of the system is improved.
  • control unit 20 may be provided in the inverter 30.
  • gesture recognition unit 20 control unit 30 inverter 40 display unit 50 motor 60 load 101 imaging unit 102 matching processing unit 103 feature amount database 201 motor operation state management unit 202 operation command generation unit 203 parameter generation unit 204 communication unit 205 operation command correspondence table 206 User confirmation necessity list 207 User confirmation method list 208 Parameter generation table 211 CPU 212 storage unit 213 operation unit 214 communication unit 215 display unit 301 communication unit 302 parameter setting unit 303 drive unit 320 emergency stop reception unit 321 STO signal reception unit 322 STO signal reception unit 3030 main body control unit 3032 rectification circuit unit 3033 DC smoothing circuit unit 3034 inverter unit 3035 gate drive unit 400 status display 401 to 405 icon 406 recognition result of hand 407 frame icon 408 gauge icon 409 notification icon 410 icon 411 indicating unlocking operation result of recognition of hand

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Control Of Electric Motors In General (AREA)
  • Control Of Ac Motors In General (AREA)

Abstract

Selon la présente invention, une instruction de fonctionnement est délivrée à un moteur de manière intuitive et sécurisée à l'aide d'une reconnaissance de geste. Le système d'entraînement de moteur de la présente invention est caractérisé en ce qu'il comporte : un onduleur qui fournit une source d'énergie à un moteur ; une unité de reconnaissance de geste qui reconnaît un geste ; une unité de gestion d'état de fonctionnement de moteur qui gère un état de fonctionnement du moteur ; une unité de génération d'instruction de fonctionnement qui génère une instruction de fonctionnement à partir d'un geste reconnu par l'unité de reconnaissance de geste ; une unité de génération de paramètre qui génère un paramètre de l'onduleur sur la base de l'instruction de fonctionnement générée par l'unité de génération d'instruction de fonctionnement ; et une base de données qui contient une liste de nécessité de vérification d'utilisateur indiquant s'il est ou non nécessaire de vérifier avec l'utilisateur si oui ou non l'instruction de fonctionnement peut être exécutée, à partir d'une relation entre l'instruction de fonctionnement et l'état de fonctionnement du moteur ; l'unité de génération d'instruction de fonctionnement faisant référence à la liste de nécessité de vérification d'utilisateur pour déterminer s'il faut ou non vérifier avec l'utilisateur si oui ou non l'instruction de fonctionnement générée peut être exécutée.
PCT/JP2018/029436 2017-08-28 2018-08-06 Système d'entraînement de moteur WO2019044400A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880048127.7A CN110945778B (zh) 2017-08-28 2018-08-06 电机驱动系统
JP2019539123A JP6915066B2 (ja) 2017-08-28 2018-08-06 モータドライブシステム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017163020 2017-08-28
JP2017-163020 2017-08-28

Publications (1)

Publication Number Publication Date
WO2019044400A1 true WO2019044400A1 (fr) 2019-03-07

Family

ID=65525271

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/029436 WO2019044400A1 (fr) 2017-08-28 2018-08-06 Système d'entraînement de moteur

Country Status (3)

Country Link
JP (1) JP6915066B2 (fr)
CN (1) CN110945778B (fr)
WO (1) WO2019044400A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021060824A (ja) * 2019-10-07 2021-04-15 株式会社Fuji 基板作業機
JP2022071796A (ja) * 2020-10-28 2022-05-16 株式会社日本総合研究所 情報処理システム、コンピュータプログラム、及び表示方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114253187B (zh) * 2021-12-06 2023-11-07 中国煤炭科工集团太原研究院有限公司 矿用急停控制设备及方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016018264A (ja) * 2014-07-04 2016-02-01 株式会社リコー 画像形成装置、画像形成方法、及びプログラム
JP2017046368A (ja) * 2015-08-24 2017-03-02 株式会社リコー モータ過負荷異常検出装置、モータ駆動制御装置、画像形成装置、およびモータ過負荷異常検出方法
JP2017123179A (ja) * 2012-12-29 2017-07-13 アップル インコーポレイテッド 複数接触ジェスチャのために触知出力の生成を見合わせるためのデバイス、方法、及びグラフィカルユーザインタフェース

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5776544B2 (ja) * 2011-12-28 2015-09-09 トヨタ自動車株式会社 ロボットの制御方法、ロボットの制御装置、及びロボット
JP6021488B2 (ja) * 2012-07-19 2016-11-09 キヤノン株式会社 制御装置、制御方法、および制御プログラム
WO2015174526A1 (fr) * 2014-05-16 2015-11-19 エイディシーテクノロジー株式会社 Système de commande de véhicule

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017123179A (ja) * 2012-12-29 2017-07-13 アップル インコーポレイテッド 複数接触ジェスチャのために触知出力の生成を見合わせるためのデバイス、方法、及びグラフィカルユーザインタフェース
JP2016018264A (ja) * 2014-07-04 2016-02-01 株式会社リコー 画像形成装置、画像形成方法、及びプログラム
JP2017046368A (ja) * 2015-08-24 2017-03-02 株式会社リコー モータ過負荷異常検出装置、モータ駆動制御装置、画像形成装置、およびモータ過負荷異常検出方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021060824A (ja) * 2019-10-07 2021-04-15 株式会社Fuji 基板作業機
JP2022071796A (ja) * 2020-10-28 2022-05-16 株式会社日本総合研究所 情報処理システム、コンピュータプログラム、及び表示方法
JP7236478B2 (ja) 2020-10-28 2023-03-09 株式会社日本総合研究所 情報処理システム、コンピュータプログラム、及び表示方法

Also Published As

Publication number Publication date
JP6915066B2 (ja) 2021-08-04
JPWO2019044400A1 (ja) 2020-07-16
CN110945778A (zh) 2020-03-31
CN110945778B (zh) 2023-08-15

Similar Documents

Publication Publication Date Title
JP6915066B2 (ja) モータドライブシステム
JP7013689B2 (ja) 遠隔操作型移動ロボットおよびロボット制御システム
JP6337199B2 (ja) 対話式移動体制御システムのための一体化ウェアラブル用品
CN103294366B (zh) 一种屏幕解锁方法和电子设备
US10266055B2 (en) Vehicle-mounted equipment operating device and vehicle-mounted equipment operating system
US8890650B2 (en) Fluid human-machine interface
EP3139256A1 (fr) Appareil tactile à porter et procédé pour appareil tactile à porter
CN107067691A (zh) 用于起重机的控制系统
JP2017510875A (ja) ジェスチャー装置、その動作方法及びこれを備えた車両
US10296096B2 (en) Operation recognition device and operation recognition method
JP2017013984A (ja) エレベータ装置
KR102301240B1 (ko) 웨어러블 기기를 이용한 차량 제스쳐 인식 장치와 제스쳐 인식 방법 및 이를 위한 웨어러블 기기
WO2018163992A1 (fr) Tableau de commande de centre d'usinage à commande numérique
JP2010100370A (ja) エレベーターの操作入力装置及び方法
WO2020107292A1 (fr) Procédé de commande pour cardan, cardan, plateforme mobile et support de données lisible par ordinateur
JP5776544B2 (ja) ロボットの制御方法、ロボットの制御装置、及びロボット
CN112512763A (zh) 控制装置、控制方法和程序
CN111788549B (zh) 旋钮解锁方法、电子设备及可读存储介质
US9841745B2 (en) Machine controller and method for controlling a machine
US20160370785A1 (en) Method for actuating a safe switching element of an installation
US20230386283A1 (en) Safety Locking Device
TW201604715A (zh) 觸控裝置及其控制方法與判斷解鎖的方法
CN111645701A (zh) 一种车辆控制方法、装置及系统
JP5275185B2 (ja) 電力系統監視制御装置
JP2021158693A (ja) 携帯端末、警備システム、表示方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18851039

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019539123

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18851039

Country of ref document: EP

Kind code of ref document: A1