WO2014141100A1 - Automatic machine management operator support system and corresponding method and automatic machine - Google Patents

Automatic machine management operator support system and corresponding method and automatic machine Download PDF

Info

Publication number
WO2014141100A1
WO2014141100A1 PCT/IB2014/059690 IB2014059690W WO2014141100A1 WO 2014141100 A1 WO2014141100 A1 WO 2014141100A1 IB 2014059690 W IB2014059690 W IB 2014059690W WO 2014141100 A1 WO2014141100 A1 WO 2014141100A1
Authority
WO
WIPO (PCT)
Prior art keywords
automatic machine
multisensorial
machine
operator
portions
Prior art date
Application number
PCT/IB2014/059690
Other languages
French (fr)
Inventor
Gilberto Spirito
Stefano Negrini
Original Assignee
G.D Societa' Per Azioni
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by G.D Societa' Per Azioni filed Critical G.D Societa' Per Azioni
Priority to EP14722730.0A priority Critical patent/EP2972613A1/en
Publication of WO2014141100A1 publication Critical patent/WO2014141100A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling

Definitions

  • the present invention relates to an automatic machine management operator support system, and to a corresponding support method and automatic machine for producing and/or packing consumer goods.
  • the present invention relates to a support system, and corresponding method, by which to identify portions of an automatic machine on the basis of the position and/or orientation of a multisensorial device with respect to the automatic machine, to provide an operator with information relating to the identified automatic machine portion.
  • the present invention also relates to an automatic machine for producing and/or packing consumer goods, wherein the human-machine interface means comprise a multisensorial device for identifying portions of the automatic machine on the basis of the position and/or orientation of the multisensorial device with respect to the machine, to provide an operator with information relating to the identified automatic machine portion.
  • the human-machine interface means comprise a multisensorial device for identifying portions of the automatic machine on the basis of the position and/or orientation of the multisensorial device with respect to the machine, to provide an operator with information relating to the identified automatic machine portion.
  • Consumer goods are normally produced and packaged on specially designed automatic machines. For example, producing a packet of cigarettes calls for at least two in-line automatic machines, one for producing the individual cigarettes, and a follow-up machine for producing the packet.
  • the main job of automatic machine operators is to supervise the machine to monitor correct operation and the actual quality of the end product, by monitoring wear and any malfunctioning of the component parts of the machine .
  • HMI human-machine interface
  • These screens are an integral part ⁇ of the automatic machine, and enable the operator, by means of a user- friendly interface, to monitor the efficiency of the machine, determine any anomalies, and intervene when necessary.
  • the very size of the machine which may range from a few metres to tens of metres in length, poses a serious operator-machine interaction problem. Since the longitudinal travel of the product substantially corresponds to the sequence of operations performed on it, this means the operator may have to move repeatedly along the whole length of the machine in the course of processing the product, to access the various component units or portions of the machine.
  • machine documentation such as handbooks, history and maintenance data, and repair reports. Though in electronic form, these are not normally accessible from the machine interface, and must be acquired beforehand by the operator. Even when such electronic data is accessible from the HMI, searching for it is a painstaking, time-consuming job owing to the complex nature of the access menus.
  • automatic machines are often equipped with a number of display screens connected to the HMI and located at various points along the length of the machine to minimize operator walking distance between the portion of the machine for which further information is, required, and the user interface.
  • the display screens may also be connected to the machine by movable supporting structures, such as an articulated arm, that can be moved or rotated within a given range .
  • the machine may be provided with a number of touch display screens with simplified, multilevel menus surfable by the operator to access information and machine interaction commands .
  • the operator is also provided with one or more additional devices - such as one or more keyboards, indicator lights, and even additional alphanumeric display screens - located at different portions of the machine, to speed up more frequent routine jobs, and to enable the operator to work on the machine without having to look away and, more generally speaking, without having to shut down the machine .
  • additional devices such as one or more keyboards, indicator lights, and even additional alphanumeric display screens - located at different portions of the machine, to speed up more frequent routine jobs, and to enable the operator to work on the machine without having to look away and, more generally speaking, without having to shut down the machine .
  • Additional devices such as keyboards, indicator lights, additional screens and the like, however, have a serious economic impact on the machine. Expenditure . on component parts and wiring increases the end cost of the machine; while the additional devices themselves increase the size of the machine, and often call for customized design.
  • An automatic machine management operator support system is therefore needed, and in particular one allowing simple access to information at any portion of the machine .
  • Said system should allow management of any portion of the automatic machine. More specifically, it should make it possible to receive information and data relating to any portion of the machine; and to input commands to the machine without the operator having to move from the machine portion he is interacting with.
  • An automatic machine - is needed, in which the user interface allows the operator to work at any portion of the machine .
  • an automatic machine operator support method is needed, and in particular one allowing control of the automatic machine from the machine portion the operator is interacting with.
  • a further object of the present invention is to provide an automatic machine comprising human-machine interface means designed to minimize the aforementioned drawbacks .
  • Yet a further object of the present invention is to provide an operator support method for managing an automatic machine for producing consumer goods .
  • At least a portable wireless multisensorial device with which the operator of the automatic machine is equipped, and which comprises a display screen and a real-image acquisition device and ' /or spatial inertial tracking device;
  • the multisensorial device is programmed to:
  • the database is integrated in the memory of the multisensorial device, or is accessible remotely by the multisensorial device. Even more preferably, the database is located on the automatic machine and accessible by the multisensorial device via a wireless connection. Alternatively, the database is located outside the multisensorial device and the automatic machine, and is accessible by the multisensorial device over a network connection.
  • the multisensorial device is a tablet or heads-up device. More specifically, the heads-up multisensorial device is a helmet with a transparent visor equipped with the display screen. Alternatively, the heads-up multisensorial device is a pair of glasses with transparent lenses equipped with the display screen.
  • the multisensorial device generates a virtual command system for controlling .
  • the automatic machine and/or automatic machine portions are generated by the multisensorial device.
  • an automatic machine for producing and/or packing consumer goods, in particular of the . tobacco industry
  • the automatic machine comprising human-machine interface means for monitoring and/or controlling production and/or packing stages and/or portions of the automatic machine;
  • the human-machine interface means comprising at least a portable wireless multisensorial device, with which the operator of the automatic machine is equipped, and which comprises a display screen and a real-image acquisition device and/or spatial inertial tracking device;
  • the human-machine interface means being connected operatively to at least a database containing information relating to individual portions of the automatic machine;
  • the automatic machine also comprises a number of work units connected operatively to perform stages in the production and/or packing of said goods, and has one or more identification markers at the work units and/or automatic machine portions.
  • the human- machine interface means are designed to identify the identification markers, and/or to receive data from the identification markers to display user information relating to the production and/or packing stages, and/or to the work units, and/or to the automatic machine portions .
  • the user information comprises maintenance and/or help data.
  • the maintenance data comprises data relating to the efficiency of the automatic machine and/or the work units and/or the automatic machine portions.
  • the maintenance data comprises malfunction repair time data.
  • the help data comprises operator-assist data.
  • the multisensorial device is a tablet type .
  • the multisensorial device is a heads-up device.
  • the heads-up multisensorial device is a helmet with a transparent visor equipped with the display screen.
  • the heads-up multisensorial device is a pair of glasses with transparent lenses equipped with the display screen.
  • the multisensorial device is designed to identify the identification markers, and/or to receive data from the identification markers to display user information, when the acquired real image of the automatic machine and/or the work unit and/or the automatic machine " portion comprises at least one of the identification markers.
  • the multisensorial device generates a virtual command system for controlling the automatic machine and/or the work units and/or the automatic machine portions.
  • an operator support method for managing an automatic machine for producing consumer goods in particular of the tobacco industry; the method being characterized by comprising the steps of:
  • the information relating to the identified automatic machine portion, and/or to the production and/or packing stages performed by the automatic machine portion comprises maintenance and/or help data relating to the automatic machine or to a portion of the automatic machine.
  • the maintenance data comprises data relating to the efficiency of the automatic machine and/or automatic machine portions. Even more preferably, the maintenance data comprises malfunction repair time data. Even more preferably, the help data comprises operator-assist data.
  • the multisensorial device generates a virtual command system for controlling the automatic machine and/or automatic machine portions.
  • Figure 1 shows an example view in perspective of an automatic machine portion, in particular a packing material feed assembly
  • Figure 2 shows an example view in perspective of packing material flow and operating data produced on the Figure 1 packing material feed assembly
  • Figure 3 shows an example view in perspective of phased overlaying of Figures 2 and 1 ;
  • Figure 4 shows a block diagram of the system according to the present invention.
  • automatic machines comprise a number of work units connected operatively to perform stages in the production and/or packing of consumer goods.
  • Feed assembly 10 for supplying packing material 11, 12, 13.
  • Feed assembly 10 is one portion of an automatic packing machine (not shown) , in particular of the tobacco industry.
  • feed assembly 10 comprises a number of work units of the packing machine, and is equipped with a system in accordance with the present invention.
  • Feed assembly 10 comprises a control unit 110 for controlling the feed assembly, and for two-way data exchange with the machine's main control unit (not shown) .
  • the machine' s main control unit may even control feed assembly 10 on its own, with no additional control means required.
  • Feed assembly control unit 110 comprises a programmable industrial controller (PLC) and a human- machine interface (HMI) .
  • PLC programmable industrial controller
  • HMI human- machine interface
  • human-machine interface means allow the operator to monitor and, if necessary, control the packing stages performed on feed assembly 10.
  • the PLC and HMI are connected by wired or wireless network connections.
  • the HMI means comprise a portable processing unit with a wireless data connection to the HMI to which it is connected; and a portable wireless multisensorial device 210, which is designed to identify portions of the automatic machine to which it is connected, on the basis of its position and orientation with respect to the automatic machine, and to supply the operator with information relating to the identified automatic machine portion, and to the packing stages performed by the automatic machine or by a portion of it.
  • the term ⁇ multisensorial device' refers to a device capable of providing the operator with information in at least two transmission modes, e.g. visual video transmission, and audio transmission by means of loudspeakers or headsets.
  • Machine portion' refers to a portion of the machine comprising one or more units; to individual parts ' , of the units; and even to individual elements or portions of the unit parts.
  • multisensorial device 210 and the portable processing unit are combined in one heads-up device, but may equally be physically separate but connected operatively to each other.
  • the heads-up device has at least one screen for displaying information relating to the identified automatic machine portion, i.e. feed assembly 10 forming part of the automatic machine.
  • multisensorial device 210 comprising the portable processing unit is in the form of glasses with transparent lenses equipped with a display screen.
  • the glasses forming multisensorial device 210 also comprise real-image acquisition means for identifying automatic machine portions on the basis of the position and/or orientation of multisensorial device 210 with respect to the automatic machine.
  • the acquisition means may comprise known devices, such as television cameras or proximity sensors.
  • Multisensorial device 210 may comprise additional systems for improving operator-machine interaction with the HMI. More specifically, an accelerometer-gyroscope or equivalent device may be provided for spatial inertial tracking as multisensorial device 210 moves. The information about the identified machine portion may therefore vary according to the angle (view) of the multisensorial device. More specifically, the multisensorial device may provide different information about the viewed machine portion, even though the viewed portion remains stationary, by altering its position even with only one degree of freedom (e.g. by rotating about an axis of symmetry) .
  • an accelerometer-gyroscope or equivalent device may be provided for spatial inertial tracking as multisensorial device 210 moves. The information about the identified machine portion may therefore vary according to the angle (view) of the multisensorial device. More specifically, the multisensorial device may provide different information about the viewed machine portion, even though the viewed portion remains stationary, by altering its position even with only one degree of freedom (e.g. by rotating
  • a vibration alert to complete multisensorial device 210, provision may be made for a vibration alert, thermal camera (or thermographic television camera), eye tracking means, one or more illuminators, and one or more ultrasound sensors.
  • Multisensorial device 210 may comprise one or more pushbuttons connected to the portable processing unit it comprises. And provision may also be made for radio, e.g. RFID or NFC, communication antennas connected to the portable processing unit.
  • radio e.g. RFID or NFC
  • the system according to the present invention also comprises at least a database containing information relating to individual portions of the automatic machine.
  • the database is integrated in the memory of multisensorial device 210, and in particular in the portable processing unit.
  • Multisensorial device 210 is designed to generate on its display screen an interactive virtual image, as shown by way of example in Figure 3.
  • the interactive virtual image is obtained by superimposing on the acquired real image the information relating to the selected identified automatic machine portion.
  • FIG. 2 shows packing materal flow and the relative real-time checks made on assembly 10.
  • the portable processing unit contains an interactive virtual model of various two- and three- dimensional geometries, e.g. the packing material flow in Figure 2.
  • the portable processing unit communicates two-way with the PLC of main control unit 110, from which it receives data about the operation of assembly 10, and to which it transmits any commands.
  • multisensorial device 210 is connected operatively to the portable processing unit to transmit and display the processed information.
  • the information supplied by multisensorial device 210 may also comprise maintenance and/or help data. More specifically, the maintenance data may comprise overall automatic machine efficiency data; work unit efficiency and parameter data and data relating to real-time or machine-off checks made of even only portions of the machine or units .
  • the maintenance data may also comprise malfunction repair time data, which may comprise MTTF (Mean Time To Fail) and/or MTTR (Mean Time To Repair) data, i.e. consumable part expiry and estimated residual working life data.
  • MTTF Machine Time To Fail
  • MTTR Mean Time To Repair
  • Multisensorial device 210 may also be used for maintenance when the machine is off.
  • the help data supplied by device 210 comprises data to assist maintenance personnel.
  • the operator 50 is equipped with the multisensorial device 210 described above. In the event of a malfunction on assembly 10, an error is generated at the required operator- intervention point.
  • a number of contextual menus are provided, corresponding to the viewing positions of multisensorial device 210. These are simplified menus, and each relate solely to a respective machine portion viewed from the operator's observation point.
  • the operator 50 may request cloning of the conventional HMI menu, or a portion of it, on the display screen of multisensorial device 210.
  • the operator 50 may read, on the display screen of multisensorial device 210, the digital documents loaded beforehand into the system's database.
  • multisensorial device 210 is also able to identify the machine or the line viewed by operator 50. In which case, multisensorial device 210 must be able to interact with the plant's PLCs.
  • glasses 210 may superimpose on the display screen images and videos to assist maintenance of the machine or assembly 10. These overlays must be phased spatially with the real image of assembly 10.
  • the display may also show individual part numbers, management data, and spare part stock.
  • the displays on multisensorial device 210 may also comprise animations or pre-recorded audiovisual sequences to assist maintenance personnel.
  • Remote interactive assist sessions by on-line experts may also be displayed by means of a network connection.
  • Other functions may include recording interactive sessions; assigning recordings to one or more machine portions; or consulting recordings offline.
  • the heads-up multisensorial device 210 may be in the form of a helmet with a transparent visor equipped with a display screen.
  • the multisensorial device is in the form of a tablet.
  • the device must have at least a display screen on which to display information relating to the identified automatic machine portion.
  • glasses 210 may also display a safety barrier grid, such as active optical grids, which, if accidentally interrupted, stop the machine to prevent injury to the operator.
  • a safety barrier grid such as active optical grids
  • the" system provides for interacting with the automatic machine by means of proximity sensors or identification markers - useful technology when employing RFID .
  • the automatic machine is equipped, with one or more identification markers at the work units and/or automatic machine portions for identification.
  • the markers may even identify individual replaceable (or consumable) parts.
  • the human-machine interface means are therefore able to identify and/or receive data from the placed identification markers.
  • the identification data is useful in displaying packing stage information; may also identify the work units and indicate the efficiency and production progress of the identified machine portion; and, finally, may permit spare parts management in terms of stocks and/or authenticity.
  • the database may be located differently. It may be accessible remotely by the multisensorial device and located on the automatic machine to which the multisensorial device is connected. Similarly, the database may be accessed by the multisensorial device over a wireless connection or a wired or wireless network connection.
  • the automatic machine may therefore have no database, but be connected operatively to it by the human-machine interface.
  • the multisensorial device generates a virtual command system for controlling the automatic machine and/or automatic machine portions.
  • These virtual commands constitute auxiliary- pushbutton panels displayable three-dimensionally on the multisensorial device display screen.
  • a virtual space may be overlaid on the real image of the machine to substitute for the actual pushbutton panels the automatic machines are normally equipped with.
  • the virtual pushbutton panels are phased spatially with the real image of the machine, so the commands to be carried out match the machine portion for control.
  • the virtual pushbutton panels are preferably operated on a touch screen of the multisensorial device, when this is in tablet form.
  • the virtual pushbutton panels may be activated by eye tracking technology, using a virtual pointer system, or by voice control .
  • the system, machine, and method according to the present invention make it possible to obtain information and impart commands to - the machine, to machine parts, and to the work units dynamically, by means of a multisensorial device which generates the necessary information and/or virtual commands on the basis of its position and/or orientation with respect to the machine and/or machine parts.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Marketing (AREA)
  • Development Economics (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Game Theory and Decision Science (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Factory Administration (AREA)
  • Devices For Checking Fares Or Tickets At Control Points (AREA)
  • Operation Control Of Excavators (AREA)

Abstract

An operator support system, for managing an automatic machine for producing consumer goods, in particular of the tobacco industry, has at least a portable wireless multisensorial device, with which the automatic machine operator is equipped, and at least a database containing information relating to individual portions of the automatic machine; the multisensorial device being programmed to identify the automatic machine portions on the basis of the position and/or orientation of the multisensorial device with respect to the automatic machine, and to provide the operator with information relating to the identified automatic machine portion.

Description

AUTOMATIC MACHINE MANAGEMENT OPERATOR SUPPORT SYSTEM AND CORRESPONDING METHOD AND AUTOMATIC MACHINE
TECHNICAL FIELD
The present invention relates to an automatic machine management operator support system, and to a corresponding support method and automatic machine for producing and/or packing consumer goods.
More specifically, the present invention relates to a support system, and corresponding method, by which to identify portions of an automatic machine on the basis of the position and/or orientation of a multisensorial device with respect to the automatic machine, to provide an operator with information relating to the identified automatic machine portion.
The present invention also relates to an automatic machine for producing and/or packing consumer goods, wherein the human-machine interface means comprise a multisensorial device for identifying portions of the automatic machine on the basis of the position and/or orientation of the multisensorial device with respect to the machine, to provide an operator with information relating to the identified automatic machine portion. BACKGROUND ART
Consumer goods are normally produced and packaged on specially designed automatic machines. For example, producing a packet of cigarettes calls for at least two in-line automatic machines, one for producing the individual cigarettes, and a follow-up machine for producing the packet.
In both the tobacco and food industry, automatic machines provide for maximizing production while maintaining a high quality standard, and in particular for extremely high output rates with very little labour.
The main job of automatic machine operators is to supervise the machine to monitor correct operation and the actual quality of the end product, by monitoring wear and any malfunctioning of the component parts of the machine .
To do this, the operator must, on the one hand, interact with machine systems and devices, and, on the other, access pertinent machine operation and maintenance data and documentation.
As is known, the operator interacts with the automatic machine under his supervision by means of special display screens, which form the I/O peripheral units of a so-called human-machine interface (HMI) . These screens are an integral part · of the automatic machine, and enable the operator, by means of a user- friendly interface, to monitor the efficiency of the machine, determine any anomalies, and intervene when necessary. The very size of the machine, which may range from a few metres to tens of metres in length, poses a serious operator-machine interaction problem. Since the longitudinal travel of the product substantially corresponds to the sequence of operations performed on it, this means the operator may have to move repeatedly along the whole length of the machine in the course of processing the product, to access the various component units or portions of the machine.
Moreover, the operator must also access machine documentation, such as handbooks, history and maintenance data, and repair reports. Though in electronic form, these are not normally accessible from the machine interface, and must be acquired beforehand by the operator. Even when such electronic data is accessible from the HMI, searching for it is a painstaking, time-consuming job owing to the complex nature of the access menus.
To eliminate these drawbacks, automatic machines are often equipped with a number of display screens connected to the HMI and located at various points along the length of the machine to minimize operator walking distance between the portion of the machine for which further information is, required, and the user interface.
The display screens may also be connected to the machine by movable supporting structures, such as an articulated arm, that can be moved or rotated within a given range .
Even making the interface accessible from various points along the machine, accessing information is still a complicated, time-consuming job that increases downtime cost. Whereas movable display screens, while improving access, still do not allow for optimum access to information at a given machine portion.
The machine may be provided with a number of touch display screens with simplified, multilevel menus surfable by the operator to access information and machine interaction commands .
This would improve information access, but the location of the interface with respect to the machine portion for which information is needed would still remain a problem in terms of operator movement .
The same problems also apply to maintenance. Maintenance workers need to consult illustrated handbooks and spare parts catalogues, as well as to locate operating instructions and spare part numbers pertinent to the machine portion being worked on.
On most automatic machines, the operator is also provided with one or more additional devices - such as one or more keyboards, indicator lights, and even additional alphanumeric display screens - located at different portions of the machine, to speed up more frequent routine jobs, and to enable the operator to work on the machine without having to look away and, more generally speaking, without having to shut down the machine .
Additional devices, such as keyboards, indicator lights, additional screens and the like, however, have a serious economic impact on the machine. Expenditure . on component parts and wiring increases the end cost of the machine; while the additional devices themselves increase the size of the machine, and often call for customized design.
An automatic machine management operator support system is therefore needed, and in particular one allowing simple access to information at any portion of the machine .
Said system should allow management of any portion of the automatic machine. More specifically, it should make it possible to receive information and data relating to any portion of the machine; and to input commands to the machine without the operator having to move from the machine portion he is interacting with.
An automatic machine -is needed, in which the user interface allows the operator to work at any portion of the machine .
Finally, an automatic machine operator support method is needed, and in particular one allowing control of the automatic machine from the machine portion the operator is interacting with.
DESCRIPTION OF THE INVENTION
It is an object of the present invention to provide an operator support system for managing an automatic machine for producing consumer goods.
A further object of the present invention is to provide an automatic machine comprising human-machine interface means designed to minimize the aforementioned drawbacks .
Yet a further object of the present invention is to provide an operator support method for managing an automatic machine for producing consumer goods .
According to the present invention, there is provided an operator support system for managing an automatic machine for producing consumer goods, in particular of the tobacco industry; the operator support system comprising:
at least a portable wireless multisensorial device, with which the operator of the automatic machine is equipped, and which comprises a display screen and a real-image acquisition device and'/or spatial inertial tracking device; and
at least a database containing information relating to individual portions of the automatic machine ; and being characterized in that the multisensorial device is programmed to:
- identify the automatic machine portions, on the basis of the position and orientation of the multisensorial device with respect to the automatic machine, by means of the real-image acquisition device and/or by spatial inertial tracking as the multisensorial device moves; and
- provide the operator, on the display screen, with information relating to the identified automatic machine portion, by generating an interactive virtual image by superimposing the information on the acquired real image .
Preferably, the database is integrated in the memory of the multisensorial device, or is accessible remotely by the multisensorial device. Even more preferably, the database is located on the automatic machine and accessible by the multisensorial device via a wireless connection. Alternatively, the database is located outside the multisensorial device and the automatic machine, and is accessible by the multisensorial device over a network connection.
Preferably, the multisensorial device is a tablet or heads-up device. More specifically, the heads-up multisensorial device is a helmet with a transparent visor equipped with the display screen. Alternatively, the heads-up multisensorial device is a pair of glasses with transparent lenses equipped with the display screen.
Preferably, the multisensorial device generates a virtual command system for controlling . the automatic machine and/or automatic machine portions.
According to the present invention, there is also provided an automatic machine for producing and/or packing consumer goods, in particular of the . tobacco industry; the automatic machine comprising human-machine interface means for monitoring and/or controlling production and/or packing stages and/or portions of the automatic machine; the human-machine interface means comprising at least a portable wireless multisensorial device, with which the operator of the automatic machine is equipped, and which comprises a display screen and a real-image acquisition device and/or spatial inertial tracking device;
, the human-machine interface means being connected operatively to at least a database containing information relating to individual portions of the automatic machine;
and the automatic machine being characterized in that the multisensorial device is programmed to:
- identify the automatic machine portions, on the basis of the position and orientation of the multisensorial device with respect to the automatic machine, by means of the real -image acquisition device and/or by spatial inertial tracking as the multisensorial device moves; and
- provide the operator, on the display screen, with information relating to the identified automatic machine portion, and/or to the production and/or packing stages performed by the automatic machine portion, by generating an interactive virtual image by superimposing the information on the acquired real image .
Preferably, the automatic machine also comprises a number of work units connected operatively to perform stages in the production and/or packing of said goods, and has one or more identification markers at the work units and/or automatic machine portions. The human- machine interface means are designed to identify the identification markers, and/or to receive data from the identification markers to display user information relating to the production and/or packing stages, and/or to the work units, and/or to the automatic machine portions .
Preferably, the user information comprises maintenance and/or help data. Even more preferably, the maintenance data comprises data relating to the efficiency of the automatic machine and/or the work units and/or the automatic machine portions. Even more preferably, the maintenance data comprises malfunction repair time data. Even more preferably, the help data comprises operator-assist data.
Preferably, the multisensorial device is a tablet type .
Alternatively, the multisensorial device is a heads-up device. Even more preferably, the heads-up multisensorial device is a helmet with a transparent visor equipped with the display screen. Alternatively, the heads-up multisensorial device is a pair of glasses with transparent lenses equipped with the display screen.
Preferably, the multisensorial device is designed to identify the identification markers, and/or to receive data from the identification markers to display user information, when the acquired real image of the automatic machine and/or the work unit and/or the automatic machine " portion comprises at least one of the identification markers.
Preferably, the multisensorial device generates a virtual command system for controlling the automatic machine and/or the work units and/or the automatic machine portions.
According to the present invention, there is also provided an operator support method for managing an automatic machine for producing consumer goods, in particular of the tobacco industry; the method being characterized by comprising the steps of:
- identifying a portion of the automatic machine, by means of a portable wireless multisensorial device, on the basis of the position and/or orientation of the multisensorial device with respect to the automatic machine, and by acquiring real images and/or by spatial inertial tracking as the multisensorial device moves; and
- providing an operator, on the display screen of the multisensorial device, with information relating to the identified automatic machine portion, and/or to the production and/or packing stages performed by the automatic machine portion, by generating an interactive virtual image by superimposing the information on the acquired real image .
Preferably, the information relating to the identified automatic machine portion, and/or to the production and/or packing stages performed by the automatic machine portion, comprises maintenance and/or help data relating to the automatic machine or to a portion of the automatic machine.
Preferably, the maintenance data comprises data relating to the efficiency of the automatic machine and/or automatic machine portions. Even more preferably, the maintenance data comprises malfunction repair time data. Even more preferably, the help data comprises operator-assist data.
Preferably, the multisensorial device generates a virtual command system for controlling the automatic machine and/or automatic machine portions.
BRIEF DESCRIPTION OF THE DRAWINGS
A non- limiting embodiment of the present invention will be described by way of example with reference to the attached drawings, in which:
Figure 1 shows an example view in perspective of an automatic machine portion, in particular a packing material feed assembly;
Figure 2 shows an example view in perspective of packing material flow and operating data produced on the Figure 1 packing material feed assembly;
Figure 3 shows an example view in perspective of phased overlaying of Figures 2 and 1 ;
Figure 4 shows a block diagram of the system according to the present invention.
PREFERRED EMBODIMENTS OF THE INVENTION
In the attached drawings, the system according to the present invention is shown relative to an interactive operator interface.
The same considerations also apply, mutatis mutandis, to an operator interface used when the machine is off. As is known, automatic machines comprise a number of work units connected operatively to perform stages in the production and/or packing of consumer goods.
Figure 1 shows a feed assembly 10 for supplying packing material 11, 12, 13. Feed assembly 10 is one portion of an automatic packing machine (not shown) , in particular of the tobacco industry. In the embodiment shown, feed assembly 10 comprises a number of work units of the packing machine, and is equipped with a system in accordance with the present invention.
Feed assembly 10 comprises a control unit 110 for controlling the feed assembly, and for two-way data exchange with the machine's main control unit (not shown) . The machine' s main control unit may even control feed assembly 10 on its own, with no additional control means required. Feed assembly control unit 110 comprises a programmable industrial controller (PLC) and a human- machine interface (HMI) . As is known, human-machine interface means allow the operator to monitor and, if necessary, control the packing stages performed on feed assembly 10. The PLC and HMI are connected by wired or wireless network connections.
In the present invention, the HMI means comprise a portable processing unit with a wireless data connection to the HMI to which it is connected; and a portable wireless multisensorial device 210, which is designed to identify portions of the automatic machine to which it is connected, on the basis of its position and orientation with respect to the automatic machine, and to supply the operator with information relating to the identified automatic machine portion, and to the packing stages performed by the automatic machine or by a portion of it.
In the present invention, the term ^multisensorial device' refers to a device capable of providing the operator with information in at least two transmission modes, e.g. visual video transmission, and audio transmission by means of loudspeakers or headsets.
In the present invention, the term Machine portion' refers to a portion of the machine comprising one or more units; to individual parts', of the units; and even to individual elements or portions of the unit parts.
In the embodiment described, multisensorial device 210 and the portable processing unit are combined in one heads-up device, but may equally be physically separate but connected operatively to each other. The heads-up device has at least one screen for displaying information relating to the identified automatic machine portion, i.e. feed assembly 10 forming part of the automatic machine. As shown in Figure 1, in the embodiment described, multisensorial device 210 comprising the portable processing unit is in the form of glasses with transparent lenses equipped with a display screen.
The glasses forming multisensorial device 210 also comprise real-image acquisition means for identifying automatic machine portions on the basis of the position and/or orientation of multisensorial device 210 with respect to the automatic machine. The acquisition means may comprise known devices, such as television cameras or proximity sensors.
Multisensorial device 210 may comprise additional systems for improving operator-machine interaction with the HMI. More specifically, an accelerometer-gyroscope or equivalent device may be provided for spatial inertial tracking as multisensorial device 210 moves. The information about the identified machine portion may therefore vary according to the angle (view) of the multisensorial device. More specifically, the multisensorial device may provide different information about the viewed machine portion, even though the viewed portion remains stationary, by altering its position even with only one degree of freedom (e.g. by rotating about an axis of symmetry) .
Provision may be made for a sound acquisition system comprising one or more microphones, and a sound transmission system comprising one or more loudspeakers or headsets. To complete multisensorial device 210, provision may be made for a vibration alert, thermal camera (or thermographic television camera), eye tracking means, one or more illuminators, and one or more ultrasound sensors.
Multisensorial device 210 may comprise one or more pushbuttons connected to the portable processing unit it comprises. And provision may also be made for radio, e.g. RFID or NFC, communication antennas connected to the portable processing unit.
The system according to the present invention also comprises at least a database containing information relating to individual portions of the automatic machine. In the embodiment described, the database is integrated in the memory of multisensorial device 210, and in particular in the portable processing unit.
Multisensorial device 210 is designed to generate on its display screen an interactive virtual image, as shown by way of example in Figure 3. The interactive virtual image is obtained by superimposing on the acquired real image the information relating to the selected identified automatic machine portion.
One example of information relating to assembly 10 in Figure 1 is shown in Figure 2, which shows packing materal flow and the relative real-time checks made on assembly 10. The portable processing unit contains an interactive virtual model of various two- and three- dimensional geometries, e.g. the packing material flow in Figure 2. The portable processing unit communicates two-way with the PLC of main control unit 110, from which it receives data about the operation of assembly 10, and to which it transmits any commands. And multisensorial device 210 is connected operatively to the portable processing unit to transmit and display the processed information.
The information supplied by multisensorial device 210 may also comprise maintenance and/or help data. More specifically, the maintenance data may comprise overall automatic machine efficiency data; work unit efficiency and parameter data and data relating to real-time or machine-off checks made of even only portions of the machine or units . The maintenance data may also comprise malfunction repair time data, which may comprise MTTF (Mean Time To Fail) and/or MTTR (Mean Time To Repair) data, i.e. consumable part expiry and estimated residual working life data.
Multisensorial device 210 may also be used for maintenance when the machine is off. In which case, the help data supplied by device 210 comprises data to assist maintenance personnel.
Operation of the system according to the present invention is shown by way of example in Figure 4.
The operator 50 is equipped with the multisensorial device 210 described above. In the event of a malfunction on assembly 10, an error is generated at the required operator- intervention point. Instead of a single main machine menu, which branches off into various machine functions, a number of contextual menus are provided, corresponding to the viewing positions of multisensorial device 210. These are simplified menus, and each relate solely to a respective machine portion viewed from the operator's observation point.
The operator 50 may request cloning of the conventional HMI menu, or a portion of it, on the display screen of multisensorial device 210.
Similarly, the operator 50 may read, on the display screen of multisensorial device 210, the digital documents loaded beforehand into the system's database.
By means of the eye tracking function, multisensorial device 210 is also able to identify the machine or the line viewed by operator 50. In which case, multisensorial device 210 must be able to interact with the plant's PLCs.
When carrying out maintenance work when the machine is off, glasses 210 may superimpose on the display screen images and videos to assist maintenance of the machine or assembly 10. These overlays must be phased spatially with the real image of assembly 10.
The display may also show individual part numbers, management data, and spare part stock.
The displays on multisensorial device 210 may also comprise animations or pre-recorded audiovisual sequences to assist maintenance personnel.
Remote interactive assist sessions by on-line experts may also be displayed by means of a network connection. Other functions may include recording interactive sessions; assigning recordings to one or more machine portions; or consulting recordings offline. In a further embodiment, the heads-up multisensorial device 210 may be in the form of a helmet with a transparent visor equipped with a display screen.
In a further embodiment, the multisensorial device is in the form of a tablet. In this case, too, the device must have at least a display screen on which to display information relating to the identified automatic machine portion.
In a further embodiment, glasses 210 may also display a safety barrier grid, such as active optical grids, which, if accidentally interrupted, stop the machine to prevent injury to the operator.
.In a further embodiment, the" system provides for interacting with the automatic machine by means of proximity sensors or identification markers - useful technology when employing RFID .
More specifically, the automatic machine is equipped, with one or more identification markers at the work units and/or automatic machine portions for identification. The markers may even identify individual replaceable (or consumable) parts. The human-machine interface means are therefore able to identify and/or receive data from the placed identification markers. The identification data is useful in displaying packing stage information; may also identify the work units and indicate the efficiency and production progress of the identified machine portion; and, finally, may permit spare parts management in terms of stocks and/or authenticity.
In other embodiments, the database may be located differently. It may be accessible remotely by the multisensorial device and located on the automatic machine to which the multisensorial device is connected. Similarly, the database may be accessed by the multisensorial device over a wireless connection or a wired or wireless network connection. The automatic machine may therefore have no database, but be connected operatively to it by the human-machine interface.
In another embodiment of the operator support system according to the present invention, the multisensorial device generates a virtual command system for controlling the automatic machine and/or automatic machine portions.
These virtual commands constitute auxiliary- pushbutton panels displayable three-dimensionally on the multisensorial device display screen. In other words, a virtual space may be overlaid on the real image of the machine to substitute for the actual pushbutton panels the automatic machines are normally equipped with.
The virtual pushbutton panels are phased spatially with the real image of the machine, so the commands to be carried out match the machine portion for control.
The virtual pushbutton panels are preferably operated on a touch screen of the multisensorial device, when this is in tablet form.
The virtual pushbutton panels may be activated by eye tracking technology, using a virtual pointer system, or by voice control .
The system, machine, and method according to the present invention make it possible to obtain information and impart commands to - the machine, to machine parts, and to the work units dynamically, by means of a multisensorial device which generates the necessary information and/or virtual commands on the basis of its position and/or orientation with respect to the machine and/or machine parts.

Claims

1) An operator support system for managing an automatic machine for producing consumer goods, in particular of the tobacco industry; the operator support system comprising:
at least a portable wireless multisensorial device (210) , with which the operator (50) of said automatic machine is equipped, and which comprises a display screen and a real- image acquisition device and/or spatial inertial tracking device; and
at least a database containing information relating to individual portions of said automatic machine ;
and being characterized in that said multisensorial device (210) is programmed to:
- identify said automatic machine portions (10) , on the basis of the position and orientation of said multisensorial device (210) with respect to said automatic machine, by means of said real- image acquisition device and/or by spatial inertial tracking as said multisensorial device (210) moves; and
provide said operator (50) , on said display screen, with information relating to the identified said automatic machine portion (10) , by generating an interactive virtual image by superimposing said information on the acquired real image.
2) An operator support system as claimed in~ Claim
1, characterized in that said database is integrated in the memory of said multisensorial device (210) , or is accessible remotely by said multisensorial device (210) .
3) An operator support system as claimed in Claim
2, characterized in that said database is located on said automatic machine or outside said multisensorial device (210) and said automatic machine, and is accessible by said multisensorial device (210) via a wireless connection.
4) An operator support system as claimed in one or more of Claims 1 to 3, characterized in that said multisensorial device (210) is a tablet or heads-up device .
5) An operator support system as claimed in Claim 4, characterized in that the heads-up said multisensorial device (210) is a helmet with a transparent visor equipped with said display screen, or a pair of glasses with transparent lenses equipped with said display screen.
6) An operator support system as claimed in one or more of Claims 1 to 5, characterized in that said multisensorial device (210) generates a virtual command system for controlling said automatic machine and/or said automatic machine portions (10) . 7) An automatic machine for producing and/or packing consumer goods, in particular of the tobacco industry; the automatic machine comprising human-machine interface means for monitoring and/or controlling production and/or packing stages and/or portions (10) of said automatic machine; said human-machine interface means comprising at least a portable wireless multisensorial device (210) , with which the operator (50) of said automatic machine is equipped, and which comprises a display screen and a real -image acquisition device and/or spatial inertial tracking device;
said human-machine interface means being connected operatively to at least a database containing information relating to individual portions of said automatic machine;
and said automatic machine being characterized in that said multisensorial device (210) is programmed to:
- identify said automatic machine portions (10) , on the basis of the position and orientation of said multisensorial device (210) with respect to said automatic machine, by means of said real -image acquisition device and/or by spatial inertial tracking as said multisensorial device (210) moves; and
- provide the -operator (50)," on said " 'display screen, with information relating to the identified said automatic machine portion (10), and/or to the production and/or packing stages performed by said automatic machine portion, by generating an interactive virtual image by superimposing said information on the acquired real image
5 8) An automatic machine as claimed in Claim 7, and also comprising a number of work units connected operatively to perform stages in the production and/or packing of said goods; said automatic machine being characterized by having one or more identification -10 markers at said work units and/or said automatic machine portions (10) ; said human-machine interface means being designed to identify said identification markers, and/or to receive data from said identification markers to display user information relating to said production
15 , and/or packing stages, and/or to said work units, and/or to said automatic machine portions (10) .
9) An automatic machine as claimed in Claim 7 or 8 , characterized in that said user information comprises maintenance and/or help data.
20 10) An automatic machine as claimed in Claim 9, characterized in that said maintenance data comprises data relating to the efficiency of said automatic machine and/or said work units and/or said automatic machine portions (10) , and/or malfunction repair time
25 data .
11) An automatic machine as claimed in Claim 9 or 10, characterized in that said help data comprises data for assisting said operator (50) .
12) An automatic machine as claimed in one or more of Claims 7 to 11, characterized in that said multisensorial device (210) is a tablet or heads-up device .
13) An automatic machine as claimed in Claim 12, characterized in that the heads-up said multisensorial device (210) is a helmet with a transparent visor equipped with said display screen, or a pair of glasses with transparent lenses equipped with said display- screen.
14) An automatic machine as claimed in Claim 8, characterized in that said multisensorial device (210) is . designed to identify said identification markers, and/or to receive data from said identification markers to display user information, when the acquired real image of said automatic machine and/or said work unit and/or said automatic machine portion (10) comprises at least one of said identification markers.
15) An automatic machine as claimed in one or more of Claims 7 to 14 , characterized in that said multisensorial device (210) generates a virtual command system for controlling said automatic machine and/or said work units and/or said automatic machine portions (10) . 16) An operator support method for managing an automatic machine for producing consumer goods, in particular of the tobacco industry; the method being characterized by comprising the steps of :
- identifying a portion (10) of said automatic machine, by means of a portable wireless multisensorial device (210) , on the" basis of the position and orientation of said multisensorial device (210) with respect to said automatic machine, and by acquiring real images and/or by spatial inertial tracking as said multisensorial device (210) moves; and
- providing an operator (50) , on the display screen of said multisensorial device (210), with information relating to the identified said automatic machine portion (10) , and/or to the production and/or packing stages performed by said automatic machine portion, by generating an interactive virtual image by superimposing said information on the acquired real image.
17) An operator support method as claimed in Claim 16, characterized in that said information relating to the identified said automatic machine portion (10) , and/or to the production and/or packing stages performed by said automatic machine portion (10) , comprises maintenance and/or help data relating to said automatic machine or to a portion (10) of said automatic machine.
18) An operator support method as claimed in Claim 17, characterized in that said maintenance data comprises data relating to the efficiency of said automatic machine and/or said automatic machine portions (10) , and/or malfunction repair time data.
19) An operator support method as claimed in Claim
17 or 18, characterized in that said help data comprises data for assisting an operator (50) .
20) An operator support method as claimed in one or more of Claims 17 to 19, characterized in that said multisensorial device (210) generates a virtual command system for controlling said automatic machine and/or said automatic machine portions (10) .
PCT/IB2014/059690 2013-03-12 2014-03-12 Automatic machine management operator support system and corresponding method and automatic machine WO2014141100A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP14722730.0A EP2972613A1 (en) 2013-03-12 2014-03-12 Automatic machine management operator support system and corresponding method and automatic machine

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IT000107A ITBO20130107A1 (en) 2013-03-12 2013-03-12 OPERATOR SUPPORT SYSTEM IN THE MANAGEMENT OF AN AUTOMATIC MACHINE AND CORRESPONDING METHOD AND AUTOMATIC MACHINE
ITBO2013A000107 2013-03-12

Publications (1)

Publication Number Publication Date
WO2014141100A1 true WO2014141100A1 (en) 2014-09-18

Family

ID=48227375

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/059690 WO2014141100A1 (en) 2013-03-12 2014-03-12 Automatic machine management operator support system and corresponding method and automatic machine

Country Status (3)

Country Link
EP (1) EP2972613A1 (en)
IT (1) ITBO20130107A1 (en)
WO (1) WO2014141100A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ITUB20154627A1 (en) * 2015-10-13 2017-04-13 Sematic S P A PROCEDURE FOR MAINTENANCE OF AN ELECTROMECHANICAL DEVICE

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6094625A (en) * 1997-07-03 2000-07-25 Trimble Navigation Limited Augmented vision for survey work and machine control
WO2000052541A1 (en) * 1999-03-02 2000-09-08 Siemens Aktiengesellschaft System and method for situation-related interaction support with the aid of augmented reality technologies
GB2422234A (en) * 2004-12-10 2006-07-19 Fisher Rosemount Systems Inc Wireless handheld communicator in a process control environment
FR2949587A1 (en) * 2009-09-03 2011-03-04 Nicolas Berron Making an assembly of parts using a computer, comprises establishing digital model in three dimensions of each part, forming a virtual template of each part, and setting digital model with respect to reference of the workstation
US20110115816A1 (en) * 2009-11-16 2011-05-19 Alliance For Sustainable Energy, Llc. Augmented reality building operations tool

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6094625A (en) * 1997-07-03 2000-07-25 Trimble Navigation Limited Augmented vision for survey work and machine control
WO2000052541A1 (en) * 1999-03-02 2000-09-08 Siemens Aktiengesellschaft System and method for situation-related interaction support with the aid of augmented reality technologies
GB2422234A (en) * 2004-12-10 2006-07-19 Fisher Rosemount Systems Inc Wireless handheld communicator in a process control environment
FR2949587A1 (en) * 2009-09-03 2011-03-04 Nicolas Berron Making an assembly of parts using a computer, comprises establishing digital model in three dimensions of each part, forming a virtual template of each part, and setting digital model with respect to reference of the workstation
US20110115816A1 (en) * 2009-11-16 2011-05-19 Alliance For Sustainable Energy, Llc. Augmented reality building operations tool

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DAUDE R ET AL: "HEAD-MOUNTED DISPLAY ALS FACHARBEITERORIENTIERTE UNTERSTUETZUNGSKOMPONENTE AN CNC-WERKZEUGMASCHINEN", WERKSTATTSTECHNIK, SPRINGER VERLAG. BERLIN, DE, vol. 86, no. 5, 1 May 1996 (1996-05-01), pages 248 - 252, XP000585192, ISSN: 0340-4544 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ITUB20154627A1 (en) * 2015-10-13 2017-04-13 Sematic S P A PROCEDURE FOR MAINTENANCE OF AN ELECTROMECHANICAL DEVICE
WO2017064637A3 (en) * 2015-10-13 2017-05-26 Wittur Holding Gmbh Method of maintenance of an electromechanical device
CN108431709A (en) * 2015-10-13 2018-08-21 威特控股有限公司 The maintaining method of electromechanical equipment
US20180299876A1 (en) * 2015-10-13 2018-10-18 Wittur Holding Gmbh Method of maintenance of an electromechanical device
US10620617B2 (en) 2015-10-13 2020-04-14 Wittur Holding Gmbh Method of maintenance of an electromechanical device
EP4145237A1 (en) * 2015-10-13 2023-03-08 Wittur Holding GmbH Method of maintenance of an electromechanical device

Also Published As

Publication number Publication date
EP2972613A1 (en) 2016-01-20
ITBO20130107A1 (en) 2014-09-13

Similar Documents

Publication Publication Date Title
Liu et al. Augmented reality-assisted intelligent window for cyber-physical machine tools
EP3076253B1 (en) Systems and methods for presenting an augmented reality
EP3482887B1 (en) Augmented reality safety automation zone system and method
US11850755B2 (en) Visualization and modification of operational bounding zones using augmented reality
US10185309B2 (en) Systems and methods for recommending components for an industrial system
EP3140709B1 (en) Automation operating and management system
US6941248B2 (en) System for operating and observing making use of mobile equipment
US11733667B2 (en) Remote support via visualizations of instructional procedures
JP6917900B2 (en) How to operate the work unit of the textile machine and the operation device
US11263570B2 (en) Generating visualizations for instructional procedures
JP5404450B2 (en) Processing status monitoring device
US20050154489A1 (en) Numerical controller
US11472035B2 (en) Augmented reality visualization for robotic picking system
JP2012111029A (en) System, method and apparatus to display three-dimensional robotic workcell data
JP6396392B2 (en) Setting device and setting system for setting a plurality of devices
CN103302552A (en) Method for operating a processing machine, projection device and processing machine with such a projection device
CN110505947A (en) Robot system and its operation method
JP2022126853A (en) Workpiece positioner and welding sequencer
KR20170116773A (en) Augmented Reality Machine management system using smart glass
EP2972613A1 (en) Automatic machine management operator support system and corresponding method and automatic machine
US20170139381A1 (en) Method for displaying the machining in a machine tool
JP2011158981A (en) Working situation monitoring device
JP2019087256A (en) Cable processing machine control system, cable processing machine system, and method for monitoring and controlling cable processing machine
KR101695504B1 (en) Wireless work condition monitoring system using video camera
CN118131695A (en) AR auxiliary control system and method for coal mine tunneling

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14722730

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2014722730

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE