WO2006030529A1 - 擬似感情形成手段を備えた模擬生物装置 - Google Patents
擬似感情形成手段を備えた模擬生物装置 Download PDFInfo
- Publication number
- WO2006030529A1 WO2006030529A1 PCT/JP2004/014033 JP2004014033W WO2006030529A1 WO 2006030529 A1 WO2006030529 A1 WO 2006030529A1 JP 2004014033 W JP2004014033 W JP 2004014033W WO 2006030529 A1 WO2006030529 A1 WO 2006030529A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- emotion
- parameter
- event
- simulated
- action
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
Definitions
- the present invention relates to a simulated biological apparatus provided with a simulated emotion forming means for expressing various actions by computer control, and more particularly to optimization of sensitivity and response to externally received stimuli.
- the above-mentioned conventional simulated biological device is for the purpose of providing a simulated organism that does not get tired of the user, and the simulated organism itself creates and adds a new operation pattern, but it is newly created and added.
- the restriction that the movement pattern that remains is limited to the combination of action elements stored in advance in memory, it is important to express the mood and emotions of the moment only with such combinations of action elements.
- the present invention has been proposed in view of the above circumstances, and even when the same reaction behavior is manifested, it can be applied to the organism of each behavioral piece according to the emotion that is simulated by the detection history of the stimulus.
- the purpose is to provide a simulated biological device that can be inflicted in the near future.
- the present invention relates to a simulated biological apparatus that causes a plurality of operations to be generated by control by a computer, each of which detects a stimulus received by the simulated biological apparatus as an external parameter and generates an event; and A simulation emotion forming means for expressing simulated emotion as an internal parameter according to a detection situation, and a reaction action for a combination of the external parameter and the internal parameter are determined, and the reaction action is converted into a motion of a predetermined part in response to the event.
- the emotion calculation that forms an emotion parameter composed of a combination of two kinds of parameters expressing the opposite mental states in response to the event in the simulated emotion forming means A movement element according to the emotion parameter.
- Gosuru forming operation spread parameters for comprises an act spillover volume calculation unit to be sent to the operation execution unit.
- Each parameter is expressed by a combination of two types of parameters that express the opposite mental states in response to the event, for example, a parameter that expresses pleasant Z discomfort and a parameter that expresses excitement Z calmness.
- a structure is provided with an emotion judgment unit that outputs simulated emotion areas that indicate which quadrants are in two quadrants that are two-dimensionally partitioned at the boundaries of conflicting mental states. Also good.
- the simulated emotion forming means may be configured to include an emotion calculation unit that performs an emotion focusing step for bringing the emotion parameter closer to the emotion concentration point as time passes.
- a configuration may be provided that includes an emotion calculation unit that performs a character formation step of moving according to the positive / negative of the cumulative increase / decrease amount of the two types of parameters expressing the conflicting mental states each time it is received.
- the reaction pattern selected by the same event as well as the action pattern corresponding to the type of stimulus and the detection history detected at any time according to the external parameter and the internal parameter are developed.
- the behavioral elements of the reaction action are increased or decreased according to the feelings and moods expressed by the internal parameters, and as a result, the reaction action of the simulated biological device produces an inflection similar to a living thing.
- FIG. 1 is a block diagram showing an example of perception means of a simulated biological apparatus according to the present invention.
- Figure 2 is Oh functional block diagram of the whole showing an example of a simulated biological apparatus according to the present invention 0
- FIG. 3 is a block diagram showing an example of simulated emotion forming means of the simulated biological apparatus according to the present invention.
- FIG. 4 is a block diagram showing an example of the operation means of the simulated biological apparatus according to the present invention.
- FIG. 5 is a flowchart showing an example of external parameter detection processing in the simulated biological apparatus according to the present invention.
- FIG. 6 is a flowchart showing an example of external parameter analysis processing in the simulated biological apparatus according to the present invention.
- FIG. 7 () is a flowchart showing an example of event detection processing and priority event determination processing in the simulated biological apparatus according to the present invention.
- FIG. 7 () is a flowchart showing an example of priority event determination processing in the flowchart of FIG. 7 ().
- FIG. 8 is a chart showing an example of an outline of emotion parameters in the simulated biological apparatus according to the present invention.
- the illustrated embodiment is a simulated biological apparatus that generates a plurality of operations under the control of a computer system, a so-called Lopot system, which includes a plurality of sensors 7 and a plurality of sensors as shown in FIG.
- the computer is composed of a computer having a switch, a memory, and a CPU that controls the sensor 7 and the sensor 7, and is housed in a casing having a predetermined outer shape and a movable structure.
- perception means 1 for generating an event (reaction event) for detecting a stimulus received by the simulated biological device as an external parameter and generating a reaction action, and an external parameter by the stimulus Even when no data is detected, an event (autonomous event) for generating an autonomous action is generated spontaneously, and an action for selecting an event that is positively expressed from either the reaction event or the autonomous event Included in the reaction event, a determination means 4, a simulated emotion forming means 2 that guides internal parameters for expressing simulated emotions on the memory according to the detection status of external parameters included in the reaction event For combinations of external parameters and internal parameters (including those included in autonomous events; the same shall apply hereinafter)
- a database 3 in which assignments of a plurality of operation patterns are stored, and external parameters included in the reaction event by referring to the database 3 in response to the reaction event or autonomous event (hereinafter referred to as an event).
- the action execution means for determining the reaction pattern for the combination with the internal parameter or the action pattern for the autonomous action and embodying the action pattern for the reaction action or the autonomous action in the movement of a predetermined part upon receiving the event. And a simulated biological device having a timer 18 for outputting time information (see FIG. 2).
- the motion determination means 4 generates the autonomous event including internal parameters (autonomous parameters) relating to autonomous motion that are spontaneously expressed even when no externally detectable stimulus is detected and no external parameter is detected.
- Autonomous action generation unit 1 1 to be generated, reaction events that generate the reaction action output from the perceptual means 1, and the autonomic events are received and analyzed, and the priority is set based on the following event priority information.
- the acceptance / rejection decision unit 12 outputs either a reaction event or an autonomous event derived based on the ranking (see Fig. 2).
- the event event information includes event classification information for distinguishing the autonomous event from the reaction event, and event priority information indicating the priority order of various events, and in the case of the reaction event, as the external parameter Identifies the sensor or sensors that have detected the stimulus from the outside world (acceleration sensor, tactile sensor, etc.) and event intensity information that represents the amount of stimulus detected by the sensor or sensors. Yes.
- the event classification information indicates a tactile sensor
- the stimulus category such as “stroke”, “hit”, “strongly press”, etc. can be specified by the amount of the stimulus.
- the autonomous parameter is included as the event classification information.
- the action execution means 5 selects various actuators 19 including motors, speakers, etc., and selects the operation pattern according to the external parameters and autonomous parameters included in various events by referring to the database 3 upon receiving the event. And a control amount calculation unit 1 for deriving the control parameters of the motion elements related to the motion pattern selected by the motion selection unit 13 based on the motion propagation parameters given from the simulated emotion forming means 2 4 and an actuator controller 15 that adjusts control signals such as drive energy to the various actuators 19 in response to the control parameters (see FIG. 4).
- the sensory means 1 includes a sensor 7 that detects sound, light, infrared, heat, acceleration, or pressure built in the simulated biological device, and performs an arithmetic process on the output of the sensor 7 and includes external parameters. It consists of a sensor processing unit 8 that outputs as a reaction event.
- the sensor processing unit 8 of the sensory means 1 is obtained through an input interface unit 27 that detects the external parameter from the output of the sensor 7 and the input interface unit 27 from the sensor 7.
- the competition between the event generation unit 25 that generates the reaction event assigned to the external parameter or the combination of multiple external parameters and the reaction event output from the event generation unit 25 is detected, That is, when a reaction event is continuously output from the event generating unit 25 with a relatively short time difference in a certain time zone, the reaction event to be adopted by the operation execution means 5 is changed to an old and new reaction.
- This is provided with a stimulus priority determination means 26 that determines one from any of the events.
- Each input interface unit 27 includes a level detection unit 20 that detects the amount of stimulation detected from the sensor 7 as an external parameter, and a level determination unit that distributes the level level of the external parameter based on one or more threshold values.
- the event generator 25 has the contents shown in the left part of Table 1 provided in the memory, for example. Referring to the event table, the type of the sensor 7 and the level determination (e.g., stroking) according to the detection status of the sensors (tactile sensors are illustrated in the table) attached to each part of the housing Various reaction events occur depending on the result of the tactile sensor detection level (equivalent to tapping or pressing hard) (Fig. 1).
- the processing at each event generation unit 25 5 periodically shows the presence or absence of sensor information (external parameters) based on inputs from multiple sensors 7 in Fig. 2.
- a series of processing detects sequentially, and the series of processing (external parameter analysis processing) shown in Fig. 3 shows that there are no detected external parameters, only one case, and multiple cases. In this case, different reaction events are processed to limit the number of newly generated reaction events to one.
- the external parameter detection process is further repeated to detect a new external parameter (non-detection process).
- the detection of the external parameters corresponding to the “lifting” stimulus by the acceleration sensor and the detection of the external parameters corresponding to the “stroking” stimulus are performed as multiple processing. Whether or not there is a combination of external parameters that are conditions for generating a specific composite event, such as generating another event (composite event) such as “easy” by conflicting If the combination exists, the reaction event corresponding to the combination is adopted as a new event (simultaneous combined processing).
- A1 Interrupts the current operation (although different from the interrupt judgment criteria) and moves to the operation for a new event.
- A1 Interrupts the current operation (although different from the interrupt judgment criteria) and moves to the operation for a new event.
- the priority determination unit 26 periodically verifies the presence or absence of a new reaction event sent from the event generation unit 25 and the presence or absence of a previous reaction event currently being executed. If a new reaction event is detected from the event generation unit 25 to which the interface unit 27 of each sensor is connected in the event detection process, or a previous reaction event is detected.
- the reaction event to be adopted by the action execution means 5 when determining the reaction action is, for example, as shown in Table 2 and Table 3, as shown in Table 2 and Table 3.
- the step of determining the priority between categories determined based on the obtained priority is given to each response event in advance. This is determined through at least one of the intra-category priority determination steps determined based on the priority.
- the new reaction event detected by the event detection process is sent to the action determination means 4 and the action determination means 4
- the operation execution means 5 starts to execute an operation pattern for the new event.
- the destination of the ongoing operation pattern is expressed.
- the intra-category priority determination step is performed, and in the case of different categories, the inter-category priority determination step is performed.
- any of the intra-category priority determination step and the inter-category priority determination step if the priority of the previous event wins, the new event that occurred later is discarded and the ongoing operation pattern is changed. If the priority of the new event is higher, the execution of the ongoing action pattern is stopped and a new event is sent to the action determining means 4.
- the action executing means 5 starts executing the action pattern for the new event. (See Figure 6).
- the latest reaction event is sent to the action determining means 4.
- the simulated biological device is a seal-type robot.
- priority is set only by the type of sensor 7 by setting events for many events in detail. From the judgment that this is not appropriate, detection from the sensor 7 associated with system movement, and the sensor (heat sensor, current sensor, etc.) that detects contents associated with system abnormalities such as battery exhaustion and abnormal overheating.
- the priority of the event by detection from 7 is set higher than the event by detection from other types of sensors 7, and all other types of sensors 7 have the same priority. (See Table 2 and Table 3.)
- the contents of the judgment criterion table are basically set such that an event caused by a strong stimulus has a higher priority than an event caused by a weak stimulus, regardless of whether the sensor 7 is the same. Examples of these criteria are shown in Table 2 and Table 3 above, and examples of the relationship between the previous event and the new event that occurred later are shown in Tables 2 and 3 above. Stored in memory.
- the priority among sensors of the same type in the case of sound sensor input, (priority of event due to detection of relatively weak sound) (priority of event due to detection of relatively strong sound), In case of optical sensor input, (priority of event ⁇ ⁇ due to detection of daily light quantity) ⁇ (priority of event due to detection of extraordinary light quantity (camera flashlight, etc.)), tactile (pressure) sensor In the case of input, (priority of event by detection of stroking) ⁇ (priority of event by detection of strong press) ⁇ (priority of event by detection of striking), Or, in the case of acceleration sensor input, the priority of the event is relative (for example, (priority of event by detecting relatively weak vibration)) (priority of event by detecting relatively strong vibration).
- Priority ⁇ (Priority by event based on detection of speech recognition input using microphone and language recognition means (speech recognition chip, etc.)). (Priority of event based on detection of the voice direction input, the voice recognition input, and the tactile sensor input) ⁇ (acceleration) Instinct to detect danger when a strong vibration such as an earthquake is detected. (Priority of event by detecting voice recognition input, voice direction input, optical sensor input, ⁇ ⁇ ⁇ sensor input, and relatively weak acceleration sensor input) ⁇ (relatively strong acceleration sensor input The priority of each event is set as follows.
- the boiled event without countering the fur generated in the event generating unit 25 of the sensor processing unit by detecting the time difference from the back to the front at the plural positions of the tactile sensor input.
- the simulated emotion forming means 2 includes an emotion calculation unit 9 for deriving one emotion parameter of the internal parameter using the external parameter obtained from the perception means 1, and a reaction action or an autonomous action based on the calculated emotion parameter.
- Operation ripple amount calculation unit for deriving the movement spread parameter, which is an internal parameter for changing the holding time, number of iterations, movement amount (amplitude), steady position, and movement speed (hereinafter collectively referred to as movement elements) 10 (see Fig. 2 and Fig. 3).
- the emotion calculation unit 9 is provided with an emotion calculation unit 9 that forms an emotion parameter composed of a combination of two kinds of parameters that express the opposite mental states in response to the event.
- an emotion calculation unit 9 that forms an emotion parameter composed of a combination of two kinds of parameters that express the opposite mental states in response to the event.
- a four-quadrant simulated emotion region divided two-dimensionally at the boundary of pleasant Z discomfort and the boundary of excitement Z calm by combining the parameter expressing pleasure / discomfort and the parameter expressing excitement Z calm.
- An emotion determination unit 21 is provided that outputs an emotion parameter (emotion information) indicating which quadrant it is in.
- the emotion parameters are the pleasant and unpleasant parameter Kkh, and the excitement and calmness parameter. It is a combination of numerical data consisting of Kkt (Kkh, Kkt), and when the simulated emotion forming means 2 detects the external parameters of various reaction events, the numerical value of the pleasant Z discomfort parameter Kkh and the excitement calming parameter Kkt
- Kkt Kkh, Kkt
- the data is increased or decreased as appropriate (for example, see Table 4).
- the simulated emotions of the simulated biological device are changed to “joy”, “anger”, “sorrow”, “easy” as shown in Table 5. It will transition to one of the four simulated emotion zones consisting of Table 4
- the numerical data composed of the comfort / discomfort parameter Kkh and the excitement / sedation parameter Kkt are extremely prominent among the four simulated emotion zones in a region where the numerical data is higher than a certain threshold.
- the motion propagation amount calculation unit 10 corresponds to the emotion parameter (emotion state) output by the emotion calculation unit 9 every time the emotion calculation unit 9 detects a reaction event.
- Output parameters while adjusting parameters (repetition number increase / decrease parameter, steady position parameter, operation holding time increase / decrease parameter, speed parameter, amplitude increase / decrease parameter, etc.), and the operation execution means 5 reflects those operation propagation parameters. It will be expressed as a reaction action.
- the steady position of the eyelid for example, for each of the behavioral segments is lowered according to the emotion that is simulated by the detection history of the stimulus.
- the simulated emotion changes under certain conditions each time a stimulus is received, and the number of facial expressions cannot be counted even if only the steady-state position and the movement speed change are taken into account. Become. Table 6
- the simulated emotion forming means 2 in the example includes an emotion calculation unit 9 that performs an emotion focusing step for bringing the emotion parameter closer to the emotion focusing point as time passes.
- the emotion convergence point is an emotion parameter in an emotional state to be restored when a long time has passed without receiving any stimulus from the outside world, and can also be the individuality of the simulated biological device.
- the emotion focusing point is assumed to be a combination of the comfort / discomfort parameter focusing point Kkh 1 base and the excitement / sedation parameter focusing point Kkt_base (Kkh_base, Kkt_base).
- the emotion parameter (Kkh Kkt) force every time a certain time elapses
- the emotion calculation unit 9 of the simulated emotion forming means 2 moves the emotion convergence point according to the positive / negative of the cumulative increase / decrease amount of the two kinds of parameters expressing the opposite mental state every time the event is received.
- a forming step is performed.
- the emotion gathering step is unconditionally performed as time passes, whereas the personality formation step is different in that it is performed on condition that an event is received.
- the personality formation step takes more time than the emotion convergence step.
- the behavioral tendency cultivated by communication with the intentional process is intentionally increased so that the behavioral trend shifts to a relatively stable point, and the emotional convergence step forms the character that should settle down. To go.
- the personality formation step is performed using the increase / decrease amount of the comfort / discomfort parameter Kkh shown in Table 1: AKkh and the increase / decrease amount of the excitement / sedation parameter Kkt: AKkt, and the emotion calculation unit 9 detects the event.
- the increase / decrease amount AKkh of the comfort / discomfort parameter Kkh and the increase / decrease amount of the excitement Z-sedation parameter Kkt: AKkt are accumulated, respectively. Kkh).
- the character formation point is determined according to the judgment formula shown in Table 7 below, and the pleasant Z unpleasant parameter cumulative increase / decrease amount ⁇ ( ⁇ Kkh) or the excitement / sedation parameter cumulative increase / decrease amount ⁇ ( ⁇ Kkh) Is positive, the pleasant Z discomfort parameter convergence point Kkh-base and excitement / sedation parameter collection point Kkt_base are added to 2 so that the comfort / discomfort parameter Kkh and excitement / sedation parameter Kkt tend to be in a positive tendency or excitement tendency.
- Table 8 shows that when the pleasant / unpleasant parameter focusing point Kkh_base and the excited calming parameter focusing point Kkt-base are positive, 1 is subtracted from the pleasant / unpleasant parameter focusing point Kkh_base and the exciting Z calming parameter focusing point Kkt_base.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2004/014033 WO2006030529A1 (ja) | 2004-09-17 | 2004-09-17 | 擬似感情形成手段を備えた模擬生物装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2004/014033 WO2006030529A1 (ja) | 2004-09-17 | 2004-09-17 | 擬似感情形成手段を備えた模擬生物装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006030529A1 true WO2006030529A1 (ja) | 2006-03-23 |
Family
ID=36059796
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/014033 WO2006030529A1 (ja) | 2004-09-17 | 2004-09-17 | 擬似感情形成手段を備えた模擬生物装置 |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2006030529A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1256931A1 (en) * | 2001-05-11 | 2002-11-13 | Sony France S.A. | Method and apparatus for voice synthesis and robot apparatus |
WO2002099545A1 (en) * | 2001-06-01 | 2002-12-12 | Sony International (Europe) Gmbh | Man-machine interface unit control method, robot apparatus, and its action control method |
JP2003208161A (ja) * | 2001-11-12 | 2003-07-25 | Sony Corp | ロボット装置及びその制御方法 |
JP2003285285A (ja) * | 2002-03-27 | 2003-10-07 | Nec Corp | ソフトウェアエージェントを有するロボット装置及びその制御方法とプログラム |
JP2004283958A (ja) * | 2003-03-20 | 2004-10-14 | Sony Corp | ロボット装置、その行動制御方法及びプログラム |
-
2004
- 2004-09-17 WO PCT/JP2004/014033 patent/WO2006030529A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1256931A1 (en) * | 2001-05-11 | 2002-11-13 | Sony France S.A. | Method and apparatus for voice synthesis and robot apparatus |
WO2002099545A1 (en) * | 2001-06-01 | 2002-12-12 | Sony International (Europe) Gmbh | Man-machine interface unit control method, robot apparatus, and its action control method |
JP2003208161A (ja) * | 2001-11-12 | 2003-07-25 | Sony Corp | ロボット装置及びその制御方法 |
JP2003285285A (ja) * | 2002-03-27 | 2003-10-07 | Nec Corp | ソフトウェアエージェントを有するロボット装置及びその制御方法とプログラム |
JP2004283958A (ja) * | 2003-03-20 | 2004-10-14 | Sony Corp | ロボット装置、その行動制御方法及びプログラム |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6517762B2 (ja) | 人とロボットが協働して作業を行うロボットの動作を学習するロボットシステム | |
JP2019164352A (ja) | 人間型ロボットとユーザーの間におけるマルチモード会話を実行する方法、前記方法を実装するコンピュータプログラム及び人間型ロボット | |
JP5167368B2 (ja) | 移動体制御装置及び移動体制御方法 | |
EP3456487A2 (en) | Robot, method of controlling the same, and program | |
KR20170027704A (ko) | 자율 생활 능력을 갖는 휴머노이드 로봇 | |
JP2005199402A (ja) | 行動制御システム及びロボット装置の行動制御方法 | |
EP3790500B1 (en) | Methods and systems for adjusting behavior of oral care device based on status of oral cavity | |
KR20160072621A (ko) | 학습과 추론이 가능한 로봇 서비스 시스템 | |
JP2007143886A (ja) | 電動車椅子システム | |
JP7120060B2 (ja) | 音声対話装置、音声対話装置の制御装置及び制御プログラム | |
JP2003200370A (ja) | オブジェクトの成長制御システム及びオブジェクトの成長制御方法 | |
JP7370531B2 (ja) | 応答装置および応答方法 | |
WO2006030529A1 (ja) | 擬似感情形成手段を備えた模擬生物装置 | |
Takanishi et al. | An anthropomorphic head-eye robot expressing emotions based on equations of emotion | |
CN111671621B (zh) | 助动装置控制系统 | |
JP2019155546A (ja) | 制御装置、制御方法、及び制御プログラム | |
CN113661036A (zh) | 信息处理装置、信息处理方法以及程序 | |
WO2006030530A1 (ja) | 刺激優先度判定手段を備えた模擬生物装置 | |
Murai et al. | Voice activated wheelchair with collision avoidance using sensor information | |
JP2020124392A (ja) | 情報処理装置および情報処理システム | |
JP2003266353A (ja) | ロボット装置及びその制御方法 | |
JP7414735B2 (ja) | 複数のロボットエフェクターを制御するための方法 | |
Lee et al. | Development of therapeutic expression for a cat robot in the treatment of autism spectrum disorders | |
JP2017077595A (ja) | 動作体の感情動作制御装置及び感情動作制御方法 | |
JP4635486B2 (ja) | 概念獲得装置及びその方法、並びにロボット装置及びその行動制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 04773415 Country of ref document: EP Kind code of ref document: A1 |