NL2012639B1 - Method and Arrangement for Controlling a Vehicle. - Google Patents
Method and Arrangement for Controlling a Vehicle. Download PDFInfo
- Publication number
- NL2012639B1 NL2012639B1 NL2012639A NL2012639A NL2012639B1 NL 2012639 B1 NL2012639 B1 NL 2012639B1 NL 2012639 A NL2012639 A NL 2012639A NL 2012639 A NL2012639 A NL 2012639A NL 2012639 B1 NL2012639 B1 NL 2012639B1
- Authority
- NL
- Netherlands
- Prior art keywords
- user interface
- user input
- vehicle
- control unit
- state
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 24
- 230000008859 change Effects 0.000 claims description 36
- 238000012545 processing Methods 0.000 claims description 19
- 239000000463 material Substances 0.000 claims description 12
- 238000003860 storage Methods 0.000 claims description 5
- 238000004148 unit process Methods 0.000 claims description 5
- 238000013500 data storage Methods 0.000 claims 3
- 230000004913 activation Effects 0.000 claims 2
- 238000004590 computer program Methods 0.000 claims 2
- 238000012544 monitoring process Methods 0.000 claims 1
- 230000007704 transition Effects 0.000 abstract description 118
- 230000001960 triggered effect Effects 0.000 abstract description 11
- 230000003993 interaction Effects 0.000 description 10
- 238000012546 transfer Methods 0.000 description 8
- 238000003825 pressing Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 4
- 238000013461 design Methods 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 239000012773 agricultural material Substances 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000011888 foil Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000010899 nucleation Methods 0.000 description 1
- 239000002985 plastic film Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01F—PROCESSING OF HARVESTED PRODUCE; HAY OR STRAW PRESSES; DEVICES FOR STORING AGRICULTURAL OR HORTICULTURAL PRODUCE
- A01F15/00—Baling presses for straw, hay or the like
- A01F15/07—Rotobalers, i.e. machines for forming cylindrical bales by winding and pressing
-
- B60K35/10—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/082—Selecting or switching between different modes of propelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- B60K2360/1438—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/215—Selection or confirmation of options
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2200/00—Type of vehicle
- B60Y2200/20—Off-Road Vehicles
- B60Y2200/22—Agricultural vehicles
Abstract
A vehicle is transferred from a first state into a second state. This state transition is triggered by a control unit as a reaction on a user input. The control unit executes a control program. The user input is captured by a user interface unit executing user interface specification. A program generator generates the control program from a computer-executable state transition model. A user interface generator generates the user interface specification from the same state transition model. This state transition model specifies all possible state transitions and the corresponding user input capturing objects.
Description
Method and Arrangement for Controlling a Vehicle Field of the Invention
The invention refers to a method and an arrangement for controlling a vehicle wherein a control unit triggers the vehicle to perform at least one state transition. This state transition transfers the vehicle from one state to a further state, e.g. moves a pivotal part of the vehicle from a parking position into an operating position or switches a device on board of the vehicle on or off.
The invention can in particular be used in a bale forming apparatus and a bale forming method for forming a cylindrical or cuboid bale from loose material, in particular from agricultural crop material.
Background of the Invention
During its operation a vehicle performs several operations. Examples for these operations are: A mechanical part of the vehicle is pivoted or shifted or otherwise moved. A part is locked such it cannot be moved any more. This part is released again. An electronic or hydraulic drive for a part of the vehicle is switched on or off. In the case of an apparatus which creates bales from loose agricultural material two possible operations are to open a tailgate of the bale forming chamber and to eject a completed bale out of the bale forming chamber. A control unit triggers all these operations, e.g. by triggering mechanical or electrical or hydraulic actuators. Every such operation can be denoted as a state transition of the vehicle. In many cases such a state transition is initiated by the input of a human operator, e.g. a driver of the vehicle. The operator makes an input into a user interface. This user input triggers a state transition.
Summary of the Invention
An object of the invention is to provide a method and an arrangement for controlling a vehicle by means of a control arrangement comprising a control unit and a user interface unit, wherein the vehicle can be operated in different states, wherein a user input on a user input device of the user interface unit initiates a state transition from one state to a further state, wherein a control program running on the control unit triggers the initiated state transition and wherein the operations of the control program should be consistent with the operations of the user interface unit.
The problem is solved by a vehicle controlling method with the features of claim 1 and by a vehicle controlling arrangement with the features of claim 7. Preferred embodiments are specified in the depending claims.
The control method and the control arrangement according to the invention control a vehicle. This vehicle can be operated in several different states. The method is performed by using the control arrangement. This control arrangement comprises a control unit and a user interface unit with a display device and with a user input device. The control unit of the arrangement can trigger at least one state transition of the vehicle. Before this state transition the vehicle has been operated in a first state and as a result of this state transition the vehicle is operated in a second state.
The user interface unit can display at least one user input capturing object on the display device. The user interface unit can further capture a user input referring to this user input capturing object. For doing so the user interface unit comprises the user input device.
During the operation of the vehicle the following state transition sequence is automatically performed at least one time: • The control unit automatically triggers the user interface to display one user input capturing object. • The user interface unit automatically displays the user input capturing object on the display device. This displaying step is a reaction on being triggered by the control unit. The user interface unit displays the user input capturing object according to an executable user interface specification. • The user interface unit automatically captures a user input which has been done on the user interface device and which refers to the displayed user input capturing object. This user input stems from a human operator of the vehicle. • The control unit automatically processes this user input referring to the user input capturing object. • As a reaction on the captured and processed user input referring to the user input capturing object, the control unit automatically triggers the state transition from the first state to the second state.
Before this state transition sequence was performed, the vehicle was operated in the first state. Afterwards the vehicle is operated in the second state.
The control program executed by the control unit as well as the user interface specification for the user interface unit are automatically generated by processing the same given state transition model for operating the vehicle. This state transition model is provided and is stored in a model data storing means. This state transition model can automatically be executed by a computer. The model specifies • every possible state transition between different possible states of the vehicle provided that this state transition can be triggered by the control unit as a reaction on a user input, • every possible user input capturing object referring to such a possible state transition, and • for every possible user input capturing object an assignment to that possible state transition to which this user input capturing object refers to. A program generator automatically generates the control program by processing the state transition model. This generated control program is afterwards executed by the control unit of the arrangement. The control unit executing the control program • can trigger every possible state transition according to the provided state transition model and • can trigger the user interface unit to display every possible user input capturing object according to the state transition model. A user interface generator automatically generates the executable user interface specification by processing the same state transition model. By processing the generated user interface specification the user interface unit can display every possible user input capturing object according to the user interface specification. This user input capturing object is displayed after the control unit executing the control program triggers the user interface unit to display it.
Advantages
The control program running on the control unit triggers the state transitions of the vehicle. The control unit and the executable user interface specification cause the user interface unit to display several user input capturing objects. According to the invention the control program and the user interface specification are automatically generated from the same state transition model. This state transition model specifies all possible state transitions which the user can demand by a user input and all user input capturing objects for demanding a state transition. As the same state transition model specifies all possible state transitions and all possible user input capturing objects and as the same state transition is used for generating the control program as well as the user interface specification, all operations of the control unit are consistent with the operations of the user interface unit. For every possible state transition the corresponding user input capturing object is specified and will be displayed according to the user interface specification.
Thanks to the invention it is not necessary to adapt or to amend or to change manually the control program or the user interface specification. An amendment, e.g. an adaptation to an additional further state transition which is desired but not yet considered in the state transition model, is performed by purely adapting the state transition model and to trigger again the automatic generation of the control program and the user interface specification, this time by processing the same amended state transition model. With other words: One model has to be changed but not a program and additionally a specification.
The invention eliminates the risk that a change or an amendment makes the user interface unit operating inconsistently with the control unit running the control program. Such an inconsistent operation could yield an undesired event during operating the vehicle. This risk would occur if a human had to manually change the control program or the user interface specification.
The invention saves the need that a human expert implements manually the control program or the user interface specification. In contrast the human expert creates the state transition model. This human expert can be a mechanical or electrical engineer knowing the required operations of the vehicle, e.g. an engineer for vehicle design and operation. This human expert utilizes vehicle design abilities and model building abilities but does not need to apply programming knowledge. According to the invention this state transition model is automatically processed. Programming knowledge is required for implementing the control method and arrangement according to the invention but neither for generating the control program nor for generating the user interface specification.
No human interaction is required for generating the control program and the user interface specification from the same state transition model. In particular no human needs to inspect the automatically generated source code. Therefore it is easy to repeat this generation step for a sequence of different versions or a set of alternative variations of the state transition model and to evaluate and compare the results on the vehicle.
Thanks to the invention the control program is clearly separated from the user interface specification. The generated control program does not comprise statements for influencing the graphical user interface. In turn the generated user interface specification does not comprise any control logic. The user interface specification specifies the data which the user interface unit needs for displaying the user input capturing object. The control program does not need to comprise or to specify graphical information for displaying the user input capturing object. On the other hand the user interface specification does not need to comprise any data about states and state transitions.
Preferred Embodiments
Preferably the vehicle can be operated in at least three different states. A first state transition transfers the vehicle from a first state into a second state. A second state transition transfers the vehicle from the second state to the third state. Two different user input capturing objects are assigned to these two state transitions. A first sequence is performed for executing the first state transition. A second sequence is performed for executing the second state transition. Every sequence comprises the steps that the user interface unit • is triggered, • displays a user input capturing object, and • captures a corresponding user input and that the control unit triggers the corresponding state transition of the vehicle.
The provided state transition model specifies both state transitions and both user input capturing objects.
It is also possible that performing a first state transition transfers the vehicle from the first state into the second state and performing a second state transition transfers the vehicle from the second state back into the first state. One example: A part is first moved from a parking position into an operating position and afterwards moved back into the parking position. Two different user input capturing objects are assigned to these two state transitions. The provided state transition model specifies both state transitions and both user input capturing objects.
In one embodiment the control unit does not only trigger state transitions but also generates alerts to a human operator. Such an alert is automatically generated depending on the evaluation of at least one signal from a sensor. This sensor monitors the vehicle during operation. The alert is generated if the sensor detects that the vehicle is in a specific state. In one embodiment the sensor measures the current value of a time-varying variable and checks whether the measured current value is in a given domain. If this is the case, the alert is generated.
In this embodiment the state transition model additionally specifies the possible alerts to be generated and for every possible alert the state referring to this alert. The state transition model defines every possible state for which an alert has to be displayed, e.g. a state variable and a value or a domain of values which means that this state has occurred. The user interface generator generates an alert displaying unit for every defined alert. Preferably this alert displaying unit is part of the user interface specification. This embodiment ensures that suitable alerts are generated according to states of the vehicle and that the generation of alerts is consistent to the state transition triggered by the control unit.
Preferably the control unit of the vehicle has access to a program data storing means being part of the vehicle. The user interface unit has access to a specification data storing means also being part of the vehicle. The control program generated by the program generator is stored in the program data storing means. The user interface specification generated by the user interface generator is stored in the specification data storing means.
In this embodiment the following parts can be arranged outside of the vehicle, e.g. in a remote stationary data processing unit or on a further vehicle: • the model data storing means with the state transition model, • the program generator, and • the user interface generator.
This embodiment allows generating identical control programs and user interfaces specifications for different vehicles. The same state transition model is used for these vehicles.
Before operating the vehicle, the control program and the user interface specification are transmitted to the or every vehicle. During the operation of the vehicle no access to the control arrangement parts outside of the vehicle is required. This embodiment saves the need to establish a data connection between the vehicle and a central data processing unit outside the vehicle while the vehicle is operated. Such a data connection can be interrupted during operating the vehicle. Therefore it is in advantage not to rely on such a data connection during operation. In addition less computational power and performance and less storage capacity is required on board of the vehicle.
It is also possible that the model data storing means, an editor for amending the state transition model, the program generator, and the user interface generator are arranged on a mobile computer. This mobile computer can temporarily or permanently be connected with the control unit and with the user interface unit, e.g. by means of a data network of the vehicle. This embodiment enables an engineer to perform a testing operation on board of the vehicle wherein at least once the following sequence is performed: • The state transition model is amended, e.g. as a reaction on a detected undesired behavior of the vehicle while testing the vehicle. • The program generator generates the control program by processing the amended state transition model. • The user interface generator generates the user interface specification by processing the same amended state transition model. • The vehicle is operated according to this control program and to this user interface specification.
Preferably the program generator generates source code for the control program. In one embodiment the control unit interprets this source code. In a further embodiment a compiler generates an executable control program from the generated source code. The compiler is a part of the program generator. The control unit executes this compiled control program. Only the compiled and executable control program needs to be stored on a data storing means onto which the control unit has access. During runtime the control unit on board of the vehicle needs access to the executable control program but neither to the source code nor to the compiler.
In one embodiment the user interface specification comprises a unique identifier for every user input capturing object. This unique identifier distinguishes this user input capturing object from every other user input capturing object specified in the user interface specification. A passage of the control program triggers the state transition which can be caused by a user input referring to this user input capturing object. This passage comprises the unique identifier for this user input capturing object. A message from the user interface unit to the control unit about the user input also comprises this unique identifier. The user interface specification also comprises these unique identifiers.
This embodiment enables a simple data communication between the control unit and the user interface unit. In order to send a state transition message from the control unit to the user interface unit or vice versa, the input capturing object is specified by the unique identifier and not by a long alphanumeric string. Therefore the message only requires few storage capacity and therefore only little time or a data communication with a small bandwidth. Preferably these unique identifiers are automatically generated by the program generator or by the user interface generator.
The user input capturing object can specify the kind of the user interaction, e.g. that the user presses a specific button of a device or touches a specific area of a touch screen belonging to the display device. The user input capturing object can also specify that the corresponding user input comprises a sequence of alphanumeric symbols, e.g. a password or a value which is measured or observed by a human operator. The user input capturing object can also specify a selection menu to be presented on the display device and for every option of this menu an assigned state transition or further action which the control unit will trigger if the user has selected this option.
In one embodiment the control unit and the user interface unit operate according to the standard ISObus (ISO 11783). The control program and the user interface specification are generated such that this goal, i.e. operating according to ISObus, is achieved. Thanks to the invention the constraints of the ISObus standard do not need to be considered while creating the state transition model.
According to the invention the user interface unit displays on the display device a user input capturing object. The feature of displaying the object on the display device comprises presenting text and/or symbols on a screen as well as optionally a speech output by means of a loud speaker or a headset. Capturing a user input comprises the step that a user input into a touch screen or a keyboard or a selection with mouse and a cursor is captured or a speech input into a microphone is captured.
In one embodiment an executable configuration specification for the vehicle is provided. This configuration specification specifies for at least one optional part whether the vehicle comprises or does not comprise this optional part. The program generator and the user interface generator additionally evaluates this configuration specification and generate a passage in the control program and a user input capturing object referring to this optional part only if the vehicle actually comprises this optional part. This embodiment reduces the risk that senseless user interactions are offered. A user interaction referring to a nonexisting vehicle part is senseless. The same state model can be used for different vehicles. Only the configuration specification for every vehicle must additionally be provided.
In one embodiment at least one automatic state transition of the vehicle is triggered automatically by the control unit, i.e. not as a reaction on a user input. This state transition is triggered if it is detected automatically that a given event has occurred. In one implementation the state transition model comprises this or every such automatic state transition and the corresponding event, i.e. the event which causes the automatic state transition to be triggered. The program generator generates the control program such that the control unit executing this control program triggers the automatic state transition as soon as the occurrence of the event is detected. This specification of the event can specify a measurable variable and a domain of values of this variable. The event is detected if a sensor automatically detects that the current value of this variable is in the given domain.
In one embodiment the user interface unit with the display device is mounted on board of the vehicle and a human operator on board of the vehicle performs the user inputs. In a further embodiment the human operator steers the vehicle in a remote manner. The display device is arranged outside of the vehicle and is connected with the control unit on board of the vehicle by means of a wireless or wired data connection.
In an implementation of this further embodiment the vehicle is remotely controlled. The user interface unit is a part of a stationary device or of a hand-held device to be carried by a human operator. Via a wireless data connection transmission signals are exchanged between the control unit mounted on board of the vehicle and the remote user interface unit.
In a third embodiment the vehicle to be controlled is pulled by a further vehicle. This further vehicle comprises a motor and is operated by a driver.
The user interface device is mounted on board of the further vehicle, e.g. in a cabin for the driver. The control unit is mounted on board of the vehicle to be controlled. A mechanical connected as well as a data connection is established between the vehicle to be controlled and the further vehicle.
The vehicle can be an agricultural implement and the further vehicle can be a tractor, a field chopper, or a combine harvester, e.g. When being pulled the implement picks-up loose crop material from the ground and processes the picked-up material. The implement can be a loader wagon or a bale forming apparatus or a trailer for transporting bales or a fertilizing or seeding implement or a mowing device, e.g. The vehicle can also be a robot which is remotely controlled by a human operator, e.g. a milking robot for a barn or a robot used in a manufacturing site.
These and other aspects of the invention and of the preferred embodiment will be even more apparent from the detailed embodiment as described below and will be elucidated in detail there.
Detailed Description of Embodiment
In the embodiment the invention is used for controlling a bale forming apparatus (baler) serving as the vehicle to be controlled. This bale forming apparatus creates cylindrical or cuboid bales from agricultural loose material. A pick-up unit of the baler picks-up the loose material from the ground. Every bale is formed in a pressing chamber under pressure. The bale is wrapped into a wrap (a net, a twine, or a foil). The wrapped bale is ejected out of the pressing chamber. For ejecting the wrapped bale, a tailgate of the baler is opened by pivoting the tailgate with respect to the baler’s frame.
In one embodiment a bale transfer unit transfers the ejected bale onto a wrapping table. The wrapping table rotates the bale. The rotated bale is wrapped into several webs of plastic sheet.
The baler is pulled by a motorized vehicle, e.g. by a tractor, a combine harvester, or by a field chopper. A human operator in a driver’s cabin of the pulling motorized vehicle steers the combination of the pulling vehicle and the baler serving as an implement or trailer.
The baler is mechanically coupled with the pulling vehicle by means of a towing unit belonging to the baler and a corresponding hook at the pulling vehicle. A PTO shaft of the pulling vehicle rotates the main drive shaft of the baler. Signals are exchanged between a control unit for the baler and a user interface unit on board of the pulling vehicle. This data communication is described below. The operator steers the baler from the driver’s cabin and also controls the baler.
During operation the baler is several times operated in a current elementary state and is afterwards transferred from this elementary state into a further elementary state. The current overall state of the baler is described by a combination of several current elementary states. An elementary state is a combination of a state variable and a value which this state variable currently has.
Examples for elementary states are: • tailgate closed / tailgate opened, • no crop material in the pressing chamber / bale in pressing chamber is increasing / bale has reached the required size / bale in pressing chamber is wrapped into a net, • feeding channel bottom is in normal position/feeding channel bottom has been lowered downwards, • cutting arrangement is in working position / in parking position, • drum of pick-up unit is rotated / is stopped due to high load, • overload clutch in the main drive shaft engaged/disengaged, • pick-up unit is raised up (transport position) / rests on the ground (operating position), • bale transfer unit is in a bale receiving position/ in a bale holding position/ in a bale transferring position, • bale on wrapping table is dropped as soon as possible / is to be dropped if a further bale has to be transferred on the wrapping table.
On board of the baler several sensors are mounted. Every sensor can measure an assigned elementary state, e.g. the state of the pick-up unit, the state of the feeding channel bottom, or the state of the tailgate or of the wrapping table. Or a sensor measures the diameter or weight of a bale in the baling chamber or checks if the bale is properly wrapped or measures the current inclination of the baler with respect to a horizontal plane or measures a torque consumption of a shaft.
Several actuators can change an elementary state, e.g. • by pivoting the feeding channel bottom, • by opening or closing the tailgate, • by starting the process of wrapping the bale in the pressing chamber into a net or • by engaging/disengaging an overload clutch.
Such an actuator can comprise a hydraulic or pneumatic cylinder or an electrical motor.
The baler comprises a programmable control unit arranged on board of the baler. In the embodiment the baler control unit comprises • a central processing unit (CPU), • a program data storing means, • a control program (a software program) stored in the program data storing means, and • a gateway and physical interfaces to a data connection network.
This central processing unit can execute a control program (a software program). Preferably this executable control program is generated by compiling and linking a source code program. During run time the result of the compilation and linking is executed, not the source code program. Only the executable program is stored on the storing means but not necessarily the source program. It is also possible that the control unit interprets the control program during run time.
In the embodiment the executed control program is generated in an object-orientated language, e.g. in C++ or Java. While the control unit executes the compiled object-orientated control program or interprets it, i.e. during run-time, several instances (objects) of classes in the sense of the object-oriented paradigm are generated. These instances (objects) generate communication messages to other objects by means of so-called methods.
While the control unit executes the control program, the control unit processes signals from sensors and generates control inputs for actuators. In particular by sending signals to baler actuators the control unit can trigger a state transition. By such a state transition the baler is transferred from one elementary state into a further elementary state. Every state transition changes the current overall state of the baler.
The control unit is connected with sensors and actuators by means of a data communication network, e.g. via a CAN bus, see below.
The baler further comprises a user interface unit. In the embodiment this user interface unit is used by a driver (operator) of the tractor which pulls the baler.
In one implementation the user interface unit comprises a user terminal with a screen and a keyboard. The user terminal can temporarily be mounted in the driver’s cabin. A flexible electric cable connects this user terminal with the baler control unit.
In a further embodiment a wired or wireless data connection between the baler’s control unit and the control unit of the pulling tractor is established. The tractor’s control unit controls a display device which is permanently mounted in the driver’s cabin. The two control units exchange messages via this data connection. In yet a further implementation the operator uses a mobile data processing device, e.g. a smartphone or a portable computer, which is connected with the baler’s control unit.
On the one hand this user interface unit is adapted for displaying alerts and further messages to the human operator (user) of the baler. On the other hand this user interface unit displays user input capturing objects and captures user inputs from the operator. A user input capturing object enables the user to perform a specific user input and to demand a specific state transition of the baler. Therefore every user input capturing object refers to a state transition. Examples for user input capturing objects are different buttons, joysticks and further tangible actuating means, soft-keys for a touch screen or several input fields in an electronic form which is displayed on a display device. Different soft keys serve as different user input capturing objects. In contrast an alert is a message which is just given to the attention of the operator.
The control unit executing the control program triggers the operations of the user interface unit. In addition the control unit processes user inputs captured by the user interface unit.
The user interface unit comprises: • a display device, • a user input capturing device, • a data storing means in which the user interaction specification is stored, and • a user interface control device.
The display device is adapted for displaying visual and/or acoustical messages to the human operator. The user input capturing device is adapted for capturing inputs from the operator. The user interface control device is adapted for controlling the display device and the user input capturing device and to capture and to communicate signals to other control units, in particular to the baler’s control unit.
In one embodiment a touch screen can display alerts as well as so-called soft keys for user inputs. The operator can touch the touch screen at an area of this screen showing a soft key. This touch screen serves as a display device and as a user input capturing device. The touch screen detects automatically the event that the user has touched a specific area of the touch screen. This area is a user input capturing object referring to a state transition.
The operator can further enter a string with alphanumeric signs via a keyboard, e.g. for entering a password or entering a numerical value (a desired bale diameter or a desired pivoting angle of the wrapping table, e.g.). This numerical value is a desired value of a parameter, e.g. The user can select an option out of a displayed list or can also press buttons of the terminal to initiate a state transition. A button may refer to a user input capturing object displayed on the screen.
In one embodiment the user input capturing device comprises a microphone and a speech processing unit. On the display device a word is displayed. Or a loudspeaker outputs a word. If the user speaks this word into the microphone, the corresponding state transition is triggered. This embodiment enables the operator to trigger a state transition and to use his/her hands at the same time for a further operation, e.g. for steering the vehicle.
In one embodiment several rectangles or other icons are displayed on the touch screen of the display device. The control unit has triggered the user interface input to display these icons. Every icon displayed on the touch screen shows an alert or a further message to the operator (user) or enables the operator to trigger a state transition.
The operator (user) touches an icon displayed on the touch screen. Preferably the touched icon is displayed in an alternative way after being touched to show the user that the user interface unit has registered the user input and that state transition is initiated as a reaction on touching the icon. In one example the icon is displayed in an alternative color or with an alternative size or brightness.
Preferably every icon comprises a text or symbol for describing the alert/state transition. Every displayed icon is part of a user input capturing object or of an alert displaying object. Touching an icon for a state transition triggers the step that the user interface unit generates a corresponding message to the control unit and the control unit triggers the corresponding state transition.
In the embodiment the user interface control device and the baler’s control unit are connected via a data communication network, e.g. via a CAN bus, and exchange messages on the network, e.g. as CAN messages. Several sensors and actuators of the baler are also connected with this CAN bus and delivers messages to or receive messages from this CAN bus. Examples for messages are: • The tractor’s or the baler’s control unit sends wake-up signals via the CAN bus to the connected sensors and actuators. • The baler’s control unit triggers the user interface control device to display a soft key or an alert to the operator on the display device. • The user interface control device sends a message comprising a user input to the baler’s control unit. • The control unit sends a request to a sensor to measure a parameter of the baler and to deliver the measured parameter. • A sensor delivers a measured value to the baler’s control unit. • The baler’s control unit triggers a cylinder arrangement to open the tailgate.
In the embodiment the baler control unit and the user interface control device operate according to the standard ISObus, also called ISO 11783 (http://de.wikipedia.org/wiki/lsobus). The display device serves as a so-called virtual terminal (VT, http://de.wikipedia.0rg/wiki/ISOBUS#Virtual_Terminal) according to the ISObus terminology. A so-called “object pool” specifies every possible user input capturing object which the user interface unit can display. Every user input capturing object can also be denoted as a GUI object (GUI = Graphical User Interface). A specific button which the operator can actuate, a soft key connected with a state transition (a change of the current baler’s state), and an alert to the user are examples for user input capturing objects. The object pool specifies for every possible user input capturing object an own object ID (unique identifier) which distinguishes this user interaction object from all other user interaction objects specified in the object pool. Optionally an acoustic signal is also associated with the user input capturing object.
In the embodiment the object pool further specifies graphical information for a user input capturing object, e.g. the size, position, and color of this object on a screen of the display device and a human-readable text or symbol describing the state transition which the user can trigger with a user input referring to this user input capturing object.
In one embodiment the default way how the user interface unit displays a user input capturing object is to show a rectangle on the display device. The object pool specifies the position, size, color, and explaining text or symbol for this rectangle. It is possible that in addition a set of other icons for the user input capturing objects are provided, e.g. as bitmaps. Every provided icon refers to a user input capturing object of the object pool. If this user input capturing object is to be displayed, the corresponding icon is shown on the display device. Preferably every such icon is assigned to an object ID. This set of icons is not part of the object pool and therefore remains unchanged if a new object pool is automatically generated (see below).
This object pool is automatically generated and serves as a part of the executable user interface specification according to the claims. The object pool is stored in the specification data storing means on board of the vehicle, preferably as a binary file.
As just mentioned the object pool specifies for every user input capturing object a unique object ID and several displaying information, e.g. the location and/or the size of the user input capturing object on the display device. Every message transmitted from the control unit to the user interface unit or vice versa comprises the object ID of a user input capturing object. Therefore the control program does not need to process or provide any information how to display user input capturing object. The object pool does not comprise any data about states and state transitions. A complete separation of the control logic from the user interaction is realized.
In order to trigger the user interface unit to display a user input capturing object, the control unit transmits a message with an object idea to the user interface control device. The user interface control device retrieves from the object pool the information required for displaying the user input capturing object.
In the embodiment the object pool is generated as an XML file or a set of XML files. XML (http://en.wikipedia.org/wiki/XML) means “extensible Markup Language” and is a standardized programming language for storing hierarchically structured information according to a standardized syntax. Different programs are available for evaluating XML files. In the embodiment the user interface control device executes a binary object tool which is automatically generated from an XML file specifying the object tool. The XML file is also automatically generated.
In the embodiment a program generating phase and a subsequent runtime phase are performed.
In the program generating phase the state transition model (see below) is created and the control program and the user interface specification are automatically generated. A generating computer which is not necessarily part of the vehicle (baler) automatically generates the control program and the user interface specification. The generating computer can be a stationary central computer or a mobile computer which is used on the baler during a testing phase.
Preferably the generating computer automatically generates the unique identifiers (object IDs) for the user input capturing objects, e.g. by using a counter which is incremented by 1 for every new object. The generated control program and the generated user interface specification are transmitted to a data storing means on board of the baler such that the control unit has access to the stored control program and the user interface control device has access to the stored user interface specification.
In one implementation the generating computer also compiles the generated source code for the control program and the generated XML file for the object pool.
In the runtime phase the generating computer is not used. The control unit executes the automatically generated control program and triggers state transitions according to user inputs. The user interface control device evaluates the user interaction and controls the display device.
One basic idea of the invention is that the user interface specification (the object pool) and the executable control program for the control unit are generated from the same state transition model. This state transition model is created by a human vehicle engineer - or a team of vehicle engineers - on a computer. A program generator automatically evaluates the state transition model and generates the control program. A user interface generator automatically evaluates the same state transition model and generates the user interface specification. Both generators are implemented as software programs running on the generating computer. In one embodiment the same program serves as the program generator as well as the user interface generator.
The state transition model represents every possible state transition of the baler which can be triggered by a user input. The state transition model further represents every user input capturing object for a possible user input triggering a state transition. In the embodiment the state transition model further represents every alert displaying object.
In the embodiment the state transition model has the form of at least one directed graph with nodes and directed edges (arrows) between these nodes. Every node represents either an elementary state, a user input capturing object, or an alert displaying object (three kinds of nodes). A directed edge starts in a node of a first kind and ends in a node of the first kind or of a second kind.
An example for a part of this directed graph is S1 —► UI1 —► S2 (three nodes, two edges). The nodes S1 and S2 represent two elementary states, UI1 represents a user input capturing object connected with a possible user input and therefore a possible user interaction. If the baler is in the state S1 and the user performs the user input assigned to UI1, the control unit triggers a state transition from S1 to S2. A further example is UI1 —► UI2 wherein UI1, UI2 are two user input capturing objects. A user input assigned to UI1 enables the user to make the additional input assigned to UI2. A further example is S1 —► A1 wherein A1 is an alert displaying object. If the state S1 is detected by a sensor, e.g., the alert displaying object A1 is displayed.
In the embodiment some properties can be assigned to the nodes and to the arrows. A node for an elementary state can comprise an identifier for a state variable. This state variable is measured by a sensor or set by an actuator. The identifier for the state variable will occur in the source code of the generated control program. The node further specifies a possible value or a value domain of this state variable. If the corresponding value of the variable is equal to the specify value or falls into the specified domain, this state occurs.
The node can further specify an initial value - that is a value which this state variable takes when the baler is started or a default value if the current value is not yet measured.
In an alternative embodiment several nodes of the state transition models are marked as initial state nodes. As soon as the operation of the baler starts, the elementary states referring to these nodes marked as initial state nodes occur simultaneously. The control program triggers the user interface unit to display at least one user input which causes a transition of the baler from an initial elementary state into a further state. A node for a user input capturing object can specify possible user inputs, e.g. pressing a button or a touch screen, and displaying information, e.g. the shape and color on the screen and an explaining string.
An arrow UI1 —> S1 can specify what possible user input UI1 will make the control unit to set the baler into the elementary state S1.
An arrow S1 —► A1 or the node A1 can specify how the alert displaying object A1 is displayed, e.g. an explaining message and a specific color.
The display device is in general too small for displaying simultaneously all user input capturing objects. It is therefore possible that the state transition model comprises several state transition partial models. In one embodiment every partial model comprises those user input capturing objects which are displayed at one time simultaneously on the display device and comprises the corresponding states and state transitions. In the embodiment the partial model further comprises at least one model change node assigned with a further partial model and specifying a user input. The user can make a corresponding user input and demand that a further state transition model is activated and further user input capturing objects are displayed. As a reaction the user input capturing object of a further partial model are displayed.
In the embodiment the state transition model is created as a state chart diagram using the Unified Modeling Language (UML, http://www.omg.org/spec/UML/). A user builds up this state chart step vice and adds further nodes and edges if required. The nodes and edges are displayed as blocks and arrows with annotations on a display device of the generating computer. Several tools for generating an UML state chart are available, among them the software tool Dia (https://projects.gnome.org/dia/).
In the embodiment a software tool automatically parses the UML state chart. The parsing tool checks the state chart on syntactical correctness and formal completeness. According to one implementation the software tool Dia2Code f http://Dia2Code.sourceforge.net/) parses the UML state chart and uses the parsing language XLST (http://en.wikipedia.org/wiki/XSLT). In this embodiment the parsing tool belongs to the program generator as well as to the user interface generator.
On the one hand the parsing tool generates the user interface specification in the form of an XML file which specifies the object tool and in particular the user input capturing objects. An ISObus design tool evaluates this XML file and further data to generate the binary object tool. This binary object tool is executed by the user interface control device during run time.
On the other hand the parsing tool generates the source code for the control program. This source code is preferably compiled and the compilation result is evaluated by the control unit. Or the source code is interpreted.
Reference signs used in the claims will do not limit the scope of the claimed invention. The term “comprises” does not exclude other elements or steps. The articles “a”, “an”, and “one” do not exclude a plurality of elements.
Features specified in several depending claims may be combined in an advantageous manner.
Claims (11)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NL2012639A NL2012639B1 (en) | 2014-04-16 | 2014-04-16 | Method and Arrangement for Controlling a Vehicle. |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NL2012639A NL2012639B1 (en) | 2014-04-16 | 2014-04-16 | Method and Arrangement for Controlling a Vehicle. |
Publications (2)
Publication Number | Publication Date |
---|---|
NL2012639A NL2012639A (en) | 2016-02-03 |
NL2012639B1 true NL2012639B1 (en) | 2016-06-27 |
Family
ID=51398793
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
NL2012639A NL2012639B1 (en) | 2014-04-16 | 2014-04-16 | Method and Arrangement for Controlling a Vehicle. |
Country Status (1)
Country | Link |
---|---|
NL (1) | NL2012639B1 (en) |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19624027A1 (en) * | 1996-06-17 | 1997-12-18 | Claas Ohg | Mobile on-board computer system with control units for work machines |
GB0507929D0 (en) * | 2005-04-20 | 2005-06-01 | Cnh Belgium Nv | Agricultural vehicle with reconfigurable control |
BE1018941A3 (en) * | 2009-09-30 | 2011-11-08 | Cnh Belgium Nv | A RECTANGULAR BALL PRESS WITH A CONTROL UNIT. |
US8677724B2 (en) * | 2010-10-25 | 2014-03-25 | Deere & Company | Round baler for baling crop residue |
CN103389860A (en) * | 2012-05-07 | 2013-11-13 | 观致汽车有限公司 | Interactive system and interactive method thereof |
-
2014
- 2014-04-16 NL NL2012639A patent/NL2012639B1/en not_active IP Right Cessation
Also Published As
Publication number | Publication date |
---|---|
NL2012639A (en) | 2016-02-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI624783B (en) | System and method establishing application program with dynamic-link function module for mobile device | |
US11097415B2 (en) | Generation of robotic user interface responsive to connection of peripherals to robot | |
EP2033089B1 (en) | Method and system for rapidly developing and deploying sensor-enabled software applications | |
US8792879B2 (en) | System and method of performing remote diagnostics on a computing device | |
CN108228810B (en) | Form linkage configuration method and system and form updating method and system | |
CN111475151A (en) | Modular programming method and related device | |
US20190306275A1 (en) | Methods for Software Development, Installation, and Management in Computer Systems and Methods for Controlling Input and Output Data Transfer in Computer Systems | |
JP2002007299A (en) | Developing means and executing means for device control program | |
JP2008515259A5 (en) | ||
US11287811B2 (en) | Gateway interface for a work machine | |
KR20180029023A (en) | Tool setting management system and method | |
NL2012639B1 (en) | Method and Arrangement for Controlling a Vehicle. | |
CN105930168A (en) | System upgrading processing method, apparatus and device | |
CN105653316A (en) | Method and device for monitoring unloaded situation of software | |
JP2013518733A (en) | Robot system control method and apparatus | |
CN106201511A (en) | Create the method and device of modal dialog box | |
JP2008191711A5 (en) | ||
JP2007265394A (en) | Device, method and system for managing event information | |
CN114115866A (en) | Cross-domain-based vehicle-mounted scene self-defining method, device, equipment and storage medium | |
US20150022382A1 (en) | Input decoder | |
CN110546583B (en) | Programming assistance device and programming assistance method | |
TWI428828B (en) | Script application framework | |
TWI379178B (en) | Programmable remote control system and method thereof | |
JP2016167214A (en) | Information processing system and method | |
CN109996100B (en) | Control method of intelligent remote controller, storage medium and remote controller |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
HC | Change of name(s) of proprietor(s) |
Owner name: FORAGE COMPANY B.V.; NL Free format text: DETAILS ASSIGNMENT: CHANGE OF OWNER(S), CHANGE OF OWNER(S) NAME; FORMER OWNER NAME: LELY FORAGE INNOVATIONS B.V. Effective date: 20170712 |
|
PD | Change of ownership |
Owner name: LELY FORAGE INNOVATIONS B.V.; NL Free format text: DETAILS ASSIGNMENT: CHANGE OF OWNER(S), MERGE, DEMERGER; FORMER OWNER NAME: FORAGE INNOVATIONS B.V. Effective date: 20170712 |
|
MM | Lapsed because of non-payment of the annual fee |
Effective date: 20220501 |