NL2012639B1 - Method and Arrangement for Controlling a Vehicle. - Google Patents

Method and Arrangement for Controlling a Vehicle. Download PDF

Info

Publication number
NL2012639B1
NL2012639B1 NL2012639A NL2012639A NL2012639B1 NL 2012639 B1 NL2012639 B1 NL 2012639B1 NL 2012639 A NL2012639 A NL 2012639A NL 2012639 A NL2012639 A NL 2012639A NL 2012639 B1 NL2012639 B1 NL 2012639B1
Authority
NL
Netherlands
Prior art keywords
user interface
user input
vehicle
control unit
state
Prior art date
Application number
NL2012639A
Other languages
Dutch (nl)
Other versions
NL2012639A (en
Inventor
Fromm Kay
Original Assignee
Forage Innovations Bv
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Forage Innovations Bv filed Critical Forage Innovations Bv
Priority to NL2012639A priority Critical patent/NL2012639B1/en
Publication of NL2012639A publication Critical patent/NL2012639A/en
Application granted granted Critical
Publication of NL2012639B1 publication Critical patent/NL2012639B1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01FPROCESSING OF HARVESTED PRODUCE; HAY OR STRAW PRESSES; DEVICES FOR STORING AGRICULTURAL OR HORTICULTURAL PRODUCE
    • A01F15/00Baling presses for straw, hay or the like
    • A01F15/07Rotobalers, i.e. machines for forming cylindrical bales by winding and pressing
    • B60K35/10
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/082Selecting or switching between different modes of propelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • B60K2360/1438
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/215Selection or confirmation of options
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2200/00Type of vehicle
    • B60Y2200/20Off-Road Vehicles
    • B60Y2200/22Agricultural vehicles

Abstract

A vehicle is transferred from a first state into a second state. This state transition is triggered by a control unit as a reaction on a user input. The control unit executes a control program. The user input is captured by a user interface unit executing user interface specification. A program generator generates the control program from a computer-executable state transition model. A user interface generator generates the user interface specification from the same state transition model. This state transition model specifies all possible state transitions and the corresponding user input capturing objects.

Description

Method and Arrangement for Controlling a Vehicle Field of the Invention
The invention refers to a method and an arrangement for controlling a vehicle wherein a control unit triggers the vehicle to perform at least one state transition. This state transition transfers the vehicle from one state to a further state, e.g. moves a pivotal part of the vehicle from a parking position into an operating position or switches a device on board of the vehicle on or off.
The invention can in particular be used in a bale forming apparatus and a bale forming method for forming a cylindrical or cuboid bale from loose material, in particular from agricultural crop material.
Background of the Invention
During its operation a vehicle performs several operations. Examples for these operations are: A mechanical part of the vehicle is pivoted or shifted or otherwise moved. A part is locked such it cannot be moved any more. This part is released again. An electronic or hydraulic drive for a part of the vehicle is switched on or off. In the case of an apparatus which creates bales from loose agricultural material two possible operations are to open a tailgate of the bale forming chamber and to eject a completed bale out of the bale forming chamber. A control unit triggers all these operations, e.g. by triggering mechanical or electrical or hydraulic actuators. Every such operation can be denoted as a state transition of the vehicle. In many cases such a state transition is initiated by the input of a human operator, e.g. a driver of the vehicle. The operator makes an input into a user interface. This user input triggers a state transition.
Summary of the Invention
An object of the invention is to provide a method and an arrangement for controlling a vehicle by means of a control arrangement comprising a control unit and a user interface unit, wherein the vehicle can be operated in different states, wherein a user input on a user input device of the user interface unit initiates a state transition from one state to a further state, wherein a control program running on the control unit triggers the initiated state transition and wherein the operations of the control program should be consistent with the operations of the user interface unit.
The problem is solved by a vehicle controlling method with the features of claim 1 and by a vehicle controlling arrangement with the features of claim 7. Preferred embodiments are specified in the depending claims.
The control method and the control arrangement according to the invention control a vehicle. This vehicle can be operated in several different states. The method is performed by using the control arrangement. This control arrangement comprises a control unit and a user interface unit with a display device and with a user input device. The control unit of the arrangement can trigger at least one state transition of the vehicle. Before this state transition the vehicle has been operated in a first state and as a result of this state transition the vehicle is operated in a second state.
The user interface unit can display at least one user input capturing object on the display device. The user interface unit can further capture a user input referring to this user input capturing object. For doing so the user interface unit comprises the user input device.
During the operation of the vehicle the following state transition sequence is automatically performed at least one time: • The control unit automatically triggers the user interface to display one user input capturing object. • The user interface unit automatically displays the user input capturing object on the display device. This displaying step is a reaction on being triggered by the control unit. The user interface unit displays the user input capturing object according to an executable user interface specification. • The user interface unit automatically captures a user input which has been done on the user interface device and which refers to the displayed user input capturing object. This user input stems from a human operator of the vehicle. • The control unit automatically processes this user input referring to the user input capturing object. • As a reaction on the captured and processed user input referring to the user input capturing object, the control unit automatically triggers the state transition from the first state to the second state.
Before this state transition sequence was performed, the vehicle was operated in the first state. Afterwards the vehicle is operated in the second state.
The control program executed by the control unit as well as the user interface specification for the user interface unit are automatically generated by processing the same given state transition model for operating the vehicle. This state transition model is provided and is stored in a model data storing means. This state transition model can automatically be executed by a computer. The model specifies • every possible state transition between different possible states of the vehicle provided that this state transition can be triggered by the control unit as a reaction on a user input, • every possible user input capturing object referring to such a possible state transition, and • for every possible user input capturing object an assignment to that possible state transition to which this user input capturing object refers to. A program generator automatically generates the control program by processing the state transition model. This generated control program is afterwards executed by the control unit of the arrangement. The control unit executing the control program • can trigger every possible state transition according to the provided state transition model and • can trigger the user interface unit to display every possible user input capturing object according to the state transition model. A user interface generator automatically generates the executable user interface specification by processing the same state transition model. By processing the generated user interface specification the user interface unit can display every possible user input capturing object according to the user interface specification. This user input capturing object is displayed after the control unit executing the control program triggers the user interface unit to display it.
Advantages
The control program running on the control unit triggers the state transitions of the vehicle. The control unit and the executable user interface specification cause the user interface unit to display several user input capturing objects. According to the invention the control program and the user interface specification are automatically generated from the same state transition model. This state transition model specifies all possible state transitions which the user can demand by a user input and all user input capturing objects for demanding a state transition. As the same state transition model specifies all possible state transitions and all possible user input capturing objects and as the same state transition is used for generating the control program as well as the user interface specification, all operations of the control unit are consistent with the operations of the user interface unit. For every possible state transition the corresponding user input capturing object is specified and will be displayed according to the user interface specification.
Thanks to the invention it is not necessary to adapt or to amend or to change manually the control program or the user interface specification. An amendment, e.g. an adaptation to an additional further state transition which is desired but not yet considered in the state transition model, is performed by purely adapting the state transition model and to trigger again the automatic generation of the control program and the user interface specification, this time by processing the same amended state transition model. With other words: One model has to be changed but not a program and additionally a specification.
The invention eliminates the risk that a change or an amendment makes the user interface unit operating inconsistently with the control unit running the control program. Such an inconsistent operation could yield an undesired event during operating the vehicle. This risk would occur if a human had to manually change the control program or the user interface specification.
The invention saves the need that a human expert implements manually the control program or the user interface specification. In contrast the human expert creates the state transition model. This human expert can be a mechanical or electrical engineer knowing the required operations of the vehicle, e.g. an engineer for vehicle design and operation. This human expert utilizes vehicle design abilities and model building abilities but does not need to apply programming knowledge. According to the invention this state transition model is automatically processed. Programming knowledge is required for implementing the control method and arrangement according to the invention but neither for generating the control program nor for generating the user interface specification.
No human interaction is required for generating the control program and the user interface specification from the same state transition model. In particular no human needs to inspect the automatically generated source code. Therefore it is easy to repeat this generation step for a sequence of different versions or a set of alternative variations of the state transition model and to evaluate and compare the results on the vehicle.
Thanks to the invention the control program is clearly separated from the user interface specification. The generated control program does not comprise statements for influencing the graphical user interface. In turn the generated user interface specification does not comprise any control logic. The user interface specification specifies the data which the user interface unit needs for displaying the user input capturing object. The control program does not need to comprise or to specify graphical information for displaying the user input capturing object. On the other hand the user interface specification does not need to comprise any data about states and state transitions.
Preferred Embodiments
Preferably the vehicle can be operated in at least three different states. A first state transition transfers the vehicle from a first state into a second state. A second state transition transfers the vehicle from the second state to the third state. Two different user input capturing objects are assigned to these two state transitions. A first sequence is performed for executing the first state transition. A second sequence is performed for executing the second state transition. Every sequence comprises the steps that the user interface unit • is triggered, • displays a user input capturing object, and • captures a corresponding user input and that the control unit triggers the corresponding state transition of the vehicle.
The provided state transition model specifies both state transitions and both user input capturing objects.
It is also possible that performing a first state transition transfers the vehicle from the first state into the second state and performing a second state transition transfers the vehicle from the second state back into the first state. One example: A part is first moved from a parking position into an operating position and afterwards moved back into the parking position. Two different user input capturing objects are assigned to these two state transitions. The provided state transition model specifies both state transitions and both user input capturing objects.
In one embodiment the control unit does not only trigger state transitions but also generates alerts to a human operator. Such an alert is automatically generated depending on the evaluation of at least one signal from a sensor. This sensor monitors the vehicle during operation. The alert is generated if the sensor detects that the vehicle is in a specific state. In one embodiment the sensor measures the current value of a time-varying variable and checks whether the measured current value is in a given domain. If this is the case, the alert is generated.
In this embodiment the state transition model additionally specifies the possible alerts to be generated and for every possible alert the state referring to this alert. The state transition model defines every possible state for which an alert has to be displayed, e.g. a state variable and a value or a domain of values which means that this state has occurred. The user interface generator generates an alert displaying unit for every defined alert. Preferably this alert displaying unit is part of the user interface specification. This embodiment ensures that suitable alerts are generated according to states of the vehicle and that the generation of alerts is consistent to the state transition triggered by the control unit.
Preferably the control unit of the vehicle has access to a program data storing means being part of the vehicle. The user interface unit has access to a specification data storing means also being part of the vehicle. The control program generated by the program generator is stored in the program data storing means. The user interface specification generated by the user interface generator is stored in the specification data storing means.
In this embodiment the following parts can be arranged outside of the vehicle, e.g. in a remote stationary data processing unit or on a further vehicle: • the model data storing means with the state transition model, • the program generator, and • the user interface generator.
This embodiment allows generating identical control programs and user interfaces specifications for different vehicles. The same state transition model is used for these vehicles.
Before operating the vehicle, the control program and the user interface specification are transmitted to the or every vehicle. During the operation of the vehicle no access to the control arrangement parts outside of the vehicle is required. This embodiment saves the need to establish a data connection between the vehicle and a central data processing unit outside the vehicle while the vehicle is operated. Such a data connection can be interrupted during operating the vehicle. Therefore it is in advantage not to rely on such a data connection during operation. In addition less computational power and performance and less storage capacity is required on board of the vehicle.
It is also possible that the model data storing means, an editor for amending the state transition model, the program generator, and the user interface generator are arranged on a mobile computer. This mobile computer can temporarily or permanently be connected with the control unit and with the user interface unit, e.g. by means of a data network of the vehicle. This embodiment enables an engineer to perform a testing operation on board of the vehicle wherein at least once the following sequence is performed: • The state transition model is amended, e.g. as a reaction on a detected undesired behavior of the vehicle while testing the vehicle. • The program generator generates the control program by processing the amended state transition model. • The user interface generator generates the user interface specification by processing the same amended state transition model. • The vehicle is operated according to this control program and to this user interface specification.
Preferably the program generator generates source code for the control program. In one embodiment the control unit interprets this source code. In a further embodiment a compiler generates an executable control program from the generated source code. The compiler is a part of the program generator. The control unit executes this compiled control program. Only the compiled and executable control program needs to be stored on a data storing means onto which the control unit has access. During runtime the control unit on board of the vehicle needs access to the executable control program but neither to the source code nor to the compiler.
In one embodiment the user interface specification comprises a unique identifier for every user input capturing object. This unique identifier distinguishes this user input capturing object from every other user input capturing object specified in the user interface specification. A passage of the control program triggers the state transition which can be caused by a user input referring to this user input capturing object. This passage comprises the unique identifier for this user input capturing object. A message from the user interface unit to the control unit about the user input also comprises this unique identifier. The user interface specification also comprises these unique identifiers.
This embodiment enables a simple data communication between the control unit and the user interface unit. In order to send a state transition message from the control unit to the user interface unit or vice versa, the input capturing object is specified by the unique identifier and not by a long alphanumeric string. Therefore the message only requires few storage capacity and therefore only little time or a data communication with a small bandwidth. Preferably these unique identifiers are automatically generated by the program generator or by the user interface generator.
The user input capturing object can specify the kind of the user interaction, e.g. that the user presses a specific button of a device or touches a specific area of a touch screen belonging to the display device. The user input capturing object can also specify that the corresponding user input comprises a sequence of alphanumeric symbols, e.g. a password or a value which is measured or observed by a human operator. The user input capturing object can also specify a selection menu to be presented on the display device and for every option of this menu an assigned state transition or further action which the control unit will trigger if the user has selected this option.
In one embodiment the control unit and the user interface unit operate according to the standard ISObus (ISO 11783). The control program and the user interface specification are generated such that this goal, i.e. operating according to ISObus, is achieved. Thanks to the invention the constraints of the ISObus standard do not need to be considered while creating the state transition model.
According to the invention the user interface unit displays on the display device a user input capturing object. The feature of displaying the object on the display device comprises presenting text and/or symbols on a screen as well as optionally a speech output by means of a loud speaker or a headset. Capturing a user input comprises the step that a user input into a touch screen or a keyboard or a selection with mouse and a cursor is captured or a speech input into a microphone is captured.
In one embodiment an executable configuration specification for the vehicle is provided. This configuration specification specifies for at least one optional part whether the vehicle comprises or does not comprise this optional part. The program generator and the user interface generator additionally evaluates this configuration specification and generate a passage in the control program and a user input capturing object referring to this optional part only if the vehicle actually comprises this optional part. This embodiment reduces the risk that senseless user interactions are offered. A user interaction referring to a nonexisting vehicle part is senseless. The same state model can be used for different vehicles. Only the configuration specification for every vehicle must additionally be provided.
In one embodiment at least one automatic state transition of the vehicle is triggered automatically by the control unit, i.e. not as a reaction on a user input. This state transition is triggered if it is detected automatically that a given event has occurred. In one implementation the state transition model comprises this or every such automatic state transition and the corresponding event, i.e. the event which causes the automatic state transition to be triggered. The program generator generates the control program such that the control unit executing this control program triggers the automatic state transition as soon as the occurrence of the event is detected. This specification of the event can specify a measurable variable and a domain of values of this variable. The event is detected if a sensor automatically detects that the current value of this variable is in the given domain.
In one embodiment the user interface unit with the display device is mounted on board of the vehicle and a human operator on board of the vehicle performs the user inputs. In a further embodiment the human operator steers the vehicle in a remote manner. The display device is arranged outside of the vehicle and is connected with the control unit on board of the vehicle by means of a wireless or wired data connection.
In an implementation of this further embodiment the vehicle is remotely controlled. The user interface unit is a part of a stationary device or of a hand-held device to be carried by a human operator. Via a wireless data connection transmission signals are exchanged between the control unit mounted on board of the vehicle and the remote user interface unit.
In a third embodiment the vehicle to be controlled is pulled by a further vehicle. This further vehicle comprises a motor and is operated by a driver.
The user interface device is mounted on board of the further vehicle, e.g. in a cabin for the driver. The control unit is mounted on board of the vehicle to be controlled. A mechanical connected as well as a data connection is established between the vehicle to be controlled and the further vehicle.
The vehicle can be an agricultural implement and the further vehicle can be a tractor, a field chopper, or a combine harvester, e.g. When being pulled the implement picks-up loose crop material from the ground and processes the picked-up material. The implement can be a loader wagon or a bale forming apparatus or a trailer for transporting bales or a fertilizing or seeding implement or a mowing device, e.g. The vehicle can also be a robot which is remotely controlled by a human operator, e.g. a milking robot for a barn or a robot used in a manufacturing site.
These and other aspects of the invention and of the preferred embodiment will be even more apparent from the detailed embodiment as described below and will be elucidated in detail there.
Detailed Description of Embodiment
In the embodiment the invention is used for controlling a bale forming apparatus (baler) serving as the vehicle to be controlled. This bale forming apparatus creates cylindrical or cuboid bales from agricultural loose material. A pick-up unit of the baler picks-up the loose material from the ground. Every bale is formed in a pressing chamber under pressure. The bale is wrapped into a wrap (a net, a twine, or a foil). The wrapped bale is ejected out of the pressing chamber. For ejecting the wrapped bale, a tailgate of the baler is opened by pivoting the tailgate with respect to the baler’s frame.
In one embodiment a bale transfer unit transfers the ejected bale onto a wrapping table. The wrapping table rotates the bale. The rotated bale is wrapped into several webs of plastic sheet.
The baler is pulled by a motorized vehicle, e.g. by a tractor, a combine harvester, or by a field chopper. A human operator in a driver’s cabin of the pulling motorized vehicle steers the combination of the pulling vehicle and the baler serving as an implement or trailer.
The baler is mechanically coupled with the pulling vehicle by means of a towing unit belonging to the baler and a corresponding hook at the pulling vehicle. A PTO shaft of the pulling vehicle rotates the main drive shaft of the baler. Signals are exchanged between a control unit for the baler and a user interface unit on board of the pulling vehicle. This data communication is described below. The operator steers the baler from the driver’s cabin and also controls the baler.
During operation the baler is several times operated in a current elementary state and is afterwards transferred from this elementary state into a further elementary state. The current overall state of the baler is described by a combination of several current elementary states. An elementary state is a combination of a state variable and a value which this state variable currently has.
Examples for elementary states are: • tailgate closed / tailgate opened, • no crop material in the pressing chamber / bale in pressing chamber is increasing / bale has reached the required size / bale in pressing chamber is wrapped into a net, • feeding channel bottom is in normal position/feeding channel bottom has been lowered downwards, • cutting arrangement is in working position / in parking position, • drum of pick-up unit is rotated / is stopped due to high load, • overload clutch in the main drive shaft engaged/disengaged, • pick-up unit is raised up (transport position) / rests on the ground (operating position), • bale transfer unit is in a bale receiving position/ in a bale holding position/ in a bale transferring position, • bale on wrapping table is dropped as soon as possible / is to be dropped if a further bale has to be transferred on the wrapping table.
On board of the baler several sensors are mounted. Every sensor can measure an assigned elementary state, e.g. the state of the pick-up unit, the state of the feeding channel bottom, or the state of the tailgate or of the wrapping table. Or a sensor measures the diameter or weight of a bale in the baling chamber or checks if the bale is properly wrapped or measures the current inclination of the baler with respect to a horizontal plane or measures a torque consumption of a shaft.
Several actuators can change an elementary state, e.g. • by pivoting the feeding channel bottom, • by opening or closing the tailgate, • by starting the process of wrapping the bale in the pressing chamber into a net or • by engaging/disengaging an overload clutch.
Such an actuator can comprise a hydraulic or pneumatic cylinder or an electrical motor.
The baler comprises a programmable control unit arranged on board of the baler. In the embodiment the baler control unit comprises • a central processing unit (CPU), • a program data storing means, • a control program (a software program) stored in the program data storing means, and • a gateway and physical interfaces to a data connection network.
This central processing unit can execute a control program (a software program). Preferably this executable control program is generated by compiling and linking a source code program. During run time the result of the compilation and linking is executed, not the source code program. Only the executable program is stored on the storing means but not necessarily the source program. It is also possible that the control unit interprets the control program during run time.
In the embodiment the executed control program is generated in an object-orientated language, e.g. in C++ or Java. While the control unit executes the compiled object-orientated control program or interprets it, i.e. during run-time, several instances (objects) of classes in the sense of the object-oriented paradigm are generated. These instances (objects) generate communication messages to other objects by means of so-called methods.
While the control unit executes the control program, the control unit processes signals from sensors and generates control inputs for actuators. In particular by sending signals to baler actuators the control unit can trigger a state transition. By such a state transition the baler is transferred from one elementary state into a further elementary state. Every state transition changes the current overall state of the baler.
The control unit is connected with sensors and actuators by means of a data communication network, e.g. via a CAN bus, see below.
The baler further comprises a user interface unit. In the embodiment this user interface unit is used by a driver (operator) of the tractor which pulls the baler.
In one implementation the user interface unit comprises a user terminal with a screen and a keyboard. The user terminal can temporarily be mounted in the driver’s cabin. A flexible electric cable connects this user terminal with the baler control unit.
In a further embodiment a wired or wireless data connection between the baler’s control unit and the control unit of the pulling tractor is established. The tractor’s control unit controls a display device which is permanently mounted in the driver’s cabin. The two control units exchange messages via this data connection. In yet a further implementation the operator uses a mobile data processing device, e.g. a smartphone or a portable computer, which is connected with the baler’s control unit.
On the one hand this user interface unit is adapted for displaying alerts and further messages to the human operator (user) of the baler. On the other hand this user interface unit displays user input capturing objects and captures user inputs from the operator. A user input capturing object enables the user to perform a specific user input and to demand a specific state transition of the baler. Therefore every user input capturing object refers to a state transition. Examples for user input capturing objects are different buttons, joysticks and further tangible actuating means, soft-keys for a touch screen or several input fields in an electronic form which is displayed on a display device. Different soft keys serve as different user input capturing objects. In contrast an alert is a message which is just given to the attention of the operator.
The control unit executing the control program triggers the operations of the user interface unit. In addition the control unit processes user inputs captured by the user interface unit.
The user interface unit comprises: • a display device, • a user input capturing device, • a data storing means in which the user interaction specification is stored, and • a user interface control device.
The display device is adapted for displaying visual and/or acoustical messages to the human operator. The user input capturing device is adapted for capturing inputs from the operator. The user interface control device is adapted for controlling the display device and the user input capturing device and to capture and to communicate signals to other control units, in particular to the baler’s control unit.
In one embodiment a touch screen can display alerts as well as so-called soft keys for user inputs. The operator can touch the touch screen at an area of this screen showing a soft key. This touch screen serves as a display device and as a user input capturing device. The touch screen detects automatically the event that the user has touched a specific area of the touch screen. This area is a user input capturing object referring to a state transition.
The operator can further enter a string with alphanumeric signs via a keyboard, e.g. for entering a password or entering a numerical value (a desired bale diameter or a desired pivoting angle of the wrapping table, e.g.). This numerical value is a desired value of a parameter, e.g. The user can select an option out of a displayed list or can also press buttons of the terminal to initiate a state transition. A button may refer to a user input capturing object displayed on the screen.
In one embodiment the user input capturing device comprises a microphone and a speech processing unit. On the display device a word is displayed. Or a loudspeaker outputs a word. If the user speaks this word into the microphone, the corresponding state transition is triggered. This embodiment enables the operator to trigger a state transition and to use his/her hands at the same time for a further operation, e.g. for steering the vehicle.
In one embodiment several rectangles or other icons are displayed on the touch screen of the display device. The control unit has triggered the user interface input to display these icons. Every icon displayed on the touch screen shows an alert or a further message to the operator (user) or enables the operator to trigger a state transition.
The operator (user) touches an icon displayed on the touch screen. Preferably the touched icon is displayed in an alternative way after being touched to show the user that the user interface unit has registered the user input and that state transition is initiated as a reaction on touching the icon. In one example the icon is displayed in an alternative color or with an alternative size or brightness.
Preferably every icon comprises a text or symbol for describing the alert/state transition. Every displayed icon is part of a user input capturing object or of an alert displaying object. Touching an icon for a state transition triggers the step that the user interface unit generates a corresponding message to the control unit and the control unit triggers the corresponding state transition.
In the embodiment the user interface control device and the baler’s control unit are connected via a data communication network, e.g. via a CAN bus, and exchange messages on the network, e.g. as CAN messages. Several sensors and actuators of the baler are also connected with this CAN bus and delivers messages to or receive messages from this CAN bus. Examples for messages are: • The tractor’s or the baler’s control unit sends wake-up signals via the CAN bus to the connected sensors and actuators. • The baler’s control unit triggers the user interface control device to display a soft key or an alert to the operator on the display device. • The user interface control device sends a message comprising a user input to the baler’s control unit. • The control unit sends a request to a sensor to measure a parameter of the baler and to deliver the measured parameter. • A sensor delivers a measured value to the baler’s control unit. • The baler’s control unit triggers a cylinder arrangement to open the tailgate.
In the embodiment the baler control unit and the user interface control device operate according to the standard ISObus, also called ISO 11783 (http://de.wikipedia.org/wiki/lsobus). The display device serves as a so-called virtual terminal (VT, http://de.wikipedia.0rg/wiki/ISOBUS#Virtual_Terminal) according to the ISObus terminology. A so-called “object pool” specifies every possible user input capturing object which the user interface unit can display. Every user input capturing object can also be denoted as a GUI object (GUI = Graphical User Interface). A specific button which the operator can actuate, a soft key connected with a state transition (a change of the current baler’s state), and an alert to the user are examples for user input capturing objects. The object pool specifies for every possible user input capturing object an own object ID (unique identifier) which distinguishes this user interaction object from all other user interaction objects specified in the object pool. Optionally an acoustic signal is also associated with the user input capturing object.
In the embodiment the object pool further specifies graphical information for a user input capturing object, e.g. the size, position, and color of this object on a screen of the display device and a human-readable text or symbol describing the state transition which the user can trigger with a user input referring to this user input capturing object.
In one embodiment the default way how the user interface unit displays a user input capturing object is to show a rectangle on the display device. The object pool specifies the position, size, color, and explaining text or symbol for this rectangle. It is possible that in addition a set of other icons for the user input capturing objects are provided, e.g. as bitmaps. Every provided icon refers to a user input capturing object of the object pool. If this user input capturing object is to be displayed, the corresponding icon is shown on the display device. Preferably every such icon is assigned to an object ID. This set of icons is not part of the object pool and therefore remains unchanged if a new object pool is automatically generated (see below).
This object pool is automatically generated and serves as a part of the executable user interface specification according to the claims. The object pool is stored in the specification data storing means on board of the vehicle, preferably as a binary file.
As just mentioned the object pool specifies for every user input capturing object a unique object ID and several displaying information, e.g. the location and/or the size of the user input capturing object on the display device. Every message transmitted from the control unit to the user interface unit or vice versa comprises the object ID of a user input capturing object. Therefore the control program does not need to process or provide any information how to display user input capturing object. The object pool does not comprise any data about states and state transitions. A complete separation of the control logic from the user interaction is realized.
In order to trigger the user interface unit to display a user input capturing object, the control unit transmits a message with an object idea to the user interface control device. The user interface control device retrieves from the object pool the information required for displaying the user input capturing object.
In the embodiment the object pool is generated as an XML file or a set of XML files. XML (http://en.wikipedia.org/wiki/XML) means “extensible Markup Language” and is a standardized programming language for storing hierarchically structured information according to a standardized syntax. Different programs are available for evaluating XML files. In the embodiment the user interface control device executes a binary object tool which is automatically generated from an XML file specifying the object tool. The XML file is also automatically generated.
In the embodiment a program generating phase and a subsequent runtime phase are performed.
In the program generating phase the state transition model (see below) is created and the control program and the user interface specification are automatically generated. A generating computer which is not necessarily part of the vehicle (baler) automatically generates the control program and the user interface specification. The generating computer can be a stationary central computer or a mobile computer which is used on the baler during a testing phase.
Preferably the generating computer automatically generates the unique identifiers (object IDs) for the user input capturing objects, e.g. by using a counter which is incremented by 1 for every new object. The generated control program and the generated user interface specification are transmitted to a data storing means on board of the baler such that the control unit has access to the stored control program and the user interface control device has access to the stored user interface specification.
In one implementation the generating computer also compiles the generated source code for the control program and the generated XML file for the object pool.
In the runtime phase the generating computer is not used. The control unit executes the automatically generated control program and triggers state transitions according to user inputs. The user interface control device evaluates the user interaction and controls the display device.
One basic idea of the invention is that the user interface specification (the object pool) and the executable control program for the control unit are generated from the same state transition model. This state transition model is created by a human vehicle engineer - or a team of vehicle engineers - on a computer. A program generator automatically evaluates the state transition model and generates the control program. A user interface generator automatically evaluates the same state transition model and generates the user interface specification. Both generators are implemented as software programs running on the generating computer. In one embodiment the same program serves as the program generator as well as the user interface generator.
The state transition model represents every possible state transition of the baler which can be triggered by a user input. The state transition model further represents every user input capturing object for a possible user input triggering a state transition. In the embodiment the state transition model further represents every alert displaying object.
In the embodiment the state transition model has the form of at least one directed graph with nodes and directed edges (arrows) between these nodes. Every node represents either an elementary state, a user input capturing object, or an alert displaying object (three kinds of nodes). A directed edge starts in a node of a first kind and ends in a node of the first kind or of a second kind.
An example for a part of this directed graph is S1 —► UI1 —► S2 (three nodes, two edges). The nodes S1 and S2 represent two elementary states, UI1 represents a user input capturing object connected with a possible user input and therefore a possible user interaction. If the baler is in the state S1 and the user performs the user input assigned to UI1, the control unit triggers a state transition from S1 to S2. A further example is UI1 —► UI2 wherein UI1, UI2 are two user input capturing objects. A user input assigned to UI1 enables the user to make the additional input assigned to UI2. A further example is S1 —► A1 wherein A1 is an alert displaying object. If the state S1 is detected by a sensor, e.g., the alert displaying object A1 is displayed.
In the embodiment some properties can be assigned to the nodes and to the arrows. A node for an elementary state can comprise an identifier for a state variable. This state variable is measured by a sensor or set by an actuator. The identifier for the state variable will occur in the source code of the generated control program. The node further specifies a possible value or a value domain of this state variable. If the corresponding value of the variable is equal to the specify value or falls into the specified domain, this state occurs.
The node can further specify an initial value - that is a value which this state variable takes when the baler is started or a default value if the current value is not yet measured.
In an alternative embodiment several nodes of the state transition models are marked as initial state nodes. As soon as the operation of the baler starts, the elementary states referring to these nodes marked as initial state nodes occur simultaneously. The control program triggers the user interface unit to display at least one user input which causes a transition of the baler from an initial elementary state into a further state. A node for a user input capturing object can specify possible user inputs, e.g. pressing a button or a touch screen, and displaying information, e.g. the shape and color on the screen and an explaining string.
An arrow UI1 —> S1 can specify what possible user input UI1 will make the control unit to set the baler into the elementary state S1.
An arrow S1 —► A1 or the node A1 can specify how the alert displaying object A1 is displayed, e.g. an explaining message and a specific color.
The display device is in general too small for displaying simultaneously all user input capturing objects. It is therefore possible that the state transition model comprises several state transition partial models. In one embodiment every partial model comprises those user input capturing objects which are displayed at one time simultaneously on the display device and comprises the corresponding states and state transitions. In the embodiment the partial model further comprises at least one model change node assigned with a further partial model and specifying a user input. The user can make a corresponding user input and demand that a further state transition model is activated and further user input capturing objects are displayed. As a reaction the user input capturing object of a further partial model are displayed.
In the embodiment the state transition model is created as a state chart diagram using the Unified Modeling Language (UML, http://www.omg.org/spec/UML/). A user builds up this state chart step vice and adds further nodes and edges if required. The nodes and edges are displayed as blocks and arrows with annotations on a display device of the generating computer. Several tools for generating an UML state chart are available, among them the software tool Dia (https://projects.gnome.org/dia/).
In the embodiment a software tool automatically parses the UML state chart. The parsing tool checks the state chart on syntactical correctness and formal completeness. According to one implementation the software tool Dia2Code f http://Dia2Code.sourceforge.net/) parses the UML state chart and uses the parsing language XLST (http://en.wikipedia.org/wiki/XSLT). In this embodiment the parsing tool belongs to the program generator as well as to the user interface generator.
On the one hand the parsing tool generates the user interface specification in the form of an XML file which specifies the object tool and in particular the user input capturing objects. An ISObus design tool evaluates this XML file and further data to generate the binary object tool. This binary object tool is executed by the user interface control device during run time.
On the other hand the parsing tool generates the source code for the control program. This source code is preferably compiled and the compilation result is evaluated by the control unit. Or the source code is interpreted.
Reference signs used in the claims will do not limit the scope of the claimed invention. The term “comprises” does not exclude other elements or steps. The articles “a”, “an”, and “one” do not exclude a plurality of elements.
Features specified in several depending claims may be combined in an advantageous manner.

Claims (11)

1. Werkwijze voor het besturen van een voertuig door middel van een besturingssamenstel, waarbij het voertuig - kan worden bediend in diverse verschillende toestanden en - ten minste één toestandsverandering kan uitvoeren, waarbij het besturingssamenstel voor het uitvoeren van de werkwijze omvat - een besturingseenheid en - een gebruikersinterface-eenheid met een weergave-inrichting en een gebruikersinvoer-inrichting, waarbij de werkwijze de stappen omvat dat de besturingseenheid automatisch een besturingsprogramma uitvoert, het uitvoeren van het programma tot gevolg heeft dat ten minste eenmaal een toestandsveranderingsreeks automatisch wordt uitgevoerd waarin deze toestandsveranderingsreeks de stappen omvat dat - het voertuig wordt bediend in een eerste toestand, - de besturingseenheid de gebruikersinterface-eenheid activeert om een object voor het registreren van een gebruikersinvoer weer te geven op de weergave-inrichting, - als een reactie op het geactiveerd worden de gebruikersinterface-eenheid op de weergave-inrichting dit object voor het registreren van een gebruikersinvoer weergeeft volgens een uitvoerbare specificatie van de gebruikersinterface, - de gebruikersinterface-eenheid een gebruikersinvoer in de gebruikersinvoer-inrichting vastlegt waarbij de gebruikersinvoer verwijst naar het weergegeven object voor het registreren van een gebruikersinvoer, - de besturingseenheid de gebruikersinvoer verwerkt verwijzend naar het weergegeven object voor het registreren van een gebruikersinvoer, - als een reactie op het verwerken van de gebruikersinvoer de besturingseenheid een toestandsverandering van de eerste toestand in een tweede toestand activeert, en - als een reactie op het geactiveerd worden wordt het voertuig bediend in de tweede toestand, waarbij de werkwijze de verdere stappen omvat dat een door een computer uitvoerbaar toestandsveranderingsmodel is verschaft, het verschafte toestandsveranderingsmodel specificeert - iedere mogelijke toestandsverandering tussen verschillende mogelijke toestanden van het voertuig die kunnen worden geactiveerd door de besturingseenheid als een reactie op een gebruikersinvoer, - ieder mogelijk object voor het registreren van een gebruikersinvoer dat verwijst naar een dergelijke mogelijke toestandsverandering, en - voor ieder mogelijk object voor het registreren van een gebruikersinvoer een toewijzing aan een mogelijke toestandsverandering waarnaar dit object voor het registreren van een gebruikersinvoer verwijst, een programma-generator genereert automatisch het besturingsprogramma dat nadien wordt uitgevoerd door de besturingseenheid, de program ma-generator genereert het besturingsprogramma door het verwerken van het verschafte toestandsveranderingsmodel zodanig dat de besturingseenheid die het gegenereerde besturingsprogramma uitvoert - iedere mogelijke toestandsverandering kan activeren volgens het verschafte toestandsveranderingsmodel en - de gebruikersinterface-eenheid kan activeren om elk mogelijk object voor het registreren van een gebruikersinvoer weer te geven volgens het verschafte toestandsveranderingsmodel, een gebruikersinterface-generator automatisch de uitvoerbare specificatie van de gebruikersinterface genereert, de gebruikersinterface-generator de specificatie van het gebruikersinterface genereert door het verwerken van hetzelfde toestandsveranderingsmodel, de gegenereerde uitvoerbare specificatie van de gebruikersinterface het de gebruikersinterface-eenheid mogelijk maakt om elk mogelijk object voor het registreren van een gebruikersinvoer volgens de gebruikersinterface-specificatie weer te geven na te zijn geactiveerd door de besturingseenheid.Method for controlling a vehicle by means of a control assembly, wherein the vehicle - can be operated in various different states and - can make at least one state change, wherein the control assembly for performing the method comprises - a control unit and - a user interface unit with a display device and a user input device, the method comprising the steps of the control unit automatically executing a control program, the execution of the program causing at least once a state change sequence to be automatically executed in which this state change sequence steps comprising that - the vehicle is operated in a first state, - the control unit activates the user interface unit to display an object for registering a user input on the display device, - as a response to the activation, the users are activated interface unit on the display device displays this object for registering a user input according to an executable specification of the user interface, - the user interface unit records a user input in the user input device with the user input referring to the displayed object for registering a user input, - the control unit processes the user input referring to the displayed object for registering a user input, - if a response to the processing of the user input, the control unit activates a state change from the first state to a second state, and - if a response upon activation, the vehicle is operated in the second state, the method comprising the further steps of providing a computer executable state change model, specifying the state change model provided - every possible state change in g between different possible states of the vehicle that can be activated by the control unit as a response to a user input, - every possible object for registering a user input that refers to such a possible change of state, and - for every possible object for registering a user input an assignment to a possible state change to which this object for registering a user input refers, a program generator automatically generates the control program that is subsequently executed by the control unit, the program generator generates the control program by processing the provided state change model such that the control unit executing the generated control program can - activate any possible state change according to the state change model provided and - activate the user interface unit to enable any possible k display object for registering a user input according to the provided state change model, a user interface generator automatically generates the user interface executable specification, the user interface generator generates the user interface specification by processing the same state change model, the generated executable specification of the user interface allows the user interface unit to display any object for registering a user input according to the user interface specification after being activated by the control unit. 2. Werkwijze voor het besturen van een voertuig volgens conclusie 1, met het kenmerk, dat het uitvoeren van het computerprogramma tot gevolg heeft dat ten minste eenmaal een alarmweergavereeks uitgevoerd wordt waarbij de alarmweergavereeks de stappen omvat dat - een sensor monitort of het voertuig wordt bediend in een of wordt blootgesteld aan een voorafbepaalde toestand, - zodra deze voorafbepaalde toestand wordt gedetecteerd, de regeleenheid de gebruikersinterface-eenheid activeert om op de weergave-inrichting een alarmsignaal weer te geven dat verwijst naar deze toestand, en - na te zijn geactiveerd, de gebruikersinterface-eenheid een alarmweergaveobject weergeeft voor het weergeven van het alarmsignaal, waarbij het verschafte toestandsveranderingsmodel specificeert - ieder mogelijk alarmweergaveobject dat verwijst naar een toestand die kan worden gedetecteerd door een voertuigsensor en - de toestand van het voertuig waarnaar dit alarmsignaal verwijst, waarbij de werkwijze de verdere stappen omvat dat - de programma-generator het besturingsprogramma specificeert zodanig dat de regeleenheid die het computerprogramma uitvoert de stap van het weergeven van ieder mogelijk alarmsignaal zoals gespecificeerd in het toestandsveranderingsmodel kan activeren en - de gebruikersinterface-generator automatisch een uitvoerbare specificatie van de gebruikersinterface genereert zodanig dat de gegenereerde specificatie van de gebruikersinterface het de gebruikersinterface mogelijk maakt om elk mogelijk alarmweergaveobject weer te geven.A method for driving a vehicle according to claim 1, characterized in that the execution of the computer program results in an alarm display sequence being executed at least once, the alarm display sequence comprising the steps of - monitoring a sensor or operating the vehicle in or being exposed to a predetermined state, - as soon as this predetermined state is detected, the control unit activates the user interface unit to display on the display device an alarm signal referring to this state, and - after being activated, the user interface unit displays an alarm display object for displaying the alarm signal, the state change model provided specifying - any possible alarm display object that refers to a state that can be detected by a vehicle sensor and - the state of the vehicle to which this alarm signal refers, the method further The steps include that - the program generator specifies the control program such that the control unit executing the computer program can activate the step of displaying any possible alarm signal as specified in the state change model and - the user interface generator automatically generates an executable specification of the user interface such that the generated user interface specification allows the user interface to display every possible alarm display object. 3. Werkwijze voor het besturen van een voertuig volgens een van de voorgaande conclusies, met het kenmerk, dat ieder object voor het registreren van een gebruikersinvoer een uniek identificatiemiddel omvat dat dit object voor het registreren van een gebruikersinvoer onderscheidt van alle andere objecten voor het registreren van een gebruikersinvoer zoals gespecificeerd in het verschafte toestandsveranderingsmodel, de gebruikersinterface-specificatie en het besturingsprogramma zodanig worden gegenereerd dat zij het respectieve unieke identificatiemiddel omvatten van ieder object voor het registreren van een gebruikersinvoer, de stap dat de besturingseenheid de gebruikersinterface-eenheid activeert om een object voor het registreren van een gebruikersinvoer weergeeft de stap omvat dat het unieke identificatiemiddel van dit object voor het registreren van een gebruikersinvoer naar de gebruikersinterface-eenheid wordt verzonden, en de stap dat de besturingseenheid een gebruikersinvoer verwerkt die verwijst naar dit object voor het registreren van een gebruikersinvoer de stap omvat dat het unieke identificatiemiddel van dit object voor het registreren van een gebruikersinvoer wordt verzonden naar de besturingseenheid.Method for driving a vehicle according to one of the preceding claims, characterized in that each object for registering a user input comprises a unique identification means that distinguishes this object for registering a user input from all other objects for registering of a user input as specified in the provided state change model, the user interface specification and the control program are generated such that they include the respective unique identifier of each object for registering a user input, the step that the control unit activates the user interface unit to an object for registering a user input, the step comprises that the unique identifier of this object for registering a user input is sent to the user interface unit, and the step that the control unit processes a user input Referring to this object for registering a user input, the step comprises that the unique identifier of this object for registering a user input is sent to the control unit. 4. Werkwijze voor het besturen van een voertuig volgens een van de voorgaande conclusies, met het kenmerk, dat het voertuig ten minste één optioneel onderdeel omvat, een op een computer uitvoerbare configuratie-specificatie wordt verschaft, de configuratie-specificatie voor het of ieder optionele onderdeel specificeert of het voertuig al dan niet dit optionele onderdeel omvat, en de program ma-generator de configuratie-specificatie evalueert als het besturingsprogramma wordt gegenereerd zodanig dat de besturingseenheid die het gegenereerde besturingsprogramma uitvoert enkel een toestandsverandering activeert die verwijst naar het optionele onderdeel indien het voertuig daadwerkelijk het optionele onderdeel omvat.Method for driving a vehicle according to one of the preceding claims, characterized in that the vehicle comprises at least one optional component, a computer-specific configuration specification is provided, the configuration specification for the or each optional item specifies whether or not the vehicle includes this optional item, and the program generator evaluates the configuration specification when the control program is generated such that the control unit executing the generated control program only triggers a state change that refers to the optional item if it vehicle actually includes the optional part. 5. Werkwijze voor het besturen van een voertuig volgens een van de voorgaande conclusies, met het kenmerk, dat het voertuig - een werktuig is en - wordt voortgetrokken door een gemotoriseerd voertuig, de besturingseenheid aan boord van het werktuig is aangebracht, en de gebruikersinterface-eenheid aan boord van het gemotoriseerde voertuig is aangebracht.Method for driving a vehicle according to one of the preceding claims, characterized in that the vehicle - is a tool and - is pulled by a motorized vehicle, the control unit is mounted on board the tool, and the user interface - unit is fitted on board the motorized vehicle. 6. Werkwijze voor het besturen van een voertuig volgens conclusie 5, met het kenmerk, dat het werktuig - los materiaal van de grond opraapt en - onder druk een baal vormt van opgeraapt los materiaal of - het opgeraapte materiaal in een opvangorgaan aan boord van het voertuig opslaat terwijl het gemotoriseerde voertuig het werktuig voorttrekt.A method for driving a vehicle according to claim 5, characterized in that the tool - picks up loose material from the ground and - forms a bale under pressure of picked up loose material or - the picked up material in a collecting member on board the vehicle while the motorized vehicle is pulling the implement. 7. Besturingssamenstel voor het besturen van een voertuig, waarbij het voertuig - kan worden bediend in diverse verschillende toestanden en - ten minste één toestandsverandering kan uitvoeren, waarbij het besturingssamenstel omvat - een besturingseenheid ingericht voor het automatisch uitvoeren van een besturingsprogramma, - een gebruikersinterface-eenheid met een weergave-inrichting en een gebruikersinvoer-inrichting, - een opslagmiddel voor het opslaan van modeldata waarin een door een computer uitvoerbaar toestandsveranderingsmodel is opgeslagen, - een programma-generator, en - een gebruikersinterface-generator waarbij het besturingssamenstel dat het besturingsprogramma uitvoert is aangepast voor het automatisch uitvoeren van een toestandsveranderingsreeks omvattende de stappen dat - het voertuig wordt bediend in een eerste toestand, - de besturingseenheid de gebruikersinterface-eenheid activeert om een object voor het registreren van een gebruikersinvoer weer te geven op de weergave-inrichting, - als een reactie op het geactiveerd worden de gebruikersinterface-eenheid op de weergave-inrichting dit object voor het registreren van een gebruikersinvoer weergeeft volgens een uitvoerbare specificatie van de gebruikersinterface, - de gebruikersinterface-eenheid een gebruikersinvoer vastlegt in de gebruikersinvoer-inrichting waarbij de gebruikersinvoer verwijst naar het weergegeven object voor het registreren van een gebruikersinvoer, - de besturingseenheid de gebruikersinvoer verwerkt verwijzend naar het weergegeven object voor het registreren van een gebruikersinvoer, - als een reactie op het verwerken van de gebruikersinvoer de besturingseenheid een toestandsverandering van de eerste toestand in een tweede toestand activeert, en - als een reactie op het geactiveerd worden wordt het voertuig bediend in de tweede toestand, waarbij toestandsveranderingsmodel specificeert - iedere mogelijke toestandsverandering tussen verschillende mogelijke toestanden van het voertuig die kunnen worden geactiveerd door de besturingseenheid als een reactie op een gebruikersinvoer, - ieder mogelijk object voor het registreren van een gebruikersinvoer dat verwijst naar een dergelijke mogelijke toestandsverandering, en - voor ieder mogelijk object voor het registreren van een gebruikersinvoer een toewijzing aan een mogelijke toestandsverandering waarnaar dit object voor het registreren van een gebruikersinvoer verwijst, waarbij de programma-generator is aangepast voor het automatisch genereren van het besturingsprogramma dat kan worden uitgevoerd door de besturingseenheid, waarbij de programma-generator is aangepast voor het genereren van het besturingsprogramma door het verwerken van het opgeslagen toestandsveranderingsmodel zodanig dat de besturingseenheid die het gegenereerde besturingsprogramma uitvoert - iedere mogelijke toestandsverandering kan activeren volgens het verschafte toestandsveranderingsmodel en - de gebruikersinterface-eenheid kan activeren om elk mogelijk object voor het registreren van een gebruikersinvoer weer te geven volgens het toestandsveranderingsmodel, waarbij de gebruikersinterface-generator is aangepast voor het genereren van de uitvoerbare specificatie van de gebruikersinterface door het verwerken van hetzelfde opgeslagen toestandsveranderingsmodel, waarbij de gebruikersinterface-generator is aangepast om de specificatie van de gebruikersinterface te genereren zodanig dat de gebruikersinterface-eenheid elk mogelijk object voor het registreren van een gebruikersinvoer volgens de gebruikersinterface-specificatie weergeeft nadat de gebruikersinterface-eenheid geactiveerd is door de besturingseenheid.7. Control system for controlling a vehicle, wherein the vehicle - can be operated in various different states and - can make at least one state change, the control assembly comprising - a control unit adapted to automatically execute a control program, - a user interface - unit with a display device and a user input device, - a storage means for storing model data in which a computer-executable state change model is stored, - a program generator, and - a user interface generator in which the control assembly executing the control program is adapted to automatically perform a state change sequence comprising the steps that - the vehicle is operated in a first state, - the control unit activates the user interface unit to display an object for registering a user input on the display ave device, - as a response to being activated, the user interface unit on the display device displays this object for registering a user input according to an executable specification of the user interface, - the user interface unit records a user input in the user input device wherein the user input refers to the displayed object for registering a user input, - the control unit processes the user input referring to the displayed object for registering a user input, - as a response to the processing of the user input, the control unit changes the state of the first state in a second state, and - as a response to being activated, the vehicle is operated in the second state, wherein state change model specifies - every possible state change between different possible states of the vehicle g which can be activated by the control unit as a response to a user input, - every possible object for registering a user input that refers to such a possible state change, and - for each possible object for registering a user input, an assignment to a possible state change to which this object for registering a user input refers, wherein the program generator is adapted to automatically generate the control program that can be executed by the control unit, the program generator being adapted to generate the control program by processing of the stored state change model such that the control unit executing the generated control program can - activate any possible state change according to the provided state change model and - the user interface unit can activate every mo display the same object for registering a user input according to the state change model, the user interface generator being adapted to generate the executable specification of the user interface by processing the same stored state change model, the user interface generator being adapted to the specification of the user interface such that the user interface unit displays every possible object for registering a user input according to the user interface specification after the user interface unit is activated by the control unit. 8. Besturingssamenstel volgens conclusie 7, met het kenmerk, dat de besturingseenheid en de gebruikersinterface-eenheid zijn aangebracht op het voertuig, waarbij het voertuig omvat - een opslagmiddel voor programmadata en - een opslagmiddel voor specificatiedata, waarbij het besturingssamenstel is aangepast - voor het opslaan van het gegenereerde besturingsprogramma in het opslagmiddel voor programmadata en - voor het opslaan van de gegenereerde gebruikersinterface specificatie in het opslagmiddel voor specificatiedata, waarbij het besturingssamenstel zodanig is voorzien dat -de besturingseenheid toegang heeft tot het opslagmiddel voor programmadata en - de gebruikersinterface-eenheid toegang heeft tot de specificatiedatabase.A control assembly according to claim 7, characterized in that the control unit and the user interface unit are arranged on the vehicle, the vehicle comprising - a storage means for program data and - a storage means for specification data, wherein the control assembly is adapted - for storing of the generated control program in the program data storage means and - for storing the generated user interface specification in the specification data storage means, wherein the control assembly is provided such that the control unit has access to the program data storage means and - the user interface unit has access to the specification database. 9. Besturingssamenstel volgens conclusie 7, met het kenmerk, dat het voertuig - de besturingseenheid omvat en - is aangepast om mechanisch te worden gekoppeld met een verder voertuig, waarbij het voertuig een opslagmiddel voor programmadata omvat, waarbij het verdere voertuig de gebruikersinterface-eenheid omvat en een opslagmiddel voor specificatiedata, waarbij het besturingssamenstel is aangepast - voor het opslaan van het gegenereerde besturingsprogramma in het opslagmiddel voor programmadata en - voor het opslaan van de gegenereerde gebruikersinterface specificatiein het opslagmiddel voor specificatiedata, waarbij het besturingssamenstel een dataverbinding omvat tussen de besturingseenheid op het voertuig en de gebruikersinterface-eenheid op het verdere voertuig.A control assembly according to claim 7, characterized in that the vehicle - comprises the control unit and - is adapted to be mechanically coupled to a further vehicle, the vehicle comprising a program data storage means, the further vehicle comprising the user interface unit and a specification data storage means, wherein the control assembly is adapted - for storing the generated control program in the program data storage means and - for storing the generated user interface specification in the specification data storage means, the control assembly including a data connection between the control unit on the vehicle and the user interface unit on the further vehicle. 10. Besturingssamenstel volgens conclusie 9, met het kenmerk, dat het voertuig een werktuig is en het verdere voertuig een gemotoriseerd voertuig is dat is aangepast om het werktuig voort te trekken.A control assembly according to claim 9, characterized in that the vehicle is a tool and the further vehicle is a motorized vehicle adapted to pull the tool. 11. Besturingssamenstel volgens conclusie 10, met het kenmerk, dat het werktuig een baalvormend apparaat is dat is aangepast - voor het oprapen van los materiaal van de grond en - voor het vormen onder druk van een baal van los materiaal.A control assembly according to claim 10, characterized in that the tool is a bale forming device that is adapted - for picking up loose material from the ground and - for forming a bale of loose material under pressure.
NL2012639A 2014-04-16 2014-04-16 Method and Arrangement for Controlling a Vehicle. NL2012639B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
NL2012639A NL2012639B1 (en) 2014-04-16 2014-04-16 Method and Arrangement for Controlling a Vehicle.

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
NL2012639A NL2012639B1 (en) 2014-04-16 2014-04-16 Method and Arrangement for Controlling a Vehicle.

Publications (2)

Publication Number Publication Date
NL2012639A NL2012639A (en) 2016-02-03
NL2012639B1 true NL2012639B1 (en) 2016-06-27

Family

ID=51398793

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2012639A NL2012639B1 (en) 2014-04-16 2014-04-16 Method and Arrangement for Controlling a Vehicle.

Country Status (1)

Country Link
NL (1) NL2012639B1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19624027A1 (en) * 1996-06-17 1997-12-18 Claas Ohg Mobile on-board computer system with control units for work machines
GB0507929D0 (en) * 2005-04-20 2005-06-01 Cnh Belgium Nv Agricultural vehicle with reconfigurable control
BE1018941A3 (en) * 2009-09-30 2011-11-08 Cnh Belgium Nv A RECTANGULAR BALL PRESS WITH A CONTROL UNIT.
US8677724B2 (en) * 2010-10-25 2014-03-25 Deere & Company Round baler for baling crop residue
CN103389860A (en) * 2012-05-07 2013-11-13 观致汽车有限公司 Interactive system and interactive method thereof

Also Published As

Publication number Publication date
NL2012639A (en) 2016-02-03

Similar Documents

Publication Publication Date Title
TWI624783B (en) System and method establishing application program with dynamic-link function module for mobile device
US11097415B2 (en) Generation of robotic user interface responsive to connection of peripherals to robot
EP2033089B1 (en) Method and system for rapidly developing and deploying sensor-enabled software applications
US8792879B2 (en) System and method of performing remote diagnostics on a computing device
CN108228810B (en) Form linkage configuration method and system and form updating method and system
CN111475151A (en) Modular programming method and related device
US20190306275A1 (en) Methods for Software Development, Installation, and Management in Computer Systems and Methods for Controlling Input and Output Data Transfer in Computer Systems
JP2002007299A (en) Developing means and executing means for device control program
JP2008515259A5 (en)
US11287811B2 (en) Gateway interface for a work machine
KR20180029023A (en) Tool setting management system and method
NL2012639B1 (en) Method and Arrangement for Controlling a Vehicle.
CN105930168A (en) System upgrading processing method, apparatus and device
CN105653316A (en) Method and device for monitoring unloaded situation of software
JP2013518733A (en) Robot system control method and apparatus
CN106201511A (en) Create the method and device of modal dialog box
JP2008191711A5 (en)
JP2007265394A (en) Device, method and system for managing event information
CN114115866A (en) Cross-domain-based vehicle-mounted scene self-defining method, device, equipment and storage medium
US20150022382A1 (en) Input decoder
CN110546583B (en) Programming assistance device and programming assistance method
TWI428828B (en) Script application framework
TWI379178B (en) Programmable remote control system and method thereof
JP2016167214A (en) Information processing system and method
CN109996100B (en) Control method of intelligent remote controller, storage medium and remote controller

Legal Events

Date Code Title Description
HC Change of name(s) of proprietor(s)

Owner name: FORAGE COMPANY B.V.; NL

Free format text: DETAILS ASSIGNMENT: CHANGE OF OWNER(S), CHANGE OF OWNER(S) NAME; FORMER OWNER NAME: LELY FORAGE INNOVATIONS B.V.

Effective date: 20170712

PD Change of ownership

Owner name: LELY FORAGE INNOVATIONS B.V.; NL

Free format text: DETAILS ASSIGNMENT: CHANGE OF OWNER(S), MERGE, DEMERGER; FORMER OWNER NAME: FORAGE INNOVATIONS B.V.

Effective date: 20170712

MM Lapsed because of non-payment of the annual fee

Effective date: 20220501