WO2015199600A1 - Method and mobile device for steering a vehicle - Google Patents

Method and mobile device for steering a vehicle Download PDF

Info

Publication number
WO2015199600A1
WO2015199600A1 PCT/SE2015/050663 SE2015050663W WO2015199600A1 WO 2015199600 A1 WO2015199600 A1 WO 2015199600A1 SE 2015050663 W SE2015050663 W SE 2015050663W WO 2015199600 A1 WO2015199600 A1 WO 2015199600A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
electronic device
signals
type
special
Prior art date
Application number
PCT/SE2015/050663
Other languages
French (fr)
Inventor
Tom NYSTRÖM
Christoffer NORÉN
Bashar MENGANA
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to DE112015002330.5T priority Critical patent/DE112015002330T5/en
Publication of WO2015199600A1 publication Critical patent/WO2015199600A1/en

Links

Classifications

    • G05D1/2247
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0016Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement characterised by the operator's input device
    • G05D1/223
    • G05D1/2235
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/30User interface
    • G08C2201/32Remote control based on movements, attitude of remote control device
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/40Remote control systems using repeaters, converters, gateways
    • G08C2201/42Transmitting or receiving remote control signals via a network
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/90Additional features
    • G08C2201/93Remote control using other portable devices, e.g. mobile phone, PDA, laptop

Definitions

  • the present invention relates to technology in connection with remote control of a vehicle, and specifically to a method and a mobile electronic device to control a vehicle and its movement.
  • the invention also relates to a system, a computer program, a software product and a user interface.
  • Remote control of vehicles currently often forms part of many people's everyday life.
  • Most modern vehicles are, for example, equipped with the function of remote control of the vehicle's lock, via the vehicle's electronic key, which simplifies the management of the vehicle.
  • the mobile phone has become part of the average person's everyday life, with the possibility of easily downloading software to the mobile phone, it is now also possible to carry out certain functions, such as remote control of the vehicle's lock, from the mobile phone.
  • certain functions such as remote control of the vehicle's lock, from the mobile phone.
  • US201 1/0137490 and US2008/0057929 it is prior art to control a number of functions in a vehicle via a program in a mobile phone. Examples of such functions include readings of tachographs, turning ABS or air conditioning on or off, locking or opening the vehicle, etc.
  • the bend and level of bend of the positioning element specifies both the speed and steering angle of the vehicle. It is specified that the lifting function of the fork truck may be controlled via function keys in the software, but this is not explained in detail.
  • the objective is achieved at least partly by way of a method, which comprises, at a mobile electronic device which is arranged to communicate with and to remotely control a vehicle in several driving modes:
  • - setting the electronic device into a special steering configuration which means that the electronic device is configured to remotely control one or several functions in the vehicle, connected to the special driving mode, whereby continued in-signals to the electronic device may control the function or functions in real time;
  • Wireless remote control of a vehicle facilitates smoother handling, when the driver wishes to control the vehicle from outside the vehicle. Since control of a function is connected to the current driving mode of the vehicle, in-signals in the form of one or several in-signals of the second type, detected after the first type of in- signal, will only be able to impact the vehicle in a predetermined manner.
  • the device may e.g. be used as a hybrid joystick, whose movements impact, for example, the vehicle's speed. For example, fine manoeuvring of the vehicle when connecting a trailer or reversing is possible. A more efficient reversing process may then be obtained, since the driver does not need to get into and out of the driver's cabin to check the distance to the trailer or, for example, a terminal.
  • Truck functions may be controlled at road works or in the mining industry.
  • road works such as clearing of garbage, at installation of guide posts or crash barriers, laying new asphalt; sanding or salting the road, which must be carried out outside the driver's cabin
  • remote control makes the driver's job easier and more efficient, since the driver does not need to get into and out of the truck to adjust the truck's position, or when tipping the cargo.
  • manoeuvring vehicles in hazardous areas such as near precipices and in hazardous mining areas, the driver may steer the vehicle to safer zones from a safe distance. Since wireless signals may be transferred via the Internet, vehicles may be controlled over large distances.
  • the special driving mode is one of: driving forward, parking and reversing.
  • a vehicle's function is one of the vehicle's speed, acceleration, deceleration, steering wheel angle, or a bodywork feature such as tipping angle.
  • the electronic device has a touch-sensitive screen and the method comprises:
  • the method comprises setting the electronic device into a special input state, wherein the electronic device is arranged to receive in- signals of the second type in the form of voice commands, input via a touch sensitive screen on the electronic device, input via movement of the electronic device or input via one or several push-buttons connected to the electronic device, wherein the electronic device comprises means to receive the respective in-signals of the second type.
  • the method comprises setting the electronic device into the special input state at the same time as the electronic device is set into a special steering configuration.
  • the special input state comprises receiving in- signals of the second type, in the form of movement of the electronic device, wherein the method comprises:
  • the predetermined one-, two- or three-dimensional coordinate system for the electronic device is arranged in accordance with the shape of the electronic device.
  • the method comprises turning the electronic device around a predefined rotational axis, which controls a function of the vehicle connected to the driving mode.
  • the special input state comprises receiving in- signals of the second type via the touch sensitive screen, wherein the method comprises, in the special input state:
  • the method comprises low-pass filtering of the detected second type of in-signals.
  • the low-pass filtering may be adaptive and change depending on the environment or situation of the device.
  • a mobile electronic device comprising: a transmitter of wireless signals; a computer device; a computer-readable memory, comprising a computer program P with computer instructions, and an input device arranged to receive a first type of in- signal, which specifies a special driving mode for the vehicle and sends it to the computer device.
  • the computer device is arranged, in response to the first type of in-signal, specifying a special driving mode for the vehicle, to set the electronic device into a special steering configuration which means that the electronic device is arranged to remotely control one or several functions of the vehicle, connected to the vehicle's special driving mode, wherein continued in-signals to the electronic device may control the function or functions in real time.
  • the computer device is also arranged to generate signals for one or several functions of the vehicle connected to the driving mode, based on the received second type of in- signals to the electronic device, which specify the control of the function or functions of the vehicle when the device is in the special control configuration, wherein the transmitter is arranged to send the signals to a receiver device in the vehicle.
  • the objective is at least partly achieved through a system comprising a mobile electronic device and a receiver device, which is arranged to be placed in the vehicle, and which is arranged to receive one or several wireless signals from the electronic device and to send them to a suitable control device in the vehicle.
  • the objective is achieved at least partly through a computer program P, wherein said computer program P comprises program code to cause a computer unit and/or control device to carry out the method.
  • the objective is achieved at least partly through a computer program product, comprising a program code stored on a computer- readable medium, in order to carry out the method when said program code is executed in a computer unit.
  • the medium is a nonvolatile medium.
  • the objective is at least partly achieved through a graphic user interface on an electronic mobile device with a touch sensitive screen, when the graphic user interface comprises:
  • a second generation user interface element is shown on the touch sensitive screen, in the form of an axis or a coordinate system with several axes, wherein each axis represents control of a function in the vehicle; wherein, in response to a second type of in-signal, in the form of a movement on or movement near the touch sensitive screen, corresponding to a movement related to one of the axes:
  • Fig. 1 shows a system according to one embodiment.
  • Fig. 2 shows a block diagram of the electronic mobile device according to one embodiment.
  • Fig. 3 shows a first graphic interface on the electronic mobile device according to one embodiment.
  • Fig. 4 illustrates movement of the electronic mobile device, in order to control a function in the vehicle according to one embodiment.
  • Fig. 5 illustrates movement of the electronic mobile device, in order to control a function in the vehicle according to another embodiment.
  • Fig. 6 illustrates movement of the electronic mobile device, in order to control a function in the vehicle according to another embodiment.
  • Fig. 7 illustrates movement of the electronic mobile device, in order to control a function in the vehicle according to another embodiment.
  • Fig. 8 shows a second graphic interface on the electronic mobile device according to one embodiment.
  • Fig. 9 shows a flow chart for the method according to one embodiment. Detailed description of preferred embodiments of the invention
  • Fig. 1 shows a system 15 to remotely control a vehicle 2.
  • the vehicle 2 is here displayed in the form of a truck, but may be another commercial vehicle, such as a boom truck or a passenger car.
  • the system 15 comprises a mobile electronic device 1 , hereafter referred to as the device 1 , which is portable and which may e.g. be a mobile phone, a programmable PDA (Personal Digital Assistant), a tablet or similar.
  • the term "mobile” means that the electronic device 1 is portable by a human.
  • the system 15 also comprises a receiver device 4A, which is arranged to be placed in the vehicle 2 to be remotely controlled.
  • the receiver device 4A is arranged to receive one or several wireless signals from the device 1 , and to send it or them to a suitable control device 5 in the vehicle 2.
  • the control device 5 may e.g. be an ECU ( Electronic Control Unit).
  • the system 15 may also comprise a transmitter unit 4B, which is arranged to be placed in the remotely controlled vehicle 2.
  • the transmitter unit 4B is arranged for wireless
  • the device 1 may be equipped with a touch screen 3 and one or several keys or push-buttons 7.
  • the device 1 is arranged for wireless communication 6 with the vehicle 2 via the receiver device 4.
  • Fig. 2 shows a block diagram to schematically illustrate a number of units in the device 1 .
  • the device 1 comprises a transmitter 8A of wireless signals.
  • the transmitter 8A may, for example, be intended for radio communication, and the wireless signals are in this case radio signals.
  • the transmitter 8A may be intended to communicate via GSM (Global System for Mobile Communications), UMTS (Universal Mobile Telecommunications System), LTE Advanced (Long Term Evolution Advanced) or Internet, and the device 1 may in such case be equipped with a suitable 2G, 3G or 4G chip.
  • Other alternative transmitters 8A may be intended for WLAN (Wireless LAN) or Bluetooth.
  • the device 1 may also comprise a receiver 8B of wireless signals.
  • the receiver 8A may, for example, be intended for radio communication and receive wireless signals in the form of radio signals.
  • the receiver 8B may be intended to receive wireless signals via GSM, UMTS, LTE Advanced or Internet, and the device 1 may be equipped with a suitable 2G, 3G or 4G chip for this purpose.
  • Other alternative receivers 8B may be intended for WLAN (Wireless LAN) or Bluetooth.
  • the device 1 may also comprise several of the above mentioned transmitters and receivers.
  • the device 1 also comprises a computer device 9 with a processor device 10A and a memory unit 10B.
  • a computer program P is stored, with instructions that may cause the computer unit 9 to carry out the steps according to the method described herein.
  • parts of the method are carried out on a control device 5 in the vehicle 2.
  • the memory unit 10B is a part of the processor device 10A.
  • the processor device 10A may comprise one or several CPUs (Central Processing Unit).
  • a part of the method, for example certain calculations, may be carried out by the user interface 18.
  • the memory device 10B comprises a memory, for example a non-volatile memory, such as a flash-memory or a RAM (Random Access Memory).
  • the device 1 comprises one or several input units 3, 7, 1 1 , 17.
  • a user may supply in-signals to the device 1 via an input unit 3, 7, 1 1 , 17.
  • the device 1 is, as previously described, equipped with a screen 3 that may be an input device in the form of a touch sensitive screen 3.
  • the touch sensitive screen 3 may, for example, be in the form of a display with a superimposed touch sensitive surface, or a display with an integrated touch sensitive surface. Graphic images, text etc. may be displayed on the display.
  • the touch sensitive screen 3 may be a conventional touch sensitive screen that may receive in-signals from one or several simultaneous touches, or near touches.
  • the term near touch means, in this context, that the touching object, such as the user's finger, does not touch the touch sensitive surface, but is very near the surface.
  • the touch sensitive screen may e.g. detect a touch, near touch and/or movement by acting resistively, capacitively or by using infra-red light.
  • a user interface 18 in the device 1 which may form part of the computer unit 9 or be connected to the computer unit 9, may communicate with the touch sensitive screen 3 and ensure that a detected touch, near touch and/or movement, for example a gesture, is processed.
  • the user interface 18 may comprise one or several GPUs (Graphics Processing Unit).
  • the user interface 18 is also arranged to display graphic interfaces on the screen 3.
  • the user interface 18 is also arranged to connect a touch etc. on the screen 3, at a certain position on the screen 3, with the placement of a graphic interface displayed in the screen 3, when the touch corresponding to the position is detected.
  • the device 1 may, as previously described, also be equipped with an input device 7 in the form of one or several keys or push-buttons 7, and a microphone 17 and/or a loudspeaker 18 in the device 1 . Furthermore, the device 1 may be equipped with a vibrator 16, in order to provide haptic feedback to the user in the form of vibrations. Virtual vibration of a certain part of the graphic interface on the device 2, or vibration of a special interface element may also be achieved, in order to e.g. catch the user's attention, to indicate that something may not be changed, to simulate pushing a button, to simulate change of a slider etc. The device 1 may also be equipped with an input device 1 1 , in the form of a motion detector 1 1 that may detect the position of the device 1 .
  • the motion detector 1 1 may, for example, comprise one or several accelerometers, one or several magnetometers and/or a gyro.
  • An accelerometer registers a movement of the device 1 along an axis, or a rotation, which means the device 1 is at a different angle in relation to gravity.
  • a magnetometer registers magnetic fields, and a change of the magnetic field when the device 1 changes position.
  • a gyro registers the orientation of the device 1 . By inclining the device to different extents at different angles, the device 1 may be used to remotely control the vehicle 2 in an intuitive manner. The angle or angles of the device in relation to gravity is determined, and such angle or angles are transformed into control signals in the device 1 or the vehicle 2.
  • signals from two or more sources are merged into values in the Euclidean coordinate system in the motion detector 1 1 .
  • signals from the sources accelerometer, magnetometer and gyro may be used.
  • the merge may, for example, be carried out with the assistance of a suitably selected filter, e.g. a Kalman filter.
  • the sampling frequency for detection of angles etc. may according to one embodiment be set in the device 1 .
  • the device 1 may also be arranged to receive vibrations as in- signals to the device 1 .
  • the wireless signals may be in the form of ready control signals, which directly control a function in the vehicle.
  • the wireless signals may instead be in the form of more or less processed out-signals from the input devices 3, 7, 1 1 , 17, which signals have been received and converted into control signals for a function in the receiver unit 4A or in a control device 5.
  • Remote control means continuous sending of wireless signals from the device 1 to the vehicle 2, which signals give rise to control signals, in order to control one or several functions in the vehicle 2.
  • a control signal may comprise a speed request, an acceleration request or control commands. These signals are interpreted and executed by the vehicle 2 in real time to perform the user's commands.
  • the vehicle 1 may be arranged to send wireless signals to the device 1 via the transmitter unit 4B. These signals may comprise data about the vehicle's driving mode or status, such as for example speed, acceleration, turning radius, etc., data telling if the wireless signals from the device 1 to the receiver unit 4A have been received, if the device 1 and the vehicle 2 have established a secure data connection for the wireless signals, if the distance between the device 1 and the vehicle 2 is too great for good quality wireless transmission to occur, etc. In this manner, the vehicle 2 may provide feedback to the device 1 , which may impose the remote control of the vehicle 2 since the user receives feedback relating to their commands. This information may e.g. be displayed in the form of text messages on the screen 3 or as voice feedback.
  • the vehicle 1 may also be equipped with one or several cameras (not displayed), and feedback from these in the form of one or several video streams may be displayed on the screen 3.
  • data from different cameras may be displayed. For example, data from a forward view camera in the vehicle 2 may be displayed when the vehicle 2 is in the driving mode "forward”, and data from a rear view camera in the vehicle 2 may be displayed when the vehicle is in the driving mode "reverse”.
  • the control device 5 in the vehicle 1 may be arranged to decelerate or stop the vehicle 1 .
  • Fig. 9 illustrates a method for the device 1 to function in an intuitive manner for the user, for example the driver, and the method will now be explained with reference to the flow chart in this figure, and to the various examples displayed in Figures 3-8.
  • the input device 3, 7, 1 1 receives a first type of signal, which specifies a special driving mode for the vehicle 2 (A1 ).
  • a special driving mode may, according to one embodiment, be one of: driving forward, parking and reversing.
  • the special driving modes correspond to the different positions of the vehicle's gear lever.
  • This first type of in-signals may be detected by one of the devices that have been explained in connection with the device 1 .
  • the input device 3, 7, 1 1 then sends the detected first type of in-signal to the computer unit 9.
  • the computer unit 9 is arranged - in response to the first type of in-signal, specifying a special driving mode for the vehicle 2 - to set the electronic device 1 into a special steering configuration, which means that the electronic device 1 is arranged to remotely control one or several functions of the vehicle 2, connected to the special driving mode (A2).
  • a special steering configuration which means that the electronic device 1 is arranged to remotely control one or several functions of the vehicle 2, connected to the special driving mode (A2).
  • A2 special driving mode
  • subsequent second type in-signals to the electronic device 1 may control one, two or several predefined functions, connected to the special driving mode, in the vehicle 2 in real time.
  • a function of a vehicle 2 is one of the vehicle's speed, acceleration, deceleration, steering wheel angle, or a bodywork feature such as tipping angle. According to one embodiment, it is possible at any time to transition from one driving mode or function to another.
  • a first type of in- signal is thus hierarchically placed above a second type of in-signal.
  • a first type of in-signal may, for example, be a sweeping gesture on the screen 3, providing direct access to a special driving mode or bodywork function. This will be explained in further detail below.
  • the computer unit 9 is arranged to receive the second type of in-signals and/or data from the units in the device 1 , in order to control which steering configuration it is in and which driving mode has been selected, and in order to process the signals and/or data, depending on the steering configuration and the driving mode.
  • the computer unit 9 may, for example, chose to ignore certain signals or data, depending on the steering configuration.
  • Fig. 3 shows an example of a graphic user interface, which may be displayed on the touch sensitive screen 3 on the device 1 to receive the first type of in-signals.
  • the device 1 is here in the form of a mobile phone with a rectangular design.
  • the graphic user interface here comprises three different user interfaces 12A, 12B, 12C, each of which specifies a special driving mode for the vehicle 2.
  • the user interface 12A at the extreme left is in the form of a text "FORWARD", which corresponds to "DRIVE" on the gear lever of a vehicle.
  • the user interface 12B in the middle is in the form of a text "PARKING", which corresponds to "PARK” on the gear lever of a vehicle.
  • the user interface 12C to the right is in the form of a text "REVERSE”, which corresponds to "REVERSE” on the gear lever of a vehicle. These texts are mere examples, and may be different.
  • a first type of in-signal in the form of a touch or near touch on the touch sensitive screen 3, by e.g. the user's finger, and corresponding to a position for one of the user interface elements 12A, 12B, 12C is received, the device 1 is set into the special steering configuration, which is connected to the user interface element 12A, 12B, 12C corresponding to the position. If the user, for example, pushes "FORWARD", the user may now steer the vehicle 1 in its forward movement.
  • the user may, for example, impact the vehicle's speed, reduce the speed to zero, turn the vehicle when it drives forward, etc. If the user now instead pushes "REVERSE”, the user may now steer the vehicle 1 while moving backwards. The user may, for example, impact the vehicle's speed backwards, reduce the speed to zero, turn the vehicle when it reverses, etc. If the user now instead pushes "PARKING", the user may now steer the vehicle 1 when it is parked. For example, the user may control bodywork functions in the vehicle 2, such as tipping of the tipper, etc.
  • the user may go directly to a function position to control a bodywork function without passing via "PARKING".
  • a suitable user interface element may in this case be available specifically for this bodywork function.
  • the device 1 may be set into a special input state, wherein the device 1 is arranged to receive the second type of in-signals in the form of voice commands, input via the touch sensitive screen 3 on the electronic device 1 , input via movement of the electronic device 1 , or input via one or several push-buttons 7 connected to the electronic device 1 .
  • Voice commands may e.g. be received via the microphone 17 in the device 1 .
  • the microphone 17 may be an in-built microphone in the device 1 .
  • the device 1 may be set into the special input state at the same time as the electronic device 1 is set into a special steering configuration.
  • the user may alternate between the different driving modes displayed in Fig. 3.
  • the user may "switch" the vehicle 2 to a different driving mode.
  • the user may first go into the driving mode
  • TIPPING may be available as a function that may be reached without passing via "PARKING".
  • the user selects "TIPPING” by pointing at the element, and tips the tipper 30°. Subsequently, the user goes into the driving mode "FORWARD” by making a sweeping gesture on the right side, and drives the vehicle 10 metres. Subsequently, the user returns to "TIPPING” and tips the tipper another 10°. Subsequently, the user returns the tipper to its original position and completes the tipping.
  • Figures 4, 5, 6, and 7 illustrate a case, where the special input state comprises receiving the second type of in-signals in the form of movement of the device 1 .
  • the device 1 When the device 1 is in a special steering configuration, receiving movement as the second type of in-signals, the device 1 may be turned or inclined and give rise to the second type of in-signals, controlling the vehicle 2 in one of its selected driving modes.
  • the movement may be detected with the motion detector 1 1 (Fig. 2), which may detect the position of the device 1 .
  • the motion detector 1 1 or the computer unit 9 define a one-, two- or three-dimensional coordinate system for the electronic device 1 .
  • the computer unit 9 receives the detected second type of in- signals and generates one or several wireless signals to a control device 5 in the vehicle 2, which signals specify the detected movement.
  • the vehicle's one or several functions connected to the driving mode are then based on the detected movement.
  • the predetermined one-, two- or three-dimensional coordinate system for the electronic device 1 is, according to one embodiment, arranged in accordance with the shape of the device. In this manner, the user may, in an intuitive manner, control the vehicle 2 with the help of the device 2. For example, turning the electronic device 1 around a predefined rotational axis running along one of the device's sides may control a function of the vehicle 2 connected to the driving mode, which will be explained in further detail below.
  • Fig. 4 illustrates an imagined rotational axis 19A in an x-axis direction for the device 1 , around which the device 1 may be turned.
  • the device 1 has a rectangular design, whose longest side 20A runs along the rotational axis 19A.
  • the short sides 21 A, 21 B of the device run in the direction of a z-axis when the device 1 is in a vertical position.
  • the device's short sides 21 A, 21 B run in the direction of a y-axis.
  • This turning angle may be translated into a desired speed for the vehicle.
  • the vehicle's speed may then be adjusted within this interval, where 0° means 0 km/h and 90° means the vehicle's maximum speed, for example 80 km/h.
  • the interval may be adjusted according to the vehicle's speed, so that the interval has adaptive end positions and facilitates more accurate driving. Either or both of the bottom and/or the top speed in the interval may be adjusted.
  • 0° to 90° may first entail a speed increase of 0-30 km/h, and when 30 km/h has been reached, the interval 0° to 90° is adjusted to 0-50 km/h, and when 50 km/h has been reached, the interval 0° to 90° is adjusted to 0-80 km/h.
  • the same turning angle thus gradually gives rise to an increased speed change by adjusting the top speed in the interval.
  • the bottom speed may, as an example, be adjusted to e.g. 50-80 km/h in the interval 0° to 90° when 50 km/h has been reached etc.
  • the user is able, according to one
  • the vehicle's reversing speed may be controlled in this interval, where 0° means 0 km/h and 90° means, for example, the vehicle's maximum speed when reversing, e.g. 30 km/h.
  • This interval may also be adjusted according to the vehicle's speed, so that the interval has adaptive end positions and facilitates more accurate driving.
  • Fig. 5 illustrates an imagined rotational axis 19B in a y-axis direction for the device 1 , around which the device 1 may be turned.
  • the rotational axis 19B is here placed in the centre of the device 1 , but might be placed in some other place.
  • the device 1 has a rectangular design here as well, whose longest sides 20A, 20B run along the direction of an x-axis.
  • the device's short sides 21 A, 21 B run in the direction of a z-axis when the device 1 is in a vertical position.
  • the motion detector 1 1 provides a finding, and a turning angle a, specifying how much the device 1 has been turned around the rotational axis 19 may be detected.
  • This turning angle may be translated into a desired steering or turning of the wheels of the vehicle 2.
  • Driving forward may correspond to no turning of the device 1 .
  • the direction of the z-axis thus specifies the 0-position.
  • the turning around the rotational axis 19B in a positive direction thus produces a turn to the right, and the turn around the rotational axis 19B in a negative direction produces a turn to the left.
  • the desired steering or turning of the wheels may be speed-dependent. If the vehicle 1 has a high speed, the curve radius becomes greater, compared to if the vehicle 1 had a lower speed. The sensitivity of the device may thus be adjusted according to the situation.
  • the device 1 may have knowledge about the vehicle's speed via the signals sent to the vehicle 2 from the device 1 regarding speed. Alternatively, the vehicle's actual speed may be fed back to the device 1 .
  • the bodywork functions may be controlled.
  • a second generation user interface element (not displayed) may be displayed on the screen 3, facilitating control of various bodywork functions.
  • a function may be selected by, for example, pointing at one of the user interface elements.
  • Such a selection may be for control of the bodywork function of tipping of the vehicle's tipper or tub, and is illustrated in Figures 6-7.
  • Fig. 6 illustrates an imagined rotational axis 19C in a y-axis direction for the device 1 , around which the device 1 may be turned.
  • the rotational axis 19C is here placed along the extension of one of the short sides.
  • the device 1 has a rectangular design here as well, whose longest sides 20A, 20B run along the direction of an x-axis.
  • the device's short sides 21 A, 21 B run in the direction of a y-axis when the device 1 is in a horizontal position.
  • the motion detector 1 1 provides a finding, and a turning angle ⁇ , specifying how much the device 1 has been turned around the rotational shaft 19C, may be detected.
  • This turning angle may be translated into a desired tipping angle of a tipper on the vehicle 2.
  • the translation may be direct, or alternatively adapted with adaptive end positions on the turning angle interval.
  • Tipping angle means the angle of the vehicle's tipper or tub in relation to the vehicle's frame when it is tipped. If no tipping occurs, the tipping angle is usually around zero degrees.
  • Fig. 7 illustrates an imagined rotational axis 19D in a y-axis direction for the device 1 , around which the device 1 may be turned.
  • the rotational axis 19D is here placed along the extension of the y-axis, in one of the device's corners.
  • the device 1 here also has a rectangular shape, whose longest sides 20A, 20B run along the direction of an x-axis, when the device 1 is placed so that a turning angle ⁇ around the rotational axis 19D is zero.
  • the short sides of the device 21 A, 21 B run in the direction of a z-axis when the turning angle ⁇ around the rotational axis 19D is zero.
  • the touch sensitive screen 3 shows a second generation user interface element 13A, 13B in the form of a coordinate system with two axes 13A, 13B in the special input state.
  • the second generation user interface element 13A, 13B may, however, instead be only one axis 13A or 13B.
  • Each axis 13A, 13B provides a possibility of controlling a function in the vehicle 2.
  • signals are generated, which may then be sent to the vehicle 2 and control the function or functions.
  • the horizontal axis 13A may, for example, correspond to control of the vehicle's steering wheel.
  • the horizontal axis 13A crosses the vertical shaft 13B the vehicle's direction is forward.
  • the vertical axis 13B may, for example, correspond to the vehicle's speed. Above the horizontal axis 13A the vehicle's speed is increased, and below the horizontal axis 13A the vehicle's speed is reduced.
  • movement on or movement near the touch sensitive screen 3, corresponding to a movement related to one of the axes, may be detected by the device 1 via the user interface module 18.
  • one or several signals are generated by the computer unit 9, specifying the detected motion, which is sent by the transmitter 8A to a control device 5 in the vehicle 1 .
  • the vehicle 2 is then controlled based on the detected movement.
  • the finger's position may, for example, be displayed with a circle 14, point or similar. In this manner the detected motion may be displayed.
  • the user may also have the option of letting the circle 14 remain on the screen 3 when the user has released contact with the screen 3.
  • the user is able to view the speed or steering of the vehicle at the current time.
  • the vehicle may continue to be controlled after the specification provided by the circle 14, and the user may thus release the screen 3 in order to let the vehicle continue to be controlled according to the requests, which the user made according to the circle 14.
  • the user may draw up a route in a suitable coordinate system (not displayed), which is displayed on the screen 3.
  • the route is received as the second type of in-signals and is converted into suitable control signals for the vehicle's speed, steering, etc.
  • the control signals may comprise driving forward 20 metres, and then turning with a radius of 50 metres.
  • the user may be assisted by e.g. a user interface element displaying a ruler with measurements, indicating how long a distance the user must draw in order to make the vehicle move a certain distance. The assistance thus entails a kind of mapping between a drawn in-signal and reality.
  • the method comprises low pass filtering of the detected second type in-signals.
  • the device 1 may then comprise a low pass filter that filters away undesired signals such as jolts. In this manner, a more robust control may be achieved.
  • the device 1 may also receive in-signals in the form of voice commands, which are interpreted in the device 1 and forwarded as control signals to the vehicle 2.
  • the user may request the vehicle 2 to "maintain 5 km per hour” or to "brake” via the device 1 .
  • the device 1 may be arranged to confirm the received in-signals by e.g. a voice, which via the device's loudspeakers says “maintaining 5 km/h" or "I am applying the brakes". The confirmation may be directly fed back to the in-signals received, or by way of confirmation that the vehicle 2 carries out the commands provided.
  • the user may set how aggressive the vehicle 1 should be on a scale in the device 1 .
  • “Aggressive” as used in this context means how much the vehicle 2 should react to the second type of in-signals provided to the device 1 .
  • the scale may e.g. be implemented in software in the device 1 and be displayed as a graphic user interface element (not displayed) on the screen 3, which the user may adjust.
  • the device 1 has an implemented "dead-man- control". This is to enhance safety, so that there is no risk of an accident in case the user, for example, drops the device 1 .
  • the computer unit 9 may demand that the user have a finger or similar on the screen 3 to be able to remotely control the vehicle 2.
  • the device 1 may also be arranged to detect if it is dropped, for example by registering quick movements, quick angle changes, etc. with one or several of the inbuilt detectors, e.g. one or several accelerometers. The device 1 may then send control signals to the vehicle 2, so that it decelerates and/or stops, is set in a parking state or similar. Detected quick movements may also lock the speed and/or steering angle into the current position. In this manner, noise in the control signals is prevented.
  • a special gesture on the screen 3, in the form of, for example, the dragging of a finger from the edge of the screen to the middle, may also lock the speed and/or the steering angle in the current position.
  • the user may then control the vehicle 2 in the locked position, e.g. control the speed on a straight road section or at a constant curve.
  • the user may provide the second type of in-signals to the device 1 by way of a graphic user interface element in the form of a slider.
  • the slider may, for example, represent control of speed or steering angle. A change of the slider is interpreted and sent as control signals to the vehicle 2.

Abstract

A method comprising, at a mobile electronic device, which is arranged to communicate with and remotely control a vehicle in several driving modes: to receive a first type of in-signal, specifying a special driving mode for the vehicle, and in response to such a first type of in-signal: to set the electronic device into a special steering configuration, which means that the electronic device is configured to remotely control one or several functions in the vehicle, connected to the special driving mode, wherein subsequent second type in-signals to the electronic device may control the functions or functions in real time; and to receive one or several second type in-signals specifying control of the function or functions of the vehicle, when the electronic device is in the special steering configuration, and to control the function or functions in the vehicle, connected to the special driving mode for the vehicle, in real time, based on the one or several in-signals of the second type. Also a mobile electronic device, a system, a computer program, a computer program product and a graphic user interface.

Description

Method and mobile device tor steenng a vehicle Field of the invention
The present invention relates to technology in connection with remote control of a vehicle, and specifically to a method and a mobile electronic device to control a vehicle and its movement. The invention also relates to a system, a computer program, a software product and a user interface.
Background of the invention
Remote control of vehicles currently often forms part of many people's everyday life. Most modern vehicles are, for example, equipped with the function of remote control of the vehicle's lock, via the vehicle's electronic key, which simplifies the management of the vehicle. As the mobile phone has become part of the average person's everyday life, with the possibility of easily downloading software to the mobile phone, it is now also possible to carry out certain functions, such as remote control of the vehicle's lock, from the mobile phone. Through, for example, US201 1/0137490 and US2008/0057929, it is prior art to control a number of functions in a vehicle via a program in a mobile phone. Examples of such functions include readings of tachographs, turning ABS or air conditioning on or off, locking or opening the vehicle, etc. Another example is shown by SE1350333- 9, in which an autonomous vehicle is given high level commands, which it then processes and performs independently. Thus, the latter case does not involve direct remote control of the autonomous vehicle. More functions than ever are electronically controlled in a vehicle today. In the driver's cabin, the driver provides mechanical input signals by turning the steering wheel, actuating of the accelerator pedal and brake pedal, changing the position of the gear lever etc. These mechanical input signals may be converted into electric signals, which are processed in the vehicle's control system to provide suitable control signals to the respective element in the vehicle, ensuring ensures that the vehicle changes speed, direction, etc. The electronic processing in the vehicle also results in increased possibilities for remote control. GB2460326 illustrates an example of wireless remote control of a fork truck. Via a touch screen on a mobile phone or a similar device, the fork truck may be controlled and its lifting function may be adjusted. The control may be
implemented as a circular positioning element in an X-/Y-coordinate system on the touch screen. The bend and level of bend of the positioning element specifies both the speed and steering angle of the vehicle. It is specified that the lifting function of the fork truck may be controlled via function keys in the software, but this is not explained in detail.
When the movements of a vehicle are controlled remotely, it is very important that steering may be carried out in an intuitive manner, which facilitates a safe and accurate steering. It is therefore one objective of the invention to achieve a user-friendly interface for the user, resembling real driving of the vehicle to be remotely controlled, so that the remote control may be performed safely and intuitively.
Summary of the invention
According to a first aspect, the objective is achieved at least partly by way of a method, which comprises, at a mobile electronic device which is arranged to communicate with and to remotely control a vehicle in several driving modes:
- receiving a first type of in-signal, which specifies a specific driving mode for the vehicle, and in response to such a first type of in-signal:
- setting the electronic device into a special steering configuration, which means that the electronic device is configured to remotely control one or several functions in the vehicle, connected to the special driving mode, whereby continued in-signals to the electronic device may control the function or functions in real time; and
- receiving one or several in-signals of a second type, specifying the control of the function or functions of the vehicle, when the electronic device is in the special steering configuration, and - controlling the function or functions of the vehicle connected to the special driving mode for the vehicle in real time, based on the one or several in- signals of the second type. Wireless remote control of a vehicle facilitates smoother handling, when the driver wishes to control the vehicle from outside the vehicle. Since control of a function is connected to the current driving mode of the vehicle, in-signals in the form of one or several in-signals of the second type, detected after the first type of in- signal, will only be able to impact the vehicle in a predetermined manner. Since continued in-signals of the second type to the device only control one or perhaps two functions, there is greater freedom in handling the device. The device may e.g. be used as a hybrid joystick, whose movements impact, for example, the vehicle's speed. For example, fine manoeuvring of the vehicle when connecting a trailer or reversing is possible. A more efficient reversing process may then be obtained, since the driver does not need to get into and out of the driver's cabin to check the distance to the trailer or, for example, a terminal.
Truck functions may be controlled at road works or in the mining industry. At road works such as clearing of garbage, at installation of guide posts or crash barriers, laying new asphalt; sanding or salting the road, which must be carried out outside the driver's cabin, remote control makes the driver's job easier and more efficient, since the driver does not need to get into and out of the truck to adjust the truck's position, or when tipping the cargo. When manoeuvring vehicles in hazardous areas, such as near precipices and in hazardous mining areas, the driver may steer the vehicle to safer zones from a safe distance. Since wireless signals may be transferred via the Internet, vehicles may be controlled over large distances.
According to one embodiment, the special driving mode is one of: driving forward, parking and reversing. According to one embodiment, a vehicle's function is one of the vehicle's speed, acceleration, deceleration, steering wheel angle, or a bodywork feature such as tipping angle. According to one embodiment, the electronic device has a touch-sensitive screen and the method comprises:
- showing several user interface elements, each of which specifies a special driving mode for the vehicle;
- detecting a first type of in-signal in the form of touch or near touch of the touch sensitive screen, corresponding to a position for one of the user interface elements, and in response to such a detected touch or near touch:
- setting the electronic device into the special steering configuration, which is connected to the user interface element corresponding to the position.
According to one embodiment, the method comprises setting the electronic device into a special input state, wherein the electronic device is arranged to receive in- signals of the second type in the form of voice commands, input via a touch sensitive screen on the electronic device, input via movement of the electronic device or input via one or several push-buttons connected to the electronic device, wherein the electronic device comprises means to receive the respective in-signals of the second type. According to one embodiment, the method comprises setting the electronic device into the special input state at the same time as the electronic device is set into a special steering configuration.
According to one embodiment, the special input state comprises receiving in- signals of the second type, in the form of movement of the electronic device, wherein the method comprises:
- detecting with a movement sensor in the electronic device a second type of in- signal, in the form of movement of the electronic device in a predetermined one-, two- or three-dimensional coordinate system for the electronic device;
- generating and sending one or several signals specifying the detected movement to a control device in the vehicle; - controlling the vehicle's one or several functions connected to the driving mode, based on the detected movement.
According to one embodiment, the predetermined one-, two- or three-dimensional coordinate system for the electronic device is arranged in accordance with the shape of the electronic device.
According to one embodiment, the method comprises turning the electronic device around a predefined rotational axis, which controls a function of the vehicle connected to the driving mode.
According to one embodiment, the special input state comprises receiving in- signals of the second type via the touch sensitive screen, wherein the method comprises, in the special input state:
- showing the user interface element on the screen in the form of an axis or a coordinate system with several axes, wherein each axis provides for a possibility of controlling a function in the vehicle;
- detecting a second type of in-signal in the form of a movement on or movement near the touch sensitive screen, which corresponds to a movement related to one of the axes, and in response to such a detected movement:
- generating and sending one or several signals specifying the detected
movement to a control device in the vehicle;
- controlling the vehicle based on the detected movement. According to one embodiment, the method comprises low-pass filtering of the detected second type of in-signals. The low-pass filtering may be adaptive and change depending on the environment or situation of the device.
According to a second aspect, the objectives are achieved at least partly via a mobile electronic device comprising: a transmitter of wireless signals; a computer device; a computer-readable memory, comprising a computer program P with computer instructions, and an input device arranged to receive a first type of in- signal, which specifies a special driving mode for the vehicle and sends it to the computer device. The computer device is arranged, in response to the first type of in-signal, specifying a special driving mode for the vehicle, to set the electronic device into a special steering configuration which means that the electronic device is arranged to remotely control one or several functions of the vehicle, connected to the vehicle's special driving mode, wherein continued in-signals to the electronic device may control the function or functions in real time. The computer device is also arranged to generate signals for one or several functions of the vehicle connected to the driving mode, based on the received second type of in- signals to the electronic device, which specify the control of the function or functions of the vehicle when the device is in the special control configuration, wherein the transmitter is arranged to send the signals to a receiver device in the vehicle. According to a third aspect, the objective is at least partly achieved through a system comprising a mobile electronic device and a receiver device, which is arranged to be placed in the vehicle, and which is arranged to receive one or several wireless signals from the electronic device and to send them to a suitable control device in the vehicle.
According to a fourth aspect, the objective is achieved at least partly through a computer program P, wherein said computer program P comprises program code to cause a computer unit and/or control device to carry out the method. According to a fifth aspect, the objective is achieved at least partly through a computer program product, comprising a program code stored on a computer- readable medium, in order to carry out the method when said program code is executed in a computer unit. According to one embodiment, the medium is a nonvolatile medium. According to a sixth aspect, the objective is at least partly achieved through a graphic user interface on an electronic mobile device with a touch sensitive screen, when the graphic user interface comprises:
- one or several user interface elements, each of which specifies a special driving mode for the vehicle; wherein, in response to a first type of in-signal in the form of a touch or near touch of the touch sensitive screen, corresponding to a position for one of the user interface elements:
- a second generation user interface element is shown on the touch sensitive screen, in the form of an axis or a coordinate system with several axes, wherein each axis represents control of a function in the vehicle; wherein, in response to a second type of in-signal, in the form of a movement on or movement near the touch sensitive screen, corresponding to a movement related to one of the axes:
- the detected movement is shown. Preferred embodiments are described in the dependent claims and in the detailed description.
Brief description of the enclosed figures
The invention is described below with reference to the enclosed figures, of which: Fig. 1 shows a system according to one embodiment.
Fig. 2 shows a block diagram of the electronic mobile device according to one embodiment.
Fig. 3 shows a first graphic interface on the electronic mobile device according to one embodiment.
Fig. 4 illustrates movement of the electronic mobile device, in order to control a function in the vehicle according to one embodiment.
Fig. 5 illustrates movement of the electronic mobile device, in order to control a function in the vehicle according to another embodiment.
Fig. 6 illustrates movement of the electronic mobile device, in order to control a function in the vehicle according to another embodiment.
Fig. 7 illustrates movement of the electronic mobile device, in order to control a function in the vehicle according to another embodiment. Fig. 8 shows a second graphic interface on the electronic mobile device according to one embodiment.
Fig. 9 shows a flow chart for the method according to one embodiment. Detailed description of preferred embodiments of the invention
Fig. 1 shows a system 15 to remotely control a vehicle 2. The vehicle 2 is here displayed in the form of a truck, but may be another commercial vehicle, such as a boom truck or a passenger car. The system 15 comprises a mobile electronic device 1 , hereafter referred to as the device 1 , which is portable and which may e.g. be a mobile phone, a programmable PDA (Personal Digital Assistant), a tablet or similar. The term "mobile" means that the electronic device 1 is portable by a human. The system 15 also comprises a receiver device 4A, which is arranged to be placed in the vehicle 2 to be remotely controlled. The receiver device 4A is arranged to receive one or several wireless signals from the device 1 , and to send it or them to a suitable control device 5 in the vehicle 2. The control device 5 may e.g. be an ECU ( Electronic Control Unit). The system 15 may also comprise a transmitter unit 4B, which is arranged to be placed in the remotely controlled vehicle 2. The transmitter unit 4B is arranged for wireless
communication with the device 1 . The device 1 may be equipped with a touch screen 3 and one or several keys or push-buttons 7. The device 1 is arranged for wireless communication 6 with the vehicle 2 via the receiver device 4.
Fig. 2 shows a block diagram to schematically illustrate a number of units in the device 1 . The device 1 comprises a transmitter 8A of wireless signals. The transmitter 8A may, for example, be intended for radio communication, and the wireless signals are in this case radio signals. The transmitter 8A may be intended to communicate via GSM (Global System for Mobile Communications), UMTS (Universal Mobile Telecommunications System), LTE Advanced (Long Term Evolution Advanced) or Internet, and the device 1 may in such case be equipped with a suitable 2G, 3G or 4G chip. Other alternative transmitters 8A may be intended for WLAN (Wireless LAN) or Bluetooth. The device 1 may also comprise a receiver 8B of wireless signals. The receiver 8A may, for example, be intended for radio communication and receive wireless signals in the form of radio signals. Alternatively, the receiver 8B may be intended to receive wireless signals via GSM, UMTS, LTE Advanced or Internet, and the device 1 may be equipped with a suitable 2G, 3G or 4G chip for this purpose. Other alternative receivers 8B may be intended for WLAN (Wireless LAN) or Bluetooth. The device 1 may also comprise several of the above mentioned transmitters and receivers.
The device 1 also comprises a computer device 9 with a processor device 10A and a memory unit 10B. On the memory unit 10B a computer program P is stored, with instructions that may cause the computer unit 9 to carry out the steps according to the method described herein. According to one embodiment, parts of the method are carried out on a control device 5 in the vehicle 2. According to one embodiment, the memory unit 10B is a part of the processor device 10A. The processor device 10A may comprise one or several CPUs (Central Processing Unit). A part of the method, for example certain calculations, may be carried out by the user interface 18. The memory device 10B comprises a memory, for example a non-volatile memory, such as a flash-memory or a RAM (Random Access Memory). The device 1 comprises one or several input units 3, 7, 1 1 , 17. A user may supply in-signals to the device 1 via an input unit 3, 7, 1 1 , 17. The device 1 is, as previously described, equipped with a screen 3 that may be an input device in the form of a touch sensitive screen 3. The touch sensitive screen 3 may, for example, be in the form of a display with a superimposed touch sensitive surface, or a display with an integrated touch sensitive surface. Graphic images, text etc. may be displayed on the display. The touch sensitive screen 3 may be a conventional touch sensitive screen that may receive in-signals from one or several simultaneous touches, or near touches. The term near touch means, in this context, that the touching object, such as the user's finger, does not touch the touch sensitive surface, but is very near the surface. The touch sensitive screen may e.g. detect a touch, near touch and/or movement by acting resistively, capacitively or by using infra-red light. A user interface 18 in the device 1 , which may form part of the computer unit 9 or be connected to the computer unit 9, may communicate with the touch sensitive screen 3 and ensure that a detected touch, near touch and/or movement, for example a gesture, is processed. The user interface 18 may comprise one or several GPUs (Graphics Processing Unit). The user interface 18 is also arranged to display graphic interfaces on the screen 3. The user interface 18 is also arranged to connect a touch etc. on the screen 3, at a certain position on the screen 3, with the placement of a graphic interface displayed in the screen 3, when the touch corresponding to the position is detected.
The device 1 may, as previously described, also be equipped with an input device 7 in the form of one or several keys or push-buttons 7, and a microphone 17 and/or a loudspeaker 18 in the device 1 . Furthermore, the device 1 may be equipped with a vibrator 16, in order to provide haptic feedback to the user in the form of vibrations. Virtual vibration of a certain part of the graphic interface on the device 2, or vibration of a special interface element may also be achieved, in order to e.g. catch the user's attention, to indicate that something may not be changed, to simulate pushing a button, to simulate change of a slider etc. The device 1 may also be equipped with an input device 1 1 , in the form of a motion detector 1 1 that may detect the position of the device 1 . The motion detector 1 1 may, for example, comprise one or several accelerometers, one or several magnetometers and/or a gyro. An accelerometer registers a movement of the device 1 along an axis, or a rotation, which means the device 1 is at a different angle in relation to gravity. A magnetometer registers magnetic fields, and a change of the magnetic field when the device 1 changes position. A gyro registers the orientation of the device 1 . By inclining the device to different extents at different angles, the device 1 may be used to remotely control the vehicle 2 in an intuitive manner. The angle or angles of the device in relation to gravity is determined, and such angle or angles are transformed into control signals in the device 1 or the vehicle 2. According to one embodiment, signals from two or more sources are merged into values in the Euclidean coordinate system in the motion detector 1 1 . For example, signals from the sources accelerometer, magnetometer and gyro may be used. The merge may, for example, be carried out with the assistance of a suitably selected filter, e.g. a Kalman filter. The sampling frequency for detection of angles etc., may according to one embodiment be set in the device 1 . The device 1 may also be arranged to receive vibrations as in- signals to the device 1 .
As mentioned previously, the vehicle 2 is arranged to receive wireless signals from the device 1 via the receiver unit 4A in the vehicle 2. The receiver unit 4 is also arranged to receive the wireless signals transmitted by the transmitter unit 8A in the device 1 . The receiver unit 4A is also arranged to forward the wireless signals to one or several control devices 5 in the vehicle. The receiver unit 4A and the control devices 5 are arranged to communicate via an internal network in the vehicle 2, for example with a CAN-bus (Controller Area Network), Ethernet, TCP- IP or another wireless transmission. A control device 5 may, for example, be intended for control of the vehicle's speed, control of the vehicle's direction, control of the gear lever, control of a bodywork on the vehicle, etc. The wireless signals may be in the form of ready control signals, which directly control a function in the vehicle. The wireless signals may instead be in the form of more or less processed out-signals from the input devices 3, 7, 1 1 , 17, which signals have been received and converted into control signals for a function in the receiver unit 4A or in a control device 5.
Remote control means continuous sending of wireless signals from the device 1 to the vehicle 2, which signals give rise to control signals, in order to control one or several functions in the vehicle 2. A control signal may comprise a speed request, an acceleration request or control commands. These signals are interpreted and executed by the vehicle 2 in real time to perform the user's commands.
The vehicle 1 may be arranged to send wireless signals to the device 1 via the transmitter unit 4B. These signals may comprise data about the vehicle's driving mode or status, such as for example speed, acceleration, turning radius, etc., data telling if the wireless signals from the device 1 to the receiver unit 4A have been received, if the device 1 and the vehicle 2 have established a secure data connection for the wireless signals, if the distance between the device 1 and the vehicle 2 is too great for good quality wireless transmission to occur, etc. In this manner, the vehicle 2 may provide feedback to the device 1 , which may impose the remote control of the vehicle 2 since the user receives feedback relating to their commands. This information may e.g. be displayed in the form of text messages on the screen 3 or as voice feedback. The vehicle 1 may also be equipped with one or several cameras (not displayed), and feedback from these in the form of one or several video streams may be displayed on the screen 3.
Depending on the vehicle's 2 driving mode, data from different cameras may be displayed. For example, data from a forward view camera in the vehicle 2 may be displayed when the vehicle 2 is in the driving mode "forward", and data from a rear view camera in the vehicle 2 may be displayed when the vehicle is in the driving mode "reverse".
If the device 1 loses an established connection with the vehicle 2, e.g. because of an error in the components, because the wireless signals are not received, etc., the control device 5 in the vehicle 1 may be arranged to decelerate or stop the vehicle 1 .
The device 1 functions like a hybrid joystick for the vehicle 1 . Fig. 9 illustrates a method for the device 1 to function in an intuitive manner for the user, for example the driver, and the method will now be explained with reference to the flow chart in this figure, and to the various examples displayed in Figures 3-8.
According to the method, the input device 3, 7, 1 1 receives a first type of signal, which specifies a special driving mode for the vehicle 2 (A1 ). A special driving mode may, according to one embodiment, be one of: driving forward, parking and reversing. The special driving modes correspond to the different positions of the vehicle's gear lever. This first type of in-signals may be detected by one of the devices that have been explained in connection with the device 1 . The input device 3, 7, 1 1 then sends the detected first type of in-signal to the computer unit 9. The computer unit 9 is arranged - in response to the first type of in-signal, specifying a special driving mode for the vehicle 2 - to set the electronic device 1 into a special steering configuration, which means that the electronic device 1 is arranged to remotely control one or several functions of the vehicle 2, connected to the special driving mode (A2). By setting the electronic device 1 into a special steering configuration, subsequent second type in-signals to the electronic device 1 may control one, two or several predefined functions, connected to the special driving mode, in the vehicle 2 in real time. One or several in-signals of the second type are thus received via an input unit 3, 7, 1 1 , specifying the control of the function or functions of the vehicle 2 when the electronic device 1 is in the special steering configuration (A3), and the function or functions of the vehicle 2 are controlled in real time in connection with the special driving mode for the vehicle 2, based on the one or several in-signals of the second type (A4). According to one embodiment, a function of a vehicle 2 is one of the vehicle's speed, acceleration, deceleration, steering wheel angle, or a bodywork feature such as tipping angle. According to one embodiment, it is possible at any time to transition from one driving mode or function to another. This is illustrated by way of arrows from the various steps A2-A4 to the step A1 in the flow chart in Fig. 9. A first type of in- signal is thus hierarchically placed above a second type of in-signal. A first type of in-signal may, for example, be a sweeping gesture on the screen 3, providing direct access to a special driving mode or bodywork function. This will be explained in further detail below.
The computer unit 9 is arranged to receive the second type of in-signals and/or data from the units in the device 1 , in order to control which steering configuration it is in and which driving mode has been selected, and in order to process the signals and/or data, depending on the steering configuration and the driving mode. The computer unit 9 may, for example, chose to ignore certain signals or data, depending on the steering configuration.
Fig. 3 shows an example of a graphic user interface, which may be displayed on the touch sensitive screen 3 on the device 1 to receive the first type of in-signals. The device 1 is here in the form of a mobile phone with a rectangular design. The graphic user interface here comprises three different user interfaces 12A, 12B, 12C, each of which specifies a special driving mode for the vehicle 2. The user interface 12A at the extreme left is in the form of a text "FORWARD", which corresponds to "DRIVE" on the gear lever of a vehicle. The user interface 12B in the middle is in the form of a text "PARKING", which corresponds to "PARK" on the gear lever of a vehicle. The user interface 12C to the right is in the form of a text "REVERSE", which corresponds to "REVERSE" on the gear lever of a vehicle. These texts are mere examples, and may be different. When a first type of in-signal in the form of a touch or near touch on the touch sensitive screen 3, by e.g. the user's finger, and corresponding to a position for one of the user interface elements 12A, 12B, 12C, is received, the device 1 is set into the special steering configuration, which is connected to the user interface element 12A, 12B, 12C corresponding to the position. If the user, for example, pushes "FORWARD", the user may now steer the vehicle 1 in its forward movement. The user may, for example, impact the vehicle's speed, reduce the speed to zero, turn the vehicle when it drives forward, etc. If the user now instead pushes "REVERSE", the user may now steer the vehicle 1 while moving backwards. The user may, for example, impact the vehicle's speed backwards, reduce the speed to zero, turn the vehicle when it reverses, etc. If the user now instead pushes "PARKING", the user may now steer the vehicle 1 when it is parked. For example, the user may control bodywork functions in the vehicle 2, such as tipping of the tipper, etc.
Alternatively, the user may go directly to a function position to control a bodywork function without passing via "PARKING". A suitable user interface element may in this case be available specifically for this bodywork function. When a driving mode has been selected, subsequent second type in-signals to the device 1 will thus make it possible to control certain functions, connected to the driving mode, in the vehicle 1 . This makes it possible to use the device 1 in order to control the vehicle 1 in as intuitive a manner as possible.
The device 1 may be set into a special input state, wherein the device 1 is arranged to receive the second type of in-signals in the form of voice commands, input via the touch sensitive screen 3 on the electronic device 1 , input via movement of the electronic device 1 , or input via one or several push-buttons 7 connected to the electronic device 1 . Voice commands may e.g. be received via the microphone 17 in the device 1 . The microphone 17 may be an in-built microphone in the device 1 . For example, the device 1 may be set into the special input state at the same time as the electronic device 1 is set into a special steering configuration.
The user may alternate between the different driving modes displayed in Fig. 3. By, for example, performing a sweeping gesture with the finger on the screen 3, in either direction on the screen 3, the user may "switch" the vehicle 2 to a different driving mode. For example, the user may first go into the driving mode
"FORWARD" and drive 10 metres. Subsequently, the user enters into the driving mode "PARKING" by making a sweeping gesture on the left side of the screen, whereat the user interface element "PARKING" appears, pushing on "PARKING", whereat a user interface element appears on the screen 3, specifying the bodywork function "TIPPING", entailing the function of tipping the tipper.
Alternatively, "TIPPING" may be available as a function that may be reached without passing via "PARKING". The user selects "TIPPING" by pointing at the element, and tips the tipper 30°. Subsequently, the user goes into the driving mode "FORWARD" by making a sweeping gesture on the right side, and drives the vehicle 10 metres. Subsequently, the user returns to "TIPPING" and tips the tipper another 10°. Subsequently, the user returns the tipper to its original position and completes the tipping. Figures 4, 5, 6, and 7 illustrate a case, where the special input state comprises receiving the second type of in-signals in the form of movement of the device 1 . When the device 1 is in a special steering configuration, receiving movement as the second type of in-signals, the device 1 may be turned or inclined and give rise to the second type of in-signals, controlling the vehicle 2 in one of its selected driving modes. The movement may be detected with the motion detector 1 1 (Fig. 2), which may detect the position of the device 1 . The motion detector 1 1 or the computer unit 9 define a one-, two- or three-dimensional coordinate system for the electronic device 1 . The computer unit 9 receives the detected second type of in- signals and generates one or several wireless signals to a control device 5 in the vehicle 2, which signals specify the detected movement. The vehicle's one or several functions connected to the driving mode are then based on the detected movement. The predetermined one-, two- or three-dimensional coordinate system for the electronic device 1 is, according to one embodiment, arranged in accordance with the shape of the device. In this manner, the user may, in an intuitive manner, control the vehicle 2 with the help of the device 2. For example, turning the electronic device 1 around a predefined rotational axis running along one of the device's sides may control a function of the vehicle 2 connected to the driving mode, which will be explained in further detail below.
Fig. 4 illustrates an imagined rotational axis 19A in an x-axis direction for the device 1 , around which the device 1 may be turned. The device 1 has a rectangular design, whose longest side 20A runs along the rotational axis 19A. The short sides 21 A, 21 B of the device run in the direction of a z-axis when the device 1 is in a vertical position. When the device 1 is in a horizontal position the device's short sides 21 A, 21 B run in the direction of a y-axis. By turning the device 1 around the rotational axis 19A, the motion detector 1 1 provides a finding, and a turning angle specifying how much the device 1 has been turned around the rotational axis 19A may be detected. This turning angle may be translated into a desired speed for the vehicle. For example, an interval for the device 1 may be set, from a turning angle φ = 0° when the device 1 is in a vertical position, to φ = +90° when the device 1 is in a horizontal position on the positive y-axis. For example, the vehicle's speed may then be adjusted within this interval, where 0° means 0 km/h and 90° means the vehicle's maximum speed, for example 80 km/h. According to one embodiment, the interval may be adjusted according to the vehicle's speed, so that the interval has adaptive end positions and facilitates more accurate driving. Either or both of the bottom and/or the top speed in the interval may be adjusted. For example, 0° to 90° may first entail a speed increase of 0-30 km/h, and when 30 km/h has been reached, the interval 0° to 90° is adjusted to 0-50 km/h, and when 50 km/h has been reached, the interval 0° to 90° is adjusted to 0-80 km/h. The same turning angle thus gradually gives rise to an increased speed change by adjusting the top speed in the interval. The bottom speed may, as an example, be adjusted to e.g. 50-80 km/h in the interval 0° to 90° when 50 km/h has been reached etc. The user is able, according to one
embodiment, to set the angle interval, speed interval, speed limits, etc. in the device 1 . For example, one or several graphic user interface elements, which correspond to controls, via which the user may change one or several end positions in one of the intervals, may be displayed on the screen 3. In the same manner, an interval for the device 1 may be set, from a turning angle φ = 0° when the device 1 is in a vertical position, to φ = -90° when the device 1 is in a horizontal position on the negative y-axis. For example, the vehicle's reversing speed may be controlled in this interval, where 0° means 0 km/h and 90° means, for example, the vehicle's maximum speed when reversing, e.g. 30 km/h. This interval may also be adjusted according to the vehicle's speed, so that the interval has adaptive end positions and facilitates more accurate driving.
In the various driving modes, it is possible to request braking by, for example, inclining the device in the negative angle interval. The more pronounced the inclination, the more powerful the braking desired. The angle interval may, according to one embodiment, be set by the user according to their needs. Fig. 5 illustrates an imagined rotational axis 19B in a y-axis direction for the device 1 , around which the device 1 may be turned. The rotational axis 19B is here placed in the centre of the device 1 , but might be placed in some other place. The device 1 has a rectangular design here as well, whose longest sides 20A, 20B run along the direction of an x-axis. The device's short sides 21 A, 21 B run in the direction of a z-axis when the device 1 is in a vertical position. By turning the device 1 around the rotational axis 19B, the motion detector 1 1 provides a finding, and a turning angle a, specifying how much the device 1 has been turned around the rotational axis 19 may be detected. This turning angle may be translated into a desired steering or turning of the wheels of the vehicle 2. Driving forward may correspond to no turning of the device 1 . The direction of the z-axis thus specifies the 0-position. The turning around the rotational axis 19B in a positive direction thus produces a turn to the right, and the turn around the rotational axis 19B in a negative direction produces a turn to the left.
The desired steering or turning of the wheels may be speed-dependent. If the vehicle 1 has a high speed, the curve radius becomes greater, compared to if the vehicle 1 had a lower speed. The sensitivity of the device may thus be adjusted according to the situation. The device 1 may have knowledge about the vehicle's speed via the signals sent to the vehicle 2 from the device 1 regarding speed. Alternatively, the vehicle's actual speed may be fed back to the device 1 .
When a driving mode has been selected, and the selection entails that the vehicle is in a driving mode corresponding to it being parked, i.e. that it is at a standstill, the bodywork functions may be controlled. For example, a second generation user interface element (not displayed) may be displayed on the screen 3, facilitating control of various bodywork functions. A function may be selected by, for example, pointing at one of the user interface elements. Such a selection may be for control of the bodywork function of tipping of the vehicle's tipper or tub, and is illustrated in Figures 6-7. Fig. 6 illustrates an imagined rotational axis 19C in a y-axis direction for the device 1 , around which the device 1 may be turned. The rotational axis 19C is here placed along the extension of one of the short sides. The device 1 has a rectangular design here as well, whose longest sides 20A, 20B run along the direction of an x-axis. The device's short sides 21 A, 21 B run in the direction of a y-axis when the device 1 is in a horizontal position. By turning the device 1 around the rotational axis 19C, the motion detector 1 1 provides a finding, and a turning angle β, specifying how much the device 1 has been turned around the rotational shaft 19C, may be detected. This turning angle may be translated into a desired tipping angle of a tipper on the vehicle 2. The translation may be direct, or alternatively adapted with adaptive end positions on the turning angle interval. Tipping angle means the angle of the vehicle's tipper or tub in relation to the vehicle's frame when it is tipped. If no tipping occurs, the tipping angle is usually around zero degrees.
Fig. 7 illustrates an imagined rotational axis 19D in a y-axis direction for the device 1 , around which the device 1 may be turned. The rotational axis 19D is here placed along the extension of the y-axis, in one of the device's corners. The device 1 here also has a rectangular shape, whose longest sides 20A, 20B run along the direction of an x-axis, when the device 1 is placed so that a turning angle μ around the rotational axis 19D is zero. The short sides of the device 21 A, 21 B run in the direction of a z-axis when the turning angle φ around the rotational axis 19D is zero. By turning the device 1 around the rotational axis 19D, the motion detector 1 1 provides a finding, and a turning angle μ, specifying how much the device 1 has been turned around the rotational shaft 19D, may be detected. This turning angle may be translated into a desired tipping angle of a tipper or tub on the vehicle 2. The translation may be direct, or alternatively adapted with adaptive end positions on the turning angle interval. Fig. 7 also displays a graphic slider 23, with which the user may tune the tipping angle at which the loading tub should be tipped. The angle may be displayed visually by way of a graphic representation of the input angle on the screen, here 15°. Fig. 8 illustrates the special input state comprising receiving the second type of in- signals via the touch sensitive screen 3 on the device 1 , according to one embodiment. The touch sensitive screen 3 shows a second generation user interface element 13A, 13B in the form of a coordinate system with two axes 13A, 13B in the special input state. The second generation user interface element 13A, 13B may, however, instead be only one axis 13A or 13B. Each axis 13A, 13B provides a possibility of controlling a function in the vehicle 2. In response to a second type of in-signal in the form of a movement on or movement near the touch sensitive screen 3, corresponding to a movement related to one of the axes, signals are generated, which may then be sent to the vehicle 2 and control the function or functions. The horizontal axis 13A may, for example, correspond to control of the vehicle's steering wheel. When the horizontal axis 13A crosses the vertical shaft 13B, the vehicle's direction is forward. To the right of the vertical axis 13B the steering wheel is turned to the right, and to the left of the vertical axis 13B, the steering wheel is turned to the left. The vertical axis 13B may, for example, correspond to the vehicle's speed. Above the horizontal axis 13A the vehicle's speed is increased, and below the horizontal axis 13A the vehicle's speed is reduced. One or several second type in-signals in the form of a
movement on or movement near the touch sensitive screen 3, corresponding to a movement related to one of the axes, may be detected by the device 1 via the user interface module 18. In response to such a detected motion, one or several signals are generated by the computer unit 9, specifying the detected motion, which is sent by the transmitter 8A to a control device 5 in the vehicle 1 . The vehicle 2 is then controlled based on the detected movement. By pointing and dragging with a finger or similar on the screen 3, it is thus possible to control the vehicle's direction and speed. The finger's position may, for example, be displayed with a circle 14, point or similar. In this manner the detected motion may be displayed. The user may also have the option of letting the circle 14 remain on the screen 3 when the user has released contact with the screen 3. In this manner, the user is able to view the speed or steering of the vehicle at the current time. The vehicle may continue to be controlled after the specification provided by the circle 14, and the user may thus release the screen 3 in order to let the vehicle continue to be controlled according to the requests, which the user made according to the circle 14.
According to one embodiment, the user may draw up a route in a suitable coordinate system (not displayed), which is displayed on the screen 3. The route is received as the second type of in-signals and is converted into suitable control signals for the vehicle's speed, steering, etc. For example, the control signals may comprise driving forward 20 metres, and then turning with a radius of 50 metres. The user may be assisted by e.g. a user interface element displaying a ruler with measurements, indicating how long a distance the user must draw in order to make the vehicle move a certain distance. The assistance thus entails a kind of mapping between a drawn in-signal and reality.
According to one embodiment, the method comprises low pass filtering of the detected second type in-signals. The device 1 may then comprise a low pass filter that filters away undesired signals such as jolts. In this manner, a more robust control may be achieved.
According to one embodiment, the device 1 may also receive in-signals in the form of voice commands, which are interpreted in the device 1 and forwarded as control signals to the vehicle 2. For example, the user may request the vehicle 2 to "maintain 5 km per hour" or to "brake" via the device 1 . The device 1 may be arranged to confirm the received in-signals by e.g. a voice, which via the device's loudspeakers says "maintaining 5 km/h" or "I am applying the brakes". The confirmation may be directly fed back to the in-signals received, or by way of confirmation that the vehicle 2 carries out the commands provided.
According to one embodiment, the user may set how aggressive the vehicle 1 should be on a scale in the device 1 . "Aggressive" as used in this context means how much the vehicle 2 should react to the second type of in-signals provided to the device 1 . For example, it is possible to select that the vehicle 2 must or must not react to the second type of in-signals in the form of small movements. The scale may e.g. be implemented in software in the device 1 and be displayed as a graphic user interface element (not displayed) on the screen 3, which the user may adjust. According to one embodiment, the device 1 has an implemented "dead-man- control". This is to enhance safety, so that there is no risk of an accident in case the user, for example, drops the device 1 . For example, the computer unit 9 may demand that the user have a finger or similar on the screen 3 to be able to remotely control the vehicle 2. The device 1 may also be arranged to detect if it is dropped, for example by registering quick movements, quick angle changes, etc. with one or several of the inbuilt detectors, e.g. one or several accelerometers. The device 1 may then send control signals to the vehicle 2, so that it decelerates and/or stops, is set in a parking state or similar. Detected quick movements may also lock the speed and/or steering angle into the current position. In this manner, noise in the control signals is prevented. A special gesture on the screen 3, in the form of, for example, the dragging of a finger from the edge of the screen to the middle, may also lock the speed and/or the steering angle in the current position. The user may then control the vehicle 2 in the locked position, e.g. control the speed on a straight road section or at a constant curve.
According to one embodiment, the user may provide the second type of in-signals to the device 1 by way of a graphic user interface element in the form of a slider. The slider may, for example, represent control of speed or steering angle. A change of the slider is interpreted and sent as control signals to the vehicle 2.
The present invention is not limited to the embodiments described above. Various alternatives, modifications and equivalents may be used. The embodiments above therefore do not limit the scope of the invention, which is defined by the enclosed claims.

Claims

Claims
1 . Method comprising, at a mobile electronic device (1 ), which is arranged to communicate with and remotely control a vehicle (2) in several driving modes:
- receiving a first type of in-signal, which specifies a specific driving mode for the vehicle (2), and in response to such a first type of in-signal:
- setting the electronic device (1 ) into a special steering configuration, which means that the electronic device (1 ) is configured to remotely control one or several functions in the vehicle (2) connected to the special driving mode, wherein continued in-signals to the electronic device (1 ) may control the function or functions in real time; and
- receiving one or several in-signals of a second type, specifying control of the function or functions in the vehicle (2) when the electronic device (1 ) is in the special steering configuration, and
- controlling the function or functions of the vehicle (2), connected to the special driving mode for the vehicle (2), in real time, based on the one or several in-signals of the second type.
2. Method according to claim 1 , wherein the special driving mode is one of: driving forward, parking and reversing.
3. Method according to claim 1 or 2, wherein a function of the vehicle (2) is one of the vehicle's speed, acceleration, deceleration, steering wheel angle, or a bodywork feature, such as e.g. tipping angle.
4. Method according to any of the previous claims, wherein the electronic device (1 ) has a touch sensitive screen (3) and the method comprises:
- showing several user interface elements (12A, 12B, 12C), each of which specifies a special driving mode for the vehicle (2);
- detecting a first type of in-signal in the form of touch or near touch of the touch sensitive screen (3), corresponding to a position for one of the user interface elements (12A, 12B, 12C), and in response to such a detected touch or near touch:
- setting the electronic device (1 ) into the special steering configuration, which is connected to the user interface element (12A, 12B, 12C) corresponding to the position.
5. Method according to any of the previous claims, comprising setting the electronic device (1 ) into a special input state, wherein the electronic device (1 ) is arranged to receive in-signals of the second type in the form of voice commands, input via a touch sensitive screen (3) on the electronic device (1 ), input via movement of the electronic device (1 ) or input via one or several pushbuttons (7) connected to the electronic device (1 ), wherein the electronic device (1 ) comprises means to receive the respective in-signals of the second type.
6. Method according to claim 5, comprising setting the electronic device
(1 ) into the special input state at the same time as the electronic device (1 ) is set into a special steering configuration.
7. Method according to claim 5 or 6, wherein the special input state comprises receiving in-signals of the second type in the form of movement of the electronic device (1 ), wherein the method comprises:
- detecting, with a motion sensor (1 1 ) in the electronic device (1 ), a second type of in-signal in the form of movement of the electronic device in a predetermined one-, two- or three-dimensional coordinate system for the electronic device (1 );
- generating and sending one or several signals specifying the detected movement to a control device (5) in the vehicle (2);
- controlling the vehicle's one or several functions connected to the driving mode, based on the detected movement.
8. Method according to claim 7, wherein the predetermined one-, two- or three-dimensional coordinate system for the electronic device (1 ) is arranged in accordance with the shape of the electronic device.
9. Method according to claim 7 or 8, wherein turning the electronic device (1 ) around a predefined rotational axis controls a function of the vehicle
(2) , connected to the driving mode.
10. Method according to claim 5 or 6, wherein the special input state comprises receiving in-signals of the second type via the touch sensitive screen
(3) , wherein the method comprises, in the special input state:
- showing the user interface element (13A, 13B) on the screen, in the form of a shaft or a coordinate system with several axes, wherein each axis provides for a possibility of controlling a function in the vehicle (1 );
- detecting a second type of in-signal in the form of a movement on or movement near the touch sensitive screen (3), corresponding to a movement related to one of the axes, and in response to such a detected movement:
- generating and sending one or several signals specifying the detected movement to a control device (5) in the vehicle (1 );
- controlling the vehicle (2) based on the detected movement.
1 1 . Method according to any of claims 7 to 10, comprising low pass filtering of the detected in-signals of the second type.
12. A mobile electronic device (1 ) comprising:
- a transmitter (8A) of wireless signals;
- a computer unit (9);
- a computer readable memory (10B) , comprising a computer program P with computer instructions;
- an input device (3, 7, 1 1 , 17);
c h a r a c t e r i s e d i n t h a t : - the input device (3, 7, 1 1 , 17) is arranged to receive a first type of in-signal, which specifies a special driving mode for the vehicle (2), and send it to the computer unit (9); and in that
- the computer unit (9), in response to the first type of in-signal, specifying a special driving mode for the vehicle (2), is arranged to set the electronic device into a special steering configuration, which means that the electronic device (1 ) is arranged to remotely control one or several functions in the vehicle (2), connected to the vehicle's special mode, wherein subsequent second type in-signals to the electronic device (1 ) may control the function or functions in real time; and in that the computer unit (9) is arranged to generate signals for one or several functions in the vehicle (2), connected to the driving mode, based on received second type in-signals to the electronic device (1 ), specifying control of the function or functions of the vehicle (2) when the device (1 ) is in the special steering configuration, wherein the transmitter (8A) is arranged to send the signals to a receiver unit (4A) in the vehicle (1 ).
13. System comprising a mobile electronic device (1 ) according to claim 12 and a receiver device (4A), which is arranged to be placed in the vehicle (2) and which is arranged to receive one or several wireless signals from the electronic device (1 ), and to send them to a suitable control device (5) in the vehicle (2).
14. Computer program, P, wherein said computer program P comprises program code to cause a computer unit (9) and/or a control device (5) to carry out the steps according to any of claims 1 -1 1 .
15. Computer program product comprising a program code stored on a computer-readable medium, in order to execute the method steps according to any of claims 1 -1 1 , when said program code is executed in a computer unit (9).
16. Graphic user interface on an electronic mobile device (1 ) with a touch sensitive screen (3), characterised in that the graphic user interface comprises:
- one or several user interface elements (12A, 12B, 12C), each of which specifies a special driving mode for the vehicle (2); wherein, in response to a first type of in-signal in the form of a touch or near touch of the touch sensitive screen (3), which corresponds to a positon for one of the user interface
elements(12A, 12B, 12C):
- a second generation user interface element (13A, 13B) is shown on the touch sensitive screen (3), in the form of an axis or a coordinate system with several axes, wherein each axis represents control of a function in the vehicle (2); wherein, in response to a second type of in-signal in the form of a movement on or movement near the touch sensitive screen (3), corresponding to a movement related to the axis or one of the axes:
- the detected movement is shown.
PCT/SE2015/050663 2014-06-25 2015-06-09 Method and mobile device for steering a vehicle WO2015199600A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112015002330.5T DE112015002330T5 (en) 2014-06-25 2015-06-09 Method and mobile device for steering a vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1450785A SE1450785A1 (en) 2014-06-25 2014-06-25 Method and a mobile electronic device for controlling a vehicle
SE1450785-9 2014-06-25

Publications (1)

Publication Number Publication Date
WO2015199600A1 true WO2015199600A1 (en) 2015-12-30

Family

ID=54938537

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2015/050663 WO2015199600A1 (en) 2014-06-25 2015-06-09 Method and mobile device for steering a vehicle

Country Status (3)

Country Link
DE (1) DE112015002330T5 (en)
SE (1) SE1450785A1 (en)
WO (1) WO2015199600A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10081387B2 (en) 2017-02-07 2018-09-25 Ford Global Technologies, Llc Non-autonomous steering modes
GB2550656B (en) * 2015-03-27 2019-07-03 Jaguar Land Rover Ltd External vehicle control system
US11067982B2 (en) 2017-07-27 2021-07-20 Daimler Ag Method for the remote control of a function of a vehicle
US11733690B2 (en) * 2020-07-06 2023-08-22 Ford Global Technologies, Llc Remote control system for a vehicle and trailer
US11740622B2 (en) 2019-06-12 2023-08-29 Ford Global Technologies, Llc Remote trailer maneuver-assist

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019211676A1 (en) * 2019-08-02 2021-02-04 Robert Bosch Gmbh Method for controlling a mobile work machine
DE102021213915A1 (en) 2021-12-07 2023-06-07 Psa Automobiles Sa Remote control of a vehicle function with sensor fusion of touchscreen and IMU

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102008051982A1 (en) * 2008-10-16 2009-06-10 Daimler Ag Vehicle e.g. hybrid vehicle, maneuvering method, involves releasing parking brake, transferring forward- or backward driving position in automatic transmission, and regulating speed of vehicle by parking brake
WO2011041884A1 (en) * 2009-10-06 2011-04-14 Leonard Rudy Dueckman A method and an apparatus for controlling a machine using motion based signals and inputs
US20120215380A1 (en) * 2011-02-23 2012-08-23 Microsoft Corporation Semi-autonomous robot that supports multiple modes of navigation
US20130109272A1 (en) * 2011-10-31 2013-05-02 Stephen M. RINDLISBACHER Method of Controlling a Vehicle or Toy via a Motion-Sensing Device and/or Touch Screen
US20140172197A1 (en) * 2012-12-13 2014-06-19 Brian L. Ganz Method and system for controlling a vehicle with a smartphone

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102008051982A1 (en) * 2008-10-16 2009-06-10 Daimler Ag Vehicle e.g. hybrid vehicle, maneuvering method, involves releasing parking brake, transferring forward- or backward driving position in automatic transmission, and regulating speed of vehicle by parking brake
WO2011041884A1 (en) * 2009-10-06 2011-04-14 Leonard Rudy Dueckman A method and an apparatus for controlling a machine using motion based signals and inputs
US20120215380A1 (en) * 2011-02-23 2012-08-23 Microsoft Corporation Semi-autonomous robot that supports multiple modes of navigation
US20130109272A1 (en) * 2011-10-31 2013-05-02 Stephen M. RINDLISBACHER Method of Controlling a Vehicle or Toy via a Motion-Sensing Device and/or Touch Screen
US20140172197A1 (en) * 2012-12-13 2014-06-19 Brian L. Ganz Method and system for controlling a vehicle with a smartphone

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
FERNANDES C ET AL.: "Development of a convenient wireless control of an autonomous vehicle using apple iOS SDK", TENCON 2011 - 2011 IEEE REGION 10 CONFERENCE, 21 November 2011 (2011-11-21), pages 1025 - 1029, XP032092645, ISBN: 978-1-4577-0256-3 *
REUSCHENBACH A ET AL.: "iDriver - Human Machine Interface for Autonomous Cars", INFORMATION TECHNOLOGY: NEW GENERATIONS (ITNG), 2011 EIGHTH INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY: NEW GENERATIONS, 11 April 2011 (2011-04-11), pages 435 - 440, XP032003821, ISBN: 978-1-61284-427-5 *
WEI LIANG KENNY CHUA ET AL.: "Interactive methods of tele- operating a single unmanned ground vehicle on a small screen interface", HUMAN-ROBOT INTERACTION (HRI), 2011 6TH ACM/ IEEE INTERNATIONAL CONFERENCE, 8 March 2011 (2011-03-08), pages 121 - 122, XP058002226, ISBN: 978-1-4673-4393-0 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2550656B (en) * 2015-03-27 2019-07-03 Jaguar Land Rover Ltd External vehicle control system
US10081387B2 (en) 2017-02-07 2018-09-25 Ford Global Technologies, Llc Non-autonomous steering modes
GB2561065A (en) * 2017-02-07 2018-10-03 Ford Global Tech Llc Non-autonomous steering modes
US11067982B2 (en) 2017-07-27 2021-07-20 Daimler Ag Method for the remote control of a function of a vehicle
US11740622B2 (en) 2019-06-12 2023-08-29 Ford Global Technologies, Llc Remote trailer maneuver-assist
US11733690B2 (en) * 2020-07-06 2023-08-22 Ford Global Technologies, Llc Remote control system for a vehicle and trailer

Also Published As

Publication number Publication date
SE1450785A1 (en) 2015-12-26
DE112015002330T5 (en) 2017-03-23

Similar Documents

Publication Publication Date Title
WO2015199600A1 (en) Method and mobile device for steering a vehicle
US10747218B2 (en) Mobile device tethering for remote parking assist
US10181266B2 (en) System and method to provide driving assistance
JP6555599B2 (en) Display system, display method, and program
KR102311551B1 (en) Method for using a communication terminal in a motor vehicle while autopilot is activated and motor vehicle
EP3456577B1 (en) User interface apparatus for vehicle
JP5945999B1 (en) Driving support device, driving support system, driving support method, driving support program, and autonomous driving vehicle
CN107851394B (en) Driving assistance device, driving assistance system, driving assistance method, and autonomous vehicle
EP3301530A1 (en) Control method of autonomous vehicle and server
CN109760604A (en) The method of the vehicle control apparatus and control vehicle that are installed on vehicle
JP2022184896A (en) System and method for autonomous vehicle notification
WO2021082483A1 (en) Method and apparatus for controlling vehicle
JP2005041433A (en) Vehicle guiding device and route judging program
US20200132489A1 (en) Methods and apparatus to facilitate navigation using a windshield display
JP2019119231A (en) Parking control method and parking control apparatus
JP6769860B2 (en) Terminals and terminal control methods
US20190258245A1 (en) Vehicle remote operation device, vehicle remote operation system and vehicle remote operation method
US11584364B2 (en) Vehicle control device, vehicle, operation method for vehicle control device and storage medium
KR102005443B1 (en) Apparatus for user-interface
US11809187B2 (en) Mobile object, control method of mobile object, control device and program
JP7083762B2 (en) Vehicle control device, vehicle and vehicle control method
US11292484B2 (en) Vehicle control device, vehicle, and vehicle control method
US20200255016A1 (en) Vehicle control device, vehicle, and vehicle control method
CN114684112A (en) Vehicle indicating progress of automatic parking process and operating method
US20220036598A1 (en) Vehicle user interface device and operating method of vehicle user interface device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15811895

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 112015002330

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15811895

Country of ref document: EP

Kind code of ref document: A1