SE1650520A1 - Method and control unit for loading a vehicle - Google Patents

Method and control unit for loading a vehicle

Info

Publication number
SE1650520A1
SE1650520A1 SE1650520A SE1650520A SE1650520A1 SE 1650520 A1 SE1650520 A1 SE 1650520A1 SE 1650520 A SE1650520 A SE 1650520A SE 1650520 A SE1650520 A SE 1650520A SE 1650520 A1 SE1650520 A1 SE 1650520A1
Authority
SE
Sweden
Prior art keywords
package
vehicle
packages
loading position
identified
Prior art date
Application number
SE1650520A
Other languages
Swedish (sv)
Inventor
Claesson André
Andersson Jon
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1650520A priority Critical patent/SE1650520A1/en
Priority to DE102017003273.4A priority patent/DE102017003273A1/en
Publication of SE1650520A1 publication Critical patent/SE1650520A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G67/00Loading or unloading vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/083Shipping

Abstract

Method (400) and control unit (300) for visually illustrating a loading position (150) of a package (110) to be loaded into a vehicle (100). The method (400) comprises identifying (401) the package (110) to be loaded into the vehicle (100); extracting (402) information concerning a destination (220) of the identified (401) package (110); determining (403) a current geographical position (210) of the vehicle (100); determining (404) a set (105) of packages. (110) to be distributed by the vehicle (100) to a respective destination (220); determining. (405) a route of the vehicle (100), based on the current geographical position (210) of the vehicle (100) and the respective destination (220) of each package (110) in the set (105) of packages (110) to be distributed; determining (406) a loading position (150) for each package (110) in the set (105) of packages (110) to be distributed, based on the unloading order of the set (105) of packages (110) according to the determined (405) route; and visually illustrating (407) the loading position (150) of the identified (401) package (110).(Publ. Fig. 1B)

Description

METHOD AND CONTROL UNIT FOR LOADING A VEHICLE TECHNICAL FIELD This document discloses a method and a control unit. More particularly, a method and acontrol unit is described, for visually illustrating a loading position of a package to be loaded into a vehicle.
BACKGROUND Cargo is often damaged during transportation in a vehicle. This may have several reasons,such as e.g. the cargo being poorly secured; the driver not being aware of the fragility of acertain package and/ or how to handle such fragile package; some packages may be veryheavy and thereby inappropriate to place on top of other packages; some packages maycomprise dangerous goods/ hazardous materials (without the driver's knowledge) that aree.g. corrosive, oxidising, flammable, explosive, radioactive, asphyxiating, biohazardous,toxic, pathogenic, and/ or allergenic, and may thus damage other packages and/ or the driverand/ or the vehicle, or other person or property close to the vehicle. Further, some packagesmay be very valuable and thereby appealing for theft, i.e. inappropriate to place in plain viewclose to a door or an opening.
Another problem is that the driver may have to drive to several places for loading/ unloading.lt may thereby be difficult or even impossible for the driver to know where and how to placeall packages in order to simplify loading/ unloading. The driver may not be aware of otherpackages (and/ or the size/ weight of such packages) that are to be picked up on the distri-bution route and loaded into the vehicle.
To rearrange the load in order to be able to unload a certain package at the destination is time consuming and may cause an unnecessary delay of the distribution. lt may also be difficult to keep in memory all respective weights of the packages and howthey influence the vehicle center of gravity and/ or axle load. For roads restricted by maxi-mum axle load limitation of the vehicle, it is necessary for the driver to know the axle loadand how the cargo of the vehicle is distributed. A displacement of the center of gravity forthe vehicle will influence and deteriorate the driving properties of the vehicle, which may cause an accident in case the driver is not aware and cautious.
Document DE102013004537 describes a system for monitoring the loading of cargo into avehicle. A mobile detection unit detects size and/ or weight of a package to be loaded. The system also comprises an output unit for presenting information concerning where the pack- age is to be placed.
However, the purpose seems to be to utilise, i.e. fill up, the cargo compartment as efficientlyas possible in order to enable as much load as possible. No consideration is made concern-ing the order of distribution or avoiding rearrangement of packages during unload proce-dures. Further, no consideration is made concerning fragile packages, valuable packages ordangerous packages. ln addition, no consideration is made concerning other packages tobe picked up, e.g. on other stops on the distribution route.
Document US 2003110102 discloses a method for goods arrangement for placing goodsinto a storage space, for improving the space utilisation rate and decreasing the transporta-tion cost. This is made by solving the Knapsack problem, i.e. how to pack the different sizeand different weight goods into a package to maximise the package utilisation rate and min-imize the space wasteness. A particular algorithm “the wall building theorem” is hereby uti-lised, wherein big and heavy packages are placed at the bottom and small, light and fragilepackages are placed on top. Information concerning how to best pack the packages is then presented on a personal computer or similar.
No consideration is made concerning the order of distribution or avoiding rearrangement ofpackages during unload procedures. ln addition, no consideration is made concerning otherpackages to be picked up, e.g. on other stops on the distribution route. Further, the require-ment of a personal computer and the presentation there upon of how to pack is rather incon-venient for the driver. lt may also be a problem for the driver to recognise a three dimensional position for placement on a two dimensional screen.
Document US 2004115802 concerns a graphical user interface for indicating the location ofcargo items in a storage space. A three dimensional representation of the storage space andof the cargo items to be packed are made. An operator could then arrange the cargo itemson the screen, based on dimensions and weight of the respective package, until he/ she issatisfied with the package structure.
This process requires manual operation which is time consuming, expensive and rather un-feasible. No consideration is made concerning the order of distribution or avoiding rearrange-ment of packages during unload procedures. Further, no consideration is made concerningfragile packages, valuable packages or dangerous packages. ln addition, no considerationis made concerning other packages to be picked up, e.g. on other stops on the distribution route.
Document CN1936937 concerns a loading strategy of a vehicle, based on dimensions of thepackages to be loaded.
No consideration is made concerning the order of distribution or avoiding rearrangement ofpackages during unload procedures. ln addition, no consideration is made concerning otherpackages to be picked up, e.g. on other stops on the distribution route. Further, it is notdescribed how the loading strategy is to be presented to the driver.
Document DE 102014018198 describes a method for assisting the driver in loading a vehi-cle. The dimensions of the storage space and the packages to be loaded into the vehicle,respectively, are determined. The placing of the packages is then planned, based on the respective dimensions.
The method seems primarily intended for small vehicles having a limited number of packagesto be loaded. The purpose seems to be to utilise, i.e. fill up, the cargo compartment as effi-ciently as possible in order to enable as much load as possible. No consideration is madeconcerning the order of distribution or avoiding rearrangement of packages during unloadprocedures. Further, no consideration is made concerning fragile packages, valuable pack-ages or dangerous packages. ln addition, no consideration is made concerning other pack-ages to be picked up, e.g. on other stops on the distribution route. lt would thus be desired to assist the driver of the vehicle to plan the loading of the cargozone of the vehicle, for simplifying unloading of the packages, minimising or reducing rear-rangement of packages during the unload.
SUMMARY lt is therefore an object of this invention to solve at least some of the above problems andimprove load planning of a vehicle.
According to a first aspect of the invention, this objective is achieved by a method for visuallyillustrating a loading position of a package to be loaded into a vehicle. The method comprisesidentifying the package to be loaded into the vehicle. Further, the method also comprisesextracting information concerning a destination of the identified package. The method alsocomprises determining a current geographical position of the vehicle. Additionally, themethod also comprises determining a set of packages to be distributed by the vehicle to arespective destination. The method in addition also comprises determining a route of the vehicle, based on the current geographical position of the vehicle and the respective desti-nation of each package in the set of packages to be distributed. Also, the method furthercomprises determining a loading position for each package in the set of packages to bedistributed, based on the unloading order of the set of packages according to the determinedroute. The method comprises visually illustrating the loading position of the identified pack-age.
According to a second aspect of the invention, this objective is achieved by a control unit for visually illustrating a loading position of a package to be loaded into a vehicle. The controlunit is configured to identify the package to be loaded into the vehicle when receiving a coderepresenting the identify the package. Also, the control unit is further configured to extractinformation concerning a destination of the identified package. ln addition, the control unit isalso configured to determine a current geographical position of the vehicle. The control unitis in further addition also configured to determine a set of packages to be distributed by thevehicle to a respective destination. Further, the control unit is also configured to determine aroute of the vehicle, based on the current geographical position of the vehicle and the re-spective destination of each package in the set of packages to be distributed. Also, the con-trol unit is configured to determine a loading position for each package in the set of packagesto be distributed, based on the unloading order of the set of packages according to the de-termined route. The control unit is furthermore configured to generating command signals for visually illustrating the loading position of the identified package.
Hereby, thanks to the disclosed aspects, by illustrating the loading positions of identifiedpackages, loading planning is simplified for the driver, or any other entity such as a robotwhen transporting cargo, also in a complex loading situation where packages are to be de-livered to different destinations, where packages are disparate in form, size and/ or weight,and/ or when packages are to be picked up at different positions along a distribution route.Thereby, the driver, or corresponding entity, is saved from the complicated task of planningthe loading and keep in mind where all packages are to be placed for minimising or at leastreducing rearrangements of packages during unloading of the packages, while avoiding plac-ing packages so that an accident may occur, or placing packages so that the centre of gravityof the vehicle is moved in a way making the vehicle unstable. Thereby, loading of the vehicleis enabled.
Other advantages and additional novel features will become apparent from the subsequentdetailed description.
FIGURES Embodiments of the invention will now be described in further detail with reference to theaccompanying figures, in which: Figure 1A illustrates a side view of a vehicle according to an embodiment; Figure 1 B illustrates a back view of a vehicle, packages to be loaded and a visualisationdevice according to an embodiment; Figure 1C illustrates a back view of a vehicle, packages to be loaded and a visualisationdevice according to an embodiment; Figure 1D illustrates a back view of a vehicle, packages to be loaded and a visualisationdevice according to an embodiment; Figure 2 illustrates an above perspective overview of a scenario where a vehicle isloaded and starts driving on a route to a destination; Figure 3A illustrates an example of a vehicle interior according to an embodiment; Figure 3B illustrates an example of a vehicle interior according to an embodiment; Figure 4 is a flow chart illustrating an embodiment of a method; Figure 5 is an illustration depicting a system according to an embodiment.
DETAILED DESCRIPTION Embodiments of the invention described herein are defined as a method and a control unit,which may be put into practice in the embodiments described below. These embodimentsmay, however, be exemplified and realised in many different forms and are not to be limitedto the examples set forth herein; rather, these illustrative examples of embodiments are pro-vided so that this disclosure will be thorough and complete.
Still other objects and features may become apparent from the following detailed description,considered in conjunction with the accompanying drawings. lt is to be understood, however,that the drawings are designed solely for purposes of illustration and not as a definition ofthe limits of the herein disclosed embodiments, for which reference is to be made to theappended claims. Further, the drawings are not necessarily drawn to scale and, unless oth-en/vise indicated, they are merely intended to conceptually illustrate the structures and pro-cedures described herein.
Figure 1A illustrates a scenario with a vehicle 100 which is to be loaded.
The vehicle 100 may comprise a means for transportation in broad sense such as e.g. atruck, a car, a trailer, a container, a bus, a train, a tram, an aircraft such as a cargo bay of a plane, a watercraft such as a cargo ship, a cable transport, an aerial tramway, an elevator,a drone, a spacecraft, or other similar manned or unmanned means of conveyance running e.g. on wheels, rails, air, water or similar media.
The vehicle 100 may be driver controlled or driverless (i.e. autonomously controlled) in dif-ferent embodiments. However, for enhanced clarity, the vehicle 100 is subsequently de-scribed as having a driver.
The vehicle 100 is to be loaded with packages by e.g. the driver, in case the vehicle 100 hasa driver, by any other human, or by another entity having loading ability such as a loadingrobot, a heavy loader, a trained animal, an automated loading system, etc. The vehicle 100may drive along a distribution route, where different packages are to be unloaded on variousdestinations along the route, and possibly some packages are to be picked up. lt is not uncomplicated for a human, or corresponding entity, to keep all destinations of allpackages to be loaded and delivered in mind, in case he/ she/ it wants to minimise or reducerearrangement of packages when unloading. The situation is further complicated when thepackages have different sizes, weights and/ or levels of fragility, and even more complicatedwhen packages are to be picked up along the route. The level of fragility may be set to a number on a scale from 1 to 5, 1 to 10 etc., in some embodiments.
According to some embodiments, this may be solved by firstly detecting an identification ofeach package to be loaded into the vehicle 100. This recognised identification may be asso-ciated with characteristics such as size, weight, level of fragility, destination of each packageetc., which then may be extracted.
Further, the available size of the cargo compartment of the vehicle 100 may be determined,or may be previously known. The placement in the cargo compartment of each package isthen determined according to an algorithm in order to reduce or at least minimise relocationof the loaded packages in order to be able to unload packages during distribution. Thus thepackage having the last destination on the route may be placed at the very front of the vehi-cle, or rather at the very front of the cargo compartment of the vehicle 100, while packagesto be delivered at the first destination of the route may be situated further back in the vehicle100, close to the loading hatch.
Also, placement of the packages may be planned in order to avoid placing heavy packageshigh up, or above a fragile package, for example. Certain packages may be subject to a certified secure transportation service. Such packages may be in particular valuable, or sen-sitive and thus require a particular placing within the cargo compartment, e.g. in separatelocked compartment or similar. Further, the placing of the packages may be planned so thatthe load distribution of the vehicle 100 is kept stable from a centre of gravity position per-spective.
Further, the calculated respective positions in the vehicle 100 of the packages to be loadedmay be indicated for the person loading the vehicle 100 by a visualisation device, e.g. bylaser, spotlights, LED lights or other source of illumination, in some embodiments, which isillustrated in Figure 1B.
Further, such visualisation device may comprise a pair of glasses with an at least partiallytransparent display arranged for augmented reality, in some embodiments, as illustrated inFigure 1C.
Such visualisation device may alternatively comprise a device with a display arranged foraugmented reality as illustrated in Figure 1D.
Thereby the person loading the packages is provided with an easily comprehensible indica-tion. The identification of each package may be made e.g. by an RFID tag.
RFID, or Radio-frequency Identification uses electromagnetic fields to automatically identifyand track tags attached to objects such as the packages. The RFID tags may comprise elec-tronically stored information such as destination, size, weight and/ or level of fragility or otherspecial demands of the package, such as e.g. a box with flowers has to be placed with noth-ing on top. ln other embodiments, the RFID tags may comprise a reference number, whichmay be used for extracting information stored in a database (destination, size, weight, etc.).
Such RFID tag may be active or passive. Passive tags collect energy from a nearby RFIDreader's interrogating radio waves. Active tags have a local power source such as a batteryand may operate at hundreds of meters from the RFID reader. Unlike a barcode, the tagneed not be within the line of sight of the reader, so it may be embedded in the trackedobject.
However, the identification of the respective packages may be made by a barcode, such asa Iinear bar code like European Article Number (EAN), or a matrix code such as e.g. a QuickResponse (QR) code, in combination with a bar code reader.
The identification of the respective packages may aiternatively be made with a transponderplaced in each package. A transponder is a device that emits an identifying signal in re- sponse to an interrogating received signal. ln yet an alternative embodiment, each package may have a visible reference code, whichmay be recognised by a camera in combination with an image recognition program.
Image recognition/ computer vision is a technical field comprising methods for acquiring,processing, analysing, and understanding images and, in general, high-dimensional datafrom the real world in order to produce numerical or symbolic information. A theme in thedevelopment of this field has been to duplicate the abilities of human vision by electronicallyperceiving and understanding an image. Understanding in this context means the transfor-mation of visual images (the input of retina) into descriptions of world that can interface withother thought processes and elicit appropriate action. This image understanding can be seenas the disentangling of symbolic information from image data using models constructed withthe aid of geometry, physics, statistics, and learning theory. Computer vision may also bedescribed as the enterprise of automating and integrating a wide range of processes and representations for vision perception.
Once the package has been identified based on the reference code, package specific datamay be extracted from a database such as volume, dimensions, weight, fragility, whether itcomprise dangerous goods/ hazardous materials, value, and/ or if it has specific require-ments, such as whether it has to be placed in a particular direction, etc.
The expression ““package” as herein utilised is to be understood in broad sense and maycomprise e.g. parcels, collis, pallets, bags, but also plants, vegetables, animals and human-oids.
Passengers on a public transportation vehicle such as a bus or an elevator may be adviseda position according to the described method in some embodiments, based on their respec-tive destination.
Figure 1B illustrates a back view of a vehicle 100, a set 105 of packages 110 to be loadedand a visualisation device 140 according to an embodiment.
Each package 110 within the set 105 of packages to be loaded comprises a code 120 iden-tifying the package 110. The code 120 may for example be based on RFID, bar code, matrixcode or similar, which may be detected by a device 130 for recognising the code 120. Such device 130 may comprise an RFID reader, a barcode reader, a matrix code reader, a cameraand associated program for image recognition, etc. The device 130 for recognising the code120 may be handhold in some embodiments, or mounted at an opening of the vehicle 100as illustrated in the Figure 1B.
The recognised code 120 may be used for extracting information concerning the package110, such as dimensions, volume, weight, etc., e.g. from a database. The extracted infor-mation may alternatively be comprised in the code 120.
A computation may then be made in a control unit. Various other parameters may also bedetermined such as e.g., the current geographical position of the vehicle 100; the total set105 of packages to be distributed by the vehicle 100 to a respective destination along thecurrent route; a route of the vehicle 100, based on the current geographical position of thevehicle 100 and the respective destination of each package 110 in the set 105 of packagesto be distributed; a loading position 150 for each package 110 in the set 105 of packages110 to be distributed, based on the unloading order of the set 105 of packages 110 accordingto the determined route.
The visualisation device 140 may then illuminate the calculated loading position 150 for thepackage 110 to be loaded at the moment, in order to avoid rearrangement of the packages105 when unloading the package 110. The computation may also be made taking additionalfactors into account, such as not placing heavy packages 110 high up, as the centre of gravityof the vehicle 100 may be changed; fragility of the package 110; stackability of the package110; whether the package 110 may comprise dangerous goods/ hazardous materials; and/or whether the package 110 is valuable, i.e. has a considered value exceeding a thresholdvalue; whether the package 110 is subject to a transport certification service or not.
Thus fragile packages 110 should be placed so they cannot fall, and that no other package110 may fall into them; packages 110 that are not stackable should not be stacked; packages110 comprise dangerous goods/ hazardous materials should be handled with care, possiblyin a separate compartment; valuable packages should be placed such that a bypasser can-not easily pick them up, i.e. not close to any door or opening, possibly in a locked compart- ment; etc.
The visualisation of the loading position 150 may be made by the visualisation device 140,based on projection of visible light or laser, or alternatively by LED lights mounted in thecargo space of the vehicle 100, in the floor, walls and/ or roof of the cargo space, in differentembodiments.
Figure 1C illustrates a back view of a vehicle 100 according to an embodiment, wherein theloading position 150 of the package 110 to be loaded is illustrated for the loading entity inthe visualisation device 140, comprising an at least partially transparent display, which in thisembodiment is portable by the loading entity.
The visualisation device 140 may comprise a pair of intelligent glasses, i.e. an optical head-mounted display, that is designed in the shape of a pair of eyeglasses; or a set of portablehead-up displays.
The visualisation of the loading position 150 of the package 110 may be made based onAugmented Reality (AR).
Augmented reality is a live direct or indirect view of a physical, real-world environment whoseelements are augmented (or supplemented) by computer-generated sensory input such assound, video, graphics or GPS data. lt is related to a more general concept called mediatedreality, in which a view of reality is modified (possibly even diminished rather than aug-mented) by a computer. As a result, the technology functions by enhancing one's currentperception of the reality.
With the help of advanced AR technology (e.g. adding computer vision and object recogni-tion) the information about the surrounding real world of the user becomes visible and pos-sibly interactive. Information about the environment and its objects is overlaid on the realworld. This information can be virtual or real, e.g. seeing other real sensed or measuredinformation such as electromagnetic radio waves overlaid in exact alignment with where they actually are in space.
An advantage with this embodiment may be that the illustration of the loading position 150of the package 110 may be made without requirement of modifying the vehicle 100 by in-stalling a light emitting device in the vehicle 100.
Figure 1D illustrates a back view of a vehicle 100 according to an embodiment, wherein theloading position 150 of the package 110 to be loaded is illustrated for the loading entity inthe visualisation device 140, comprising a display, which in this embodiment is portable bythe loading entity.
This illustrated embodiment functions to a large extent similar to the device 140 illustrated inFigure 1C, but wherein the display is not transparent. lnstead an image of the vehicle 100 is 11 captured by a camera in front of the device 140, which is presented on the display of thevisualisation device 140, upon which the calculated loading position 150 is illustrated.
An advantage with this embodiment may be that the loading entity may use an existing de-vice such as a mobile telephone and just download an app in order to be able to perform theherein described method for visualisation.
Figure 2 depicts an example of a scenario wherein the vehicle 100 is situated at a startingpoint 210, where a first set 105-1 of packages 110 are to be loaded into the vehicle 100.
Different packages 1 10 may be distributed to different destinations 220-1, 220-2, 220-3. Fur-ther, a second set 105-2 of packages 110 may be picked up at the first destination 220-1,while some other packages 110 are to be unloaded from the vehicle 100. The vehicle 100may then continue to the subsequent destination 220-2, where yet some packages 110 maybe unloaded, and then continue to the third destination 220-3.
This is merely an arbitrary example of a distribution route to be made by the vehicle 100.
When loading the vehicle 100 with the first set 105-1 of packages 110, the loading and un-loading will be simplified if the driver has knowledge, not only about the respective destina-tions 220-1, 220-2, 220-3 of the packages 110 in the first set 105-1, but also about the exist-ence of the second set 105-2 of packages 110 to be picked up at the first destination 220-1,and the respective destinations 220-2, 220-3 of those packages 110.
Figure 3A illustrates an example of how the previous scenario in Figure 2 may be perceivedby the driver of the vehicle 100, if any, when situated at any arbitrary position along the routetowards the first destination 220-1, according to an embodiment.
A control unit 300 may be configured for planning the route of the vehicle 100, from thecurrent position 210 of the vehicle 100, to the first destination 220-1, in some embodiments,and for planning the loading order of packages 110 to be loaded and distributed.
The control unit 300 may thus assist a loading entity in loading a package 110 into the vehicle100. The control unit 300 is configured to identify the package 110 to be loaded into thevehicle 100 when receiving a code representing the identify the package 110. Further thecontrol unit 300 is configured to extract information concerning a destination 220 of the iden-tified package 110. The control unit 300 is also configured to determine a current geograph- 12 ical position of the vehicle 100. ln addition, the control unit 300 is further configured to deter-mine a set 105 of packages to be distributed by the vehicle 100 to a respective destination220. Furthermore, the control unit 300 is configured to determine a route of the vehicle 100,based on the current geographical position of the vehicle 100 and the respective destination220 of each package 110 in the set 105 of packages to be distributed. Additionally, the con-trol unit 300 is further configured to determine a loading position 150 for each package 110in the set 105 of packages 1 10 to be distributed, based on the unloading order of the set 105of packages 110 according to the determined route. The control unit 300 is also configuredto generate command signals for visually illustrating the loading position 150 of the identifiedpackage 110.
The control unit 300 may comprise, or be connected to a database 310, which database 310may comprise information concerning the packages 110 to be distributed, such as e.g. des-tination, sensibility of the package 110, weight of the package 110, dimensions of the pack-age 110, volume of the package 110, stackability of the package 110, value of the package110, fragility of the package 110 and/ or whether the package 110 is subject of a transportcertification service or not. This information may be stored in the database 310, associatedwith an identity reference of each respective package 110, such as a reference number orother unique identification. ln the illustrated embodiment, the control unit 300 and the database 310 are comprised withina vehicle external structure 315. However, in other embodiments, another placing of thecontrol unit 300 and/ or the database 310 may be made, such as discussed in Figure 3B.
The communication between the control unit 300 and the database 310 of the vehicle exter-nal 315 and the vehicle 100 may be made by a communication device 320. The communi-cation device 320 may be configured for wireless communication over a wireless communi-cation interface, such as e.g. Vehicle-to-Vehicle (V2V) communication, or Vehicle-to-Struc-ture (V2X) communication. ln some embodiments, the communication between the vehicle 100 and the vehicle externalstructure 315 may be performed via V2V communication, e.g. based on Dedicated Short-Range Communications (DSRC) devices. DSRC works in 5.9 GHz band with bandwidth of75 l\/lHz and approximate range of 1000 m in some embodiments.
The wireless communication may be made according to any IEEE standard for wireless ve-hicular communication like e.g. a special mode of operation of IEEE 802.11 for vehicularnetworks called Wireless Access in Vehicular Environments (WAVE). IEEE 802.11p is an 13 extension to 802.11 Wireless LAN medium access layer (MAC) and physical layer (PHY)specification.
Such wireless communication interface may comprise, or at least be inspired by wirelesscommunication technology such as Wi-Fi, Wireless Local Area Network (WLAN), Ultra Mo-bile Broadband (UMB), Bluetooth (BT), Near Field Communication (NFC), Radio-FrequencyIdentification (RFID), Z-wave, ZigBee, lPv6 over Low power Wireless Personal Area Net-works (6LoWPAN), Wireless Highway Addressable Remote Transducer (HART) Protocol,Wireless Universal Serial Bus (USB), optical communication such as Infrared Data Associa-tion (lrDA) or infrared transmission to name but a few possible examples of wireless com- munications in some embodiments.
The communication may alternatively be made over a wireless interface comprising, or atleast being inspired by radio access technologies such as e.g. 3GPP LTE, LTE-Advanced,E-UTRAN, UMTS, GSM, GSM/ EDGE, WCDMA, Time Division Multiple Access (TDMA) net-works, Frequency Division Multiple Access (FDMA) networks, Orthogonal FDMA (OFDMA)networks, Single-Carrier FDMA (SC-FDMA) networks, Worldwide lnteroperability for Micro-wave Access (WiMax), or Ultra Mobile Broadband (UMB), High Speed Packet Access(HSPA) Evolved Universal Terrestrial Radio Access (E-UTRA), Universal Terrestrial RadioAccess (UTRA), GSM EDGE Radio Access Network (GERAN), 3GPP2 CDMA technologies,e.g., CDMA2000 1x RTT and High Rate Packet Data (HRPD), or similar, just to mention some few options, via a wireless communication network.
The geographical position of the vehicle 100 may be determined by a positioning device 330in the vehicle 100, which may be based on a satellite navigation system such as the Naviga-tion Signal Timing and Ranging (Navstar) Global Positioning System (GPS), Differential GPS(DGPS), Galileo, GLONASS, or the like.
The geographical position of the positioning device 330, (and thereby also of the vehicle 100)may be made continuously with a certain predetermined or configurable time intervals ac- cording to various embodiments.
Positioning by satellite navigation is based on distance measurement using triangulationfrom a number of satellites 340-1, 340-2, 340-3, 340-4. ln this example, four satellites 340-1, 340-2, 340-3, 340-4 are depicted, but this is merely an example. More than four satellites340-1, 340-2, 340-3, 340-4 may be used for enhancing the precision, or for creating redun-dancy. The satellites 340-1, 340-2, 340-3, 340-4 continuously transmit information about timeand date (for example, in coded form), identity (which satellite 340-1, 340-2, 340-3, 340-4 14 that broadcasts), status, and where the satellite 340-1, 340-2, 340-3, 340-4 are situated atany given time. The GPS satellites 340-1, 340-2, 340-3, 340-4 sends information encodedwith different codes, for example, but not necessarily based on Code Division Multiple Ac-cess (CDMA). This allows information from an individual satellite 340-1, 340-2, 340-3, 340-4 distinguished from the others' information, based on a unique code for each respectivesatellite 340-1, 340-2, 340-3, 340-4. This information can then be transmitted to be receivedby the appropriately adapted positioning device 330 comprised in the vehicle 100.
Distance measurement can according to some embodiments comprise measuring the differ-ence in the time it takes for each respective satellite signal transmitted by the respectivesatellites 340-1, 340-2, 340-3, 340-4 to reach the positioning device 330. As the radio signalstravel at the speed of light, the distance to the respective satellite 340-1, 340-2, 340-3, 340-4 may be computed by measuring the signal propagation time.
The positions of the satellites 340-1, 340-2, 340-3, 340-4 are known, as they continuouslyare monitored by approximately 15-30 ground stations located mainly along and near theearth's equator. Thereby the geographical position, i.e. latitude and longitude, of the vehicle100 may be calculated by determining the distance to at least three satellites 340-1, 340-2,340-3, 340-4 through triangulation. For determination of altitude, signals from four satellites340-1, 340-2, 340-3, 340-4 may be used according to some embodiments.
The geographical position of the vehicle 100 may alternatively be determined, e.g. by havingtransponders positioned at known positions around the route and a dedicated sensor in thevehicle 100, for recognising the transponders and thereby determining the position; by de-tecting and recognising WiFi networks (WiFi networks along the route may be mapped withcertain respective geographical positions in a database); by receiving a Bluetooth beaconingsignal, associated with a geographical position, or other signal signatures of wireless signalssuch as e.g., by triangulation of signals emitted by a plurality of fixed base stations withknown geographical positions. The position of the vehicle 100 may alternatively be enteredby the driver, if any, or any human, robot or other loading entity, loading the vehicle 100.
Having determined the geographical position of the positioning device 330 (or in anotherway), it may be presented on a map, a screen or a display device where the position of thevehicle 100 may be marked. ln some embodiments, the geographical position of the vehicle 100, information concerningthe route, the packages 110 to be delivered, the destinations 220-1, 220-2, 220-3 and otherpossible information related to the transportation of the packages 110 or route planning, may be displayed on an interface unit 350. The interface unit 350 may comprise e.g. a mobiletelephone, a computer, a computer tablet, a display, a loudspeaker, a projector, a head-updisplay, a display integrated in the windshield of the vehicle 100, a display integrated in thedashboard of the vehicle 100, a tactile device, a portable device of the vehicle driver, intelli-gent glasses of the vehicle driver, etc.; or a combination thereof, in some embodiments. Theoptional interface unit 350 may be identical with the visualisation device 140, in some em-bodiments.
Further the vehicle comprises a calculation unit 360, which may be configured for compilinginformation from sensors in the vehicle 100 and possibly calculating maximum gross weightof the vehicle 100, and/ or a maximum axle load of the axles of the vehicle 100. The calcu-lation unit 360 may also obtain information from the code recognising device 130, concerningdetected packages 110 to be loaded.
The various units 320, 330, 350, 360 in the vehicle 100 may interactively communicate be-tween themselves via e.g. a wired or wireless communication bus. The communication busmay comprise e.g. a Controller Area Network (CAN) bus, a l\/ledia Oriented SystemsTransport (MOST) bus, or similar. However, the communication may alternatively be madeover a wireless connection comprising, or at least be inspired by any of the previously dis- cussed wireless communication technologies.
Figure 3B illustrates another example of how the previous scenario in Figure 2 may be per-ceived by the driver of the vehicle 100 (if any) according to another embodiment, alternativeto the embodiments illustrated in Figure 3A.
According to this embodiment, the control unit 300 is situated within the vehicle 100. Alsothe optional database 310 may be comprised in the vehicle 100 in this embodiment.
The control unit 300 may be configured for similar or even identical functionalities for assist-ing a loading entity in loading packages 110 into the vehicle 100 as the control unit 300described in Figure 3A.
Figure 4 illustrates an example of a method 400 according to an embodiment. The flow chartin Figure 4 shows the method 400, for visually illustrating a loading position 150 of a package110 to be loaded into a vehicle 100.
The vehicle 100 may be any arbitrary kind of means for conveyance, such as a truck, a bus,a car, or similar. The vehicle 100 may be driven by a driver, or be autonomous in different 16 embodiments. ln order to correctly be able to illustrate a loading position 150 of the package 110 to beloaded into the vehicle 100, the method 400 may comprise a number of steps 401-407. How-ever, some of these steps 401-407 may be performed in a somewhat different chronologicalorder than the numbering suggests. Step 404 may be performed before step 403 for examplein some embodiments. The method 400 may comprise the subsequent steps: Step 401 comprises identifying the package 110 to be Ioaded into the vehicle 100.
The identification may be made via a tag on the package 110, via image recognition, by acolour mark or other similar method uniquely identifying the package 110.
Step 402 comprises extracting information concerning a destination 220 of the identified 401package 110.
The extracted information of the identified 401 package 110 may further comprise an indica-tion of: sensibility of the package 1 10, weight of the package 110, dimensions of the package110, stackability of the package 110, value of the package 110, and/ or whether the package110 is subject of a transport certification service or not.
Step 403 comprises determining a current geographical position 210 of the vehicle 100.
The current vehicle position 210 may be determined by a geographical positioning unit 330,such as e.g. a GPS. However, the current position 210 of the vehicle 100 may alternativelybe detected and registered by the driver of the vehicle 100.
Step 404 comprises determining a set 105 of packages 110 to be distributed by the vehicle100 to a respective destination 220.
Step 405 comprises determining a route of the vehicle 100, based on the current geograph-ical position 210 of the vehicle 100 and the respective destination 220 of each package 110in the set 105 of packages 110 to be distributed.
Step 406 comprises determining a loading position 150 for each package 110 in the set 105of packages 110 to be distributed, based on the unloading order of the set 105 of packages110 according to the determined 405 route. 17 The loading position 150 for each package 110 in the set 105 of packages to be distributedmay be determined in order to avoid removal of other packages in the set 105 when unload-ing the identified 401 package 110.
The loading position 150 for each package 110 in the set 105 may be determined, further based on the further comprised indication of each respective identified 401 package 110.
The loading position 150 for each package 110 may be determined further based on loaddistribution from a centre of gravity perspective, available storage space in the vehicle 100,whether there is a particular compartment for valuable packages 110, and/ or packages 110subject of the transport certification service.
Step 407 comprises visually illustrating the loading position 150 of the identified 401 package1 10.
The loading position 150 of the identified 401 package 110 may be visually illustrated bygenerating and transmitting command signals for directing and igniting a light source 140,for illuminating the loading position 150 of the identified 401 package 110.
The loading position 150 of the identified 401 package 110 may be visually illustrated bygenerating and transmitting command signals to an at least partially transparent display 140configured for augmented reality, wherein an indication of the loading position 150 of theidentified 401 package 110 is provided when regarding the vehicle load space through theat least partially transparent display 140.
The loading position 150 of the identified 401 package 110 may be visually illustrated bygenerating and transmitting command signals to a display 140 presenting an image of thevehicle load space and an indication corresponding to the loading position 150 of the identi-fied 401 package 110.
Figure 5 illustrates an embodiment of a system 500 for visually illustrating a loading position150 of a package 110 to be loaded into a vehicle 100.
The system 500 comprises a code 120 identifying the package 110 to be loaded into thevehicle 100. Further, the system 500 comprises a device 130 for recognising the code 120.ln addition, the system 500 also comprises a visualisation device 140, configured to visuallyillustrate a loading position 150 of the identified package 110. The system also comprises acontrol unit 300. The control unit 300 may perform at least some of the previously described 18 steps 401-407 according to the method 400 described above and illustrated in Figure 4, forvisually illustrating the loading position 150 of the package 110 to be Ioaded into a vehicle100.
The control unit 300 is configured to identify the package 110 to be Ioaded into the vehicle100 when receiving a code representing the identify the package 110. The control unit 300is further configured to extract information concerning a destination 220 of the identifiedpackage 110. Also, the control unit 300 is configured to determine a current geographicalposition of the vehicle 100. ln further addition the control unit 300 is configured to determinea set 105 of packages to be distributed by the vehicle 100 to a respective destination 220.The control unit 300 is also, furthermore, configured to determine a route of the vehicle 100,based on the current geographical position of the vehicle 100 and the respective destination220 of each package 110 in the set 105 of packages to be distributed. Additionally, the con-trol unit 300 is also configured to determine a loading position 150 for each package 110 inthe set 105 of packages 110 to be distributed, based on the unloading order of the set 105of packages 110 according to the determined route. The control unit 300 is further configuredto generate command signals for visually illustrating the loading position 150 of the identifiedpackage 110. ln some embodiments, the control unit 300 may be further configured to determine the load-ing position 150 for each package 110 to be delivered in order to avoid removal of otherpackages in the set 105 to be delivered when unloading the identified package 110. ln some embodiments, the control unit 300 may be comprised in the vehicle 100. However,in some other embodiments, the control unit 300 may be comprised in a vehicle externalstructure 315.
The control unit 300 may in some embodiments be further configured to extract informationof the identified package 110 which further comprises an indication of: sensibility of the pack-age 110, weight of the package 110, dimensions of the package 110, stackability of thepackage 110, value of the package 110, and/ or whether the package 110 is subject of atransport certification service or not. ln addition, the control unit 300 may in some embodi-ments also be further configured to determine the loading position 150 for each package 110to be distributed, further based on the further comprised indication of each respective iden-tified package 110. 19 Furthermore, the control unit 300 may also be configured to determine the loading position150 for each package 110 further based on load distribution from a centre of gravity perspec-tive, available storage space in the vehicle 100, whether there is a particular compartmentfor valuable packages 110 and/ or packages 1 10 subject of the transport certification service.
The control unit 300 may also be further configured to generate and transmit command sig-nals for directing and igniting a light source 140, for illuminating the loading position 150 ofthe identified package 110.
Also, the control unit 300 may be additionally configured to generate and transmit commandsignals to an at least partially transparent display 140 configured for augmented reality,wherein the loading entity is provided an indication of the loading position 150 of the identifiedpackage 110 when regarding the vehicle load space through the at least partially transparentdisplay 140.
The control unit 300 may be configured to generate and transmit command signals to a dis-play 140 presenting an image of the vehicle load space and an indication corresponding tothe loading position 150 of the identified package 110.
The control unit 300 may comprise a processor 520 configured for performing at least someof the previously described steps 401-407 according to the method 400, in some embodi- mentS.
Such processor 520 may comprise one or more instances of a processing circuit, i.e. a Cen-tral Processing Unit (CPU), a processing unit, a processing circuit, a processor, an Applica-tion Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that mayinterpret and execute instructions. The herein utilised expression “processor” may thus rep-resent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.
The control unit 300 may further comprise a receiving circuit 510 configured for receiving asignal from the device 130 for recognising the code 120 identifying the package 110; or fromthe database 310 in different embodiments.
Furthermore, the control unit 300 may comprise a memory 525 in some embodiments. Theoptional memory 525 may comprise a physical device utilised to store data or programs, i.e.,sequences of instructions, on a temporary or permanent basis. According to some embodi- ments, the memory 525 may comprise integrated circuits comprising silicon-based transis-tors. The memory 525 may comprise e.g. a memory card, a flash memory, a USB memory,a hard disc, or another similar volatile or non-volatile storage unit for storing data such ase.g. ROIVI (Read-Only Memory), PROIVI (Programmable Read-Only Memory), EPROIVI(Erasable PROIVI), EEPROIVI (Electrically Erasable PROIVI), etc. in different embodiments.
Further, the control unit 300 may comprise a signal transmitter 530. The signal transmitter530 may be configured for transmitting signals to be received by the visualisation device 140,configured to visually illustrate a loading position 150 of the identified package 110 in someembodiments.
The previously described steps 401 -407 to be performed in the control unit 300 may be im-plemented through the one or more processors 520 within the control unit 300, together withcomputer program product for performing at least some of the functions of the steps 401-407. Thus a computer program product, comprising instructions for performing the steps 401-407 in the control unit 300 may perform the method 400 comprising at least some of thesteps 401-407 for visually illustrating the loading position 150 of the package 110 to beloaded into the vehicle 100, when the computer program is loaded into the one or moreprocessors 520 of the control unit 300.
The computer program product mentioned above may be provided for instance in the formof a data carrier carrying computer program code for performing at least some of the step401-407 according to some embodiments when being loaded into the one or more proces-sors 520 of the control unit 300. The data carrier may be, e.g., a hard disk, a CD ROIVI disc,a memory stick, an optical storage device, a magnetic storage device or any other appropri-ate medium such as a disk or tape that may hold machine readable data in a non-transitorymanner. The computer program product may furthermore be provided as computer programcode on a server and downloaded to the control unit 300 remotely, e.g., over an Internet or an intranet connection.
The terminology used in the description of the embodiments as illustrated in the accompa-nying drawings is not intended to be limiting of the described method 400; the control unit300; the computer program, the vehicle 100 and/ or the vehicle external structure 315. Vari-ous changes, substitutions and/ or alterations may be made, without departing from inventionembodiments as defined by the appended claims.
As used herein, the term "and/ or" comprises any and all combinations of one or more of theassociated listed items. The term “or” as used herein, is to be interpreted as a mathematical 21 OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless ex-pressly stated otherwise. ln addition, the singular forms "a", "an" and "the" are to be inter-preted as “at least one", thus also possibly comprising a plurality of entities of the same kind,unless expressly stated othen/vise. lt will be further understood that the terms "includes","comprises", "including" and/ or "comprising", specifies the presence of stated features, ac-tions, integers, steps, operations, elements, and/ or components, but do not preclude thepresence or addition of one or more other features, actions, integers, steps, operations, ele-ments, components, and/ or groups thereof. A single unit such as e.g. a processor may fulfilthe functions of several items recited in the claims. The mere fact that certain measures arerecited in mutually different dependent claims does not indicate that a combination of thesemeasures cannot be used to advantage. A computer program may be stored/ distributed ona suitable medium, such as an optical storage medium or a solid-state medium suppliedtogether with or as part of other hardware, but may also be distributed in other forms such as via Internet or other Wired or wireless communication system.

Claims (18)

1. A method (400) for visually illustrating a loading position (150) of a package (110)to be Ioaded into a vehicle (100), wherein the method (400) comprises: identifying (401) the package (110) to be Ioaded into the vehicle (100); extracting (402) information concerning a destination (220) of the identified (401)package (110); determining (403) a current geographical position (210) of the vehicle (100); determining (404) a set (105) of packages (110) to be distributed by the vehicle(100) to a respective destination (220); determining (405) a route of the vehicle (100), based on the current geographicalposition (210) of the vehicle (100) and the respective destination (220) of each package (1 10)in the set (105) of packages (110) to be distributed; determining (406) a loading position (150) for each package (110) in the set (105)of packages (110) to be distributed, based on the unloading order of the set (105) of pack-ages (110) according to the determined (405) route; and visually illustrating (407) the loading position (150) of the identified (401) package(110).
2. The method (400) according to claim 1, wherein the loading position (150) for eachpackage (110) in the set (105) of packages to be distributed is determined (406) in order toavoid removal of other packages in the set (105) when unloading the identified (401) package(110).
3. The method (400) according to any of claim 1 or claim 2, wherein the extracted(402) information of the identified (401) package (110) further comprises an indication of:sensibility of the package (110), weight of the package (110), dimensions of the package(110), stackability of the package (110), value of the package (110), and/ or whether thepackage (110) is subject of a transport certification service or not; and wherein the loading position (150) for each package (110) in the set (105) is deter-mined (406), further based on the further comprised indication of each respective identified(401) package (110).
4. The method (400) according to claim 3, wherein the loading position (150) for eachpackage (110) is determined (406) further based on load distribution from a centre of gravityperspective, available storage space in the vehicle (100), whether there is a particular com-partment for valuable packages (1 10), and/ or packages (110) subject of the transport certi- fication service. 23
5. The method (400) according to any of claims 1-4, wherein the loading position (150)of the identified (401) package (110) is visually illustrated (407) by generating and transmit-ting command signals for directing and igniting a light source (140), for illuminating the load-ing position (150) of the identified (401) package (110).
6. The method (400) according to any of claims 1-4, wherein the loading position (150)of the identified (401) package (110) is visually illustrated (407) by generating and transmit-ting command signals to an at least partially transparent display (140) configured for aug-mented reality, wherein an indication of the loading position (150) of the identified (401)package (110) is provided when regarding the vehicle load space through the at least par-tially transparent display (140).
7. The method (400) according to any of claims 1-4, wherein the loading position (150)of the identified (401) package (110) is visually illustrated (407) by generating and transmit-ting command signals to a display (140) presenting an image of the vehicle load space andan indication corresponding to the loading position (150) of the identified (401) package(110).
8. A control unit (300) for visually illustrating a loading position (150) of a package(110) to be loaded into a vehicle (100), wherein the control unit (300) is configured to: identify the package (1 10) to be loaded into the vehicle (100) when receiving a coderepresenting the identify the package (110); extract information concerning a destination (220) of the identified package (110); determine a current geographical position of the vehicle (100); determine a set (105) of packages to be distributed by the vehicle (100) to a respec-tive destination (220); determine a route of the vehicle (100), based on the current geographical positionof the vehicle (100) and the respective destination (220) of each package (110) in the set(105) of packages to be distributed; determine a loading position (150) for each package (110) in the set (105) of pack-ages (1 10) to be distributed, based on the unloading order of the set (105) of packages (1 10)according to the determined route; and generating command signals for visually illustrating the loading position (150) of theidentified package (110).
9. The control unit (300) according to claim 8, further configured to determine the load-ing position (150) for each package (110) to be delivered in order to avoid removal of otherpackages in the set (105) to be delivered when unloading the identified package (110). 24
10.extract information of the identified package (110) which further comprises an indication of: The control unit (300) according to any of claim 8 or claim 9, further configured to sensibility of the package (110), weight of the package (110), dimensions of the package(110), stackability of the package (110), value of the package (110), and/ or whether thepackage (110) is subject of a transport certification service or not; and in addition configured to determine the loading position (150) for each package(110) to be distributed, further based on the further comprised indication of each respectiveidentified package (110).
11. The control unit (300) according to claim 10, further configured to determine theloading position (150) for each package (1 10) further based on load distribution from a centreof gravity perspective, available storage space in the vehicle (100), whether there is a par-ticular compartment for valuable packages (110) and/ or packages (110) subject of thetransport certification service.
12.and transmit command signals for directing and igniting a light source (140), for illuminating The control unit (300) according to any of claims 8-1 1 , further configured to generate the loading position (150) of the identified package (110).
13.and transmit command signals to an at least partially transparent display (140) configured The control unit (300) according to any of claims 8-1 2, further configured to generate for augmented reality, wherein an indication of the loading position (150) of the identifiedpackage (110) is provided when regarding the vehicle load space through the at least par-tially transparent display (140).
14.and transmit command signals to a display (140) presenting an image of the vehicle load The control unit (300) according to any of claims 8-1 3, further configured to generate space and an indication corresponding to the loading position (150) of the identified package(110).
15.cording to any of claims 1-7 when the computer program is executed in a control unit (300) A computer program comprising program code for performing a method (400) ac- according to any of claims 8-14.
16. A vehicle (100) comprising a control unit (300) according to any of claims 8-14.
17. A structure (315) comprising a control unit (300) according to any of claims 8-14, external to a vehicle (100).
18. A system (500) for visually illustrating a loading position (150) of a package (110)5 to be loaded into a vehicle (100), wherein the system (500) comprises:a code (120) identifying the package (110) to be loaded into the vehicle (100);a device (130) for recognising the code (120);a control unit (300) according to any of claims 8-14; anda visualisation device (140), configured to visually illustrate a loading position (150) 10 of the identified package (110).
SE1650520A 2016-04-19 2016-04-19 Method and control unit for loading a vehicle SE1650520A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE1650520A SE1650520A1 (en) 2016-04-19 2016-04-19 Method and control unit for loading a vehicle
DE102017003273.4A DE102017003273A1 (en) 2016-04-19 2017-04-04 METHOD AND CONTROL UNIT FOR LOADING A VEHICLE

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1650520A SE1650520A1 (en) 2016-04-19 2016-04-19 Method and control unit for loading a vehicle

Publications (1)

Publication Number Publication Date
SE1650520A1 true SE1650520A1 (en) 2017-10-20

Family

ID=59980703

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1650520A SE1650520A1 (en) 2016-04-19 2016-04-19 Method and control unit for loading a vehicle

Country Status (2)

Country Link
DE (1) DE102017003273A1 (en)
SE (1) SE1650520A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107934705A (en) * 2017-12-06 2018-04-20 广州广日电梯工业有限公司 A kind of elevator and its operation method suitable for automated workshop
CN107892236B (en) * 2017-12-11 2023-08-04 中国地质大学(武汉) Container loading and unloading operation auxiliary equipment
CN112288316A (en) * 2020-11-16 2021-01-29 珠海格力智能装备有限公司 Control method, device and system for dispatching vehicle

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NZ510133A (en) 2001-02-23 2004-01-30 Internat Stevedoring Operation 3-D modeling of cargo packing on a ship
US20030110102A1 (en) 2001-05-29 2003-06-12 Chen-Fu Chien Method for goods arrangement and its system
CN1936937A (en) 2005-09-20 2007-03-28 中国海洋大学 Heuristic car-distribution method under multiple constraint conditions
DE102013004537A1 (en) 2013-03-15 2013-09-19 Daimler Ag Device for monitoring parcel of box wagon, has detecting unit detecting size or weight of load before loading of load, and output unit outputting weight or size-dependant optimal position of load within loading space of vehicle
DE102014018198A1 (en) 2014-12-09 2015-06-18 Daimler Ag A method of assisting in loading a vehicle and apparatus adapted to carry out such a method

Also Published As

Publication number Publication date
DE102017003273A1 (en) 2017-10-19

Similar Documents

Publication Publication Date Title
JP6800217B2 (en) How to determine a flat surface for descending an unmanned aerial vehicle
EP3698270B1 (en) Systems and methods for tracking goods carriers
ES2806559T3 (en) Object tracking method and system
JP6390068B2 (en) Method for delivering cargo by unmanned transport equipment
US10723554B2 (en) Systems and methods for intake and transport of physical objects in a facility
US11430148B2 (en) Apparatus and method for pallet volume dimensioning through 3D vision capable unmanned aerial vehicles (UAV)
US20200130833A1 (en) Systems, methods, and devices for package delivery using unmanned aerial vehicles
US20140354809A1 (en) Unmanned aerial vehicle inventory system
CN115027676A (en) Unmanned aerial vehicle picks up and delivers system
US20190303861A1 (en) System and method for item recovery by robotic vehicle
EP3262855A1 (en) A monitoring device and systems and methods related thereto
US10586202B2 (en) Systems and methods for validating products to be delivered by unmanned aerial vehicles
US20200254619A1 (en) Teleoperation in a smart container yard
US11905012B2 (en) Determining method of arrangement place, transport system, and information processing device
WO2017068871A1 (en) Information processing device, information processing method, and transportation system
KR101727516B1 (en) Method of Delivering Products Using an Unmanned Delivery Equipment
CN109690437A (en) Vehicle with UAV Landing function
US20190080283A1 (en) System and Method for Pallet Optimization
SE1650520A1 (en) Method and control unit for loading a vehicle
CN111587444A (en) System and method for mobile parcel size calculation and predictive situational analysis
CN113226959A (en) Logistics system, unmanned aerial vehicle and cargo management method
EP3689743A1 (en) Augmented weight sensing for aircraft cargo handling systems
US10726379B1 (en) Last mile delivery systems and methods using a combination of autonomous launch and delivery vehicles
US11790313B1 (en) Unmanned aerial vehicle item delivery
JP2020179991A (en) Delivery containment

Legal Events

Date Code Title Description
NAV Patent application has lapsed