WO2020111999A1 - Method and control arrangement for visualisation of obstructed view - Google Patents

Method and control arrangement for visualisation of obstructed view Download PDF

Info

Publication number
WO2020111999A1
WO2020111999A1 PCT/SE2019/051145 SE2019051145W WO2020111999A1 WO 2020111999 A1 WO2020111999 A1 WO 2020111999A1 SE 2019051145 W SE2019051145 W SE 2019051145W WO 2020111999 A1 WO2020111999 A1 WO 2020111999A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
receiving unit
representation
control arrangement
transmitting unit
Prior art date
Application number
PCT/SE2019/051145
Other languages
French (fr)
Inventor
Pedro Lima
Marcello CIRILLO
Attila MÁRKUS
Henrik Pettersson
Lars-Gunnar Hedström
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Publication of WO2020111999A1 publication Critical patent/WO2020111999A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/162Decentralised systems, e.g. inter-vehicle communication event-triggered
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/85Arrangements for transferring vehicle- or driver-related data
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/69Coordinated control of the position or course of two or more vehicles
    • G05D1/695Coordinated control of the position or course of two or more vehicles for maintaining a fixed relative position of the vehicles, e.g. for convoy travelling or formation flight
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/94Hardware or software architectures specially adapted for image or video understanding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/96Management of image or video recognition tasks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/22Platooning, i.e. convoy of communicating vehicles
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/22Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of characters or indicia using display control signals derived from coded signals representing the characters or indicia, e.g. with a character-code memory
    • G09G5/24Generation of individual character patterns
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/176Camera images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/589Wireless data transfers
    • B60K2360/5915Inter vehicle communication
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/65Data transmitted between vehicles
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • G09G2360/144Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/16Use of wireless transmission of display information
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications

Definitions

  • This document discloses a control arrangement of an information transmitting unit and a 5 method therein, and a control arrangement of an information receiving unit and a method therein. More particularly, methods and control arrangements are described, for providing information detected by the information transmitting unit, to the information receiving unit.
  • Grouping vehicles into platoons is an emerging technology, leading to reduced fuel con sumption and increased capacity of the roads.
  • a number of vehicles e.g. 2-25 or more, may be organised in a platoon or vehicle convoy, wherein the vehicles are driving in coordination after each other with only a small distance between the vehicles, such as some decimetres or some meters, such as e.g. 20 meters, or at a distance that is dependent on the speed of5 the platoon (e.g., the vehicles may be about 2 or 3 seconds apart during transportation).
  • air resistance is reduced, which is important for reducing energy consumption, in particular for trucks, busses and goods vehicles or other vehicles having a large frontal area. In principle it may be said that the shorter the distance is between the vehicles, the lower the air resistance becomes, which reduces energy consumption for the vehicle platoon.
  • the distance between the vehicles in the platoon may be reduced as the vehicles are ena bled to communicate wirelessly with each other and thereby coordinate their velocity by e.g. accelerating or braking simultaneously. Thereby the reacting distance needed for human reaction during normal driving is eliminated.
  • platooning brings a multitude of advantages, such as improved fuel economy due to reduced air resistance, and also reduced traffic congestion leading to increased capacity of the roads and enhanced traffic flow. Also, platoons can readily exploit advancements in automation, for example by letting only the lead vehicle may be human-driven, while the others may follow0 autonomously. This would enable a reduction on the number of the drivers (that is, one or two per platoon), or prolonged continuous driving, as the drivers in all but the first truck can rest.
  • Another emerging technology is remotely controlled vehicles, for example in mining, con- struction, forestry, military applications, rescuing operations, extraterrestrial explorations, etc.
  • the driver could then be safely situated in a control room, protected from a possibly hostile environment of the vehicle.
  • a video image or other sensor data captured by a sensor of the vehicle may be provided to the driver in the control room.
  • Vehicles are sometimes autonomous. Thereby the driver is omitted, superseded by an onboard control logic enabling the vehicle to drive and manage various appearing traffic sit uations, based on sensor data captured by sensors on the vehicle.
  • various unde- fined, non-predicted situations may occur, which cannot be handled by the onboard control logic alone.
  • a human operator in a remote monitoring room may then be alerted and sensor data documenting the appeared situation may be transmitted to the operator.
  • streaming video data may cause an unfortunate time delay. Again, the driving commands of the operator may be reactions to an obsolete situation, and / or arrive too late to the vehicle, which may compromise vehicle safety and / or cause an accident.
  • Document US2017158133 discloses a vehicle vision system with compressed video trans fer.
  • the vehicle utilises one or more cameras to capture image data and transmits com pressed video images to another vehicle.
  • This system can be used in a platooning group of vehicles.
  • the compressed images captured by a forward viewing camera of the lead vehicle are communicated to following vehicles.
  • the compressed video images may be processed by a machine vision processer.
  • This solution is reducing the transferring time of the video stream by compressing the infor mation to be transferred.
  • the process of compression/ decompression takes time.
  • a very small delay may be hazardous and cause an accident.
  • Document CN102821282 discloses a video communication in vehicular network.
  • the video captured by each vehicle is shared among vehicles of the whole vehicle fleet.
  • the captured video is coded and compressed. This solution shares the same or similar problems as the previously described solution.
  • Document US2015043782 discloses a method for detecting and displaying obstacles and data associated with the obstacles.
  • a digital device displays both the captured image and the related information. For example, the distance of each person is overlayed on their im- ages.
  • this objective is achieved by a method in a control arrangement of an information transmitting unit.
  • the method aims at providing information to an information receiving unit.
  • the method comprises the steps of: collecting environmental data with at least one sensor. Further the method comprises identifying an object in the en vironment of the information transmitting unit, which is considered relevant for the information receiving unit.
  • the method also comprises extracting data related to the identified object from the collected environmental data. Also, the method furthermore comprises converting the extracted data into information.
  • the method in addition comprises determining position of the object based on the collected environmental data.
  • the method comprises providing the converted information and the determined position of the object to the infor- mation receiving unit via a wireless transmitter, thereby enabling output of a representation of the object on an output device of the information receiving unit.
  • this objective is achieved by a control ar rangement of an information transmitting unit.
  • the control arrangement aims at for outputting a representation of an object detected by at least one sensor of an information transmitting unit, based on information obtained from the information transmitting unit.
  • the method com- prises receiving information concerning the object and position of the object from the infor mation transmitting unit via a wireless receiver.
  • the method comprises converting the received information concerning the object into a representation of the object.
  • the method additionally comprises outputting the representation of the object on an output device of the information receiving unit.
  • this objective is achieved by a method in a control arrangement of an information receiving unit, for outputting a representation of an object detected by at least one sensor of an information transmitting unit, based on information obtained from the information transmitting unit.
  • the method comprises receiving information concerning the object and position of the object from the information transmitting unit via a wireless receiver. Also, the method further comprises converting the received information concerning the object into a representation of the object. The method in addition comprises outputting the representation of the object on an output device of the information receiving unit.
  • this objective is achieved by a control arrange ment of an information receiving unit, for outputting a representation of an object detected by at least one sensor of an information transmitting unit, based on information obtained from the information transmitting unit.
  • the control arrangement is configured to receive infor- mation concerning the object and position of the object from the information transmitting unit via a wireless receiver.
  • the control arrangement is also configured to convert the received information concerning the object into a representation of the object. Further, the control arrangement is configured to output the representation of the object on an output device of the information receiving unit.
  • information may be provided with low time latency to the information receiving unit.
  • the information receiving unit may obtain information in real time or with a very low time delay, enabling output of a representation of the object on an output device of the information receiving unit.
  • the driver of another vehicle/ the information receiving unit may be informed about the object detected by the first vehicle/ information providing unit without any substantial time delay, enabling the driver to prepare for an appropriate action due to the detected object. It is thereby avoided that the driver of the other vehicle/ the information receiving unit is sur- prised by a suddenly occurring action such as a hard brake, speed bump, etc., which may in a worst-case scenario may cause an accident.
  • traffic safety is enhanced.
  • Figure 1A illustrates an embodiment of a group of vehicles.
  • Figure 1 B illustrates vehicles transmitting information between each other.
  • Figure 1C illustrates a vehicle transmitting information to a control tower.
  • Figure 2A illustrates a vehicle interior according to an embodiment of the invention.
  • Figure 2B illustrates a vehicle interior according to an embodiment of the invention.
  • Figure 2C illustrates a vehicle interior according to an embodiment of the invention.
  • Figure 3A illustrates vehicles transmitting information between each other.
  • Figure 3B illustrates vehicles transmitting information between each other.
  • Figure 3C illustrates vehicles transmitting information between each other.
  • Figure 4 is a flow chart illustrating an embodiment of a first method.
  • Figure 5 is an illustration depicting a control arrangement of an information transmitting unit according to an embodiment.
  • Figure 6 is a flow chart illustrating an embodiment of a second method.
  • Figure 7 is an illustration depicting a control arrangement of an information receiving unit according to an embodiment.
  • Embodiments of the invention described herein are defined as control arrangements and methods in control arrangements, which may be put into practice in the embodiments de- scribed below. These embodiments may, however, be exemplified and realised in many dif ferent forms and are not to be limited to the examples set forth herein; rather, these illustra tive examples of embodiments are provided so that this disclosure will be thorough and com plete.
  • Figure 1A illustrates a scenario wherein a number of vehicles 100a, 100b, 100c, driving in a driving direction 105, with inter-vehicular distances d1 , d2.
  • the vehicles 100a, 100b, 100c may be coordinated and organised in a group 110 of vehicles, which may be referred to as a platoon.
  • the vehicles 100a, 100b, 100c may be non-coordinated, for example standing sequentially after each other in a traffic congestion, or just driving/ standing in a vicinity of each other.
  • the involved vehicles 100a, 100b, 100c may not necessarily be driving in the same direction 105, and / or the same file; or even be driving at all, i.e. one or more vehicles 100a, 100b, 100c may be stationary.
  • one or more of the vehicles 100a, 100b, 100c in the group 1 10 may be referred to as a structure rather than a vehicle, such as e.g. a building, a control tower, a lamp post, a traffic sign, etc.
  • at least one vehicle 100a, 100b, 100c in the group 1 10 is blocking at least a part of the view of at least one other vehicle 100a, 100b, 100c in the group 1 10.
  • the vehicle group 1 10 comprises a platoon
  • it may be described as a chain of coordinated, inter-communicating vehicles 100a, 100b, 100c travelling at given inter-vehicular distances d1 , d2 and velocity.
  • the inter-vehicular distances d1 , d2 may be fixed or variable in different embodiments.
  • the distances d1 , d2 may be e.g. some centimetres, some decimetres, some meters or some tenths of meters in some embodiments.
  • each vehicle 100a, 100b, 100c in the group 1 10 may have a different distance d1 , d2 to the vehicle following, or leading, vehicle 100a, 100b, 100c, than all other vehicles 100a, 100b, 100c in the coordinated group 1 10.
  • the vehicles 100a, 100b, 100c in the group 1 10 may comprise vehicles of the same, or different types in different embodiments, such as trucks, multi-passenger vehicles, trailers, cars, etc; and / or structures such as buildings, road infra structures, etc.
  • the vehicles 100a, 100b, 100c may be driver controlled or driverless autonomously con trolled vehicles in different embodiments. However, for enhanced clarity, the vehicles 100a, 100b, 100c are subsequently described as having a driver.
  • the vehicles 100a, 100b, 100c in the group 1 10 may be coordinated, or communicate via wireless signal.
  • wireless signal may comprise, or at least be inspired by wireless com munication technology such as Wi-Fi, Wireless Local Area Network (WLAN), Ultra Mobile Broadband (UMB), Bluetooth (BT), Near Field Communication (NFC), Radio-Frequency Identification (RFID), optical communication such as Infrared Data Association (IrDA) or in frared transmission to name but a few possible examples of wireless communications in some embodiments.
  • the communication between vehicles 100a, 100b, 100c in the group 1 10 may be performed via vehicle-to-vehicle (V2V) communication, e.g. based on Dedicated Short-Range Communications (DSRC) devices.
  • V2V vehicle-to-vehicle
  • DSRC Dedicated Short-Range Communications
  • the wireless communication may be made according to any IEEE standard for wireless ve hicular communication like e.g. a special mode of operation of IEEE 802.1 1 for vehicular networks called Wireless Access in Vehicular Environments (WAVE).
  • IEEE 802.1 1 p is an extension to 802.1 1 Wireless LAN medium access layer (MAC) and physical layer (PHY) specification.
  • MAC Wireless LAN medium access layer
  • PHY physical layer
  • the communication may alternatively be made over a wireless interface comprising, or at least being inspired by radio access technologies such as e.g. third Generation Partnership Project (3GPP) 5G/ 4G, 3GPP Long Term Evolution (LTE), LTE-Advanced, Groupe Special Mobile (GSM), or similar, just to mention some few options, via a wireless communication network.
  • 3GPP third Generation Partnership Project
  • LTE Long Term Evolution
  • GSM Groupe Special Mobile
  • the driver of the first vehicle 100a drive the own vehicle 100a and the other vehicles 100b, 100c in the group 1 10 are merely following the driving commands of the first vehicle 100a.
  • a non-leading vehicle 100b, 100c driving in the platoon 1 10 has a restricted sight as the leading vehicle 100a obstructs the field of view of the following vehicles 100b, 100c. This phenomenon may occur also in a vehicle queue, a traffic congestion, a parking lot, etc.
  • one vehicle 100a in the group 1 10, typically the first vehicle 100a of the group 1 10, comprises one or several sensors 130, of the same or different types.
  • the sensor 130 may be a forwardly directed sensor 130 in some embodi ments.
  • the forwardly directed sensor 130 may be situated e.g. at the front of the first vehicle 100a of the group 1 10, behind the windscreen of the vehicle 100.
  • the sensor 130 may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar device, in different embodiments.
  • the senor 130 may comprise e.g. a motion detector and / or be based on a Passive Infrared (PIR) sensor sensitive to a person's skin temperature through emitted black body radiation at mid-infrared wavelengths, in contrast to background objects at room temperature; or by emitting a continuous wave of microwave radiation and detect motion through the principle of Doppler radar; or by emitting an ultrasonic wave and detecting and analysing the reflections; or by a tomographic motion detection system based on detection of radio wave disturbances, to mention some possible implementations.
  • Figure 1 B illustrates a scenario wherein the sensor 130 of one/ the first vehicle 100a of the group 1 10 is detecting an object 200.
  • the object 200 in the illustrated example is a human, but it may be any kind of object which may be considered relevant for another vehicle 100a, 100b, 100c in the group 1 10, such as an obstacle on the road, an animal at or in the vicinity of the road, another vehicle, a speed barrier, a structure at or close to the road such as a road sign, a traffic light, a crossing road and vehicles there upon, etc.
  • the sensor 130 may comprise or be connected to a control arrangement configured for im age recognition/ computer vision and object recognition.
  • Computer vision is a technical field comprising methods for acquiring, processing, analysing, and understanding images and, in general, high-dimensional data from the real world in order to produce numerical or symbolic information.
  • a theme in the development of this field has been to duplicate the abilities of human vision by electronically perceiving and understanding an image. Understanding in this context means the transformation of visual images (the input of retina) into descriptions of world that can interface with other thought processes and elicit appropriate action.
  • This im age understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory.
  • Computer vision may also be described as the enterprise of automating and integrat ing a wide range of processes and representations for vision perception.
  • the sensor data of the sensor/-s 130 may take many forms, such as e.g. images, video sequences, views from multiple cameras, or multi-dimensional data from a scanner, data of a lidar, radar, etc; or a combination thereof.
  • information 210 comprising a sim- plified representation of the object 200 may be provided to the other vehicles/ information receiving units 100b, 100c.
  • information 210 concerning the perceived environment may be transmitted as a simplified cartoon with standard images saved a priori, e.g., traffic light, pedestrian, car, bicycle, bus, etc. This data can then be used by the following vehicles to create an“cartooned” image of the obstructed field of view.
  • the transmission of this information is much simpler and faster, allowing for a real time com munication of what is in front of the vehicle/ information transmitting unit 100a having de- tected the object 200.
  • the drivers of the following vehicles/ information receiving units 100b, 100c are informed about what is happening in front of the leading vehicle/ information trans mitting unit 100a and can react accordingly, e.g. by preparing for a brake, for slowing down, for a speed bump, etc.
  • the drivers of the other vehicles/ information receiving units 100b, 100c could become aware of the environmental traffic situation and prepare accordingly.
  • certain functions in the vehicle/ information receiving units 100b, 100c could be activated in some embodi ments, triggered by the detected obstacle 200, such as tightening the safety belt when a hard brake could be expected, etc., or the vehicles 100a, 100b, 100c could activate some automatic functions to avoid the impact, e.g., slowing down or changing trajectory Thereby traffic safety is increased, and / or impact of any traffic accident is eliminated or at least de creased.
  • Figure 1C illustrates an embodiment wherein the group 1 10 comprises a vehicle 100a driv- ing on a road 120, and a control room 100b.
  • the vehicle 100a is an information transmitting unit while the control room 100b is an information receiving unit.
  • the vehicle/ information transmitting unit 100a may be an unmanned vehicle, remotely con- trolled by a human driver in the control room / information receiving unit 100b.
  • the vehicle/ information transmitting unit 100a may be an autonomous vehicle while the human driver in the control room / information receiving unit 100b is only alerted when the autonomous vehicle is experiencing an unknown/ undefined problem.
  • the situation may in other embodiments be the opposite, i.e. the control room 100b may be the information transmitting unit while the vehicle 100a may be the information re DCving unit.
  • the vehicle 100a may be a manned vehicle and a sensor 130 in the control room 100b may detect information which may be provided to the driver of the vehicle 100a, such as for example map / directional information; working instructions for mining/ agriculture/ forestry, etc.
  • the sensor 130 of the information transmitting unit 100a may detect an object 200.
  • the control arrangement of the information transmitting unit 100a may then determine that the detected object 200 is relevant, and information 210 comprising a simplified representation of the object 200 may be provided to the information receiving unit 100b.
  • the information 210 may be received via the wireless receiver 140b and outputted on an output device such as a display or similar.
  • a human monitoring the vehicle 100a, or a plurality of vehicles comprised in the group 1 10 may become aware of the object 200 detected by the sensor 130 of the vehicle/ information transmitting unit 100a in real time, or almost real time, without risking to suffer the time delay that would have result if all the sensor data of the object 200 would have been transferred.
  • the human may thereby react and determine an appropriate action of the vehicle/ infor mation transmitting unit 100a, e.g. by sending a command or instructions how to handle the situation due to the detected object 200.
  • Figure 2A illustrates an example of a scenario as it may be perceived by the driver of the second vehicle 100b in a group 1 10 of vehicles 100a, 100b, 100c.
  • the vehicle/ information receiving unit 100b comprises a control arrangement 230 for out- putting a representation of an object 200 detected by at least one sensor 130 of an infor mation transmitting unit 100a, based on information 210 obtained from the information trans mitting unit 100a.
  • the information transmitting unit 100a comprises a transmitter 140a, trans mitting wireless data to be received by a receiver 140b in the information receiving unit 100b.
  • the information receiving unit 100b comprises an output device 240 in form of a display, loud speaker and / or a tactile device.
  • the output device 240 may com prise a pair of intelligent glasses, i.e. an optical head-mounted display, that is designed in the shape of a pair of eyeglasses; or a set of portable head-up displays; a device for illus- trating an Augmented Reality (AR).
  • AR Augmented Reality
  • the driver of the vehicle/ information receiving unit 100b By receiving the simplified information 210 of the object 200 via the wireless communication from the transmitter 140a of the information transmitting unit 100a and outputting a repre sentation of the object 200 on the output device 240 in form of a cartooned object, the driver of the vehicle/ information receiving unit 100b becomes aware of the object 200 in real time, or almost real time.
  • Figure 2B illustrates yet an example of a scenario as it may be perceived by the driver of the second vehicle 100b in a group 1 10 of vehicles 1 00a, 1 00b, 100c, similar to the scenario illustrated in Figure 2A.
  • the information concerning the detected object 200 triggers the output of a prestored image of an animal on the output device 240.
  • the image may be prestored at a memory device of the first vehicle/ information transmitting unit 100a and transmitted to the other vehicles/ information receiving unit 100b, 100c.
  • the image may be prestored at a memory device of the other vehicles/ information receiving unit 100b, 100c and only a reference to the image is transferred from the first vehicle/ information transmitting unit 100a to the other vehicles/ information receiving unit 100b, 100c.
  • FIG. 2C illustrates yet an example of a scenario as it may be perceived by the driver of the second vehicle 100b in a group 1 10 of vehicles 100a, 100b, 100c, similar to the scenario illustrated in Figures 2A-2B.
  • the information concerning the detected object 200 triggers the output of a highly stylized image, which may be prestored, of a detected obstacle on the output device 240.
  • the image may be prestored at a memory device of the first vehi cle/ information transmitting unit 100a and transmitted to the other vehicles/ information re- ceiving unit 100b, 100c.
  • the image may be prestored at a memory device of the other vehicles/ information receiving unit 100b, 100c and only a reference to the image is transferred from the first vehicle/ information transmitting unit 100a to the other vehicles/ information receiving unit 100b, 100c.
  • Figure 3A illustrates an example of information transfer, in some embodiments, and the vehicles 1 00a, 1 00b as regarded from above.
  • the sensor/-s 130 of the first vehicle/ information transmitting unit 100a may determine the distance D, and / or lateral displacement L of the object 200 in relation to the sensor 180 and / or vehicle 100a (or some other reference point) may be determined.
  • the distance/ lateral displacement may for example be determined by radar, lidar, etc., by triangulation of sensor signals captured by sensors 130 situated at different locations on the vehicle 100a; or by capturing an image and performing an image analysis.
  • the determined information concerning the relative position of the detected object 200 such as e.g. D and L, may then be provided to the information receiving unit 100b in some em bodiments, together with information representing the object 200.
  • an absolute position of the detected object 200 be calculated, based on the determined relative position of the object 200 and an absolute geographical position of the vehicle 100a.
  • Figure 3B illustrates an example of information transfer, in some embodiments.
  • the information transmitting unit 100a and the information re DCving unit 100b comprise a common table 320a, 320b wherein some different examples of object representation are stored, each associated with a reference, such as e.g. a number.
  • the table 320a of the information transmitting unit 100a may be stored in a memory 300 of the information transmitting unit 100a while the table 320b of the information receiving unit 100b may be stored in a memory 310 of the information receiving unit 100b.
  • the control arrangement 220 of the information transmitting unit 100a may then determine that the object 200 is relevant for the information receiving unit 100b and that the object 200 is categorised as a pedestrian/ human.
  • a reference number, in this case“1”, referring to the representation of the object 200 in the table 320a may be transmitted to the information receiving unit 100b, via the transmitter 140a of the information transmitting unit 100a.
  • FIG. 3C illustrates yet an example of information transfer, in some embodiments rather similar to the embodiment disclosed in Figure 3B.
  • the difference between the embodiment of Figure 3C and the previously discussed embod iment of Figure 3B is that the prestored representations 330 stored in the respective tables 320a, 320b may be user-selected.
  • the output representation 330 is personalised according to personal preferences of the users/ drivers. The output representation 330 may thereby become easier to identify by the user.
  • Figure 4 illustrates an example of a method 400 in a control arrangement 220 of an infor- mation transmitting unit 100a, according to an embodiment.
  • the flow chart in Figure 4 shows the method 400 for providing information 210 to an information receiving unit 100b, 100c.
  • the information transmitting unit 100a may comprise a vehicle in a group 1 10 of vehicles, comprising also an information receiving unit 100b, 100c.
  • the method 400 may comprise a number of steps 401-407. Flowever, some of the described method steps 401 -407 such as e.g. step 404 may be performed only in some embodiments. The described steps 401 -407 may be performed in a somewhat different chronological order than the numbering suggests.
  • the method 400 may comprise the sub sequent steps:
  • Step 401 comprises collecting environmental data with at least one sensor 130.
  • the sensor 130 or plurality of sensors (of the same or different types) as may be the case, may be comprised onboard the information transmitting unit 100a, i.e. on board the vehicle.
  • the sensor/ -s 130 may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar de vice, in different embodiments.
  • Step 402 comprises identifying an object 200 in the environment of the information transmit ting unit 100a, which is considered relevant for the information receiving unit 100b, 100c and / or the information transmitting unit 100a.
  • the object 200 may be detected based on sensor data obtained from the at least one sensor 130, e.g. sensor data fused from a plurality of sensors 130 in some embodiments.
  • the object 200 may be considered relevant when comprised in a list of entities predeter- mined to be relevant, comprising e.g. any arbitrary object situated on the road 120 in front of the vehicle/ information transmitting unit 100a within a predetermined distance; a traffic sign associated with the road 120, within a predetermined distance; a traffic light associated with the road 120, within a predetermined distance (the information including the concurrent col our of the traffic light); road structure such as bends, curves, crossings; marks on the road 120 indicating a pedestrian crossing, a speed bump, a hole or other irregularity in the road surface; a building or other structure in the vicinity of the road 120, etc.
  • Step 403 comprises extracting data related to the identified 402 object 200 from the collected 401 environmental data.
  • the extracted environmental data may comprise e.g. type of object 200, relative/ absolute position of the object 200, direction of motion, speed, size of the object 200, distance D between the sensor 130 and the object 200, colour of the object 200, etc.
  • Step 404 which may be performed only in some embodiments, comprises coordinating the tables 320a, 320b comprising the prestored representations 330 between the information transmitting unit 100a and the information receiving unit 100b, 100c.
  • This method step may only be performed in embodiments wherein the prestored represen tations 330 are stored in tables 320a, 320b in respective memories 300, 310 of the infor mation transmitting unit 100a and the information receiving unit 100b.
  • the prestored representations 330 in the respective memories 300, 310 may be identical in some embodiments, or only representing the same kind of object in other embodiments. However, the prestored representations 330 are associated with the same references. Thereby, only the reference has to be communicated from the information transmitting unit 100a to the information receiving unit 100b, leading to a further reduced time delay during the information transfer.
  • Step 405 comprises converting the extracted 403 data into information 210.
  • the conversion 405 of the extracted 403 data into information 210 may comprise selecting a prestored representation 330 of the object 200 in a memory 300 of the information trans mitting unit 100a.
  • the conversion 405 of the extracted 403 data into information 210 may in some embodi ments comprise selecting a prestored representation 330 of the identified 402 object 200 in a table 320a, 320b stored in both a memory 300 of the information transmitting unit 100a and a memory 310 of the information receiving unit 100b, 100c. Further, the conversion 405 of the extracted 403 data into information 210 may also comprise determining a reference to the selected prestored representation 330 in the table 320a, 320b.
  • Step 406 comprises determining position of the object 200 based on the collected 401 envi ronmental data.
  • the position of the object 200 may be related to the sensor 130/ vehicle 100a, comprising e.g. distance D to the object 200, between the sensor 130/ vehicle 100a and the object 200; lateral displacement L in relation to the sensor 130/ vehicle 100a; position in height of the object 200 (above the road surface), etc.
  • the position of the object 200 may also be an absolute position in some embodi ments, determined based on the absolute geographical position of the vehicle 100a, as de termined by a positioning unit of the vehicle 100a, which may be based on a satellite navi gation system such as the Navigation Signal Timing and Ranging (Navstar) Global Position- ing System (GPS), Differential GPS (DGPS), Galileo, GLONASS, or the like.
  • a satellite navi gation system such as the Navigation Signal Timing and Ranging (Navstar) Global Position- ing System (GPS), Differential GPS (DGPS), Galileo, GLONASS, or the like.
  • the geographical position of the positioning unit, (and thereby also of the vehicle 100a) may be made continuously with a certain predetermined or configurable time intervals according to various embodiments.
  • the absolute position of the object 200 may be determined based on the geographical posi tion of the vehicle 100a, in addition to the relative position of the object 200 in relation to the vehicle 100a.
  • Step 407 comprises providing the converted 404 information 210 and the determined 406 position of the object 200 to the information receiving unit 100b, 100c via a wireless trans- mitter 140a, thereby enabling output of a representation 330 of the object 200 on an output device 240 of the information receiving unit 100b, 100c.
  • the conversion 405 of the extracted 403 data into information 210 comprises selecting a prestored representation 330 of the object 200
  • the provided 407 in- formation 210 may comprise the selected prestored representation 330.
  • the provided 407 information 210 may comprise the determined reference to the selected prestored representation 330 in the table 320a, 320b. Furthermore, the provided 407 information 210 may comprise various data defining the ob ject 200, such as e.g. type, direction of motion, speed, size of the object 200, distance D between the sensor 130 and the object 200, colour of the object 200, etc., in various embod iments. The provided 407 information 210 may in some embodiments comprise data in object form.
  • Figure 5 illustrates an embodiment of a control arrangement 220 of an information transmit ting unit 100a.
  • the control arrangement 220 aims at performing at least some of the method steps 401 -407 according to the above described method 400 for providing information 210 to an information receiving unit 100b, 100c.
  • the control arrangement 220 is configured to collect environmental data with at least one sensor 130. Further, the control arrangement 220 is configured to identify an object 200 in the environment of the information transmitting unit 100a, which is considered relevant for the information receiving unit 100b, 100c. Also, the control arrangement 220 is further con figured to extract data related to the identified object 200 from the collected environmental data. The control arrangement 220 is in addition also configured to convert the extracted data into information 210. Furthermore, the control arrangement 220 is configured to deter mine position of the object 200 based on the collected environmental data.
  • the control ar- rangement 220 is configured to provide the converted information 210 and the determined position of the object 200 to the information receiving unit 100b, 100c via a wireless trans mitter 140a, thereby enabling output of a representation 330 of the object 200 on an output device 220 of the information receiving unit 100b, 100c.
  • the control arrangement 220 may in some embodiments be configured to convert the ex tracted data into information 210 by selecting a prestored representation 330 of the object 200. Further, the control arrangement 220 may be configured to provide, via the wireless transmitter 140a, information comprising the selected prestored representation 330.
  • control arrangement 220 may be configured to convert the ex tracted data into information 210 by selecting a prestored representation 330 of the identified object 200 in a table 320a, 320b stored in both a memory 300 of the information transmitting unit 100a and a memory 310 of the information receiving unit 100b, 100c.
  • the control ar rangement 220 may be configured to determine a reference to the selected prestored repre- sentation 330 in the table 320a, 320b, when the provided information 210 comprises the determined reference.
  • the control arrangement 220 may also in some embodiments be configured to coordinate the tables 320a, 320b comprising prestored representations 330 between the information transmitting unit 100a and the information receiving unit 100b, 100c before the converted information 210 is provided.
  • control arrangement 220 may be configured to provide information 210 comprising data in object form, in some embodiments.
  • the control arrangement 220 comprises a receiving circuit 510 configured for collecting in formation from a sensor 130
  • the control arrangement 220 further comprises a processing circuitry 520 configured for providing information 210 to an information receiving unit 100b, 100c by performing the de scribed method 400 according to at least some of the steps 401 -407.
  • processing circuitry 520 may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processor, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions.
  • a processing circuit i.e. a Central Processing Unit (CPU), a processing unit, a processor, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions.
  • CPU Central Processing Unit
  • ASIC Application Specific Integrated Circuit
  • microprocessor or other processing logic that may interpret and execute instructions.
  • the herein utilised expression“processing circuitry” may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.
  • control arrangement 220 may comprise a memory 525 in some embodi- ments.
  • the optional memory 525 may comprise a physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis.
  • the memory 525 may comprise integrated circuits comprising silicon- based transistors.
  • the memory 525 may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different embod iments.
  • control arrangement 220 may comprise a signal transmitting circuit 530.
  • the signal transmitting circuit 530 may be configured for transmitting information 210 to at least some information receiving unit 100b, 100c.
  • the previously described method steps 401 -407 to be performed in the control arrangement 220 may be implemented through the one or more processing circuits 520 within the control arrangement 220, together with computer program product for performing at least some of the functions of the steps 401 -407.
  • a computer program product comprising instruc tions for performing the steps 401 -407 in the control arrangement 220 may perform the method 400 comprising at least some of the steps 401 -407 for providing information 210 to the information receiving unit 100b, 100c, when the computer program is loaded into the one or more processing circuits 520 of the control arrangement 220.
  • the described steps 401 - 407 thus may be performed by a computer algorithm, a machine executable code, a non- transitory computer-readable medium, or software instructions programmed into a suitable programmable logic such as the processing circuits 520 in the control arrangement 220.
  • the computer program product mentioned above may be provided for instance in the form of a data carrier carrying computer program code for performing at least some of the step 401 -407 according to some embodiments when being loaded into the one or more pro cessing circuitry 520 of the control arrangement 220.
  • the data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appropriate medium such as a disk or tape that may hold machine readable data in a non-transitory manner.
  • the computer program product may furthermore be pro vided as computer program code on a server and downloaded to the control arrangement 220 remotely, e.g., over an Internet or an intranet connection.
  • Figure 6 illustrates an example of a method 600 in a control arrangement 230 of an infor mation receiving unit 100b, 100c, according to an embodiment.
  • the flow chart in Figure 6 shows the method 600 for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a.
  • the information receiving unit 100b, 100c may comprise a vehicle in a group 1 10 of vehicles, comprising also the information transmitting unit 100a.
  • the method 600 may comprise a number of steps 601-603. However, some of the described method steps 601 - 603 may be performed in a somewhat different chronological order than the numbering sug gests.
  • the method 600 may comprise the subsequent steps:
  • Step 601 comprises receiving information 210 concerning the object 200 and position of the object 200 from the information transmitting unit 100a via a wireless receiver 140b.
  • Step 602 which may be performed only in some embodiments, comprises coordinating the tables 320a, 320b comprising prestored representations 330 between the information trans mitting unit 100a and the information receiving unit 100b, 100c before the information 210 concerning the object 200 is received 601.
  • Step 603 comprises converting the received 601 information 210 concerning the object 200 into a representation 330 of the object 200.
  • the conversion 603 of the received 601 information 210 into the representation 330 of the object 200 may comprise selecting the representation 330 of the object 200 based on the received 601 information 210, in some embodiments.
  • the conversion 603 may optionally comprise extracting a reference to a prestored represen- tation 330 in a table 320a, 320b stored in both a memory 310 of the information receiving unit 100b, 100c and a memory 300 of the information transmitting unit 100a, from the re ceived 601 information 210. Further, the conversion 603 may comprise selecting the prestored representation 330 of the object 200 in the table 320a, 320b stored in the memory 310 of the information receiving unit 100b, 100c, based on the extracted reference.
  • the representation 330 of the object 200 may be a simplified or cartooned version of the object 200 in some embodiments.
  • the representation 330 may for example comprise only a contour of the object 200, a geometric figure, a stylised illustration, a colour, a sound, a text, a tactile signal, etc., or possibly a combination thereof.
  • the representation 330 of the object 200 in the table 320b may be configurable by a user of the information receiving unit 100b, 100c, in some embodiments.
  • Step 604 comprises outputting the representation 330 of the object 200 on an output device 240 of the information receiving unit 100b, 100c.
  • the output device 240 may comprise a visual output device such as a screen; a head-up display; a projector projecting the image on either the road 120, or the back of the vehicle ahead; a set of close-eyes displays/ intelligent glasses/ lenses, i.e. an optical head-mounted display; a loudspeaker; a tactile device; and / or a combination thereof.
  • the output device 240 may in some embodiments be configured for Augmented Reality (AR), and / or Virtual Reality (VR).
  • Figure 7 illustrates an embodiment of a control arrangement 230 of an information receiving unit 100b, 100c, for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a.
  • the control arrangement 230 is configured to perform at least some of the above described method steps 601 -604 for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a.
  • the control arrangement 230 is configured to receive information 210 concerning the object 200 and position of the object 200 from the information transmitting unit 100a via a wireless receiver 140b. Also, the control arrangement 220 is configured to convert the received infor mation 210 concerning the object 200 into a representation 330 of the object 200. The control arrangement 220 is furthermore configured to output the representation 330 of the object 200 on an output device 240 of the information receiving unit 100b, 100c.
  • control arrangement 230 may be configured to convert the re ceived information 210 into the representation 330 of the object 200 by selecting the repre sentation 330 of the object 200 based on the received information 210.
  • the control arrangement 230 may be further configured to convert the received information 210 into the representation 330 of the object 200 by extracting a reference to a prestored representation 330 in a table 320a, 320b stored in both a memory 310 of the information receiving unit 100b, 100c and a memory 300 of the information transmitting unit 100a, from the received information 210.
  • the control arrangement 230 may be configured to select the prestored representation 330 of the object 200 in the table 320a, 320b stored in the memory 310 of the information receiving unit 100b, 100c, based on the extracted reference.
  • control arrangement 230 may also be configured to coordinate the tables 320a, 320b comprising prestored representations 330 between the information transmitting unit 100a and the information receiving unit 100b, 100c before the information 210 concerning the object 200 is received.
  • the control arrangement 230 may in some embodiments be further configured to enable a user of the information receiving unit 100b, 100c to configure the representation 330 of the object 200 in the table 320b.
  • control arrangement 230 may be configured to receive information 210 comprising data in object form, in some embodiments.
  • the control arrangement 230 comprises a receiving circuit 710 configured for collecting in formation 210 from a wireless transmitter 140a of an information transmitting unit 100a.
  • the control arrangement 230 further comprises a processing circuitry 720 configured for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a by performing the described method 600 according to at least some of the steps 601 -604.
  • processing circuitry 720 may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processor, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions.
  • the herein utilised expression“processing circuitry” may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.
  • control arrangement 230 may comprise a memory 725 in some embodi- ments.
  • the optional memory 725 may comprise a physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis.
  • the memory 725 may comprise integrated circuits comprising silicon- based transistors.
  • the memory 725 may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different embod iments.
  • control arrangement 230 may comprise a signal transmitting circuit 730.
  • the signal transmitting circuit 730 may be configured for providing the representation 330 to the output device 240 at the information receiving unit 100b, 100c.
  • a computer program product comprising instruc tions for performing the steps 601 -604 in the control arrangement 230 may perform the method 600 comprising at least some of the steps 601 -604 for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a, when the computer program is loaded into the one or more processing circuits 720 of the control arrangement 230.
  • the described steps 601 -604 thus may be performed by a computer al gorithm, a machine executable code, a non-transitory computer-readable medium, or soft ware instructions programmed into a suitable programmable logic such as the processing circuits 720 in the control arrangement 230.
  • the computer program product mentioned above may be provided for instance in the form of a data carrier carrying computer program code for performing at least some of the step 601 -604 according to some embodiments when being loaded into the one or more pro cessing circuitry 720 of the control arrangement 230.
  • the data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appropriate medium such as a disk or tape that may hold machine readable data in a non-transitory manner.
  • the computer program product may furthermore be pro vided as computer program code on a server and downloaded to the control arrangement 230 remotely, e.g., over an Internet or an intranet connection.
  • the solution may further comprise a vehicle 100a, 100b, 100c, comprising a control arrange ment 220 as illustrated in Figure 5 and / or a control arrangement 230 as illustrated in Figure 7.
  • a vehicle 100a, 100b, 100c comprising a control arrange ment 220 as illustrated in Figure 5 and / or a control arrangement 230 as illustrated in Figure 7.
  • the terminology used in the description of the embodiments as illustrated in the accompa nying drawings is not intended to be limiting of the described methods 400, 600, control arrangements 220, 230, computer program and / or vehicle 100a, 100b, 100c.
  • Various changes, substitutions and / or alterations may be made, without departing from invention embodiments as defined by the appended claims.
  • the term “and/ or” comprises any and all combinations of one or more of the associated listed items.
  • the term“or” as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless ex pressly stated otherwise.
  • the singular forms "a”, “an” and “the” are to be inter- preted as“at least one”, thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Optics & Photonics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method (400) in a control arrangement (220) of an information transmitting unit (100a), for providing information (210) to an information receiving unit (100b, 100c). The method (400) comprises collecting (401) environmental data with at least one sensor (130); identifying (402) an object (200), which is considered relevant; extracting (403) data related to the object (200) from the environmental data; converting (405) the data into information (210); deter- mining (406) position of the object (200) based on the collected (401) environmental data; and providing (407) the information (210) and the determined (406) position of the object (200) to the information receiving unit (100b, 100c) via a wireless transmitter (140a), thereby enabling output of a representation (330) of the object (200) on an output device (240) of the information receiving unit (100b, 100c). Also, a method (600) in a control arrangement (230) of an information receiving unit (100b, 00c), is provided.

Description

METHOD AND CONTROL ARRANGEMENT FOR VISUALISATION OF OBSTRUCTED VIEW
TECHNICAL FIELD
This document discloses a control arrangement of an information transmitting unit and a 5 method therein, and a control arrangement of an information receiving unit and a method therein. More particularly, methods and control arrangements are described, for providing information detected by the information transmitting unit, to the information receiving unit.
BACKGROUND
0 Grouping vehicles into platoons is an emerging technology, leading to reduced fuel con sumption and increased capacity of the roads. A number of vehicles, e.g. 2-25 or more, may be organised in a platoon or vehicle convoy, wherein the vehicles are driving in coordination after each other with only a small distance between the vehicles, such as some decimetres or some meters, such as e.g. 20 meters, or at a distance that is dependent on the speed of5 the platoon (e.g., the vehicles may be about 2 or 3 seconds apart during transportation). Thereby air resistance is reduced, which is important for reducing energy consumption, in particular for trucks, busses and goods vehicles or other vehicles having a large frontal area. In principle it may be said that the shorter the distance is between the vehicles, the lower the air resistance becomes, which reduces energy consumption for the vehicle platoon.
0
The distance between the vehicles in the platoon may be reduced as the vehicles are ena bled to communicate wirelessly with each other and thereby coordinate their velocity by e.g. accelerating or braking simultaneously. Thereby the reacting distance needed for human reaction during normal driving is eliminated.
5
Platooning brings a multitude of advantages, such as improved fuel economy due to reduced air resistance, and also reduced traffic congestion leading to increased capacity of the roads and enhanced traffic flow. Also, platoons can readily exploit advancements in automation, for example by letting only the lead vehicle may be human-driven, while the others may follow0 autonomously. This would enable a reduction on the number of the drivers (that is, one or two per platoon), or prolonged continuous driving, as the drivers in all but the first truck can rest.
However, the short distance between the vehicles in the platoon obstructs the field of view5 of the following vehicles. This problem also emerges for any vehicle in a queue, standing/ driving behind another vehicle, in particular a large vehicle such as a truck, a bus, etc. Only the driver of the leading vehicle has an open field of view and can see what is ahead of the vehicle platoon/ vehicle queue. The following vehicles drive very close to each other and, therefore, drivers have their field of view obstructed. It has been proposed that the leading vehicles broadcasts to the following vehicles a video streaming, containing what is in the front of the platoon leader. However, this can be quite complicated to do in real-time, and a delayed video is useless, or even dangerous as it may provide a false sense of security to the recipient of the video stream.
Another emerging technology is remotely controlled vehicles, for example in mining, con- struction, forestry, military applications, rescuing operations, extraterrestrial explorations, etc. The driver could then be safely situated in a control room, protected from a possibly hostile environment of the vehicle. For the driver to become aware of the current situation of the vehicle, a video image or other sensor data captured by a sensor of the vehicle may be provided to the driver in the control room.
However, providing environmental data e.g. by streaming video data leads to a substantial time delay, which may become crucial for the traffic/ operational safety of the vehicle. In case the remotely situated driver receives the images with a time delay, the driving commands of the driver may be reactions to an obsolete situation, and / or arrive too late to the vehicle, which may compromise vehicle safety and / or cause an accident.
Vehicles are sometimes autonomous. Thereby the driver is omitted, superseded by an onboard control logic enabling the vehicle to drive and manage various appearing traffic sit uations, based on sensor data captured by sensors on the vehicle. However, various unde- fined, non-predicted situations may occur, which cannot be handled by the onboard control logic alone. A human operator in a remote monitoring room may then be alerted and sensor data documenting the appeared situation may be transmitted to the operator. However, as already mentioned, streaming video data may cause an unfortunate time delay. Again, the driving commands of the operator may be reactions to an obsolete situation, and / or arrive too late to the vehicle, which may compromise vehicle safety and / or cause an accident.
Document US2017158133 discloses a vehicle vision system with compressed video trans fer. The vehicle utilises one or more cameras to capture image data and transmits com pressed video images to another vehicle. This system can be used in a platooning group of vehicles. The compressed images captured by a forward viewing camera of the lead vehicle are communicated to following vehicles. The compressed video images may be processed by a machine vision processer. This solution is reducing the transferring time of the video stream by compressing the infor mation to be transferred. However, the process of compression/ decompression takes time. Also, a very small delay may be hazardous and cause an accident.
Document CN102821282 discloses a video communication in vehicular network. The video captured by each vehicle is shared among vehicles of the whole vehicle fleet. The captured video is coded and compressed. This solution shares the same or similar problems as the previously described solution.
Document US2015043782 discloses a method for detecting and displaying obstacles and data associated with the obstacles. A digital device displays both the captured image and the related information. For example, the distance of each person is overlayed on their im- ages. The document mentions that the invention can solve the view problem of vehicle fleets.
As these described scenarios, and similar variants of them, may lead to general inconven ience and discomfort for the passengers and / or the driver in a platoon or dense traffic envi ronment, it would be desired to find a solution.
SUMMARY
It is therefore an object of this invention to solve at least some of the above problems and improve visibility of a vehicle. According to a first aspect of the invention, this objective is achieved by a method in a control arrangement of an information transmitting unit. The method aims at providing information to an information receiving unit. The method comprises the steps of: collecting environmental data with at least one sensor. Further the method comprises identifying an object in the en vironment of the information transmitting unit, which is considered relevant for the information receiving unit. The method also comprises extracting data related to the identified object from the collected environmental data. Also, the method furthermore comprises converting the extracted data into information. The method in addition comprises determining position of the object based on the collected environmental data. Furthermore, the method comprises providing the converted information and the determined position of the object to the infor- mation receiving unit via a wireless transmitter, thereby enabling output of a representation of the object on an output device of the information receiving unit. According to a second aspect of the invention, this objective is achieved by a control ar rangement of an information transmitting unit. The control arrangement aims at for outputting a representation of an object detected by at least one sensor of an information transmitting unit, based on information obtained from the information transmitting unit. The method com- prises receiving information concerning the object and position of the object from the infor mation transmitting unit via a wireless receiver. Also, the method comprises converting the received information concerning the object into a representation of the object. The method additionally comprises outputting the representation of the object on an output device of the information receiving unit.
According to a third aspect of the invention, this objective is achieved by a method in a control arrangement of an information receiving unit, for outputting a representation of an object detected by at least one sensor of an information transmitting unit, based on information obtained from the information transmitting unit. The method comprises receiving information concerning the object and position of the object from the information transmitting unit via a wireless receiver. Also, the method further comprises converting the received information concerning the object into a representation of the object. The method in addition comprises outputting the representation of the object on an output device of the information receiving unit.
According to a fourth aspect of the invention, this objective is achieved by a control arrange ment of an information receiving unit, for outputting a representation of an object detected by at least one sensor of an information transmitting unit, based on information obtained from the information transmitting unit. The control arrangement is configured to receive infor- mation concerning the object and position of the object from the information transmitting unit via a wireless receiver. The control arrangement is also configured to convert the received information concerning the object into a representation of the object. Further, the control arrangement is configured to output the representation of the object on an output device of the information receiving unit.
Thanks to the described aspects, by converting the extracted data into information of an object which is considered relevant for the information receiving unit, information may be provided with low time latency to the information receiving unit. Thereby, the information receiving unit may obtain information in real time or with a very low time delay, enabling output of a representation of the object on an output device of the information receiving unit. Thereby, the driver of another vehicle/ the information receiving unit may be informed about the object detected by the first vehicle/ information providing unit without any substantial time delay, enabling the driver to prepare for an appropriate action due to the detected object. It is thereby avoided that the driver of the other vehicle/ the information receiving unit is sur- prised by a suddenly occurring action such as a hard brake, speed bump, etc., which may in a worst-case scenario may cause an accident. Thus, traffic safety is enhanced.
Other advantages and additional novel features will become apparent from the subsequent detailed description.
FIGURES
Embodiments of the invention will now be described in further detail with reference to the accompanying figures, in which:
Figure 1A illustrates an embodiment of a group of vehicles.
Figure 1 B illustrates vehicles transmitting information between each other.
Figure 1C illustrates a vehicle transmitting information to a control tower.
Figure 2A illustrates a vehicle interior according to an embodiment of the invention. Figure 2B illustrates a vehicle interior according to an embodiment of the invention. Figure 2C illustrates a vehicle interior according to an embodiment of the invention. Figure 3A illustrates vehicles transmitting information between each other.
Figure 3B illustrates vehicles transmitting information between each other.
Figure 3C illustrates vehicles transmitting information between each other.
Figure 4 is a flow chart illustrating an embodiment of a first method.
Figure 5 is an illustration depicting a control arrangement of an information transmitting unit according to an embodiment.
Figure 6 is a flow chart illustrating an embodiment of a second method.
Figure 7 is an illustration depicting a control arrangement of an information receiving unit according to an embodiment. DETAILED DESCRIPTION
Embodiments of the invention described herein are defined as control arrangements and methods in control arrangements, which may be put into practice in the embodiments de- scribed below. These embodiments may, however, be exemplified and realised in many dif ferent forms and are not to be limited to the examples set forth herein; rather, these illustra tive examples of embodiments are provided so that this disclosure will be thorough and com plete.
Still other objects and features may become apparent from the following detailed description, considered in conjunction with the accompanying drawings. It is to be understood, however, that the drawings are designed solely for purposes of illustration and not as a definition of the limits of the herein disclosed embodiments, for which reference is to be made to the appended claims. Further, the drawings are not necessarily drawn to scale and, unless oth erwise indicated, they are merely intended to conceptually illustrate the structures and pro cedures described herein.
Figure 1A illustrates a scenario wherein a number of vehicles 100a, 100b, 100c, driving in a driving direction 105, with inter-vehicular distances d1 , d2. The vehicles 100a, 100b, 100c may be coordinated and organised in a group 110 of vehicles, which may be referred to as a platoon. However, the vehicles 100a, 100b, 100c may be non-coordinated, for example standing sequentially after each other in a traffic congestion, or just driving/ standing in a vicinity of each other. The involved vehicles 100a, 100b, 100c may not necessarily be driving in the same direction 105, and / or the same file; or even be driving at all, i.e. one or more vehicles 100a, 100b, 100c may be stationary. In fact, one or more of the vehicles 100a, 100b, 100c in the group 1 10 may be referred to as a structure rather than a vehicle, such as e.g. a building, a control tower, a lamp post, a traffic sign, etc. However, at least one vehicle 100a, 100b, 100c in the group 1 10 is blocking at least a part of the view of at least one other vehicle 100a, 100b, 100c in the group 1 10.
In embodiments wherein the vehicle group 1 10 comprises a platoon, it may be described as a chain of coordinated, inter-communicating vehicles 100a, 100b, 100c travelling at given inter-vehicular distances d1 , d2 and velocity.
The inter-vehicular distances d1 , d2 may be fixed or variable in different embodiments. Thus, the distances d1 , d2 may be e.g. some centimetres, some decimetres, some meters or some tenths of meters in some embodiments. Alternatively, each vehicle 100a, 100b, 100c in the group 1 10 may have a different distance d1 , d2 to the vehicle following, or leading, vehicle 100a, 100b, 100c, than all other vehicles 100a, 100b, 100c in the coordinated group 1 10. The vehicles 100a, 100b, 100c in the group 1 10 may comprise vehicles of the same, or different types in different embodiments, such as trucks, multi-passenger vehicles, trailers, cars, etc; and / or structures such as buildings, road infra structures, etc. The vehicles 100a, 100b, 100c may be driver controlled or driverless autonomously con trolled vehicles in different embodiments. However, for enhanced clarity, the vehicles 100a, 100b, 100c are subsequently described as having a driver.
The vehicles 100a, 100b, 100c in the group 1 10 may be coordinated, or communicate via wireless signal. Such wireless signal may comprise, or at least be inspired by wireless com munication technology such as Wi-Fi, Wireless Local Area Network (WLAN), Ultra Mobile Broadband (UMB), Bluetooth (BT), Near Field Communication (NFC), Radio-Frequency Identification (RFID), optical communication such as Infrared Data Association (IrDA) or in frared transmission to name but a few possible examples of wireless communications in some embodiments.
In some embodiments, the communication between vehicles 100a, 100b, 100c in the group 1 10 may be performed via vehicle-to-vehicle (V2V) communication, e.g. based on Dedicated Short-Range Communications (DSRC) devices. DSRC works in 5.9 GHz band with band- width of 75 MHz and approximate range of 1000 m in some embodiments.
The wireless communication may be made according to any IEEE standard for wireless ve hicular communication like e.g. a special mode of operation of IEEE 802.1 1 for vehicular networks called Wireless Access in Vehicular Environments (WAVE). IEEE 802.1 1 p is an extension to 802.1 1 Wireless LAN medium access layer (MAC) and physical layer (PHY) specification.
The communication may alternatively be made over a wireless interface comprising, or at least being inspired by radio access technologies such as e.g. third Generation Partnership Project (3GPP) 5G/ 4G, 3GPP Long Term Evolution (LTE), LTE-Advanced, Groupe Special Mobile (GSM), or similar, just to mention some few options, via a wireless communication network.
In some embodiments, when the vehicles 100a, 100b, 100c in the group 1 10 are coordinated and are communicating, the driver of the first vehicle 100a drive the own vehicle 100a and the other vehicles 100b, 100c in the group 1 10 are merely following the driving commands of the first vehicle 100a. A non-leading vehicle 100b, 100c driving in the platoon 1 10 has a restricted sight as the leading vehicle 100a obstructs the field of view of the following vehicles 100b, 100c. This phenomenon may occur also in a vehicle queue, a traffic congestion, a parking lot, etc.
According to embodiments disclosed herein, one vehicle 100a in the group 1 10, typically the first vehicle 100a of the group 1 10, comprises one or several sensors 130, of the same or different types. The sensor 130 may be a forwardly directed sensor 130 in some embodi ments. In the illustrated embodiment, which is merely an arbitrary example, the forwardly directed sensor 130 may be situated e.g. at the front of the first vehicle 100a of the group 1 10, behind the windscreen of the vehicle 100.
The sensor 130 may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar device, in different embodiments.
In some embodiments, the sensor 130 may comprise e.g. a motion detector and / or be based on a Passive Infrared (PIR) sensor sensitive to a person's skin temperature through emitted black body radiation at mid-infrared wavelengths, in contrast to background objects at room temperature; or by emitting a continuous wave of microwave radiation and detect motion through the principle of Doppler radar; or by emitting an ultrasonic wave and detecting and analysing the reflections; or by a tomographic motion detection system based on detection of radio wave disturbances, to mention some possible implementations. Figure 1 B illustrates a scenario wherein the sensor 130 of one/ the first vehicle 100a of the group 1 10 is detecting an object 200. The object 200 in the illustrated example is a human, but it may be any kind of object which may be considered relevant for another vehicle 100a, 100b, 100c in the group 1 10, such as an obstacle on the road, an animal at or in the vicinity of the road, another vehicle, a speed barrier, a structure at or close to the road such as a road sign, a traffic light, a crossing road and vehicles there upon, etc.
The sensor 130 may comprise or be connected to a control arrangement configured for im age recognition/ computer vision and object recognition. Computer vision is a technical field comprising methods for acquiring, processing, analysing, and understanding images and, in general, high-dimensional data from the real world in order to produce numerical or symbolic information. A theme in the development of this field has been to duplicate the abilities of human vision by electronically perceiving and understanding an image. Understanding in this context means the transformation of visual images (the input of retina) into descriptions of world that can interface with other thought processes and elicit appropriate action. This im age understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory. Computer vision may also be described as the enterprise of automating and integrat ing a wide range of processes and representations for vision perception.
The sensor data of the sensor/-s 130 may take many forms, such as e.g. images, video sequences, views from multiple cameras, or multi-dimensional data from a scanner, data of a lidar, radar, etc; or a combination thereof.
When an object 200 is detected by the sensor 130 of the first vehicle/ information transmitting unit 100a, which is considered relevant for the other vehicles/ information receiving units 100b, 100c in the group 1 10 by the control arrangement, information 210 comprising a sim- plified representation of the object 200 may be provided to the other vehicles/ information receiving units 100b, 100c.
Thus, instead of transmitting the whole data of the sensor 130, information 210 concerning the perceived environment may be transmitted as a simplified cartoon with standard images saved a priori, e.g., traffic light, pedestrian, car, bicycle, bus, etc. This data can then be used by the following vehicles to create an“cartooned” image of the obstructed field of view.
The transmission of this information is much simpler and faster, allowing for a real time com munication of what is in front of the vehicle/ information transmitting unit 100a having de- tected the object 200. The drivers of the following vehicles/ information receiving units 100b, 100c are informed about what is happening in front of the leading vehicle/ information trans mitting unit 100a and can react accordingly, e.g. by preparing for a brake, for slowing down, for a speed bump, etc. By being able to receive real time information without time lag, or with only a negligible time lag, the drivers of the other vehicles/ information receiving units 100b, 100c could become aware of the environmental traffic situation and prepare accordingly. Also, certain functions in the vehicle/ information receiving units 100b, 100c could be activated in some embodi ments, triggered by the detected obstacle 200, such as tightening the safety belt when a hard brake could be expected, etc., or the vehicles 100a, 100b, 100c could activate some automatic functions to avoid the impact, e.g., slowing down or changing trajectory Thereby traffic safety is increased, and / or impact of any traffic accident is eliminated or at least de creased.
Figure 1C illustrates an embodiment wherein the group 1 10 comprises a vehicle 100a driv- ing on a road 120, and a control room 100b. In the illustrated embodiment, the vehicle 100a is an information transmitting unit while the control room 100b is an information receiving unit.
The vehicle/ information transmitting unit 100a may be an unmanned vehicle, remotely con- trolled by a human driver in the control room / information receiving unit 100b. In other em bodiments, the vehicle/ information transmitting unit 100a may be an autonomous vehicle while the human driver in the control room / information receiving unit 100b is only alerted when the autonomous vehicle is experiencing an unknown/ undefined problem. However, the situation may in other embodiments be the opposite, i.e. the control room 100b may be the information transmitting unit while the vehicle 100a may be the information re ceiving unit. In this embodiment, the vehicle 100a may be a manned vehicle and a sensor 130 in the control room 100b may detect information which may be provided to the driver of the vehicle 100a, such as for example map / directional information; working instructions for mining/ agriculture/ forestry, etc.
The sensor 130 of the information transmitting unit 100a may detect an object 200. The control arrangement of the information transmitting unit 100a may then determine that the detected object 200 is relevant, and information 210 comprising a simplified representation of the object 200 may be provided to the information receiving unit 100b. At the information receiving unit 100b, the information 210 may be received via the wireless receiver 140b and outputted on an output device such as a display or similar.
Thereby, a human monitoring the vehicle 100a, or a plurality of vehicles comprised in the group 1 10, may become aware of the object 200 detected by the sensor 130 of the vehicle/ information transmitting unit 100a in real time, or almost real time, without risking to suffer the time delay that would have result if all the sensor data of the object 200 would have been transferred. The human may thereby react and determine an appropriate action of the vehicle/ infor mation transmitting unit 100a, e.g. by sending a command or instructions how to handle the situation due to the detected object 200. Figure 2A illustrates an example of a scenario as it may be perceived by the driver of the second vehicle 100b in a group 1 10 of vehicles 100a, 100b, 100c. The vehicle/ information receiving unit 100b comprises a control arrangement 230 for out- putting a representation of an object 200 detected by at least one sensor 130 of an infor mation transmitting unit 100a, based on information 210 obtained from the information trans mitting unit 100a. The information transmitting unit 100a comprises a transmitter 140a, trans mitting wireless data to be received by a receiver 140b in the information receiving unit 100b.
Further, the information receiving unit 100b comprises an output device 240 in form of a display, loud speaker and / or a tactile device. Alternatively, the output device 240 may com prise a pair of intelligent glasses, i.e. an optical head-mounted display, that is designed in the shape of a pair of eyeglasses; or a set of portable head-up displays; a device for illus- trating an Augmented Reality (AR).
By receiving the simplified information 210 of the object 200 via the wireless communication from the transmitter 140a of the information transmitting unit 100a and outputting a repre sentation of the object 200 on the output device 240 in form of a cartooned object, the driver of the vehicle/ information receiving unit 100b becomes aware of the object 200 in real time, or almost real time.
Figure 2B illustrates yet an example of a scenario as it may be perceived by the driver of the second vehicle 100b in a group 1 10 of vehicles 1 00a, 1 00b, 100c, similar to the scenario illustrated in Figure 2A.
In this case, the information concerning the detected object 200, in this case an animal, triggers the output of a prestored image of an animal on the output device 240. The image may be prestored at a memory device of the first vehicle/ information transmitting unit 100a and transmitted to the other vehicles/ information receiving unit 100b, 100c. Alternatively, in some embodiments, the image may be prestored at a memory device of the other vehicles/ information receiving unit 100b, 100c and only a reference to the image is transferred from the first vehicle/ information transmitting unit 100a to the other vehicles/ information receiving unit 100b, 100c.
The position of the detected object 200 relative to the sensor 130, or vehicle 100a, may also be determined and provided to the information receiving unit 100b. Figure 2C illustrates yet an example of a scenario as it may be perceived by the driver of the second vehicle 100b in a group 1 10 of vehicles 100a, 100b, 100c, similar to the scenario illustrated in Figures 2A-2B.
In this case, the information concerning the detected object 200, in this case an animal, triggers the output of a highly stylized image, which may be prestored, of a detected obstacle on the output device 240. The image may be prestored at a memory device of the first vehi cle/ information transmitting unit 100a and transmitted to the other vehicles/ information re- ceiving unit 100b, 100c. Alternatively, in some embodiments, the image may be prestored at a memory device of the other vehicles/ information receiving unit 100b, 100c and only a reference to the image is transferred from the first vehicle/ information transmitting unit 100a to the other vehicles/ information receiving unit 100b, 100c. Figure 3A illustrates an example of information transfer, in some embodiments, and the vehicles 1 00a, 1 00b as regarded from above.
The sensor/-s 130 of the first vehicle/ information transmitting unit 100a may determine the distance D, and / or lateral displacement L of the object 200 in relation to the sensor 180 and / or vehicle 100a (or some other reference point) may be determined. The distance/ lateral displacement may for example be determined by radar, lidar, etc., by triangulation of sensor signals captured by sensors 130 situated at different locations on the vehicle 100a; or by capturing an image and performing an image analysis. The determined information concerning the relative position of the detected object 200, such as e.g. D and L, may then be provided to the information receiving unit 100b in some em bodiments, together with information representing the object 200. In other embodiments, may an absolute position of the detected object 200 be calculated, based on the determined relative position of the object 200 and an absolute geographical position of the vehicle 100a.
By determining the position of the object 200 in relation to the sensor 180 and / or the vehicle 100a, it becomes possible to recreate, at the receiver side, the position of the object 200 in relation to the vehicle 100a, and also in relation to other detected objects 200, if any. At the information receiving unit 100b, calculations may be made for outputting a simplified representation 330 of the object 200, at a relative position on the output device 240, thereby providing a view of the environmental situation of the vehicle 100a, comprising the objects 200 considered important/ relevant. The human driver/ operator at the information receiving unit 100b may thereby obtain a view of the situation which gives him/ her an intuitive under standing of the emerged scenario detected by the sensors 130 of the first vehicle/ information transmitting unit 100a.
Thereby, information concerning the detected object 200 is provided without (relevant) time delay, in a way which is immediately possible to interpret and understand by the human driver/ operator. Thus, traffic safety is enhanced. Figure 3B illustrates an example of information transfer, in some embodiments.
In the illustrated embodiment, the information transmitting unit 100a and the information re ceiving unit 100b comprise a common table 320a, 320b wherein some different examples of object representation are stored, each associated with a reference, such as e.g. a number. The table 320a of the information transmitting unit 100a may be stored in a memory 300 of the information transmitting unit 100a while the table 320b of the information receiving unit 100b may be stored in a memory 310 of the information receiving unit 100b.
Thereby, when the sensor 130 of the information transmitting unit 100a detects the object 200, in this case a pedestrian in front of the information transmitting unit 100a. The control arrangement 220 of the information transmitting unit 100a may then determine that the object 200 is relevant for the information receiving unit 100b and that the object 200 is categorised as a pedestrian/ human. A reference number, in this case“1”, referring to the representation of the object 200 in the table 320a may be transmitted to the information receiving unit 100b, via the transmitter 140a of the information transmitting unit 100a.
At the information receiving unit 100b, upon receiving the information, in this case the refer- ence“1” by the receiver 140b, a search may be made in the table 320b of the memory 310, to determine which representation corresponds to the received reference. In this case, the reference“1” corresponds to a representation 330 of a human, which then may be output on the output device 240 of the information receiving unit 100b. Figure 3C illustrates yet an example of information transfer, in some embodiments rather similar to the embodiment disclosed in Figure 3B. The difference between the embodiment of Figure 3C and the previously discussed embod iment of Figure 3B is that the prestored representations 330 stored in the respective tables 320a, 320b may be user-selected. Thereby, the output representation 330 is personalised according to personal preferences of the users/ drivers. The output representation 330 may thereby become easier to identify by the user.
Figure 4 illustrates an example of a method 400 in a control arrangement 220 of an infor- mation transmitting unit 100a, according to an embodiment. The flow chart in Figure 4 shows the method 400 for providing information 210 to an information receiving unit 100b, 100c.
The information transmitting unit 100a may comprise a vehicle in a group 1 10 of vehicles, comprising also an information receiving unit 100b, 100c.
In order to correctly be able to provide information 210 to the information receiving unit 100b, 100c, the method 400 may comprise a number of steps 401-407. Flowever, some of the described method steps 401 -407 such as e.g. step 404 may be performed only in some embodiments. The described steps 401 -407 may be performed in a somewhat different chronological order than the numbering suggests. The method 400 may comprise the sub sequent steps:
Step 401 comprises collecting environmental data with at least one sensor 130. The sensor 130, or plurality of sensors (of the same or different types) as may be the case, may be comprised onboard the information transmitting unit 100a, i.e. on board the vehicle.
The sensor/ -s 130 may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar de vice, in different embodiments.
Step 402 comprises identifying an object 200 in the environment of the information transmit ting unit 100a, which is considered relevant for the information receiving unit 100b, 100c and / or the information transmitting unit 100a. The object 200 may be detected based on sensor data obtained from the at least one sensor 130, e.g. sensor data fused from a plurality of sensors 130 in some embodiments.
The object 200 may be considered relevant when comprised in a list of entities predeter- mined to be relevant, comprising e.g. any arbitrary object situated on the road 120 in front of the vehicle/ information transmitting unit 100a within a predetermined distance; a traffic sign associated with the road 120, within a predetermined distance; a traffic light associated with the road 120, within a predetermined distance (the information including the concurrent col our of the traffic light); road structure such as bends, curves, crossings; marks on the road 120 indicating a pedestrian crossing, a speed bump, a hole or other irregularity in the road surface; a building or other structure in the vicinity of the road 120, etc.
Thereby, a filtering of information is made, neglecting irrelevant information around the vehi cle/ information transmitting unit 100a. Thus, only relevant information is communicated to the information receiving unit 100b, 100c, leading to a reduced communication delay. It also becomes easier for the human driver/ operator at the information receiving unit 100b, 100c to immediately detect the representation 330 of the relevant object 200, as disturbing non- relevant objects have been filtered out. Step 403 comprises extracting data related to the identified 402 object 200 from the collected 401 environmental data.
The extracted environmental data may comprise e.g. type of object 200, relative/ absolute position of the object 200, direction of motion, speed, size of the object 200, distance D between the sensor 130 and the object 200, colour of the object 200, etc.
Step 404 which may be performed only in some embodiments, comprises coordinating the tables 320a, 320b comprising the prestored representations 330 between the information transmitting unit 100a and the information receiving unit 100b, 100c.
This method step may only be performed in embodiments wherein the prestored represen tations 330 are stored in tables 320a, 320b in respective memories 300, 310 of the infor mation transmitting unit 100a and the information receiving unit 100b. The prestored representations 330 in the respective memories 300, 310 may be identical in some embodiments, or only representing the same kind of object in other embodiments. However, the prestored representations 330 are associated with the same references. Thereby, only the reference has to be communicated from the information transmitting unit 100a to the information receiving unit 100b, leading to a further reduced time delay during the information transfer. Step 405 comprises converting the extracted 403 data into information 210.
The conversion 405 of the extracted 403 data into information 210 may comprise selecting a prestored representation 330 of the object 200 in a memory 300 of the information trans mitting unit 100a.
The conversion 405 of the extracted 403 data into information 210 may in some embodi ments comprise selecting a prestored representation 330 of the identified 402 object 200 in a table 320a, 320b stored in both a memory 300 of the information transmitting unit 100a and a memory 310 of the information receiving unit 100b, 100c. Further, the conversion 405 of the extracted 403 data into information 210 may also comprise determining a reference to the selected prestored representation 330 in the table 320a, 320b.
Step 406 comprises determining position of the object 200 based on the collected 401 envi ronmental data.
The position of the object 200 may be related to the sensor 130/ vehicle 100a, comprising e.g. distance D to the object 200, between the sensor 130/ vehicle 100a and the object 200; lateral displacement L in relation to the sensor 130/ vehicle 100a; position in height of the object 200 (above the road surface), etc.
However, the position of the object 200 may also be an absolute position in some embodi ments, determined based on the absolute geographical position of the vehicle 100a, as de termined by a positioning unit of the vehicle 100a, which may be based on a satellite navi gation system such as the Navigation Signal Timing and Ranging (Navstar) Global Position- ing System (GPS), Differential GPS (DGPS), Galileo, GLONASS, or the like.
The geographical position of the positioning unit, (and thereby also of the vehicle 100a) may be made continuously with a certain predetermined or configurable time intervals according to various embodiments.
The absolute position of the object 200 may be determined based on the geographical posi tion of the vehicle 100a, in addition to the relative position of the object 200 in relation to the vehicle 100a.
Step 407 comprises providing the converted 404 information 210 and the determined 406 position of the object 200 to the information receiving unit 100b, 100c via a wireless trans- mitter 140a, thereby enabling output of a representation 330 of the object 200 on an output device 240 of the information receiving unit 100b, 100c.
In embodiments wherein the conversion 405 of the extracted 403 data into information 210 comprises selecting a prestored representation 330 of the object 200, the provided 407 in- formation 210 may comprise the selected prestored representation 330.
The provided 407 information 210 may comprise the determined reference to the selected prestored representation 330 in the table 320a, 320b. Furthermore, the provided 407 information 210 may comprise various data defining the ob ject 200, such as e.g. type, direction of motion, speed, size of the object 200, distance D between the sensor 130 and the object 200, colour of the object 200, etc., in various embod iments. The provided 407 information 210 may in some embodiments comprise data in object form.
Figure 5 illustrates an embodiment of a control arrangement 220 of an information transmit ting unit 100a. The control arrangement 220 aims at performing at least some of the method steps 401 -407 according to the above described method 400 for providing information 210 to an information receiving unit 100b, 100c.
The control arrangement 220 is configured to collect environmental data with at least one sensor 130. Further, the control arrangement 220 is configured to identify an object 200 in the environment of the information transmitting unit 100a, which is considered relevant for the information receiving unit 100b, 100c. Also, the control arrangement 220 is further con figured to extract data related to the identified object 200 from the collected environmental data. The control arrangement 220 is in addition also configured to convert the extracted data into information 210. Furthermore, the control arrangement 220 is configured to deter mine position of the object 200 based on the collected environmental data. The control ar- rangement 220 is configured to provide the converted information 210 and the determined position of the object 200 to the information receiving unit 100b, 100c via a wireless trans mitter 140a, thereby enabling output of a representation 330 of the object 200 on an output device 220 of the information receiving unit 100b, 100c. The control arrangement 220 may in some embodiments be configured to convert the ex tracted data into information 210 by selecting a prestored representation 330 of the object 200. Further, the control arrangement 220 may be configured to provide, via the wireless transmitter 140a, information comprising the selected prestored representation 330. In some embodiments, the control arrangement 220 may be configured to convert the ex tracted data into information 210 by selecting a prestored representation 330 of the identified object 200 in a table 320a, 320b stored in both a memory 300 of the information transmitting unit 100a and a memory 310 of the information receiving unit 100b, 100c. The control ar rangement 220 may be configured to determine a reference to the selected prestored repre- sentation 330 in the table 320a, 320b, when the provided information 210 comprises the determined reference.
The control arrangement 220 may also in some embodiments be configured to coordinate the tables 320a, 320b comprising prestored representations 330 between the information transmitting unit 100a and the information receiving unit 100b, 100c before the converted information 210 is provided.
Also, the control arrangement 220 may be configured to provide information 210 comprising data in object form, in some embodiments.
The control arrangement 220 comprises a receiving circuit 510 configured for collecting in formation from a sensor 130
The control arrangement 220 further comprises a processing circuitry 520 configured for providing information 210 to an information receiving unit 100b, 100c by performing the de scribed method 400 according to at least some of the steps 401 -407.
Such processing circuitry 520 may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processor, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions. The herein utilised expression“processing circuitry” may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.
Furthermore, the control arrangement 220 may comprise a memory 525 in some embodi- ments. The optional memory 525 may comprise a physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis. According to some embodiments, the memory 525 may comprise integrated circuits comprising silicon- based transistors. The memory 525 may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different embod iments.
Further, the control arrangement 220 may comprise a signal transmitting circuit 530. The signal transmitting circuit 530 may be configured for transmitting information 210 to at least some information receiving unit 100b, 100c.
The previously described method steps 401 -407 to be performed in the control arrangement 220 may be implemented through the one or more processing circuits 520 within the control arrangement 220, together with computer program product for performing at least some of the functions of the steps 401 -407. Thus, a computer program product, comprising instruc tions for performing the steps 401 -407 in the control arrangement 220 may perform the method 400 comprising at least some of the steps 401 -407 for providing information 210 to the information receiving unit 100b, 100c, when the computer program is loaded into the one or more processing circuits 520 of the control arrangement 220. The described steps 401 - 407 thus may be performed by a computer algorithm, a machine executable code, a non- transitory computer-readable medium, or software instructions programmed into a suitable programmable logic such as the processing circuits 520 in the control arrangement 220. The computer program product mentioned above may be provided for instance in the form of a data carrier carrying computer program code for performing at least some of the step 401 -407 according to some embodiments when being loaded into the one or more pro cessing circuitry 520 of the control arrangement 220. The data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appropriate medium such as a disk or tape that may hold machine readable data in a non-transitory manner. The computer program product may furthermore be pro vided as computer program code on a server and downloaded to the control arrangement 220 remotely, e.g., over an Internet or an intranet connection. Figure 6 illustrates an example of a method 600 in a control arrangement 230 of an infor mation receiving unit 100b, 100c, according to an embodiment. The flow chart in Figure 6 shows the method 600 for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a.
The information receiving unit 100b, 100c may comprise a vehicle in a group 1 10 of vehicles, comprising also the information transmitting unit 100a.
In order to be able to correctly outputting the object representation 330 the method 600 may comprise a number of steps 601-603. However, some of the described method steps 601 - 603 may be performed in a somewhat different chronological order than the numbering sug gests. The method 600 may comprise the subsequent steps:
Step 601 comprises receiving information 210 concerning the object 200 and position of the object 200 from the information transmitting unit 100a via a wireless receiver 140b.
Step 602 which may be performed only in some embodiments, comprises coordinating the tables 320a, 320b comprising prestored representations 330 between the information trans mitting unit 100a and the information receiving unit 100b, 100c before the information 210 concerning the object 200 is received 601.
Step 603 comprises converting the received 601 information 210 concerning the object 200 into a representation 330 of the object 200. The conversion 603 of the received 601 information 210 into the representation 330 of the object 200 may comprise selecting the representation 330 of the object 200 based on the received 601 information 210, in some embodiments.
The conversion 603 may optionally comprise extracting a reference to a prestored represen- tation 330 in a table 320a, 320b stored in both a memory 310 of the information receiving unit 100b, 100c and a memory 300 of the information transmitting unit 100a, from the re ceived 601 information 210. Further, the conversion 603 may comprise selecting the prestored representation 330 of the object 200 in the table 320a, 320b stored in the memory 310 of the information receiving unit 100b, 100c, based on the extracted reference.
The representation 330 of the object 200 may be a simplified or cartooned version of the object 200 in some embodiments. The representation 330 may for example comprise only a contour of the object 200, a geometric figure, a stylised illustration, a colour, a sound, a text, a tactile signal, etc., or possibly a combination thereof. By only transferring and outputting a reduced amount of information, in comparison with transferring and outputting a streamed video or full image, any transfer delay may be omitted, minimised or at least reduced. Thereby, the driver is informed about the object 200 in real time, or almost real time. The driver thereby has time to react on the upcoming situation and an accident could be avoided, and / or the impact of the accident may be reduced.
The representation 330 of the object 200 in the table 320b may be configurable by a user of the information receiving unit 100b, 100c, in some embodiments.
Step 604 comprises outputting the representation 330 of the object 200 on an output device 240 of the information receiving unit 100b, 100c.
The output device 240 may comprise a visual output device such as a screen; a head-up display; a projector projecting the image on either the road 120, or the back of the vehicle ahead; a set of close-eyes displays/ intelligent glasses/ lenses, i.e. an optical head-mounted display; a loudspeaker; a tactile device; and / or a combination thereof. The output device 240 may in some embodiments be configured for Augmented Reality (AR), and / or Virtual Reality (VR).
Figure 7 illustrates an embodiment of a control arrangement 230 of an information receiving unit 100b, 100c, for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a. The control arrangement 230 is configured to perform at least some of the above described method steps 601 -604 for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a.
The control arrangement 230 is configured to receive information 210 concerning the object 200 and position of the object 200 from the information transmitting unit 100a via a wireless receiver 140b. Also, the control arrangement 220 is configured to convert the received infor mation 210 concerning the object 200 into a representation 330 of the object 200. The control arrangement 220 is furthermore configured to output the representation 330 of the object 200 on an output device 240 of the information receiving unit 100b, 100c.
In some embodiments, the control arrangement 230 may be configured to convert the re ceived information 210 into the representation 330 of the object 200 by selecting the repre sentation 330 of the object 200 based on the received information 210. The control arrangement 230 may be further configured to convert the received information 210 into the representation 330 of the object 200 by extracting a reference to a prestored representation 330 in a table 320a, 320b stored in both a memory 310 of the information receiving unit 100b, 100c and a memory 300 of the information transmitting unit 100a, from the received information 210. Also, the control arrangement 230 may be configured to select the prestored representation 330 of the object 200 in the table 320a, 320b stored in the memory 310 of the information receiving unit 100b, 100c, based on the extracted reference.
In some embodiments, the control arrangement 230 may also be configured to coordinate the tables 320a, 320b comprising prestored representations 330 between the information transmitting unit 100a and the information receiving unit 100b, 100c before the information 210 concerning the object 200 is received.
The control arrangement 230 may in some embodiments be further configured to enable a user of the information receiving unit 100b, 100c to configure the representation 330 of the object 200 in the table 320b.
Also, the control arrangement 230 may be configured to receive information 210 comprising data in object form, in some embodiments. The control arrangement 230 comprises a receiving circuit 710 configured for collecting in formation 210 from a wireless transmitter 140a of an information transmitting unit 100a.
The control arrangement 230 further comprises a processing circuitry 720 configured for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a by performing the described method 600 according to at least some of the steps 601 -604. Such processing circuitry 720 may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processor, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions. The herein utilised expression“processing circuitry” may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.
Furthermore, the control arrangement 230 may comprise a memory 725 in some embodi- ments. The optional memory 725 may comprise a physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis. According to some embodiments, the memory 725 may comprise integrated circuits comprising silicon- based transistors. The memory 725 may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different embod iments.
Further, the control arrangement 230 may comprise a signal transmitting circuit 730. The signal transmitting circuit 730 may be configured for providing the representation 330 to the output device 240 at the information receiving unit 100b, 100c.
The previously described method steps 601 -604 to be performed in the control arrangement 230 may be implemented through the one or more processing circuits 720 within the control arrangement 230, together with computer program product for performing at least some of the functions of the steps 601 -604. Thus, a computer program product, comprising instruc tions for performing the steps 601 -604 in the control arrangement 230 may perform the method 600 comprising at least some of the steps 601 -604 for outputting a representation 330 of an object 200 detected by at least one sensor 130 of an information transmitting unit 100a, based on information 210 obtained from the information transmitting unit 100a, when the computer program is loaded into the one or more processing circuits 720 of the control arrangement 230. The described steps 601 -604 thus may be performed by a computer al gorithm, a machine executable code, a non-transitory computer-readable medium, or soft ware instructions programmed into a suitable programmable logic such as the processing circuits 720 in the control arrangement 230. The computer program product mentioned above may be provided for instance in the form of a data carrier carrying computer program code for performing at least some of the step 601 -604 according to some embodiments when being loaded into the one or more pro cessing circuitry 720 of the control arrangement 230. The data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appropriate medium such as a disk or tape that may hold machine readable data in a non-transitory manner. The computer program product may furthermore be pro vided as computer program code on a server and downloaded to the control arrangement 230 remotely, e.g., over an Internet or an intranet connection.
The solution may further comprise a vehicle 100a, 100b, 100c, comprising a control arrange ment 220 as illustrated in Figure 5 and / or a control arrangement 230 as illustrated in Figure 7. The terminology used in the description of the embodiments as illustrated in the accompa nying drawings is not intended to be limiting of the described methods 400, 600, control arrangements 220, 230, computer program and / or vehicle 100a, 100b, 100c. Various changes, substitutions and / or alterations may be made, without departing from invention embodiments as defined by the appended claims.
As used herein, the term "and/ or" comprises any and all combinations of one or more of the associated listed items. The term“or” as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless ex pressly stated otherwise. In addition, the singular forms "a", "an" and "the" are to be inter- preted as“at least one”, thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise. It will be further understood that the terms "includes", "comprises", "including" and / or "comprising", specifies the presence of stated features, ac tions, integers, steps, operations, elements, and / or components, but do not preclude the presence or addition of one or more other features, actions, integers, steps, operations, ele- ments, components, and / or groups thereof. A single unit such as e.g. a processor may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/ distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms such as via Internet or other wired or wireless communication system.

Claims

PATENT CLAIMS
1. A method (400) in a control arrangement (220) of an information transmitting unit (100a), for providing information (210) to an information receiving unit (100b, 100c), wherein the method (400) comprises the steps of:
collecting (401 ) environmental data with at least one sensor (130);
identifying (402) an object (200) in the environment of the information transmitting unit (100a), which is considered relevant for the information receiving unit (100b, 100c); extracting (403) data related to the identified (402) object (200) from the collected (401 ) environmental data;
converting (405) the extracted (403) data into information (210);
determining (406) position of the object (200) based on the collected (401 ) environ mental data; and
providing (407) the converted (405) information (210) and the determined (406) po sition of the object (200) to the information receiving unit (100b, 100c) via a wireless trans- mitter (140a), thereby enabling output of a representation (330) of the object (200) on an output device (240) of the information receiving unit (100b, 100c).
2. The method (400) according to claim 1 , wherein the conversion (405) of the ex tracted (403) data into information (210) comprises selecting a prestored representation (330) of the object (200); and wherein
the provided (407) information (210) comprises the selected prestored representa tion (330).
3. The method (400) according to any one of claim 1 or claim 2, wherein the conversion (405) of the extracted (403) data into information (210) comprises:
selecting a prestored representation (330) of the identified (402) object (200) in a table (320a, 320b) stored in both a memory (300) of the information transmitting unit (100a) and a memory (310) of the information receiving unit (100b, 100c); and
determining a reference to the selected prestored representation (330) in the table (320a, 320b); and wherein the provided (407) information (210) comprises the determined reference.
4. The method (400) according to claim 3, further comprising the step of:
coordinating (404) the tables (320a, 320b) comprising the prestored representations (330) between the information transmitting unit (100a) and the information receiving unit (100b, 100c) before the converted (405) information (210) is provided (407). 5. The method (400) according to any one of claims 1 -4, wherein the provided (407) information (210) comprises data in object form.
6. A control arrangement (220) of an information transmitting unit (100a), for providing information (210) to an information receiving unit (100b, 100c), wherein the control arrange ment (220) is configured to:
collect environmental data with at least one sensor (130);
identify an object (200) in the environment of the information transmitting unit (100a), which is considered relevant for the information receiving unit (100b, 100c);
extract data related to the identified object (200) from the collected environmental data;
convert the extracted data into information (210); and
determine position of the object (200) based on the collected environmental data; and
provide the converted information (210) and the determined position of the object
(200) to the information receiving unit (100b, 100c) via a wireless transmitter (140a), thereby enabling output of a representation (330) of the object (200) on an output device (220) of the information receiving unit (100b, 100c). 7. The control arrangement (220) according to claim 6, further configured to:
convert the extracted data into information (210) by selecting a prestored represen tation (330) of the object (200); and to
provide, via the wireless transmitter (140a), information comprising the selected prestored representation (330).
8. The control arrangement (220) according to any one of claim 6 or claim 7, further configured to convert the extracted data into information (210) by
selecting a prestored representation (330) of the identified object (200) in a table (320a, 320b) stored in both a memory (300) of the information transmitting unit (100a) and a memory (310) of the information receiving unit (100b, 100c); and
determining a reference to the selected prestored representation (330) in the table (320a, 320b); and wherein the provided information (210) comprises the determined refer ence. 9. The control arrangement (220) according to claim 8, further configured to: coordinate the tables (320a, 320b) comprising prestored representations (330) be tween the information transmitting unit (100a) and the information receiving unit (100b, 100c) before the converted information (210) is provided.
10. The control arrangement (220) according to any one of claims 6-9, further config ured to provide information (210) comprising data in object form.
1 1 . A computer program comprising program code for performing a method (400) ac cording to any of claims 1 -5, when the computer program is executed in a control arrange ment (220) according to any one of claims 6-10.
12. A method (600) in a control arrangement (230) of an information receiving unit (100b, 100c), for outputting a representation (330) of an object (200) detected by at least one sensor (130) of an information transmitting unit (100a), based on information (210) ob tained from the information transmitting unit (100a), wherein the method (600) comprises the steps of:
receiving (601 ) information (210) concerning the object (200) and position of the object (200) from the information transmitting unit (100a) via a wireless receiver (140b); converting (603) the received (601 ) information (210) concerning the object (200) into a representation (330) of the object (200); and
outputting (604) the representation (330) of the object (200) on an output device (240) of the information receiving unit (100b, 100c).
13. The method (600) according to claim 12, wherein the conversion (603) of the re ceived (601 ) information (210) into the representation (330) of the object (200) comprises selecting the representation (330) of the object (200) based on the received (601 ) information (210).
14. The method (600) according to any one of claim 12 or claim 13, wherein the con version (603) comprises:
extracting a reference to a prestored representation (330) in a table (320a, 320b) stored in both a memory (310) of the information receiving unit (100b, 100c) and a memory (300) of the information transmitting unit (100a), from the received (601 ) information (210); selecting the prestored representation (330) of the object (200) in the table (320a, 320b) stored in the memory (310) of the information receiving unit (100b, 100c), based on the extracted reference.
15. The method (600) according to claim 14, further comprising the step of:
coordinating (602) the tables (320a, 320b) comprising prestored representations
(330) between the information transmitting unit (100a) and the information receiving unit (100b, 100c) before the information (210) concerning the object (200) is received (601 ).
16. The method (600) according to any one of claims 12-15, wherein the representation (330) of the object (200) in the table (320b) is configurable by a user of the information re ceiving unit (100b, 100c). 17. A control arrangement (230) of an information receiving unit (100b, 100c), for out- putting a representation (330) of an object (200) detected by at least one sensor (130) of an information transmitting unit (100a), based on information (210) obtained from the infor mation transmitting unit (100a), wherein the control arrangement (230) is configured to: receive information (210) concerning the object (200) and position of the object (200) from the information transmitting unit (100a) via a wireless receiver (140b);
convert the received information (210) concerning the object (200) into a represen tation (330) of the object (200); and
output the representation (330) of the object (200) on an output device (240) of the information receiving unit (100b, 100c).
18. The control arrangement (230) according to claim 17, further configured to convert the received information (210) into the representation (330) of the object (200) by selecting the representation (330) of the object (200) based on the received information (210). 19. The control arrangement (230) according to any one of claim 17 or claim 18, further configured to convert the received information (210) into the representation (330) of the ob ject (200) by
extracting a reference to a prestored representation (330) in a table (320a, 320b) stored in both a memory (310) of the information receiving unit (100b, 100c) and a memory (300) of the information transmitting unit (100a), from the received information (210); and selecting the prestored representation (330) of the object (200) in the table (320a, 320b) stored in the memory (310) of the information receiving unit (100b, 100c), based on the extracted reference. 20. The control arrangement (230) according to claim 19, further configured to: coordinate the tables (320a, 320b) comprising prestored representations (330) be tween the information transmitting unit (100a) and the information receiving unit (100b, 100c) before the information (210) concerning the object (200) is received. 21 . The control arrangement (230) according to any one of claims 17-20, further con figured to enable a user of the information receiving unit (100b, 100c) to configure the rep resentation (330) of the object (200) in the table (320b).
22. A computer program comprising program code for performing a method (600) ac- cording to any of claims 12-16, when the computer program is executed in a control arrange ment (230) according to any one of claims 17-21 .
23. A vehicle (100a, 100b, 100c) comprising a control arrangement (220) according to any one of claims 6-10, or a control arrangement (230) according to any one of claims 17- 21 .
PCT/SE2019/051145 2018-11-27 2019-11-12 Method and control arrangement for visualisation of obstructed view WO2020111999A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1851463A SE1851463A1 (en) 2018-11-27 2018-11-27 Method and control arrangement for visualisation of obstructed view
SE1851463-8 2018-11-27

Publications (1)

Publication Number Publication Date
WO2020111999A1 true WO2020111999A1 (en) 2020-06-04

Family

ID=70852143

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2019/051145 WO2020111999A1 (en) 2018-11-27 2019-11-12 Method and control arrangement for visualisation of obstructed view

Country Status (2)

Country Link
SE (1) SE1851463A1 (en)
WO (1) WO2020111999A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013220312A1 (en) * 2013-10-08 2015-04-09 Bayerische Motoren Werke Aktiengesellschaft Means of transport and method for exchanging information with a means of transportation
GB2524385A (en) * 2014-02-14 2015-09-23 Ford Global Tech Llc Autonomous vehicle handling and performance adjustment
DE102015105784A1 (en) * 2015-04-15 2016-10-20 Denso Corporation Distributed system for detecting and protecting vulnerable road users
GB2545571A (en) * 2015-12-16 2017-06-21 Ford Global Tech Llc Convoy vehicle look-ahead
US20180105176A1 (en) * 2002-05-03 2018-04-19 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US20180253899A1 (en) * 2014-03-25 2018-09-06 Conti Temic Micorelectronic GMBH Method and device for displaying objects on a vehicle display
GB2562018A (en) * 2016-09-15 2018-11-07 Vivacity Labs Ltd A method and system for analyzing the movement of bodies in a traffic system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180105176A1 (en) * 2002-05-03 2018-04-19 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
DE102013220312A1 (en) * 2013-10-08 2015-04-09 Bayerische Motoren Werke Aktiengesellschaft Means of transport and method for exchanging information with a means of transportation
GB2524385A (en) * 2014-02-14 2015-09-23 Ford Global Tech Llc Autonomous vehicle handling and performance adjustment
US20180253899A1 (en) * 2014-03-25 2018-09-06 Conti Temic Micorelectronic GMBH Method and device for displaying objects on a vehicle display
DE102015105784A1 (en) * 2015-04-15 2016-10-20 Denso Corporation Distributed system for detecting and protecting vulnerable road users
GB2545571A (en) * 2015-12-16 2017-06-21 Ford Global Tech Llc Convoy vehicle look-ahead
GB2562018A (en) * 2016-09-15 2018-11-07 Vivacity Labs Ltd A method and system for analyzing the movement of bodies in a traffic system

Also Published As

Publication number Publication date
SE1851463A1 (en) 2020-05-28

Similar Documents

Publication Publication Date Title
US11392131B2 (en) Method for determining driving policy
US10877485B1 (en) Handling intersection navigation without traffic lights using computer vision
CN111664854B (en) Object position indicator system and method
CN113376657B (en) Automatic tagging system for autonomous vehicle LIDAR data
US11450026B2 (en) Information processing apparatus, information processing method, and mobile object
US20200117926A1 (en) Apparatus, method, and system for controlling parking of vehicle
CN111284487B (en) Lane line display method and electronic device for executing same
CN112154492A (en) Early warning and collision avoidance
JP7143857B2 (en) Information processing device, information processing method, program, and mobile object
JP7320001B2 (en) Information processing device, information processing method, program, mobile body control device, and mobile body
CN111292351A (en) Vehicle detection method and electronic device for executing same
US10162357B2 (en) Distributed computing among vehicles
US10970569B2 (en) Systems and methods for monitoring traffic lights using imaging sensors of vehicles
WO2019188391A1 (en) Control device, control method, and program
US10636305B1 (en) Systems and methods for determining parking availability on floors of multi-story units
CN112073936A (en) System and method for network node communication
Ortiz et al. Applications and services using vehicular exteroceptive sensors: A survey
KR20220052846A (en) Identifying objects using lidar
US10803683B2 (en) Information processing device, information processing method, computer program product, and moving object
US11199854B2 (en) Vehicle control system, apparatus for classifying markings, and method thereof
US20220058428A1 (en) Information processing apparatus, information processing method, program, mobile-object control apparatus, and mobile object
WO2021070768A1 (en) Information processing device, information processing system, and information processing method
KR20200070100A (en) A method for detecting vehicle and device for executing the method
WO2020111999A1 (en) Method and control arrangement for visualisation of obstructed view
US11386565B2 (en) Signal processing apparatus and signal processing method, program, and moving body

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19891148

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19891148

Country of ref document: EP

Kind code of ref document: A1