SE542020C2 - Control unit and method thereof - Google Patents
Control unit and method thereofInfo
- Publication number
- SE542020C2 SE542020C2 SE1850709A SE1850709A SE542020C2 SE 542020 C2 SE542020 C2 SE 542020C2 SE 1850709 A SE1850709 A SE 1850709A SE 1850709 A SE1850709 A SE 1850709A SE 542020 C2 SE542020 C2 SE 542020C2
- Authority
- SE
- Sweden
- Prior art keywords
- vehicle
- control unit
- additional
- comprised
- identity confirmation
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B10/00—Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
- H04B10/11—Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/06—Authentication
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Computer Security & Cryptography (AREA)
- Optical Communication System (AREA)
- Traffic Control Systems (AREA)
Abstract
The disclosure relates to a method performed by a control unit (111) adapted to be comprised in a vehicle (110) and configured to communicate with an additional control unit (121) comprised in an additional vehicle (120), the method comprising transmitting (510) an identity confirmation request over a first wireless communication channel (210), receiving (520) an identity confirmation response over an optical communication channel (220) from the additional vehicle (120), verifying (530) that a light emission sequence comprised in the identity confirmation response corresponds to a reference sequence.
Description
CONTROL UNIT AND METHOD THEREOF Technical Field The present invention relates to control unit adapted to be comprised in a vehicle and configured to communicate with an additional control unit comprised in an additional vehicle. The invention further relates to a corresponding method, computer program, computer program product, carrier and a vehicle comprising the control unit.
Background Modern vehicles are often equipped with functionality for autonomous or semi-autonomous driving capability, e.g. so-called ACC (Adaptive/Autonomous Cruise Control). The vehicle are then typically further provided with a communications interface for Vehicle to Vehicle, V2V, communication, e.g. a transceiver for a wireless communication system, and on-board sensors capable of monitoring the environment of the vehicle, e.g. to determine a current position of the vehicle and/or and a position of other vehicles in the vehicle’s environment.
The wireless communication system is typically used to exchange V2V messages between adjacent or nearby vehicles, through which e.g. the current position, velocity and direction of travel may be conveyed. The current position may typically be obtained using a position sensor, such as a Global Navigation Satellite System (GNSS) receiver. Typical GNSS systems available in vehicles today have relatively low accuracy. Furthermore, the GNSS systems of different vehicles may be utilizing different sets of satellites with different accuracies. This has the disadvantage of position information indicative of a spatial location with varying accuracy and precision.
In a situation, where a vehicle is taking part of or intends to take part of a vehicle group formation or platoon, the accuracy of the GNSS based position information in a received V2V message may not be sufficient to maintain a position or assume a new position within the vehicle group formation or platoon.
Some conventional systems may attempt to obtain position information with improved accuracy and precision, e.g. by using high accuracy GNSS systems. However these systems are too complex and costly for viable use in series production of vehicles.
Some conventional systems may attempt to obtain position information with improved accuracy and precision, e.g. by using one or more ranging sensors such as radar or lidar. However, a further problem is to associate the improved position information to a particular vehicle, when there are multiple vehicles travelling in the vehicle group formation Thus, there is a need for an improved control unit, method therefore and vehicle.
Objects of the invention An objective of embodiments of the present invention is to provide a solution which mitigates or solves the drawbacks and problems of conventional solutions described above.
Summary of the invention The above and further objectives are achieved by the subject matter of the independent claims. Further advantageous implementation forms of the invention are defined by the dependent claims.
According to a first aspect of the invention, this objective is achieved by a method performed by a control unit adapted to be comprised in a vehicle and configured to communicate with an additional control unit comprised in an additional vehicle, the method comprising transmitting an identity confirmation request over a first wireless communication channel, receiving an identity confirmation response over an optical communication channel from the additional vehicle, verifying that a light emission sequence comprised in the identity confirmation response corresponds to a reference sequence.
At least one advantage of the disclosure according to the first aspect is that complexity of the system and vehicle is reduced by utilizing sensors and lighting systems that are currently available in existing or planned series vehicles. A further advantage is that the solution is simple and not limited to a specific transmitter or receiver technology or wireless technology. The proposed solution alleviates the need for accurate GNSS measurements for V2V message information by relying on available onboard sensors with higher accuracy.
According to a second aspect of the invention, this objective is achieved by a control unit configured to perform the method according to the first aspect.
According to a third aspect of the invention, this objective is achieved by a vehicle comprising the control unit according to the second aspect.
The advantages of the second and third aspect of the invention are the same as for the first invention.
Further applications and advantages of embodiments of the invention will be apparent from the following detailed description.
Brief description of the drawings Fig. 1 illustrates a scenario where vehicles are travelling in a vehicle group formation according to one or more embodiments of the present disclosure.
Fig. 2 illustrates a method of identifying a vehicle according to one or more embodiments of the disclosure.
Fig. 3 illustrates a further scenario where a vehicle joins other vehicles in a vehicle group formation according to one or more embodiments of the present disclosure.
Fig. 4 shows a vehicle comprising a control unit according to an embodiment of the present disclosure.
Fig. 5 shows a control unit according to an embodiment of the present disclosure.
Fig. 6 shows a block diagram of a method according to one or more embodiments of the present disclosure.
A more complete understanding of embodiments of the invention will be afforded to those skilled in the art, as well as a realization of additional advantages thereof, by a consideration of the following detailed description of one or more embodiments. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures.
Detailed description An “or” in this description and the corresponding claims is to be understood as a mathematical OR which covers ”and” and “or”, and is not to be understand as an XOR (exclusive OR). The indefinite article “a” in this disclosure and claims is not limited to “one” and can also be understood as “one or more”, i.e., plural.
Fig. 1 illustrates a scenario where vehicles are travelling in a vehicle group formation according to one or more embodiments of the present invention. Fig. 1 shows a road, on which vehicles are traveling, from a bird<'>s-eye view. A vehicle 110, travelling on the road, has identified a vehicle group formation also travelling on the road in front of the vehicle 110, e.g. by monitoring Vehicle to Vehicle V2V messages. A first vehicle 102, a second vehicle 103 and an additional vehicle 120 are travelling in vehicle group formation, e.g. with substantially the same velocity, direction of speed and fixed relative distances to each other. The vehicle 110 intends to or is instructed to assume a new position, e.g. immediately behind the additional vehicle 120. In other words, to assume a target position, which is at a relative distance from a position or from a position, e.g. a center position, 101 of the additional vehicle 120. The distance may e.g. be approximately half the length of the additional vehicle 120 with an added safety distance or the safety distance from the rear end of the additional vehicle 120. The position or center position 101 of the additional vehicle 120 is obtained by receiving a V2V message sent from the additional vehicle 120. As can be seen from Fig. 1 , the accuracy or uncertainty of the position or center position 101 , indicated by the circle 104, is not sufficient to distinguish the origin of the V2V message and thus which of the vehicles 102, 103, 120 in the vehicle group formation a relative distance should be determined to, e.g. by employing a radar or lidar ranging sensor. The vehicle 110 may comprise e.g. a truck, a bus or a car, or any similar vehicle or other means of conveyance suitable for travelling on a road. The vehicle 110 may be driver controlled and/or driverless autonomously controlled vehicles in different embodiments.
Fig. 2 illustrates a method of identifying a vehicle 110 according to one or more embodiments of the disclosure. The vehicle 110 and the additional vehicle 120 are provided with a control unit 111 and an additional control unit 121 respectively. The vehicle 110 and the additional vehicle 120 are further provided with a transceiver 214 and an additional transceiver 215 respectively communicatively coupled to the respective control unit and configured to exchange wireless signals, e.g. comprising V2V messages. The vehicle 110 is further provided with a light detector 212 configured to measure or detect intensity of light and send a control signal to the control unit 111. The light detector 212 may e.g. be an onboard camera for visual and/or non-visual light. The additional vehicle 120 is further provided with one or more light emitters 222 configured to emit visual and/or non-visual light in response to a control signal from the control unit 121 .
In one example scenario, the vehicle 110 intends to or is instructed to assume a new position or target spatial location, e.g. immediately behind the additional vehicle 120. The position of the additional vehicle 120 is obtained by receiving a V2V message sent from the additional vehicle 120. As further described in relation to Fig. 1 , the received V2V message could potentially be associated to any of the vehicles 102, 103, 120 in the vehicle group formation due to the accuracy or uncertainty of the position based on a GNSS position sensor.
The vehicle 110 then transmits an identity confirmation request over a first wireless communications channel 210, e.g. by transmitting a V2V message over the wireless communications network. The additional vehicle 120 receives and processes the identity confirmation request and transmits an identity confirmation response over an optical communication channel 220, using the one or more light emitters 222. The vehicle 110 then receives the identity confirmation response over the optical communication channel 220.
In one example, the identity confirmation request comprises a reference sequence and the identity confirmation response is sent using the reference sequence to generate a light emission sequence, e.g. by blinking or activating/deactivating the brake lights according to the reference sequence. The reference sequence may be a binary sequence representing an identity. The identity confirmation response may be received by the vehicle 110 by the use of a camera sensor with a field of view including the road in front of the vehicle 110. As Light Emitting Diode, LED, lights are often used the light emission sequence may be used to control the lighting system in a dedicated manner or to control the lighting system to overlay the light emission sequence onto the normal operation of the lighting system. In other words, the light emission sequence may be modulated using sufficiently high frequencies so that the human eye may not notice them, e.g. above 50 Hz.
The control unit 111 of the vehicle 110 then verifies that the light emission sequence comprised in the identity confirmation response corresponds to the reference sequence.
This has the effect that sensor data from different sensors, e.g. the light detector 212 and a ranging sensor 213, can be associated by using a transfer function between a field of view of the light detector 212 and a field of view of the ranging sensor 113. The transfer function can be derived from calibration when the vehicle 110 is produced, as would be understood by a person skilled in the art. In other words, coordinate systems relative to which sensor data is generated for each sensor can be determined or calibrated at the time of production of the vehicle, and transfer functions between those coordinate systems can be generated and stored, e.g. in the memory of the control unit 111.
In one example, the light detector 212 is a camera and the ranging sensor 113 is a radar sensor. The source direction or location of the light emission sequence can be identified in the sensor data from the camera using its coordinate system and used to filter out sensor data from the radar sensor originating from the corresponding source direction or location. This allows e.g. an improved position of the vehicle 110 relative to the additional vehicle 120, or vice versa, to be obtained.
In one embodiment, if the light emission sequence corresponds to the reference sequence the control unit 111 is further configured to: Obtain sensor information indicative of a spatial location of the additional vehicle 120. In one example, this may include to determine a relative distance between the vehicle 110 and the additional vehicle 120. The target position or new position may then be calculated or determined as a point located at a relative distance from the position of the additional vehicle 120 and or relative distances to vehicles 103, 103, 120 in the vehicle group formation.
The control unit 111 is further configured to generate maneuvering data indicative of a target spatial location of the vehicle 110. In one example this involves generating maneuvering data in the form of a target relative distance to one or more of the vehicles in the vehicle group formation, a target path or a target geographical position. The control unit 111 is further configured to maneuver the vehicle 110 to the target spatial location using the maneuvering data.
At least one advantage of the embodiments of the disclosure is that complexity of the system and vehicle is reduced by utilizing sensors and systems that are currently available in existing or planned series vehicles. A further advantage is that the solution is simple and not limited to a specific transmitter or receiver technology or wireless technology. The proposed solution alleviates the need for accurate GNSS measurements for V2V message information by relying on available onboard sensors with higher accuracy.
Fig. 3 illustrates a further scenario where a vehicle 110 joins other vehicles 102, 103, 120 in a vehicle group formation according to one or more embodiments of the present invention. In one example scenario, the vehicle 110 is located at a current location 301 and intends to or is instructed to assume a new position or target spatial location 302, e.g. immediately behind the additional vehicle 120. It is appreciated that the spatial location may be an absolute location or a relative location. Referring back to Fig. 2, after receiving the identity confirmation response the control unit 111 of the vehicle 110 then verifies that the light emission sequence comprised in the identity confirmation response corresponds to the reference sequence. If the light emission sequence corresponds to the reference sequence, the control unit 111 is further configured to obtain sensor information indicative of a spatial location of the additional vehicle 120, typically by determining a relative distance between the vehicle 110 and the additional vehicle 120 using a radar or lidar sensor. The control unit 111 is further configured to generate maneuvering data indicative of a target spatial location of the vehicle 110. In one example this involves generating maneuvering data in the form of a target relative distance to the additional vehicle 120, e.g. 3 meters from the back of the additional vehicle 120. The control unit 111 is further configured to maneuver the vehicle 110 to the target spatial location 302 using the maneuvering data, e.g. by increasing the speed of the vehicle 110 to catch up with the additional vehicle 120 and thereby achieve the target relative distance of 3 meters. The vehicle<'>s 110 position before the maneuver is illustrated by dashed lines and the vehicle 110 in the assumed target position is shown by solid lines in Fig. 3. For illustrative purposes, the scenario in Fig. 3 shows the other vehicles 102, 103, 120 in the vehicle group formation to be stationary relative to the road surface. It is understood that the disclosed solution is equally applicable to scenarios where the vehicles are moving relative to the road surface.
Fig. 4 shows a vehicle 110 comprising a control unit 111 according to an embodiment of the present disclosure. The vehicle may comprise a control unit 111 according to embodiments described herein. The vehicle may further comprise one or more primary braking control units PB1-PB4 configured to actuate one or more primary braking means of the vehicle 110. The primary braking means may be configured to apply a braking force to any or all of the wheels W1-W4 of the vehicle. The one or more primary braking control units PB1-PB4 may comprise control logic and/or a processor, an optional memory and an actuator configured to actuate the primary braking means. The one or more primary braking control units PB1-PB4 may be configured to receive control signals from the control unit 111 and control an actuator controlled by the control unit acting on the primary braking means based on the control signal. The one or more primary braking control units PB1-PB4 may further be configured to send status signals from the control unit to the control unit 111 indicative of status of the one or more primary braking control units PB1-PB4, the actuator or the primary braking means, e.g. indicating a failure of the one or more primary braking control units PB1-PB4. The primary braking means may be any means or arrangements suitable to apply a braking force to any or all of the wheels W1 -W4 of the vehicle, e.g. hydraulic, electric or pneumatic brakes, disc brakes or drum brakes.
The vehicle may further comprise one or more sensors 212, 213 communicatively coupled to the control unit 111. The one or more sensors may be configured to detect and/or register and/or capture first sensor data indicative of the environment of the vehicle, e.g. to capture sensor data indicative of the environment and send sensor data to the control unit 111. The one or more sensors 212, 213 may further be configured to send the sensor data as a signal to the control unit 111. Examples of sensors may be any selection of radar sensor, lidar sensor, video camera, infrared camera, GPS with map, traffic information receiver or any other suitable sensor. In an example, the sensors 212, 213 may include a radar detecting obstacles in front of the vehicle, such as pedestrians and/or other vehicles. In a further example, the sensors 212, 213 may include a camera detecting light emitted from the additional vehicle 120.
The vehicle may further comprise a transceiver 214 communicatively coupled to the control unit 111. The transceiver 214 may further comprise at least one optional antenna (not shown in the figure). The antenna may be coupled to the transceiver and is configured to transmit and/or emit and/or receive wired signals in a wired communications system and/or wireless signals in a wireless communication system. The signals may comprise V2V messages.
The control unit 111 may further be communicatively coupled to the one or more primary braking control units PB1-PB4, the one or more secondary braking control units SB1-SB4 and the one or more sensors 212, 213, e.g. via wired or wireless communication, such as a Controller Area Network, CAN, bus, Bluetooth, WiFi etc. The one or more environment sensors 121-123 may be configured to send the first sensor data directly to the control unit 111 or via a wired and/or wireless communications network 130. The wired or wireless communication may be performed using any of a CAN bus, Bluetooth, WiFi, GSM, UMTS, LTE or LTE advanced communications network or any other wired or wireless communication network known in the art.
The vehicle 110 may further optionally comprise a steering control unit SC configured to actuate steering means of the vehicle 110, e.g. to control an angle of a pair of wheels W1-W4 e.g. the front wheels. In one example, the steering means are controlled such that the vehicle 110 follows the target driving path. The steering control unit SC may comprise control logic and/or a processor, an optional memory and an actuator configured to actuate the steering means. The steering control unit SC may be configured to receive control signals from the control unit 111 and control an actuator controlled by the steering control unit and actuating the steering means based on the control signal. The steering control unit SC may further be configured to send status signals from the control unit to the control unit 111 indicative of status of the steering control unit SC, the actuator or the steering means, e.g. indicating a failure of the steering control unit SC. The steering means may be any means or arrangement suitable to steering the vehicle, e.g. hydraulic, electric or pneumatic means acting on the wheels W1-W4 of the vehicle.
The vehicle may further optionally comprise a powertrain control unit PC configured to control the power train means delivering driving power to one or more of the wheels W1-W4 of the vehicle. In one example, the power train means are controlled in a manner such that the vehicle 110 follows a target driving path and/or a target speed. The powertrain control unit PC may comprise control logic and/or a processor, an optional memory and an actuator configured to actuate the steering means. The powertrain control unit PC may be configured to receive control signals from the control unit 111 and control an actuator controlled by the control unit acting on the powertrain means based on the control signal. The powertrain control unit PC may further be configured to send status signals from the control unit to the control unit 111 indicative of status of the powertrain control unit PC, the actuator or the powertrain means, e.g. indicating a failure of the powertrain control unit PC. The powertrain means may be any means or arrangement suitable to delivering driving power to one or more of the wheels W1-W4 of the vehicle, e.g. the engine and/or driving means, the transmission, the drive shafts, the differentials, and the final drive, e.g. acting on the wheels W1-W4 of the vehicle. The control unit 111 may further be communicatively coupled to the steering control unit SC and/or the powertrain control unit PC. The vehicle may further comprise on or more additional sensors configured to receive and/or obtain and/or measure physical properties pertaining to the vehicle 120 and send one or more sensor signals comprising second sensor data indicative of the physical properties to the processing means 112, e.g. sensor data indicative of wheel speeds of the vehicle.
Fig. 5 shows a control unit 111 according to an embodiment of the present disclosure. The control unit 111 may be in the form of a selection of any of one or more Electronic Control Units, a server, an on-board computer, an digital information display, a stationary computing device, a laptop computer, a tablet computer, a handheld computer, a wrist-worn computer, a smart watch, a PDA, a Smartphone, a smart TV, a telephone, a media player, a game console, a vehicle mounted computer system or a navigation device. The control unit 1 may comprise a processor 512 communicatively coupled to a transceiver 504 for wired or wireless communication. The transceiver may be an internal transceiver 504 or an external transceiver 214. Further, the control unit 111 may further comprise at least one optional antenna (not shown in figure). The antenna may be coupled to the transceiver 504 and is configured to transmit and/or emit and/or receive a wireless signals in a wireless communication system, e.g. send/receive control signals and/or status data to/from the powertrain control unit PC, the one or more sensors 212, 213 or any other control unit or sensor. In one example, the processor 512 may be any of a selection of processing circuitry and/or a central processing unit and/or processor modules and/or multiple processors configured to cooperate with each-other. Further, the control unit 111 may further comprise a memory 515. The memory 515 may contain instructions executable by the processor to perform the methods described herein. The processor 512 may be communicatively coupled to a selection of any of the transceiver 504, the one or more sensors 212, 213 and the memory 515. The control unit 111 may be configured to receive the sensor data directly from the one or more sensors 212, 213 or via a wired and/or wireless communications network 140.
The control unit 111 may further comprise a communications interface, e.g. the wireless transceiver 104 and/or a wired/wireless communications network adapter, which is configured to send and/or receive data values or parameters as a signal to or from the processing means 512 to or from other external nodes, e.g. a control information server. In an embodiment, the communications interface communicates directly between communication network nodes or via the communications network.
In one or more embodiments the control unit 111 may further comprise an input device 117, configured to receive input or indications from a user and send a user-input signal indicative of the user input or indications to the processing means 512. In one or more embodiments the control unit 111 may further comprise a display 118 configured to receive a display signal indicative of rendered objects, such as text or graphical user input objects, from the processing means 512 and to display the received signal as objects, such as text or graphical user input objects. In one embodiment the display 518 is integrated with the user input device 517 and is configured to receive a display signal indicative of rendered objects, such as text or graphical user input objects, from the processing means 512 and to display the received signal as objects, such as text or graphical user input objects, and/or configured to receive input or indications from a user and send a user-input signal indicative of the user input or indications to the processing means 512. In embodiments, the processing means 512 is communicatively coupled to the memory 515 and/or the communications interface and/or the input device 517 and/or the display 518 and/or the one or more sensors 212, 213.
In embodiments, the communications interface and/or transceiver communicates using wired and/or wireless communication techniques. In embodiments, the one or more memory 515 may comprise a selection of a hard RAM, disk drive, a floppy disk drive, a magnetic tape drive, an optical disk drive, a CD or DVD drive (R or RW), or other removable or fixed media drive. In a further embodiment, the control unit 111 may further comprise and/or be coupled to one or more additional sensors (not shown in the figure) configured to receive and/or obtain and/or measure physical properties pertaining to the vehicle 110 and send one or more sensor signals indicative of the physical properties to the processing means 512, e.g. sensor data indicative of relative wheel speeds of the vehicle.
Fig. 6 shows a block diagram of a method 600 according to one or more embodiments of the present disclosure. The method 600 may be performed by a control unit 111 adapted to be comprised in a vehicle 110 and configured to communicate with an additional control unit 121 comprised in an additional vehicle 120, the method comprising: STEP 610: transmitting an identity confirmation request over a first wireless communication channel 210. Transmitting the identity confirmation request may be performed by controlling the transceiver 214 to transmit over the first wireless communication channel 210. In one embodiment, the identity confirmation request is comprised in a signal transmitted in a wireless communications network, such as WiFi/IEEE 802. X, 2G, 3, 4G or 5G wireless communications networks. In a further embodiment, the first wireless communication channel 210 is formed between a transceiver 214 of the vehicle and an additional transceiver 215 of the additional vehicle 120. In one embodiment, the identity confirmation request comprises a request identity identifying the purpose of the request and/or a reference sequence and/or indicia identifying a predetermined reference sequence. In a further embodiment, the identity confirmation request is comprised in a Vehicle-to-vehicle, V2V, message received from the additional vehicle 120.
In one example, the vehicle 110 transmits a V2V message over a wireless communication channel of a 5G wireless communications network to the additional vehicle 120. In one further example, the V2V message comprises a reference sequence indicating an expected activation sequence of rear facing lights of the additional vehicle<'>s lighting system. The lights may typically be selected if they are observable by a camera in the vehicle 110. This may e.g. involve activating break lights to signal a vehicle identity of the additional vehicle 120 in Morse code or any other suitable protocol.
STEP 620: receiving an identity confirmation response over an optical communication channel 220 from the additional vehicle 120. Receiving the identity confirmation response may be performed in a light detector 212 configured to receive the identity confirmation response and send to the control unit 111.
In one embodiment, the identity confirmation response comprises a light emission sequence. In one embodiment, the optical communication channel 220 is formed between a light emitter 222 of the additional vehicle 120 and a light detector 212 of the vehicle 110. The identity confirmation response may be received by a sensor or light detector 212 and forwarded as a control signal to the control unit 111. In one embodiment, the light emitter 122 comprises a vehicle lighting system of the additional vehicle 120 configured to emit the light emission sequence and the light detector 112 comprises a camera of the vehicle 110. In one embodiment, a light emission sequence is emitted by activating a selection of lights of the vehicle lighting system using the reference sequence.
STEP 630: verifying that the light emission sequence comprised in the identity confirmation response corresponds to a reference sequence and if the light emission sequence corresponds to the reference sequence to perform further method steps. The light emission sequence may be verified by decoding it to a sequence and comparing it to the reference sequence or other predetermined sequence, e.g. a sequence stored in memory of the control unit 111.
In one example, the light emission sequence is decoded to a first binary sequence and the reference sequence is retrieved by the control unit 111 as a second binary sequence from memory. The identity of the first sequence is then compared to the second sequence.
In one embodiment, the method further comprises checking if the light emission sequence corresponds to the reference sequence, and if so further perform the steps: STEP 640: obtaining sensor information indicative of a spatial location of the additional vehicle 120. In one embodiment, the sensor information is obtained from a ranging sensor 113 comprised in the vehicle 110 and where the sensor information comprises at least a distance or relative distance between the vehicle 110 and the additional vehicle 120 and/or other vehicles 102, 103 in the vehicle group formation.
In one example and after verifying that the light emission sequence comprised in the identity confirmation response corresponds to a reference sequence, a distance recorded or measured by the ranging sensor may be associated to the additional vehicle 120 using a transfer function between a coordinate system of the field of view of the light detector 212 and a coordinate system of the field of view of the ranging sensor 113. The transfer function can be derived from calibration when the vehicle 110 is produced, as would be understood by a person skilled in the art.
In other words, the direction or position from where the received the light emission sequence is received and recorded by a camera of the vehicle 110 can be used to derive the direction or position of a radar reading of the relative distance between the vehicle 110 and the additional vehicle 120.
STEP 650: generating maneuvering data indicative of a target spatial location of the vehicle 110.
In one example this involves generating maneuvering data in the form of a target relative distance to one or more of the vehicles in the vehicle group formation, a target path or a target geographical position.
In one embodiment, the additional vehicle 120 is taking part of a vehicle group formation or platoon and the target spatial location corresponds to an assigned position within the vehicle group formation or platoon. In one embodiment, the target spatial location indicates the assigned position within the vehicle group formation or platoon as relative distances to one or more vehicles within the vehicle group formation or platoon.
STEP 660: maneuvering the vehicle 110 to the target spatial location using the maneuvering data.
In one example, further illustrated in Fig. 3, maneuvering the vehicle 110 to the target spatial location comprises temporarily increasing the speed of the vehicle 110 until target distances to vehicles 102, 103, 120 of the vehicle group formation have been achieved.
In one embodiment, a control unit 111 is provided and adapted to be comprised in a vehicle 110, the control unit 111 comprising: a processor 512, and a memory 515, said memory containing instructions executable by said processor, wherein said control unit 11 is configured to perform any of the method steps described herein.
In one embodiment, a computer program is provided comprising computer-executable instructions for causing the control unit 11 1 when the computer-executable instructions are executed on a processing unit comprised in the control unit 111 , to perform any of the methods described herein. Furthermore, any methods according to embodiments of the invention may be implemented in a computer program, having code means, which when run by processing means causes the processing means to execute the steps of the method. The computer program is included in a computer readable medium of a computer program product.
In one embodiment, a computer program product is provided comprising a computerreadable storage medium, the computer-readable storage medium having the computer program above embodied therein.
In one embodiment, a carrier containing the computer program above, wherein the carrier is one of an electronic signal, optical signal, radio signal, or computer readable storage medium.
In an embodiment, a computer program product comprising a memory and/or a computerreadable storage medium, the computer-readable storage medium having the computer program described above embodied therein. The memory and/or computer-readable storage medium referred to herein may comprise of essentially any memory, such as a ROM (Read Only Memory), a PROM (Programmable Read-Only Memory), an EPROM (Erasable PROM), a Flash memory, an EEPROM (Electrically Erasable PROM), or a hard disk drive.
In embodiments, the communications network communicate using wired or wireless communication techniques that may include at least one of a Local Area Network (LAN), Metropolitan Area Network (MAN), Global System for Mobile Network (GSM), Enhanced Data GSM Environment (EDGE), Universal Mobile Telecommunications System, Long term evolution, High Speed Downlink Packet Access (HSDPA), Wideband Code Division Multiple Access (W-CDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Bluetooth®, Zigbee®, Wi-Fi, Voice over Internet Protocol (VoIP), LTE Advanced, IEEE802.16m, WirelessMAN-Advanced, Evolved High-Speed Packet Access (HSPA+), 3GPP Long Term Evolution (LTE), Mobile WiMAX (IEEE 802.16e), Ultra Mobile Broadband (UMB) (formerly Evolution-Data Optimized (EV-DO) Rev. C), Fast Low-latency Access with Seamless Handoff Orthogonal Frequency Division Multiplexing (Flash-OFDM), High Capacity Spatial Division Multiple Access (iBurst®) and Mobile Broadband Wireless Access (MBWA) (IEEE 802.20) systems, High Performance Radio Metropolitan Area Network (HIPERMAN), Beam-Division Multiple Access (BDMA), World Interoperability for Microwave Access (Wi-MAX) and ultrasonic communication, etc., but is not limited thereto.
Moreover, it is realized by the skilled person that the control unit 210 may comprise the necessary communication capabilities in the form of e.g., functions, means, units, elements, etc., for performing the present solution. Examples of other such means, units, elements and functions are: processors, memory, buffers, control logic, encoders, decoders, rate matchers, de-rate matchers, mapping units, multipliers, decision units, selecting units, switches, interleavers, de-interleavers, modulators, demodulators, inputs, outputs, antennas, amplifiers, receiver units, transmitter units, DSPs, MSDs, TCM encoder, TCM decoder, power supply units, power feeders, communication interfaces, communication protocols, etc. which are suitably arranged together for performing the present solution.
Especially, the processor and/or processing means of the present disclosure may comprise one or more instances of processing circuitry, processor modules and multiple processors configured to cooperate with each-other, Central Processing Unit (CPU), a processing unit, a processing circuit, a processor, an Application Specific Integrated Circuit (ASIC), a microprocessor, a Field-Programmable Gate Array (FPGA) or other processing logic that may interpret and execute instructions. The expression “processor” and/or “processing means” may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones mentioned above. The processing means may further perform data processing functions for inputting, outputting, and processing of data comprising data buffering and device control functions, such as call processing control, user interface control, or the like.
Finally, it should be understood that the invention is not limited to the embodiments described above, but also relates to and incorporates all embodiments within the scope of the appended independent claims.
Claims (11)
1. A method performed by a control unit (111) adapted to be comprised in a vehicle (110) and configured to communicate with an additional control unit (121) comprised in an additional vehicle (120), the method comprising: transmitting (610) an identity confirmation request over a first wireless communication channel (210), receiving (620) an identity confirmation response over an optical communication channel (220) from the additional vehicle (120), verifying (630) that a light emission sequence comprised in the identity confirmation response corresponds to a reference sequence.
2. The method according to claim 1, wherein if the light emission sequence corresponds to the reference sequence the method further comprises the steps of: obtaining (640) sensor information indicative of a relative distance between the vehicle (110) and the additional vehicle (120), generating (650) maneuvering data indicative of a target relative distance between the vehicle (110) and the additional vehicle (120), maneuvering (660) the vehicle (110) to the target relative distance using the maneuvering data.
3. The method according to any of claims 1 or 2, wherein the optical communication channel (220) is formed between a light emitter (122) of the additional vehicle (120) and a light detector (212) of the vehicle (110).
4. The method according to claim 3, wherein the light emitter (122) comprises a vehicle lighting system of the additional vehicle (120) and the light detector (112) comprises a camera of the vehicle (110).
5. The method according to claim 4, wherein the light emission sequence is emitted by activating a selection of lights of the vehicle lighting system using the reference sequence.
6. The method according to claim 5, wherein the reference sequence is comprised in a Vehicle-to-vehicle, V2V message.
7. The method according to claims 2-6, wherein the sensor information is obtained from a ranging sensor (113) comprised in the vehicle (110) and where the sensor information comprises at least a distance between the vehicle (110) and the additional vehicle (120).
8. The method according to claims 2-7, wherein the additional vehicle (120) is taking part of a vehicle group formation and wherein the target relative distance between the vehicle (110) and the additional vehicle (120)corresponds to an assigned position within the vehicle group formation.
9. The method according to claim 8, wherein the target relative distance between the vehicle (110) and the additional vehicle (120) indicates the assigned position within the vehicle group formation as relative distances to one or more vehicles within the vehicle group formation.
10. A control unit (111) adapted to be comprised in a vehicle (110) and configured to communicate with an additional control unit (121) comprised in an additional vehicle (120), wherein the control unit (111 ) is configured to perform the method according to any of claims 1 -9.
11. . A vehicle comprising the control unit (111 ) according to claim 10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE1850709A SE542020C2 (en) | 2018-06-12 | 2018-06-12 | Control unit and method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE1850709A SE542020C2 (en) | 2018-06-12 | 2018-06-12 | Control unit and method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
SE1850709A1 SE1850709A1 (en) | 2019-12-13 |
SE542020C2 true SE542020C2 (en) | 2020-02-11 |
Family
ID=69139182
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
SE1850709A SE542020C2 (en) | 2018-06-12 | 2018-06-12 | Control unit and method thereof |
Country Status (1)
Country | Link |
---|---|
SE (1) | SE542020C2 (en) |
-
2018
- 2018-06-12 SE SE1850709A patent/SE542020C2/en unknown
Also Published As
Publication number | Publication date |
---|---|
SE1850709A1 (en) | 2019-12-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10789848B2 (en) | Multi-level hybrid vehicle-to-anything communications for cooperative perception | |
US10531254B2 (en) | Millimeter wave vehicle-to-vehicle communication system for data sharing | |
US10486701B2 (en) | Driving control device, driving control method and vehicle-to-vehicle communication system | |
KR102353558B1 (en) | Method for supporting a first mobile station to predict the channel quality for a planned decentralized wireless communication to a communication partner station, mobile station, and vehicle | |
US9293044B2 (en) | Cooperative vehicle collision warning system | |
US11676427B2 (en) | Vehicle component modification based on vehicle-to-everything communications | |
US11242056B2 (en) | Apparatus and method for controlling smart cruise control system | |
WO2013076793A1 (en) | Vehicle identification device | |
EP3333030A1 (en) | Vehicle control unit and method thereof | |
US20140358412A1 (en) | Method for operating a driver assistance system and method for processing vehicle surroundings data | |
CN110709907B (en) | Detection of vehicle-to-vehicle performance of a vehicle | |
US20210227365A1 (en) | Wireless in-vehicle networking enhanced interference detection via external sensors | |
KR20160037544A (en) | System and method for controlling group driving based on v2v and das sensor | |
JP2011186953A (en) | Vehicle group travel control device | |
US20140316690A1 (en) | Device and method for determining the position of a vehicle | |
WO2016126318A1 (en) | Method of automatically controlling an autonomous vehicle based on cellular telephone location information | |
JP2020061733A (en) | Vehicle-to-everything (v2x) full-duplex localization assistance for v2x receiver | |
US20220398927A1 (en) | Method and apparatuses for preventing safety-critical traffic situations between vehicles | |
US11222540B2 (en) | Vehicle and method of controlling the same | |
SE542020C2 (en) | Control unit and method thereof | |
CN111788620B (en) | Information processing apparatus, information processing method, and computer-readable recording medium | |
US10691136B2 (en) | Method and device for providing a signal for operating at least two vehicles | |
CN113626545A (en) | Vehicle, apparatus, method and computer program for determining a merged environment map | |
EP3339897A1 (en) | Selective sensor data network | |
GB2566098A (en) | Apparatus and method for determining following vehicle information |