SE1851463A1 - Method and control arrangement for visualisation of obstructed view - Google Patents
Method and control arrangement for visualisation of obstructed viewInfo
- Publication number
- SE1851463A1 SE1851463A1 SE1851463A SE1851463A SE1851463A1 SE 1851463 A1 SE1851463 A1 SE 1851463A1 SE 1851463 A SE1851463 A SE 1851463A SE 1851463 A SE1851463 A SE 1851463A SE 1851463 A1 SE1851463 A1 SE 1851463A1
- Authority
- SE
- Sweden
- Prior art keywords
- information
- representation
- receiving unit
- control arrangement
- prestored
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract 18
- 238000012800 visualization Methods 0.000 title 1
- 230000007613 environmental effect Effects 0.000 claims abstract 8
- 238000004590 computer program Methods 0.000 claims 4
- 238000006243 chemical reaction Methods 0.000 claims 3
- 230000001702 transmitter Effects 0.000 claims 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
- G08G1/162—Decentralised systems, e.g. inter-vehicle communication event-triggered
-
- B60K35/28—
-
- B60K35/85—
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G05D1/695—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/94—Hardware or software architectures specially adapted for image or video understanding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/96—Management of image or video recognition tasks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/22—Platooning, i.e. convoy of communicating vehicles
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/22—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of characters or indicia using display control signals derived from coded signals representing the characters or indicia, e.g. with a character-code memory
- G09G5/24—Generation of individual character patterns
-
- B60K2360/176—
-
- B60K2360/5915—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/65—Data transmitted between vehicles
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/144—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/16—Use of wireless transmission of display information
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2380/00—Specific applications
- G09G2380/10—Automotive applications
Abstract
A method (400) in a control arrangement (220) of an information transmitting unit (100a), for providing information (210) to an information receiving unit (100b, 100c). The method (400) comprises collecting (401) environmental data with at least one sensor (130); identifying (402) an object (200), which is considered relevant; extracting (403) data related to the object (200) from the environmental data; converting (405) the data into information (210); determining (406) position of the object (200) based on the collected (401) environmental data; and providing (407) the information (210) and the determined (406) position of the object (200) to the information receiving unit (100b, 100c) via a wireless transmitter (140a), thereby enabling output of a representation (330) of the object (200) on an output device (240) of the information receiving unit (100b, 100c).Also, a method (600) in a control arrangement (230) of an information receiving unit (100b, 100c), is provided.
Claims (23)
1. A method (400) in a control arrangement (220) of an information transmitting unit(100a), for providing information (210) to an information receiving unit (100b, 100c), whereinthe method (400) comprises the steps of: collecting (401) environmental data with at least one sensor (130); identifying (402) an object (200) in the environment of the information transmittingunit (100a), which is considered relevant for the information receiving unit (100b, 100c); extracting (403) data related to the identified (402) object (200) from the collected(401) environmental data; converting (405) the extracted (403) data into information (210); determining (406) position of the object (200) based on the collected (401) environ-mental data; and providing (407) the converted (405) information (210) and the determined (406) po-sition of the object (200) to the information receiving unit (100b, 100c) via a wireless trans-mitter (140a), thereby enabling output of a representation (330) of the object (200) on anoutput device (240) of the information receiving unit (100b, 100c).
2. The method (400) according to claim 1, wherein the conversion (405) of the ex-tracted (403) data into information (210) comprises selecting a prestored representation(330) of the object (200); and wherein the provided (407) information (210) comprises the selected prestored representa-üon(330)
3. The method (400) according to any one of claim 1 or claim 2, wherein the conversion(405) of the extracted (403) data into information (210) comprises: selecting a prestored representation (330) of the identified (402) object (200) in atable (320a, 320b) stored in both a memory (300) of the information transmitting unit (100a)and a memory (310) of the information receiving unit (100b, 100c); and determining a reference to the selected prestored representation (330) in the table(320a, 320b); and wherein the provided (407) information (210) comprises the determined reference.
4. The method (400) according to claim 3, further comprising the step of: coordinating (404) the tables (320a, 320b) comprising the prestored representations(330) between the information transmitting unit (100a) and the information receiving unit(100b, 100c) before the converted (405) information (210) is provided (407). 26
5. The method (400) according to any one of claims 1-4, wherein the provided (407)information (210) comprises data in object form.
6. A control arrangement (220) of an information transmitting unit (100a), for providinginformation (210) to an information receiving unit (100b, 100c), wherein the control arrange-ment (220) is configured to: collect environmental data with at least one sensor (130); identify an object (200) in the environment of the information transmitting unit(100a), which is considered relevant for the information receiving unit (100b, 100c); extract data related to the identified object (200) from the collected environmentaldata; convert the extracted data into information (210); and determine position of the object (200) based on the collected environmental data;and provide the converted information (210) and the determined position of the object(200) to the information receiving unit (100b, 100c) via a wireless transmitter (140a), therebyenabling output of a representation (330) of the object (200) on an output device (220) of theinformation receiving unit (100b, 100c).
7. The control arrangement (220) according to claim 6, further configured to: convert the extracted data into information (210) by selecting a prestored represen-tation (330) of the object (200); and to provide, via the wireless transmitter (140a), information comprising the selectedprestored representation (330).
8. The control arrangement (220) according to any one of claim 6 or claim 7, furtherconfigured to convert the extracted data into information (210) by selecting a prestored representation (330) of the identified object (200) in a table(320a, 320b) stored in both a memory (300) of the information transmitting unit (100a) and amemory (310) of the information receiving unit (100b, 100c); and determining a reference to the selected prestored representation (330) in the table(320a, 320b); and wherein the provided information (210) comprises the determined refer- GHCG.
9. The control arrangement (220) according to claim 8, further configured to: 27 coordinate the tables (320a, 320b) comprising prestored representations (330) be-tvveen the information transmitting unit (100a) and the information receiving unit (100b, 100c)before the converted information (210) is provided.
10.ured to provide information (210) comprising data in object form. The control arrangement (220) according to any one of claims 6-9, further config-
11. A computer program comprising program code for performing a method (400) ac-cording to any of claims 1-5, when the computer program is executed in a control arrange-ment (220) according to any one of claims 6-10.
12.(100b, 100c), for outputting a representation (330) of an object (200) detected by at least A method (600) in a control arrangement (230) of an information receiving unit one sensor (130) of an information transmitting unit (100a), based on information (210) ob-tained from the information transmitting unit (100a), wherein the method (600) comprises thesteps of: receiving (601) information (210) concerning the object (200) and position of theobject (200) from the information transmitting unit (100a) via a wireless receiver (140b); converting (603) the received (601) information (210) concerning the object (200)into a representation (330) of the object (200); and outputting (604) the representation (330) of the object (200) on an output device(240) of the information receiving unit (100b, 100c).
13.ceived (601) information (210) into the representation (330) of the object (200) comprises The method (600) according to claim 12, wherein the conversion (603) of the re- selecting the representation (330) of the object (200) based on the received (601) information(210).
14.version (603) comprises: The method (600) according to any one of claim 12 or claim 13, wherein the con- extracting a reference to a prestored representation (330) in a table (320a, 320b)stored in both a memory (310) of the information receiving unit (100b, 100c) and a memory(300) of the information transmitting unit (100a), from the received (601) information (210); selecting the prestored representation (330) of the object (200) in the table (320a,320b) stored in the memory (310) of the information receiving unit (100b, 100c), based onthe extracted reference. 28
15. The method (600) according to claim 14, further comprising the step of:coordinating (602) the tables (320a, 320b) comprising prestored representations(330) between the information transmitting unit (100a) and the information receiving unit (100b, 100c) before the information (210) concerning the object (200) is received (601).
16.(330) of the object (200) in the table (320b) is configurable by a user of the information re-ceiving unit (100b, 100c). The method (600) according to any one of claims 12-15, wherein the representation
17.putting a representation (330) of an object (200) detected by at least one sensor (130) of an A control arrangement (230) of an information receiving unit (100b, 100c), for out- information transmitting unit (100a), based on information (210) obtained from the infor-mation transmitting unit (100a), wherein the control arrangement (230) is configured to: receive information (210) concerning the object (200) and position of the object(200) from the information transmitting unit (100a) via a wireless receiver (140b); convert the received information (210) concerning the object (200) into a represen-tation (330) of the object (200); and output the representation (330) of the object (200) on an output device (240) of theinformation receiving unit (100b, 100c).
18.the received information (210) into the representation (330) of the object (200) by selecting The control arrangement (230) according to claim 17, further configured to convert the representation (330) of the object (200) based on the received information (210).
19.configured to convert the received information (210) into the representation (330) of the ob-ject (200) by extracting a reference to a prestored representation (330) in a table (320a, 320b) The control arrangement (230) according to any one of claim 17 or claim 18, further stored in both a memory (310) of the information receiving unit (100b, 100c) and a memory(300) of the information transmitting unit (100a), from the received information (210); and selecting the prestored representation (330) of the object (200) in the table (320a,320b) stored in the memory (310) of the information receiving unit (100b, 100c), based onthe extracted reference.
20. The control arrangement (230) according to claim 19, further configured to: 29 coordinate the tables (320a, 320b) comprising prestored representations (330) be-tvveen the information transmitting unit (100a) and the information receiving unit (100b, 100c)before the information (210) concerning the object (200) is received.
21.figured to enable a user of the information receiving unit (100b, 100c) to configure the rep-resentation (330) of the object (200) in the table (320b). The control arrangement (230) according to any one of claims 17-20, further con-
22. cording to any of claims 12-16, when the computer program is executed in a control arrange- A computer program comprising program code for performing a method (600) ac-ment (230) according to any one of claims 17-21.
23. any one of claims 6-10, or a control arrangement (230) according to any one of claims 17-21. A vehicle (100a, 100b, 100c) comprising a control arrangement (220) according to
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE1851463A SE1851463A1 (en) | 2018-11-27 | 2018-11-27 | Method and control arrangement for visualisation of obstructed view |
PCT/SE2019/051145 WO2020111999A1 (en) | 2018-11-27 | 2019-11-12 | Method and control arrangement for visualisation of obstructed view |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE1851463A SE1851463A1 (en) | 2018-11-27 | 2018-11-27 | Method and control arrangement for visualisation of obstructed view |
Publications (1)
Publication Number | Publication Date |
---|---|
SE1851463A1 true SE1851463A1 (en) | 2020-05-28 |
Family
ID=70852143
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
SE1851463A SE1851463A1 (en) | 2018-11-27 | 2018-11-27 | Method and control arrangement for visualisation of obstructed view |
Country Status (2)
Country | Link |
---|---|
SE (1) | SE1851463A1 (en) |
WO (1) | WO2020111999A1 (en) |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ES2391556T3 (en) * | 2002-05-03 | 2012-11-27 | Donnelly Corporation | Object detection system for vehicles |
DE102013220312A1 (en) * | 2013-10-08 | 2015-04-09 | Bayerische Motoren Werke Aktiengesellschaft | Means of transport and method for exchanging information with a means of transportation |
US9666069B2 (en) * | 2014-02-14 | 2017-05-30 | Ford Global Technologies, Llc | Autonomous vehicle handling and performance adjustment |
DE102014205511A1 (en) * | 2014-03-25 | 2015-10-01 | Conti Temic Microelectronic Gmbh | METHOD AND DEVICE FOR DISPLAYING OBJECTS ON A VEHICLE INDICATOR |
DE102015105784A1 (en) * | 2015-04-15 | 2016-10-20 | Denso Corporation | Distributed system for detecting and protecting vulnerable road users |
US10062290B2 (en) * | 2015-12-16 | 2018-08-28 | Ford Global Technologies, Llc | Convoy vehicle look-ahead |
GB2562018A (en) * | 2016-09-15 | 2018-11-07 | Vivacity Labs Ltd | A method and system for analyzing the movement of bodies in a traffic system |
-
2018
- 2018-11-27 SE SE1851463A patent/SE1851463A1/en not_active Application Discontinuation
-
2019
- 2019-11-12 WO PCT/SE2019/051145 patent/WO2020111999A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2020111999A1 (en) | 2020-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20170287472A1 (en) | Speech recognition apparatus and speech recognition method | |
RU2010140434A (en) | INFORMATION PROCESSING DEVICE, PROGRAM, INFORMATION PROCESSING METHOD AND INFORMATION PROCESSING SYSTEM | |
CN101465004A (en) | Method, medium, and apparatus representing adaptive information of 3D depth image | |
US9509914B2 (en) | Image processing apparatus, location information adding method, and program | |
CN106228796A (en) | The learning method of infrared remote control and device | |
US20090217755A1 (en) | Smart sensor | |
JP2007071816A (en) | Position estimating system | |
JP2012103223A (en) | Method and device for discriminating position information of mobile terminal | |
US20170337098A1 (en) | Cloud device, terminal device, and method for handling abnormalities therein | |
CN104237471A (en) | Gas detection method and device | |
KR101526168B1 (en) | Effective method for removing noise in capacitive touch sensor and the touch screen device thereof | |
SE1851463A1 (en) | Method and control arrangement for visualisation of obstructed view | |
CN109625038B (en) | Track circuit state identification system and method | |
CN104272224A (en) | Method and apparatus for recognizing key input from virtual keyboard | |
CN103593956A (en) | Remote control equipment, computer equipment and control method | |
CN104951292A (en) | Data processing system and data processing method | |
JP2016099647A (en) | Information processing device | |
JP2022522912A (en) | Systems and methods for pairing devices using visual recognition | |
US7075482B2 (en) | Direction finding method and system using transmission signature differentiation | |
CN110687543A (en) | Ranging power determination method, device, equipment and storage medium | |
CN103165139B (en) | Method and device for detecting figure infra-acoustic frequency signals | |
KR101835967B1 (en) | An apparatus for indoor positioning using visible light communication and apparatus thereof | |
KR101531841B1 (en) | Effective method and module for measuring noise level in capacitive touch sensor | |
CN105892798A (en) | Information translation method and apparatus | |
JP2015172555A (en) | pulse analyzer and pulse analysis method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
NAV | Patent application has lapsed |