EP3599596A1 - Véhicule, appareil, procédé et programme informatique pour surveiller un véhicule, serveur d'application, appareil, procédé et programme informatique pour un serveur d'application, émetteur-récepteur mobile, appareil, procédé et programme informatique pour un émetteur-récepteur mobile - Google Patents

Véhicule, appareil, procédé et programme informatique pour surveiller un véhicule, serveur d'application, appareil, procédé et programme informatique pour un serveur d'application, émetteur-récepteur mobile, appareil, procédé et programme informatique pour un émetteur-récepteur mobile Download PDF

Info

Publication number
EP3599596A1
EP3599596A1 EP18185621.2A EP18185621A EP3599596A1 EP 3599596 A1 EP3599596 A1 EP 3599596A1 EP 18185621 A EP18185621 A EP 18185621A EP 3599596 A1 EP3599596 A1 EP 3599596A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
image data
control module
request
application server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP18185621.2A
Other languages
German (de)
English (en)
Inventor
Thorsten Hehn
Joakim Cerwall
Ernst Zielinski
Roman Alieiev
Teodor BUBURUZAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MAN Truck and Bus SE
Volkswagen AG
Scania CV AB
Original Assignee
MAN Truck and Bus SE
Volkswagen AG
Scania CV AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by MAN Truck and Bus SE, Volkswagen AG, Scania CV AB filed Critical MAN Truck and Bus SE
Priority to EP18185621.2A priority Critical patent/EP3599596A1/fr
Publication of EP3599596A1 publication Critical patent/EP3599596A1/fr
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/162Decentralised systems, e.g. inter-vehicle communication event-triggered
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096791Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle

Definitions

  • the present invention relates to a vehicle, an apparatus, a method, and a computer program for monitoring a vehicle, a mobile transceiver, an apparatus, a method, and a computer program for a mobile transceiver, an application server, an apparatus, a method, and a computer program for an application server, more particularly, but not exclusively, to a concept for obtaining image data of an outside of a vehicle if a trigger event occurs.
  • Vehicles may be equipped with a variety of sensors. Different sensor data of a vehicle, its components, a driving situation and other road users may be available. In some situations, information on a status of a parked vehicle may be desirable. For example, applications of mobile cell phones may be used to monitor certain conditions of a parked vehicle. Examples are tire pressure, temperature inside and outside, fuel level, maintenance interval, etc.
  • document EP 1 367 408 A2 discloses a concept for synthesizing an image showing a situation around a car, which is produced from images taken with cameras capturing the surroundings of the vehicle.
  • Document EP 2 905 704 A1 describes a concept for a vehicle, which collects sensor data for provision to an owner in case an event is triggered. Such event can be triggered if the vehicle is parked and shock sensors register that the vehicle was hit or indicate a collision.
  • a user may activate internal cameras of a vehicle from remote (via Internet), e.g. to check whether a wallet was left inside the car.
  • Document EP 2 750 116 A1 describes concept for automated charging for parking vehicles.
  • Document EP 1 464 540 A1 discloses a concept for a movable bumper camera using a reflecting mirror to determine an image including a break line showing the outermost part of the vehicle. These monitoring options for a parked vehicle are limited.
  • Embodiments are based on the finding that in some situations it is desirable to monitor the status of the parked vehicle also from the outside.
  • a request for the vehicle status can have several reasons. Examples are curiosity of the owner or damage of the vehicle. Such request may be initiated on a regular basis or based on an event.
  • a wireless communication system offers the possibility to reach or to communicate with other vehicles remotely.
  • Other vehicles may be able to read/recognize the status data of the vehicle.
  • Status data in this context may refer to the appearance of the vehicle or the relation between the (target) vehicle and its vicinity. For instance, an owner of a vehicle would like to see the vehicle to check the condition of the vehicle body or the surrounding after unpleasant nature event (e.g. storm or hail).
  • Embodiments are further based on the finding that this kind of status information may be mostly or in some cases even only measured or be seen by external observers. Embodiments may enable a service to get these observations (photos or videos) of a target vehicle by other vehicles, which are equipped with cameras and a communication unit. Embodiments are further based on the finding that communication of information on a trigger event, a request for image data, and image data itself can be implemented in various communication architectures, e.g. via an application server, directly between vehicles, or based on communication with an owner device.
  • Embodiments provide an apparatus for monitoring a vehicle.
  • Another embodiment is a vehicle with an embodiment of the apparatus.
  • the apparatus comprises one or more interfaces, which are configured to communicate in a mobile communication system.
  • the apparatus further comprises a control module, which is configured to control one or more interfaces.
  • the control module is further configured to determine a parking situation of the vehicle and to detect a trigger event.
  • the control module is further configured to transmit a message comprising information on the trigger event to a predefined device in case the trigger event is detected.
  • Embodiments provide a concept for monitoring a vehicle by transmitting an according message based on the trigger event.
  • the predefined device is an application server or user equipment, e.g. such a message may be transmitted to another vehicle, an application server, or an owner's device.
  • the event may be triggered frequently or periodically.
  • embodiments enable communication with other devices, e.g. other vehicles or network components in a parking situation of the vehicle.
  • Embodiments also provide an apparatus for an application server, which is configured to communicate through a mobile communication network.
  • Another embodiment is an application server comprising an embodiment of the application server apparatus.
  • the apparatus comprises one or more interfaces, which are configured to communicate in the mobile communication system.
  • the apparatus further comprises a control module, which is configured to control the one or more interfaces.
  • the control module is further configured to obtain a request for obtaining image data of a first vehicle, and to determine a second vehicle, which is capable of determining such image data.
  • the control module is further configured to instruct the second vehicle to obtain and provide the image data.
  • Embodiments may enable image data provision from the outside of one vehicle by another vehicle.
  • Embodiments also provide an apparatus for a mobile transceiver of a mobile communication system.
  • a mobile transceiver comprising an embodiment of the apparatus.
  • the apparatus comprises one or more interfaces, which are configured to communicate in the mobile communication system.
  • the apparatus further comprises a control module, which is configured to control the one or more interfaces.
  • the control module is further configured to generate a request for obtaining image data of a first vehicle and to forward the request to a second vehicle, which is capable of determining such image data.
  • the control module is further configured to instruct the second vehicle to obtain and provide the image data.
  • Embodiments may hence enable a mechanism to obtain image data of a vehicle from other vehicles.
  • control module of the apparatus for monitoring the vehicle may be further configured to request a vehicle in the vicinity of the vehicle to provide image data on the vehicle.
  • Embodiments may enable an efficient mechanism for a first vehicle to find or identify a second vehicle in the vicinity, which is capable of providing image data of the first vehicle.
  • control module of the apparatus for monitoring the vehicle may be further configured to detect the trigger event based on one or more elements of the group of information received from a vehicle owner or a vehicle owner's mobile transceiver, information received from an application server, information received from another vehicle, and sensor data from the vehicle.
  • Embodiments may hence enable different services, triggered by different network components. Such a request may be issued upon detection of the trigger event and/or upon request from another entity.
  • Embodiments may enable an efficient image data detection mechanism.
  • the control module of the apparatus for monitoring the vehicle may be further configured to determine the vehicle in the vicinity by communicating with the application server through the mobile communication system.
  • Embodiments may enable efficient communication to obtain the image data.
  • control module of the apparatus for monitoring the vehicle is further configured to determine the vehicle in the vicinity by broadcasting a request using direct communication through the mobile communication system.
  • Embodiments may enable car-to-car (C2C) or vehicle-to-vehicle (V2V), e.g. 3 rd Generation Partnership Project (3GPP) V2V, communication to determine capable vehicles in the vicinity.
  • the control module of the apparatus for monitoring the vehicle may be configured to communicate a request to provide the image data of the vehicle, to communicate with the application server, and/or to communicate with an owner of the vehicle, his mobile transceiver, respectively, using the one or more interfaces.
  • Embodiments may enable different mechanisms to communicate a request for image data and the image data itself to an owner of the vehicle.
  • control module may be further configured to receive the image data from the second vehicle and to provide the image data to an owner of the first vehicle, his mobile transceiver, respectively.
  • the control module may be configured to receive the image data using the communication module.
  • image data may be received from the application server, from the (first) vehicle itself, or from another (second) vehicle in the vicinity of the (first) vehicle directly.
  • Embodiments may enable different communication paths to communicate trigger information, request information and/or the image data.
  • the control module for the application server may be further configured to instruct the second vehicle to provide the image data to the owner of the first vehicle.
  • the control module for the mobile transceiver may be configured to instruct the second vehicle (directly, via the application server, or via the first vehicle) to provide the image data to the owner of the first vehicle.
  • the application server apparatus may trigger different communication paths for the image data.
  • the application server apparatus further comprises a data base, which is configured to store information on one or more vehicles and their locations.
  • the control module of the application server apparatus may be further configured to determine the second vehicle in the vicinity of the first vehicle using the data base. Embodiments may enable a quick determination of a second vehicle through use of the data base.
  • the control module at the application server apparatus may be further configured to receive information on a trigger event from the vehicle. Embodiments may enable triggering by the vehicle itself to be monitored.
  • the control module may be further configured to automatically obtain or generate the request upon reception of the information on the trigger event.
  • information on the trigger event may be forwarded to the mobile transceiver apparatus of the vehicle's owner (directly or via the application server apparatus), where the request may be generated.
  • a further embodiment is a method for monitoring a vehicle.
  • the method comprises determining a parking situation of the vehicle and detecting a trigger event.
  • the method further comprises transmitting a message comprising information on the trigger event to a predefined device in case the trigger event is detected.
  • Another embodiment is a method for an application server, which is configured to communicate through a mobile communication network.
  • the method comprises obtaining a request for obtaining image data of a first vehicle, and determining a second vehicle, which is capable of determining such image data.
  • the method further comprises instructing the second vehicle to obtain and provide the image data.
  • Yet another embodiment is a method for a mobile transceiver of a mobile communication system.
  • the method comprises generating a request for obtaining image data of a first vehicle and forwarding the request to a second vehicle, which is capable of determining such image data.
  • the method further comprises instructing the second vehicle to obtain and provide the image data.
  • Embodiments further provide a computer program having a program code for performing one or more of the above described methods, when the computer program is executed on a computer, processor, or programmable hardware component.
  • a further embodiment is a computer readable storage medium storing instructions which, when executed by a computer, processor, or programmable hardware component, cause the computer to implement one of the methods described herein.
  • the term, "or” refers to a non-exclusive or, unless otherwise indicated (e.g., “or else” or “or in the alternative”).
  • words used to describe a relationship between elements should be broadly construed to include a direct relationship or the presence of intervening elements unless otherwise indicated. For example, when an element is referred to as being “connected” or “coupled” to another element, the element may be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present. Similarly, words such as “between”, “adjacent”, and the like should be interpreted in a like fashion.
  • Fig. 1 illustrates an embodiment of an apparatus 10 for monitoring a vehicle 100.
  • the apparatus 10 is configured to, adapted to or suitable to be used in the vehicle 100.
  • the vehicle 100 is shown in Fig. 1 as optional (broken line).
  • another embodiment is a vehicle 100 comprising an embodiment of the apparatus 10.
  • the apparatus 10 comprises one or more interfaces 12, which are configured to communicate in a mobile communication system 400.
  • the apparatus 10 further comprises a control module 14, which is coupled to the one or more interfaces 12, and which is configured to control one or more interfaces 12.
  • the control module 14 is further configured to determine a parking situation of the vehicle 100.
  • the control module 14 is further configured to detect a trigger event, and to transmit a message comprising information on the trigger event to a predefined device in case the trigger event is detected.
  • Fig. 1 further illustrates an embodiment of an apparatus 20 for an application server 200 being configured to communicate through a mobile communication network 400.
  • the apparatus 20 is configured to, adapted to or suitable to be used in the application server 200.
  • the application server 200 is shown in Fig. 1 as optional (broken line).
  • another embodiment is an application server 200 comprising an embodiment of the apparatus 20.
  • the apparatus 20 comprises one or more interfaces 22 configured to communicate in the mobile communication system 400.
  • the apparatus 20 further comprises a control module 24, which is coupled to the one or more interfaces 22.
  • the control module 24 is configured to control the one or more interfaces 22.
  • the control module 24 is further configured to obtain a request for obtaining image data of a first vehicle 100, and to determine a second vehicle 102, which is capable of determining such image data.
  • the control module 24 is further configured to instruct the second vehicle 102 to obtain and provide the image data.
  • Fig. 1 also illustrates an apparatus 30 for a mobile transceiver 300 of a mobile communication system 400.
  • a mobile transceiver 300 comprising an embodiment of the apparatus 30.
  • the apparatus 30 comprises one or more interfaces 32 configured to communicate in the mobile communication system 400.
  • the apparatus 30 further comprises a control module 34, which is coupled to the one or more interfaces 32.
  • the control module 34 is configured to control the one or more interfaces 32.
  • the control module 34 is further configured to generate a request for obtaining image data of a first vehicle 100, and to forward the request to a second vehicle 102, which is capable of determining such image data.
  • the control module 34 is further configured to instruct the second vehicle 102 to obtain and provide the image data.
  • FIG. 1 also illustrates an embodiment of a system 400 comprising embodiments of the apparatuses 10, 20 and 30.
  • an owner of an embodiment of a mobile transceiver 300 with the apparatus 30 may request to see an outside image of his vehicle 100.
  • the control module 34 of the apparatus 30 may further run an accordingly adapted application, which lets the owner input an according trigger for retrieving the outside image of his vehicle 100.
  • the mobile device or transceiver 300 may now determine and instruct another vehicle 102 to record or acquire such image data to be communicated and displayed at the mobile transceiver 300.
  • the mobile transceiver 300 may communicate with the first vehicle 100, which may be capable of identifying the second vehicle 102 itself, e.g. by means of using local communication, sensor data, etc.
  • the mobile transceiver 300 may determine the second vehicle 102 without the help of the first vehicle 100 in some embodiments.
  • the mobile transceiver may start a query based on the location of the first vehicle 100, which can be predetermined, provided by the first vehicle 100, determined by the mobile communication system 400, etc. Such a query may be run through an application server 200, which will be detailed subsequently.
  • the image data from the second vehicle 102 may be communicated via the first vehicle 100, via the application server 200, via other vehicles or mobile transceivers, to be finally displayed at the mobile transceiver 300.
  • the trigger may be generated by the parked vehicle 100, e.g. based on a detected shock, sensor data, an observation of another vehicle, etc. Again the second vehicle 102 can be determined by different mechanisms as laid out above.
  • the trigger event may be determined by the application server 200, e.g. upon request from the owner (mobile transceiver 300 or any other owner's device), as will be presented in more detail subsequently.
  • the one or more interfaces 12, 22, 32 may correspond to any means for obtaining, receiving, transmitting or providing analog or digital signals or information, e.g. any connector, contact, pin, register, input port, output port, conductor, lane, etc. which allows providing or obtaining a signal or information.
  • An interface may be wireless or wireline and it may be configured to communicate, i.e. transmit or receive signals, information with further internal or external components.
  • the one or more interfaces 12, 22, 32 may comprise further components to enable according communication in the mobile communication system 400, such components may include transceiver (transmitter and/or receiver) components, such as one or more Low-Noise Amplifiers (LNAs), one or more Power-Amplifiers (PAs), one or more duplexers, one or more diplexers, one or more filters or filter circuitry, one or more converters, one or more mixers, accordingly adapted radio frequency components, etc.
  • the one or more interfaces 12, 22, 32 may be coupled to one or more antennas, which may correspond to any transmit and/or receive antennas, such as horn antennas, dipole antennas, patch antennas, sector antennas etc.
  • the antennas may be arranged in a defined geometrical setting, such as a uniform array, a linear array, a circular array, a triangular array, a uniform field antenna, a field array, combinations thereof, etc.
  • the one or more interfaces 12, 22, 32 may serve the purpose of transmitting or receiving or both, transmitting and receiving, information, such as information related to capabilities, application requirements, trigger indications, requests, message interface configurations, feedback, information related to control commands etc.
  • control modules 14, 24, 34 may be implemented using one or more processing units, one or more processing devices, any means for processing, such as a processor, a computer or a programmable hardware component being operable with accordingly adapted software.
  • the described functions of the control modules 14, 24, 34 may as well be implemented in software, which is then executed on one or more programmable hardware components.
  • Such hardware components may comprise a general purpose processor, a Digital Signal Processor (DSP), a micro-controller, etc.
  • DSP Digital Signal Processor
  • Fig. 1 also shows an embodiment of a system 400 comprising embodiments the vehicles 100, 102 (mobile or relay transceivers), a mobile transceiver 300, and the application server 200, which may correspond to a network controller/server or base station, respectively.
  • communication i.e. transmission, reception or both, may take place among mobile transceivers/vehicles 100, 102 directly and/or between mobile transceivers/vehicles 100, 102 and a network component (infrastructure or mobile transceiver) or application server 200 (e.g. a base station, a network server, a backend server, etc.).
  • a network component infrastructure or mobile transceiver
  • application server 200 e.g. a base station, a network server, a backend server, etc.
  • Such communication may make use of a mobile communication system 400.
  • Such communication may be carried out directly, e.g.
  • D2D Device-to-Device
  • V2V Vehicle-to-Vehicle
  • car-to-car communication in case of vehicles 100, 102.
  • D2D Device-to-Device
  • V2V Vehicle-to-Vehicle
  • car-to-car communication in case of vehicles 100, 102.
  • Such communication may be carried out using the specifications of a mobile communication system 400.
  • the mobile communication system 400 may, for example, correspond to one of the Third Generation Partnership Project (3GPP)-standardized mobile communication networks, where the term mobile communication system is used synonymously to mobile communication network.
  • the mobile or wireless communication system 400 may correspond to a mobile communication system of the 5th Generation (5G) and may use mm-Wave technology.
  • the mobile communication system may correspond to or comprise, for example, a Long-Term Evolution (LTE), an LTE-Advanced (LTE-A), High Speed Packet Access (HSPA), a Universal Mobile Telecommunication System (UMTS) or a UMTS Terrestrial Radio Access Network (UTRAN), an evolved-UTRAN (e-UTRAN), a Global System for Mobile communication (GSM) or Enhanced Data rates for GSM Evolution (EDGE) network, a GSM/EDGE Radio Access Network (GERAN), or mobile communication networks with different standards, for example, a Worldwide Inter-operability for Microwave Access (WIMAX) network IEEE 802.16 or Wireless Local Area Network (WLAN) IEEE 802.11, generally an Orthogonal Frequency Division Multiple Access (OFDMA) network, a Time Division Multiple Access (TDMA) network, a Code Division Multiple Access (CDMA) network, a Wideband-CDMA (WCDMA) network, a Frequency Division Multiple Access (FDMA) network, a Spatial Division Multiple Access (SDMA) network, etc
  • a base station transceiver can be operable or configured to communicate with one or more active mobile transceivers/vehicles 100, 102, 300 and a base station transceiver can be located in or adjacent to a coverage area of another base station transceiver, e.g. a macro cell base station transceiver or small cell base station transceiver.
  • a mobile communication system 400 comprising two or more mobile transceivers/vehicles 100, 102, 300 and one or more base station transceivers, wherein the base station transceivers may establish macro cells or small cells, as e.g. pico-, metro-, or femto cells.
  • a mobile transceiver may correspond to a smartphone, a cell phone, user equipment, a laptop, a notebook, a personal computer, a Personal Digital Assistant (PDA), a Universal Serial Bus (USB) -stick, a car, a vehicle etc.
  • PDA Personal Digital Assistant
  • USB Universal Serial Bus
  • a mobile transceiver may also be referred to as User Equipment (UE) or mobile in line with the 3GPP terminology.
  • UE User Equipment
  • a vehicle 100, 102 may correspond to any conceivable means for transportation, e.g. a car, a bike, a motorbike, a van, a truck, a bus, a ship, a boat, a plane, a train, a tram, etc.
  • a base station transceiver can be located in the fixed or stationary part of the network or system.
  • a base station transceiver may correspond to a remote radio head, a transmission point, an access point, a macro cell, a small cell, a micro cell, a femto cell, a metro cell etc.
  • a base station transceiver can be a wireless interface of a wired network, which enables transmission of radio signals to a UE or mobile transceiver.
  • Such a radio signal may comply with radio signals as, for example, standardized by 3GPP or, generally, in line with one or more of the above listed systems.
  • a base station transceiver may correspond to a NodeB, an eNodeB, a Base Transceiver Station (BTS), an access point, a remote radio head, a relay station, a transmission point etc., which may be further subdivided in a remote unit and a central unit.
  • BTS Base Transceiver Station
  • a mobile transceiver 100, 102, 300 can be associated with a base station transceiver or cell.
  • the term cell refers to a coverage area of radio services provided by a base station transceiver, e.g. a NodeB (NB), an eNodeB (eNB), a remote radio head, a transmission point, etc.
  • NB NodeB
  • eNB eNodeB
  • a base station transceiver may operate one or more cells on one or more frequency layers, in some embodiments a cell may correspond to a sector. For example, sectors can be achieved using sector antennas, which provide a characteristic for covering an angular section around a remote unit or base station transceiver.
  • a base station transceiver may, for example, operate three or six cells covering sectors of 120° (in case of three cells), 60° (in case of six cells) respectively.
  • a base station transceiver may operate multiple sectorized antennas.
  • a cell may represent an according base station transceiver generating the cell or, likewise, a base station transceiver may represent a cell the base station transceiver generates.
  • Mobile transceivers 100, 102, 300 may communicate directly with each other, i.e. without involving any base station transceiver, which is also referred to as Device-to-Device (D2D) communication.
  • D2D Device-to-Device
  • An example of D2D is direct communication between vehicles, also referred to as Vehicle-to-Vehicle communication (V2V), car-to-car using 802.11 p, respectively.
  • V2V Vehicle-to-Vehicle communication
  • 802.11 p 802.11 p
  • the one or more interfaces 12, 22, 32 can be configured to use this kind of communication.
  • radio resources are used, e.g. frequency, time, code, and/or spatial resources, which may as well be used for wireless communication with a base station transceiver.
  • the assignment of the radio resources may be controlled by a base station transceiver, i.e. the determination which resources are used for D2D and which are not.
  • radio resources of the respective components may correspond to any radio resources conceivable on radio carriers and they may use the same or different granularities on the respective carriers.
  • the radio resources may correspond to a Resource Block (RB as in LTE/LTE-A/LTE-unlicensed (LTE-U)), one or more carriers, sub-carriers, one or more radio frames, radio sub-frames, radio slots, one or more code sequences potentially with a respective spreading factor, one or more spatial resources, such as spatial sub-channels, spatial precoding vectors, any combination thereof, etc.
  • RB Resource Block
  • LTE-U LTE/LTE-A/LTE-unlicensed
  • transmission according to 3GPP Release 14 onward can be managed by infrastructure (so-called mode 3) or run in a User Equipment (UE) Autonomous mode (UEA), (so-called mode 4).
  • UE User Equipment
  • UPA User Equipment
  • mode 4 the two or more mobile transceivers 100, 102, 300 as indicated by Fig.1 may be registered in the same mobile communication system 400.
  • one or more of the mobile transceivers 100, 102, 300 may be registered in different mobile communication systems 400.
  • the different mobile communication systems 400 may use the same access technology but different operators or they may use different access technologies as outlined above.
  • a status of a parked vehicle 100 is monitored.
  • the request for the vehicle status can have several reasons (curiosity of the owner or damage of the vehicle, for instance) and triggers, which could be initiated either on a regular basis or based on an event (storm, hail, accident, etc.).
  • the wireless communication system 400 offers the possibility to reach the vehicle 100 remotely and read/recognize the status data of the vehicle 100.
  • Status data may refer to the appearance of the vehicle 100 or the relation between the (target) vehicle 100 and its vicinity. For instance, an owner of the vehicle 100 would like to see the vehicle to check the condition of the vehicle body or the surrounding after unpleasant nature event (e.g. storm or hail). This kind of status information may be measured or be seen by external observers or vehicles 102.
  • Embodiments may enable a service to get these observations (photos or videos) of a target vehicle 100 by other vehicles 102, which are equipped with cameras and a communication unit.
  • a vehicle status may be interesting in many scenarios.
  • the owner, monitoring or application server 200 initiates a status request (regular or event dependent) and in another scenario the vehicle 100 itself may initiate the status monitoring (regular or event dependent).
  • another vehicle may generate the event, e.g. a vehicle that just had contact with vehicle 100 or which witnessed contact between any object and vehicle 100.
  • the status report, or message comprising information on the trigger event from the vehicle 100 might be simple information, such as "I am ok", for instance, a photo (of the area around the vehicle 100 or of the vehicle 100 itself) or a detailed report based on predefined parameters.
  • Fig. 2 illustrates a communication scenario in an embodiment.
  • Fig. 2 shows an embodiment of a vehicle 100 comprising an embodiment of the above apparatus 10.
  • the vehicle 100 may have multiple access possibilities or technologies, e.g. interface 12 may be a 3GPP RAN or WIFI to access a mobile communication system 400.
  • the mobile communication system 400 comprises one or more operator core networks, and connects, via Internet in the embodiment shown in Fig. 2 , to an embodiment of an application server 200.
  • User equipment 300 may be connected to the mobile communication system 400 via another operator core network and another RAN.
  • the predefined device, to transmit the message from the vehicle 100 to may be the application server 200 or the user equipment 300.
  • a simple embodiment is illustrated.
  • the target vehicle 100 is damaged and requests a status report process to inform the owner 300 about this situation.
  • this request is sent to either the application server 200 or the vehicle owner 300, which or who initiate a status monitoring process.
  • the control module 14 at the apparatus 10 may be further configured to detect the trigger event based on one or more elements of the group of information received from a vehicle owner, information received from an application server 200, information received from another vehicle 102, and sensor data from the vehicle 100.
  • another vehicle 102 may monitor an accident and inform the vehicle 100 thereby triggering the event.
  • the vehicle 100 is reachable via mobile device 300 (or via server 200).
  • the vehicle 100 may detect an unusual situation and may want to know the status of its vehicle body.
  • the control module 14 of the monitoring apparatus 10 is further configured to request a vehicle 102 in the vicinity of the vehicle 100 to provide image data on the vehicle 100.
  • the control module 14 may be further configured to request the image data upon detection of the trigger event (e.g. an accident or shaking of the vehicle) and/or upon request (e.g. from an owner of the vehicle 100, another vehicle, or an application server 200).
  • the control module 14 is configured to determine the vehicle 102 in the vicinity by communicating with an application server 200 through the mobile communication system 400.
  • the application server 200 may keep track of vehicles and may hence have information on potential vehicles in the vicinity of vehicle 100 available.
  • the application server 200 may be configured to determine the vehicle 102 in the vicinity of vehicle 100 on demand or upon request.
  • the control module 34 can be configured to trigger the event, e.g. based on weather conditions, and start a query for the vehicle 102.
  • vehicles 102 or other mobile transceivers may let the mobile transceiver 300 know whether they are capable of fulfilling the request and provide the requested data to the mobile transceiver 300, e.g. via the vehicle 100 and/or the application server 200.
  • the control module 14 may be further configured to determine the vehicle 102 in the vicinity by broadcasting a request using direct communication through the mobile communication system 400. Hence, in some embodiments the vehicle 100 may broadcast such a request to vehicles in its vicinity. Generally in embodiments, the control module 14 may be configured to communicate a request to provide the image data of the vehicle 100, to communicate with the application server 200, and/or to communicate with an owner of the vehicle 100 using the one or more interfaces 12.
  • the control module 24 at the application server apparatus 20 may be configured to receive the image data from the second vehicle 102 and to provide the image data to an owner (e.g. the mobile transceiver 300 of the owner) of the first vehicle 100. The control module 24 may further be configured to instruct the second vehicle 102 to provide the image data to the owner 300 of the first vehicle 100.
  • control module 34 of the mobile transceiver apparatus 30 may be configured to determine and instruct the second vehicle 102 as explained above.
  • the control module 24, additionally or alternatively the control module 34, may be further configured to receive information on a trigger event from the vehicle 100.
  • the control module 24, 34 may be further configured to obtain the request upon reception of the information on the trigger event.
  • the application server 200 which may potentially be operated by a vehicle manufacturer or a service provider, may possess a data base of one or more vehicles and their positions/locations,
  • the application server apparatus 20 may further comprise a data base 26, which is configured to store information on one or more vehicles and their locations.
  • the control module 24 may be further configured to determine the second vehicle 102 in the vicinity of the first vehicle 100 using the data base 26.
  • Fig. 3 illustrates a communication sequence in an embodiment.
  • Fig. 3 illustrates vehicle 100, which after being hit by another vehicle requests image data on its body. Similar to what was already described with respect to Fig. 2 , the vehicle 100 is connected to a mobile communication system 400 via one or more RANs, potentially being interconnected by the internet.
  • An application server 200 and user equipment 300 may as well be connected to the network 400.
  • the request for image data may hence be communicated from the vehicle 100 to the application server 200.
  • the application server 200 may request to take a photo or the status of vehicle 100 from vehicle 102, which is located in the vicinity of vehicle 100. Vehicle 102 may then take a photo of vehicle 100 and send this photo to the requester (application server 200, vehicle 100, or owner UE 300).
  • the target vehicle 102 may request the application server 200 (or owner 300), which is operated by a vehicle manufacture or a service provider and which may possess a data base of vehicles and their positions/locations, to contact the vehicles in the proximity of the target vehicle 100 via a wireless connection, and to
  • the target vehicle 102 may request the vehicles in the proximity of the target vehicle 102 via a wireless connection to perform the corresponding measurements and may request the vehicles to send this measurement (either to the target vehicle 100, to an application server 200, or to an owner UE 300, for instance).
  • Fig. 4 shows another communication sequence in an embodiment.
  • Fig. 4 shows similar components as already described with the help of Figs. 2 and 3 .
  • vehicle 100 requests vehicle 102, e.g. by means of using a broadcast message, to take a photo/detect status and provide the image data to the requester.
  • Fig. 5 shows a block diagram of a flow chart of an embodiment of a method 40 for monitoring a vehicle 100.
  • the method 40 for monitoring the vehicle 100 comprises determining 42 a parking situation of the vehicle 100, detecting 44 a trigger event, and transmitting 46 a message comprising information on the trigger event to a predefined device in case the trigger event is detected.
  • Fig. 6 shows a block diagram of a flow chart of an embodiment of a method 50 for an application server 200.
  • the method 50 for the application server 200 is configured to communicate through a mobile communication network 400.
  • the method 50 comprises obtaining 52 a request for obtaining image data of a first vehicle 100, determining 54 a second vehicle 102, which is capable of determining such image data, and instructing 56 the second vehicle 102 to obtain and provide the image data.
  • Fig. 7 shows a block diagram of a flow chart of an embodiment of a method 60 for a mobile transceiver 300 of a mobile communication system 400.
  • the method 60 comprises generating 62 a request for obtaining image data of a first vehicle 100.
  • the method 60 further comprises forwarding 64 the request to a second vehicle 102, which is capable of determining such image data, and instructing 66 the second vehicle 102 to obtain and provide the image data.
  • the respective methods may be implemented as computer programs or codes, which can be executed on a respective hardware.
  • another embodiment is a computer program having a program code for performing at least one of the above methods, when the computer program is executed on a computer, a processor, or a programmable hardware component.
  • a further embodiment is a (non-transitory) computer readable storage medium storing instructions which, when executed by a computer, processor, or programmable hardware component, cause the computer to implement one of the methods described herein.
  • program storage devices e.g., digital data storage media, which are machine or computer readable and encode machine-executable or computer-executable programs of instructions where said instructions perform some or all of the steps of methods described herein.
  • the program storage devices may be, e.g., digital memories, magnetic storage media such as magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media.
  • the embodiments are also intended to cover computers programmed to perform said steps of methods described herein or (field) programmable logic arrays ((F)PLAs) or (field) programmable gate arrays ((F)PGAs), programmed to perform said steps of the above-described methods.
  • processor When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared.
  • explicit use of the term "processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, Digital Signal Processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read only memory (ROM) for storing software, random access memory (RAM), and non-volatile storage. Other hardware, conventional or custom, may also be included. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
  • DSP Digital Signal Processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • ROM read only memory
  • RAM random access memory
  • non-volatile storage Other hardware
  • any block diagrams herein represent conceptual views of illustrative circuitry embodying the principles of the invention.
  • any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
  • each claim may stand on its own as a separate embodiment. While each claim may stand on its own as a separate embodiment, it is to be noted that - although a dependent claim may refer in the claims to a specific combination with one or more other claims - other embodiments may also include a combination of the dependent claim with the subject matter of each other dependent claim. Such combinations are proposed herein unless it is stated that a specific combination is not intended. Furthermore, it is intended to include also features of a claim to any other independent claim even if this claim is not directly made dependent to the independent claim.
EP18185621.2A 2018-07-25 2018-07-25 Véhicule, appareil, procédé et programme informatique pour surveiller un véhicule, serveur d'application, appareil, procédé et programme informatique pour un serveur d'application, émetteur-récepteur mobile, appareil, procédé et programme informatique pour un émetteur-récepteur mobile Pending EP3599596A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP18185621.2A EP3599596A1 (fr) 2018-07-25 2018-07-25 Véhicule, appareil, procédé et programme informatique pour surveiller un véhicule, serveur d'application, appareil, procédé et programme informatique pour un serveur d'application, émetteur-récepteur mobile, appareil, procédé et programme informatique pour un émetteur-récepteur mobile

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP18185621.2A EP3599596A1 (fr) 2018-07-25 2018-07-25 Véhicule, appareil, procédé et programme informatique pour surveiller un véhicule, serveur d'application, appareil, procédé et programme informatique pour un serveur d'application, émetteur-récepteur mobile, appareil, procédé et programme informatique pour un émetteur-récepteur mobile

Publications (1)

Publication Number Publication Date
EP3599596A1 true EP3599596A1 (fr) 2020-01-29

Family

ID=63047277

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18185621.2A Pending EP3599596A1 (fr) 2018-07-25 2018-07-25 Véhicule, appareil, procédé et programme informatique pour surveiller un véhicule, serveur d'application, appareil, procédé et programme informatique pour un serveur d'application, émetteur-récepteur mobile, appareil, procédé et programme informatique pour un émetteur-récepteur mobile

Country Status (1)

Country Link
EP (1) EP3599596A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115396543A (zh) * 2022-07-21 2022-11-25 摩拜(北京)信息技术有限公司 一种车辆控制方法、移动终端及服务器

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1288887A2 (fr) * 2001-08-07 2003-03-05 Mazda Motor Corporation Système et procédé de transmission de gain de contrôle du véhicule
EP1367408A2 (fr) 2002-05-31 2003-12-03 Matsushita Electric Industrial Co., Ltd. Appareil de surveillance de l'environnement d'un véhicule et procédé pour la production d'images
EP1464540A1 (fr) 2003-03-31 2004-10-06 Mazda Motor Corporation Système de surveillance de véhicule
US20090231429A1 (en) * 2008-03-15 2009-09-17 International Business Machines Corporation Informing a driver or an owner of a vehicle of visible problems detected by outside video sources
JP2009288915A (ja) * 2008-05-28 2009-12-10 Kayaba Ind Co Ltd ドライブレコーダ
EP2750116A1 (fr) 2012-07-05 2014-07-02 Javier De La Plaza Ortega Système de stationnement contrôlé automatique et procédé associé
US20140324247A1 (en) * 2013-04-29 2014-10-30 Intellectual Discovery Co., Ltd. Vehicular image processing apparatus and method of sharing data using the same
EP2905704A1 (fr) 2014-02-05 2015-08-12 Harman International Industries, Incorporated Système d'alerte et de surveillance automatique pour véhicule intelligent
EP3109834A1 (fr) * 2015-06-23 2016-12-28 LG Electronics Inc. Terminal mobile pour partager une image avec un dispositif blackbox proche et son procédé de commande

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1288887A2 (fr) * 2001-08-07 2003-03-05 Mazda Motor Corporation Système et procédé de transmission de gain de contrôle du véhicule
EP1367408A2 (fr) 2002-05-31 2003-12-03 Matsushita Electric Industrial Co., Ltd. Appareil de surveillance de l'environnement d'un véhicule et procédé pour la production d'images
EP1464540A1 (fr) 2003-03-31 2004-10-06 Mazda Motor Corporation Système de surveillance de véhicule
US20090231429A1 (en) * 2008-03-15 2009-09-17 International Business Machines Corporation Informing a driver or an owner of a vehicle of visible problems detected by outside video sources
JP2009288915A (ja) * 2008-05-28 2009-12-10 Kayaba Ind Co Ltd ドライブレコーダ
EP2750116A1 (fr) 2012-07-05 2014-07-02 Javier De La Plaza Ortega Système de stationnement contrôlé automatique et procédé associé
US20140324247A1 (en) * 2013-04-29 2014-10-30 Intellectual Discovery Co., Ltd. Vehicular image processing apparatus and method of sharing data using the same
EP2905704A1 (fr) 2014-02-05 2015-08-12 Harman International Industries, Incorporated Système d'alerte et de surveillance automatique pour véhicule intelligent
EP3109834A1 (fr) * 2015-06-23 2016-12-28 LG Electronics Inc. Terminal mobile pour partager une image avec un dispositif blackbox proche et son procédé de commande

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115396543A (zh) * 2022-07-21 2022-11-25 摩拜(北京)信息技术有限公司 一种车辆控制方法、移动终端及服务器
CN115396543B (zh) * 2022-07-21 2024-03-29 摩拜(北京)信息技术有限公司 一种车辆控制方法、移动终端及服务器

Similar Documents

Publication Publication Date Title
JP7219237B2 (ja) 車両環境のためのプロキシ協調無線通信動作
CN111497839B (zh) 用于运载工具和网络组件的系统、运载工具、网络组件、装置、方法和计算机程序
JP6716833B2 (ja) 無線車両通信用の通信グループの発見及び確立
US11636840B2 (en) Vehicle, apparatus, method and computer program for sharing sound data
EP2830356B1 (fr) Appareil, serveur de données, véhicule, procédé et programme informatique permettant de configurer des mesures de cellule voisine d'un noeud de relais mobile
EP3598413A1 (fr) Appareil, procédé et programme informatique pour un émetteur-récepteur mobile
US11153768B2 (en) Vehicle, network component and apparatus for a mobile transceiver, methods and computer programs for multi-client sampling
US11272422B2 (en) Vehicle, system, apparatuses, methods, and computer programs for user equipment of a mobile communication system
EP3614356A1 (fr) Appareil, véhicule de groupement, peloton de véhicules, procédé et programme informatique pour un véhicule de groupement
US11702109B2 (en) Method, computer program, apparatus, vehicle, and traffic entity for updating an environmental model of a vehicle
EP3599596A1 (fr) Véhicule, appareil, procédé et programme informatique pour surveiller un véhicule, serveur d'application, appareil, procédé et programme informatique pour un serveur d'application, émetteur-récepteur mobile, appareil, procédé et programme informatique pour un émetteur-récepteur mobile
US10841763B2 (en) Configurable message interface
EP3696786B1 (fr) Système, véhicule, composant de réseau, appareils, procédés et programmes informatiques pour un véhicule et composant de réseau
EP3614357A1 (fr) Véhicules, composant de réseau, appareils, procédés et programmes informatiques pour un véhicule, pour un véhicule de groupement et pour un composant de réseau
US11383724B2 (en) Method, apparatus and computer program for transferring an execution of a function for a vehicle between a backend entity and the vehicle, method, apparatus and computer program for a vehicle and method, apparatus and computer program for a backend entity
US20220007187A1 (en) Proxy coordinated wireless communication operation for vehicular environments
EP3567345A1 (fr) Véhicule, appareil, procédé et programme informatique permettant d'adapter une route actuelle, composant de réseau, procédé et programme informatique pour fournir des données sur un véhicule perturbant la circulation
EP3964909A1 (fr) Procédés, programmes informatiques, appareils pour un véhicule et un centre de contrôle permettant de résoudre une situation d'interblocage de trafic d'un véhicule automatique
EP3965443A1 (fr) Véhicules, procédés, programmes informatiques et appareils permettant de résoudre une situation d'interblocage de trafic d'un véhicule automatique
EP3611981A1 (fr) Appareils, procédés et programmes informatiques pour un émetteur-récepteur mobile et pour une entité de réseau, émetteur-récepteur mobile, entité de réseau, système de communication mobile

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200729

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

RAP3 Party data changed (applicant data changed or rights of an application transferred)

Owner name: VOLKSWAGEN AG

Owner name: MAN TRUCK & BUS SE

Owner name: SCANIA CV AB

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20211110