WO2020222319A1 - Dispositif électronique pour véhicule et procédé pour faire fonctionner le dispositif électronique pour véhicule - Google Patents

Dispositif électronique pour véhicule et procédé pour faire fonctionner le dispositif électronique pour véhicule Download PDF

Info

Publication number
WO2020222319A1
WO2020222319A1 PCT/KR2019/005140 KR2019005140W WO2020222319A1 WO 2020222319 A1 WO2020222319 A1 WO 2020222319A1 KR 2019005140 W KR2019005140 W KR 2019005140W WO 2020222319 A1 WO2020222319 A1 WO 2020222319A1
Authority
WO
WIPO (PCT)
Prior art keywords
processor
user
vehicle
electronic device
image data
Prior art date
Application number
PCT/KR2019/005140
Other languages
English (en)
Korean (ko)
Inventor
조성일
김자연
장유준
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to PCT/KR2019/005140 priority Critical patent/WO2020222319A1/fr
Publication of WO2020222319A1 publication Critical patent/WO2020222319A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/02Reservations, e.g. for tickets, services or events
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/14Central alarm receiver or annunciator arrangements
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B7/00Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00
    • G08B7/06Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources

Definitions

  • the present invention relates to an electronic device for a vehicle and a method of operating the electronic device for a vehicle.
  • a vehicle is a device that moves in a direction desired by a boarding user.
  • a typical example is a car.
  • Autonomous vehicle refers to a vehicle that can be driven automatically without human driving operation.
  • shared autonomous vehicles are being introduced to the industry. Since such a shared autonomous vehicle can be used by an unspecified number of vehicles, a safety device against crime is required when sharing a vehicle with others.
  • an object of the present invention is to provide an electronic device for preparing for crime in a shared vehicle.
  • an object of the present invention is to provide a method of operating an electronic device to prepare for crime in a shared vehicle.
  • an electronic device in an electronic device provided in a shared vehicle, receives indoor image data from a camera, and based on the indoor image data, the first user's Set a first occupied area and a second occupied area of a second user, set a virtual barrier between the first occupied area and the second occupied area, and determine whether the second user invades the virtual barrier It includes;
  • An electronic device operating method is a method of operating an electronic device provided in a shared vehicle, the method comprising: receiving, by at least one processor, indoor image data from a camera; Setting, by at least one processor, a first occupied area of a first user and a second occupied area of a second user based on the indoor image data; Setting, by at least one processor, a virtual barrier between the first occupied area and the second occupied area; And determining, by at least one processor, whether the second user invades the virtual barrier.
  • FIG. 1 is a view showing the exterior of a vehicle according to an embodiment of the present invention.
  • FIG. 2 is a control block diagram of a vehicle according to an embodiment of the present invention.
  • FIG 3 is a view showing the interior of a vehicle according to an embodiment of the present invention.
  • FIG. 4 is a control block diagram of an electronic device according to an embodiment of the present invention.
  • FIG. 5 is a flow chart of an electronic device according to an embodiment of the present invention.
  • 6A to 6B are views referenced to explain a virtual barrier according to an embodiment of the present invention.
  • FIG. 7A to 7C are views referenced for explaining a crime preparation operation according to an embodiment of the present invention.
  • FIG. 1 is a view showing a vehicle according to an embodiment of the present invention.
  • a vehicle 10 is defined as a means of transport running on a road or track.
  • the vehicle 10 is a concept including a car, a train, and a motorcycle.
  • the vehicle 10 may be a concept including both an internal combustion engine vehicle including an engine as a power source, a hybrid vehicle including an engine and an electric motor as a power source, and an electric vehicle including an electric motor as a power source.
  • the vehicle 10 may be a shared vehicle.
  • the vehicle 10 may be an autonomous vehicle.
  • the electronic device 100 may be included in the shared vehicle 10.
  • the electronic device 100 may determine an abnormal behavior of the user and perform a crime preparation operation according to the determination result.
  • FIG. 2 is a control block diagram of a vehicle according to an embodiment of the present invention.
  • the vehicle 10 includes an electronic device 100 for a vehicle, a user interface device 200, an object detection device 210, a communication device 220, a driving operation device 230, and a main ECU 240. ), a vehicle driving device 250, a driving system 260, a sensing unit 270, and a location data generating device 280.
  • the vehicle electronic device 100 may sense a user's behavior.
  • the vehicle electronic device 100 may determine an abnormal behavior of a user.
  • the vehicle electronic device 100 may perform a crime preparation operation according to the determination result.
  • the user interface device 200 is a device for communicating with the vehicle 10 and a user.
  • the user interface device 200 may receive a user input and provide information generated in the vehicle 910 to the user.
  • the vehicle 10 may implement a user interface (UI) or a user experience (UX) through the user interface device 200.
  • UI user interface
  • UX user experience
  • the object detection device 210 may detect an object outside the vehicle 10.
  • the object detection device 210 may include at least one of a camera, a radar, a lidar, an ultrasonic sensor, and an infrared sensor.
  • the object detection device 210 may provide data on an object generated based on a sensing signal generated by a sensor to at least one electronic device included in the vehicle.
  • the communication device 220 may exchange signals with devices located outside the vehicle 10.
  • the communication device 220 may exchange signals with at least one of an infrastructure (eg, a server, a broadcasting station) and another vehicle.
  • the communication device 220 may include at least one of a transmission antenna, a reception antenna, a radio frequency (RF) circuit capable of implementing various communication protocols, and an RF element to perform communication.
  • RF radio frequency
  • the driving operation device 230 is a device that receives a user input for driving. In the case of the manual mode, the vehicle 10 may be driven based on a signal provided by the driving operation device 230.
  • the driving operation device 230 may include a steering input device (eg, a steering wheel), an acceleration input device (eg, an accelerator pedal), and a brake input device (eg, a brake pedal).
  • the main ECU 240 may control the overall operation of at least one electronic device provided in the vehicle 10.
  • the vehicle drive device 250 is a device that electrically controls driving of various devices in the vehicle 10.
  • the vehicle driving apparatus 250 may include a power train driving unit, a chassis driving unit, a door/window driving unit, a safety device driving unit, a lamp driving unit, and an air conditioning driving unit.
  • the power train driving unit may include a power source driving unit and a transmission driving unit.
  • the chassis driving unit may include a steering driving unit, a brake driving unit, and a suspension driving unit.
  • the safety device driving unit may include a safety belt driving unit for controlling the safety belt.
  • the ADAS 260 may control a movement of the vehicle 10 or generate a signal for outputting information to a user based on data on an object received by the object detection apparatus 210.
  • the ADAS 260 may provide the generated signal to at least one of the user interface device 200, the main ECU 240, and the vehicle driving device 250.
  • ADAS 260 includes an adaptive cruise control system (ACC), an automatic emergency braking system (AEB), a forward collision warning system (FCW), and a lane maintenance assistance system (LKA: Lane Keeping Assist), Lane Change Assist (LCA), Target Following Assist (TFA), Blind Spot Detection (BSD), Adaptive High Beam Control System (HBA: High) Beam Assist), Auto Parking System (APS), PD collision warning system, Traffic Sign Recognition (TSR), Traffic Sign Assist (TSA), At least one of a night vision system (NV: Night Vision), a driver status monitoring system (DSM), and a traffic jam assistance system (TJA) may be implemented.
  • ACC adaptive cruise control system
  • AEB automatic emergency braking system
  • FCW forward collision warning system
  • LKA Lane Keeping Assist
  • Lane Change Assist LCA
  • TFA Target Following Assist
  • BSD Blind Spot Detection
  • HBA High Beam Control System
  • APS Auto Parking System
  • the sensing unit 270 may sense the state of the vehicle.
  • the sensing unit 270 includes an inertial navigation unit (IMU) sensor, a collision sensor, a wheel sensor, a speed sensor, a tilt sensor, a weight detection sensor, a heading sensor, a position module, and a vehicle.
  • IMU inertial navigation unit
  • a collision sensor a wheel sensor
  • a speed sensor a speed sensor
  • a tilt sensor a weight detection sensor
  • a heading sensor a position module
  • a vehicle At least one of forward/reverse sensor, battery sensor, fuel sensor, tire sensor, steering sensor by steering wheel rotation, vehicle interior temperature sensor, vehicle interior humidity sensor, ultrasonic sensor, illuminance sensor, accelerator pedal position sensor, and brake pedal position sensor It may include.
  • the inertial navigation unit (IMU) sensor may include one or more of an acceleration sensor, a gyro sensor, and a magnetic sensor.
  • the sensing unit 270 may generate state data of the vehicle based on a signal generated by at least one sensor.
  • the sensing unit 270 includes vehicle attitude information, vehicle motion information, vehicle yaw information, vehicle roll information, vehicle pitch information, vehicle collision information, vehicle direction information, vehicle angle information, and vehicle speed.
  • the sensing unit 270 includes an accelerator pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor (AFS), an intake air temperature sensor (ATS), a water temperature sensor (WTS), and a throttle position sensor. (TPS), a TDC sensor, a crank angle sensor (CAS), and the like may be further included.
  • the sensing unit 270 may generate vehicle state information based on the sensing data.
  • the vehicle status information may be information generated based on data sensed by various sensors provided inside the vehicle.
  • the vehicle status information includes vehicle attitude information, vehicle speed information, vehicle tilt information, vehicle weight information, vehicle direction information, vehicle battery information, vehicle fuel information, vehicle tire pressure information, It may include vehicle steering information, vehicle interior temperature information, vehicle interior humidity information, pedal position information, vehicle engine temperature information, and the like.
  • the sensing unit may include a tension sensor.
  • the tension sensor may generate a sensing signal based on a tension state of the seat belt.
  • the location data generating device 280 may generate location data of the vehicle 10.
  • the location data generating apparatus 280 may include at least one of a Global Positioning System (GPS) and a Differential Global Positioning System (DGPS).
  • GPS Global Positioning System
  • DGPS Differential Global Positioning System
  • the location data generating apparatus 280 may generate location data of the vehicle 10 based on a signal generated by at least one of GPS and DGPS.
  • the location data generating apparatus 280 may correct the location data based on at least one of an IMU (Inertial Measurement Unit) of the sensing unit 270 and a camera of the object detection apparatus 210.
  • IMU Inertial Measurement Unit
  • Vehicle 10 may include an internal communication system 50.
  • a plurality of electronic devices included in the vehicle 10 may exchange signals through the internal communication system 50.
  • the signal may contain data.
  • the internal communication system 50 may use at least one communication protocol (eg, CAN, LIN, FlexRay, MOST, Ethernet).
  • FIG 3 is a view showing the interior of a vehicle according to an embodiment of the present invention.
  • FIG. 4 is a control block diagram of an electronic device according to an embodiment of the present invention.
  • the electronic device 100 may include a memory 140, a processor 170, an interface unit 180, and a power supply unit 190.
  • the electronic device 100 may further include a microphone 101, an emergency communication device 102, a camera 130, and a speaker 103 individually or in combination.
  • the microphone 101 can convert an audio signal into an electrical signal.
  • the microphone 101 may receive the voice of a person riding in the shared vehicle 10 and convert it into an electrical signal.
  • the emergency communication device 102 can establish an emergency communication channel with a vehicle external device.
  • the emergency communication device 102 may include at least one of a transmission antenna, a reception antenna, a radio frequency (RF) circuit capable of implementing various communication protocols, and an RF element.
  • the external device may be an electronic device capable of communicating located in a police station, a government office, a fire station, a hospital, or the like.
  • the camera 130 may be installed at at least one point in the interior of the shared vehicle 10.
  • the camera 130 may photograph the interior of the shared vehicle 10.
  • the camera 130 may generate indoor image data.
  • the camera 130 may provide the generated image data to the processor 170.
  • the camera 130 may be provided in plural.
  • the camera 130 may include a first camera and a second camera disposed at a different angle from the first camera.
  • the first camera may generate indoor image data of a first view point.
  • the second camera may generate indoor image data of a second viewpoint.
  • the camera 130 may include more cameras.
  • the speaker 103 can convert an electrical signal into an audio signal.
  • the speaker 103 can convert an electrical signal based on a signal from an external device into an audio signal.
  • the occupant of the shared vehicle 10 may communicate with another person using the external device of the vehicle through the microphone 101 and the speaker 103 when an emergency communication channel is established with the vehicle external device.
  • the memory 140 is electrically connected to the processor 170.
  • the memory 140 may store basic data for a unit, control data for controlling the operation of the unit, and input/output data.
  • the memory 140 may store data processed by the processor 170.
  • the memory 140 may be configured with at least one of ROM, RAM, EPROM, flash drive, and hard drive.
  • the memory 140 may store various data for overall operation of the electronic device 100, such as a program for processing or controlling the processor 170.
  • the memory 140 may be implemented integrally with the processor 170. Depending on the embodiment, the memory 140 may be classified as a sub-element of the processor 170.
  • the memory 140 may store image data generated by the camera 130. When it is determined by the processor 170 that the second user is invading the virtual barrier, the memory 140 may store image data that is the basis of the determination.
  • the interface unit 180 may exchange signals with at least one electronic device provided in the vehicle 10 by wire or wirelessly.
  • the interface unit 280 includes an object detection device 210, a communication device 220, a driving operation device 230, a main ECU 140, a vehicle driving device 250, an ADAS 260, and a sensing unit 170. And it is possible to exchange a signal with at least one of the location data generating device 280 wired or wirelessly.
  • the interface unit 280 may be configured with at least one of a communication module, a terminal, a pin, a cable, a port, a circuit, an element, and a device.
  • the interface unit 180 may receive location data of the vehicle 10 from the location data generating device 280.
  • the interface unit 180 may receive driving speed data from the sensing unit 270.
  • the interface unit 180 may receive object data around the vehicle from the object detection device 210.
  • the power supply unit 190 may supply power to the electronic device 100.
  • the power supply unit 190 may receive power from a power source (eg, a battery) included in the vehicle 10 and supply power to each unit of the electronic device 100.
  • the power supply unit 190 may be operated according to a control signal provided from the main ECU 140.
  • the power supply unit 190 may be implemented as a switched-mode power supply (SMPS).
  • SMPS switched-mode power supply
  • the processor 170 may be electrically connected to the memory 140, the interface unit 280, and the power supply unit 190 to exchange signals.
  • the processor 170 may be electrically connected to the microphone 101, the emergency communication device 102, the camera 130, and the speaker 103 to exchange signals.
  • the processor 170 includes application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, and controllers. It may be implemented using at least one of (controllers), micro-controllers, microprocessors, and electrical units for performing other functions.
  • the processor 170 may be driven by power provided from the power supply unit 190.
  • the processor 170 may receive data, process data, generate a signal, and provide a signal while power is supplied by the power supply unit 190.
  • the processor 170 may receive indoor image data from the camera 130.
  • the processor 170 may receive a plurality of indoor image data from the plurality of cameras 130.
  • the processor 170 may receive indoor image data through the interface unit 180.
  • the processor 170 may set the occupied area of the user based on the indoor image data.
  • the occupied area may be defined as a three-dimensional space in which user actions are allowed. The user's body crossing the occupied area and invading another area may be defined as an abnormal behavior.
  • the processor 170 may set the occupied area based on the user's boarding seat.
  • the processor 170 may set the occupied area based on the space occupied by the user's boarding seat and the estimated space occupied by the user by seating on the boarding seat by the user.
  • the processor 170 may set an occupied area by adding a space for a boarding seat and a space for a user's belongings to a space for a user's seating space.
  • the processor 170 may set the first occupied area of the first user based on the indoor image data.
  • the processor 170 may set the second occupied area of the second user based on the indoor image data.
  • the processor 170 may set the virtual barrier based on the occupied area.
  • the processor 170 may set a virtual barrier between the first occupied area and the second occupied area.
  • the virtual barrier may be defined as a virtual surface that separates a plurality of occupied areas.
  • the processor 170 may set the virtual barrier based on the space occupied by the user's boarding seat and the estimated space occupied by the user by seating on the boarding seat.
  • the processor 170 may set an occupied area by adding a space for a boarding seat and a space for a user's belongings to a space for a user's seating space.
  • the processor 170 may determine whether a user or a passenger invades the virtual barrier.
  • the processor 170 may determine whether a user or a passenger's body or belongings (eg, a weapon such as a gun or a knife) crosses the virtual barrier based on the indoor image data.
  • the processor 170 may determine whether the second user invades the virtual barrier. For example, when it is sensed that a part of the second user's body or a part of belongings crosses the virtual barrier and reaches the occupied space of the first user, the processor 170 may determine that the second user is invading the virtual barrier. I can.
  • the processor 170 may receive reservation information of the user's shared vehicle 10 through the interface unit 180.
  • the reservation information may include passenger information.
  • the processor 170 may determine a passenger of the user based on the reservation information.
  • the processor 170 may determine the user's passengers further based on whether the user and the passengers board together at a predetermined place at a predetermined time.
  • the processor 170 may receive reservation information of the first user through the interface unit 180 and determine a passenger of the first user based on the reservation information.
  • the processor 170 may set the occupied space of the user and the passenger together.
  • the processor 170 may set the first occupied area by combining the boarding area of the first user and the boarding area of a passenger.
  • the user may be described as a person who uses the shared vehicle 10 by making a direct reservation.
  • the passenger is not directly reserved, but may be described as a person who uses the vehicle 10 with the user by the user's reservation.
  • the processor 170 may perform a crime preparation operation. For example, when it is determined that the second user is invading the virtual barrier, the processor 170 may store image data that is the basis of the determination. In this case, the processor 170 may transmit image data that is the basis of the determination to an external device of the vehicle. For example, if it is determined that the second user is invading the virtual barrier based on the indoor image data of the first view generated by the first camera, the processor 170 generates the indoor image data of the second view.
  • the second camera can be switched from an inactive state to an active state.
  • the second viewpoint may be a viewpoint different from the first viewpoint.
  • the processor 170 obtains the indoor image data with a second resolution higher than the first resolution.
  • a control signal may be provided to the camera 130 so as to be performed.
  • the processor 170 may activate at least one of the microphone 101 and the speaker 103.
  • the processor 170 may establish an emergency communication channel with an external device of the vehicle.
  • the emergency communication channel may be understood as a communication channel directly connected to a special external device (eg, a police station, a government office, a fire station, a hospital) in a special situation, not a communication using the communication device 220.
  • a special external device eg, a police station, a government office, a fire station, a hospital
  • Special circumstances can be described as criminal preliminary situations.
  • the special situation may be a situation in which the second user has invaded the virtual barrier.
  • the electronic device 100 may include at least one printed circuit board (PCB).
  • PCB printed circuit board
  • the microphone 101, the emergency communication device 102, the camera 130, the memory 140, the speaker 103, the interface unit 180, the power supply unit 190, and the processor 170 are electrically connected to a printed circuit board. Can be connected to.
  • 5 is a flow chart of an electronic device according to an embodiment of the present invention. 5 illustrates each step of the operation method S500 of the electronic device 100 provided in the shared vehicle 10.
  • the processor 170 may receive indoor image data from the camera 130 (S510 ).
  • the processor 170 may set the occupied area of the user based on the indoor image data (S520). When the first user and the second user board the shared vehicle 10, the processor 170 determines the first occupied area of the first user and the second occupied area of the second user based on the indoor image data. Can be set.
  • the at least one processor 170 receives the reservation information of the first user and the at least one processor 170 is Based on the information, it may include determining a passenger of the first user. In the step of setting the first occupied area and the second occupied area (S520), the at least one processor 170 further includes the step of setting the first occupied area by combining the boarding area of the first user and the boarding area of the passenger. Can include.
  • the processor 170 may set the virtual barrier based on the occupied area (S530). When the first occupied area and the second occupied area are set, the processor 170 may set a virtual barrier between the first occupied area and the second occupied area.
  • the processor 170 may determine whether the user invades the virtual barrier (S540).
  • the virtual barrier may be defined as a virtual surface that separates a plurality of occupied areas.
  • the processor 170 may determine whether the user has invaded the virtual barrier, based on whether the user's body or belongings cross the virtual barrier.
  • the processor 170 may determine whether the second user invades the virtual barrier.
  • step S540 if it is determined that the second user is invading the virtual barrier, the processor 170 may store image data that is the basis of the determination (S550). The processor 170 may transmit image data, which is a basis for determination, to an external device of the shared vehicle 10 (S555).
  • step S540 based on the indoor image data of the first view generated by the first camera, if it is determined that the second user is invading the virtual barrier, the processor 170, the indoor image data of the second view
  • the generated second camera may be switched from an inactive state to an active state (S561).
  • the second viewpoint may be a viewpoint different from the first viewpoint.
  • step S540 based on the indoor image data of the first resolution, if it is determined that the second user is invading the virtual barrier, the processor 170 acquires the indoor image data with a second resolution higher than the first resolution.
  • a control signal may be provided to the camera 130 so as to be performed (S562).
  • step S540 if it is determined that the second user is invading the virtual barrier, the processor 170 may activate at least one of the microphone 101 and the speaker 103 (S563).
  • step S540 when it is determined that the second user is invading the virtual barrier, the processor 170 may establish an emergency communication channel with an external device of the shared vehicle 10 (S564).
  • an emergency communication channel When an emergency communication channel is established, real-time calls and data sharing between the occupant of the shared vehicle 10 and a user of an external device are possible.
  • 6A to 6B are views referenced to explain a virtual barrier according to an embodiment of the present invention.
  • the processor 170 may acquire reservation information of the first user 610.
  • the processor 170 may set the first occupied area 631 around the boarding seat of the first user 610.
  • the reservation information of the first user 610 may include information on the passenger 615.
  • the first occupied area 631 may be set around the boarding seat of the first user 610 and the boarding seat of the passenger 631.
  • the processor 170 may obtain reservation information of the second user 620.
  • the processor 170 may set the second occupied area 632 around the boarding seat of the second user 620.
  • the processor 170 creates a virtual barrier 601 that separates the first occupied area 631 and the second occupied area 632. Can be set.
  • the processor 170 may set a plurality of barriers.
  • the processor 170 is the first occupied area of the first user 610 ( 641 ), a second occupied area 642 of the second user 620, and a third occupied area 643 of the third user 630 may be set, respectively.
  • the processor 170 includes a first virtual barrier 601 that separates the first occupied area 641 and the second occupied area 642 And a second virtual barrier 602 that separates the first occupied area 641 and the third occupied area 643, respectively.
  • FIG. 7A to 7C are views referenced for explaining a crime preparation operation according to an embodiment of the present invention.
  • the electronic device 100 provides a function to prepare in advance for crime situations that may occur in the shared vehicle 10 and to quickly respond when sensing a crime situation.
  • the electronic device 100 may recognize a dangerous object, an action direction, and a motion even in a dark situation.
  • the electronic device 100 may detect area invasion and behavior by recognizing areas and objects of passengers (users or passengers).
  • the electronic device 100 may store data related to the illegal activity and report it through an emergency communication channel.
  • the processor 170 may set an occupied area for each occupant based on a boarding seat when the user boards.
  • the processor 170 may set the occupied area for each occupant by reflecting the space where the user's belongings are placed.
  • the processor 170 may determine a passenger of the user based on the information on the number of passengers reserved for boarding and the information on the number of people who boarded at the same time.
  • the processor 170 may store data (eg, image data) when illegal or abnormal behavior between passengers is sensed.
  • the processor 170 may report to a police station or the like through an emergency communication channel.
  • the processor 170 may determine whether the hand invades the virtual barrier by tracking the passenger's hand as the center.
  • the processor 170 may determine an abnormal behavior (eg, theft) based on the determination. In this case, the processor 170 may treat the passenger as an exception.
  • the processor 170 may sense violent or threatening behavior in the vehicle.
  • the processor 170 may sense a violent behavior caused by a passenger's hand or foot based on the internal image data.
  • the processor 170 may sense a sound such as vocal or profanity based on the voice data.
  • the processor 170 may sense a dangerous object such as a gun and a knife, and determine a threatening behavior targeting other passengers through the dangerous object.
  • the processor 170 may record an image inside the vehicle through the camera 130 and automatically report a report to a police station or the like.
  • the processor 170 may extract a singular point, such as an impression of a person involved in a violent or threatening behavior, based on the image inside the vehicle. If the party gets off and runs away, the processor 170 may track and report the party by interlocking with external cameras, store cameras, etc. around the shared vehicle 10.
  • the processor 170 may sense the trash dumping behavior in the shared vehicle 10. When the garbage dumping activity is sensed, the processor 170 may generate billing data for charging a cost to the party of the corresponding activity.
  • the processor 170 may sense a food consumption behavior in the shared vehicle 10. When the act of ingesting food is sensed, the processor 170 may generate billing data for charging a cleaning fee to a party of the act. The processor 170 may create a comfortable environment by operating the outside air circulation function of the air conditioner.
  • the processor 170 may monitor the interior of the shared vehicle 10 by activating the first camera (main camera) 131.
  • the processor 170 may activate the second camera (sub camera) 132.
  • the processor 170 may deactivate the second camera 132.
  • the second camera 132 may be activated to sense a corresponding area.
  • the processor 170 may activate microphones 101a, 101b, 101c, 101d, and speakers 103a, 103b, 103c, 103d when a dangerous person is sensed or a dangerous situation is sensed. have.
  • the processor 170 may increase the resolution of the camera 130.
  • the processor 170 may form an emergency communication channel 720 using the emergency communication device 102.
  • the processor 170 may perform real-time streaming and recording operations.
  • the processor 170 may detect the smell.
  • an odor sensor may be further included in the shared vehicle 10.
  • the present invention described above can be implemented as a computer-readable code in a medium on which a program is recorded.
  • the computer-readable medium includes all types of recording devices storing data that can be read by a computer system. Examples of computer-readable media include HDD (Hard Disk Drive), SSD (Solid State Disk), SDD (Silicon Disk Drive), ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc. There is also a carrier wave (eg, transmission over the Internet).
  • the computer may include a processor or a control unit. Therefore, the detailed description above should not be construed as restrictive in all respects and should be considered as illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the equivalent scope of the present invention are included in the scope of the present invention.

Landscapes

  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Emergency Management (AREA)
  • Tourism & Hospitality (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Alarm Systems (AREA)

Abstract

La présente invention concerne un dispositif électronique disposé dans un véhicule partageable, le dispositif électronique comprenant un processeur qui : reçoit des données d'image d'intérieur provenant d'une caméra ; définit une première zone occupée d'un premier utilisateur et une seconde zone occupée d'un second utilisateur sur la base des données d'image d'intérieur ; définit une barrière virtuelle entre la première zone occupée et la seconde zone occupée ; et détermine si le second utilisateur empiète sur la barrière virtuelle.
PCT/KR2019/005140 2019-04-29 2019-04-29 Dispositif électronique pour véhicule et procédé pour faire fonctionner le dispositif électronique pour véhicule WO2020222319A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2019/005140 WO2020222319A1 (fr) 2019-04-29 2019-04-29 Dispositif électronique pour véhicule et procédé pour faire fonctionner le dispositif électronique pour véhicule

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2019/005140 WO2020222319A1 (fr) 2019-04-29 2019-04-29 Dispositif électronique pour véhicule et procédé pour faire fonctionner le dispositif électronique pour véhicule

Publications (1)

Publication Number Publication Date
WO2020222319A1 true WO2020222319A1 (fr) 2020-11-05

Family

ID=73029724

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/005140 WO2020222319A1 (fr) 2019-04-29 2019-04-29 Dispositif électronique pour véhicule et procédé pour faire fonctionner le dispositif électronique pour véhicule

Country Status (1)

Country Link
WO (1) WO2020222319A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003104132A (ja) * 2001-09-27 2003-04-09 Denso Corp 同乗者モニタ装置
JP2004276784A (ja) * 2003-03-17 2004-10-07 Aisin Seiki Co Ltd 車両監視装置
US20040263323A1 (en) * 2003-04-14 2004-12-30 Fujitsu Ten Limited Antitheft device, monitoring device and antitheft system
KR101663096B1 (ko) * 2015-09-02 2016-10-06 주식회사 서연전자 차량의 도난 방지 장치
KR101826715B1 (ko) * 2016-10-21 2018-02-07 한국오므론전장 주식회사 실내 카메라를 이용한 차량 침입 검출 시스템 및 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003104132A (ja) * 2001-09-27 2003-04-09 Denso Corp 同乗者モニタ装置
JP2004276784A (ja) * 2003-03-17 2004-10-07 Aisin Seiki Co Ltd 車両監視装置
US20040263323A1 (en) * 2003-04-14 2004-12-30 Fujitsu Ten Limited Antitheft device, monitoring device and antitheft system
KR101663096B1 (ko) * 2015-09-02 2016-10-06 주식회사 서연전자 차량의 도난 방지 장치
KR101826715B1 (ko) * 2016-10-21 2018-02-07 한국오므론전장 주식회사 실내 카메라를 이용한 차량 침입 검출 시스템 및 방법

Similar Documents

Publication Publication Date Title
JP7003660B2 (ja) 情報処理装置、情報処理方法及びプログラム
EP3378722B1 (fr) Dispositif d'assistance à la conduite et procédé d'assistance à la conduite, et corps mobile
WO2020222333A1 (fr) Dispositif électronique embarqué et procédé d'actionnement de dispositif électronique embarqué
US20230106791A1 (en) Control device for vehicle and automatic driving system
WO2021002503A1 (fr) Dispositif électronique pour véhicule et son procédé de fonctionnement
WO2020241952A1 (fr) Système de véhicule autonome et procédé de conduite autonome pour véhicule
WO2021002517A1 (fr) Dispositif de gestion de véhicule partagé et procédé de gestion de véhicule partagé
US11430436B2 (en) Voice interaction method and vehicle using the same
WO2021002519A1 (fr) Appareil pour fournir une annonce à un véhicule et procédé pour fournir une annonce à un véhicule
CN110239527A (zh) 车辆控制装置、车辆控制方法及存储介质
WO2015053434A1 (fr) Dispositif pour l'exécution d'une estimation sur base d'une caméra de la charge sur un véhicule et procédé d'estimation de la charge sur un véhicule
KR20160129790A (ko) 차량의 제어 방법
WO2020241971A1 (fr) Dispositif de gestion d'accident de la circulation et procédé de gestion d'accident de la circulation
JP6981095B2 (ja) サーバ装置、記録方法、プログラム、および記録システム
WO2020222319A1 (fr) Dispositif électronique pour véhicule et procédé pour faire fonctionner le dispositif électronique pour véhicule
WO2020222317A1 (fr) Dispositif électronique et procédé pour faire fonctionner le dispositif électronique
WO2021215559A1 (fr) Procédé et appareil de surveillance de véhicule
US20200193734A1 (en) Control device, control method, and control program of vehicle
WO2021002518A1 (fr) Dispositif de génération de données de position, véhicule autonome, et procédé de génération de données de position
JP7127636B2 (ja) 情報処理装置、情報処理方法および情報処理プログラム
WO2020196931A1 (fr) Dispositif électronique de véhicule et procédé de commande de dispositif électronique de véhicule
WO2020196961A1 (fr) Dispositif électronique pour véhicule et procédé de fonctionnement de dispositif électronique pour véhicule
KR101788663B1 (ko) 초음파 센서와 영상 센서의 데이터 통합처리 시스템
WO2020241953A1 (fr) Dispositif électronique pour véhicule et procédé de fonctionnement du dispositif électronique pour véhicule
WO2020196960A1 (fr) Dispositif électronique pour véhicule et procédé d'exploitation de dispositif électronique pour véhicule

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19927514

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19927514

Country of ref document: EP

Kind code of ref document: A1