WO2015099500A1 - Appareil d'enregistrement d'accident de véhicule et son procédé - Google Patents

Appareil d'enregistrement d'accident de véhicule et son procédé Download PDF

Info

Publication number
WO2015099500A1
WO2015099500A1 PCT/KR2014/012922 KR2014012922W WO2015099500A1 WO 2015099500 A1 WO2015099500 A1 WO 2015099500A1 KR 2014012922 W KR2014012922 W KR 2014012922W WO 2015099500 A1 WO2015099500 A1 WO 2015099500A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
recording
image
driver
wearable device
Prior art date
Application number
PCT/KR2014/012922
Other languages
English (en)
Korean (ko)
Inventor
김우식
홍성욱
박민호
강신재
이승용
김승혜
박형곤
권민혜
조효일
Original Assignee
엘지전자 주식회사
이화여자대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사, 이화여자대학교 산학협력단 filed Critical 엘지전자 주식회사
Publication of WO2015099500A1 publication Critical patent/WO2015099500A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • G07C5/0866Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/087Interaction between the driver and the control system where the control system corrects or modifies a request from the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers

Definitions

  • the present invention relates to a vehicle accident recording apparatus capable of operating in conjunction with a wearable device worn by a driver and a method thereof.
  • Terminal is movable It may be divided into a wearable device (mobile / portable terminal) and a stationary terminal according to whether or not.
  • the wearable device may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.
  • the terminal is implemented in the form of a multimedia player having complex functions such as taking a picture or video, playing a music or video file, playing a game or receiving a broadcast. have. Further, in order to support and increase the function of the terminal, it may be considered to improve the structural part and the software part of the terminal.
  • Another object of the present invention is to provide a vehicle accident recording apparatus and a method thereof, in which a recording object can be changed according to a state of a vehicle and a driver when a dangerous situation or an accident occurs in a vehicle.
  • the step of starting the recording by driving the camera provided on the outside of the vehicle; sensing through the vehicle sensor provided in the vehicle Receiving the biometric information of the driver detected through the wearable device when the detected sensor value exceeds a reference value; Determining a driving time point of a camera provided in the vehicle based on an impact level of a driver corresponding to the received biometric information; And changing the object of the recording based on the determined driving time.
  • the method may include determining whether an accident of the vehicle occurs based on the detected sensor value and the degree of impact of the driver; In the event of an accident of the vehicle, the method may further include driving the cameras provided at the outside and the inside of the vehicle to perform recording.
  • the performing of the recording may include recording the inside of the vehicle by driving the camera provided in the wearable device when the camera provided in the vehicle cannot be driven. It features.
  • the recording may include: splitting a first image obtained by driving a camera provided outside the vehicle and a second image obtained by driving a camera provided inside the vehicle on one screen; And performing recording to be displayed.
  • the method may further include controlling to record an image corresponding to the driver's gaze using a camera included in the wearable device when the driver's shock level corresponding to the received biometric information exceeds a reference value. It features.
  • the storage time of the recorded image before or after the corresponding time is extended or before or after the corresponding time. And controlling the recorded image to be transmitted to a predetermined server.
  • the method may further include collecting driving information of a vehicle using a camera provided outside the vehicle; The method may further include storing state information of the driver corresponding to the received biometric information along with the recorded image when the proximity of the object around the vehicle is detected based on the collected driving information.
  • the changing of the object of the recording may include: generating a first image recorded by a camera provided on the outside of the vehicle based on the determined driving time; It is characterized in that it is converted to a second image or the first and second images are recorded and synthesized, respectively.
  • the method may further include controlling the second image to be converted to the first image when the level corresponding to the degree of impact of the driver decreases. It is done.
  • the method may further include recording an image based on the vehicle sensor outputting the sensor value exceeding the reference value.
  • the vehicle accident recording apparatus the camera provided on the inside and outside of the vehicle; A recording unit for recording an image of a surrounding when the camera is driven; And a control unit which transmits a driving signal to a camera provided outside the vehicle, and receives biometric information of a driver from a connected wearable device when a sensor value detected through a vehicle sensor provided in the vehicle exceeds a reference value.
  • the controller is configured to determine a driving time point of a camera provided in the vehicle based on the driver's shock information corresponding to the received biometric information, and to change an object of recording based on the determined driving time point. Providing a signal to the camera.
  • the controller determines whether an accident occurs in the vehicle based on the detected sensor value and the driver's impact level, and when the accident occurs, simultaneously driving and recording a camera provided at the outside and the inside of the vehicle. It characterized in that the control to perform.
  • the occurrence of the vehicle accident based on the biometric information of the driver received from the vehicle sensor provided in the vehicle and the wearable device worn by the driver Accurately judge and record.
  • the vehicle accident recording apparatus and method thereof according to an embodiment of the present invention, according to the state of the vehicle and the driver appropriately by changing the object of recording or by simultaneously acquiring images of the inside and outside of the vehicle, the accident image is obtained You can record and save more efficiently.
  • FIG. 1 is an exemplary block diagram showing the configuration of a vehicle accident recording apparatus according to an embodiment of the present invention.
  • FIG. 2A is a block diagram illustrating a configuration of a wearable device interworking with a vehicle accident recording apparatus according to an exemplary embodiment of the present invention.
  • FIG. 2B is a diagram illustrating a watch type terminal as an example of the wearable device of FIG. 2A.
  • FIG. 3 is an exemplary flowchart of a vehicle accident recording method according to an embodiment of the present invention.
  • 4A is an exemplary conceptual view illustrating a method of recognizing a vehicle and a lane in front of a camera using a camera in order to detect an accident of a vehicle in advance according to an exemplary embodiment of the present invention.
  • 4B is an exemplary conceptual diagram illustrating a method of tracking an area of a vehicle using a camera in order to detect an accident of a vehicle in advance according to an exemplary embodiment of the present invention.
  • 4C is an exemplary conceptual diagram illustrating a method of detecting a proximity vehicle using a camera in order to detect an accident of a vehicle in advance according to an embodiment of the present disclosure.
  • FIG. 1 is an exemplary block diagram showing the configuration of a vehicle accident recording apparatus according to an embodiment of the present invention.
  • the vehicle accident recording apparatus 100 may perform wireless communication in cooperation with a mobile terminal worn by a driver, particularly a wearable device 200.
  • the vehicle accident recording apparatus 100 includes a camera module 110, a sensing unit 120, a wireless communication unit 130, a control unit 150, an output unit 170, a recording unit 180, and a storage unit 190. It is made to include.
  • the vehicle accident recording apparatus 100 communicates with the wearable device 200 worn by the driver through the wireless communication unit 110. Accordingly, the vehicle accident recording apparatus 100 may determine the driver's state by receiving the driver's biometric information from the wearable device 200. In addition, the vehicle accident recording apparatus 100 may transmit a control signal for controlling the operation of the wearable device 200 based on the determined driver's state.
  • the vehicle accident recording apparatus 100 transmits information related to the state of the vehicle to the wearable device 200 by wireless communication using Bluetooth, ZigBee, WIFI, etc. using the wireless communication unit 130, or RS It can transmit information related to the vehicle status by wire communication using -232, RS-485, USB, CAN, etc.
  • the camera module 110 may include an internal camera 111 provided inside the vehicle and an external camera 112 provided outside the vehicle.
  • the internal camera 111 photographs the interior of the vehicle according to the driving signal
  • the external camera 112 photographs the exterior of the vehicle and the front of the vehicle according to the driving signal.
  • a single camera may perform the functions of the internal camera 111 and the external camera 112.
  • the sensing unit 120 includes various sensors provided in the vehicle and collects various information related to the driving of the vehicle.
  • the sensing unit 120 may include a shock sensor for detecting a shock of a vehicle, an obstacle detection sensor for detecting a vehicle or an obstacle around the vehicle, a brake signal for detecting a brake signal and a braking distance, and a And speed sensors.
  • the controller 150 may determine whether an accident of the vehicle occurs based on the sensor value.
  • the sensing unit 120 is limited to an impact sensor that can sense the impact applied to the recording unit 180, not all the vehicle sensors provided in the vehicle, the impact applied to the recording unit 180 is Accidents can be determined based on whether they exceed the threshold.
  • the controller 150 controls the overall operation of the vehicle accident recording apparatus 100.
  • the controller 150 may determine whether an accident occurs based on the sensor value detected through the sensing unit 120 and the biometric information of the driver received from the wearable device 200.
  • the controller 150 determines the object of recording and the start point of recording by comparing the sensor value detected through the sensing unit 120 with the driver's state corresponding to the biometric information of the driver received from the wearable device 200. Can be.
  • the controller 150 may generate a control signal for automatically recording the recording unit 180 based on road condition information corresponding to the current location of the vehicle, for example, using GPS information. .
  • the recording unit 180 may be implemented, for example, in the same configuration as the black box of the vehicle.
  • the recording unit 180 may be implemented to always record the surroundings of the vehicle regardless of the trigger signal. In this case, when the reference time elapses, previously recorded images may be automatically deleted in consideration of the storage space of the storage 190.
  • the recording unit 180 when the recording unit 180 is implemented to record the surroundings of the vehicle only when a trigger signal is received, the recorded images may be stored for a relatively long time. In this case, in order to accurately identify the accident at the time of the vehicle accident, the start time of recording is very important.
  • the storage unit 190 stores the video recorded by the recording unit 180 in chronological order. At this time, according to the control signal of the controller 150, the biometric information or the state information of the driver corresponding to the recording time of the image may be stored together.
  • the output unit 160 may include a display unit (not shown) capable of confirming the recorded image and an audio output unit (not shown) for outputting feedback indicating the start or end of the recording of the image.
  • the wearable device 200 When the image recorded through the vehicle accident recording apparatus 100 and / or the biometric information of the driver detected from the wearable device 200 satisfies a preset condition, the wearable device 200 may be used to help rescue organizations (eg, hospitals). , Police station, etc.) may be transmitted to an external server (eg, a database) 300.
  • help rescue organizations eg, hospitals. , police station, etc.
  • an external server eg, a database
  • FIG. 2A is an exemplary block diagram illustrating a configuration of the wearable device 200 interoperating with the vehicle accident recording apparatus 100 described above.
  • the wearable device 200 includes a wireless communication unit 210, an input unit 220, a detection unit 240, an output unit 250, an interface unit 260, a memory 270, and a controller. 280, a power supply unit 290, and the like.
  • the components shown in FIG. 2A are not essential to implementing a mobile terminal, so that the mobile terminal described herein may have more or fewer components than those listed above.
  • the wireless communication unit 210 of the components, between the wearable device 200 and the wireless communication system, between the wearable device 200 and another wearable device 200, or the wearable device 200 and the external server It may include one or more modules that enable wireless communication therebetween.
  • the wireless communication unit 210 may include one or more modules for connecting the wearable device 200 to one or more networks.
  • the wireless communication unit 210 may include at least one of a broadcast receiving module 211, a mobile communication module 212, a wireless internet module 213, a short range communication module 214, and a location information module 215. .
  • the input unit 220 may include a camera 221 or an image input unit for inputting an image signal, a microphone 222 for inputting an audio signal, an audio input unit, or a user input unit 223 for receiving information from a user. , Touch keys, mechanical keys, and the like.
  • the voice data or the image data collected by the input unit 220 may be analyzed and processed as a control command of the user.
  • the sensing unit 240 may include one or more sensors for sensing at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information.
  • the sensing unit 240 may include a proximity sensor 241, an illumination sensor 242, an illumination sensor, a touch sensor, an acceleration sensor, a magnetic sensor, and gravity.
  • Optical sensors e.g. cameras (see 221)), microphones (see 222), battery gauges, environmental sensors (e.g.
  • the mobile terminal disclosed herein may use a combination of information sensed by at least two or more of these sensors.
  • the output unit 250 is used to generate an output related to sight, hearing, or tactile sense, and includes at least one of a display unit 251, an audio output unit 252, a hap tip module 253, and an optical output unit 254. can do.
  • the display unit 251 forms a layer structure with or is integrally formed with the touch sensor, thereby implementing a touch screen.
  • the touch screen may function as a user input unit 223 providing an input interface between the wearable device 200 and the user, and may also provide an output interface between the wearable device 200 and the user.
  • the interface unit 260 serves as a path to various types of external devices connected to the wearable device 200.
  • the interface unit 260 connects a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, and an identification module. It may include at least one of a port, an audio input / output (I / O) port, a video input / output (I / O) port, and an earphone port.
  • the wearable device 200 may perform appropriate control related to the connected external device in response to the external device being connected to the interface unit 260.
  • the memory 270 stores data supporting various functions of the wearable device 200.
  • the memory 270 may store a plurality of application programs or applications that are driven by the wearable device 200, data for operating the wearable device 200, and instructions. At least some of these applications may be downloaded from an external server via wireless communication.
  • at least some of these application programs may exist on the wearable device 200 from the time of shipment for basic functions (for example, call incoming, outgoing, message receiving, and outgoing functions) of the wearable device 200.
  • the application program may be stored in the memory 270, installed on the wearable device 200, and driven by the controller 280 to perform an operation (or function) of the mobile terminal.
  • the controller 280 In addition to the operation related to the application program, the controller 280 typically controls the overall operation of the wearable device 200.
  • the controller 280 may provide or process information or a function appropriate to a user by processing signals, data, information, and the like, which are input or output through the above-described components, or driving an application program stored in the memory 270.
  • controller 280 may control at least some of the components described with reference to FIG. 2A in order to drive an application program stored in the memory 270. In addition, the controller 280 may operate by combining at least two or more of the components included in the wearable device 200 to drive the application program.
  • the power supply unit 290 is supplied with an external power source and an internal power source under the control of the controller 280 to supply power to each component included in the wearable device 200.
  • the power supply unit 290 includes a battery, which may be a built-in battery or a replaceable battery.
  • At least some of the components may operate in cooperation with each other to implement an operation, control, or control method of the mobile terminal according to various embodiments described below.
  • the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the memory 270.
  • the broadcast receiving module 211 of the wireless communication unit 210 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or switching of broadcast channels for at least two broadcast channels.
  • the mobile communication module 212 may include technical standards or communication schemes (eg, Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (CDMA2000), and EV).
  • GSM Global System for Mobile communication
  • CDMA Code Division Multi Access
  • CDMA2000 Code Division Multi Access 2000
  • EV Enhanced Voice-Data Optimized or Enhanced Voice-Data Only (DO), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), LTE-A (Long Term Evolution-Advanced) and the like to transmit and receive a radio signal with at least one of a base station, an external terminal, a server on a mobile communication network.
  • GSM Global System for Mobile communication
  • CDMA Code Division Multi Access
  • CDMA2000 Code Division Multi Access 2000
  • EV Enhanced Voice-Data Optimized or Enhanced Voice-Data Only (DO)
  • WCDMA Wideband CDMA
  • HSDPA High
  • the wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.
  • the wireless internet module 213 refers to a module for wireless internet access and may be embedded or external to the wearable device 200.
  • the wireless internet module 213 is configured to transmit and receive wireless signals in a communication network according to wireless internet technologies.
  • wireless Internet technologies include Wireless LAN (WLAN), Wireless-Fidelity (Wi-Fi), Wireless Fidelity (Wi-Fi) Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), and WiMAX (World).
  • 213 transmits and receives data according to at least one wireless Internet technology in a range including the Internet technologies not listed above.
  • the wireless Internet module 213 for performing a wireless Internet access through the mobile communication network 213 May be understood as a kind of mobile communication module 212.
  • the short range communication module 214 is for short range communication, and includes Bluetooth TM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and NFC. (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, Wireless USB (Wireless Universal Serial Bus) by using at least one of the technologies, it is possible to support near field communication.
  • the short-range communication module 214 may be provided between the wearable device 200 and the wireless communication system, between the wearable device 200 and another wearable device 200, or the wearable device 200 through a wireless area network. ) And a wireless communication between a network in which the other mobile terminal 200 or an external server is located.
  • the short range wireless communication network may be short range wireless personal area networks.
  • the other wearable device 200 is a wearable device capable of exchanging (or interworking) data with the wearable device 200 according to the present invention, for example, a smartwatch, smart glasses. (smart glass), head mounted display (HMD).
  • the short range communication module 214 may sense (or recognize) a wearable device that can communicate with the wearable device 200 around the wearable device 200. Further, when the detected wearable device is a device that is authenticated to communicate with the wearable device 200 according to the present invention, the controller 280 may include at least a portion of data processed by the wearable device 200 in the short range communication module ( 214 may be transmitted to the wearable device. Therefore, a user of the wearable device may use data processed by the wearable device 200 through the wearable device.
  • the wearable device 200 when a call is received by the wearable device 200, the user performs a phone call through the wearable device or when a message is received by the wearable device 200, the received through the wearable device 200. It is possible to check the message.
  • the location information module 215 is a module for obtaining the location (or current location) of the mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module.
  • GPS Global Positioning System
  • Wi-Fi Wireless Fidelity
  • the mobile terminal may acquire the location of the mobile terminal using a signal transmitted from a GPS satellite.
  • the mobile terminal may acquire the location of the mobile terminal based on information of the wireless access point (AP) transmitting or receiving the Wi-Fi module and the wireless signal.
  • the location information module 215 may alternatively or additionally perform any function of other modules of the wireless communication unit 210 to obtain data regarding the location of the mobile terminal.
  • the location information module 215 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.
  • the input unit 220 is for inputting image information (or signal), audio information (or signal), data, or information input from a user, and for inputting image information, the wearable device 200 is one.
  • the plurality of cameras 221 may be provided.
  • the camera 221 processes an image frame such as a still image or a video obtained by an image sensor in a video call mode or a photographing mode.
  • the processed image frame may be displayed on the display unit 251 or stored in the memory 270.
  • the plurality of cameras 221 included in the wearable device 200 may be arranged to form a matrix structure, and through the camera 221 forming the matrix structure, the wearable device 200 may have various angles or focuses.
  • the plurality of pieces of image information may be input.
  • the plurality of cameras 221 may be arranged in a stereo structure so as to obtain a left image and a right image for implementing a stereoscopic image.
  • the microphone 222 processes external sound signals into electrical voice data.
  • the processed voice data may be variously utilized according to a function (or an application program being executed) performed by the wearable device 200. Meanwhile, various noise reduction algorithms may be implemented in the microphone 222 to remove noise generated in the process of receiving an external sound signal.
  • the user input unit 223 is for receiving information from a user. When information is input through the user input unit 223, the controller 280 may control an operation of the wearable device 200 to correspond to the input information. .
  • the user input unit 223 may be a mechanical input unit (or a mechanical key, for example, a button, a dome switch, a jog wheel, or the like located on the front or rear or side of the wearable device 200). Jog switch, etc.) and touch input means.
  • the touch input means may include a virtual key, a soft key, or a visual key displayed on the touch screen through a software process, or a portion other than the touch screen.
  • the virtual key or the visual key may be displayed on the touch screen while having various forms, for example, graphic or text. ), An icon, a video, or a combination thereof.
  • the sensing unit 240 senses at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information, and generates a sensing signal corresponding thereto.
  • the controller 280 may control driving or operation of the wearable device 200 or may perform data processing, function, or operation related to an application program installed in the wearable device 200 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 240 will be described in more detail.
  • the proximity sensor 241 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays.
  • the proximity sensor 241 may be disposed in an inner region of the mobile terminal covered by the touch screen described above or near the touch screen.
  • the proximity sensor 241 examples include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
  • the proximity sensor 241 may be configured to detect the proximity of the object with the change of the electric field according to the proximity of the conductive object.
  • the touch screen (or touch sensor) itself may be classified as a proximity sensor.
  • the proximity sensor 241 may detect a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). have.
  • the controller 280 processes the data (or information) corresponding to the proximity touch operation and the proximity touch pattern detected through the proximity sensor 241 as described above, and further, provides visual information corresponding to the processed data. It can be output on the touch screen. Furthermore, the controller 280 may control the wearable device 200 to process different operations or data (or information) according to whether the touch on the same point on the touch screen is a proximity touch or a touch touch. .
  • the touch sensor applies a touch (or touch input) applied to the touch screen (or display unit 251) using at least one of various touch methods such as a resistive film method, a capacitive method, an infrared method, an ultrasonic method, and a magnetic field method. Detect.
  • the touch sensor may be configured to convert a change in pressure applied to a specific portion of the touch screen or capacitance generated at the specific portion into an electrical input signal.
  • the touch sensor may be configured to detect a position, an area, a pressure at the touch, a capacitance at the touch, and the like, when the touch object applying the touch on the touch screen is touched on the touch sensor.
  • the touch object is an object applying a touch to the touch sensor and may be, for example, a finger, a touch pen or a stylus pen, a pointer, or the like.
  • the touch controller processes the signal (s) and then transmits the corresponding data to the controller 280.
  • the controller 280 may determine which area of the display unit 251 is touched.
  • the touch controller may be a separate component from the controller 280 or may be the controller 280 itself.
  • the controller 280 may perform different control or perform the same control according to the type of touch object that touches the touch screen (or a touch key provided in addition to the touch screen). Whether to perform different control or the same control according to the type of the touch object may be determined according to an operation state of the wearable device 200 or an application program being executed.
  • the touch sensor and the proximity sensor described above may be independently or combined, and may be a short (or tap) touch, a long touch, a multi touch, a drag touch on a touch screen. ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, etc. A touch can be sensed.
  • the ultrasonic sensor may recognize location information of a sensing object using ultrasonic waves.
  • the controller 280 may calculate the position of the wave generation source through the information detected by the optical sensor and the plurality of ultrasonic sensors.
  • the position of the wave source can be calculated using the property that the light is much faster than the ultrasonic wave, that is, the time that the light reaches the optical sensor is much faster than the time when the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generation source may be calculated using a time difference from the time when the ultrasonic wave reaches the light as the reference signal.
  • the camera 221 described in the configuration of the input unit 220 includes at least one of a camera sensor (eg, CCD, CMOS, etc.), a photo sensor (or an image sensor) and a laser sensor.
  • a camera sensor eg, CCD, CMOS, etc.
  • a photo sensor or an image sensor
  • a laser sensor e.g., a laser sensor
  • the camera 221 and the laser sensor may be combined with each other to detect a touch of a sensing object with respect to a 3D stereoscopic image.
  • the photo sensor may be stacked on the display element, which is configured to scan the movement of the sensing object in proximity to the touch screen. More specifically, the photo sensor mounts a photo diode and a transistor (TR) in a row / column and scans contents mounted on the photo sensor by using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor calculates coordinates of the sensing object according to the amount of light change, and thus, the position information of the sensing object can be obtained.
  • TR transistor
  • the display unit 251 displays (outputs) information processed by the wearable device 200.
  • the display unit 251 may display execution screen information of an application program driven by the wearable device 200 or UI (User Interface) or Graphic User Interface (GUI) information according to the execution screen information. .
  • UI User Interface
  • GUI Graphic User Interface
  • the display unit 251 may be configured as a stereoscopic display unit for displaying a stereoscopic image.
  • the stereoscopic display unit may be a three-dimensional display method such as a stereoscopic method (glasses method), an auto stereoscopic method (glasses-free method), a projection method (holographic method).
  • the sound output unit 252 may output audio data received from the wireless communication unit 210 or stored in the memory 270 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
  • the sound output unit 252 may also output a sound signal related to a function (for example, a call signal reception sound or a message reception sound) performed by the wearable device 200.
  • the sound output unit 252 may include a receiver, a speaker, a buzzer, and the like.
  • the haptic module 253 generates various haptic effects that a user can feel.
  • a representative example of the tactile effect generated by the haptic module 253 may be vibration.
  • the intensity and pattern of vibration generated by the haptic module 253 may be controlled by the user's selection or the setting of the controller. For example, the haptic module 253 may synthesize and output different vibrations or sequentially output them.
  • the haptic module 253 is not only for vibration, but also for stimulation such as pin array vertically moving with respect to the contact skin surface, jetting force or suction force of air through the injection or inlet, grazing to the skin surface, electrode contact, electrostatic force, and the like.
  • stimulation such as pin array vertically moving with respect to the contact skin surface, jetting force or suction force of air through the injection or inlet, grazing to the skin surface, electrode contact, electrostatic force, and the like.
  • Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.
  • the haptic module 253 may not only deliver a tactile effect through direct contact, but may also be implemented so that a user may feel the tactile effect through a muscle sense such as a finger or an arm. Two or more haptic modules 253 may be provided according to a configuration aspect of the wearable device 200.
  • the light output unit 254 outputs a signal for notifying occurrence of an event by using light of a light source of the wearable device 200.
  • Examples of events generated in the wearable device 200 may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.
  • the signal output from the light output unit 254 is implemented as the mobile terminal emits light of a single color or a plurality of colors to the front or the rear.
  • the signal output may be terminated by the mobile terminal detecting the user's event confirmation.
  • the interface unit 260 serves as a path with all external devices connected to the wearable device 200.
  • the interface unit 260 receives data from an external device, receives power, transfers the power to each component inside the wearable device 200, or transmits the data inside the wearable device 200 to an external device.
  • the port, an audio input / output (I / O) port, a video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 260.
  • the identification module is a chip that stores a variety of information for authenticating the usage rights of the wearable device 200, a user identification module (UIM), subscriber identity module (SIM), universal user authentication And a universal subscriber identity module (USIM).
  • a device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through the interface unit 260.
  • the interface unit 260 may be a passage for supplying power from the cradle to the wearable device 200 or may be input from the cradle by a user.
  • Various command signals may be a passage for transmitting to the wearable device 200.
  • Various command signals or power input from the cradle may operate as signals for recognizing that the wearable device 200 is correctly mounted on the cradle.
  • the memory 270 may store a program for the operation of the controller 280, and may temporarily store input / output data (eg, a phone book, a message, a still image, a video, etc.).
  • the memory 270 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.
  • the memory 270 may be a flash memory type, a hard disk type, a solid state disk type, an SSD type, a silicon disk drive type, or a multimedia card micro type. ), Card-type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read It may include at least one type of storage medium of -only memory (PROM), programmable read-only memory (PROM), magnetic memory, magnetic disk and optical disk.
  • the wearable device 200 may be operated in association with a web storage that performs a storage function of the memory 270 on the Internet.
  • the controller 280 controls an operation related to an application program and an overall operation of the wearable device 200. For example, if the state of the mobile terminal satisfies a set condition, the controller 280 may execute or release a lock state that restricts input of a user's control command to applications.
  • controller 280 may perform control and processing related to voice call, data communication, video call, or the like, or may perform pattern recognition processing for recognizing handwriting input or drawing input performed on a touch screen as text and images, respectively. Can be.
  • controller 280 may control any one or a plurality of components described above in order to implement various embodiments described below on the wearable device 200 according to the present invention.
  • the power supply unit 290 receives an external power source and an internal power source under the control of the controller 280 to supply power for operation of each component.
  • the power supply unit 290 may include a battery, and the battery may be a built-in battery configured to be rechargeable, and may be detachably coupled to the terminal body for charging.
  • the power supply unit 290 may be provided with a connection port, the connection port may be configured as an example of the interface 160 is electrically connected to the external charger for supplying power for charging the battery.
  • the power supply unit 290 may be configured to charge the battery in a wireless manner without using the connection port.
  • the power supply unit 290 uses at least one of an inductive coupling based on a magnetic induction phenomenon or a magnetic resonance coupling based on an electromagnetic resonance phenomenon from an external wireless power transmitter. Power can be delivered.
  • various embodiments of the present disclosure may be implemented in a recording medium readable by a computer or a similar device using, for example, software, hardware, or a combination thereof.
  • FIG. 2B is a view illustrating a watch type terminal as an example of the wearable device of FIG. 2A.
  • the watch type mobile terminal 200 includes a main body 201 having a display unit 251 and a band 202 connected to the main body 201 and configured to be worn on a wrist.
  • the mobile terminal 300 may include features of or similar to the wearable device 200 of FIG. 2A.
  • the main body 201 includes a case forming an external appearance. As shown, the case may include a first case 201a and a second case 201b which provide an inner space for accommodating various electronic components. However, the present invention is not limited thereto, and one case may be configured to provide the internal space so that the watch type mobile terminal 200 of the unibody may be implemented.
  • the watch type mobile terminal 200 may be configured to enable wireless communication, and an antenna for the wireless communication may be installed in the main body 201.
  • the antenna can extend the performance using a case.
  • a case containing a conductive material may be configured to be electrically connected with the antenna to extend the ground area or the radiation area.
  • the display unit 251 may be disposed on the front surface of the main body 201 to output information, and the display unit 251 may be provided with a touch sensor to be implemented as a touch screen. As illustrated, the window 251a of the display unit 251 may be mounted on the first case 201a to form the front surface of the terminal body together with the first case 201a.
  • the main body 201 may include a sound output unit 252, a camera 221, a microphone 222, a user input unit 223, and the like.
  • the display unit 251 When the display unit 251 is implemented as a touch screen, the display unit 251 may function as the user input unit 223, and thus a separate key may not be provided in the main body 201.
  • the band 202 may be worn on the wrist to surround the wrist, and may be formed of a flexible material to facilitate wearing.
  • the band 202 may be formed of leather, rubber, silicone, synthetic resin, or the like.
  • the band 202 is configured to be detachable to the main body 201, the user can be configured to be replaced with various types of bands according to taste.
  • the band 202 can be used to extend the performance of the antenna.
  • the band may include a ground extension (not shown) electrically connected to the antenna to extend the ground area.
  • the band 202 may be provided with a fastener 202a.
  • the fastener 202a may be implemented by a buckle, a snap-fit hook structure, a velcro (trade name), or the like, and may include elastic sections or materials. . In this figure, an example in which the fastener 202a is implemented in a buckle form is shown.
  • the vehicle sensor while recording the image around the vehicle using only the camera provided on the outside of the vehicle
  • the camera provided in the vehicle is driven based on the biometric information of the driver received from the wearable device, so that it is possible to more accurately determine whether an accident has occurred, and furthermore, the cause of the accident. It is possible to record an image that can be identified more clearly.
  • the vehicle accident recording apparatus 100 may perform recording at all times by using an external camera of a vehicle, or may include an automatic recording mode and a manual recording mode, and satisfy a predetermined condition. Automatic recording may be performed, or recording may be performed when a trigger signal for starting recording is generated by manipulating predetermined keys, buttons, and switches provided in the vehicle accident recording apparatus 100.
  • the controller 150 determines a reference value for which a sensor value detected through various vehicle sensors, for example, an impact sensor, an obstacle detecting sensor, a braking sensor, and a speed sensor, provided in the vehicle is determined. It is determined whether it exceeds (S320).
  • the controller 150 may determine that the degree of impact detected by the impact sensor of the vehicle exceeds a predetermined level, or the approaching degree of the moving object detected by the front or rear camera of the vehicle falls within a reference distance, or the vehicle.
  • the speed of the vehicle exceeds the speed limit value or the predetermined reference value of the driving road, it may be determined that the detected sensor value exceeds the predetermined reference value.
  • the reference value may be set in advance at the time of manufacture of the product of the vehicle accident recording apparatus 100, or may be set or changed according to a road situation or through a user input after installation in the vehicle.
  • the controller 150 may receive the biometric information of the driver detected from the wearable device through the wireless communication unit 130 (S330).
  • the biometric information is data corresponding to an electrical signal generated by a wearable device 200, for example, a body of a driver wearing a watch-type mobile terminal.
  • a wearable device 200 for example, a body of a driver wearing a watch-type mobile terminal.
  • an ECG (ElectroCardioGram) signal and a PPG (Photoplethymogram) may be used.
  • a GSR (Galvanic Skin Response) signal but is not limited thereto, and may include all kinds of signals widely used in the art for measurement of sleep stage.
  • the watch type terminal 200 may further include a body temperature sensor, a heart rate sensor, a pressure sensor, and the like, to further acquire the biometric information of the driver detected therefrom.
  • an electrocardiogram (ECG) signal is an electrical signal in which electrical activity of the heart occurs on the skin surface.
  • Electrocardiogram signals can be measured by inducing the active currents in the myocardium in two appropriate places on the body surface as the heart beats. By periodically observing the change characteristics of the cycle and waveform of the ECG, it is possible to distinguish the psychological state of the driver wearing the watch type terminal 200.
  • the Photoplethysmogram (PPG) signal is an electrical signal obtained by measuring the repeated increase and decrease of arterial blood volume in the fingertip vessel in synchronization with the heartbeat.
  • the electromyogram (EMG) signal is an electrical signal in which muscle contractility, muscle activity and fatigue occur on the skin surface.
  • the EMG may detect, for example, the movement of the tendons according to the movement of the finger of the driver, which is detected through the wearing of the watch type terminal 200.
  • the galvanic skin reflex (GSR) signal is an electrical signal in which a change in skin resistance to sympathetic nerve activity occurs on the skin surface.
  • the skin conduction signal may be obtained by measuring a phenomenon in which the electrical resistance caused by external stimulus or emotional excitement in the skin of a living body temporarily decreases or an action potential occurs. For example, when the driver becomes nervous / wakes up and the sympathetic nervous system is activated, the sweat glands on the skin surface are activated to increase conductivity, thereby increasing GSR.
  • the vehicle accident recording apparatus 100 may be automatically connected to the wearable device when the driver's vehicle riding or wearing of the wearable device is detected.
  • the recording using the vehicle's external camera is continued.
  • the vehicle accident recording process may not be performed.
  • the recorded video is processed as a normal recorded video and automatically deleted from the recording after the reference time has elapsed.
  • the cause of the vehicle accident may occur after the driver's vehicle.
  • the internal camera of the vehicle may be driven first or recording may be started simultaneously with the external camera.
  • the recorded video since the recorded video is likely to be used to determine the cause of the accident, the recorded video may be processed as a special recorded video so that the recorded video may not be deleted even after the reference time elapses.
  • the controller 150 always receives the biometric information detected from some sensors from the connected wearable device 200, and when the sensor value detected by the vehicle sensor exceeds the reference value, the biometric information of the driver may be viewed.
  • a control signal for activating all the sensors included in the wearable device 200 may be provided to obtain a large number.
  • the controller 150 receives only the minimum PPG information through the wireless communication unit 130, and when a risk occurs such that a sensor value measured from a vehicle sensor exceeds a reference value,
  • other sensors included in the wearable device for example, ECG, GSR, body temperature sensor, pressure sensor and the like can be further activated.
  • the controller 150 may determine a driving time point of the internal camera of the vehicle based on the driver's impact level corresponding to the received biometric information (S340).
  • a camera may be provided on the outside and the inside of the vehicle, or another means (eg, a camera provided in the wearable device) for photographing at least the inside of the vehicle together with the outside camera of the vehicle.
  • the determination of the driving time of the internal camera of the vehicle may mean determining the timing of simultaneously driving the internal camera of the vehicle without stopping the driving of the external camera of the vehicle.
  • the internal camera of the vehicle is driven to closely grasp the driver's condition through the camera, but the controller 150 is determined from the time point at which the accident occurred.
  • a time point for driving the internal camera eg, 5 seconds after the occurrence of the accident
  • a time interval for driving the internal camera eg, driving the internal camera for 30 seconds every 3 minutes
  • the controller 150 may control to change the object of the recording based on the determined driving time (S350).
  • changing the object of the recording means that the first image recorded through the camera provided on the outside of the vehicle is converted to the second image recorded through the camera provided on the inside of the vehicle, based on the driving time of the internal camera. And a case in which the first and second images are recorded and synthesized, respectively.
  • the video recorded at the time when a high probability of an accident occurs or when the accident occurs may be recorded as a plurality of divided screens. According to this, since the external situation of the vehicle and the internal situation of the vehicle are simultaneously recorded, determination of whether an accident has occurred and the cause identification corresponding thereto can be performed more accurately.
  • the controller 150 controls the internal camera.
  • the second image recorded through the first camera may be converted into the first image recorded through the external camera.
  • the controller 180 may determine whether an accident of the vehicle occurs based on the detected sensor value and the degree of impact of the driver.
  • the vehicle sensor provided in the vehicle may be used to determine whether the vehicle has an accident as a primary, and after that, the vehicle may be determined as a secondary by using the biometric information of the driver.
  • the vehicle accident recording apparatus is provided with a microphone (not shown) or the voice of the driver input through the microphone 222 included in the wearable device may be recorded together with the recorded image, thereby recording. By matching the recorded video with the recorded driver's voice, it can help to more accurately identify the cause of the accident.
  • the controller 150 may simultaneously drive the cameras provided in the outside and the inside of the vehicle to perform recording. .
  • the plurality of recorded images may be separately stored or may be synthesized and stored as described above.
  • the controller 150 when it is determined that an accident of a vehicle occurs based on the detected sensor value and the degree of impact of a driver corresponding thereto, the controller 150 extends the storage time point of the recorded image before or after the corresponding time point or corresponding time point. Video recorded before and after can be immediately sent to a preset server or institution.
  • the vehicle recording apparatus 100 generates a control signal by the controller 150 to drive the camera provided in the wearable device to drive the camera in the vehicle. Control to perform recording.
  • the vehicle recording apparatus 100 may display a first image obtained by driving a camera provided outside the vehicle and a second image obtained by driving a camera provided inside the vehicle on a screen. Recording can be performed.
  • the first image and the second image may be synthesized in various ways.
  • the first image acquired using the external camera of the vehicle may be synthesized such that the second image acquired using the internal camera of the vehicle is displayed in the small region.
  • the first image may be a main image and the second image may be synthesized as a PIP image, or the first image may be synthesized such that the second image is intermittently overlapped with at least a portion of the first image.
  • the vehicle recording apparatus 100 controls to record an image corresponding to the driver's gaze using a camera provided in the wearable device. can do.
  • the vehicle recording apparatus 100 may be configured to drive a camera provided in the wearable device when the impact level of the driver exceeds a reference value, for example, when the driver is extremely excited or when the PPG signal rises sharply.
  • the control signal may be generated to continuously track the driver's gaze.
  • the vehicle recording apparatus 100 may alternately or simultaneously drive the internal camera and the external camera of the vehicle so that an image corresponding to the driver's gaze may be recorded.
  • the vehicle recording apparatus 100 may collect driving information of the vehicle using a camera provided outside the vehicle in relation to the above-described steps S310 and S320.
  • Driving information of a vehicle that can be collected using a camera provided outside the vehicle may include recognition of a lane and a vehicle area and detection of approach of an external vehicle, as illustrated in FIGS. 4A to 4C.
  • FIG. 4A is an exemplary conceptual diagram illustrating a method of recognizing a vehicle and a lane in front of a vehicle using a camera in order to detect an accident of a vehicle in advance.
  • the vehicle accident recording apparatus 100 may acquire an image including an object located in front of the vehicle by using a camera provided outside the vehicle.
  • the object may be a vehicle located in front of the vehicle, that is, a front vehicle.
  • the vehicle accident recording apparatus 100 may acquire a vanishing line using, for example, a homography matrix.
  • the vanishing line 310 is generated in the horizontal direction of the image being photographed.
  • the vehicle accident recording apparatus 100 maps the width value of the actual lane installed on the road on which the vehicle travels from the acquired image to the lane of the image, between the vanishing line 310 and the lane located at the bottom of the image.
  • the set lane area may be corrected to a trapezoidal lane area by correcting an error based on state information of a road and distance information between a road and an object.
  • the vehicle accident recording apparatus 100 may recognize the trapezoidal lane area 410 and 420 as shown in FIG. 4A.
  • 4B is an exemplary conceptual diagram illustrating a method of tracking an area of a vehicle using a camera in order to detect an accident of a vehicle in advance.
  • the vehicle accident recording apparatus 100 may detect feature points of the vehicle from images captured at predetermined time intervals by using a camera provided outside the vehicle. For example, an optical flow of vehicle feature points existing between the first image photographed at the first time point and the second image photographed at the second time point is extracted, and the extracted light flow is expressed in a vector form. Then, the vehicle accident recording apparatus 100 may track the vehicle region based on the form of the end points of the vectors.
  • optical flow refers to the velocity distribution of the apparent motion in the image that occurs by slowly changing the brightness pattern.
  • the optical flow refers to a vector showing the apparent motion appearing in the image from two different image data obtained in time by the camera.
  • the vehicle accident recording apparatus 100 may determine a vehicle area and use it to determine whether an accident of a vehicle occurs.
  • FIG. 4C is an exemplary conceptual diagram illustrating a method of detecting a proximity vehicle using a camera to detect an accident of a vehicle in advance.
  • the vehicle accident recording apparatus 100 first selects the vehicle search region 451 and checks whether there is associated vehicle characteristic data in the selected search region 201.
  • feature data associated with a lower portion of the vehicle may not be considered.
  • the vehicle accident recording apparatus 100 may include the first feature data 452 near the rear lamp of the vehicle, the second feature data 453 near the glass on the rear of the vehicle, and the first near the bumper on the rear of the vehicle.
  • some feature data may be checked to confirm detection of the vehicle. That is, the vehicle accident recording apparatus 100 may detect a vehicle only with a set of highly reliable feature data, and detect a vehicle that is close to the driver vehicle in advance.
  • the vehicle accident recording apparatus 100 may recognize a lane and a vehicle area by using a camera provided outside the vehicle, and detect an approaching vehicle in advance.
  • the occurrence of an accident may be more accurately determined and recorded by interworking with the wearable device 200.
  • the controller 150 detects whether an object in the vicinity of the vehicle is in proximity, and accordingly, the biometric information received from the wearable device 200.
  • the driver's status information may be stored together with the recorded image.
  • the controller 150 may record an image based on the position of the vehicle sensor outputting the sensor value exceeding the reference value.
  • the sensor may recognize the position of the sensor and adjust the camera angle so that the periphery of the bumper is at the center of the recorded image.
  • the camera may control to perform recording by performing a zoom-in / zoom-out function.
  • the occurrence of the vehicle accident based on the biometric information of the driver received from the vehicle sensor provided in the vehicle and the wearable device worn by the driver Accurate judgment can be used to perform recording.
  • the object of the recording or acquiring images of the inside and outside of the vehicle at the same time according to the state of the vehicle and the driver it is possible to record and store the accident image more efficiently.
  • the present invention described above can be embodied as computer readable codes on a medium in which a program is recorded.
  • the computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAMs, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. This also includes implementations in the form of carrier waves (eg, transmission over the Internet).
  • the computer may include the controller 180 of the terminal. Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Transportation (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)

Abstract

L'invention porte sur un appareil d'enregistrement d'accident de véhicule et sur son procédé. Un procédé d'enregistrement d'accident de véhicule selon un mode de réalisation de la présente invention comprend les étapes consistant à : déclencher un enregistrement par l'actionnement d'une caméra équipant l'extérieur d'un véhicule ; recevoir une information biométrique de conducteur détectée par un dispositif portatif si une valeur de capteur détectée par un capteur de véhicule équipant le véhicule dépasse une valeur de référence ; déterminer un temps de fonctionnement d'une caméra équipant l'intérieur du véhicule sur la base du degré de l'impact du conducteur correspondant à l'information biométrique reçue ; et changer l'objet d'enregistrement sur la base du temps de fonctionnement déterminé.
PCT/KR2014/012922 2013-12-27 2014-12-26 Appareil d'enregistrement d'accident de véhicule et son procédé WO2015099500A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2013-0165864 2013-12-27
KR20130165864 2013-12-27

Publications (1)

Publication Number Publication Date
WO2015099500A1 true WO2015099500A1 (fr) 2015-07-02

Family

ID=53479259

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/012922 WO2015099500A1 (fr) 2013-12-27 2014-12-26 Appareil d'enregistrement d'accident de véhicule et son procédé

Country Status (2)

Country Link
KR (1) KR101741074B1 (fr)
WO (1) WO2015099500A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017095034A3 (fr) * 2015-12-01 2018-02-22 Lg Electronics Inc. Terminal mobile de type montre et son procédé de commande

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101871272B1 (ko) * 2015-08-28 2018-06-27 엘지전자 주식회사 충전 제어 장치 및 그 제어 방법
KR101727181B1 (ko) * 2015-09-25 2017-04-14 주식회사 서연이화 차량 원격 제어 시스템 및 방법
DE112017007003T5 (de) * 2017-03-03 2020-01-02 Ford Global Technologies, Llc Fahrzeugereignisidentifizierung
KR102303023B1 (ko) * 2019-08-01 2021-09-16 정규홍 스마트 밴드를 활용한 택시 모니터링 시스템, 스마트 밴드를 활용한 택시 서비스와 연계된 게이트웨이 모듈 및 통합 제어 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100837686B1 (ko) * 2007-02-09 2008-06-13 이수익 네비게이션 장치에 적용가능한 차량 사고 기록용 동영상촬영 장치 및 방법
US20080243332A1 (en) * 2002-01-25 2008-10-02 Basir Otman A Vehicle visual and non-visual data recording system
KR20090042503A (ko) * 2007-10-26 2009-04-30 아주대학교산학협력단 사용자의 행동인식 및 위치추적 시스템
KR101181909B1 (ko) * 2012-05-14 2012-09-10 케이티텔레캅 주식회사 차량용 블랙박스 기능을 구비한 운전 관리 장치를 이용한 운전자 원격 진단 시스템
KR20130028443A (ko) * 2011-09-09 2013-03-19 유비벨록스(주) 블랙 박스와 감지 센서를 활용한 사고 진단 시스템 및 방법

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002293271A (ja) * 2001-04-02 2002-10-09 Niles Parts Co Ltd 車両用事故情報記憶システム
JP5114101B2 (ja) * 2007-06-07 2013-01-09 クラリオン株式会社 車載カメラシステム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080243332A1 (en) * 2002-01-25 2008-10-02 Basir Otman A Vehicle visual and non-visual data recording system
KR100837686B1 (ko) * 2007-02-09 2008-06-13 이수익 네비게이션 장치에 적용가능한 차량 사고 기록용 동영상촬영 장치 및 방법
KR20090042503A (ko) * 2007-10-26 2009-04-30 아주대학교산학협력단 사용자의 행동인식 및 위치추적 시스템
KR20130028443A (ko) * 2011-09-09 2013-03-19 유비벨록스(주) 블랙 박스와 감지 센서를 활용한 사고 진단 시스템 및 방법
KR101181909B1 (ko) * 2012-05-14 2012-09-10 케이티텔레캅 주식회사 차량용 블랙박스 기능을 구비한 운전 관리 장치를 이용한 운전자 원격 진단 시스템

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017095034A3 (fr) * 2015-12-01 2018-02-22 Lg Electronics Inc. Terminal mobile de type montre et son procédé de commande
US10229545B2 (en) 2015-12-01 2019-03-12 Lg Electronics Inc. Watch-type mobile terminal and controlling method thereof

Also Published As

Publication number Publication date
KR101741074B1 (ko) 2017-05-30
KR20150077360A (ko) 2015-07-07

Similar Documents

Publication Publication Date Title
US9974482B2 (en) Mobile terminal and method of controlling the same
EP2953106B1 (fr) Gestion des accidents de véhicule au moyen d'un terminal mobile
WO2016208802A1 (fr) Terminal mobile de type montre et son procédé d'utilisation
WO2015199304A1 (fr) Terminal mobile et son procédé de commande
WO2016024752A1 (fr) Dispositif à porter sur soi et procédé de fonctionnement de ce dispositif
KR20180136776A (ko) 이동 단말기 및 그 제어방법
WO2015099500A1 (fr) Appareil d'enregistrement d'accident de véhicule et son procédé
WO2015174612A1 (fr) Terminal mobile et son procédé de commande
KR101613957B1 (ko) 와치 타입 이동 단말기 및 그것의 제어방법
WO2016052788A1 (fr) Terminal mobile, et son procédé de commande
WO2016190477A1 (fr) Terminal mobile
WO2018143509A1 (fr) Robot mobile et son procédé de commande
WO2016190478A1 (fr) Terminal mobile de type montre et son procédé de fonctionnement
WO2015111805A1 (fr) Terminal vestimentaire et système le comprenant
KR20180028701A (ko) 휴대용 카메라 및 그 제어방법
WO2016163591A1 (fr) Terminal mobile de type montre
WO2018074615A1 (fr) Terminal mobile
WO2018110753A1 (fr) Terminal de type montre
WO2020050432A1 (fr) Terminal mobile
KR102130801B1 (ko) 손목 스탭 검출 장치 및 그 방법
KR20170083403A (ko) 스마트 워치 및 그의 근전도 신호를 이용한 제어방법
WO2018128199A1 (fr) Terminal de type montre
KR20150145550A (ko) 이동 단말기
KR20160024538A (ko) 사용자 맞춤형으로 추천 목적지를 제공하는 텔레매틱스 단말기 및 그 제어 방법
WO2018066735A1 (fr) Terminal mobile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14874662

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14874662

Country of ref document: EP

Kind code of ref document: A1