WO2014041646A1 - 携帯端末装置、車載装置、及び車載システム - Google Patents
携帯端末装置、車載装置、及び車載システム Download PDFInfo
- Publication number
- WO2014041646A1 WO2014041646A1 PCT/JP2012/073375 JP2012073375W WO2014041646A1 WO 2014041646 A1 WO2014041646 A1 WO 2014041646A1 JP 2012073375 W JP2012073375 W JP 2012073375W WO 2014041646 A1 WO2014041646 A1 WO 2014041646A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- terminal device
- mobile terminal
- touch panel
- display
- Prior art date
Links
- 230000006870 function Effects 0.000 claims abstract description 17
- 238000004891 communication Methods 0.000 claims description 59
- 206010048669 Terminal state Diseases 0.000 description 26
- 238000010586 diagram Methods 0.000 description 24
- 230000008859 change Effects 0.000 description 16
- 238000000034 method Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 10
- 238000001514 detection method Methods 0.000 description 9
- 230000004044 response Effects 0.000 description 9
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010845 search algorithm Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Arrangement of adaptations of instruments
-
- B60K35/10—
-
- B60K35/28—
-
- B60K35/654—
-
- B60K35/80—
-
- B60K35/81—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/169—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1698—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a sending/receiving arrangement to establish a cordless communication link, e.g. radio or infrared link, integrated cellular phone
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72415—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
-
- B60K2360/143—
-
- B60K2360/1438—
-
- B60K2360/146—
-
- B60K2360/1468—
-
- B60K2360/1472—
-
- B60K2360/166—
-
- B60K2360/573—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to a mobile terminal device, a vehicle-mounted device that cooperates with the mobile terminal device, and a vehicle-mounted system that links the mobile terminal device and the vehicle-mounted device.
- This in-vehicle system allows the mobile terminal device to function as a pointing device for the in-vehicle display in a state where the mobile terminal device and the in-vehicle device are connected via a short-range wireless communication line. Specifically, a display image of the in-vehicle display is captured by a camera attached to the mobile terminal device, it is determined to which part of the display image the captured image corresponds, and the determination result is used for specifying the input position. Thus, the mobile terminal device is caused to function as a pointing device.
- the present invention provides a mobile terminal device that can more easily operate an operation target displayed on a vehicle-mounted display, a vehicle-mounted device that cooperates with the mobile terminal device, and a vehicle-mounted device that links the mobile terminal device and the vehicle-mounted device.
- the purpose is to provide a system.
- a mobile terminal device is a mobile terminal device including a touch panel, and when the mobile terminal device is placed at a predetermined position in a vehicle interior, the touch panel is used as an in-vehicle display.
- a control device that functions as a touch pad for operating the displayed operation target is included.
- An in-vehicle device is an in-vehicle device connected to an in-vehicle display, and an operation input to a touch panel of a mobile terminal device placed at a predetermined position in a vehicle interior is displayed on the in-vehicle display. Received as an operation input for the operation target.
- the vehicle-mounted system which concerns on the Example of this invention is a portable terminal provided with the control apparatus which functions as a touchpad which operates the operation target displayed on a vehicle-mounted display, when it places in the predetermined position in a vehicle interior And a vehicle-mounted device that receives an operation input on the touch panel of the mobile terminal device placed at a predetermined position in a vehicle interior as an operation input on an operation target displayed on the vehicle-mounted display.
- the present invention provides a mobile terminal device that can more easily operate an operation target displayed on the in-vehicle display, an in-vehicle device that cooperates with the mobile terminal device, and an in-vehicle system that links the mobile terminal device and the in-vehicle device. Can be provided.
- FIG. 1 It is a functional block diagram which shows the structural example of the portable terminal device which concerns on the Example of this invention. It is a front view of the portable terminal device of FIG. It is a figure which shows the mode of a vehicle interior when the portable terminal device of FIG. 1 is mounted
- FIG. 1 is a functional block diagram showing a configuration example of an in-vehicle system 100 including a mobile terminal device 40 according to an embodiment of the present invention.
- FIG. 2 is a front view of the mobile terminal device 40
- FIG. 3 is a diagram showing the interior of the vehicle interior when the mobile terminal device 40 is placed on a cradle (dock) 30 on the dashboard.
- the in-vehicle system 100 is a system that links the mobile terminal device 40 and the in-vehicle device 50, and mainly includes the mobile terminal device 40 and the in-vehicle device 50.
- the portable terminal device 40 is a terminal device carried by a passenger, and includes, for example, a mobile phone, a smartphone, a PDA (Personal Digital Assistance), a portable game machine, a tablet computer, and the like.
- the mobile terminal device 40 is a smartphone, and mainly includes a control device 1, an information acquisition device 2, a touch panel 3, a communication device 4, a storage device 5, a display device 6, a voice input device 7, and a voice output.
- Device 8 is included.
- the control device 1 is a device that controls the mobile terminal device 40.
- the control device 1 is a computer including a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like.
- the control device 1 reads a program corresponding to each functional element of a terminal state switching unit 10 and an operation input notification unit 11 described later from the ROM, expands the program in the RAM, and causes the CPU to execute processing corresponding to each functional element.
- the program corresponding to each functional element may be downloaded through a communication network or provided in a state recorded in a recording medium.
- the information acquisition device 2 is a device that acquires information from the outside.
- the information acquisition device 2 is a wireless communication device for using a mobile phone line, a public wireless LAN, or the like.
- the touch panel 3 is one of operation input devices mounted on the mobile terminal device 40.
- the touch panel 3 is a multi-touch touch panel disposed on the display device 6 and supporting a multi-touch gesture function.
- the communication device 4 is a device that controls communication with the in-vehicle device 50.
- the communication device 4 is connected to the communication device 4V of the in-vehicle device 50 via short-range wireless communication (hereinafter referred to as “NFC (Near Field Communication)”).
- NFC Near Field Communication
- wireless communication using Bluetooth (registered trademark), Wi-Fi (registered trademark), or the like may be used for communication between the communication device 4 and the communication device 4V, such as USB (Universal Serial Bus). Wired communication using the may be used.
- the communication device 4 periodically transmits a response request signal.
- communication device 4V receives the response request signal
- communication device 4V returns the response signal to communication device 4.
- the communication device 4 establishes wireless communication with the communication device 4V.
- the communication device 4V may periodically transmit a response request signal, and each of the communication device 4 and the communication device 4V may periodically transmit a response request signal.
- the communication device 4 returns a response signal to the communication device 4V.
- the communication device 4V receives the response signal
- the communication device 4V establishes wireless communication with the communication device 4.
- the communication device 4 outputs to the control device 1 a control signal notifying that the wireless communication has been established.
- FIG. 3 shows a state in which the mobile terminal device 40 is mounted on the dock 30 as an example of a state in which wireless communication is established between the mobile terminal device 40 and the in-vehicle device 50.
- the mobile terminal device 40 is held by the dock 30 with the touch panel 3 and the display device 6 facing the driver.
- the driver can, for example, extend the hand holding the steering wheel 70 and perform an operation input on the touch panel 3.
- the driver also displays a display device 6V that displays navigation information and the like during driving, a speed meter 80 that displays speed information, and a multi-information display 90 that displays the communication status and battery status of the mobile terminal device 40. Can be seen as needed.
- the storage device 5 is a device that stores various types of information, and includes, for example, a nonvolatile semiconductor memory such as a flash memory.
- the storage device 5 stores application software (hereinafter referred to as “APP”) executed on the mobile terminal device 40, a widget, and the like.
- APP application software
- “Widget” is a small accessory APP that operates on the mobile terminal device 40.
- the widget is, for example, an APP that periodically acquires and displays new information, and specifically includes an APP that displays stock price information, weather forecast, altitude, coastal wave prediction, and the like.
- the widget also includes an APP that displays a calendar, time, etc., a slide show APP that sequentially displays images around the vehicle acquired from the web, an APP that displays an eco level based on driving operation information, and the like.
- the widget may be downloaded through a communication network or provided in a state of being recorded on a recording medium.
- the display device 6 is a device for displaying various information, and is, for example, a liquid crystal display.
- the voice input device 7 is a device for inputting voice, and is, for example, a microphone.
- the audio output device 8 is a device for outputting various information as audio, and is, for example, a speaker.
- the in-vehicle device 50 is, for example, an in-vehicle navigation device, and mainly includes a control device 1V, a storage device 5V, a display device 6V, an audio output device 8V, and a position detection device 9V.
- the control device 1V is a device that controls the in-vehicle device 50, and in the present embodiment, is a computer including a CPU, a RAM, a ROM, and the like.
- the control device 1 reads a program corresponding to a route guidance unit 12V, which will be described later, from the ROM, expands it in the RAM, and causes the CPU to execute processing corresponding to the route guidance unit 12V.
- the program corresponding to the route guidance unit 12V may be downloaded through a communication network or provided in a state of being recorded on a recording medium.
- the storage device 5V is a device for storing various information, and includes, for example, a nonvolatile semiconductor memory such as a flash memory.
- the storage device 5V stores a map database 51V.
- the map database 51V includes node positions such as intersections and interchanges, link distances that are elements connecting the nodes, time required to pass the links, link costs that are degrees of costs, facility positions (latitude, longitude, This is a database that systematically stores facility names, etc.
- the display device 6V is a vehicle-mounted display for displaying various types of information, for example, a liquid crystal display.
- the audio output device 8V is a device for outputting various information as audio, and is, for example, a speaker.
- the position detection device 9V is a device for detecting the position of the in-vehicle device 50.
- the position detection device 9V is a GPS receiver that receives a GPS signal output from a GPS satellite via a GPS (Global Positioning System) antenna.
- the position detection device 9V detects the position (latitude, longitude, altitude) of the in-vehicle device 50 based on the received GPS signal, and outputs the detection result to the control device 1V.
- the terminal state switching unit 10 is a functional element that switches the operation state of the mobile terminal device 40. For example, when the mobile terminal device 40 is placed at a predetermined position in the vehicle interior, the operation state functions as a normal mobile terminal device. (Hereinafter referred to as “normal mode”) is switched to an operating state (hereinafter referred to as “input mode”) that functions as an operation input device of the in-vehicle device 50.
- normal mode a normal mobile terminal device.
- input mode an operating state
- the “predetermined position in the passenger compartment” is a position in an area where communication between the mobile terminal device 40 and the in-vehicle device 50 can be used, for example, a position in a predetermined area around the driver's seat. .
- the terminal state switching unit 10 switches between the normal mode and the input mode based on the output of the communication device 4. Specifically, when the terminal state switching unit 10 detects that wireless communication is established between the mobile terminal device 40 and the in-vehicle device 50, the terminal state switching unit 10 sets the operation state of the mobile terminal device 40 to the input mode. . On the other hand, when the terminal state switching unit 10 detects that wireless communication is not established between the mobile terminal device 40 and the in-vehicle device 50, the terminal state switching unit 10 sets the operation state of the mobile terminal device 40 to the normal mode.
- the terminal state switching unit 10 automatically activates a predetermined APP when the mobile terminal device 40 is attached to the dock 30 and detects that wireless communication has been established, thereby enabling the mobile terminal device to start automatically.
- 40 operation states are set to the input mode.
- the terminal state switching unit 10 automatically terminates a predetermined APP when the mobile terminal device 40 is removed from the dock 30 and it is detected that wireless communication has been interrupted, thereby operating the mobile terminal device 40.
- the terminal state switching unit 10 may not only automatically start or end a predetermined APP at the time of mode switching, but only enable or disable the predetermined APP. This is because the operation state of the mobile terminal device 40 is switched according to the activation or termination of a predetermined APP manually by the operator.
- the “predetermined APP” is an APP that operates on the mobile terminal device 40, and includes, for example, an operation input APP related to an operation input, and particularly includes a touch gesture recognition APP that recognizes various touch gestures.
- the touch gesture is an action for performing an operation input using a finger or the like on the touch panel 3, and includes, for example, a tap, a double tap, a drag, a swipe, a flick, a pinch-in, a pinch-out, and the like.
- FIG. 3 shows a state where a touch pad image (image representing a black touch pad surface), which is a screen for touch gesture recognition APP, is displayed on the display device 6 of the mobile terminal device 40.
- a touch pad image image representing a black touch pad surface
- the touch gesture recognition APP is activated, that is, the operation input to the touch panel 3 by the operator can be accepted, and then the screen of the display device 6 is displayed.
- the display may be interrupted.
- the “touch panel” means an operation input device arranged on the display device and cooperating with the display device (operating the operation target displayed on the display device), and the “touch pad” is a display
- the operation input notification unit 11 is a functional element for notifying the in-vehicle device 50 of the content of the operation input performed by the operator on the operation input device of the mobile terminal device 40.
- the operation input notification unit 11 is a touch gesture recognition APP, and notifies the in-vehicle device 50 of the content of the touch gesture on the touch panel 3 by the operator.
- the operation input notification unit 11 changes the operation target according to the number of fingers used for the touch gesture.
- the “operation target” is an image on the vehicle-mounted display that is operated by the operator using the operation input device. Specifically, when a touch gesture is performed using one finger, the operation input notification unit 11 inputs predetermined operation input information on the vehicle so that a cursor displayed on the display device 6V is an operation target. To device 50. More specifically, predetermined operation input information is transmitted to the in-vehicle device 50 so that the cursor is moved, selected by the cursor, and the like.
- the operation input information is information representing the content of the operation input to the touch panel 3 by the operator, and includes, for example, an operation target identification number, an operation target movement amount, an operation target movement speed, an operation target movement direction, and the like.
- the operation input notification unit 11 receives predetermined operation input information so that a specific APP image displayed on the display device 6V is an operation target. It transmits to the vehicle equipment 50. More specifically, the predetermined operation input information is transmitted to the in-vehicle device 50 so that the map image of the navigation APP is scrolled, enlarged and reduced, and the like.
- the operation input notification unit 11 sends predetermined operation input information to the in-vehicle device 50 so that the widget screen displayed on the display device 6V is an operation target. Send to. More specifically, predetermined operation input information is transmitted to the in-vehicle device so that the display / non-display of the widget screen displayed on the display device 6V is switched and the type of the widget screen displayed on the display device 6V is switched. 50.
- the widget screen is a screen that the widget displays in a partial area on the display device 6V.
- the operation target when a touch gesture is performed using one, two, or three fingers is set in association with the number of fingers.
- the operation target when the touch gesture is performed by using may be set in association with the number of fingers.
- the route guide unit 12V is a functional element that guides a route to a predetermined point, and executes navigation APP, for example.
- the route guidance unit 12V includes the current position detected by the position detection device 9V, the destination position input via the touch panel 3 of the mobile terminal device 40, and the map information database 51V stored in the storage device 5V. Based on the above, the optimum route from the current position to the destination is derived.
- the route guide unit 12V searches for the shortest route using, for example, the Dijkstra method as the shortest route search algorithm.
- the route guide unit 12V may search for the fastest route that can reach the destination earliest, the route that does not use the expressway, and the like.
- the route guidance unit 12V displays the recommended route searched for on the display device 6V so that the recommended route can be distinguished from other routes so that the operator can easily confirm the recommended route, and voice guidance is provided in the voice output device 8V. To support the driving along the recommended route by the operator.
- FIG. 4 is a flowchart showing the flow of the terminal state switching process, and the mobile terminal device 40 repeatedly executes this terminal state switching process at a predetermined cycle.
- the terminal state switching unit 10 in the control device 1 of the mobile terminal device 40 determines whether or not wireless communication is established between the mobile terminal device 40 and the in-vehicle device 50 (step S1).
- the terminal state switching unit 10 establishes NFC wireless communication between the communication device 4 mounted on the mobile terminal device 40 and the communication device 4V of the in-vehicle device 50 based on the output of the communication device 4. It is determined whether or not.
- the terminal state switching unit 10 determines whether or not a predetermined APP has not been activated (step S2). In the present embodiment, the terminal state switching unit 10 determines whether the touch gesture recognition APP has not been activated.
- step S3 the terminal state switching unit 10 activates the touch gesture recognition APP (step S3). Thereby, the terminal state switching unit 10 switches the operation state of the mobile terminal device 40 to the input mode. In addition, the terminal state switching part 10 maintains the operation state (input mode) of the portable terminal device 40 as it is, when it determines with the touch gesture recognition APP having been started (NO of step S2).
- the terminal state switching unit 10 determines whether or not the predetermined APP has been activated (step S4).
- the control device 1 determines whether or not the touch gesture recognition APP has been activated.
- the terminal state switching unit 10 ends the touch gesture recognition APP (step S5). Thereby, the terminal state switching unit 10 switches the operation state of the mobile terminal device 40 to the normal mode. In addition, the terminal state switching part 10 maintains the operation state (normal mode) of the portable terminal device 40 as it is, when it determines with touch gesture recognition APP not having started (NO of step S4).
- the mobile terminal device 40 can automatically switch its operation state depending on whether or not wireless communication is established between itself and the in-vehicle device 50.
- FIGS. 5 to 8 a process for changing the operation target according to the content of the operation input by the operator for the portable terminal device 40 operating in the input mode (hereinafter referred to as “operation target change process”). .).
- FIG. 5 is a flowchart showing the flow of the operation target change process, and the mobile terminal device 40 executes the operation target change process every time an operation input is performed.
- 6 to 8 are diagrams showing the relationship between the content of the touch gesture performed on the touch panel 3 of the mobile terminal device 40 and the change in the display image of the display device 6V.
- the operation input notification unit 11 in the control device 1 of the mobile terminal device 40 detects the number of operation points of the touch gesture (the number of fingers used for the touch gesture) (step S11).
- the operation input notification unit 11 selects the cursor 60V as an operation target (step S12).
- FIG. 6 is a diagram illustrating the relationship between the content of a touch gesture performed with one finger and the change in the display image.
- the left diagram illustrates the content of the touch gesture
- the right diagram illustrates the content of the display image of the display device 6V.
- the display image of the display device 6V includes a cursor 60V, a vehicle position icon 61V, and widget screens 62V and 63V.
- the widget screens 62V and 63V are displayed in a superimposed manner on the map image, and the cursor 60V is displayed so as to cross over all the images in the display image. 6 shows a state in which the image “A” relating to the first widget is displayed on the widget screen 62V and the image “B” relating to the second widget is displayed on the widget screen 63V.
- the cursor 60V moves in accordance with the drag operation, as shown in the right diagram of FIG.
- the positions of the map image and the widget screens 62V and 63V do not change. This is because the operation target is the cursor 60V.
- the operation input notification unit 11 selects an image as an operation target (step S13).
- the operation input notification unit 11 selects a map image as an operation target.
- FIG. 7 is a diagram showing the relationship between the content of touch gestures performed with two fingers and the change in the display image.
- the upper and lower left diagrams show the content of touch gestures, and the upper and lower diagrams on the right.
- the content of the display image of the display device 6V is shown.
- the map image is enlarged and displayed as shown in the upper right diagram of FIG.
- the position of the cursor 60V and the positions of the widget screens 62V and 63V do not change. This is because the operation target is a map image. The same applies to a case where a map image is reduced and displayed by a pinch-in operation with two fingers.
- the operation input notification unit 11 selects the widget screen as an operation target (step S14).
- FIG. 8 is a diagram showing the relationship between the content of touch gestures performed with three fingers and the change in the display image.
- the upper and lower left diagrams show the contents of the touch gesture, and the upper and lower diagrams on the right.
- the content of the display image of the display device 6V is shown.
- the contents of the widget screen are switched as shown in the upper right diagram of FIG. Specifically, the image “B” relating to the second widget is displayed on the widget screen 62V that has displayed the image “A” relating to the first widget.
- the image “C” relating to the third widget is newly displayed on the widget screen 63 ⁇ / b> V displaying the image “B” relating to the second widget.
- the position of the cursor 60V and the map image do not change even when the contents of the widget screen are switched by a swipe operation or a flip operation to the left with three fingers. This is because the operation target is a widget screen. The same applies to the case where the contents of the widget screen are switched by a right swipe operation or flip operation with three fingers.
- the widget screen when a downward swipe or flip operation is performed with three fingers, the widget screen is displayed / hidden as shown in the lower right diagram of FIG. Is switched. Specifically, the widget screen 62V displaying the image “A” related to the first widget and the widget screen 63V displaying the image “B” related to the second widget are hidden and can be visually recognized. The image area is enlarged.
- the widget screens 62V and 63V that are not displayed are indicated by broken lines for convenience of explanation, but these broken lines are not actually displayed.
- the widget screens 62V and 63V that are not displayed return to the display state when the downward swipe operation or flip operation with three fingers is performed again.
- the position of the cursor 60V and the map image do not change even when the widget screen is switched between display and non-display by a swipe operation or flip operation with three fingers downward. This is because the operation target is a widget screen.
- the mobile terminal device 40 can cause its own touch panel 3 to function as a touch pad of the in-vehicle device 50 without forcing the operator to perform complicated operations. Therefore, the operator can more easily operate the operation target displayed on the in-vehicle display.
- the in-vehicle device 50 can omit an operation input device such as a touch panel. However, omission of the operation input device provided such as a touch panel is not essential.
- the operator can select a desired operation target from a plurality of operation targets displayed on the display device 6V by changing the number of fingers performing the touch gesture. Therefore, the operator can perform an operation input on a desired operation object without gazing at the display device 6V. This is because when the operation target cannot be selected by changing the number of fingers, the operator needs to gaze at the display image in order to correctly specify the operation target on the display image.
- the in-vehicle system 100 causes the route guide unit 12V in the control device 1V of the in-vehicle device 50 to perform route guidance, but the route guide unit (not shown) in the control device 1 of the mobile terminal device 40. ) May perform route guidance.
- the route guidance unit in the mobile terminal device 40 may use either a map database (not shown) stored in the storage device 5 or a map database 51V stored in the storage device 5V of the in-vehicle device 50.
- any of the output of the position detection device (not shown) mounted on itself and the output of the position detection device 9V mounted on the in-vehicle device 50 may be used.
- the mobile terminal device 40 establishes wireless communication between the mobile terminal device 40 and the in-vehicle device 50 when the mobile terminal device 40 is attached to the dock 30.
- the present invention is not limited to this.
- the mobile terminal device 40 may establish wireless communication between the mobile terminal device 40 and the in-vehicle device 50 when entering a predetermined area around the driver's seat.
Abstract
Description
Claims (6)
- タッチパネルを備える携帯端末装置であって、
車室内の所定位置に置かれた場合に、前記タッチパネルを、車載ディスプレイに表示される操作対象を操作するタッチパッドとして機能させる制御装置を有する携帯端末装置。 - 前記タッチパネルは、前記車載ディスプレイに関して、マルチタッチ式のタッチパッドとして機能し、
前記制御装置は、前記タッチパネルに対して行われる操作入力の操作点の数に応じて操作対象を変える、
請求項1に記載の携帯端末装置。 - 前記制御装置は、前記タッチパネルに対して行われるタッチジェスチャに用いた指の本数に応じて、カーソル、特定のAPPの画像、及びウィジェット画面のうちの1つを操作対象として選択する、
請求項2に記載の携帯端末装置。 - 前記車載ディスプレイに接続される車載装置との近距離無線通信が確立した場合に、前記タッチパネルを前記車載ディスプレイに関するタッチパッドとして機能させる、
請求項1に記載の携帯端末装置。 - 車載ディスプレイに接続される車載装置であって、
車室内の所定位置に置かれた携帯端末装置のタッチパネルに対する操作入力を、前記車載ディスプレイに表示される操作対象に対する操作入力として受ける、
車載装置。 - 車室内の所定位置に置かれた場合に、タッチパネルを、車載ディスプレイに表示される操作対象を操作するタッチパッドとして機能させる制御装置を備える携帯端末装置と、
車室内の所定位置に置かれた前記携帯端末装置の前記タッチパネルに対する操作入力を、前記車載ディスプレイに表示される操作対象に対する操作入力として受ける車載装置と、
を有する車載システム。
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/425,388 US20150227221A1 (en) | 2012-09-12 | 2012-09-12 | Mobile terminal device, on-vehicle device, and on-vehicle system |
KR1020157006157A KR20150041127A (ko) | 2012-09-12 | 2012-09-12 | 휴대 단말 장치, 차재 장치 및 차재 시스템 |
JP2014535292A JP6172153B2 (ja) | 2012-09-12 | 2012-09-12 | 携帯端末装置、車載装置、及び車載システム |
IN1719DEN2015 IN2015DN01719A (ja) | 2012-09-12 | 2012-09-12 | |
PCT/JP2012/073375 WO2014041646A1 (ja) | 2012-09-12 | 2012-09-12 | 携帯端末装置、車載装置、及び車載システム |
KR1020177002615A KR101838859B1 (ko) | 2012-09-12 | 2012-09-12 | 휴대 단말 장치, 차재 장치 및 차재 시스템 |
CN201280075615.XA CN104603577A (zh) | 2012-09-12 | 2012-09-12 | 便携终端装置、车载装置以及车载系统 |
DE112012006892.0T DE112012006892T5 (de) | 2012-09-12 | 2012-09-12 | Mobiles Endgerät, Fahrzeugvorrichtung und Fahrzeugsystem |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2012/073375 WO2014041646A1 (ja) | 2012-09-12 | 2012-09-12 | 携帯端末装置、車載装置、及び車載システム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014041646A1 true WO2014041646A1 (ja) | 2014-03-20 |
Family
ID=50277800
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/073375 WO2014041646A1 (ja) | 2012-09-12 | 2012-09-12 | 携帯端末装置、車載装置、及び車載システム |
Country Status (7)
Country | Link |
---|---|
US (1) | US20150227221A1 (ja) |
JP (1) | JP6172153B2 (ja) |
KR (2) | KR20150041127A (ja) |
CN (1) | CN104603577A (ja) |
DE (1) | DE112012006892T5 (ja) |
IN (1) | IN2015DN01719A (ja) |
WO (1) | WO2014041646A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015076045A (ja) * | 2013-10-11 | 2015-04-20 | Necパーソナルコンピュータ株式会社 | 情報処理装置、方法及びプログラム |
CN105117146A (zh) * | 2014-05-26 | 2015-12-02 | Lg电子株式会社 | 信息提供设备及其方法 |
WO2016009512A1 (ja) * | 2014-07-16 | 2016-01-21 | 三菱電機株式会社 | エンジニアリングツール |
US9420086B2 (en) | 2014-03-05 | 2016-08-16 | Honda Motor Co., Ltd. | Information terminal |
JP2017530446A (ja) * | 2014-09-25 | 2017-10-12 | アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited | 情報検索 |
EP3146413A4 (en) * | 2014-05-22 | 2017-12-13 | Samsung Electronics Co., Ltd. | User terminal device, method for controlling user terminal device, and multimedia system thereof |
JP2019532283A (ja) * | 2016-09-13 | 2019-11-07 | サムスン エレクトロニクス カンパニー リミテッド | ナビゲーション装置のアップデート方法、それをサポートする方法、ナビゲーション装置、及び端末装置 |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6525504B2 (ja) * | 2014-04-20 | 2019-06-05 | アルパイン株式会社 | 入力装置および操作入力方法 |
KR101535032B1 (ko) | 2014-07-17 | 2015-07-07 | 현대자동차주식회사 | 차량 인터페이스 확장 방법 |
DE102014219326A1 (de) * | 2014-09-24 | 2016-03-24 | Continental Teves Ag & Co. Ohg | Sensorfusion mit Smartphone im Fahrzeug |
US9874952B2 (en) * | 2015-06-11 | 2018-01-23 | Honda Motor Co., Ltd. | Vehicle user interface (UI) management |
TWI578021B (zh) * | 2015-08-19 | 2017-04-11 | 國立臺北科技大學 | 擴增實境互動系統及其動態資訊互動顯示方法 |
CN105260028A (zh) * | 2015-11-11 | 2016-01-20 | 武汉卡比特信息有限公司 | 通过手机摄像头体感控制车载电脑的方法 |
CN105302007A (zh) * | 2015-12-03 | 2016-02-03 | 深圳市凯立德科技股份有限公司 | 一种车联网操作控制系统 |
DE102016112833A1 (de) * | 2016-07-13 | 2018-01-18 | Visteon Global Technologies, Inc. | Verfahren zum Erkennen von Softwareanwendungen und Nutzereingaben |
DE102016217770A1 (de) * | 2016-09-16 | 2018-03-22 | Audi Ag | Verfahren zum Betrieb eines Kraftfahrzeugs |
KR102005443B1 (ko) * | 2017-09-13 | 2019-07-30 | 엘지전자 주식회사 | 사용자 인터페이스 장치 |
KR102480704B1 (ko) * | 2018-01-31 | 2022-12-22 | 엘지전자 주식회사 | 차량용 사용자 인터페이스 장치 |
EP3456577B1 (en) * | 2017-09-13 | 2022-01-26 | LG Electronics Inc. | User interface apparatus for vehicle |
DE102018100196A1 (de) * | 2018-01-05 | 2019-07-11 | Bcs Automotive Interface Solutions Gmbh | Verfahren zum Betreiben einer Mensch-Maschinen-Schnittstelle sowie Mensch-Maschinen-Schnittstelle |
CN111147731A (zh) * | 2018-11-06 | 2020-05-12 | 比亚迪股份有限公司 | 全景预览方法、系统、装置、存储介质及车辆 |
WO2020194349A1 (en) * | 2019-03-27 | 2020-10-01 | Tvs Motor Company Limited | Smart connect instrument cluster |
JP7310705B2 (ja) * | 2020-05-18 | 2023-07-19 | トヨタ自動車株式会社 | エージェント制御装置、エージェント制御方法、及びエージェント制御プログラム |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009042796A (ja) * | 2005-11-25 | 2009-02-26 | Panasonic Corp | ジェスチャー入力装置および方法 |
JP2012008968A (ja) * | 2010-06-28 | 2012-01-12 | Honda Motor Co Ltd | 携帯機器と連携し、該携帯機器に対して可能な入力操作を実現する車載機器 |
WO2012039022A1 (ja) * | 2010-09-21 | 2012-03-29 | パイオニア株式会社 | 情報通信装置、情報通信方法、情報通信プログラム、及び情報通信システム |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070177804A1 (en) * | 2006-01-30 | 2007-08-02 | Apple Computer, Inc. | Multi-touch gesture dictionary |
US9292111B2 (en) * | 1998-01-26 | 2016-03-22 | Apple Inc. | Gesturing with a multipoint sensing device |
JP2001142563A (ja) * | 1999-11-09 | 2001-05-25 | Internatl Business Mach Corp <Ibm> | 機能補完型携帯情報装置 |
KR100474724B1 (ko) * | 2001-08-04 | 2005-03-08 | 삼성전자주식회사 | 터치스크린을 가지는 장치 및 그 장치에 외부디스플레이기기를 연결하여 사용하는 방법 |
JP2004028909A (ja) * | 2002-06-27 | 2004-01-29 | Victor Co Of Japan Ltd | 車内無線通信システム |
JP2005284886A (ja) * | 2004-03-30 | 2005-10-13 | Toshiba Corp | 情報表示システム |
JP2008191868A (ja) | 2007-02-02 | 2008-08-21 | Fujitsu Ltd | 位置指定プログラムおよび携帯端末装置 |
JP4499127B2 (ja) * | 2007-03-15 | 2010-07-07 | 本田技研工業株式会社 | 携帯端末 |
CN101600009A (zh) * | 2008-06-04 | 2009-12-09 | 深圳富泰宏精密工业有限公司 | 无线控制装置及具有该控制装置的无线通信装置 |
US8683390B2 (en) * | 2008-10-01 | 2014-03-25 | Microsoft Corporation | Manipulation of objects on multi-touch user interface |
TW201117982A (en) * | 2009-11-23 | 2011-06-01 | Htc Corp | Electronic system applied to a transport and related control method |
JP5656046B2 (ja) * | 2010-01-20 | 2015-01-21 | 株式会社ユピテル | 車両用情報表示装置 |
US8971967B2 (en) * | 2010-04-19 | 2015-03-03 | Dap Realize Inc. | Mobile information processing apparatus equipped with touch panel device and program for mobile information processing apparatus |
US20130241720A1 (en) * | 2012-03-14 | 2013-09-19 | Christopher P. Ricci | Configurable vehicle console |
JP2012108719A (ja) * | 2010-11-17 | 2012-06-07 | Ntt Docomo Inc | 電子機器及び入出力方法 |
US8818275B2 (en) * | 2011-03-10 | 2014-08-26 | Continental Automotive Systems, Inc | Enhancing vehicle infotainment systems by adding remote sensors from a portable device |
JP5633460B2 (ja) * | 2011-04-01 | 2014-12-03 | 株式会社デンソー | 制御装置 |
US8805349B2 (en) * | 2011-05-04 | 2014-08-12 | General Motors Llc | Method for controlling mobile communications |
US8661151B2 (en) * | 2011-05-09 | 2014-02-25 | Google Inc. | Dynamic playlist for mobile computing device |
CN102594903A (zh) * | 2012-03-02 | 2012-07-18 | 许晓聪 | 一种智能化移动车载系统 |
WO2013133478A1 (en) * | 2012-03-04 | 2013-09-12 | Lg Electronics Inc. | Portable device and control method thereof |
-
2012
- 2012-09-12 DE DE112012006892.0T patent/DE112012006892T5/de not_active Withdrawn
- 2012-09-12 KR KR1020157006157A patent/KR20150041127A/ko active Search and Examination
- 2012-09-12 WO PCT/JP2012/073375 patent/WO2014041646A1/ja active Application Filing
- 2012-09-12 US US14/425,388 patent/US20150227221A1/en not_active Abandoned
- 2012-09-12 IN IN1719DEN2015 patent/IN2015DN01719A/en unknown
- 2012-09-12 JP JP2014535292A patent/JP6172153B2/ja not_active Expired - Fee Related
- 2012-09-12 KR KR1020177002615A patent/KR101838859B1/ko active IP Right Grant
- 2012-09-12 CN CN201280075615.XA patent/CN104603577A/zh active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009042796A (ja) * | 2005-11-25 | 2009-02-26 | Panasonic Corp | ジェスチャー入力装置および方法 |
JP2012008968A (ja) * | 2010-06-28 | 2012-01-12 | Honda Motor Co Ltd | 携帯機器と連携し、該携帯機器に対して可能な入力操作を実現する車載機器 |
WO2012039022A1 (ja) * | 2010-09-21 | 2012-03-29 | パイオニア株式会社 | 情報通信装置、情報通信方法、情報通信プログラム、及び情報通信システム |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015076045A (ja) * | 2013-10-11 | 2015-04-20 | Necパーソナルコンピュータ株式会社 | 情報処理装置、方法及びプログラム |
US9420086B2 (en) | 2014-03-05 | 2016-08-16 | Honda Motor Co., Ltd. | Information terminal |
US9800709B2 (en) | 2014-03-05 | 2017-10-24 | Honda Motor Co., Ltd. | Information terminal |
EP3146413A4 (en) * | 2014-05-22 | 2017-12-13 | Samsung Electronics Co., Ltd. | User terminal device, method for controlling user terminal device, and multimedia system thereof |
US9414201B2 (en) | 2014-05-26 | 2016-08-09 | Lg Electronics Inc. | Information providing apparatus and method thereof |
CN105117146B (zh) * | 2014-05-26 | 2017-05-17 | Lg电子株式会社 | 信息提供设备及其方法 |
CN105117146A (zh) * | 2014-05-26 | 2015-12-02 | Lg电子株式会社 | 信息提供设备及其方法 |
US10282156B2 (en) | 2014-05-26 | 2019-05-07 | Lg Electronics Inc. | Information providing apparatus and method thereof |
JP5968541B2 (ja) * | 2014-07-16 | 2016-08-10 | 三菱電機株式会社 | エンジニアリングツール |
WO2016009512A1 (ja) * | 2014-07-16 | 2016-01-21 | 三菱電機株式会社 | エンジニアリングツール |
JP2017530446A (ja) * | 2014-09-25 | 2017-10-12 | アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited | 情報検索 |
JP2019532283A (ja) * | 2016-09-13 | 2019-11-07 | サムスン エレクトロニクス カンパニー リミテッド | ナビゲーション装置のアップデート方法、それをサポートする方法、ナビゲーション装置、及び端末装置 |
JP7165125B2 (ja) | 2016-09-13 | 2022-11-02 | サムスン エレクトロニクス カンパニー リミテッド | ナビゲーション装置のアップデート方法、それをサポートする方法、ナビゲーション装置、及び端末装置 |
Also Published As
Publication number | Publication date |
---|---|
KR20170015555A (ko) | 2017-02-08 |
JPWO2014041646A1 (ja) | 2016-08-12 |
DE112012006892T5 (de) | 2015-06-11 |
IN2015DN01719A (ja) | 2015-05-22 |
CN104603577A (zh) | 2015-05-06 |
US20150227221A1 (en) | 2015-08-13 |
KR101838859B1 (ko) | 2018-04-27 |
JP6172153B2 (ja) | 2017-08-02 |
KR20150041127A (ko) | 2015-04-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6172153B2 (ja) | 携帯端末装置、車載装置、及び車載システム | |
CN106062514B (zh) | 便携式装置与车辆头端单元之间的交互 | |
JP5920474B2 (ja) | 携帯端末装置、車載装置、及び車載システム | |
JP5673631B2 (ja) | 情報表示装置及び携帯端末装置 | |
KR20140136799A (ko) | 영상표시장치 및 영상표시장치의 동작방법 | |
EP2990764B1 (en) | Traffic information notification system, traffic information notification device, traffic information notification method, and computer program | |
WO2014030263A1 (ja) | 携帯端末装置、携帯端末装置の制御方法、携帯端末装置で実行されるプログラム、及び、携帯端末装置用クレードル | |
KR101542495B1 (ko) | 이동 단말기의 정보 표시 방법 및 그 장치 | |
JP2012122777A (ja) | 車載装置 | |
JP5219503B2 (ja) | 情報端末装置、コンピュータプログラム及び表示方法 | |
US20140181651A1 (en) | User specific help | |
JP5326678B2 (ja) | ナビゲーション装置 | |
JP6084021B2 (ja) | 表示システム、表示装置、表示方法、及び、プログラム | |
JP2012225751A (ja) | 車載情報端末 | |
JP7132144B2 (ja) | ナビゲーション装置、ナビゲーション方法及びプログラム | |
JP2018010584A (ja) | 操作支援装置及びコンピュータプログラム | |
JP2014191818A (ja) | 操作支援システム、操作支援方法及びコンピュータプログラム | |
JP6687978B2 (ja) | ナビゲーションシステム及びナビゲーション装置 | |
JP2016024112A (ja) | 誘導装置、誘導装置の制御方法およびプログラム | |
JP2023137446A (ja) | アプリケーションプログラム、情報処理システム、および情報処理方法 | |
JP2015099032A (ja) | 経路表示装置、経路表示方法及び経路表示プログラム | |
JP2016102734A (ja) | ナビゲーションシステム、ナビゲーション方法、及びナビゲーションプログラム | |
JP2013134074A (ja) | 操作システム、ナビゲーション装置、入力端末 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12884384 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014535292 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14425388 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 20157006157 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120120068920 Country of ref document: DE Ref document number: 112012006892 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12884384 Country of ref document: EP Kind code of ref document: A1 |