WO2014141954A1 - 表示装置 - Google Patents
表示装置 Download PDFInfo
- Publication number
- WO2014141954A1 WO2014141954A1 PCT/JP2014/055485 JP2014055485W WO2014141954A1 WO 2014141954 A1 WO2014141954 A1 WO 2014141954A1 JP 2014055485 W JP2014055485 W JP 2014055485W WO 2014141954 A1 WO2014141954 A1 WO 2014141954A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- touch
- operation information
- display device
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0231—Cordless keyboards
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0381—Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0383—Remote input, i.e. interface arrangements in which the signals generated by a pointing device are transmitted to a PC at a remote location, e.g. to a PC in a LAN
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0384—Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices
Definitions
- the present invention relates to a display device.
- the display device connected to the information terminal is based on the image receiving unit that receives the image information of the screen including the cursor from the information terminal, and the image information received by the image receiving unit.
- a touch panel that displays a screen and detects a touch position designated by a touch operation, and a transmission unit that transmits operation information corresponding to the touch operation to the information terminal are provided.
- the transmission unit transmits the operation information with a delay.
- the transmission unit in the display device according to the first aspect, when the touch position is designated by the touch operation, the transmission unit is configured to move the cursor from the predetermined reference position to the touch position.
- the operation information includes movement amount information indicating a movement amount of the cursor, and the transmission unit responds to a distance from the reference position to the touch position. It is preferable that the movement amount information is transmitted as the first operation information, and the predetermined movement amount information corresponding to the reference position is transmitted as the second operation information.
- the transmission unit when the touch position is designated by the touch operation, transmits the first operation information and then transmits the predetermined button. If the designation of the touch position is canceled after the third operation information indicating the operation is transmitted and the third operation information is transmitted, the second operation information indicating the release of the button operation is transmitted, and then the second operation information is transmitted. It is preferable to transmit the operation information.
- the transmission unit sets the amount of movement of the cursor to zero in the third operation information and the fourth operation information.
- the transmitting unit when the touch position is moved during the touch operation, transmits the first operation information, Operation information including button operation information indicating button operation and movement amount information corresponding to the movement amount of the touch position may be transmitted.
- the transmission unit when the touch position is moved during the touch operation, displays the first operation information and the third operation information. After the transmission, operation information including button operation information indicating the button operation and movement amount information corresponding to the movement amount of the touch position may be transmitted.
- the display device may further include an interval time information acquisition unit that acquires interval time information related to the interval time from the information terminal.
- the transmission unit preferably determines whether or not to transmit the operation information with a delay based on the interval time information acquired by the interval time information acquisition unit.
- the interval time information is preferably transmitted from the information terminal based on a policy file distributed from the predetermined server device to the information terminal.
- the transmission unit may transmit the operation information using a communication format for a mouse in Bluetooth or USB.
- the display device connected to the information terminal is based on the image receiving unit that receives the image information of the screen including the cursor from the information terminal, and the image information received by the image receiving unit.
- a touch panel that displays a screen and detects a touch position designated by a touch operation, and a transmission unit that transmits operation information corresponding to the touch operation to the information terminal are provided.
- the transmission unit transmits operation information including button operation information indicating a predetermined button operation and movement amount information corresponding to the movement amount of the touch position.
- the information terminal when the display screen of the information terminal to which the touch panel operation is applied is displayed on the display device, the information terminal can be operated from the display device.
- FIG. 1 It is a figure which shows the structure of the vehicle-mounted information system by one embodiment of this invention. It is a block diagram which shows the structure of a vehicle-mounted apparatus and a portable terminal. It is a figure which shows schematic structure of the software in a portable terminal. It is a figure explaining operation
- FIG. 1 is a diagram showing a configuration of an in-vehicle information system according to an embodiment of the present invention.
- the in-vehicle information system shown in FIG. 1 is used by being mounted on a vehicle, and the in-vehicle device 1 and the mobile terminal 2 are connected to each other by short-range wireless communication and wired communication via the video / audio cable 3. It is realized by.
- the in-vehicle device 1 is fixed in the vehicle, and is installed, for example, in an instrument panel of the vehicle.
- the mobile terminal 2 is a portable information terminal that can be carried by a user, and is, for example, a mobile phone or a smartphone.
- Bluetooth registered trademark
- HDMI registered trademark
- the display unit 11 is provided in the in-vehicle device 1.
- the display unit 11 is a touch panel capable of displaying various images and videos, and is configured by combining, for example, a resistive film type touch panel switch and a liquid crystal display.
- the user can cause the mobile terminal 2 to execute a desired function by touching an arbitrary position on the display unit 11 with a finger or the like and specifying an icon, an operation button, or the like displayed at the position.
- an in-vehicle device 1 may further include an operation switch corresponding to a predetermined operation.
- the mobile terminal 2 is provided with a display unit 21.
- the display unit 21 is a touch panel that can display various images and videos, and is configured by combining, for example, a capacitive touch panel switch and a liquid crystal display.
- the user can cause the portable terminal 2 to execute a desired function by touching an arbitrary position on the display unit 21 with a finger or the like according to the content of the image or video displayed on the display unit 21.
- the display part 21 as the touch panel was demonstrated here, it is good also as a normal display monitor which is not a touch panel. In that case, it is preferable to provide the portable terminal 2 with various operation switches according to the contents of the process executed by the portable terminal 2.
- the display unit 21 may be a touch panel display monitor, and an operation switch corresponding to a predetermined operation may be provided in the mobile terminal 2.
- FIG. 2 is a block diagram showing the configuration of the in-vehicle device 1 and the portable terminal 2.
- the in-vehicle device 1 includes a control unit 10, a display unit 11, an operation unit 12, an audio output unit 13, a memory unit 14, a short-range wireless communication interface unit 15, and a video / audio signal input unit 16.
- the portable terminal 2 includes a control unit 20, a display unit 21, an operation unit 22, an audio output unit 23, a memory unit 24, a short-range wireless communication interface unit 25, a video / audio signal output unit 26, a wireless communication unit 27, and a GPS. (Global Positioning System)
- the receiving unit 28 is included.
- control unit 10 includes a microprocessor, various peripheral circuits, a RAM, a ROM, and the like, and executes various processes based on a control program recorded in the memory unit 14. By the processing performed by the control unit 10, various image display processing, audio output processing, and the like are executed.
- control unit 10 acquires a vehicle speed signal and a parking signal output from the vehicle. Based on the vehicle speed signal and the parking signal, the control unit 10 determines whether the traveling state of the vehicle is traveling or stopped.
- the output of the vehicle speed signal and the parking signal from the vehicle to the control unit 10 is, for example, the vehicle speed mounted on the vehicle via a CAN (Controller Area Network) not shown which is a communication network provided in the vehicle. This is done by outputting a vehicle speed pulse from the sensor.
- CAN Controller Area Network
- the display unit 11 is a display monitor configured by a liquid crystal display or the like as described above with reference to FIG.
- the operation unit 12 is a part for detecting a user's touch operation on the display unit 11 and corresponds to the touch panel switch described above.
- the display unit 11 and the operation unit 12 are shown separately, but actually they are integrated to form one touch panel.
- the operation switch is also included in the operation unit 12.
- the content of the user input operation performed on the operation unit 12 is output to the control unit 10 and reflected in the processing performed by the control unit 10.
- the sound output unit 13 includes an amplifier, a speaker, and the like, and can output various sounds under the control of the control unit 10. For example, music that reproduces music data read from the portable terminal 2 or a recording medium (not shown), guidance voice for guiding the vehicle to the destination, and the like are output from the voice output unit 13.
- the memory unit 14 is a non-volatile data storage device, and is realized by, for example, an HDD (hard disk drive) or a flash memory.
- the memory unit 14 stores various data such as the above-described control program used in the control unit 10, for example. Reading and writing of data in the memory unit 14 is performed as necessary under the control of the control unit 10.
- the short-range wireless communication interface unit 15 performs a wireless interface process required when performing short-range wireless communication with the mobile terminal 2 under the control of the control unit 10. For example, the information output from the control unit 10 is converted into a predetermined radio signal format and transmitted to the mobile terminal 2, or the information output from the mobile terminal 2 in the predetermined radio signal format is received and sent to the control unit 10. Or output. Interface processing by the short-range wireless communication interface unit 15 is performed according to a predetermined communication standard such as Bluetooth.
- the video / audio signal input unit 16 receives a video signal and an audio signal input from the portable terminal 2 via the video / audio cable 3, and uses these signals for image (video) data for screen display and audio output. Each is converted into audio data and output to the control unit 10.
- the control unit 10 controls the display unit 11 to display an image based on the image data on the display unit 11.
- the audio output unit 13 is controlled to cause the audio output unit 13 to output audio based on the audio data.
- control unit 20 is configured by a microprocessor, various peripheral circuits, RAM, ROM, and the like, similar to the control unit 10 of the in-vehicle device 1, and is stored in a control program recorded in the memory unit 24. Various processes are executed based on the above.
- the display unit 21 is a touch panel display monitor as described above.
- the operation unit 22 is a part for detecting a user input operation.
- the display unit 21 and the operation unit 22 are shown as separate configurations, but actually, like the display unit 11 described above, the display unit 21 and the operation unit 22 are integrated to form a single touch panel. It is composed.
- the operation switch is also included in the operation unit 22.
- the content of the user input operation performed on the operation unit 22 is output to the control unit 20 and reflected in the processing performed by the control unit 20.
- the audio output unit 23 includes an amplifier, a speaker, and the like, and can output various types of audio under the control of the control unit 20. For example, when a call is made using the mobile terminal 2, the voice of the other party is output from the voice output unit 23.
- the memory unit 24 is a non-volatile data storage device similar to the memory unit 14 of the in-vehicle device 1, and stores various data to be used in the processing of the control unit 20.
- the memory unit 24 stores various application programs (hereinafter simply referred to as applications) acquired by the user in advance. The user can implement various functions in the mobile terminal 2 by selecting any one of the various applications stored in the memory unit 24 and causing the control unit 20 to execute the selected application.
- the short-range wireless communication interface unit 25 performs a wireless interface process based on a predetermined communication standard, similarly to the short-range wireless communication interface unit 15 of the in-vehicle device 1. That is, information communication between the in-vehicle device 1 and the mobile terminal 2 is realized by the short-range wireless communication interface unit 15 and the short-range wireless communication interface unit 25 exchanging information with each other through wireless communication.
- the video / audio signal output unit 26 converts the image (video) and audio generated by the control unit 20 into a video signal and an audio signal according to a predetermined communication standard such as HDMI, respectively, and the video / audio cable 3 To the in-vehicle device 1 via
- a predetermined communication standard such as HDMI, respectively
- the video signal and the audio signal are input to the video / audio signal input unit 16 in the in-vehicle device 1
- the same image (screen) displayed on the display unit 21 in the mobile terminal 2 is displayed on the display unit of the in-vehicle device 1.
- 11 is also displayed.
- the same sound that is output from the sound output unit 23 in the portable terminal 2 is also output from the sound output unit 13 of the in-vehicle device 1.
- Such a function is called video mirroring.
- the wireless communication unit 27 performs wireless communication for connecting the mobile terminal 2 to another mobile terminal or a server via a wireless communication network (not shown).
- the portable terminal 2 can make a call with another portable terminal or download an arbitrary application from the server by wireless communication performed by the wireless communication unit 27.
- a wireless communication performed by the wireless communication unit 27 for example, a cellular phone network or an Internet network via a wireless LAN can be used as the wireless communication network.
- the GPS receiver 28 receives a GPS signal transmitted from a GPS satellite and outputs it to the controller 20.
- the GPS signal includes information on the position and transmission time of the GPS satellite that transmitted the GPS signal as information for obtaining the current position and the current time of the mobile terminal 2. Accordingly, by receiving GPS signals from a predetermined number or more of GPS satellites, the control unit 20 can calculate the current position and the current time based on these pieces of information.
- This in-vehicle information system has a cooperation function between the in-vehicle device 1 and the mobile terminal 2.
- the in-vehicle device 1 performs image display and audio output according to the application. be able to.
- the user's operation performed on the in-vehicle device 1 can be reflected in the operation of the application executed on the mobile terminal 2.
- navigation processing for guiding the vehicle to the destination can be performed.
- a map screen on which a map in the vicinity of the current position is drawn is created on the mobile terminal 2, and image information representing the map screen is sent from the video / audio signal output unit 26 to the video / audio cable from the video signal described above. 3 to the video / audio signal input unit 16.
- a map screen is transmitted from the portable terminal 2 to the in-vehicle device 1 so that the map screen near the current position can be displayed on the display unit 11 of the in-vehicle device 1.
- the recommended route from the current position of the vehicle to the set destination is set as the departure point.
- Search in the portable terminal 2. When the vehicle approaches a guidance point on the recommended route, a guidance voice corresponding to the traveling direction of the vehicle at the guidance point is transmitted from the mobile terminal 2 to the in-vehicle device 1. Thereby, the guidance voice can be output from the voice output unit 13 of the in-vehicle device 1. At this time, a predetermined signal may be output from the mobile terminal 2 to the in-vehicle device 1 according to the start and end timings of the guidance voice output.
- the in-vehicle device 1 can drive the vehicle to the destination without the user getting lost. To notify the user.
- various data such as map data necessary for the mobile terminal 2 to execute the navigation application may be stored in advance in the memory unit 24 of the mobile terminal 2.
- the wireless communication unit 27 is used to connect to a predetermined server and the necessary data is stored. You may make it acquire each time.
- the mobile terminal 2 executes an application selected by the user among a plurality of applications including the navigation application as described above.
- the user can select an application to be executed by the mobile terminal 2 by operating the operation unit 22 and selecting a desired application on the menu screen displayed on the display unit 21 of the mobile terminal 2.
- On this menu screen for example, icons of applications that can use the cooperation function are displayed side by side.
- an application corresponding to the icon is executed on the mobile terminal 2.
- the mobile terminal 2 transmits a menu screen to the in-vehicle device 1 by the video signal from the video / audio signal output unit 26.
- the in-vehicle device 1 displays a menu screen on the display unit 11 based on the video signal transmitted from the mobile terminal 2.
- operation information corresponding to the touch operation is transmitted from the in-vehicle device 1 to the portable terminal by the short-range wireless communication interface unit 15. 2 is transmitted.
- the operation information transmitted from the in-vehicle device 1 is received by the short-range wireless communication interface unit 25 in the mobile terminal 2 and output to the control unit 20. Based on the operation information received in this way, the control unit 20 recognizes which application is selected by the user in the in-vehicle device 1 and executes the application. Thereby, the user can select a desired application in the in-vehicle device 1 and cause the mobile terminal 2 to execute the same as in the case where the menu screen displayed on the display unit 21 of the mobile terminal 2 is used.
- control unit 20 can execute each application in the foreground or the background.
- the application When executed in the foreground, the application is a target of image display or operation input in the in-vehicle device 1 and the portable terminal 2.
- the process according to the application is executed by the control unit 20, but the application is not subjected to image display or operation input in the in-vehicle device 1 and the portable terminal 2.
- audio may be output from an application executing in the background.
- an application called an application manager is installed in the mobile terminal 2 in advance and stored in the memory unit 24. That is, the memory unit 24 stores a plurality of applications including an application manager.
- the application manager is read from the memory unit 24 and executed by the control unit 20.
- FIG. 3 is a diagram showing a schematic configuration of software in the mobile terminal 2.
- the application manager 201 includes a sub application Ma and a sub application Ms.
- the sub-application Ma has a launcher function for starting an application other than the application manager 201, and a policy file acquisition function for acquiring a policy file in which various information necessary for cooperative operation between the in-vehicle device 1 and the mobile terminal 2 is recorded. And have.
- the control unit 20 can use these functions by executing the sub application Ma in the foreground. For example, it is possible to call another application using the launcher function, and cause the control unit 20 to execute the application in the foreground instead of the sub application Ma.
- a policy file acquisition function can be used to acquire a data file called a policy file from an external server device or the like. In the policy file, restriction information indicating the contents of operation restriction during traveling of the vehicle for each application, resolution information of the display unit 21, interval time information described later, and the like are recorded. The acquired policy file is stored in the memory unit 24.
- the sub application Ms has a communication function for connecting the mobile terminal 2 to the in-vehicle device 1 and an operation restriction function for restricting the operation while the vehicle is running.
- the control unit 20 can use these functions by executing the sub application Ms in the background.
- a communication function it is possible to refer to a policy file stored in the memory unit 24 and exchange various types of communication information necessary for cooperation between the mobile terminal 2 and the in-vehicle device 1.
- the operation restriction function referring to restriction information indicated in the policy file stored in the memory unit 24, it is possible to determine the content of the operation restriction while the vehicle is running for the application being executed in the foreground. Can do.
- This determination result is transmitted as communication information from the portable terminal 2 to the in-vehicle device 1 by the communication function described above, and is used when the in-vehicle device 1 performs operation restriction during traveling of the vehicle.
- the application manager 201 is configured by being divided into the sub application Ma executed in the foreground in the control unit 20 and the sub application Ms executed in the background in the control unit 20. By doing so, the function sharing of the application manager 201 can be optimized, and the function sharing suitable for execution in the foreground and the background can be achieved.
- Application manager 201 calls one of the applications included in application 202 by the launcher function of sub application Ma. Then, the control unit 20 executes this application in the foreground instead of the sub application Ma. In the following description of FIG. 3, it is assumed that the application A is being executed.
- the OS (Operating System) 203 is software for managing the overall operation of the mobile terminal 2.
- the OS 203 mediates information input / output between the sub application Ms executed in the background in the control unit 20 and the SPP profile 204 and the HID profile 205.
- the SPP profile 204 and the HID profile 205 are drivers used in short-range wireless communication performed between the in-vehicle device 1 and the mobile terminal 2, and are standardized as a part of standards used in Bluetooth.
- the SPP profile 204 performs transmission / reception processing of communication information input / output between the mobile terminal 2 and the in-vehicle device 1 by the communication function of the sub application Ms.
- the communication information transmitted from the mobile terminal 2 to the in-vehicle device 1 includes information indicating the determination result of the operation restriction content by the operation restriction function of the sub application Ms, and the resolution of the display unit 21 indicated in the acquired policy file. Information and interval time information are included.
- the communication information transmitted from the in-vehicle device 1 to the mobile terminal 2 includes travel information corresponding to the travel state of the vehicle.
- the HID profile 205 performs processing for receiving operation information output from the in-vehicle device 1 in accordance with the user's operation content in the in-vehicle device 1.
- the contents of each information received by the SPP profile 204 and the HID profile 205 are output to the sub application Ms via the OS 203 and transferred to the application being executed by the communication function of the sub application Ms. Note that these pieces of information are transmitted and received between the short-range wireless communication interface unit 15 of the in-vehicle device 1 and the short-range wireless communication interface unit 25 of the mobile terminal 2 by wireless communication.
- the sub application Ma When the sub application Ma is being executed in the foreground in the control unit 20, the sub application Ma generates an image of a menu screen for allowing the user to select an application to be executed by the above-described launcher function. Further, when the application A is being executed in the foreground in the control unit 20, the application A uses the traveling information and operation information passed from the sub application Ms as necessary to generate a predetermined image or sound. . These images and sound are temporarily stored in the sound / image memory 206 and then output to the HDMI driver 207.
- the HDMI driver 207 performs a process of converting an image or sound generated by the sub application Ma or application A into a video signal or an audio signal according to a method defined by HDMI.
- the video signal and audio signal are output to the in-vehicle apparatus 1 via the video / audio cable 3 by the video / audio signal output unit 26.
- the mobile terminal 2 has a software configuration as described above.
- a software configuration can be realized by using, for example, Android (registered trademark).
- the sub-application Ma is executed by a thread called “Activity” and the sub-application Ms is executed by a thread called “Service”, so that the sub-application Ma is executed in the foreground in the control unit 20.
- the application Ms can be executed in the background.
- the operation when the user performs a touch operation on the in-vehicle device 1 will be described in detail.
- the operation information corresponding to the touch position specified by the touch operation is transmitted from the in-vehicle device 1 to the mobile terminal 2.
- Sent. At this time, the in-vehicle device 1 transmits operation information to the mobile terminal 2 using a mouse communication format called an HID packet.
- an HID packet that is a communication format for a mouse in Bluetooth is generally used when a mouse is used as an input device of the mobile terminal 2.
- the HID packet carries movement amount information corresponding to the movement of the mouse and button operation information corresponding to various button operations such as a click operation from the mouse at a predetermined time in a predetermined data format. It can be transmitted to the terminal 2.
- the mobile terminal 2 Upon receiving these pieces of information transmitted from the mouse by the HID packet, the mobile terminal 2 performs the reception process using the HID profile 205 shown in FIG. Move the cursor on the screen according to the movement.
- the operation content performed on the screen by the user is decoded from the screen content corresponding to the cursor position at that time and the content of the button operation.
- the control unit 20 activates an application corresponding to the icon designated on the menu screen in accordance with the mouse operation content decoded by the OS 203 in this way, or performs an operation designated on the application screen being executed. Depending on the process.
- the operation information corresponding to the touch operation on the in-vehicle device 1 is transmitted from the in-vehicle device 1 to the portable terminal 2 using the HID packet as described above.
- the touch operation content can be recognized by the OS 203 as in the case of using the mouse.
- FIG. 4 is a diagram for explaining operations of the in-vehicle device 1 and the mobile terminal 2 when a touch operation is performed in the in-vehicle device 1.
- FIG. 4A shows a state before the touch operation.
- an arrow-shaped cursor 40 displayed at the lower right end of the outer peripheral portions of the display unit 11 and the display unit 21 has the upper left end of the cursor 40 corresponding to the tip of the arrow as an instruction point. . That is, before the touch operation, the cursor 40 on the screen indicates the reference position 41 corresponding to the lower right end of the display unit 11 in the in-vehicle device 1, and indicates the reference position 42 corresponding to the lower right end of the display unit 21 in the mobile terminal 2. ing. At this time, most of the cursor 40 is located outside the range of each screen display of the display units 11 and 21 and is not actually displayed. Therefore, in FIG. 4A, the display position of the cursor 40 is indicated by a broken line.
- the mobile terminal 2 displays a screen including the cursor 40 displayed at the reference position 42 on the display unit 21, and the image information on this screen is transmitted to the video / audio cable using the video signal described above. 3 to the in-vehicle device 1.
- the in-vehicle device 1 displays the same screen as that displayed on the display unit 21 of the mobile terminal 2 on the display unit 11 based on the image information.
- a screen including the cursor 40 displayed at the reference position 41 is displayed on the display unit 11.
- the resolution (number of pixels) in the X direction (horizontal direction) of the display unit 11 is represented by Xv
- the resolution (number of pixels) in the Y direction (vertical direction) is represented by Yv
- the resolution (number of pixels) in the X direction (horizontal direction) of the display unit 21 is represented as Xs
- the resolution (number of pixels) in the Y direction (vertical direction) is represented as Ys.
- FIG. 4B shows a state when a touch operation is performed.
- the in-vehicle device 1 detects the movement amount Px in the X direction and the movement in the Y direction as the movement amount from the reference position 41 at the lower right end pointed by the cursor 40 before the touch operation to the touch position 43.
- the amount Py is calculated.
- the movement amounts Px and Py are calculated, for example, by determining the number of pixels between the reference position 41 and the touch position 43 in the X direction and the Y direction, respectively.
- the movement from the reference position 41 to the touch position 43 when the upper left corner of the display unit 11 is the starting point is defined as the positive direction of X
- the downward direction is defined as the positive direction of Y.
- the quantities Px and Py are both obtained as negative values.
- the in-vehicle device 1 After calculating the movement amount Px in the X direction and the movement amount Py in the Y direction as described above, the in-vehicle device 1 subsequently uses the movement amounts Px and Py as the movement amounts Qx and Qy on the display unit 21 of the mobile terminal 2. Convert to The conversion of the movement amount is performed using the following formula (1) based on the above-described resolutions Xv, Yv and Xs, Ys.
- Qx Px ⁇ (Xs / Xv)
- Qy Py ⁇ (Ys / Yv) (1)
- Xs / Xv and Ys / Yv are conversion coefficients for converting the movement amounts Px and Py into the movement amounts Qx and Qy, respectively.
- the in-vehicle device 1 can obtain these conversion coefficients by acquiring the resolutions Xs and Ys of the display unit 21 from the mobile terminal 2 in advance.
- the in-vehicle device 1 After calculating the converted movement amounts Qx and Qy, the in-vehicle device 1 transmits the movement amount information indicating these movement amounts Qx and Qy to the portable terminal 2 by the HID packet as the first operation information corresponding to the touch operation. Send.
- this operation information is referred to as “Move”.
- the mobile terminal 2 decodes the movement amounts Qx and Qy by the OS 203 based on this. Then, as shown in FIG. 4B, the cursor 40 is moved from the reference position 42 at the lower right corner by the movement amounts Qx and Qy and displayed on the display unit 21, and the screen image including the cursor 40 after the movement is displayed. Information is output to the in-vehicle device 1 via the video / audio cable 3 using the video signal described above.
- the in-vehicle device 1 When receiving the image information of the screen including the moved cursor 40 from the portable terminal 2, the in-vehicle device 1 displays the same screen as that displayed on the display unit 21 of the portable terminal 2 based on the image information. 11 is displayed. As a result, as shown in FIG. 4B, a screen in which the cursor 40 indicates the touch position 43 is displayed on the display unit 11.
- the in-vehicle device 1 After transmitting the operation information “Move”, the in-vehicle device 1 transmits, as the second operation information corresponding to the touch operation, button operation information indicating that a predetermined button operation, for example, a left click operation has been performed, as an HID packet. Is transmitted to the portable terminal 2.
- this operation information is referred to as “Tap”.
- movement amount information indicating that the movement amount of the cursor 40 is 0 is also transmitted together with the button operation information.
- the mobile terminal 2 decodes the content of the button operation by the OS 203 based on this. Then, according to the display position of the cursor 40 after movement designated by the operation information “Move”, that is, the screen contents displayed at the touch position 43 and the button operation contents designated by the operation information “Tap”, For example, the application corresponding to the icon at the touch position 43 is activated.
- the in-vehicle device 1 When the user performs a release operation of releasing the finger from the display unit 11 and ends the touch operation, the in-vehicle device 1 performs the button operation specified by the operation information “Tap” as the third operation information corresponding to the touch operation.
- the button operation information indicating that it has been released is transmitted to the portable terminal 2 using an HID packet.
- this operation information is referred to as “Untap”.
- movement amount information indicating that the movement amount of the cursor 40 is 0 is also transmitted together with the button operation information, similarly to the operation information “Tap” described above.
- the in-vehicle device 1 After transmitting the operation information “Untap”, the in-vehicle device 1 transmits the operation information for returning the display position of the cursor 40 from the touch position 43 to the reference position 41 as the fourth operation information corresponding to the touch operation. Is transmitted to the portable terminal 2.
- this operation information is referred to as “Move_Reset”.
- the in-vehicle device 1 preferably transmits, as the operation information “Move_Reset”, movement amount information indicating a movement amount from the touch position 43 to the reference position 41 or a movement amount larger than that to the mobile terminal 2.
- movement amount information indicating a movement amount from the touch position 43 to the reference position 41 or a movement amount larger than that to the mobile terminal 2.
- Operation information “Move_Reset” can be transmitted by transmitting movement amount information corresponding to the movement amount.
- movement amount information indicating movement amounts ⁇ Qx and ⁇ Qy obtained by inverting these signs may be transmitted as operation information “Move_Reset”.
- Good In addition to this, if there is a movement amount from the touch position 43 to the reference position 41 or more, movement amount information indicating an arbitrary movement amount can be transmitted as the operation information “Move_Reset”.
- the mobile terminal 2 uses the OS 203 to decode the movement amount represented by the movement amount information. Then, the cursor 40 is moved to the reference position 42 at the lower right corner and displayed on the display unit 21, and the image information of the screen including the cursor 40 after the movement is displayed on the video / audio cable 3 using the video signal described above.
- the in-vehicle device 1 When receiving the image information of the screen including the moved cursor 40 from the portable terminal 2, the in-vehicle device 1 displays the same screen as that displayed on the display unit 21 of the portable terminal 2 based on the image information. 11 is displayed. Thereby, a screen as shown in FIG. 4A is displayed again on the display unit 11.
- each operation information of “Move”, “Tap”, “Untap”, and “Move_Reset” is displayed in the in-vehicle device 1 according to the touch operation.
- To the portable terminal 2 sequentially. Thereby, in the portable terminal 2, a user's touch operation can be recognized and an appropriate process can be performed.
- an acceleration process is executed when operation information indicating a movement amount of a certain distance or more is continuously input within a predetermined time. Yes.
- This acceleration processing is for adding a movement amount larger than the actually input movement amount and moving the cursor so that the cursor can be moved to a desired position with a small mouse operation.
- the touch operation as described above is realized by transmitting operation information corresponding to the touch operation from the in-vehicle device 1 to the portable terminal 2 using the HID packet. Therefore, when the acceleration process is executed in the mobile terminal 2, the cursor moves to a place different from the actual touch position, and as a result, the touch operation cannot be correctly recognized in the mobile terminal 2.
- the in-vehicle device 1 when the user performs a touch operation continuously for a short time, the in-vehicle device 1 does not immediately transmit the operation information corresponding to the second and subsequent touch operations, but transmits it with a delay of a predetermined interval time. To do. Thereby, it is possible to prevent the operation information from the in-vehicle device 1 from being continuously input to the mobile terminal 2 a plurality of times within the interval time, and to prevent the mobile terminal 2 from executing the acceleration process.
- operation information corresponding to the second and subsequent touch operations is sequentially delayed and accumulated, so that the total delay time becomes enormous.
- the number of touch operations that delay the transmission of operation information is limited to a predetermined number, for example, five times, and touch operations that exceed this are discarded and operation information is not transmitted. It is preferable to do.
- an upper limit may be provided for the delay time, and a touch operation input exceeding the upper limit may be discarded and the operation information may not be transmitted.
- the in-vehicle device 1 obtains interval time information from the mobile terminal 2 in advance and sets an optimal interval time based on this information. For example, for various types and versions of OS, the corresponding interval time is recorded in advance in a predetermined server device.
- the server device distributes information on the interval time corresponding to the OS 203 to the mobile terminal 2.
- the portable terminal 2 notifies the in-vehicle device 1 of the interval time based on the information distributed from the server device. By doing so, the in-vehicle device 1 can acquire the optimum interval time for the OS 203 installed in the mobile terminal 2.
- FIG. 5 is a sequence diagram showing a flow of information among the in-vehicle device 1, the portable terminal 2, and the server device related to the touch operation described above.
- the in-vehicle device 1 When the in-vehicle device 1 and the mobile terminal 2 are connected, the in-vehicle device 1 outputs a session start request indicated by reference numeral 50 to the application manager 201 in FIG.
- the mobile terminal 2 uses the wireless communication unit 27 by the sub-application Ma of the application manager 201 to the server device via a wireless communication network such as a mobile phone network or an Internet network. Connecting. Then, a policy file distribution request is made to the server device as indicated by reference numeral 51.
- the sub-application Ma is mounted as information necessary for specifying the policy file, for example, information on each application stored in the memory unit 24, model information of the portable terminal 2, and the portable terminal 2.
- the server apparatus is notified of information on the type and version of the OS 203 being installed.
- the server device When receiving a policy file distribution request from the sub-application Ma of the portable terminal 2, the server device selects a policy file suitable for the portable terminal 2 from various policy files stored in advance, as indicated by reference numeral 52. Deliver to the mobile terminal 2.
- the server device individually records, as policy files, various types of information terminals such as interval time information for each OS type and version, resolution information for each model, and restriction information for various applications. ing.
- the server device selects a policy file most suitable for the mobile terminal 2 from these, and distributes the policy file to the mobile terminal 2 via a wireless communication network such as a telephone network or an Internet network.
- FIG. 6 is a diagram illustrating an example of information recorded in a policy file distributed to the mobile terminal 2.
- restriction information represents the content of the operation restriction during traveling of the vehicle of each application stored in the mobile terminal 2.
- the resolution information represents the screen resolution in the horizontal direction (X direction) and the vertical direction (Y direction) of the display unit 21.
- the resolution in the horizontal direction is 1280 pixels, and the resolution in the vertical direction.
- the interval time information represents an interval time necessary for the OS 203, and the example in FIG. 6 indicates that the interval time is 100 ms.
- the mobile terminal 2 When receiving the policy file distributed from the server device, the mobile terminal 2 stores the policy file in the memory 24 by the sub application Ma of the application manager 201. Then, the sub application Ms reads the policy file from the memory 24 and notifies the in-vehicle device 1 of the resolution and interval time using the short-range wireless communication interface unit 25 as indicated by reference numeral 53. Here, the resolution information and interval time information indicated in the received policy file are transmitted to the in-vehicle device 1 to notify the in-vehicle device 1 of these contents.
- the in-vehicle device 1 When receiving the notification of the resolution and the interval time from the mobile terminal 2, the in-vehicle device 1 calculates the conversion coefficient used in the above-described equation (1) based on the notified resolution. Further, based on the notified interval time, the interval time for preventing the execution of the acceleration process is set.
- the in-vehicle device 1 calculates the amount of movement after conversion corresponding to the touch position using the above conversion coefficient,
- the operation information “Move” is transmitted to the OS 203 of the mobile terminal 2. Thereafter, as indicated by reference numeral 55, the second operation information “Tap” is transmitted.
- the in-vehicle device 1 transmits the third operation information “Untap” as indicated by reference numeral 56. Thereafter, in order to return the cursor moved to the touch position to the original position, the fourth operation information “Move_Reset” is transmitted as indicated by reference numeral 57.
- the in-vehicle device 1 stands by after calculating the converted movement amount according to the touch position.
- the waiting time at this time is a time from when the operation information “Move_Reset” is transmitted according to the release operation with respect to the previous touch operation until the interval time elapses.
- the in-vehicle device 1 continuously transmits the operation information “Move” and “Tap” to the OS 203 of the mobile terminal 2 as indicated by reference numerals 58 and 59, respectively.
- the in-vehicle device 1 continuously transmits operation information “Untap” and “Move_Reset” to the OS 203 as indicated by reference numerals 60 and 61, respectively. Thereafter, the same operation is repeated.
- each operation information is continuously transmitted from the in-vehicle device 1 to the portable terminal 2 in a predetermined order.
- the operation information of subsequent touch operations is transmitted with a delay, so that a transmission interval of operation information longer than the interval time is ensured between touch operations.
- the mobile terminal 2 can recognize the content of the touch operation performed by the user on the in-vehicle device 1 by decoding each operation information from the in-vehicle device 1 in the OS 203.
- FIG. 7 is a flowchart of processing related to a touch operation executed in the in-vehicle device 1. The processing shown in this flowchart is executed by the control unit 10 when communication is established between the in-vehicle device 1 and the portable terminal 2.
- step S ⁇ b> 10 the control unit 10 receives the resolution information and interval time information of the display unit 21 transmitted from the mobile terminal 2 using the short-range wireless communication interface unit 15.
- the resolution Xs of the display unit 21 in the X direction and the resolution Ys in the Y direction are received from the portable terminal 2 as resolution information.
- the OS 203 receives information indicating the length of time necessary for preventing the execution of acceleration processing, for example, an interval time of 100 ms, from the portable terminal 2 as interval time information.
- step S ⁇ b> 20 the control unit 10 displays on the display unit 11 based on the resolution information of the display unit 21 received from the mobile terminal 2 in step S ⁇ b> 10 and the resolution of the display unit 11 stored in advance in the memory unit 14.
- a conversion coefficient corresponding to the resolution ratio of the unit 21 is calculated.
- the resolutions Xs and Ys of the display unit 21 represented by the resolution information received in step S10 are the resolution Xv for the X direction of the display unit 11 and the resolution Yv for the Y direction, respectively.
- the conversion coefficient Xs / Xv for the X direction and the conversion coefficient Ys / Yv for the Y direction are calculated.
- step S30 the control unit 10 determines whether or not a touch operation has been performed by the user on the operation unit 12 which is a touch panel switch configured integrally with the display unit 11. If a touch operation has been performed, the touch operation is detected and the process proceeds to step S40.
- step S40 the control unit 10 calculates a movement amount from the cursor display position when the touch operation is detected in step S30 to the touch position specified by the touch operation.
- the movement amount Px in the X direction and the movement amount Py in the Y direction are calculated as the movement amount from the reference position 41 at the lower right end to the touch position 43.
- step S50 the control unit 10 converts the movement amount calculated in step S40 into a movement amount on the display unit 21 of the mobile terminal 2.
- the control unit 10 uses the X-direction conversion coefficient Xs / Xv and the Y-direction conversion coefficient Ys / Yv calculated in step S20, the movement in the X direction calculated for the display unit 11 by the above-described equation (1).
- the amount Px and the amount of movement Py in the Y direction are converted into the amount of movement Qx in the X direction and the amount of movement Qy in the Y direction on the display unit 21, respectively.
- step S60 the control unit 10 determines whether or not an interval time has elapsed since the transmission of the previous operation information “Move_Reset”.
- the elapsed time since operation information “Move_Reset” was transmitted in step S120 described later executed in response to the previous touch operation, and the interval time set based on the interval time information received in step S10, Compare As a result, if the elapsed time is less than the interval time, the process proceeds to step S70, and if the elapsed time is equal to or greater than the interval time, the process proceeds to step S80.
- step S70 the control unit 10 stands by until the elapsed time from the transmission of the previous operation information “Move_Reset” reaches the interval time. Thereby, the execution of the process of the next step S80 is delayed, and the transmission of the operation information “Move” is delayed. When the interval time has elapsed, the process proceeds to step S80.
- step S80 the control unit 10 transmits the operation information “Move” to the mobile terminal 2.
- the control unit 10 uses the short-range wireless communication interface unit 15 to transmit movement amount information indicating the movement amount converted in step S50 to the mobile terminal 2 as operation information “Move”.
- the movement amount information according to the distance from the display position of the cursor at the time of the touch operation to the detected touch position is transmitted from the in-vehicle device 1 to the portable terminal 2.
- the movement amount information is transmitted using an HID packet that is a communication format for a mouse.
- the movement amount information thus transmitted from the in-vehicle device 1 is received by the short-range wireless communication interface unit 25 in the portable terminal 2 and is decoded by the OS 203.
- the cursor position on the screen is moved in the portable terminal 2, and image information including the moved cursor is transmitted from the portable terminal 2 to the in-vehicle device 1 and displayed on the display unit 11.
- step S90 the control unit 10 transmits the operation information “Tap” to the mobile terminal 2.
- the control unit 10 uses the short-range wireless communication interface unit 15, movement amount information indicating that the movement amount is 0 in both the X direction and the Y direction, and a predetermined button operation such as a left click operation. Button operation information indicating that has been performed is transmitted to the mobile terminal 2 as operation information “Tap”.
- step S100 the control unit 10 determines whether or not a release operation has been performed by the user with respect to the operation unit 12 which is a touch panel switch configured integrally with the display unit 11.
- the release operation is performed, that is, when the touch operation detected in step S30 is terminated, the process proceeds to step S110.
- step S110 the control unit 10 transmits the operation information “Untap” to the mobile terminal 2.
- the control unit 10 uses the short-range wireless communication interface unit 15 to indicate movement amount information indicating that the movement amount is 0 in both the X direction and the Y direction, and that the button operation has been released.
- the button operation information is transmitted to the portable terminal 2 as operation information “Untap”.
- step S ⁇ b> 120 the control unit 10 transmits operation information “Move_Reset” to the mobile terminal 2.
- the control unit 10 uses the short-range wireless communication interface unit 15 to transfer the movement amount information set in advance according to the movement amount to the reference position to the portable terminal 2 as the operation information “Move_Reset”. Send.
- step S120 the control part 10 will return to step S30, and will repeatedly perform the above processes according to the touch operation from a user.
- the in-vehicle device 1 receives image information of a screen including the cursor 40 from the mobile terminal 2 by the video / audio signal input unit 16, and displays the screen on the display unit 11 which is a touch panel based on the image information. To do. And the touch position designated by the touch operation from the user is detected by the operation unit 12 constituting the touch panel together with the display unit 11, and the short-range wireless communication interface unit 15 is processed by the control unit 10. Operation information corresponding to the touch operation is transmitted to the portable terminal 2 (steps S80 to S120). If the touch operation is continuously performed at this time, the operation information is delayed and transmitted by waiting until the interval time elapses (step S70). Since it did in this way, it can prevent that an acceleration process is performed in the portable terminal 2. FIG. Therefore, when the display screen of the mobile terminal 2 to which the touch panel operation is applied is displayed on the in-vehicle device 1, the mobile terminal 2 can be operated from the in-vehicle device 1.
- step S80 When the touch position 43 is designated by the user's touch operation, the control unit 10 transmits operation information “Move” for moving the cursor 40 from the reference position 41 to the touch position 43 in step S80.
- operation information “Move_Reset” for returning the cursor 40 from the touch position 43 to the reference position 41 is transmitted in step S120.
- the control unit 10 waits until the interval time elapses in step S70, and then executes step S80.
- the operation information “Move” is transmitted with a delay. Since it did in this way, when a touch operation is performed continuously within an interval time, it can prevent reliably that an acceleration process is performed in the portable terminal 2.
- step S80 the control unit 10 transmits movement amount information corresponding to the distance from the reference position 41 to the touch position 43 as operation information “Move”, and in step S120, a predetermined amount corresponding to the reference position 41 is transmitted.
- the movement amount information is transmitted as operation information “Move_Reset”. Since it did in this way, the display position of the cursor 40 can always be returned to the original reference
- the control unit 10 transmits the operation information “Move” in step S80, and then in step S90, the operation information “Tap” indicating a predetermined button operation. Send. If the designation of the touch position 43 is canceled after transmitting the operation information “Tap”, operation information “Untap” indicating the cancellation of the button operation is transmitted in step S110. Thereafter, operation information “Move_Reset” is transmitted in step S120. Since it did in this way, in the portable terminal 2, the content of the touch operation which the user performed with respect to the vehicle-mounted apparatus 1 can be recognized reliably.
- the control unit 10 sets the movement amount of the cursor 40 to 0 in the operation information “Tap” and “Untap”. Thereby, in the portable terminal 2, the icon etc. corresponding to the touch position 43 designated by the user's touch operation can be reliably selected, and appropriate processing can be executed according to the selection result.
- the in-vehicle device 1 acquires interval time information related to the interval time from the mobile terminal 2 using the short-range wireless communication interface unit 15 by the process of the control unit 10 (step S10). By performing the determination process in step S60 based on the interval time information, the control unit 10 determines whether or not to execute step S70. Thereby, it is determined whether or not the operation information “Move” is transmitted with a delay in the next step S80. Since it did in this way, an appropriate interval time can be set with respect to OS203 mounted in the portable terminal 2, and it can be judged whether operation information is delayed and transmitted.
- step S10 the interval time information is transmitted from the portable terminal 2 based on the policy file distributed from the predetermined server device to the portable terminal 2. Therefore, the optimal interval time can be notified from the portable terminal 2 to the in-vehicle device 1 according to the type and version of the OS 203 installed in the portable terminal 2, and the interval time can be set in the in-vehicle device 1.
- the control unit 10 uses the short-range wireless communication interface unit 15 to transmit each operation information using an HID packet that is a communication format for a mouse in Bluetooth. did. Since it did in this way, transmission of movement amount information can be implement
- FIG. 8 is a diagram for explaining operations of the in-vehicle device 1 and the portable terminal 2 when a flick operation is performed in the in-vehicle device 1.
- a flick operation for moving the touch position by quickly moving the finger in the direction of the arrow 70 is performed.
- the in-vehicle device 1 first transmits operation information “Move” and “Tap” according to the touch position 43 specified first, as in the first embodiment described above. .
- the in-vehicle device 1 detects the next touch position 44 specified after a predetermined time along the direction of the arrow 70 from the first touch position 43. Then, as the movement amount from the touch position 43 to the touch position 44, the movement amount Px1 in the X direction and the movement amount Py1 in the Y direction are calculated.
- the in-vehicle device 1 uses the above-described equation (1) to calculate the movement amounts Px1 and Py1 as a mobile terminal. 2 are converted into movement amounts Qx1 and Qy1 on the display unit 21, respectively.
- the same button operation information as the operation information “Tap” and the movement indicating the converted movement amounts Qx1 and Qy1
- the amount information is transmitted to the portable terminal 2 by the HID packet. In the following description, this operation information is referred to as “Flick”.
- the in-vehicle device 1 executes the same processing as described above every predetermined time.
- the movement amount information indicating the movement amounts Qx2 and Qy2 after conversion according to the movement amounts Px2 and Py2 from the touch position 44 to the touch position 45, and the touch position 45 to the touch position 46.
- the movement amount information indicating the movement amounts Qx3, Qy3 after conversion corresponding to the movement amounts Px3, Py3 until the operation information “Flick” together with the button operation information is sent from the in-vehicle device 1 to the portable terminal 2 every predetermined time. Sent.
- the in-vehicle device 1 detects the touch positions 43 to 46 for every predetermined time from the touch positions continuously designated along the direction of the arrow 70 in the flick operation. Then, X-direction movement amounts Px1 to Px3 and Y-direction movement amounts Py1 to Py3 between the touch positions are calculated, and converted X-direction movement amounts Qx1 to Qx3 and Y-direction movement amounts Qy1 to Qy3 corresponding to these are calculated.
- the operation information “Flick” including the movement amount information and the button operation information is transmitted to the mobile terminal 2 every predetermined time.
- the mobile terminal 2 can detect the position 73 on the display unit 21 corresponding to the touch positions 43 to 46 on the display unit 11 of the in-vehicle device 1 specified in the flick operation. With respect to .about.76, the movement amounts Qx1 to Qx3 in the X direction between the positions and the movement amounts Qy1 to Qy3 in the Y direction can be acquired.
- the detection result of the touch operation performed on the display unit 11 is output from the operation unit 12 constituting the touch panel together with the display unit 11 to the control unit 10 at every predetermined output cycle.
- the output cycle of the detection result of the touch operation from the operation unit 12 is generally shorter than the transmission cycle of the HID packet used for transmitting the operation information “Flick”. Therefore, the control unit 10 extracts a response corresponding to the transmission cycle of the HID packet from the detection results of the touch operation input from the operation unit 12 every predetermined cycle, and based on this, extracts the operation information “Flick”. Preferably generated and transmitted.
- a part of the function of the operation unit 12 may be realized by driver software executed by the control unit 10.
- the mobile terminal 2 Upon receiving the operation information “Flick” from the in-vehicle device 1, the mobile terminal 2 decodes the content of the flick operation by the OS 203 based on this. Then, according to the flick operation content specified by the operation information “Flick”, for example, switching display of the menu screen, processing corresponding to the application being executed, and the like are performed.
- the in-vehicle device 1 displays the operation information “Untap” and “Move_Reset” as in the first embodiment. Each is transmitted to the portable terminal 2 by the HID packet.
- each operation information of “Move”, “Tap”, “Flick”, “Untap”, and “Move_Reset” according to the flick operation. are sequentially transmitted from the in-vehicle device 1 to the portable terminal 2. Thereby, in the portable terminal 2, a user's flick operation can be recognized and an appropriate process can be performed.
- FIG. 9 is a sequence diagram showing a flow of information among the in-vehicle device 1, the mobile terminal 2, and the server device regarding the flick operation.
- the in-vehicle device 1 and the mobile terminal 2 When the in-vehicle device 1 and the mobile terminal 2 are connected, the in-vehicle device 1, the mobile terminal 2, and the server device perform the operations indicated by reference numerals 50 to 53, respectively, as in the sequence diagram of FIG. 5 in the first embodiment.
- the portable terminal 2 when the in-vehicle device 1 outputs a session start request, the portable terminal 2 requests the server device to distribute a policy file, and the server device selects a policy file suitable for the portable terminal 2 and selects the portable terminal 2. Deliver to.
- the portable terminal 2 notifies the in-vehicle device 1 of the resolution and the interval time. Based on the resolution and interval time thus notified, the in-vehicle device 1 calculates the conversion coefficient and sets the interval time in the above-described equation (1).
- the in-vehicle device 1 When the user performs a flick operation on the in-vehicle device 1, the in-vehicle device 1 performs the conversion after the conversion according to the touch position at the start of the flick operation, as in the case of the touch operation described in the first embodiment. The amount of movement is calculated. Then, as indicated by reference numerals 54 and 55, operation information “Move” and “Tap” are sequentially transmitted. Thereafter, the in-vehicle device 1 transmits operation information “Flick” corresponding to the amount of movement of the touch position in the flick operation to the portable terminal 2 at predetermined time intervals as indicated by reference numeral 80. The transmission of the operation information “Flick” is continuously performed while the flick operation is input.
- the in-vehicle device 1 sequentially transmits operation information “Untap” and “Move_Reset” as indicated by reference numerals 56 and 57, respectively, as in the first embodiment. To do.
- the in-vehicle device 1 waits until the interval time elapses, as in the first embodiment. When the interval time elapses, the in-vehicle device 1 repeats the same operation as described above.
- the portable terminal 2 can recognize the flick operation performed by the user on the in-vehicle device 1 by decoding the operation information “Flick” from the in-vehicle device 1 in the OS 203.
- FIG. 10 is a flowchart of processing related to a flick operation executed in the in-vehicle device 1.
- the processing shown in this flowchart is executed by the control unit 10 when communication is established between the in-vehicle device 1 and the portable terminal 2.
- the same processing numbers as those in the flowchart of FIG. 7 described in the first embodiment are assigned the same step numbers as in FIG. In the following, description of the processing of the same step number as in FIG. 7 is omitted unless particularly necessary.
- step S91 the control unit 10 performs a flick operation from the user on the operation unit 12 which is a touch panel switch configured integrally with the display unit 11. It is determined whether or not.
- the presence or absence of the flick operation is determined by determining whether or not the touch position has changed from the position when the touch operation is detected in step S30.
- the touch position changes accordingly. Therefore, if a change in the touch position is detected, it is determined that a flick operation has been performed, and the process proceeds to the next step S92. If not detected, it is determined that there is no flick operation and the process proceeds to step S100.
- step S92 the control unit 10 calculates the movement amount from the previous touch position.
- the touch position that continuously changes during the flick operation is detected every predetermined time, and the amount of movement in the X direction and the Y direction from the previously detected touch position to the current touch position is detected. calculate.
- step S93 the control unit 10 converts the movement amount calculated in step S92 into a movement amount on the display unit 21 of the mobile terminal 2.
- the X-direction movement coefficient and Y-direction conversion coefficient Xs / Xv and Y-direction conversion coefficient Ys / Yv calculated in step S20 are used to calculate the X-direction movement amount and Y Convert the amount of movement in each direction.
- step S94 the control unit 10 transmits the operation information “Flick” to the mobile terminal 2.
- the control unit 10 uses the short-range wireless communication interface unit 15 to move amount information indicating the amount of movement converted in step S93 and a button operation indicating that a predetermined button operation such as a left click operation has been performed.
- Information is transmitted to the portable terminal 2 as operation information “Flick”.
- step S94 the control part 10 will progress to step S100, and will determine the presence or absence of release operation from a user. As a result, if a release operation is performed, that is, if the flick operation detected in step S91 is terminated, the process proceeds to step S110, and if not, the process returns to step S91 and the above-described processing is repeated.
- the processes in steps S92 to S94 are repeatedly executed at predetermined time intervals after the flick operation is detected in step S91 until the release operation is detected in step S100. Accordingly, as described with reference to FIG. 8, the touch positions 43 to 46 in the flick operation are detected every predetermined time, and the X-direction movement amounts Px1 to Px3 and the Y-direction movement amounts Py1 to Py3 between the touch positions are calculated, respectively. Is done. Then, the converted X-direction movement amounts Qx1 to Qx3 and the Y-direction movement amounts Qy1 to Qy3 corresponding to these are respectively calculated, and the movement amount information corresponding to the converted movement amounts is displayed together with the button operation information and the operation information “Flick”. Is transmitted from the in-vehicle device 1 to the portable terminal 2 at predetermined time intervals.
- step S80 When the flick operation is performed by moving the touch position during the user's touch operation, the control unit 10 transmits the operation information “Move” in step S80, and the operation information “Tap” in step S90. After the transmission, in step S94, operation information “Flick” including button operation information indicating a predetermined button operation and movement amount information corresponding to the movement amount of the touch position is transmitted. Since it did in this way, in the portable terminal 2, the content of the flick operation which the user performed with respect to the vehicle-mounted apparatus 1 can be recognized reliably.
- a video signal and an audio signal are transmitted from the mobile terminal 2 to the in-vehicle device 1 by connecting the in-vehicle device 1 and the mobile terminal 2 to each other via the video / audio cable 3.
- the present invention can also be realized using other communication methods and signal transmission methods.
- a video signal or an audio signal from the mobile terminal 2 to the in-vehicle device 1 may be transmitted by wireless communication. Communication between the in-vehicle device 1 and the mobile terminal 2 can also be performed using wired communication such as USB.
- operation information is transmitted from the in-vehicle device 1 to the portable terminal 2 using a communication format for a mouse in USB instead of the above-described HID packet.
- Any communication method may be adopted as long as necessary signals and information can be transmitted and received between the in-vehicle device 1 and the portable terminal 2.
- in-vehicle device 1 may acquire various vehicle information output from the vehicle, in addition to the vehicle speed signal and the parking signal.
- the vehicle information acquired at this time may be used in a process executed by the in-vehicle device 1 or may be output from the in-vehicle device 1 to the mobile terminal 2 and used in a process executed by the mobile terminal 2.
- activation conditions corresponding to vehicle information are set in advance for each application, and when vehicle information that satisfies the activation conditions is output from the vehicle, the application is automatically activated on the mobile terminal 2. can do.
- information indicating the activation condition of each application may be transmitted from the mobile terminal 2 to the in-vehicle device 1 to determine whether the in-vehicle device 1 satisfies the activation condition based on the vehicle information.
- the vehicle information may be transmitted from the in-vehicle device 1 to the portable terminal 2 and the portable terminal 2 may determine whether or not the activation condition is satisfied based on the vehicle information.
- the display unit 11 of the in-vehicle device 1 is a touch panel and an image received from the mobile terminal 2 is displayed on the display unit 11 has been described.
- the scope of application of the present invention is not limited to this.
- the present invention can be applied to any display device as long as it is connected to an information terminal and displays an image received from the information terminal on a touch panel.
- 1 vehicle-mounted device
- 2 portable terminal
- 3 video / audio cable
- 10 control unit
- 11 display unit
- 12 operation unit
- 13 audio output unit
- 14 memory unit
- 15 short-range wireless communication interface Unit
- 16 video / audio signal input unit
- 20 control unit
- 21 display unit
- 22 operation unit
- 23 audio output unit
- 24 memory unit
- 25 short-range wireless communication interface unit
- 26 video / audio unit Audio signal output unit
- 27 wireless communication unit
- 28 GPS receiving unit
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Navigation (AREA)
- Telephone Function (AREA)
Abstract
Description
本発明の第2の態様によると、第1の態様の表示装置において、送信部は、タッチ操作によりタッチ位置が指定されると、カーソルを所定の基準位置からタッチ位置まで移動させるための第1の操作情報を送信し、第1の操作情報を送信した後にタッチ位置の指定が解除されると、カーソルをタッチ位置から基準位置まで戻すための第2の操作情報を送信し、第2の操作情報を送信した後、所定のインターバル時間以内に次のタッチ操作が行われると、次回の第1の操作情報を遅延させて送信することが好ましい。
本発明の第3の態様によると、第2の態様の表示装置において、操作情報は、カーソルの移動量を示す移動量情報を含み、送信部は、基準位置からタッチ位置までの距離に応じた移動量情報を第1の操作情報として送信し、基準位置に応じた所定の移動量情報を第2の操作情報として送信することが好ましい。
本発明の第4の態様によると、第2または第3の態様の表示装置において、送信部は、タッチ操作によりタッチ位置が指定されると、第1の操作情報を送信した後に、所定のボタン操作を示す第3の操作情報を送信し、第3の操作情報を送信した後にタッチ位置の指定が解除されると、ボタン操作の解除を示す第4の操作情報を送信した後に、第2の操作情報を送信することが好ましい。
本発明の第5の態様によると、第4の態様の表示装置において、送信部は、第3の操作情報および第4の操作情報において、カーソルの移動量を0とすることが好ましい。
本発明の第6の態様によると、第2または第3の態様の表示装置において、送信部は、タッチ操作中にタッチ位置が移動されると、第1の操作情報を送信した後に、所定のボタン操作を示すボタン操作情報とタッチ位置の移動量に応じた移動量情報とを含む操作情報を送信してもよい。
本発明の第7の態様によると、第4または第5の態様の表示装置において、送信部は、タッチ操作中にタッチ位置が移動されると、第1の操作情報および第3の操作情報を送信した後に、ボタン操作を示すボタン操作情報とタッチ位置の移動量に応じた移動量情報とを含む操作情報を送信してもよい。
本発明の第8の態様によると、第2乃至第7のいずれか一態様の表示装置は、情報端末からインターバル時間に関するインターバル時間情報を取得するインターバル時間情報取得部をさらに備えてもよい。この表示装置において、送信部は、インターバル時間情報取得部により取得されたインターバル時間情報に基づいて、操作情報を遅延させて送信するか否かを判断することが好ましい。
本発明の第9の態様によると、第8の態様の表示装置において、インターバル時間情報は、所定のサーバ装置から情報端末に配信されるポリシーファイルに基づいて、情報端末から送信されることが好ましい。
本発明の第10の態様によると、第1乃至9のいずれか一態様の表示装置において、送信部は、ブルートゥースまたはUSBにおけるマウス用の通信フォーマットを利用して、操作情報を送信してよい。
本発明の第11の態様によると、情報端末と接続される表示装置は、情報端末からカーソルを含む画面の画像情報を受信する画像受信部と、画像受信部により受信された画像情報に基づいて画面を表示し、タッチ操作により指定されたタッチ位置を検出するタッチパネルと、タッチ操作に応じた操作情報を情報端末へ送信する送信部とを備える。送信部は、タッチ操作中にタッチ位置が移動されると、所定のボタン操作を示すボタン操作情報とタッチ位置の移動量に応じた移動量情報とを含む操作情報を送信する。
図1は、本発明の一実施の形態による車載情報システムの構成を示す図である。図1に示す車載情報システムは、車両に搭載されて使用されるものであり、車載装置1と携帯端末2が近距離無線通信および映像・音声ケーブル3を介した有線通信で互いに接続されることによって実現される。車載装置1は、車両内に固定されており、たとえば車両のインストルメントパネル内などに設置されている。携帯端末2は、ユーザが持ち運び可能な携帯型の情報端末であり、たとえば携帯電話やスマートフォンなどである。なお、車載装置1と携帯端末2の間で行われる近距離無線通信には、たとえばBluetooth(ブルートゥース)(登録商標)などを用いることができる。また、映像・音声ケーブル3を介した有線通信には、たとえばHDMI(登録商標)などを用いることができる。
Qx=Px×(Xs/Xv)、Qy=Py×(Ys/Yv) ・・・(1)
次に、本発明の第2の実施形態について説明する。前述の第1の実施形態では、ユーザが車載装置1に対して画面上の任意の位置を指定するタッチ操作を行うと、そのタッチ操作に応じた操作情報をHIDパケットにより車載装置1から携帯端末2へと送信することで、車載装置1から携帯端末2を操作する例を説明した。これに対して、第2の実施形態では、さらにフリック操作と呼ばれる、タッチ操作中にタッチ位置を所定方向に弾くように素早く移動させるタッチ操作の例について説明する。
日本国特許出願2013年第50791号(2013年3月13日出願)
Claims (11)
- 情報端末と接続される表示装置であって、
前記情報端末からカーソルを含む画面の画像情報を受信する画像受信部と、
前記画像受信部により受信された画像情報に基づいて前記画面を表示し、タッチ操作により指定されたタッチ位置を検出するタッチパネルと、
前記タッチ操作に応じた操作情報を前記情報端末へ送信する送信部とを備え、
前記送信部は、前記タッチ操作が連続して行われた場合に、前記操作情報を遅延させて送信する表示装置。 - 請求項1に記載の表示装置において、
前記送信部は、
前記タッチ操作により前記タッチ位置が指定されると、前記カーソルを所定の基準位置から前記タッチ位置まで移動させるための第1の操作情報を送信し、
前記第1の操作情報を送信した後に前記タッチ位置の指定が解除されると、前記カーソルを前記タッチ位置から前記基準位置まで戻すための第2の操作情報を送信し、
前記第2の操作情報を送信した後、所定のインターバル時間以内に次のタッチ操作が行われると、次回の前記第1の操作情報を遅延させて送信する表示装置。 - 請求項2に記載の表示装置において、
前記操作情報は、前記カーソルの移動量を示す移動量情報を含み、
前記送信部は、前記基準位置から前記タッチ位置までの距離に応じた移動量情報を前記第1の操作情報として送信し、前記基準位置に応じた所定の移動量情報を前記第2の操作情報として送信する表示装置。 - 請求項2または3に記載の表示装置において、
前記送信部は、
前記タッチ操作により前記タッチ位置が指定されると、前記第1の操作情報を送信した後に、所定のボタン操作を示す第3の操作情報を送信し、
前記第3の操作情報を送信した後に前記タッチ位置の指定が解除されると、前記ボタン操作の解除を示す第4の操作情報を送信した後に、前記第2の操作情報を送信する表示装置。 - 請求項4に記載の表示装置において、
前記送信部は、前記第3の操作情報および前記第4の操作情報において、前記カーソルの移動量を0とする表示装置。 - 請求項2または3に記載の表示装置において、
前記送信部は、前記タッチ操作中に前記タッチ位置が移動されると、前記第1の操作情報を送信した後に、所定のボタン操作を示すボタン操作情報と前記タッチ位置の移動量に応じた移動量情報とを含む操作情報を送信する表示装置。 - 請求項4または5に記載の表示装置において、
前記送信部は、前記タッチ操作中に前記タッチ位置が移動されると、前記第1の操作情報および前記第3の操作情報を送信した後に、前記ボタン操作を示すボタン操作情報と前記タッチ位置の移動量に応じた移動量情報とを含む操作情報を送信する表示装置。 - 請求項2乃至7のいずれか一項に記載の表示装置において、
前記情報端末から前記インターバル時間に関するインターバル時間情報を取得するインターバル時間情報取得部をさらに備え、
前記送信部は、前記インターバル時間情報取得部により取得されたインターバル時間情報に基づいて、前記操作情報を遅延させて送信するか否かを判断する表示装置。 - 請求項8に記載の表示装置において、
前記インターバル時間情報は、所定のサーバ装置から前記情報端末に配信されるポリシーファイルに基づいて、前記情報端末から送信される表示装置。 - 請求項1乃至9のいずれか一項に記載の表示装置において、
前記送信部は、ブルートゥースまたはUSBにおけるマウス用の通信フォーマットを利用して、前記操作情報を送信する表示装置。 - 情報端末と接続される表示装置であって、
前記情報端末からカーソルを含む画面の画像情報を受信する画像受信部と、
前記画像受信部により受信された画像情報に基づいて前記画面を表示し、タッチ操作により指定されたタッチ位置を検出するタッチパネルと、
前記タッチ操作に応じた操作情報を前記情報端末へ送信する送信部とを備え、
前記送信部は、前記タッチ操作中に前記タッチ位置が移動されると、所定のボタン操作を示すボタン操作情報と前記タッチ位置の移動量に応じた移動量情報とを含む操作情報を送信する表示装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/774,210 US9846506B2 (en) | 2013-03-13 | 2014-03-04 | Display device |
CN201480014777.1A CN105190522B (zh) | 2013-03-13 | 2014-03-04 | 显示装置 |
EP14765255.6A EP2975829B1 (en) | 2013-03-13 | 2014-03-04 | Display apparatus |
JP2015505413A JP6317326B2 (ja) | 2013-03-13 | 2014-03-04 | 表示装置 |
US15/813,624 US20180136779A1 (en) | 2013-03-13 | 2017-11-15 | Display Device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-050791 | 2013-03-13 | ||
JP2013050791 | 2013-03-13 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/774,210 A-371-Of-International US9846506B2 (en) | 2013-03-13 | 2014-03-04 | Display device |
US15/813,624 Continuation US20180136779A1 (en) | 2013-03-13 | 2017-11-15 | Display Device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014141954A1 true WO2014141954A1 (ja) | 2014-09-18 |
Family
ID=51536620
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/055485 WO2014141954A1 (ja) | 2013-03-13 | 2014-03-04 | 表示装置 |
Country Status (5)
Country | Link |
---|---|
US (2) | US9846506B2 (ja) |
EP (1) | EP2975829B1 (ja) |
JP (2) | JP6317326B2 (ja) |
CN (2) | CN105190522B (ja) |
WO (1) | WO2014141954A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190007356A (ko) * | 2017-07-12 | 2019-01-22 | 미래나노텍(주) | 양방향 터치스크린 장치 및 이를 이용한 정보 표시 방법 |
KR20210042862A (ko) * | 2020-05-29 | 2021-04-20 | 베이징 바이두 넷컴 사이언스 테크놀로지 컴퍼니 리미티드 | 스마트 백미러에 사용되는 제어 방법, 장치, 전자기기 및 저장매체 |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104679469B (zh) * | 2014-12-29 | 2019-02-26 | 合肥杰发科技有限公司 | 车载终端及其获取手持终端的屏幕分辨率的方法 |
JP6724463B2 (ja) * | 2016-03-24 | 2020-07-15 | 株式会社Ihi | 電子機器、対象システムの操作方法、および操作プログラム |
US10476963B2 (en) * | 2016-05-09 | 2019-11-12 | Honda Motor Co., Ltd. | System and method for contacting vehicle for tandem parking |
US9817511B1 (en) * | 2016-09-16 | 2017-11-14 | International Business Machines Corporation | Reaching any touch screen portion with one hand |
WO2018058362A1 (zh) * | 2016-09-28 | 2018-04-05 | 北京小米移动软件有限公司 | 显示方法、装置及行车记录仪 |
CN107747949A (zh) * | 2017-09-28 | 2018-03-02 | 惠州Tcl移动通信有限公司 | 导航时车载终端画面投射的方法、移动终端及存储介质 |
IL259518B2 (en) | 2018-05-22 | 2023-04-01 | Lumus Ltd | Optical system and method for improving light field uniformity |
GB2577067B (en) * | 2018-09-12 | 2021-01-13 | Avecto Ltd | Controlling applications by an application control system in a computer device |
JP6778735B2 (ja) * | 2018-12-26 | 2020-11-04 | 本田技研工業株式会社 | 表示装置、表示方法、およびプログラム |
CN112463086A (zh) * | 2019-09-06 | 2021-03-09 | 华为技术有限公司 | 一种显示控制方法及电子设备 |
CN111045590B (zh) * | 2019-12-04 | 2021-06-04 | 广州小鹏汽车科技有限公司 | 一种车载模拟按键的方法、系统、存储介质及车辆 |
CN112799557B (zh) * | 2021-01-28 | 2022-03-22 | 青岛海信移动通信技术股份有限公司 | 一种水墨屏显示控制方法、终端及计算机可读存储介质 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003036141A (ja) * | 2001-07-25 | 2003-02-07 | Sony Corp | 入力装置 |
JP2003244343A (ja) | 2002-02-21 | 2003-08-29 | Toyota Motor Corp | 表示装置、携帯端末及び情報表示システム |
JP2010108212A (ja) * | 2008-10-30 | 2010-05-13 | Kyocera Corp | コンテンツ処理システム、端末装置及びコンテンツ処理方法 |
WO2012065020A1 (en) * | 2010-11-12 | 2012-05-18 | Apple Inc. | Device, method, and graphical user interface for navigating a list of identifiers |
US20120144323A1 (en) * | 2010-10-01 | 2012-06-07 | Imerj, Llc | Desktop Reveal By Moving a Logical Display Stack With Gestures |
JP2012257156A (ja) * | 2011-06-10 | 2012-12-27 | Kyocera Corp | 通信端末 |
WO2013183765A1 (ja) * | 2012-06-08 | 2013-12-12 | クラリオン株式会社 | 表示装置 |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3841661B2 (ja) * | 2001-10-09 | 2006-11-01 | 富士通テン株式会社 | ナビゲーションシステム、およびナビゲーション装置、並びに携帯電話機 |
JP4765729B2 (ja) * | 2006-03-31 | 2011-09-07 | カシオ計算機株式会社 | コンピュータシステムのクライアント装置およびその制御プログラム、サーバ装置およびその制御プログラム |
JP5623287B2 (ja) * | 2007-12-05 | 2014-11-12 | ジョンソン コントロールズテクノロジーカンパニーJohnson Controls Technology Company | 車両ユーザインターフェースシステム及び方法 |
JP2010086216A (ja) * | 2008-09-30 | 2010-04-15 | Fujitsu Ten Ltd | 情報処理装置、方法、プログラム、及び情報配信システム |
JP2010127781A (ja) * | 2008-11-27 | 2010-06-10 | Fujitsu Ten Ltd | 車載装置および同装置を有する車載システム |
JP2010130553A (ja) * | 2008-11-28 | 2010-06-10 | Fujitsu Ten Ltd | 車載装置 |
JP5247388B2 (ja) * | 2008-12-01 | 2013-07-24 | 富士通テン株式会社 | 車載システムおよび車載システムの操作制御方法 |
JP2010130669A (ja) * | 2008-12-01 | 2010-06-10 | Fujitsu Ten Ltd | 車載装置および無線通信システム |
US9258402B2 (en) * | 2009-04-14 | 2016-02-09 | Qualcomm Incorporated | System and method for controlling mobile devices |
JP5442326B2 (ja) * | 2009-06-15 | 2014-03-12 | アルパイン株式会社 | 車載用通信装置 |
JP2011033460A (ja) * | 2009-07-31 | 2011-02-17 | Fujitsu Ten Ltd | ナビゲーションシステム、車載機、ナビゲーション方法及びプログラム |
US8860676B2 (en) * | 2010-01-26 | 2014-10-14 | Panasonic Intellectual Property Corporation Of America | Display control device, method, program, and integrated circuit |
US8265928B2 (en) * | 2010-04-14 | 2012-09-11 | Google Inc. | Geotagged environmental audio for enhanced speech recognition accuracy |
JP5494318B2 (ja) * | 2010-07-20 | 2014-05-14 | 株式会社デンソー | 携帯端末および通信システム |
JP5628612B2 (ja) * | 2010-09-17 | 2014-11-19 | クラリオン株式会社 | 車載情報システム、車載装置、情報端末 |
JP5613005B2 (ja) * | 2010-10-18 | 2014-10-22 | オリンパスイメージング株式会社 | カメラ |
JP5633341B2 (ja) * | 2010-11-30 | 2014-12-03 | カシオ計算機株式会社 | サーバベース・コンピューティング・システムのクライアント装置、サーバ装置、およびプログラム |
JP5429198B2 (ja) * | 2011-01-12 | 2014-02-26 | コニカミノルタ株式会社 | 画像処理装置、画像形成システム、および制御プログラム |
JP5472256B2 (ja) * | 2011-05-06 | 2014-04-16 | 株式会社デンソー | 車両用表示装置および情報表示システム |
KR20130053185A (ko) * | 2011-11-15 | 2013-05-23 | 삼성전자주식회사 | 전자기기들 간의 상호 제어 방법 및 시스템 |
EP2808773A4 (en) * | 2012-01-26 | 2015-12-16 | Panasonic Corp | MOBILE TERMINAL, TELEPHONE RECEIVER AND DEVICE CONNECTING METHOD |
JP6032286B2 (ja) * | 2012-08-27 | 2016-11-24 | 日本電気株式会社 | 車載機、携帯端末制御方法およびプログラム |
CN104737104B (zh) * | 2012-10-19 | 2017-12-19 | 三菱电机株式会社 | 信息处理装置、信息终端、信息处理系统及校准方法 |
US9430067B2 (en) * | 2013-01-11 | 2016-08-30 | Sony Corporation | Device and method for touch detection on a display panel |
CN103631866B (zh) * | 2013-11-01 | 2017-01-18 | 北京奇虎科技有限公司 | 网页的显示方法和浏览器 |
-
2014
- 2014-03-04 WO PCT/JP2014/055485 patent/WO2014141954A1/ja active Application Filing
- 2014-03-04 US US14/774,210 patent/US9846506B2/en active Active
- 2014-03-04 CN CN201480014777.1A patent/CN105190522B/zh active Active
- 2014-03-04 CN CN201811110778.2A patent/CN109240633B/zh active Active
- 2014-03-04 JP JP2015505413A patent/JP6317326B2/ja active Active
- 2014-03-04 EP EP14765255.6A patent/EP2975829B1/en active Active
-
2017
- 2017-11-15 US US15/813,624 patent/US20180136779A1/en not_active Abandoned
-
2018
- 2018-03-29 JP JP2018063928A patent/JP6559825B2/ja active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003036141A (ja) * | 2001-07-25 | 2003-02-07 | Sony Corp | 入力装置 |
JP2003244343A (ja) | 2002-02-21 | 2003-08-29 | Toyota Motor Corp | 表示装置、携帯端末及び情報表示システム |
JP2010108212A (ja) * | 2008-10-30 | 2010-05-13 | Kyocera Corp | コンテンツ処理システム、端末装置及びコンテンツ処理方法 |
US20120144323A1 (en) * | 2010-10-01 | 2012-06-07 | Imerj, Llc | Desktop Reveal By Moving a Logical Display Stack With Gestures |
WO2012065020A1 (en) * | 2010-11-12 | 2012-05-18 | Apple Inc. | Device, method, and graphical user interface for navigating a list of identifiers |
JP2012257156A (ja) * | 2011-06-10 | 2012-12-27 | Kyocera Corp | 通信端末 |
WO2013183765A1 (ja) * | 2012-06-08 | 2013-12-12 | クラリオン株式会社 | 表示装置 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190007356A (ko) * | 2017-07-12 | 2019-01-22 | 미래나노텍(주) | 양방향 터치스크린 장치 및 이를 이용한 정보 표시 방법 |
KR102372803B1 (ko) * | 2017-07-12 | 2022-03-10 | 미래나노텍(주) | 양방향 터치스크린 장치 및 이를 이용한 정보 표시 방법 |
KR20210042862A (ko) * | 2020-05-29 | 2021-04-20 | 베이징 바이두 넷컴 사이언스 테크놀로지 컴퍼니 리미티드 | 스마트 백미러에 사용되는 제어 방법, 장치, 전자기기 및 저장매체 |
JP2021108187A (ja) * | 2020-05-29 | 2021-07-29 | ベイジン バイドゥ ネットコム サイエンス テクノロジー カンパニー リミテッドBeijing Baidu Netcom Science Technology Co., Ltd. | スマートバックミラーを制御するための方法、装置、電子機器及び記憶媒体 |
KR102501293B1 (ko) | 2020-05-29 | 2023-02-16 | 아폴로 인텔리전트 커넥티비티 (베이징) 테크놀로지 씨오., 엘티디. | 스마트 백미러에 사용되는 제어 방법, 장치, 전자기기 및 저장매체 |
JP7230093B2 (ja) | 2020-05-29 | 2023-02-28 | 阿波▲羅▼智▲聯▼(北京)科技有限公司 | スマートバックミラーを制御するための方法、装置、電子機器及び記憶媒体 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2014141954A1 (ja) | 2017-02-16 |
EP2975829B1 (en) | 2022-02-09 |
US20160018943A1 (en) | 2016-01-21 |
CN109240633B (zh) | 2021-10-22 |
US9846506B2 (en) | 2017-12-19 |
EP2975829A4 (en) | 2016-12-28 |
EP2975829A1 (en) | 2016-01-20 |
JP6317326B2 (ja) | 2018-04-25 |
US20180136779A1 (en) | 2018-05-17 |
CN105190522B (zh) | 2018-10-23 |
JP2018110039A (ja) | 2018-07-12 |
JP6559825B2 (ja) | 2019-08-14 |
CN105190522A (zh) | 2015-12-23 |
CN109240633A (zh) | 2019-01-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6559825B2 (ja) | 表示装置、情報端末操作方法 | |
JP6124169B2 (ja) | 表示装置 | |
JP6103620B2 (ja) | 車載情報システム、情報端末、アプリケーション実行方法、プログラム | |
JP6058654B2 (ja) | 車載情報システム、情報端末、アプリケーション実行方法 | |
JP6074150B2 (ja) | 車載情報システム、情報端末、アプリケーション実行方法、プログラム | |
KR20170099903A (ko) | 장치들에 걸쳐 디지털 개인 비서 에이전트를 스케일링하기 위한 기법 | |
US20130229377A1 (en) | Accessory protocol for touch screen device accessibility | |
CN109582893A (zh) | 一种页面显示位置跳转方法、装置,终端设备及存储介质 | |
JP6557457B2 (ja) | 画像表示システム、画像表示方法、及び表示装置 | |
JP6362875B2 (ja) | 車載装置、車載情報システム | |
WO2022160612A1 (zh) | 与车辆的车载系统交互方法、存储介质和移动终端 | |
JP2015220533A (ja) | 車載情報システム、車載装置、情報端末、アプリケーション実行方法 | |
WO2013180279A1 (ja) | 車載情報システム、情報端末、アプリケーション実行方法、プログラム | |
JP6397530B2 (ja) | 表示装置 | |
KR20150044417A (ko) | 복수의 단말 간 사용자인터페이스 통합 방법 및 이를 수행하는 단말 | |
JP2012063249A (ja) | 車載情報システム、車載装置、情報端末 | |
WO2016069286A1 (en) | Application level audio connection and streaming | |
JP7193904B2 (ja) | 電子装置および電子システム | |
KR20150059429A (ko) | 사용자 제스처를 이용한 복수의 단말 간 사용자인터페이스 통합 방법 및 이를 수행하는 단말 | |
JP2015118483A (ja) | 表示装置、表示連携システム、表示装置の制御方法、プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480014777.1 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14765255 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015505413 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14774210 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014765255 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |