WO2015083267A1 - Dispositif et procédé de commande d'affichage - Google Patents

Dispositif et procédé de commande d'affichage Download PDF

Info

Publication number
WO2015083267A1
WO2015083267A1 PCT/JP2013/082688 JP2013082688W WO2015083267A1 WO 2015083267 A1 WO2015083267 A1 WO 2015083267A1 JP 2013082688 W JP2013082688 W JP 2013082688W WO 2015083267 A1 WO2015083267 A1 WO 2015083267A1
Authority
WO
WIPO (PCT)
Prior art keywords
display control
icon
image
control device
prescribed
Prior art date
Application number
PCT/JP2013/082688
Other languages
English (en)
Japanese (ja)
Inventor
直樹 礒▲崎▼
下谷 光生
清水 直樹
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2013/082688 priority Critical patent/WO2015083267A1/fr
Priority to DE112013007666.7T priority patent/DE112013007666T5/de
Priority to JP2015551343A priority patent/JP6033465B2/ja
Publication of WO2015083267A1 publication Critical patent/WO2015083267A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/65Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers

Definitions

  • the present invention relates to a display control device and a display control method for controlling a display unit.
  • split-view (also referred to as multi-view or dual-view (registered trademark)) type display devices are known as multi-image display devices capable of displaying different images on one screen depending on the viewing direction.
  • a split view display device For example, it has been proposed to apply a split view display device and a touch panel disposed on the screen to an in-vehicle navigation device. According to such a navigation device, an image having different contents in the direction of the driver's seat and the direction of the passenger's seat is displayed on the screen, and an operation for the icon displayed in the image is received on the touch panel. Is possible.
  • the position of the icon in the image displayed in the direction of the driver's seat and the position of the icon in the image displayed in the direction of the passenger's seat are determined by the split view display device. May overlap on the screen. In such a case, even if an operation on the icon is accepted on the touch panel, whether the operation is performed on the icon in the image displayed in the direction of the driver's seat is displayed in the direction of the passenger's seat There has been a problem that it is impossible to determine whether an operation has been performed on an icon in the displayed image.
  • Patent Document 1 the positions of the icons in the image displayed in the direction of the driver's seat and the positions of the icons in the image displayed in the direction of the passenger's seat are not overlapped with each other. Techniques for arranging at different positions have been proposed.
  • the present invention has been made in view of the above-described problems, and provides a technique capable of suppressing execution of the function of the application without knowing the user's operation. Objective.
  • the display control device includes a first image that is visible in the first direction but not visible in the second direction, and a second image that is visible in the second direction but not visible in the first direction.
  • a display control apparatus that controls a display unit that can display an image on one screen, and includes a control unit. The control unit is based on an output signal from the input unit that uniformly receives the first operation on the first image for executing the function of the application and the second operation on the second image for executing the function of the application.
  • the first specified operation or gesture operation determined to be performed is changed to the first Judge as operation.
  • the control unit determines that a pre-defined second prescribed operation different from the first prescribed operation or a gesture operation after the second prescribed operation has been performed based on an output signal from the input unit
  • the second prescribed operation or the gesture operation determined to be performed is determined as the second operation.
  • the first specified operation when it is determined that the first specified operation has been performed, the first specified operation is determined as the first operation, and when it is determined that the second specified operation has been performed, The second prescribed operation is determined as the second operation. Therefore, the user located in the first direction can execute his / her application without executing it without knowing the user's application located in the second direction by performing the first prescribed operation. Similarly, the user located in the second direction can execute his / her application without executing it without knowing the user's application located in the first direction by performing the second prescribed operation. .
  • FIG. 1 is a block diagram illustrating an example of a configuration of a navigation device according to Embodiment 1.
  • FIG. 3 is a cross-sectional view illustrating an example of a configuration of a split view display unit according to Embodiment 1.
  • FIG. 6 is a diagram illustrating a display example of a split view display unit according to Embodiment 1.
  • FIG. 3 is a cross-sectional view illustrating an example of a configuration of a split view display unit according to Embodiment 1.
  • FIG. 6 is a diagram illustrating a display example of a split view display unit according to Embodiment 1.
  • FIG. 3 is a flowchart showing an operation of the navigation device according to the first embodiment.
  • FIG. 6 is a diagram illustrating a display example of a left image and a right image of the navigation device according to Embodiment 1.
  • FIG. FIG. 6 is a diagram for explaining the operation of the navigation device according to the first embodiment.
  • FIG. 6 is a diagram for explaining the operation of the navigation device according to the first embodiment.
  • FIG. 10 is a diagram for explaining an operation of the navigation device according to the first modification of the first embodiment.
  • FIG. 10 is a diagram for explaining an operation of the navigation device according to the first modification of the first embodiment.
  • FIG. 10 is a diagram for explaining an operation of the navigation device according to the second modification of the first embodiment.
  • FIG. 10 is a diagram for explaining an operation of the navigation device according to the second modification of the first embodiment.
  • FIG. 10 is a diagram for explaining an operation of the navigation device according to the second modification of the first embodiment.
  • FIG. 10 is a diagram for explaining an operation of the navigation device according to the second modification of the first embodiment.
  • FIG. 12 is a diagram for explaining an operation of the navigation device according to the third modification of the first embodiment.
  • FIG. 12 is a diagram for explaining an operation of the navigation device according to the third modification of the first embodiment.
  • 10 is a flowchart showing the operation of the navigation device according to the second embodiment.
  • FIG. 10 is a diagram for explaining an operation of the navigation device according to the second embodiment.
  • FIG. 10 is a diagram for explaining an operation of the navigation device according to the second embodiment.
  • FIG. 25 is a diagram for explaining the operation of the navigation device according to the first modification of the third embodiment.
  • FIG. 25 is a diagram for explaining the operation of the navigation device according to the first modification of the third embodiment.
  • FIG. 25 is a diagram for explaining the operation of the navigation device according to the first modification of the third embodiment.
  • FIG. 11 is a diagram for illustrating a configuration of a navigation device according to a first modification of the third embodiment.
  • FIG. 25 is a diagram for explaining the operation of the navigation device according to the first modification of the third embodiment.
  • FIG. 25 is a diagram for explaining the operation of the navigation device according to the first modification of the third embodiment.
  • FIG. 10 is a diagram for explaining an operation of the navigation device according to the fourth embodiment.
  • FIG. 10 is a diagram for explaining an operation of the navigation device according to the fourth embodiment.
  • FIG. 1 is a block diagram showing an example of the configuration of the navigation device.
  • the vehicle equipped with the navigation device 1 shown in FIG. 1 will be described as “own vehicle”.
  • the navigation device 1 includes a split view display unit 2, a touch panel 3, an operation input processing unit 9, an interface unit 10, a storage unit 11, a left image generation unit 12, a right image generation unit 13, and the like. And a control unit 14 that performs overall control.
  • the interface unit 10 is connected between the wireless communication unit 4, the speaker 5, the DVD (Digital Versatile Disk) player 6, the air conditioner 7, the in-vehicle LAN (Local Area Network) 8, and the control unit 14.
  • Various information and various signals are bidirectionally output between the wireless communication unit 4, the speaker 5, the DVD player 6, the air conditioner 7, the in-vehicle LAN 8, and the control unit 14 via the interface unit 10.
  • the control unit 14 can control the control information by outputting the control information to the wireless communication unit 4, the speaker 5, the DVD player 6, the air conditioner 7, and the in-vehicle LAN 8.
  • the split view display unit 2 is arranged, for example, on the dashboard of the own vehicle.
  • the split view display unit 2 includes a first image that can be viewed in the direction of the left seat (first direction) but cannot be viewed from the direction of the right seat (hereinafter referred to as “left image”), and the direction of the right seat (first A second image (hereinafter referred to as “right image”) that can be viewed in two directions but not from the direction of the left seat can be displayed on one screen.
  • the split view display unit 2 displays an icon in the left image (first icon) and an icon in the right image (second icon).
  • the icon (first icon) in the left image is referred to as “left icon”
  • the icon (second icon) in the right image is referred to as “right icon”.
  • the left seat is the driver's seat and the right seat is the front passenger seat as an example, but the left seat is the passenger seat and the right seat is the driver seat. Is the same as left and right interchangeably.
  • FIG. 2 is a schematic cross-sectional view of the display device.
  • a display device 200 illustrated in FIG. 2 includes a display screen 201 and a parallax barrier 202.
  • the first pixels 201a for displaying the left image and the second pixels 201b for displaying the right image are alternately arranged along the horizontal direction (left-right direction).
  • the parallax barrier 202 allows the light of the first pixel 201a to pass in the direction of the left seat but blocks the light of the second pixel 201b, and allows the light of the second pixel 201b to pass in the direction of the right seat. Blocks the light of the first pixel 201a.
  • the user 101a in the left seat cannot visually recognize the image for the right but can visually recognize the image for left
  • the user 101b in the right seat cannot visually recognize the image for the left but visually recognizes the image for right. be able to.
  • the parallax barrier 202 allows the light from the plurality of first pixels 201a to pass through in the direction of the left seat, so that the left icon is displayed in a visible manner.
  • the parallax barrier 202 allows light from the plurality of second pixels 201b to pass in the direction of the right seat, the right icon is displayed so as to be visible.
  • the outer edge of the left icon display area corresponds to the first pixel 201a located at the outer edge of the plurality of first pixels 201a used for displaying the left icon
  • the right icon display area The outer edge portion corresponds to the second pixel 201b located at the outer edge portion among the plurality of second pixels 201b used for displaying the right icon.
  • FIG. 3 is a diagram showing a display example of the split view display unit 2 of the space division method, and shows a left image and a right image of one frame.
  • a display device of WVGA Wide
  • VGA Wide
  • the split-view display device of the space division type as shown in FIG. 3 corresponding to the display device of WVGA differs depending on the performance of the display device.
  • the horizontal pixels in the display device of WVGA whose total number is horizontally arranged
  • the first and second pixels 201a and 201b are double the number, that is, the total number of pixels is 1600 dots horizontally and 480 dots vertically.
  • the split view display device is composed of a first pixel 201a having 13 dots horizontally and 4 dots vertically and the same number of second pixels 201b.
  • the description will be made assuming that dots are displayed by the first pixel 201a or the second pixel 201b of one dot vertically.
  • the 1-dot x-axis (left-right direction) shift of the icon as shown in FIG. 3 is not visible to the human eye from the normal viewing position, and appears to be displayed at the same position.
  • the outer edge portion (outer frame) of the left icon is indicated by a broken line, and four first pixels 201 a arranged in the horizontal direction are used to display the left icon. It is shown.
  • the outer edge portion (outer frame) of the right icon is indicated by a one-dot chain line, and four second pixels 201 b arranged in the horizontal direction are used to display the right icon. It is shown that.
  • the space division display device 200 is applied to the split view display unit 2
  • at least one of the plurality of (four in FIG. 3) first pixels 201a used for displaying the left icon is used.
  • the second pixel 201b (second pixel 201b corresponding to the alternate long and short dash line in FIG. 3) located at the outer edge is sandwiched.
  • at least a part of the display area of the left icon and at least a part of the display area of the right icon overlap each other on the screen of the split view display unit 2.
  • first pixels 201a first pixels 201a corresponding to broken lines in FIG. 3 located on the outer edge of the first pixels 201a, at least a part of the display area of the left icon, It is noted that at least a part of the display area of the right icon overlaps with each other on the screen of the split view display unit 2.
  • FIG. 4 is a schematic cross-sectional view of the display device.
  • the display device 250 illustrated in FIG. 4 includes a display screen 251 and a parallax barrier 252.
  • the display screen 251 displays the left image by the pixel 251c in the first period, and displays the right image by the pixel 251c in the second period.
  • the parallax barrier 252 allows the light of the pixel 251c to pass in the direction of the left seat in the first period, but blocks the light of the pixel 251c in the direction of the right seat, and in the direction of the right seat in the second period.
  • the light of the pixel 251c is allowed to pass but the light of the pixel 251c is blocked with respect to the direction of the left seat.
  • FIG. 4 shows the state of the first period.
  • the left seat user 101a cannot visually recognize the right image but can visually recognize the left image
  • the right seat user 101b cannot visually recognize the left image but visually recognize the right image. can do.
  • the eyes of the user 101b in the right seat do not receive the light of the pixel 251c from the split view display unit 2 in the first period.
  • the first period is set to be very short
  • the right seat user 101b does not recognize that the eyes are not receiving light in the first period.
  • the user 101b in the right seat recognizes that the image of the second period is also displayed in the first period.
  • the fact that the eyes are not receiving light in the second period is not recognized by the user 101a in the left seat, and the image of the first period is also displayed in the second period due to the afterimage effect of the light received in the first period. Is recognized by the user 101a in the left seat.
  • the parallax barrier 252 allows the light from the plurality of pixels 251c to pass in the direction of the left seat in the first period so that the left icon can be visually recognized.
  • the right icon is displayed so as to be visible by allowing light from the plurality of pixels 251c to pass in the direction of the right seat in the second period. Therefore, the outer edge of the left icon display area corresponds to the pixel 251c located at the outer edge of the plurality of pixels 251c used for displaying the left icon, and the outer edge of the right icon display area is This corresponds to the pixel 251c located at the outer edge among the plurality of pixels 251c used for displaying the right icon.
  • FIGS. 5A and 5 (b) are diagrams showing a display example of the time-division split view display unit 2, and a left image and a right image of one frame are shown.
  • a WVGA display device has pixels of 800 dots on the horizontal (x axis) and 480 dots on the vertical (y axis) as a whole.
  • the time-division split-view display device corresponding to the WVGA display device as shown in FIGS. 5A and 5B differs depending on the performance of the display device, for example, 800 dots horizontally and vertically.
  • the pixel 251c is composed of 480 dots.
  • the split view display device is configured with pixels 251c having 13 dots horizontally and 4 dots vertically, and icons are displayed by pixels 251c having 3 dots horizontally and 1 dot vertically. It will be described as a thing.
  • the outer edge (outer frame) of the left icon displayed in the first period is indicated by a broken line, and three pixels arranged in the horizontal direction to display the left icon. It is shown that 251c is used.
  • the outer edge portion (outer frame) of the right icon displayed in the second period is indicated by a broken line, and 3 arranged in the horizontal direction to display the right icon. It is shown that two pixels 251c are used.
  • the time-division display device 250 in the configuration in which the time-division display device 250 is applied to the split view display unit 2, it is used for displaying the left icon in the first period and used for displaying the right icon in the second period.
  • the pixel 251c is present, at least a part of the display area for the left icon and at least a part of the display area for the right icon overlap each other on the screen of the split view display unit 2.
  • the display area of the left icon and the right icon It is noted that the display area is separated on the screen of the split view display unit 2.
  • the split view display unit 2 may be applied with a display device that combines a space division method and a time division method. For example, at least a part of the pixels used for displaying the left icon in the first period is sandwiched between pixels located on the outer edge among the plurality of pixels used for displaying the right icon in the second period. Or at least some of the pixels used for displaying the right icon in the second period are pixels located at the outer edge of the plurality of pixels used for displaying the left icon in the first period. When sandwiched, at least a part of the display area for the left icon and at least a part of the display area for the right icon overlap each other on the screen of the split view display unit 2.
  • the display of the left icon is displayed. It is noted that the area and the display area for the right icon are separated on the screen of the split view display unit 2.
  • pixel scanning is performed in a short period (for example, 1/30 [second]) in both the space division method and the time division method.
  • the touch panel 3 (input unit) includes a first operation (hereinafter referred to as “left operation”) for a left image for executing an application function and a second operation (hereinafter referred to as “left operation” for executing an application function). (Referred to as “right operation”).
  • the touch panel 3 periodically detects a two-dimensional position on the detection surface for an indicator such as one or more fingers that touch the detection surface. Then, the touch panel 3 outputs a signal indicating the position of the indicator to the operation input processing unit 9.
  • the touch panel 3 is not limited to the one that detects the two-dimensional position of the indicator.
  • the position of the point (two-dimensional position) on the detection surface that has the shortest distance from the indicator, and the indication A three-dimensional position including a distance (another one-dimensional position) between the body and the detection surface (the point) may be detected as the position of the indicator.
  • the wireless communication unit 4 communicates with the server via, for example, DSRC (Dedicate Short Range Communication) and a mobile phone.
  • the wireless communication unit 4 outputs information received from the server (for example, downloaded information) to the control unit 14 or transmits information output from the control unit 14 to the server.
  • the wireless communication unit 4 receives radio broadcasts and television broadcasts and outputs information acquired from the broadcasts to the control unit 14.
  • Speaker 5 (audio output unit) outputs audio based on the audio signal output from the control unit 14.
  • the DVD player 6 reproduces AV (Audio-video) information recorded on the DVD, and outputs the AV information to the control unit 14.
  • AV Audio-video
  • the air conditioner 7 adjusts the temperature and humidity in the vehicle interior under the control of the control unit 14.
  • the in-vehicle LAN 8 communicates with the own vehicle ECU (Electronic Control Unit), GPS (Global Positioning System) device, and the like.
  • the in-vehicle LAN 8 outputs the speed of the own vehicle acquired from the ECU and the current position (for example, latitude and longitude) of the own vehicle acquired from the GPS device to the control unit 14.
  • the operation input processing unit 9 determines whether or not a gesture operation has been performed on the touch panel 3 based on an output signal of the touch panel 3 and determines the type of the gesture operation that has been performed.
  • the gesture operation includes a touch operation in which the detection surface of the touch panel 3 is touched by an indicator, and a gesture operation that draws a predetermined trajectory on the detection surface of the touch panel 3 by the indicator (hereinafter referred to as “orbit gesture operation”).
  • the orbital gesture operation may include a gesture operation in which two points of the two-point touch are continuously used after the two-point touch, or one point out of the two points of the two-point touch is left after the two-point touch. It may include a gesture operation that uses one of the points in succession.
  • the operation input processing unit 9 determines whether a touch operation has been performed as a gesture operation based on the output signal of the touch panel 3.
  • the operation input processing unit 9 also determines the number of points where the detection surface of the touch panel 3 is touched (the number of indicators touching the detection surface). Therefore, the operation input processing unit 9 determines whether or not a one-point touch operation for touching the detection surface of the touch panel 3 at one point with the indicator is performed, and two points for touching the detection surface of the touch panel 3 with the indicator at two points. It is possible to determine whether or not a touch operation has been performed.
  • the two-point touch operation is described as an operation in which the detection surface of the touch panel 3 is touched at two points at the same time by two indicators.
  • the present invention is not limited to this, and for example, the first embodiment The operation described in the second modification may be used.
  • the operation input processing unit 9 determines whether or not the orbital gesture operation is performed as the gesture operation based on the output signal of the touch panel 3.
  • the orbital gesture operation is, for example, a flick operation in which the indicator rubs the detection surface in a time shorter than a predetermined time, a drag operation in which the indicator rubs the detection surface in a time longer than a predetermined time, And pinching operation etc. which change the distance between them in the state which two indicators contacted the detection surface are included.
  • the drag operation is not limited to the above-described operation, and may be applied as an operation of rubbing the detection surface while the indicator is touching the touch panel.
  • the flick operation is not limited to the above-described operation, and may be applied as an operation of touching the detection surface from a state where the indicator touches the touch panel.
  • a gesture operation is applied to a first prescribed operation and a second prescribed operation described later.
  • the operation input processing unit 9 is configured to determine whether or not a gesture operation has been performed for each type of gesture operation. Therefore, the first specified operation and the second specified operation are performed. It is possible to determine whether or not it has been done.
  • icon position information indicating the position of the icon displayed on the split view display unit 2 is input from the control unit 14 to the operation input processing unit 9. Based on the icon position information and the output signal of the touch panel 3 (signal indicating the position of the indicator), the operation input processing unit 9 applies the icon displayed on the touch panel 3 and eventually the split view display unit 2. To determine whether a touch operation or a gesture operation has been performed.
  • the operation input processing unit 9 determines that the position of the indicator indicated by the output signal of the touch panel 3 overlaps the display area of the left icon (the indicator is located inside the left icon), or If it is determined that the display area changes while overlapping the display area (the position of the indicator changes while being positioned inside the display area), it is determined that a gesture operation has been performed on the left icon.
  • the operation input processing unit 9 performs the same determination on the right icon as the determination on the left icon.
  • the operation input processing unit 9 outputs the determination result of the above gesture operation and the like to the control unit 14.
  • the determination process may be performed by the control unit 14.
  • the operation input processing unit 9 is provided separately from the touch panel 3 and the control unit 14, but is not limited thereto, and may be provided in the touch panel 3 as a function of the touch panel 3.
  • the control unit 14 may be provided as a function of the control unit 14.
  • the storage unit 11 includes, for example, a hard disk drive, a DVD and its drive device, a Blu-ray disc and its drive device, or a storage device such as a semiconductor memory.
  • the storage unit 11 stores information used for the control unit 14 in addition to a program necessary for the control unit 14 to operate.
  • the information used for the control unit 14 includes, for example, an application (application software), an image in which icons operated when executing the function of the application, map information, and the like.
  • an image for example, an image corresponding to FIGS. 7A and 7B
  • the “icon arrangement image” includes an image in which an icon is displayed on the map information.
  • the left image generation unit 12 generates a display signal for displaying the left image based on the display information output from the control unit 14, and outputs the display signal to the split view display unit 2.
  • the split view display unit 2 displays the left image based on the display signal.
  • the right image generation unit 13 generates a display signal for displaying the right image based on the display information output from the control unit 14, and outputs the display signal to the split view display unit 2.
  • the split view display unit 2 displays the right image based on the display signal.
  • the display signal generated by the left image generation unit 12 includes, for example, (1, 1), (2, 1),..., (800, 1), (1) for each of a plurality of pixels used in the left image. , 2),... (800, 2),..., (800, 480).
  • the display signal generated by the right image generation unit 13 is also referred to as (1, 1), (1, 2),..., (800, 480) for each of a plurality of pixels used in the right image. Contains pixel numbers assigned in order. Therefore, when the pixel number of at least one pixel used for displaying the left icon matches the pixel number of at least one pixel used for displaying the right icon, at least the display area of the left icon is displayed.
  • (x, y) indicates a pixel position corresponding to an xy coordinate in which the upper left on the screen is (1, 1), the x axis is positive in the right direction, and the y axis is positive in the downward direction.
  • the control unit 14 includes, for example, a CPU (Central Processing Unit), and when the CPU executes a program stored in the storage unit 11, the navigation device 1 executes various applications, and thus The speaker 5 and the like can be controlled according to the executed application.
  • a CPU Central Processing Unit
  • the control unit 14 determines a route from the current position to the destination based on the current position of the host vehicle, the destination based on the output signal of the touch panel 3, and the map information. Search is performed, and display information for displaying guidance along the route and a voice signal for outputting the guidance by voice are generated. As a result, the above guidance is displayed as a left image or a right image, and the above guidance voice is output from the speaker 5.
  • the control unit 14 when a DVD playback application is executed, the control unit 14 generates display information for displaying AV information from the DVD player 6 and an audio signal for outputting the AV information as audio. .
  • the video stored in the DVD is displayed as the left image or the right image, and the audio stored in the DVD is output from the speaker 5.
  • control unit 14 acquires one icon arrangement image corresponding to one or more applications that can be executed on the left image side (executable from the left image side) from the storage unit 11, and the acquired icon arrangement image Is displayed as a left image.
  • an icon that is an operation target for executing the function of the application on the left image side is displayed on the split view display unit (left image).
  • an icon arrangement image (for example, an image corresponding to FIG. 7A) that can be displayed as the left image is referred to as a “left icon arrangement image”.
  • the icon in the left icon arrangement image displayed as the left image corresponds to the above left icon.
  • the control unit 14 acquires, from the storage unit 11, one icon arrangement image corresponding to one or more applications that can be executed on the right image side (executable from the right image side).
  • the icon arrangement image is displayed as the right image.
  • an icon that is an operation target for executing the function of the application on the right image side is displayed on the split view display unit 2 (right image).
  • an icon arrangement image that can be displayed as a right image (for example, an image corresponding to FIG. 7B) will be referred to as a “right icon arrangement image”.
  • the icon in the right icon arrangement image displayed as the right image corresponds to the above-described right icon.
  • the control unit 14 changes the first specified operation determined to be performed as the above left operation. It is determined as On the other hand, if the operation input processing unit 9 determines that a second prescribed operation that is different from the first prescribed operation has been performed, the control unit 14 determines that the second prescribed operation that has been performed is performed. Is determined as the above-described right operation.
  • the first prescribed operation is a first touch operation in which the touch panel 3 is touched with a predetermined first number of points with an indicator
  • the second prescribed operation is the touch panel 3 with an indicator. Is a second touch operation of touching at a predetermined second number of points different from the first number.
  • the first prescribed operation is a one-point touch operation (first touch operation having a first number of 1)
  • the second prescribed operation is a two-point touch operation (second number).
  • a second touch operation in which there are two) will be described.
  • the first prescribed operation and the second prescribed operation are described as operations on icons, but even if they are operations on display areas other than icons (for example, gesture areas where trajectory gesture operations are performed). Good.
  • FIG. 6 is a flowchart showing the operation of the navigation device 1 according to the first embodiment. The operation shown in FIG. 6 is performed by the CPU executing a program stored in the storage unit 11. Hereinafter, the operation of the navigation device 1 will be described with reference to FIG.
  • step S1 when an operation for executing the initial operation is performed, the control unit 14 executes the initial operation.
  • the control unit 14 acquires from the storage unit 11 an application to be initially executed on the left image side and the right image side, and executes the application.
  • step S2 the control unit 14 acquires the left icon arrangement image corresponding to the application executed on the left image side from the storage unit 11, and also corresponds to the application executed on the right image side.
  • a right icon arrangement image is acquired from the storage unit 11.
  • step S3 the control unit 14 displays the acquired left icon arrangement image as the left image of the split view display unit 2, and displays the acquired right icon arrangement image as the right image of the split view display unit 2.
  • FIG. 7 (a) and 7 (b) are diagrams showing display examples of the left image and the right image in step S3 of the navigation device 1 (split view display unit 2) according to the first embodiment.
  • left icons L1, L2, L3, L4, and L5 (hereinafter, these icons may be collectively referred to as “left icons L1 to L5”) are displayed.
  • right icons R1, R2, R3, R4, and R5 (hereinafter, these icons may be collectively referred to as “right icons R1 to R5”) are displayed. Yes.
  • left icons L1 to L5 and right icons R1 to R5 are arranged on the screen of the split view display unit 2 so as to overlap each other.
  • the control unit 14 displays at least part of the display area of the left icons L1 to L5 and at least part of the display area of the right icons R1 to R5 on the screen of the split view display part 2.
  • step S4 of FIG. 6 the operation input processing unit 9 determines whether or not a touch operation has been performed. If it is determined that the touch operation has been performed, the process proceeds to step S5. If it is determined that the touch operation has not been performed, step S4 is performed again. When step S4 is performed again, if the map is displayed as the left image or the right image and the position of the vehicle has changed, the control unit 14 responds to the change. You may scroll the map.
  • step S5 the operation input processing unit 9 determines whether or not the touch operation in step S4 has been performed on the left icon or the right icon. This determination result is used in step S8 or step S11.
  • step S6 the operation input processing unit 9 determines whether the number of points in the touch operation in step S4 is one, two, or none of them.
  • step S7 When it is determined that there is one (when it is determined that it is a one-point touch operation), the process proceeds to step S7. When it is determined that there is two (when it is determined that it is a two-point touch operation) ) Proceeds to step S10, and if it is determined that none of these has occurred, the process returns to step S4.
  • step S4 if the map is displayed as a left image or a right image and the position of the vehicle has changed, the control unit 14 responds to the change with the map. May be scrolled. This is the same when returning from step S6 to step S4.
  • step S7 the control unit 14 determines the touch operation in step S4, that is, the one-point touch operation, as the left operation.
  • step S8 the control unit 14 determines whether or not the one-point touch operation determined to be the left operation has been performed on the left icon based on the determination result of step S5. If it is determined that the one-point touch operation has been performed on the left icon, the process proceeds to step S9, and if not, the process returns to step S4.
  • step S9 the control unit 14 executes a function previously associated with the left icon for which the one-point touch operation has been performed. Thereafter, the process returns to step S4. If an icon arrangement image is associated with the left icon and stored in the storage unit 11 in advance, the process returns from step S9 to step S3, and the icon arrangement image is displayed on the split view display unit 2. May be displayed.
  • step S10 the control unit 14 determines the touch operation in step S4, that is, the two-point touch operation, as the right operation.
  • step S11 the control unit 14 determines whether or not the two-point touch operation determined to be the right operation has been performed on the right icon based on the determination result of step S5. If it is determined that the two-point touch operation has been performed on the right icon, the process proceeds to step S12. If not, the process returns to step S4.
  • step S12 the control unit 14 executes a function previously associated with the right icon on which the two-point touch operation has been performed. Thereafter, the process returns to step S4.
  • the process returns from step S12 to step S3, and the icon arrangement image is displayed in the split view display unit 2. May be displayed.
  • FIGS. 8A and 8B An example of the operation of FIG. 6 described above will be described.
  • the control unit 14 determines the one-point touch operation as a left operation.
  • the control unit 14 executes the function associated with the left icon L1 without executing the function associated with the right icon R1.
  • FIGS. 9A and 9B it is assumed that a two-point touch operation with the finger 21 is performed on the left icon L1 and the right icon R1.
  • the control unit 14 determines the two-point touch operation as a right operation.
  • the control unit 14 executes the function associated with the right icon R1 without executing the function associated with the left icon L1.
  • the first prescribed operation here, one-point touch operation
  • the second specified operation two-point touch operation in this case
  • the user of the left seat can execute the application of the user of the left seat without executing the application without knowing the application of the user of the right seat by performing the first prescribed operation.
  • the user of the right seat can execute the application of the user of the right seat without executing the application without knowing the application of the user of the left seat by performing the second prescribed operation.
  • the left icon and the right icon can be arranged on the screen of the split view display unit 2 so that the area for arranging the icons is insufficient in the step of generating the icon arrangement image. It is possible to suppress this, and it is possible to reduce restrictions imposed on icon arrangement.
  • the user of the left seat that is, the driver only needs to perform a one-point touch operation that is simpler than the two-point touch operation, thereby suppressing the operational burden on the driver. can do.
  • the driver can concentrate on driving.
  • both fingers touching the two points touch the right icon R1, but one finger touching the two points touches the right icon R1.
  • the right seat application may be executed on the assumption that the icon R1 is operated. Further, when both the two touched fingers touch the right icon R1, it may be determined that the icon R1 has been operated.
  • the operation of the touch may be invalidated to notify that it is invalid.
  • the two-point touch often uses the middle finger and the index finger, and the middle finger can be pressed (touched) more strongly than the index finger, so the touch operation with the middle finger may be given priority over the touch operation with the index finger.
  • touching the upper point (touch point with a large y-axis coordinate of the touch panel 3) or the left point (touch point with a small x-axis coordinate of the touch panel 3) of the two touch points with the middle finger The touch point may be validated as a point.
  • the index finger may be prioritized depending on the use situation, which touch point of the two touch points is prioritized may be set by the user.
  • the first specified operation when it is determined that the first specified operation has been performed, the first specified operation is determined as the left operation.
  • the present invention is not limited to this. Instead of determining the first prescribed operation as the left operation, the gesture operation (touch operation or orbital gesture operation) after the first prescribed operation may be determined as the left operation. Good. That is, when the operation input processing unit 9 determines that the gesture operation after the first prescribed operation has been performed, the control unit 14 determines the gesture operation that has been determined to be performed as a left operation. Also good.
  • the drag operation after the one-point touch operation with the finger 21 is a point on the left image. It is assumed that the operation is performed on Dl and the point Dr of the right image (the arrow 21A in the figure indicates the trajectory of the finger 21 in the drag operation).
  • the control unit 14 may determine the drag operation as a left operation for the point D1 of the left image.
  • a flick operation or the like is applied instead of the drag operation as the gesture operation after the first prescribed operation. This operation is applied to, for example, a map scroll function for performing operations outside icons.
  • the control unit 14 may determine the pinch operation as a left operation for a point of the left image.
  • the second specified operation when it is determined that the second specified operation has been performed, the second specified operation is determined as the right operation.
  • the present invention is not limited to this. Instead of determining the second specified operation as a right operation, the gesture operation (touch operation or orbital gesture operation) after the second specified operation is determined as a right operation. May be. That is, when the operation input processing unit 9 determines that the gesture operation after the second prescribed operation has been performed, the control unit 14 determines the gesture operation determined to have been performed as the right operation. May be.
  • the drag operation after the two-point touch operation with the finger 21 is a point on the left image. It is assumed that the operation is performed on Dl and the point Dr of the right image (the arrow 21A in the figure indicates the trajectory of the finger 21 in the drag operation). In this case, the control unit 14 may determine the drag operation as a right operation for the point Dr of the right image. The same applies when a flick operation or the like is applied instead of the drag operation as the gesture operation after the second prescribed operation.
  • the control unit 14 may determine the pinch operation as a right operation with respect to a point of the right image.
  • the operation input processing unit 9 is configured to determine that a two-point touch operation has been performed when an operation of simultaneously touching the detection surface of the touch panel 3 with two points using two indicators is performed. explained.
  • the present invention is not limited to this, and the configuration described in the following first or second example may be used.
  • the configuration of the first example will be described. For example, as shown in FIG. 12, while touching a specific point (special button 22) on the touch panel 3 with the finger 21, another point on the touch panel 3 is touched with the finger 21 as shown in FIG. It is assumed that a series of operations for touching (left button L3) is performed. In this case, the operation input processing unit 9 may determine that a two-point touch operation has been performed on another one point (the left button L3). Even in such a configuration, the same effect as in the first embodiment can be obtained.
  • FIG. 12 For example, as shown in FIG. 12, within a predetermined time (for example, within 2 to 5 seconds) after touching a specific point (special button 22) on the touch panel 3 with the finger 21, FIG. As shown, it is assumed that a series of operations for touching another point (left button L3) on the touch panel with the finger 21 is performed. In this case, the operation input processing unit 9 may determine that a two-point touch operation has been performed on another one point (the left button L3). Even in such a configuration, the same effect as in the first embodiment can be obtained.
  • a predetermined time for example, within 2 to 5 seconds
  • the operation input processing unit 9 may determine that a two-point touch operation has been performed on another one point (the left button L3).
  • buttons 22 shown in FIGS. 12 to 14 are arranged so as to be separated from the left icon display area and the right icon display area on the screen of the split view display unit 2. It can suppress that operation with respect to 22 is determined as operation with respect to the icon for left, and the icon for right.
  • any one point on the touch panel 3 may be used instead of using one specific point (special button 22) on the touch panel 3.
  • the operation input processing unit 9 may determine that a two-point touch operation (second prescribed operation) for the first one point or the next one point has been performed.
  • the operation input processing unit 9 may determine that a one-point touch operation (first specified operation) for the one point has been performed. Even when configured as described above, the same effects as those of the first embodiment can be obtained.
  • an arbitrary point (first one point) on the touch panel 3 is touched with an indicator within a predetermined time (for example, 2 to 5). It is assumed that a series of operations for touching another point (next point) on the touch panel with the indicator is performed within a second). In this case, the operation input processing unit 9 may determine that a two-point touch operation (second prescribed operation) for the first one point or the next one point has been performed.
  • the operation input processing unit 9 may determine that a one-point touch operation (first specified operation) for the one point has been performed. Even when configured as described above, the same effects as those of the first embodiment can be obtained.
  • the number of points touched by the touch panel 3 is different between the touch operation applied to the first prescribed operation and the touch operation applied to the second prescribed operation, but the number of these points is different. It may be the same.
  • the first specified operation may be a two-point touch operation in which the touch panel 3 is touched simultaneously by two points separated by a distance smaller than a predetermined distance D by the indicator. It may be a touch operation in which the touch panel 3 is touched simultaneously by two points separated by a distance larger than a predetermined distance D by an indicator. Note that, for example, the maximum distance between the tip of the index finger of one hand and the tip of the middle finger is applied to the predetermined distance D here.
  • the control unit 14 uses two one-hand distances D1 that are smaller than the distance D. When it is determined that a point touch operation has been performed, the two-point touch operation can be determined as a left operation. On the other hand, as shown in FIGS. 16A and 16B, the control unit 14 determines that a two-point touch operation with both hands in which the distance D2 between the two points is equal to or greater than the distance D is performed. Can determine the two-point touch operation as a right operation.
  • the same effects as those of the first embodiment can be obtained.
  • the driver since the driver only needs to perform a two-point touch operation with one hand, which is simpler than a two-point touch operation with both hands, for example, the driver's operational burden can be suppressed. Thus, the driver can concentrate on driving.
  • the first prescribed operation and the second prescribed operation are the first touch operation and the second touch operation, respectively.
  • the first prescribed operation is a first touch operation
  • the second prescribed operation is a first gesture operation that draws a predetermined first trajectory on the touch panel 3 by an indicator (Hereinafter referred to as “first trajectory gesture operation”).
  • first trajectory gesture operation is a one-point touch operation (first touch operation having a first number of 1)
  • the first orbit gesture operation is included in the flick operation (orbit gesture operation). Will be described.
  • FIG. 17 is a flowchart showing the operation of the navigation device 1 according to the second embodiment. Steps S1 to S3 in the flowchart shown in FIG. 17 are the same as steps S1 to S3 in the flowchart shown in FIG.
  • step S14 after step S3, the operation input processing unit 9 determines whether or not an operation on the touch panel 3 has been performed. If it is determined that the operation has been performed, the process proceeds to step S15. If it is determined that the operation has not been performed, step S14 is performed again. In addition, when performing step S14 again, the control part 14 may scroll a map similarly to when performing above-mentioned step S4 again.
  • step S15 the operation input processing unit 9 determines whether or not the operation in step S14 has been performed on the left icon or the right icon. This determination result is used in step S18 or step S20.
  • step S16 the operation input processing unit 9 determines whether the operation in step S14 was a one-point touch operation, a flick operation, or none of these.
  • step S17 If it is determined that the operation is a one-point touch operation, the process proceeds to step S17. If it is determined that the operation is a flick operation, the process proceeds to step S20. If it is determined that none of these operations is performed, the process returns to step S14. . When returning to step S14, the control unit 14 may scroll the map in the same manner as when returning to step S4 described above. This is the same when returning from step S16 to step S14.
  • step S17 the control unit 14 determines the operation of step S14, that is, the one-point touch operation as the left operation.
  • step S18 the control unit 14 determines whether or not the one-point touch operation determined to be the left operation has been performed on the left icon based on the determination result of step S15. If it is determined that the one-point touch operation has been performed on the left icon, the process proceeds to step S19. If not, the process returns to step S14.
  • step S19 the control unit 14 executes a function previously associated with the left icon for which the one-point touch operation has been performed. Thereafter, the process returns to step S14. If an icon arrangement image is associated with the left icon and stored in the storage unit 11 in advance, the process returns from step S19 to step S3, and the icon arrangement image is displayed on the split view display unit 2. May be displayed.
  • step S20 the control unit 14 determines the operation of step S14, that is, the flick operation as a right operation.
  • step S21 the control unit 14 determines whether or not the flick operation determined to be the right operation has been performed on the right icon based on the determination result of step S15. If it is determined that the flick operation has been performed on the right icon, the process proceeds to step S22; otherwise, the process returns to step S14.
  • step S22 the control unit 14 executes a function previously associated with the right icon on which the flick operation has been performed. Thereafter, the process returns to step S14. If an icon arrangement image is associated with the right icon and stored in the storage unit 11 in advance, the process returns from step S22 to step S3, and the icon arrangement image is displayed in the split view display unit 2. May be displayed.
  • FIG. 17 An example of the operation of FIG. 17 described above will be described.
  • the control unit 14 determines the one-point touch operation as a left operation.
  • the control unit 14 executes the function associated with the left icon L1 without executing the function associated with the right icon R1.
  • the control unit 14 determines the flick operation as a right operation. As a result, the control unit 14 executes the function associated with the right icon R1 without executing the function associated with the left icon L1.
  • the driver since the driver only needs to perform the first touch operation that is simpler than the first orbital gesture operation, the operational burden on the driver can be suppressed. Thus, the driver can concentrate on driving.
  • the first prescribed operation is the first orbital gesture operation described in the second embodiment, and the second prescribed operation is different from the first orbital gesture operation that draws a predetermined second orbit.
  • Second gesture operation hereinafter referred to as “second orbit gesture operation”.
  • the difference between the first trajectory gesture operation and the second trajectory gesture operation means that at least one of the first trajectory and the second trajectory is different and the type of the trajectory gesture operation is different.
  • the first trajectory gesture operation is a flick operation (one kind of operation included in the trajectory gesture operation)
  • the second trajectory gesture operation is one of the operations included in the trajectory gesture operation.
  • the case where the types of orbital gesture operations are different will be described.
  • the difference between the first track and the second track will be described in Modification 1 of Embodiment 3.
  • the control unit 14 determines the flick operation as the left operation. As a result, the control unit 14 executes the function associated with the left icon L1 without executing the function associated with the right icon R1.
  • the control unit 14 determines the drag operation as the right operation. As a result, the control unit 14 executes the function associated with the right icon R1 without executing the function associated with the left icon L1.
  • a touch panel that can discriminate between a light touch operation that touches the touch panel and a push operation that presses the touch panel with a predetermined strength may be applied to the touch panel 3.
  • the operation input processing unit 9 determines that a light touch operation on the icon has been performed based on the output signal from the touch panel 3, the operation input processing unit 9 determines that the touch operation is an operation from the driver's seat side.
  • the pressing operation may be determined to be an operation from the passenger seat side.
  • the shape of the first trajectory of the first trajectory gesture operation is different from the shape of the second trajectory of the second trajectory gesture operation.
  • the first trajectory is not limited to a complete linear shape, and may be any shape included in a linear region having a predetermined width (for example, a substantially linear shape having a relatively small zigzag). Good.
  • FIG. 21A and FIG. 21B it is assumed that a V-shaped drag operation with the finger 21 is performed on the left icon L1 and the right icon R1 (in the drawing).
  • the arrow 21D indicates the trajectory of the finger 21 in the drag operation).
  • the control unit 14 determines the drag operation as a right operation.
  • the control unit 14 executes the function associated with the right icon R1 without executing the function associated with the left icon L1.
  • the first trajectory of the first trajectory gesture operation and the second trajectory of the second trajectory gesture operation are both linear, and the extending directions of the first trajectory and the second trajectory are Is different.
  • the first trajectory and the second trajectory are not limited to a complete linear shape, but are included in a linear region having a predetermined width (for example, a substantially linear shape having a relatively small zigzag). And so on.
  • the first tilt angle ⁇ 1 in the counterclockwise direction in the extending direction of the first track with respect to the horizontal direction of the touch panel 3 is greater than 0 degrees and 90 degrees. It is assumed that it is defined within a predetermined range ( ⁇ 1S to ⁇ 1L) smaller than the degree.
  • ⁇ 1S to ⁇ 1L a predetermined range
  • the horizontal X-axis direction corresponds to the inclination angle ⁇ of 0 degree and is the vertical direction.
  • the Y-axis direction corresponds to an inclination angle ⁇ of 90 degrees
  • the ⁇ X-axis direction corresponds to an inclination angle ⁇ of 180 degrees
  • the ⁇ Y-axis direction corresponds to an inclination angle ⁇ of 270 degrees.
  • the counterclockwise second tilt angle ⁇ 2 in the extending direction of the second track with respect to the horizontal direction of the touch panel 3 is defined within a predetermined range ( ⁇ 2S to ⁇ 2L) that is greater than 90 degrees and smaller than 180 degrees.
  • ⁇ 1S and ⁇ 1L may be other than 10 degrees and 80 degrees as long as they are angles larger than 0 degrees and smaller than 90 degrees.
  • ⁇ 2S and ⁇ 2L may be other than 100 degrees and 170 degrees as long as they are angles larger than 90 degrees and smaller than 180 degrees.
  • a linear drag operation with the finger 21 is performed on the left icon L1 and the right icon R1, and the drag operation is performed. It is assumed that the inclination angle ⁇ of the straight line is within ⁇ 1S to ⁇ 1L (the arrow 21E in the figure indicates the trajectory of the finger 21 in the drag operation).
  • the control unit 14 determines the drag operation as a left operation. As a result, the control unit 14 executes the function associated with the left icon L1 without executing the function associated with the right icon R1.
  • FIGS. 24A and 24B a linear drag operation with the finger 21 is performed on the left icon L1 and the right icon R1, and the straight line related to the drag operation is changed. It is assumed that the inclination angle ⁇ exists within ⁇ 2S to ⁇ 2L (the arrow 21F in the figure indicates the trajectory of the finger 21 in the drag operation). In this case, the control unit 14 determines the drag operation as a right operation. As a result, the control unit 14 executes the function associated with the right icon R1 without executing the function associated with the left icon L1.
  • a case where the user of the left seat performs a linear drag operation on the touch panel 3 disposed on the right side of the user is considered.
  • the user of the left seat rotates the forearm 26 of the right arm near the touch panel 3 out of the left arm and the right arm around the elbow 27 so as to be close to itself.
  • a drag operation for drawing a linear shape along the arrow 21E can be performed relatively easily.
  • the inclination angle ⁇ of the straight line drawn by the user of the left seat by the drag operation is likely to be within a predetermined range that is larger than 0 degree and smaller than 90 degrees.
  • the first inclination angle ⁇ 1 is defined within ⁇ 1S to ⁇ 1L.
  • an orbital gesture operation that is natural from the viewpoint of ergonomics and that is highly likely to be performed as a left operation can be applied to the first prescribed operation.
  • the orbital gesture operation that is natural from the viewpoint of ergonomics and is likely to be performed as the right operation is performed. It can be applied to prescribed operations.
  • the value of the range ( ⁇ 1S to ⁇ 1L) used to define the first tilt angle ⁇ 1 and the value of the range ( ⁇ 2S to ⁇ 2L) used to define the second tilt angle ⁇ 2 are not limited to the above.
  • the value may be changed to another value based on the position of the touch panel 3 with respect to the driver seat and the passenger seat.
  • the first tilt angle ⁇ 1 is defined within a predetermined range that is greater than ⁇ 45 degrees and less than 45 degrees
  • the second tilt angle ⁇ 2 is greater than 45 degrees and less than 135 degrees. May be defined within a certain range.
  • the first specified operation is an operation for changing the slide bar displayed on the split view display unit 2 to the first display state
  • the second specified operation is the second display state. It is an operation to change to.
  • the control unit 14 determines the gesture operation (touch operation, drag operation, flick operation, pinch operation, etc.) after the operation as a left operation.
  • the position of the ball 32 of the slide bar 31 is changed to the second display state in which the ball 32 in the slide bar 31 is positioned on the right side.
  • an operation for example, a trajectory gesture operation for moving the indicator from the left side to the right side on the slide bar 31
  • the control unit 14 determines the gesture operation (touch operation, drag operation, flick operation, pinch operation, etc.) after the operation as the right operation.
  • the position of the ball 32 of the slide bar 31 displayed immediately after the apparatus is started is stored in the storage unit 11, for example. For this reason, immediately after the device is started up, if an operation for an icon is performed without performing an operation for changing the position of the ball 32 of the slide bar 31, the operation is performed according to the stored position of the ball 32. Is determined as a left operation or a right operation. At this time, it is convenient for the driver if the initial position of the ball 32 of the slide bar 31 is set to the driver side, but the present invention is not limited to this.
  • the input unit is not limited to the touch panel 3 as long as it can uniformly accept an operation on the left image for executing the function of the application and an operation on the right image for executing the function of the application. is not.
  • a touch pad provided apart from the split view display unit 2 may be applied to the input unit.
  • the touch pad has a function of obtaining the three-dimensional position of the indicator, the position of the indicator on the operation area of the touch pad is associated with the display area of the split view display unit 2, and the indicator You may perform by the point and icon display which show the position of.
  • the left image and the right image (for example, FIG. 7A and FIG. 7B) described so far, at least a part of the display area of each of the left icons L1 to L5 and the right image are displayed. At least a part of the display area of each of the icons R1 to R5 is arranged so as to overlap on the screen of the split view display unit 2.
  • the present invention is not limited to this, and at least a part of at least one display area of the left icons L1 to L5 and at least a part of at least one display area of the right icons R1 to R5 are included in the split view display unit 2. As long as they are arranged in an overlapping manner on the screen.
  • control unit 14 performs the execution regardless of the type of the operation.
  • the function of the displayed icon may be executed.
  • the display control device described above includes not only the navigation device 1 provided in the vehicle but also a PND (Portable Navigation Device) and a mobile terminal (for example, a mobile phone, a smartphone, and a tablet) that can be mounted on the vehicle,
  • the present invention can also be applied to a display control apparatus constructed as a system by appropriately combining servers and the like. In this case, each function or each component of the navigation device 1 described above is distributed and arranged in each device constituting the system.
  • the present invention is not limited to this, and so-called display audio, PND, and portable terminal that do not have a navigation function but have a display function.
  • the present invention may be applied to any one of a stationary display device and a server.
  • the configuration in which the split view display unit 2 included in the navigation device 1 is applied to the split view type display unit has been described above, but the configuration is not limited thereto.
  • a split-view display unit included in the smartphone may be applied.
  • the present invention can be freely combined with each embodiment and each modification within the scope of the invention, or can be appropriately modified and omitted with each embodiment and each modification.
  • 1 navigation device 2 split view display section, 3 touch panel, 14 control section, 21 fingers, L1 to L5 left icons, R1 to R5 right icons.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne une technologie capable d'empêcher l'exécution involontaire de la fonction d'une application par une action de l'utilisateur. Une unité (14) de commande, lorsqu'il est déterminé qu'une opération tactile mono-point a été mise en œuvre, identifie l'opération tactile mono-point en tant qu'opération pour côté gauche par rapport à une image pour côté gauche servant à exécuter la fonction de l'application. Lorsqu'il est déterminé qu'une opération tactile à deux points a été mise en œuvre, l'unité (14) de commande identifie l'opération tactile à deux points en tant qu'opération pour côté droit par rapport à une image pour côté droit servant à exécuter la fonction de l'application.
PCT/JP2013/082688 2013-12-05 2013-12-05 Dispositif et procédé de commande d'affichage WO2015083267A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2013/082688 WO2015083267A1 (fr) 2013-12-05 2013-12-05 Dispositif et procédé de commande d'affichage
DE112013007666.7T DE112013007666T5 (de) 2013-12-05 2013-12-05 Anzeigesteuervorrichtung und Anzeigesteuerfahren
JP2015551343A JP6033465B2 (ja) 2013-12-05 2013-12-05 表示制御装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2013/082688 WO2015083267A1 (fr) 2013-12-05 2013-12-05 Dispositif et procédé de commande d'affichage

Publications (1)

Publication Number Publication Date
WO2015083267A1 true WO2015083267A1 (fr) 2015-06-11

Family

ID=53273060

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/082688 WO2015083267A1 (fr) 2013-12-05 2013-12-05 Dispositif et procédé de commande d'affichage

Country Status (3)

Country Link
JP (1) JP6033465B2 (fr)
DE (1) DE112013007666T5 (fr)
WO (1) WO2015083267A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017145746A1 (fr) * 2016-02-23 2017-08-31 京セラ株式会社 Unité de commande pour véhicule
CN108829332A (zh) * 2018-05-22 2018-11-16 珠海格力电器股份有限公司 触摸滑条的控制方法、遥控器、家用电器、计算机可读存储介质及触摸滑条装置
JP2020072943A (ja) * 2019-02-07 2020-05-14 グリー株式会社 表示制御プログラム、表示制御方法、及び表示制御システム
US11318371B2 (en) 2016-08-18 2022-05-03 Gree, Inc. Program, control method, and information processing apparatus

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006522397A (ja) * 2003-03-10 2006-09-28 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ マルチビュー・ディスプレイ
JP2007164614A (ja) * 2005-12-15 2007-06-28 Sharp Corp 表示装置及び表示装置の制御方法
JP2007207146A (ja) * 2006-02-06 2007-08-16 Alpine Electronics Inc 表示装置およびそのユーザインタフェース、メニュー提供方法
JP2008269225A (ja) * 2007-04-19 2008-11-06 Seiko Epson Corp 検出装置、および、その制御方法
WO2011102406A1 (fr) * 2010-02-18 2011-08-25 ローム株式会社 Dispositif d'entrée à panneau tactile

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101375235B (zh) * 2006-02-03 2011-04-06 松下电器产业株式会社 信息处理装置
JP2010061256A (ja) * 2008-09-02 2010-03-18 Alpine Electronics Inc 表示装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006522397A (ja) * 2003-03-10 2006-09-28 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ マルチビュー・ディスプレイ
JP2007164614A (ja) * 2005-12-15 2007-06-28 Sharp Corp 表示装置及び表示装置の制御方法
JP2007207146A (ja) * 2006-02-06 2007-08-16 Alpine Electronics Inc 表示装置およびそのユーザインタフェース、メニュー提供方法
JP2008269225A (ja) * 2007-04-19 2008-11-06 Seiko Epson Corp 検出装置、および、その制御方法
WO2011102406A1 (fr) * 2010-02-18 2011-08-25 ローム株式会社 Dispositif d'entrée à panneau tactile

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017145746A1 (fr) * 2016-02-23 2017-08-31 京セラ株式会社 Unité de commande pour véhicule
JP2017149225A (ja) * 2016-02-23 2017-08-31 京セラ株式会社 車両用コントロールユニット
US11221735B2 (en) 2016-02-23 2022-01-11 Kyocera Corporation Vehicular control unit
US11318371B2 (en) 2016-08-18 2022-05-03 Gree, Inc. Program, control method, and information processing apparatus
US11707669B2 (en) 2016-08-18 2023-07-25 Gree, Inc. Program, control method, and information processing apparatus
CN108829332A (zh) * 2018-05-22 2018-11-16 珠海格力电器股份有限公司 触摸滑条的控制方法、遥控器、家用电器、计算机可读存储介质及触摸滑条装置
CN108829332B (zh) * 2018-05-22 2023-10-03 珠海格力电器股份有限公司 触摸滑条的控制方法、遥控器、家用电器、计算机可读存储介质及触摸滑条装置
JP2020072943A (ja) * 2019-02-07 2020-05-14 グリー株式会社 表示制御プログラム、表示制御方法、及び表示制御システム

Also Published As

Publication number Publication date
DE112013007666T5 (de) 2016-08-25
JP6033465B2 (ja) 2016-11-30
JPWO2015083267A1 (ja) 2017-03-16

Similar Documents

Publication Publication Date Title
CN106062514B (zh) 便携式装置与车辆头端单元之间的交互
JP5225820B2 (ja) 入力装置、車両周辺監視装置、アイコンスイッチ選択方法及びプログラム
JP2007331692A (ja) 車載電子装置およびタッチパネル装置
JP2007042029A (ja) 表示装置およびプログラム
JP6033465B2 (ja) 表示制御装置
JP2019175449A (ja) 情報処理装置、情報処理システム、移動体、情報処理方法、及びプログラム
EP3236340B1 (fr) Appareil de traitement d'informations et procédé de commande d'appareil de traitement d'informations
JP2008129689A (ja) タッチパネルを備えた入力装置、その入力受付方法
JP2016097928A (ja) 車両用表示制御装置
JP2013161230A (ja) 入力装置
JP6147357B2 (ja) 表示制御装置及び表示制御方法
US20190155560A1 (en) Multi-display control apparatus and method thereof
WO2015083266A1 (fr) Dispositif et procédé de commande d'affichage
JP6180306B2 (ja) 表示制御装置及び表示制御方法
JP2014182808A (ja) タッチスクリーンユーザインターフェースのナビゲーション制御
JP5933468B2 (ja) 情報表示制御装置、情報表示装置および情報表示制御方法
JP6124777B2 (ja) 表示制御装置、表示制御方法及び画像設計方法
US8621347B2 (en) System for providing a handling interface
JP5901865B2 (ja) 表示制御装置及び表示制御方法
JP2015108984A (ja) 表示制御装置及び表示制御方法
JP5889230B2 (ja) 情報表示制御装置、情報表示装置および情報表示制御方法
JP5950851B2 (ja) 情報表示制御装置、情報表示装置および情報表示制御方法
JP6371589B2 (ja) 車載システム、視線入力受付方法及びコンピュータプログラム
JP5984718B2 (ja) 車載情報表示制御装置、車載情報表示装置および車載表示装置の情報表示制御方法
JP2014191818A (ja) 操作支援システム、操作支援方法及びコンピュータプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13898651

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015551343

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 112013007666

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13898651

Country of ref document: EP

Kind code of ref document: A1