WO2015151154A1 - Display apparatus, display method, and display program - Google Patents

Display apparatus, display method, and display program Download PDF

Info

Publication number
WO2015151154A1
WO2015151154A1 PCT/JP2014/059441 JP2014059441W WO2015151154A1 WO 2015151154 A1 WO2015151154 A1 WO 2015151154A1 JP 2014059441 W JP2014059441 W JP 2014059441W WO 2015151154 A1 WO2015151154 A1 WO 2015151154A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
pressed
touch
finger
change
Prior art date
Application number
PCT/JP2014/059441
Other languages
French (fr)
Japanese (ja)
Inventor
晃生 川口
悟 池増
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to PCT/JP2014/059441 priority Critical patent/WO2015151154A1/en
Priority to JP2016511181A priority patent/JPWO2015151154A1/en
Publication of WO2015151154A1 publication Critical patent/WO2015151154A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to a display device that performs display control by a touch operation, a display method, and a display program.
  • the use of the present invention is not limited to the display device, the display method, and the display program.
  • a touch panel that performs display control by operating a finger or the like.
  • an operation type is acquired from a touch area touched by a finger operation, and an operation amount is determined based on a touch area touched by the finger (see, for example, Patent Document 1 below).
  • the amount of operation cannot be varied by the manner of touch operation with a plurality of fingers.
  • the operation amount is different for each person depending on the size of the touched finger or the like.
  • the operation is troublesome, such as changing the strength of the touch of each finger to increase or decrease the touch area for changing the operation amount.
  • a display device is a display device having a touch panel, wherein a first receiving unit that receives a first touch operation for touching the touch panel; After the first touch operation, a second accepting unit for accepting a second touch operation for touching a plurality of other parts of the touch panel within a reference time, and determining a processing content related to display by the first touch operation, Determining means for determining a degree of change related to the processing content by a second touch operation.
  • a display method implemented by a display device having a touch panel.
  • a display program according to the invention of claim 7 causes a computer to execute the display method of claim 6.
  • FIG. 1 is a block diagram illustrating an example of a functional configuration of the display device according to the embodiment.
  • FIG. 2 is a flowchart illustrating a processing example of the display device according to the embodiment.
  • FIG. 3 is a block diagram illustrating an example of a hardware configuration of the navigation device according to the embodiment.
  • FIG. 4 is a diagram for explaining display control for changing the magnification based on a finger operation according to the first embodiment.
  • FIG. 5 is a diagram for explaining display control of magnification change based on a finger operation according to the second embodiment.
  • FIG. 6 is a flowchart illustrating an example of display control for changing the magnification based on a finger operation according to the second embodiment.
  • FIG. 1 is a block diagram illustrating an example of a functional configuration of the display device according to the embodiment.
  • FIG. 2 is a flowchart illustrating a processing example of the display device according to the embodiment.
  • FIG. 3 is a block diagram illustrating an example of a hardware configuration of
  • FIG. 7 is a diagram for explaining display control for changing the scroll amount based on a finger operation according to the third embodiment.
  • FIG. 8 is a flowchart illustrating a display control example of scroll amount change based on a finger operation according to the third embodiment.
  • FIG. 9 is a diagram for explaining display control of scroll change based on a finger operation according to the fourth embodiment.
  • FIG. 1 is a block diagram illustrating an example of a functional configuration of a display device according to an embodiment.
  • the display device 100 is configured such that an operation unit 111 such as a touch panel is provided on the display unit 112.
  • the coordinate position of the display screen of the display unit 112 and the coordinate position of the operation surface of the operation unit 111 are set to the same coordinate by a touch panel control unit (not shown).
  • the display device 100 includes a first reception unit 101, a second reception unit 102, and a determination unit 103.
  • the first reception unit 101 receives a first operation in which a part of the operation unit 111 is pressed (touched) by the user.
  • the second receiving unit 102 receives a second operation of pressing a plurality of other parts of the operation unit 111 within a reference time after the user has pressed down a part of the operation unit 111 received by the first receiving unit 101.
  • the second reception unit 102 since the second reception unit 102 receives the second operation within the reference time after the first operation, even if the user once lifts the finger from the operation unit 111, the second reception unit 102 is within the reference time. If another portion of the operation unit 111 is pressed, it is accepted as a second operation. As a result, the user can, for example, perform the operation of setting the display type related to the display in the first operation and the operation of setting the operation amount of the display in the second operation separately in time. The operation can be performed clearly and easily.
  • the determination unit 103 determines the processing content by the first operation, and determines the change direction and the degree of change of the processing determined by the second operation. For example, when the first operation is an enlargement / reduction of a scale such as a map, the second operation determines the degree of change of the scale for the determined enlargement / reduction. In addition, when the first operation is scrolling a map or the like, the scroll amount is determined in the second operation. The amount of change in the second operation indicates the speed of change in display content per time.
  • the processing content determined by the determination unit 103 and the degree of change with respect to the processing content are output to the touch panel control unit, and the display content of the display unit 112 is controlled.
  • the determining unit 103 can determine the direction of processing based on the direction of pressing a plurality of fingers by the second operation (the direction in which the fingers are arranged), and can determine the degree of change based on the number of presses.
  • the determination unit 103 is not limited to determining the degree of change only when the operation amount based on the second operation is operated, that is, when the finger is pressed (touched) on the operation unit 111. That is, when the finger is pressed on the operation unit 111, the degree of change with respect to the processing content is determined to change the actual display content on the display unit 112. Thereafter, when the finger is released from the operation unit 111 Alternatively, the degree of change in processing may be determined again.
  • FIG. 2 is a flowchart illustrating a processing example of the display device according to the embodiment.
  • the display device 100 waits for a first operation by the user to depress a part of the operation unit 111 by the first reception unit 101 (step S201: No loop). If there is a first operation of the operation unit 111 by the user (step S201: Yes), the process proceeds to step S202.
  • the second reception unit 102 determines whether the user has performed a second operation on the operation unit 111 within a reference time (step S202).
  • the second reception unit 102 returns to step S201.
  • a plurality of fingers are pressed at a position different from the first pressed position within a reference time (for example, within 1 second) after the first operation, it is determined that the second operation has been performed (step S202). : Yes).
  • the determination unit 103 determines the processing content by the first operation (step S203). For example, if the process for enlarging the scale is selected as the process for the map displayed in the first operation, the scale enlargement is determined.
  • the determination unit 103 determines the degree of change (change amount) with respect to the processing content (scale enlargement of the map) by the second operation. For example, in the second operation, the enlargement ratio of the map scale is determined by the number of fingers (step S204). Specifically, the enlargement ratio is increased as the number of fingers pressing the operation unit 111 is increased. For example, it is determined that the magnification rate is doubled when pressed with two fingers, and the magnification rate is tripled when pressed with three fingers.
  • the degree of change at the time of display processing can be changed based on the number of fingers pressing the operation unit.
  • the user may first remove the finger from the operation unit within the reference time after first determining the processing content by the first operation.
  • the degree of change with respect to the processing content determined in the first operation is determined based on the number of fingers.
  • the determination unit can determine the processing direction based on the plurality of pressing directions (directions in which fingers are arranged) by the second operation. For example, when the scale of the map is determined by the first operation, and when a plurality of fingers are pressed in order substantially along the right direction on the screen, the determination is made to enlarge the scale by the number of pressed fingers. On the contrary, when a plurality of fingers are pressed in order almost along the left direction on the screen, it is determined to reduce the scale by the number of pressed fingers. At this time, the degree of change (the amount of scale at the time of scaling and the amount of scrolling at the time of scrolling) can be determined by the number of times of pressing a plurality of times.
  • the determination unit determines a change amount by pressing a plurality of fingers once on the operation unit in the second operation, changes the display content by the change amount, and then releases the finger from the operation unit. It is possible to determine the degree of change in processing determined including the state. For example, if the number of pressed fingers is reduced by 1 after the operation unit is pressed down by the second operation and displayed as a predetermined enlargement ratio (3 times), the degree of change is reduced by the reduced amount, The scale of the display can be reduced (changed to 2 times).
  • the embodiment it is possible to easily determine the processing content and the degree of change with respect to the processing content by simply pressing a plurality of fingers without sliding the finger on the operation unit.
  • the display contents can be controlled with a simple operation.
  • FIG. 3 is a block diagram illustrating an example of a hardware configuration of the navigation device according to the embodiment.
  • a navigation device 300 includes a CPU 301, ROM 302, RAM 303, magnetic disk drive 304, magnetic disk 305, optical disk drive 306, optical disk 307, audio I / F (interface) 308, microphone 309, speaker 310, input device 311, A video I / F 312, a display 313, a communication I / F 314, a GPS unit 315, various sensors 316, and a camera 317 are provided.
  • Each component 301 to 317 is connected by a bus 320.
  • the CPU 301 governs overall control of navigation device 300.
  • the ROM 302 records a boot program, a route search program, and the like.
  • the RAM 303 is used as a work area for the CPU 301. That is, the CPU 301 controls the entire navigation device 300 by executing various programs recorded in the ROM 302 while using the RAM 303 as a work area.
  • the magnetic disk drive 304 controls the reading / writing of the data with respect to the magnetic disk 305 according to control of CPU301.
  • the magnetic disk 305 records data written under the control of the magnetic disk drive 304.
  • an HD hard disk
  • FD flexible disk
  • the optical disk drive 306 controls reading / writing of data with respect to the optical disk 307 according to the control of the CPU 301.
  • the optical disk 307 is a detachable recording medium from which data is read according to the control of the optical disk drive 306.
  • a writable recording medium can be used as the optical disc 307.
  • an MO, a memory card, or the like can be used as a removable recording medium.
  • Examples of information recorded on the magnetic disk 305 and the optical disk 307 include map data, vehicle information, images, and travel history. Map data is used when searching for routes in the navigation system. Background data that represents features (features) such as buildings, rivers, ground surfaces, and energy supply facilities, and road shape data that represents road shapes with links and nodes. Vector data including
  • the voice I / F 308 is connected to a microphone 309 for voice input and a speaker 310 for voice output.
  • the sound received by the microphone 309 is A / D converted in the sound I / F 308.
  • the microphone 309 is installed in a dashboard portion of a vehicle, and the number thereof may be one or more. From the speaker 310, a sound obtained by D / A converting a predetermined sound signal in the sound I / F 308 is output.
  • the input device 311 includes a remote controller, a keyboard, a touch panel, and the like provided with a plurality of keys for inputting characters, numerical values, various instructions, and the like.
  • a transparent touch panel provided on the display screen of the display 313 is used.
  • the video I / F 312 is connected to the display 313. Specifically, the video I / F 312 is output from, for example, a graphic controller that controls the entire display 313, a buffer memory such as a VRAM (Video RAM) that temporarily records image information that can be displayed immediately, and a graphic controller. And a control IC for controlling the display 313 based on the image data to be processed.
  • a graphic controller that controls the entire display 313, a buffer memory such as a VRAM (Video RAM) that temporarily records image information that can be displayed immediately, and a graphic controller.
  • VRAM Video RAM
  • the display 313 displays icons, cursors, menus, windows, or various data such as characters and images.
  • a TFT liquid crystal display, an organic EL display, or the like can be used as the display 313, for example.
  • the camera 317 shoots an image including, for example, a road outside the vehicle.
  • the image may be either a still image or a moving image.
  • the outside of the vehicle is photographed by the camera 317, and the photographed image is analyzed by the CPU 301, or a recording medium such as the magnetic disk 305 or the optical disk 307 via the video I / F 312. Or output to
  • the communication I / F 314 is connected to the network via wireless and functions as an interface between the navigation device 300 and the CPU 301.
  • Communication networks that function as networks include in-vehicle communication networks such as CAN and LIN (Local Interconnect Network), public line networks and mobile phone networks, DSRC (Dedicated Short Range Communication), LAN, and WAN.
  • the communication I / F 314 is, for example, a public line connection module, an ETC (non-stop automatic fee payment system) unit, an FM tuner, a VICS (Vehicle Information and Communication System) / beacon receiver, or the like.
  • the GPS unit 315 receives radio waves from GPS satellites and outputs information indicating the current position of the vehicle.
  • the output information of the GPS unit 315 is used when the CPU 301 calculates the current position of the vehicle together with output values of various sensors 316 described later.
  • the information indicating the current position is information for specifying one point on the map data such as latitude / longitude and altitude.
  • Various sensors 316 output information for determining the position and behavior of the vehicle, such as a vehicle speed sensor, an acceleration sensor, an angular velocity sensor, and a tilt sensor.
  • the output values of the various sensors 316 are used for the calculation of the current position of the vehicle by the CPU 301 and the amount of change in speed and direction.
  • the CPU 301 executes a predetermined program using programs and data recorded in the ROM 302, RAM 303, magnetic disk 305, optical disk 307, etc. of the navigation device 300 shown in FIG. 1, display control performed by the first receiving unit 101 to the determination unit 103 in FIG. 1 described above (determination of display processing content, determination of the degree of change with respect to the determined processing content, and on the display 313 according to the determination content) Display control) function.
  • FIG. 4 is a diagram for explaining display control for changing the magnification based on a finger operation according to the first embodiment.
  • an input device (touch panel) 311 is provided so as to overlap.
  • the map data is displayed on the display screen of the display 313, and an operation button (group) 401 for determining the processing content for the display (scale change in the illustrated example) is displayed on the left side of the display screen.
  • an operation button (group) 401 for determining the processing content for the display is displayed on the left side of the display screen.
  • an operation button (group) 401 for determining the processing content for the display is displayed on the left side of the display screen.
  • an operation button (group) 401 for determining the processing content for the display is displayed on the left side of the display screen.
  • the input device (touch panel) 311 detects the pressed coordinate position when the operation button 401 or the like displayed on the display 313 is pressed.
  • the input device 311 is a magnification enlargement operation when the magnification enlargement button 401a is pressed. Is detected.
  • the display control performed by the navigation device 300 includes the determination of the processing contents described above and the determination of the degree of change with respect to the determined processing contents. These determination processes will be mainly described below.
  • the user when an example is described in which the user reduces the scale of the map data, the user first presses the scale down button 401b with, for example, the index finger B as the first operation (the location of touch1 in the figure). Thereby, the touch panel 311 detects pressing of the magnification reduction button 401b, and the CPU 301 determines the magnification reduction as the display processing content. At this time, the map data is reduced to 1 ⁇ 2 by the first operation.
  • the user when the user further changes the reduction ratio, the user further performs a second operation.
  • the user uses a finger other than the finger (index finger B) pressed in the first operation, to a position other than the position where the operation button (group) 401 is arranged on the touch panel 311 (any part of the map data). ).
  • the touch panel 311 when the user presses down on the touch panel 311 other than the pressing index finger B used for the first operation (in this example, the middle finger C) (two touches in the figure), a total of two fingers are pressed and reduced.
  • the magnification can be further changed (for example, 1/3).
  • the touch panel 311 is further pressed with another ring finger D (the location of touch 3 in the figure), a total of three fingers are pressed on the touch panel 311 to further change the reduction magnification (for example, 1/4 times).
  • the touch panel 311 is further pressed with another little finger E (the location of touch 4 in the figure)
  • a total of four fingers are pressed on the touch panel 311 to further change the reduction magnification (for example, 1/5 times).
  • the CPU 301 changes the reduction ratio according to the degree of change according to the number of fingers pressed. . For this reason, even if the finger (index finger B) pressed in the first operation is released, a portion other than the operation button 401 on the touch panel 311 (an arbitrary portion of the map data) is pressed within the reference time (middle finger C in the above example). ) Can be detected as the second operation.
  • the degree of processing content (magnification reduction in this example) determined by the first operation can be increased only by increasing the number of fingers to be pressed by the second operation.
  • the degree of magnification reduction can be increased as the number of fingers pressing the touch panel 311 is increased.
  • the degree of magnification enlargement can be increased as the number of fingers pressing down the touch panel 311 is increased.
  • the CPU 301 can perform display control reverse to that in the past. For example, when only one finger (for example, the little finger E) is released after being pressed using the four fingers BCDE shown in FIG. 4, the magnification reduction control up to that point is performed (magnification of 1/4 corresponding to three). To). In this way, the degree of change in the determined processing content can be arbitrarily enlarged or reduced by merely changing the number of fingers that press each finger of one hand on the touch panel 311.
  • the degree of change again can be increased or decreased according to the number of pressed fingers.
  • FIG. 5 is a diagram for explaining display control of magnification change based on a finger operation according to the second embodiment.
  • the direction of change related to the processing content and the degree of change related to the processing content are determined according to the direction of the finger when pressed in order by the second operation.
  • buttons magnification enlargement button 401a and magnification reduction button 401b
  • magnification change button 401c may be displayed.
  • the user first presses the magnification change button 401c with, for example, the index finger B (the location of touch 1 in the drawing).
  • the CPU 301 determines the magnification change as the processing content of the display by the first operation, but has not yet determined the change direction of the processing content (magnification or reduction of the magnification).
  • the user sequentially presses a plurality of fingers along different directions depending on whether the magnification is enlarged or the magnification is reduced. Thereby, the CPU 301 determines the direction of change of the processing content.
  • the user when performing magnification enlargement, uses a plurality of fingers other than the finger (index finger B) pressed in the first operation to use a part other than the operation button 401 on the touch panel 311 (any part of the map data). Press.
  • the CPU 301 determines that the subsequent pressing direction is the right direction, and in this case, determines magnification enlargement as the direction of change in the processing content.
  • the coordinate position within a predetermined distance with respect to the coordinate position of the magnification change button 401c is set as a close place in advance.
  • the CPU 301 determines the second processing content assuming that the degree of change in the processing content is doubled, for example.
  • the CPU 301 determines the second processing content by setting the degree of change in the processing content to, for example, a reduction ratio of 1/2.
  • the reduction ratio can be further changed (for example, 1/3).
  • the touch panel 311 is further pressed down with the other middle finger C located on the left side of the ring finger D (the location of touch 2 in the drawing)
  • the reduction ratio can be changed to 1 ⁇ 4.
  • FIG. 6 is a flowchart illustrating an example of display control for changing the magnification based on a finger operation according to the second embodiment.
  • the magnification change determination process executed by the CPU 301 is mainly described.
  • the CPU 301 detects that the magnification change button 401c is pressed (step S601)
  • the CPU 301 waits for detection of the second pressing position with respect to a place other than the magnification change button 401c (step S602).
  • the user may release the magnification change button 401c after pressing the magnification change button 401c, or press the magnification change button 401c as in the first embodiment.
  • the second press may be performed with another finger.
  • step S602 the CPU 301 waits for the next second press for the reference time after the press of the magnification change button 401c. If there is no next pressing within the reference time after pressing the magnification change button 401c (step S602: Case 1), the operation of changing the magnification is canceled (step S603), and the process is terminated.
  • step S602 If the position close to the magnification change button 401c (touch2 in the example of FIG. 5) is pressed within the reference time after the magnification change button 401c is pressed (step S602: Case 2), the CPU 301 expands the magnification change processing. (Step S604). If a position far from the magnification change button 401c (touch 4 in the example of FIG. 5) is pressed within the reference time after the magnification change button 401c is pressed (step S602: Case 3), the CPU 301 reduces the magnification change process. (Step S610).
  • step S604 it is determined whether or not there is a third press within a predetermined time from the second press in step S602 (step S605). If there is no third press (step S605: No), The number of pressed fingers is two, the enlargement magnification is determined to be 2 (step S606), and the process is terminated. On the other hand, if there is a third press (for example, touch 3 in FIG. 5, step S605: Yes), the process proceeds to step S607.
  • step S605 No
  • step S607 it is determined whether there is a fourth press within a predetermined time after step S605 (step S607). If there is no fourth press (step S607: No), the number of pressed fingers is three. Thus, the enlargement magnification is determined to be 3 (step S608), and the process is terminated. On the other hand, if there is a fourth press (for example, touch 4 in FIG. 5, step S607: Yes), the number of pressed fingers is four, the enlargement magnification is determined to be four times (step S609), and the process is performed. finish.
  • step S607 it is determined whether there is a fourth press within a predetermined time after step S605 (step S607). If there is no fourth press (step S607: No), the number of pressed fingers is three. Thus, the enlargement magnification is determined to be 3 (step S608), and the process is terminated. On the other hand, if there is a fourth press (for example, touch 4 in FIG. 5, step
  • step S611 it is determined whether or not there is a third press within a predetermined time from the second press in step S602 (step S611). If there is no third press (step S611: No), The number of pressed fingers is two, the reduction ratio is determined to be 1/2 (step S612), and the process ends. On the other hand, if there is a third press (for example, touch 3 in FIG. 5, step S611: Yes), the process proceeds to step S613.
  • step S613 it is determined whether there is a fourth press within a predetermined time after step S610 (step S613). If there is no fourth press (step S613: No), the number of pressed fingers is three. Therefore, the reduction ratio is determined to be 1/3 (step S614), and the process is terminated. On the other hand, if there is a fourth press (for example, touch 2 in FIG. 5, step S613: Yes), the number of fingers being pressed is four, and the reduction ratio is determined to be 1/4 (step S615). The process ends.
  • the processing content in the first operation can be determined at the same time depending on the direction in which the plurality of fingers are pressed during the second operation.
  • the finger may be once released after the first operation, and it is only necessary to press the finger again within the reference time. Therefore, the determination of the processing content by the first operation and the change of the processing content by the second operation By releasing the finger from the determination of the degree, it becomes possible to perform operation by separating the sense and time.
  • the finger used in the first operation may be used again during the second operation.
  • the magnification reduction button 401b touch1
  • the index finger B is once released, and the touch2 portion may be pressed again with the index finger B in the second operation.
  • FIG. 7 is a diagram for explaining display control for changing the scroll amount based on a finger operation according to the third embodiment.
  • the magnification change has been described.
  • the scroll amount change will be described.
  • display control related to scrolling determination of processing content and determination of change amount of determined processing content
  • display control related to scrolling is basically performed by pressing a plurality of fingers as in the above embodiment.
  • the CPU 301 displays the ten-hour index X at the center position on the display 313, and when a position away from the index X is pressed, the display screen is displayed so that the pressed position becomes the center position of the display 313. Control to scroll (map data).
  • control is performed to scroll the display screen (map data) so that the position touched by the finger first becomes the center position of the display 313.
  • the CPU 301 performs control so that the position of touch1 becomes the center position of the display 313 (corresponding to the first operation).
  • the scroll speed is set to a normal speed (1 time).
  • the CPU 301 determines the second operation, that is, the degree of change (scroll speed) by the number of subsequent fingers.
  • the second operation that is, the degree of change (scroll speed) by the number of subsequent fingers.
  • the press is performed with two fingers.
  • the scroll speed is changed (increased) to twice the normal speed.
  • the scroll speed is changed to three times the normal speed when pressed with three fingers including the ring finger D, and the scroll speed is changed to four times the normal speed when pressed with four fingers including the little finger E.
  • FIG. 8 is a flowchart illustrating a display control example of scroll amount change based on a finger operation according to the third embodiment.
  • the scroll speed change determination process executed by the CPU 301 is mainly described. First, when the CPU 301 detects pressing of the finger on the map data (step S801), the CPU 301 performs control to scroll the display screen (map data) so that the pressed position becomes the center position of the display 313.
  • the CPU 301 determines whether or not there is a second finger press within a predetermined time from the first press (step S802). If there is no second press (step S802: No), scrolling is started with the scroll speed set to the normal speed (1 time) (step S803), and the process ends.
  • step S802 if there is a second press (touch2 in the example of FIG. 7, step S802: Yes), the process proceeds to step S804.
  • step S804 it is determined whether there is a third finger press within a predetermined time from the second press (step S804). If there is no third press (step S804: No), the number of pressed fingers is two, scrolling is started with the scroll speed set to twice the normal speed (step S805), and the process ends.
  • step S804 if there is a third press (touch3 in the example of FIG. 7, step S804: Yes), the process proceeds to step S806.
  • step S806 it is determined whether there is a fourth finger press within a predetermined time from the third press (step S806). If there is no fourth press (step S806: No), the number of pressed fingers is three, the scroll speed is set to three times the normal speed, scrolling is started (step S807), and the process ends. If there is a fourth press (touch4 in the example of FIG. 7, step S806: Yes), the number of pressed fingers is four, and scrolling is started with the scroll speed set to four times the normal speed (step S808). The process is terminated.
  • the scroll speed can be decreased every time the number of pressed fingers is reduced, and the scroll speed can be increased or decreased in accordance with the increase or decrease of the number of pressed fingers.
  • Example 4 The fourth embodiment is a modification of the third embodiment.
  • the scrolling speed is changed in accordance with the time interval between pressing of a plurality of fingers or the pressing width of each finger.
  • FIG. 9 is a diagram for explaining display control for scroll change based on finger operation according to the fourth embodiment.
  • the scroll speed may be changed based on the interval between adjacent positions pressed by a plurality of fingers.
  • the scroll amount at the normal speed (1 time) of pressing at one place (touch 1) and the scroll speed at the time of two fingers (touch 1 and touch 2) are set.
  • the interval L1 is based on a scroll speed twice that of two fingers (touch1 and touch2).
  • the scroll speed is 1.5 to 2.5 times depending on the situation.
  • the scroll speed can be continuously varied according to the interval L1 between the two fingers.
  • the scroll speed can also be changed by changing the interval L1 while two fingers are pressed.
  • the pressing with the three fingers is based on a scroll speed that is three times the normal speed.
  • the scroll speed is set to 2.5 to 3.5 times according to the interval L2.
  • the scroll speed based on the intervals L1 and L2 is changed every time each adjacent finger is pressed, not only the scroll speed is changed only by the adjacent intervals L1 and L2, but the entire interval of the entire pressed finger is changed.
  • the scroll speed may be changed based on the interval L1 + L2.
  • the said Example demonstrated the structural example which uses a navigation apparatus as a display apparatus, it applies similarly to the electronic device which has a touch panel, for example, a smart phone, a tablet, the display of PC provided with the touch panel, etc. Can do. If the display area or the like of the display is large, the number of fingers that can be pressed increases accordingly, so that it can be operated using both hands.
  • the display method described in this embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation.
  • This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read from the recording medium by the computer.
  • the program may be a transmission medium that can be distributed via a network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A display apparatus (100) is provided with an operation unit (111) configured from a touch panel, said operation unit being provided on a display unit (112) by overlapping the display unit. The display apparatus (100) has: a first receiving unit (101) that receives a first operation of depressing one part of the operation unit (111); a second receiving unit (102) that receives a second operation of depressing a plurality of other parts of the operation unit (111) within a reference time after the one part is depressed by means of the first operation; and a determining unit (103), which determines, by means of the first operation, processing contents relating to display, and which determines, by means of the second operation, the degree of a change relating to the processing contents.

Description

表示装置、表示方法および表示プログラムDisplay device, display method, and display program
 この発明は、タッチ操作により表示制御を行う表示装置、表示方法および表示プログラムに関する。ただし、この発明の利用は、表示装置、表示方法および表示プログラムに限らない。 The present invention relates to a display device that performs display control by a touch operation, a display method, and a display program. However, the use of the present invention is not limited to the display device, the display method, and the display program.
 表示装置として、指などの操作により表示制御を行うタッチパネルがある。例えば、指の操作により接触したタッチ領域で操作種別を取得し、指が触れたタッチ面積に基づき、操作量を決定する技術がある(例えば、下記特許文献1参照。)。 As a display device, there is a touch panel that performs display control by operating a finger or the like. For example, there is a technique in which an operation type is acquired from a touch area touched by a finger operation, and an operation amount is determined based on a touch area touched by the finger (see, for example, Patent Document 1 below).
特開2004-96267号公報JP 2004-96267 A
 しかしながら、上記従来の技術では、複数の指のタッチ操作の仕方で操作量を可変させることができなかった。例えば、従来の技術では、タッチする指の数で操作量を変更できても、タッチした指の大きさ等により各人別に操作量が異なることとなる。また、操作量の変更のためタッチ面積を増減させるべく各指のタッチの強さを可変させる等、操作に手間がかかる、という課題が一例として挙げられる。 However, with the conventional technology described above, the amount of operation cannot be varied by the manner of touch operation with a plurality of fingers. For example, in the conventional technique, even if the operation amount can be changed by the number of fingers to be touched, the operation amount is different for each person depending on the size of the touched finger or the like. Further, as an example, there is a problem that the operation is troublesome, such as changing the strength of the touch of each finger to increase or decrease the touch area for changing the operation amount.
 上述した課題を解決し、目的を達成するため、請求項1の発明にかかる表示装置は、タッチパネルを有する表示装置において、前記タッチパネルにタッチする第1のタッチ操作を受け付ける第1受付手段と、前記第1のタッチ操作後、基準時間内に前記タッチパネルの他の部分を複数タッチする第2のタッチ操作を受け付ける第2受付手段と、前記第1のタッチ操作により表示に関する処理内容を決定し、前記第2のタッチ操作により前記処理内容に関する変化の度合いを決定する決定手段と、を有することを特徴とする。 In order to solve the above-described problems and achieve the object, a display device according to a first aspect of the present invention is a display device having a touch panel, wherein a first receiving unit that receives a first touch operation for touching the touch panel; After the first touch operation, a second accepting unit for accepting a second touch operation for touching a plurality of other parts of the touch panel within a reference time, and determining a processing content related to display by the first touch operation, Determining means for determining a degree of change related to the processing content by a second touch operation.
 また、請求項6の発明にかかる表示方法は、タッチパネルを有する表示装置が実施する表示方法において、前記タッチパネルの一部をタッチする第1のタッチ操作を受け付ける第1受付工程と、前記第1のタッチ操作による一部のタッチ後、基準時間内に前記タッチパネルの他の部分を複数タッチする第2のタッチ操作を受け付ける第2受付工程と、前記第1のタッチ操作により表示に関する処理内容を決定し、前記第2のタッチ操作により前記処理内容に関する変化の度合いを決定する決定工程と、を含むことを特徴とする。 According to a sixth aspect of the present invention, there is provided a display method implemented by a display device having a touch panel. The first reception step of receiving a first touch operation for touching a part of the touch panel; After a part of the touch operation, a second reception step of accepting a second touch operation for touching a plurality of other parts of the touch panel within a reference time, and a processing content related to display by the first touch operation are determined. And a determination step of determining a degree of change related to the processing content by the second touch operation.
 また、請求項7の発明にかかる表示プログラムは、請求項6に記載の表示方法をコンピュータに実行させることを特徴とする。 A display program according to the invention of claim 7 causes a computer to execute the display method of claim 6.
図1は、実施の形態にかかる表示装置の機能的構成の一例を示すブロック図である。FIG. 1 is a block diagram illustrating an example of a functional configuration of the display device according to the embodiment. 図2は、実施の形態にかかる表示装置の処理例を示すフローチャートである。FIG. 2 is a flowchart illustrating a processing example of the display device according to the embodiment. 図3は、実施例にかかるナビゲーション装置のハードウェア構成の一例を示すブロック図である。FIG. 3 is a block diagram illustrating an example of a hardware configuration of the navigation device according to the embodiment. 図4は、実施例1にかかる指の操作に基づく倍率変更の表示制御を説明する図である。FIG. 4 is a diagram for explaining display control for changing the magnification based on a finger operation according to the first embodiment. 図5は、実施例2にかかる指の操作に基づく倍率変更の表示制御を説明する図である。FIG. 5 is a diagram for explaining display control of magnification change based on a finger operation according to the second embodiment. 図6は、実施例2にかかる指の操作に基づく倍率変更の表示制御例を示すフローチャートである。FIG. 6 is a flowchart illustrating an example of display control for changing the magnification based on a finger operation according to the second embodiment. 図7は、実施例3にかかる指の操作に基づくスクロール量変更の表示制御を説明する図である。FIG. 7 is a diagram for explaining display control for changing the scroll amount based on a finger operation according to the third embodiment. 図8は、実施例3にかかる指の操作に基づくスクロール量変更の表示制御例を示すフローチャートである。FIG. 8 is a flowchart illustrating a display control example of scroll amount change based on a finger operation according to the third embodiment. 図9は、実施例4にかかる指の操作に基づくスクロール変更の表示制御を説明する図である。FIG. 9 is a diagram for explaining display control of scroll change based on a finger operation according to the fourth embodiment.
(実施の形態)
 以下に添付図面を参照して、この発明にかかる表示装置、表示方法および表示プログラムの好適な実施の形態を詳細に説明する。
(Embodiment)
Exemplary embodiments of a display device, a display method, and a display program according to the present invention are explained in detail below with reference to the accompanying drawings.
 図1は、実施の形態にかかる表示装置の機能的構成の一例を示すブロック図である。表示装置100は、タッチパネル等の操作部111が表示部112上に重ねて設けられてなる。表示部112の表示画面の座標位置と、操作部111の操作面の座標位置はタッチパネル制御部(不図示)によって同一座標とされている。この表示装置100は、第1受付部101と、第2受付部102と、決定部103と、を有する。 FIG. 1 is a block diagram illustrating an example of a functional configuration of a display device according to an embodiment. The display device 100 is configured such that an operation unit 111 such as a touch panel is provided on the display unit 112. The coordinate position of the display screen of the display unit 112 and the coordinate position of the operation surface of the operation unit 111 are set to the same coordinate by a touch panel control unit (not shown). The display device 100 includes a first reception unit 101, a second reception unit 102, and a determination unit 103.
 第1受付部101は、ユーザにより操作部111の一部を押下(タッチ)する第1の操作を受け付ける。第2受付部102は、第1受付部101によって受け付けたユーザによる操作部111の一部の押下後、基準時間内に操作部111の他の部分を複数押下する第2の操作を受け付ける。 The first reception unit 101 receives a first operation in which a part of the operation unit 111 is pressed (touched) by the user. The second receiving unit 102 receives a second operation of pressing a plurality of other parts of the operation unit 111 within a reference time after the user has pressed down a part of the operation unit 111 received by the first receiving unit 101.
 ここで、第2受付部102は、第1の操作後、基準時間内での第2の操作を受け付けるため、ユーザが操作部111から一旦指を離した場合であっても、基準時間内に操作部111の他の部分の押下があれば第2の操作として受け付ける。これにより、ユーザは、例えば、第1の操作で表示に関する表示種別の設定の操作と、第2の操作で表示の操作量の設定の操作とを時間的に切り分けて操作することができ、異なる操作を明確かつ容易に切り分けて行うことができるようになる。 Here, since the second reception unit 102 receives the second operation within the reference time after the first operation, even if the user once lifts the finger from the operation unit 111, the second reception unit 102 is within the reference time. If another portion of the operation unit 111 is pressed, it is accepted as a second operation. As a result, the user can, for example, perform the operation of setting the display type related to the display in the first operation and the operation of setting the operation amount of the display in the second operation separately in time. The operation can be performed clearly and easily.
 決定部103は、第1の操作により処理内容を決定し、第2の操作により決定した処理の変化の方向と、変化の度合いを決定する。例えば、第1の操作が地図等縮尺の拡大/縮小である場合、第2の操作では、決定された拡大/あるいは縮小について、縮尺の変化の度合いを決定する。このほか、第1の操作が地図等のスクロールである場合、第2の操作では、このスクロール量を決定する。第2の操作の変化量は、時間あたりの表示内容の変化の速度を示す。決定部103により決定された処理内容と、処理内容に対する変化の度合いは、タッチパネル制御部に出力されて表示部112の表示内容が制御される。 The determination unit 103 determines the processing content by the first operation, and determines the change direction and the degree of change of the processing determined by the second operation. For example, when the first operation is an enlargement / reduction of a scale such as a map, the second operation determines the degree of change of the scale for the determined enlargement / reduction. In addition, when the first operation is scrolling a map or the like, the scroll amount is determined in the second operation. The amount of change in the second operation indicates the speed of change in display content per time. The processing content determined by the determination unit 103 and the degree of change with respect to the processing content are output to the touch panel control unit, and the display content of the display unit 112 is controlled.
 また、決定部103は、第2の操作による複数の指の押下の方向(指が並ぶ方向)により、処理の方向を決定し、押下の数により変化の度合いを決定することもできる。 Also, the determining unit 103 can determine the direction of processing based on the direction of pressing a plurality of fingers by the second operation (the direction in which the fingers are arranged), and can determine the degree of change based on the number of presses.
 また、決定部103は、第2の操作に基づく操作量を操作時、すなわち指を操作部111に押下(接触)させたときのみ変化の度合いを決定するに限らない。すなわち、指を操作部111に押下したことにより、処理内容に対する変化の度合いを決定して表示部112上での実際の表示内容を変化させるが、この後、操作部111から指を離したときにも処理の変化の度合いを再度決定してもよい。 Further, the determination unit 103 is not limited to determining the degree of change only when the operation amount based on the second operation is operated, that is, when the finger is pressed (touched) on the operation unit 111. That is, when the finger is pressed on the operation unit 111, the degree of change with respect to the processing content is determined to change the actual display content on the display unit 112. Thereafter, when the finger is released from the operation unit 111 Alternatively, the degree of change in processing may be determined again.
 図2は、実施の形態にかかる表示装置の処理例を示すフローチャートである。表示装置100は、第1受付部101により、ユーザによる操作部111の一部を押下する第1の操作を待機する(ステップS201:Noのループ)。ここで、ユーザによる操作部111の第1の操作があれば(ステップS201:Yes)、ステップS202に移行する。 FIG. 2 is a flowchart illustrating a processing example of the display device according to the embodiment. The display device 100 waits for a first operation by the user to depress a part of the operation unit 111 by the first reception unit 101 (step S201: No loop). If there is a first operation of the operation unit 111 by the user (step S201: Yes), the process proceeds to step S202.
 次に、第2受付部102が第1の操作後、基準時間内で第2受付部102がユーザによる操作部111に対する第2の操作があったかを判断する(ステップS202)。第2受付部102は、第1の操作後、基準時間内(例えば1秒以内)で、第2の操作がなかった場合(ステップS202:No)、ステップS201に戻る。一方、第1の操作後、基準時間内(例えば1秒以内)で、第1の押下位置と異なる位置で複数の指の押下があった場合、第2の操作があったと判断する(ステップS202:Yes)。 Next, after the second operation is performed by the second reception unit 102, the second reception unit 102 determines whether the user has performed a second operation on the operation unit 111 within a reference time (step S202). When the second operation is not performed within the reference time (for example, within 1 second) after the first operation (No in step S202), the second reception unit 102 returns to step S201. On the other hand, if a plurality of fingers are pressed at a position different from the first pressed position within a reference time (for example, within 1 second) after the first operation, it is determined that the second operation has been performed (step S202). : Yes).
 次に、決定部103は、第1の操作により処理内容を決定する(ステップS203)。例えば、第1の操作で表示されている地図に対する処理として縮尺を拡大する処理が選択されればこの縮尺拡大を決定する。 Next, the determination unit 103 determines the processing content by the first operation (step S203). For example, if the process for enlarging the scale is selected as the process for the map displayed in the first operation, the scale enlargement is determined.
 次に、決定部103は、第2の操作により処理内容(地図の縮尺拡大)に対する変化の度合い(変化量)を決定する。例えば、第2の操作では、複数の指の数により地図の縮尺の拡大率を決定する(ステップS204)。具体的には、操作部111を押下した指の数が多いほど、拡大率を増加させる。例えば、2本の指で押下されれば拡大率は2倍、3本の指で押下されれば拡大率は3倍、と決定する。 Next, the determination unit 103 determines the degree of change (change amount) with respect to the processing content (scale enlargement of the map) by the second operation. For example, in the second operation, the enlargement ratio of the map scale is determined by the number of fingers (step S204). Specifically, the enlargement ratio is increased as the number of fingers pressing the operation unit 111 is increased. For example, it is determined that the magnification rate is doubled when pressed with two fingers, and the magnification rate is tripled when pressed with three fingers.
 以上の実施の形態によれば、操作部を押下する指の本数に基づいて表示処理する際の変化の度合いを変更することができる。この際、ユーザは、先に第1の操作で処理内容を決定した後、基準時間内であれば一旦指を操作部から離してもよい。この基準時間内で再度指を操作部に触れることにより、第2の操作として検出され、この第2の操作では、第1の操作で決定された処理内容に対する変化の度合いを指の本数に基づき決定することができる。すなわち、第1の操作後操作部から指を離してもよく、基準時間内であれば、決定部は、第1の操作で決定した処理内容を保持し、第2の操作を受け付けるため、ユーザは、第1の操作による処理内容の決定のための操作と、第2の操作による変化の度合いの決定のための操作と、を分離して操作できるようになり、明確で容易な操作性を実現できるようになる。 According to the embodiment described above, the degree of change at the time of display processing can be changed based on the number of fingers pressing the operation unit. At this time, the user may first remove the finger from the operation unit within the reference time after first determining the processing content by the first operation. By touching the operation unit again with the finger within this reference time, it is detected as a second operation. In this second operation, the degree of change with respect to the processing content determined in the first operation is determined based on the number of fingers. Can be determined. That is, the first post-operation unit may be released, and if it is within the reference time, the determination unit holds the processing content determined by the first operation and accepts the second operation. Makes it possible to separate the operation for determining the processing contents by the first operation and the operation for determining the degree of change by the second operation, thereby providing a clear and easy operability. Can be realized.
 また、決定部は、第2の操作による複数の押下の方向(指が並ぶ方向)により、処理の方向を決定することもできる。例えば、第1の操作で地図の縮尺が決定された場合、画面上でほぼ右方向に沿って順番に複数の指が押下された場合には、押下された指の数だけ縮尺を拡大する決定を行い、逆に画面上でほぼ左方向に沿って順番に複数の指が押下された場合には、押下された指の数だけ縮尺を縮小する決定を行う。この際、複数の押下の回数により変化の度合い(2つの指であれば縮尺時の縮尺量やスクロール時のスクロール量)を決定することができる。 Also, the determination unit can determine the processing direction based on the plurality of pressing directions (directions in which fingers are arranged) by the second operation. For example, when the scale of the map is determined by the first operation, and when a plurality of fingers are pressed in order substantially along the right direction on the screen, the determination is made to enlarge the scale by the number of pressed fingers. On the contrary, when a plurality of fingers are pressed in order almost along the left direction on the screen, it is determined to reduce the scale by the number of pressed fingers. At this time, the degree of change (the amount of scale at the time of scaling and the amount of scrolling at the time of scrolling) can be determined by the number of times of pressing a plurality of times.
 このほか、決定部は、第2の操作で一旦複数の指を操作部に押下させて変化量を決定し、表示内容をこの変化量で変化させた後、指を操作部から離したときの状態も含めて決定した処理の変化の度合いを決定することができる。例えば、第2の操作で操作部に押下させて所定の拡大率(3倍)として表示制御した後、押下している指の数を1本減らせば、減らした分だけ変化の度合いを減らし、表示の縮尺を縮小(2倍に変更)させることができる。 In addition, the determination unit determines a change amount by pressing a plurality of fingers once on the operation unit in the second operation, changes the display content by the change amount, and then releases the finger from the operation unit. It is possible to determine the degree of change in processing determined including the state. For example, if the number of pressed fingers is reduced by 1 after the operation unit is pressed down by the second operation and displayed as a predetermined enlargement ratio (3 times), the degree of change is reduced by the reduced amount, The scale of the display can be reduced (changed to 2 times).
 このように、実施の形態によれば、操作部上で指をスライド等させずに、複数の指を押下するだけで、処理内容と、処理内容に対する変化の度合いを簡単に決定することができ、簡単な操作で表示内容を制御できるようになる。 As described above, according to the embodiment, it is possible to easily determine the processing content and the degree of change with respect to the processing content by simply pressing a plurality of fingers without sliding the finger on the operation unit. The display contents can be controlled with a simple operation.
 次に、本発明の各実施例について説明する。各実施例では、上述の表示装置100をユーザの車両に搭載されたナビゲーション装置300を用いて構成した例を説明する。 Next, each embodiment of the present invention will be described. In each embodiment, an example in which the above-described display device 100 is configured using a navigation device 300 mounted on a user's vehicle will be described.
(ナビゲーション装置300のハードウェア構成)
 図3は、実施例にかかるナビゲーション装置のハードウェア構成の一例を示すブロック図である。図3において、ナビゲーション装置300は、CPU301、ROM302、RAM303、磁気ディスクドライブ304、磁気ディスク305、光ディスクドライブ306、光ディスク307、音声I/F(インターフェース)308、マイク309、スピーカ310、入力デバイス311、映像I/F312、ディスプレイ313、通信I/F314、GPSユニット315、各種センサ316、カメラ317、を備えている。各構成部301~317は、バス320によってそれぞれ接続されている。
(Hardware configuration of navigation device 300)
FIG. 3 is a block diagram illustrating an example of a hardware configuration of the navigation device according to the embodiment. In FIG. 3, a navigation device 300 includes a CPU 301, ROM 302, RAM 303, magnetic disk drive 304, magnetic disk 305, optical disk drive 306, optical disk 307, audio I / F (interface) 308, microphone 309, speaker 310, input device 311, A video I / F 312, a display 313, a communication I / F 314, a GPS unit 315, various sensors 316, and a camera 317 are provided. Each component 301 to 317 is connected by a bus 320.
 CPU301は、ナビゲーション装置300の全体の制御を司る。ROM302は、ブートプログラム、経路探索プログラム等を記録している。RAM303は、CPU301のワークエリアとして使用される。すなわち、CPU301は、RAM303をワークエリアとして使用しながら、ROM302に記録された各種プログラムを実行することによって、ナビゲーション装置300の全体の制御を司る。 CPU 301 governs overall control of navigation device 300. The ROM 302 records a boot program, a route search program, and the like. The RAM 303 is used as a work area for the CPU 301. That is, the CPU 301 controls the entire navigation device 300 by executing various programs recorded in the ROM 302 while using the RAM 303 as a work area.
 磁気ディスクドライブ304は、CPU301の制御にしたがって磁気ディスク305に対するデータの読み取り/書き込みを制御する。磁気ディスク305は、磁気ディスクドライブ304の制御で書き込まれたデータを記録する。磁気ディスク305としては、例えば、HD(ハードディスク)やFD(フレキシブルディスク)を用いることができる。 The magnetic disk drive 304 controls the reading / writing of the data with respect to the magnetic disk 305 according to control of CPU301. The magnetic disk 305 records data written under the control of the magnetic disk drive 304. As the magnetic disk 305, for example, an HD (hard disk) or an FD (flexible disk) can be used.
 また、光ディスクドライブ306は、CPU301の制御にしたがって光ディスク307に対するデータの読み取り/書き込みを制御する。光ディスク307は、光ディスクドライブ306の制御にしたがってデータが読み出される着脱自在な記録媒体である。光ディスク307は、書き込み可能な記録媒体を利用することもできる。着脱可能な記録媒体として、光ディスク307のほか、MO、メモリカードなどを用いることができる。 The optical disk drive 306 controls reading / writing of data with respect to the optical disk 307 according to the control of the CPU 301. The optical disk 307 is a detachable recording medium from which data is read according to the control of the optical disk drive 306. As the optical disc 307, a writable recording medium can be used. In addition to the optical disk 307, an MO, a memory card, or the like can be used as a removable recording medium.
 磁気ディスク305および光ディスク307に記録される情報の一例としては、地図データ、車両情報、画像、走行履歴などが挙げられる。地図データは、ナビゲーションシステムにおいて経路探索するときに用いられ、建物、河川、地表面、エネルギー補給施設などの地物(フィーチャ)をあらわす背景データ、道路の形状をリンクやノードなどであらわす道路形状データなどを含むベクタデータである。 Examples of information recorded on the magnetic disk 305 and the optical disk 307 include map data, vehicle information, images, and travel history. Map data is used when searching for routes in the navigation system. Background data that represents features (features) such as buildings, rivers, ground surfaces, and energy supply facilities, and road shape data that represents road shapes with links and nodes. Vector data including
 音声I/F308は、音声入力用のマイク309および音声出力用のスピーカ310に接続される。マイク309に受音された音声は、音声I/F308内でA/D変換される。マイク309は、例えば、車両のダッシュボード部などに設置され、その数は単数でも複数でもよい。スピーカ310からは、所定の音声信号を音声I/F308内でD/A変換した音声が出力される。 The voice I / F 308 is connected to a microphone 309 for voice input and a speaker 310 for voice output. The sound received by the microphone 309 is A / D converted in the sound I / F 308. For example, the microphone 309 is installed in a dashboard portion of a vehicle, and the number thereof may be one or more. From the speaker 310, a sound obtained by D / A converting a predetermined sound signal in the sound I / F 308 is output.
 入力デバイス311は、文字、数値、各種指示などの入力のための複数のキーを備えたリモコン、キーボード、タッチパネルなどが挙げられる。入力デバイス311は、ディスプレイ313の表示画面上に重ねて設けられる透明なタッチパネルを用いる。 The input device 311 includes a remote controller, a keyboard, a touch panel, and the like provided with a plurality of keys for inputting characters, numerical values, various instructions, and the like. As the input device 311, a transparent touch panel provided on the display screen of the display 313 is used.
 映像I/F312は、ディスプレイ313に接続される。映像I/F312は、具体的には、例えば、ディスプレイ313全体を制御するグラフィックコントローラと、即時表示可能な画像情報を一時的に記録するVRAM(Video RAM)などのバッファメモリと、グラフィックコントローラから出力される画像データに基づいてディスプレイ313を制御する制御ICなどによって構成される。 The video I / F 312 is connected to the display 313. Specifically, the video I / F 312 is output from, for example, a graphic controller that controls the entire display 313, a buffer memory such as a VRAM (Video RAM) that temporarily records image information that can be displayed immediately, and a graphic controller. And a control IC for controlling the display 313 based on the image data to be processed.
 ディスプレイ313には、アイコン、カーソル、メニュー、ウインドウ、あるいは文字や画像などの各種データが表示される。ディスプレイ313としては、例えば、TFT液晶ディスプレイ、有機ELディスプレイなどを用いることができる。 The display 313 displays icons, cursors, menus, windows, or various data such as characters and images. As the display 313, for example, a TFT liquid crystal display, an organic EL display, or the like can be used.
 カメラ317は、車両外部の例えば、道路を含む映像を撮影する。映像は静止画あるいは動画のどちらでもよく、例えば、カメラ317によって車両外部を撮影し、撮影した画像をCPU301において画像解析したり、映像I/F312を介して磁気ディスク305や光ディスク307などの記録媒体に出力したりする。 The camera 317 shoots an image including, for example, a road outside the vehicle. The image may be either a still image or a moving image. For example, the outside of the vehicle is photographed by the camera 317, and the photographed image is analyzed by the CPU 301, or a recording medium such as the magnetic disk 305 or the optical disk 307 via the video I / F 312. Or output to
 通信I/F314は、無線を介してネットワークに接続され、ナビゲーション装置300およびCPU301のインターフェースとして機能する。ネットワークとして機能する通信網には、CANやLIN(Local Interconnect Network)などの車内通信網や、公衆回線網や携帯電話網、DSRC(Dedicated Short Range Communication)、LAN、WANなどがある。通信I/F314は、例えば、公衆回線用接続モジュールやETC(ノンストップ自動料金支払いシステム)ユニット、FMチューナー、VICS(Vehicle Information and Communication System:登録商標)/ビーコンレシーバなどである。 The communication I / F 314 is connected to the network via wireless and functions as an interface between the navigation device 300 and the CPU 301. Communication networks that function as networks include in-vehicle communication networks such as CAN and LIN (Local Interconnect Network), public line networks and mobile phone networks, DSRC (Dedicated Short Range Communication), LAN, and WAN. The communication I / F 314 is, for example, a public line connection module, an ETC (non-stop automatic fee payment system) unit, an FM tuner, a VICS (Vehicle Information and Communication System) / beacon receiver, or the like.
 GPSユニット315は、GPS衛星からの電波を受信し、車両の現在位置を示す情報を出力する。GPSユニット315の出力情報は、後述する各種センサ316の出力値とともに、CPU301による車両の現在位置の算出に際して利用される。現在位置を示す情報は、例えば、緯度・経度、高度などの、地図データ上の1点を特定する情報である。 The GPS unit 315 receives radio waves from GPS satellites and outputs information indicating the current position of the vehicle. The output information of the GPS unit 315 is used when the CPU 301 calculates the current position of the vehicle together with output values of various sensors 316 described later. The information indicating the current position is information for specifying one point on the map data such as latitude / longitude and altitude.
 各種センサ316は、車速センサ、加速度センサ、角速度センサ、傾斜センサなどの、車両の位置や挙動を判断するための情報を出力する。各種センサ316の出力値は、CPU301による車両の現在位置の算出や、速度や方位の変化量の算出に用いられる。 Various sensors 316 output information for determining the position and behavior of the vehicle, such as a vehicle speed sensor, an acceleration sensor, an angular velocity sensor, and a tilt sensor. The output values of the various sensors 316 are used for the calculation of the current position of the vehicle by the CPU 301 and the amount of change in speed and direction.
 図1に示した表示装置100は、図3に記載のナビゲーション装置300のROM302、RAM303、磁気ディスク305、光ディスク307などに記録されたプログラムやデータを用いて、CPU301が所定のプログラムを実行することによって、上述した図1の第1受付部101~決定部103が行う表示制御(表示の処理内容の決定および、決定された処理内容に対する変化の度合いの決定、および決定内容にしたがったディスプレイ313上での表示制御)の機能を実現する。 In the display device 100 shown in FIG. 1, the CPU 301 executes a predetermined program using programs and data recorded in the ROM 302, RAM 303, magnetic disk 305, optical disk 307, etc. of the navigation device 300 shown in FIG. 1, display control performed by the first receiving unit 101 to the determination unit 103 in FIG. 1 described above (determination of display processing content, determination of the degree of change with respect to the determined processing content, and on the display 313 according to the determination content) Display control) function.
(実施例1)
 図4は、実施例1にかかる指の操作に基づく倍率変更の表示制御を説明する図である。ナビゲーション装置300のディスプレイ313上には、入力デバイス(タッチパネル)311として重ねて設けられている。
Example 1
FIG. 4 is a diagram for explaining display control for changing the magnification based on a finger operation according to the first embodiment. On the display 313 of the navigation device 300, an input device (touch panel) 311 is provided so as to overlap.
 そして、ディスプレイ313の表示画面上には地図データが表示されており、表示画面の左側には、表示に対する処理内容(図示の例では縮尺変更)を決定するための操作釦(群)401が表示されている。操作釦401の一つとして、地図データの縮尺の倍率を拡大する倍率拡大釦401aと、倍率を縮小する倍率縮小釦401bとがそれぞれ表示されているとする。 The map data is displayed on the display screen of the display 313, and an operation button (group) 401 for determining the processing content for the display (scale change in the illustrated example) is displayed on the left side of the display screen. Has been. Assume that as one of the operation buttons 401, a magnification enlargement button 401a for enlarging the scale of map data and a magnification reduction button 401b for reducing the magnification are displayed.
 入力デバイス(タッチパネル)311は、ディスプレイ313に表示されたこれら操作釦401等に対する押下操作時には、押下された座標位置を検出し、例えば、倍率拡大釦401a部分の押下時には倍率拡大の操作であることを検出する。 The input device (touch panel) 311 detects the pressed coordinate position when the operation button 401 or the like displayed on the display 313 is pressed. For example, the input device 311 is a magnification enlargement operation when the magnification enlargement button 401a is pressed. Is detected.
 ナビゲーション装置300が行う表示制御は、上述した処理内容の決定と、決定した処理内容に対する変化の度合いの決定、とを含み、以下これらの決定処理について主に説明する。 The display control performed by the navigation device 300 includes the determination of the processing contents described above and the determination of the degree of change with respect to the determined processing contents. These determination processes will be mainly described below.
 例えば、ユーザが地図データの縮尺の倍率を縮小する例を用いて説明すると、ユーザは、まず第1の操作として、倍率縮小釦401bを例えば人差し指Bで押下する(図中のtouch1の箇所)。これにより、タッチパネル311は、倍率縮小釦401bの押下を検出し、CPU301は、表示の処理内容として倍率縮小を決定する。このとき、第1の操作により、地図データは1/2に縮小表示される。 For example, when an example is described in which the user reduces the scale of the map data, the user first presses the scale down button 401b with, for example, the index finger B as the first operation (the location of touch1 in the figure). Thereby, the touch panel 311 detects pressing of the magnification reduction button 401b, and the CPU 301 determines the magnification reduction as the display processing content. At this time, the map data is reduced to ½ by the first operation.
 次に、ユーザは、さらに縮小倍率を変更するときには、さらに第2の操作を行う。この際、ユーザは、第1の操作で押下した指(人差し指B)以外の指を用いて、タッチパネル311上の操作釦(群)401が配置された位置以外の位置(地図データの任意の箇所)を押下する。 Next, when the user further changes the reduction ratio, the user further performs a second operation. At this time, the user uses a finger other than the finger (index finger B) pressed in the first operation, to a position other than the position where the operation button (group) 401 is arranged on the touch panel 311 (any part of the map data). ).
 ここで、ユーザが第1の操作に用いた押下中の人差し指B以外(この例では中指C)でタッチパネル311上を押下(図中のtouch2の箇所)すると、合計2本の指が押下され縮小倍率をさらに変更(例えば1/3倍に)できる。この後、さらに他の薬指Dでタッチパネル311上を押下(図中のtouch3の箇所)すると、合計3本の指がタッチパネル311上で押下され縮小倍率をさらに変更(例えば1/4倍に)できる。この後、さらに他の小指Eでタッチパネル311上を押下(図中のtouch4の箇所)すると、合計4本の指がタッチパネル311上で押下され縮小倍率をさらに変更(例えば1/5倍に)できる。 Here, when the user presses down on the touch panel 311 other than the pressing index finger B used for the first operation (in this example, the middle finger C) (two touches in the figure), a total of two fingers are pressed and reduced. The magnification can be further changed (for example, 1/3). Thereafter, when the touch panel 311 is further pressed with another ring finger D (the location of touch 3 in the figure), a total of three fingers are pressed on the touch panel 311 to further change the reduction magnification (for example, 1/4 times). . Thereafter, when the touch panel 311 is further pressed with another little finger E (the location of touch 4 in the figure), a total of four fingers are pressed on the touch panel 311 to further change the reduction magnification (for example, 1/5 times). .
 ここで、CPU301は、第1の操作後、基準時間内(例えば1秒以内)に第2の操作があれば、押下された指の数に応じた変化の度合いに応じた縮小倍率に変更させる。このため、第1の操作で押下した指(人差し指B)を離しても、基準時間内にタッチパネル311上の操作釦401以外の箇所(地図データの任意の箇所)を押下(上記例では中指C)すれば、第2の操作として検出することができる。 Here, if there is a second operation within a reference time (for example, within 1 second) after the first operation, the CPU 301 changes the reduction ratio according to the degree of change according to the number of fingers pressed. . For this reason, even if the finger (index finger B) pressed in the first operation is released, a portion other than the operation button 401 on the touch panel 311 (an arbitrary portion of the map data) is pressed within the reference time (middle finger C in the above example). ) Can be detected as the second operation.
 このように、第2の操作で押下する指の本数を増やすだけで、第1の操作で決定した処理内容(この例では倍率縮小)の度合いを増加させることができる。 In this way, the degree of processing content (magnification reduction in this example) determined by the first operation can be increased only by increasing the number of fingers to be pressed by the second operation.
 このような倍率縮小時には、タッチパネル311を押下する指の本数を増やすにしたがい、倍率縮小の度合いを増加できる。同様に、倍率拡大時には、タッチパネル311を押下する指の本数を増やすにしたがい、倍率拡大の度合いを増加できる。 When reducing the magnification, the degree of magnification reduction can be increased as the number of fingers pressing the touch panel 311 is increased. Similarly, at the time of magnification enlargement, the degree of magnification enlargement can be increased as the number of fingers pressing down the touch panel 311 is increased.
 また、上記例では、第2の操作でタッチパネル311を押下した状態までについて説明したが、押下している指の本数を減らせば、CPU301は、それまでと逆の表示制御を行うことができる。例えば、図4に示す4本の指BCDEを用いた押下後に一つの指(例えば小指E)だけを離したときには、それまでの倍率縮小の制御を倍率拡大(3本に対応する1/4倍に)することができる。このように、片手の各指をタッチパネル311上で押下する指の数を変更させるだけで、決定した処理内容の変化の度合いを任意に拡大縮小することができるようになる。 In the above example, the state in which the touch panel 311 is pressed in the second operation has been described. However, if the number of pressed fingers is reduced, the CPU 301 can perform display control reverse to that in the past. For example, when only one finger (for example, the little finger E) is released after being pressed using the four fingers BCDE shown in FIG. 4, the magnification reduction control up to that point is performed (magnification of 1/4 corresponding to three). To). In this way, the degree of change in the determined processing content can be arbitrarily enlarged or reduced by merely changing the number of fingers that press each finger of one hand on the touch panel 311.
 さらには、いずれか1本の指だけでも押下状態として保持しておけば、以降において再度の変化の度合いを押下した指の数に応じて増減させることができる。 Furthermore, if only one of the fingers is held in the pressed state, the degree of change again can be increased or decreased according to the number of pressed fingers.
 この際、第2の操作では、単にタッチパネル311に対し各指の位置で順番に押下していくだけでよく、決められた釦位置などの押下を行わずに表示の処理内容に対する変化(上記例では倍率縮小)を指の負担なく(例えば指を広げて操作する等が不要で)簡単に操作できるようになる。 At this time, in the second operation, it is only necessary to sequentially press the touch panel 311 at the position of each finger, and a change to the processing content of the display without pressing the determined button position or the like (in the above example, It is possible to easily operate (magnification reduction) without the burden of a finger (for example, it is unnecessary to operate with a finger spread).
(実施例2)
 図5は、実施例2にかかる指の操作に基づく倍率変更の表示制御を説明する図である。実施例2では、第2の操作により順番に押下したときの指の方向により、処理内容に関する変化の方向と、前記処理内容に関する変化の度合いと、を決定する構成である。
(Example 2)
FIG. 5 is a diagram for explaining display control of magnification change based on a finger operation according to the second embodiment. In the second embodiment, the direction of change related to the processing content and the degree of change related to the processing content are determined according to the direction of the finger when pressed in order by the second operation.
 この実施例2では、ディスプレイ313上には、実施例1のような2つの釦(倍率拡大釦401a、倍率縮小釦401b)を設けず、一つの倍率変更釦401cだけを表示させればよい。 In the second embodiment, two buttons (magnification enlargement button 401a and magnification reduction button 401b) as in the first embodiment are not provided on the display 313, and only one magnification change button 401c may be displayed.
 そして、ユーザは、まず第1の操作として、倍率変更釦401cを例えば人差し指Bで押下する(図中のtouch1の箇所)。この状態では、CPU301は、第1の操作による表示の処理内容として倍率変更が決定されるが、処理内容の変化の方向(倍率の拡大あるいは縮小)についてはまだ決定しない。 Then, as a first operation, the user first presses the magnification change button 401c with, for example, the index finger B (the location of touch 1 in the drawing). In this state, the CPU 301 determines the magnification change as the processing content of the display by the first operation, but has not yet determined the change direction of the processing content (magnification or reduction of the magnification).
 次に、ユーザは、倍率拡大を行う場合と、倍率縮小を行う場合とで、異なる方向に沿って複数の指を順次押下していく。これにより、CPU301は、処理内容の変化の方向を決定する。 Next, the user sequentially presses a plurality of fingers along different directions depending on whether the magnification is enlarged or the magnification is reduced. Thereby, the CPU 301 determines the direction of change of the processing content.
 例えば、倍率拡大を行うときには、ユーザは、第1の操作で押下した指(人差し指B)以外の複数の指を用いて、タッチパネル311上の操作釦401以外の箇所(地図データの任意の箇所)を押下する。 For example, when performing magnification enlargement, the user uses a plurality of fingers other than the finger (index finger B) pressed in the first operation to use a part other than the operation button 401 on the touch panel 311 (any part of the map data). Press.
 ここで、ユーザが第1の操作に用いた押下中の人差し指B以外(この例では中指C)でタッチパネル311上で倍率変更釦401cに近い場所を押下(図中のtouch2の箇所)した場合、CPU301は、以降の押下方向が右方向と判断し、この場合、処理内容の変化の方向として倍率拡大を決定する。倍率変更釦401cの座標位置に対し所定距離までの範囲の座標位置はあらかじめ近い場所として設定しておく。同時に、CPU301は、処理内容の変化の度合いが例えば2倍拡大として第2の処理内容を決定する。 Here, when the user presses a place close to the magnification change button 401c on the touch panel 311 other than the index finger B being pressed (the middle finger C in this example) used for the first operation (the location of touch 2 in the figure), The CPU 301 determines that the subsequent pressing direction is the right direction, and in this case, determines magnification enlargement as the direction of change in the processing content. The coordinate position within a predetermined distance with respect to the coordinate position of the magnification change button 401c is set as a close place in advance. At the same time, the CPU 301 determines the second processing content assuming that the degree of change in the processing content is doubled, for example.
 この状態でさらに、他の薬指Dでタッチパネル311上を押下(図中のtouch3の箇所)すれば、合計3本の指がタッチパネル311上で押下され(この際の押下方向も右方向)、拡大倍率をさらに変更(例えば3倍に)できる。同様に、この後、さらに他の小指Eでタッチパネル311上を押下(図中のtouch4の箇所)すれば、拡大倍率を4倍に変更できる。 In this state, if the other ring finger D is pressed down on the touch panel 311 (the location of touch 3 in the figure), a total of three fingers are pressed down on the touch panel 311 (the pressing direction at this time is also the right direction) and enlarged. The magnification can be further changed (for example, 3 times). Similarly, when the touch panel 311 is further pressed with another little finger E (the location of touch 4 in the drawing), the enlargement magnification can be changed to 4 times.
 一方、倍率縮小を行うときには、ユーザが第1の操作に用いた押下中の人差し指Bの位置から離れた場所を押下(図中のtouch4の箇所で小指Eを押下)した場合、CPU301は、以降の押下方向が左方向と判断し、この場合、例えば倍率縮小と第1の処理内容を決定する。同時に、CPU301は、処理内容の変化の度合いが例えば縮小倍率を1/2倍として第2の処理内容を決定する。 On the other hand, when reducing the magnification, if the user presses a place away from the position of the index finger B being pressed used for the first operation (presses the little finger E at the position of touch 4 in the figure), the CPU 301 In this case, for example, the magnification reduction and the first processing content are determined. At the same time, the CPU 301 determines the second processing content by setting the degree of change in the processing content to, for example, a reduction ratio of 1/2.
 この状態でさらに、小指Eから左側に位置する箇所を他の薬指Dで押下(図中のtouch3の箇所)すれば、縮小倍率をさらに変更(例えば1/3倍に)できる。同様に、この後、さらに薬指Dより左側に位置する他の中指Cでタッチパネル311上を押下(図中のtouch2の箇所)すれば縮小倍率を1/4倍に変更できる。 In this state, if the portion located on the left side from the little finger E is pressed with another ring finger D (the location of touch 3 in the figure), the reduction ratio can be further changed (for example, 1/3). Similarly, when the touch panel 311 is further pressed down with the other middle finger C located on the left side of the ring finger D (the location of touch 2 in the drawing), the reduction ratio can be changed to ¼.
 図6は、実施例2にかかる指の操作に基づく倍率変更の表示制御例を示すフローチャートである。CPU301が実行する倍率変更の決定処理を主に記載してある。はじめに、CPU301は、倍率変更釦401cの押下を検出すると(ステップS601)、倍率変更釦401c以外の場所に対する2回目の押下の位置の検出を待つ(ステップS602)。この際、実施例2のように、ユーザは、倍率変更釦401cの押下後、この倍率変更釦401c部分の押下を離してもよいし、実施例1のように、倍率変更釦401cを押下しながら他の指で2回目の押下を行ってもよい。 FIG. 6 is a flowchart illustrating an example of display control for changing the magnification based on a finger operation according to the second embodiment. The magnification change determination process executed by the CPU 301 is mainly described. First, when the CPU 301 detects that the magnification change button 401c is pressed (step S601), the CPU 301 waits for detection of the second pressing position with respect to a place other than the magnification change button 401c (step S602). At this time, as in the second embodiment, the user may release the magnification change button 401c after pressing the magnification change button 401c, or press the magnification change button 401c as in the first embodiment. However, the second press may be performed with another finger.
 このステップS602では、CPU301は、倍率変更釦401cの押下後、基準時間だけ次の2回目の押下を待つ。そして、倍率変更釦401cの押下後、基準時間内に次の押下がなければ(ステップS602:Case1)、倍率変更の動作をキャンセルし(ステップS603)、処理を終了する。 In step S602, the CPU 301 waits for the next second press for the reference time after the press of the magnification change button 401c. If there is no next pressing within the reference time after pressing the magnification change button 401c (step S602: Case 1), the operation of changing the magnification is canceled (step S603), and the process is terminated.
 また、倍率変更釦401cの押下後、基準時間内に倍率変更釦401cに近い位置(図5の例ではtouch2)が押下されれば(ステップS602:Case2)、CPU301は、倍率変更の処理を拡大に決定する(ステップS604)。また、倍率変更釦401cの押下後、基準時間内に倍率変更釦401cから遠い位置(図5の例ではtouch4)が押下されれば(ステップS602:Case3)、CPU301は、倍率変更の処理を縮小に決定する(ステップS610)。 If the position close to the magnification change button 401c (touch2 in the example of FIG. 5) is pressed within the reference time after the magnification change button 401c is pressed (step S602: Case 2), the CPU 301 expands the magnification change processing. (Step S604). If a position far from the magnification change button 401c (touch 4 in the example of FIG. 5) is pressed within the reference time after the magnification change button 401c is pressed (step S602: Case 3), the CPU 301 reduces the magnification change process. (Step S610).
 ステップS604側の倍率拡大の処理では、ステップS602による2回目の押下から所定時間内に3回目の押下があるか判断する(ステップS605)、3回目の押下がなければ(ステップS605:No)、押下されている指の数は2本であり、拡大倍率を2倍に決定し(ステップS606)、処理を終了する。一方、3回目の押下があれば(例えば図5のtouch3、ステップS605:Yes)、ステップS607に移行する。 In the magnification enlargement process on the side of step S604, it is determined whether or not there is a third press within a predetermined time from the second press in step S602 (step S605). If there is no third press (step S605: No), The number of pressed fingers is two, the enlargement magnification is determined to be 2 (step S606), and the process is terminated. On the other hand, if there is a third press (for example, touch 3 in FIG. 5, step S605: Yes), the process proceeds to step S607.
 ステップS607では、ステップS605の後所定時間内に4回目の押下があるか判断する(ステップS607)、4回目の押下がなければ(ステップS607:No)、押下されている指の数は3本であり、拡大倍率を3倍に決定し(ステップS608)、処理を終了する。一方、4回目の押下があれば(例えば図5のtouch4、ステップS607:Yes)、押下されている指の数は4本であり、拡大倍率を4倍に決定し(ステップS609)、処理を終了する。 In step S607, it is determined whether there is a fourth press within a predetermined time after step S605 (step S607). If there is no fourth press (step S607: No), the number of pressed fingers is three. Thus, the enlargement magnification is determined to be 3 (step S608), and the process is terminated. On the other hand, if there is a fourth press (for example, touch 4 in FIG. 5, step S607: Yes), the number of pressed fingers is four, the enlargement magnification is determined to be four times (step S609), and the process is performed. finish.
 ステップS610側の倍率縮小の処理では、ステップS602による2回目の押下から所定時間内に3回目の押下があるか判断する(ステップS611)、3回目の押下がなければ(ステップS611:No)、押下されている指の数は2本であり、縮小倍率を1/2倍に決定し(ステップS612)、処理を終了する。一方、3回目の押下があれば(例えば図5のtouch3、ステップS611:Yes)、ステップS613に移行する。 In the magnification reduction process on the side of step S610, it is determined whether or not there is a third press within a predetermined time from the second press in step S602 (step S611). If there is no third press (step S611: No), The number of pressed fingers is two, the reduction ratio is determined to be 1/2 (step S612), and the process ends. On the other hand, if there is a third press (for example, touch 3 in FIG. 5, step S611: Yes), the process proceeds to step S613.
 ステップS613では、ステップS610の後所定時間内に4回目の押下があるか判断する(ステップS613)、4回目の押下がなければ(ステップS613:No)、押下されている指の数は3本であり、縮小倍率を1/3倍に決定し(ステップS614)、処理を終了する。一方、4回目の押下があれば(例えば図5のtouch2、ステップS613:Yes)、押下されている指の数は4本であり、縮小倍率を1/4倍に決定し(ステップS615)、処理を終了する。 In step S613, it is determined whether there is a fourth press within a predetermined time after step S610 (step S613). If there is no fourth press (step S613: No), the number of pressed fingers is three. Therefore, the reduction ratio is determined to be 1/3 (step S614), and the process is terminated. On the other hand, if there is a fourth press (for example, touch 2 in FIG. 5, step S613: Yes), the number of fingers being pressed is four, and the reduction ratio is determined to be 1/4 (step S615). The process ends.
 このように、実施例2では、第2の操作時の複数の指の押下方向によって、第1の操作での処理内容の決定を同時に行うことができるようになる。 As described above, in the second embodiment, the processing content in the first operation can be determined at the same time depending on the direction in which the plurality of fingers are pressed during the second operation.
 上記の実施例1、2のいずれにおいても、第1の操作後、第2の操作を行うまでは、基準時間分の猶予がある。この際、第1の操作後一旦指を離してもよく、基準時間内に再度指を押下すればよいため、第1の操作による処理内容の決定と、第2の操作による処理内容の変化の度合いの決定とを指を離すことで、感覚的および時間的に分離して切り分けて操作できるようになる。例えば、第1の操作で用いた指を第2の操作時に再度用いてもよい。例えば、第1の操作で倍率縮小釦401b(touch1)を人差し指Bで押下した後、この人差し指Bを一旦離し、第2の操作で再度人差し指Bでtouch2の部分を押下してもよい。 In any of the above-described Examples 1 and 2, there is a delay for the reference time until the second operation is performed after the first operation. At this time, the finger may be once released after the first operation, and it is only necessary to press the finger again within the reference time. Therefore, the determination of the processing content by the first operation and the change of the processing content by the second operation By releasing the finger from the determination of the degree, it becomes possible to perform operation by separating the sense and time. For example, the finger used in the first operation may be used again during the second operation. For example, after the magnification reduction button 401b (touch1) is pressed with the index finger B in the first operation, the index finger B is once released, and the touch2 portion may be pressed again with the index finger B in the second operation.
(実施例3)
 図7は、実施例3にかかる指の操作に基づくスクロール量変更の表示制御を説明する図である。上記の実施例では倍率変更について説明したが、実施例3では、スクロール量変更について説明する。スクロール変更でも、基本的に上記実施例同様に複数の指の押下によりスクロールに関する表示制御(処理内容の決定と、決定した処理内容の変化量の決定)とを行う。
Example 3
FIG. 7 is a diagram for explaining display control for changing the scroll amount based on a finger operation according to the third embodiment. In the above-described embodiment, the magnification change has been described. In the third embodiment, the scroll amount change will be described. Even in the scroll change, display control related to scrolling (determination of processing content and determination of change amount of determined processing content) is basically performed by pressing a plurality of fingers as in the above embodiment.
 図示のように、地図データのスクロールの場合、処理内容を決定する釦は設けていない。この場合、CPU301は、ディスプレイ313上の中心位置に十時型の指標Xを表示し、この指標Xから離れた位置を押下した場合、押下した位置がディスプレイ313の中心位置となるように表示画面(地図データ)をスクロールする制御を行う。 As shown in the figure, when the map data is scrolled, there is no button for determining the processing contents. In this case, the CPU 301 displays the ten-hour index X at the center position on the display 313, and when a position away from the index X is pressed, the display screen is displayed so that the pressed position becomes the center position of the display 313. Control to scroll (map data).
 ここで、はじめに指が触れた位置がディスプレイ313の中心位置となるように表示画面(地図データ)をスクロールする制御を行う。図7の例では、はじめに、人差し指Bでtouch1の箇所を押下した場合、CPU301は、touch1の位置がディスプレイ313の中心位置となるように制御する(第1の操作に相当)。このとき、スクロール速度は通常速度(1倍)とする。 Here, control is performed to scroll the display screen (map data) so that the position touched by the finger first becomes the center position of the display 313. In the example of FIG. 7, when the location of touch1 is first pressed with the index finger B, the CPU 301 performs control so that the position of touch1 becomes the center position of the display 313 (corresponding to the first operation). At this time, the scroll speed is set to a normal speed (1 time).
 第1の操作以降、複数の指が押下された場合、CPU301は、以降の指の数で第2の操作、すなわち、変化の度合い(スクロール速度)を決定する。図7に示すように、touch1の部分を人差し指Bで押下した状態のままで、他の中指Cでtouch2の部分を押下した場合、2本の指による押下であり、CPU301は、例えば、第2の操作としてスクロール速度を通常速度の2倍に変更(増加)させる。この後、薬指Dを加えた3本の指による押下ではスクロール速度を通常速度の3倍、小指Eを加えた4本の指による押下ではスクロール速度を通常速度の4倍に変更する。 When a plurality of fingers are pressed after the first operation, the CPU 301 determines the second operation, that is, the degree of change (scroll speed) by the number of subsequent fingers. As shown in FIG. 7, when the touch2 part is pressed with the index finger B and the touch2 part is pressed with the other middle finger C as shown in FIG. 7, the press is performed with two fingers. As an operation, the scroll speed is changed (increased) to twice the normal speed. Thereafter, the scroll speed is changed to three times the normal speed when pressed with three fingers including the ring finger D, and the scroll speed is changed to four times the normal speed when pressed with four fingers including the little finger E.
 図8は、実施例3にかかる指の操作に基づくスクロール量変更の表示制御例を示すフローチャートである。CPU301が実行するスクロール速度変更の決定処理を主に記載してある。はじめに、CPU301は、地図データ上での指の押下を検出すると(ステップS801)、押下した位置がディスプレイ313の中心位置となるように表示画面(地図データ)をスクロールする制御を行う。 FIG. 8 is a flowchart illustrating a display control example of scroll amount change based on a finger operation according to the third embodiment. The scroll speed change determination process executed by the CPU 301 is mainly described. First, when the CPU 301 detects pressing of the finger on the map data (step S801), the CPU 301 performs control to scroll the display screen (map data) so that the pressed position becomes the center position of the display 313.
 この際、CPU301は、1回目の押下から所定時間内に2回目の指の押下があるかを判断する(ステップS802)。2回目の押下がなければ(ステップS802:No)、スクロール速度を通常速度(1倍)としてスクロール開始させ(ステップS803)、処理を終了する。 At this time, the CPU 301 determines whether or not there is a second finger press within a predetermined time from the first press (step S802). If there is no second press (step S802: No), scrolling is started with the scroll speed set to the normal speed (1 time) (step S803), and the process ends.
 ステップS802において、2回目の押下があれば(図7の例でtouch2、ステップS802:Yes)ステップS804に移行する。ステップS804では、2回目の押下から所定時間内に3回目の指の押下があるかを判断する(ステップS804)。3回目の押下がなければ(ステップS804:No)、押下されている指の数は2本であり、スクロール速度を通常速度の2倍としてスクロール開始させ(ステップS805)、処理を終了する。 In step S802, if there is a second press (touch2 in the example of FIG. 7, step S802: Yes), the process proceeds to step S804. In step S804, it is determined whether there is a third finger press within a predetermined time from the second press (step S804). If there is no third press (step S804: No), the number of pressed fingers is two, scrolling is started with the scroll speed set to twice the normal speed (step S805), and the process ends.
 ステップS804において、3回目の押下があれば(図7の例でtouch3、ステップS804:Yes)ステップS806に移行する。ステップS806では、3回目の押下から所定時間内に4回目の指の押下があるかを判断する(ステップS806)。4回目の押下がなければ(ステップS806:No)、押下されている指の数は3本であり、スクロール速度を通常速度の3倍としてスクロール開始させ(ステップS807)、処理を終了する。4回目の押下があれば(図7の例でtouch4、ステップS806:Yes)、押下されている指の数は4本であり、スクロール速度を通常速度の4倍としてスクロール開始させ(ステップS808)、処理を終了する。 In step S804, if there is a third press (touch3 in the example of FIG. 7, step S804: Yes), the process proceeds to step S806. In step S806, it is determined whether there is a fourth finger press within a predetermined time from the third press (step S806). If there is no fourth press (step S806: No), the number of pressed fingers is three, the scroll speed is set to three times the normal speed, scrolling is started (step S807), and the process ends. If there is a fourth press (touch4 in the example of FIG. 7, step S806: Yes), the number of pressed fingers is four, and scrolling is started with the scroll speed set to four times the normal speed (step S808). The process is terminated.
 この実施例3においても、押下した指の本数を減らす毎にスクロール速度を減少させることができ、また、押下した指の本数の増減に対応してスクロール速度を増減させることができる。 Also in the third embodiment, the scroll speed can be decreased every time the number of pressed fingers is reduced, and the scroll speed can be increased or decreased in accordance with the increase or decrease of the number of pressed fingers.
(実施例4)
 実施例4は、実施例3の変形例である。この実施例4では、複数の指の押下の時間的間隔、あるいは各指の押下の幅に応じてスクロールの速度を変化させる例を説明する。
Example 4
The fourth embodiment is a modification of the third embodiment. In the fourth embodiment, an example will be described in which the scrolling speed is changed in accordance with the time interval between pressing of a plurality of fingers or the pressing width of each finger.
 図9は、実施例4にかかる指の操作に基づくスクロール変更の表示制御を説明する図である。まず、複数の指で押下した隣接する位置の間隔に基づきスクロール速度を変更してもよい。実施例2では、1カ所の押下(touch1)の通常速度(1倍)のスクロール量と、2本の指(touch1とtouch2)の際の2倍のスクロール速度とした。 FIG. 9 is a diagram for explaining display control for scroll change based on finger operation according to the fourth embodiment. First, the scroll speed may be changed based on the interval between adjacent positions pressed by a plurality of fingers. In the second embodiment, the scroll amount at the normal speed (1 time) of pressing at one place (touch 1) and the scroll speed at the time of two fingers (touch 1 and touch 2) are set.
 実施例4では、隣接する2つの指で押下した位置(touch1とtouch2の間隔L1)に基づき、2本の指(touch1とtouch2)の際の2倍のスクロール速度を基準として、例えば、間隔L1に応じて1.5~2.5倍のスクロール速度とする。これにより、2本の指の間隔L1に応じてスクロール速度を連続可変できるようになる。また、2本の指を押下した状態で間隔L1を変更することで、スクロール速度を変更することもできる。 In the fourth embodiment, based on the position pressed by two adjacent fingers (interval L1 between touch1 and touch2), for example, the interval L1 is based on a scroll speed twice that of two fingers (touch1 and touch2). The scroll speed is 1.5 to 2.5 times depending on the situation. As a result, the scroll speed can be continuously varied according to the interval L1 between the two fingers. The scroll speed can also be changed by changing the interval L1 while two fingers are pressed.
 この後、3本目の指を押下した場合、2本目と3本目の押下位置(touch2とtouch3の間隔L2)に基づき、3本の指による押下では通常速度の3倍のスクロール速度を基準として、例えば、間隔L2に応じて2.5~3.5倍のスクロール速度とする。 Thereafter, when the third finger is pressed, based on the second and third pressing positions (interval L2 between touch 2 and touch 3), the pressing with the three fingers is based on a scroll speed that is three times the normal speed. For example, the scroll speed is set to 2.5 to 3.5 times according to the interval L2.
 このように隣接する各指を押下する毎に間隔L1、L2に基づくスクロール速度を変更させる場合、隣接する間隔L1、L2だけでスクロール速度を変更させるに限らず、押下した指全体の間隔全体の間隔L1+L2に基づいてスクロール速度を変更させることとしてもよい。 As described above, when the scroll speed based on the intervals L1 and L2 is changed every time each adjacent finger is pressed, not only the scroll speed is changed only by the adjacent intervals L1 and L2, but the entire interval of the entire pressed finger is changed. The scroll speed may be changed based on the interval L1 + L2.
 以上説明した各実施例では、片手の指を用いた操作例を説明したが、両手を用いてさらに押下本数を増やしてもよく、決定した処理内容の変化の度合いをさらに大きく、および細かく制御できるようになる。 In each of the embodiments described above, an example of operation using a finger of one hand has been described. However, the number of pressings may be further increased using both hands, and the degree of change in the determined processing content can be further increased and finely controlled. It becomes like this.
 そして、各実施例で説明したように、ディスプレイ上のタッチパネルに対し、指を単純に押下するだけで(スライド等の操作やタッチ面積の増減のための押下の強さを調整を不要として)、処理内容(拡大/縮小やスクロール量等)の決定、および決定した処理内容の変化の度合い(倍率やスクロール速度)を簡単に操作できるようになる。この際、押下する指の数の変更で処理内容の変化の度合い(倍率やスクロール速度)を簡単に操作できる。さらには、指を順番に操作したときの押下の方向にしたがい処理内容の変化の度合いを操作することもできるため、操作のために指の押下の仕方を考える必要がなく、直感的に容易に操作できるようになる。 And, as described in each embodiment, simply press the finger on the touch panel on the display (no need to adjust the strength of pressing for operations such as sliding and increasing / decreasing the touch area) Determination of processing contents (enlargement / reduction, scroll amount, etc.) and the degree of change of the determined processing contents (magnification and scroll speed) can be easily operated. At this time, the degree of change in processing content (magnification and scrolling speed) can be easily operated by changing the number of fingers to be pressed. Furthermore, it is possible to operate the degree of change of the processing content according to the direction of pressing when the fingers are operated in order, so it is not necessary to consider how to press the finger for the operation, and it is easy and intuitive. It becomes possible to operate.
 また、上記実施例では、表示装置としてナビゲーション装置を用いる構成例を説明したが、他にタッチパネルを有する電子機器、例えば、スマートフォンやタブレット、タッチパネルを備えたPCのディスプレイなどにも同様に適用することができる。ディスプレイの表示面積等が大きければ、それだけ押下可能な指の本数も多くなるため、両手を用いて操作することもできるようになる。 Moreover, although the said Example demonstrated the structural example which uses a navigation apparatus as a display apparatus, it applies similarly to the electronic device which has a touch panel, for example, a smart phone, a tablet, the display of PC provided with the touch panel, etc. Can do. If the display area or the like of the display is large, the number of fingers that can be pressed increases accordingly, so that it can be operated using both hands.
 なお、本実施の形態で説明した表示の方法は、あらかじめ用意されたプログラムをパーソナル・コンピュータやワークステーションなどのコンピュータで実行することにより実現することができる。このプログラムは、ハードディスク、フレキシブルディスク、CD-ROM、MO、DVDなどのコンピュータで読み取り可能な記録媒体に記録され、コンピュータによって記録媒体から読み出されることによって実行される。またこのプログラムは、インターネットなどのネットワークを介して配布することが可能な伝送媒体であってもよい。 The display method described in this embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation. This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read from the recording medium by the computer. The program may be a transmission medium that can be distributed via a network such as the Internet.
 100 表示装置
 101 第1受付部
 102 第2受付部
 103 決定部
 111 操作部
 112 表示部
 300 ナビゲーション装置
 401 操作釦
401a 倍率拡大釦
401b 倍率縮小釦
401c 倍率変更釦
DESCRIPTION OF SYMBOLS 100 Display apparatus 101 1st reception part 102 2nd reception part 103 Determination part 111 Operation part 112 Display part 300 Navigation apparatus 401 Operation button 401a Magnification enlargement button 401b Magnification reduction button 401c Magnification change button

Claims (7)

  1.  タッチパネルを有する表示装置において、
     前記タッチパネルにタッチする第1のタッチ操作を受け付ける第1受付手段と、
     前記第1のタッチ操作後、基準時間内に前記タッチパネルの他の部分を複数タッチする第2のタッチ操作を受け付ける第2受付手段と、
     前記第1のタッチ操作により表示に関する処理内容を決定し、前記第2のタッチ操作により前記処理内容に関する変化の度合いを決定する決定手段と、
     を有することを特徴とする表示装置。
    In a display device having a touch panel,
    First receiving means for receiving a first touch operation for touching the touch panel;
    A second receiving means for receiving a second touch operation for touching a plurality of other parts of the touch panel within a reference time after the first touch operation;
    Determining means for determining a processing content related to display by the first touch operation, and determining a degree of a change related to the processing content by the second touch operation;
    A display device comprising:
  2.  前記決定手段は、前記第2のタッチ操作による複数タッチした際の前記タッチパネル上でのタッチ方向に基づき、前記処理内容に関する変化の方向と、前記処理内容に関する変化の度合いと、を決定することを特徴とする請求項1に記載の表示装置。 The determining means determines a direction of change related to the processing content and a degree of change related to the processing content based on a touch direction on the touch panel when a plurality of touches are made by the second touch operation. The display device according to claim 1.
  3.  前記決定手段は、前記第2のタッチ操作による複数タッチした際のタッチの間隔に基づき、前記処理内容に関する変化の度合い、を決定することを特徴とする請求項1に記載の表示装置。 The display device according to claim 1, wherein the determination unit determines a degree of change related to the processing content based on a touch interval when a plurality of touches are made by the second touch operation.
  4.  前記決定手段は、前記処理内容に関する変化の方向が増加する場合には、前記第2のタッチ操作でのタッチ数の増加によって前記処理内容に関する変化の度合いを大きくし、前記処理内容に関する変化の方向が減少する場合には、前記第2のタッチ操作でのタッチ数の増加によって前記処理内容に関する変化の度合いを小さくする、決定を行うことを特徴とする請求項2に記載の表示装置。 When the direction of change related to the processing content increases, the determination unit increases the degree of change related to the processing content by increasing the number of touches in the second touch operation, and the direction of change related to the processing content 3. The display device according to claim 2, wherein, when the value decreases, a determination is made to reduce the degree of change related to the processing content by increasing the number of touches in the second touch operation.
  5.  前記決定手段は、前記第2のタッチ操作後のタッチ数が減少したときには、前記処理内容に関する変化の度合いをそれまでと逆の方向に変化させることを特徴とする請求項4に記載の表示装置。 5. The display device according to claim 4, wherein when the number of touches after the second touch operation decreases, the determination unit changes the degree of change related to the processing content in a direction opposite to the previous one. .
  6.  タッチパネルを有する表示装置が実施する表示方法において、
     前記タッチパネルの一部をタッチする第1のタッチ操作を受け付ける第1受付工程と、
     前記第1のタッチ操作による一部のタッチ後、基準時間内に前記タッチパネルの他の部分を複数タッチする第2のタッチ操作を受け付ける第2受付工程と、
     前記第1のタッチ操作により表示に関する処理内容を決定し、前記第2のタッチ操作により前記処理内容に関する変化の度合いを決定する決定工程と、
     を含むことを特徴とする表示方法。
    In a display method performed by a display device having a touch panel,
    A first accepting step for accepting a first touch operation for touching a part of the touch panel;
    A second accepting step of accepting a second touch operation of touching a plurality of other parts of the touch panel within a reference time after a partial touch by the first touch operation;
    Determining a process content related to display by the first touch operation and determining a degree of change related to the process content by the second touch operation;
    A display method comprising:
  7.  請求項6に記載の表示方法をコンピュータに実行させることを特徴とする表示プログラム。 A display program for causing a computer to execute the display method according to claim 6.
PCT/JP2014/059441 2014-03-31 2014-03-31 Display apparatus, display method, and display program WO2015151154A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2014/059441 WO2015151154A1 (en) 2014-03-31 2014-03-31 Display apparatus, display method, and display program
JP2016511181A JPWO2015151154A1 (en) 2014-03-31 2014-03-31 Display control apparatus, display control method, and display control program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/059441 WO2015151154A1 (en) 2014-03-31 2014-03-31 Display apparatus, display method, and display program

Publications (1)

Publication Number Publication Date
WO2015151154A1 true WO2015151154A1 (en) 2015-10-08

Family

ID=54239525

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/059441 WO2015151154A1 (en) 2014-03-31 2014-03-31 Display apparatus, display method, and display program

Country Status (2)

Country Link
JP (1) JPWO2015151154A1 (en)
WO (1) WO2015151154A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018011879A (en) * 2016-07-22 2018-01-25 株式会社ユニバーサルエンターテインメント Game machine

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010108011A (en) * 2008-10-28 2010-05-13 Sony Corp Information processing apparatus, information processing method, and program
JP2010204729A (en) * 2009-02-27 2010-09-16 Sharp Corp Text display device, method, and program
JP2011248416A (en) * 2010-05-24 2011-12-08 Aisin Aw Co Ltd Device, method and program for displaying information
JP2012058819A (en) * 2010-09-06 2012-03-22 Mitsubishi Electric Corp Touch panel device
JP2012527696A (en) * 2009-05-21 2012-11-08 株式会社ソニー・コンピュータエンタテインメント Portable electronic device, method for operating portable electronic device, and recording medium
WO2013089070A1 (en) * 2011-12-16 2013-06-20 シャープ株式会社 Information display method, information display device, information display program, and computer-readable recording medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010086230A (en) * 2008-09-30 2010-04-15 Sony Corp Information processing apparatus, information processing method and program
JP2013206180A (en) * 2012-03-28 2013-10-07 Kyocera Corp Electronic device and display method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010108011A (en) * 2008-10-28 2010-05-13 Sony Corp Information processing apparatus, information processing method, and program
JP2010204729A (en) * 2009-02-27 2010-09-16 Sharp Corp Text display device, method, and program
JP2012527696A (en) * 2009-05-21 2012-11-08 株式会社ソニー・コンピュータエンタテインメント Portable electronic device, method for operating portable electronic device, and recording medium
JP2011248416A (en) * 2010-05-24 2011-12-08 Aisin Aw Co Ltd Device, method and program for displaying information
JP2012058819A (en) * 2010-09-06 2012-03-22 Mitsubishi Electric Corp Touch panel device
WO2013089070A1 (en) * 2011-12-16 2013-06-20 シャープ株式会社 Information display method, information display device, information display program, and computer-readable recording medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018011879A (en) * 2016-07-22 2018-01-25 株式会社ユニバーサルエンターテインメント Game machine

Also Published As

Publication number Publication date
JPWO2015151154A1 (en) 2017-04-13

Similar Documents

Publication Publication Date Title
CN106062514B (en) Interaction between a portable device and a vehicle head unit
US8108137B2 (en) Map scrolling method and navigation terminal
JP4943543B2 (en) MAP DISPLAY DEVICE, MAP DISPLAY METHOD, MAP DISPLAY PROGRAM, AND RECORDING MEDIUM
JP6429886B2 (en) Touch control system and touch control method
JP6838563B2 (en) On-board unit, display area division method, program and information control device
WO2013028364A2 (en) Hover based navigation user interface control
JP2012093802A (en) Image display device, image display method, and program
JP4783075B2 (en) Navigation device
CN107077281A (en) Sense of touch control system and sense of touch control method
WO2015151154A1 (en) Display apparatus, display method, and display program
JP2011080851A (en) Navigation system and map image display method
JP2013011573A (en) Operation device, operation method and program
WO2016147287A1 (en) Map display control device and map scroll operation sensitivity control method
JP5453069B2 (en) MAP DISPLAY DEVICE, MAP DISPLAY METHOD, MAP DISPLAY PROGRAM, AND RECORDING MEDIUM
WO2018123320A1 (en) User interface device and electronic apparatus
US20160253088A1 (en) Display control apparatus and display control method
JP2019036325A (en) Display device, method for display, and display program
JP2022111235A (en) Display device, method for display, and display program
JP2005128791A (en) Display unit
JP2009294132A (en) Navigation device
WO2007114067A1 (en) Map display device and map display method
WO2007105500A1 (en) Navigation device, navigation method, navigation program, and computer readable recording medium
JP2016201016A (en) On-vehicle device
JP2011196702A (en) Navigation device including touch panel and map image display method thereof
WO2015129170A1 (en) Operation system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14888520

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016511181

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase
122 Ep: pct application non-entry in european phase

Ref document number: 14888520

Country of ref document: EP

Kind code of ref document: A1