WO2015015733A1 - Dispositif d'affichage d'image, procédé d'affichage d'image et produit programme d'affichage d'image - Google Patents

Dispositif d'affichage d'image, procédé d'affichage d'image et produit programme d'affichage d'image Download PDF

Info

Publication number
WO2015015733A1
WO2015015733A1 PCT/JP2014/003752 JP2014003752W WO2015015733A1 WO 2015015733 A1 WO2015015733 A1 WO 2015015733A1 JP 2014003752 W JP2014003752 W JP 2014003752W WO 2015015733 A1 WO2015015733 A1 WO 2015015733A1
Authority
WO
WIPO (PCT)
Prior art keywords
zoom
image
display
control unit
user
Prior art date
Application number
PCT/JP2014/003752
Other languages
English (en)
Japanese (ja)
Inventor
木村 洋介
拡基 鵜飼
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2015015733A1 publication Critical patent/WO2015015733A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/34Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators for rolling or scrolling
    • G09G5/346Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators for rolling or scrolling for systems having a bit-mapped display memory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/045Zooming at least part of an image, i.e. enlarging it or shrinking it
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the present disclosure relates to an image display device, an image display method, and an image display program product that display a display image that is a vast information space such as a map image.
  • the present disclosure has been made in view of the above-described circumstances, and an object thereof is to provide an image display device capable of improving operability by enhancing the performance of a zoom-in function. Furthermore, it is also providing the image display method and image display program product relevant to it.
  • the image display device is provided so as to include a display unit, an operation detection unit, and a control unit.
  • the control unit zooms in on the display image.
  • the control unit ends the zoom-in of the display image.
  • the information communication terminal 1 includes a control unit 2 (control device / means), a display 3 having a touch panel function (display unit / device / means), an operation detection unit 4 (operation detection device / means), and various buttons 5.
  • the communication unit 6 and the memory 7 are included.
  • the control unit 2 is mainly composed of a microcomputer having a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like.
  • the control unit 2 controls various processes of the overall operation of the information communication terminal 1 by the CPU executing a control program (including an image display program) stored in the ROM. Alternatively, these various types of processing may be realized as a hardware configuration as one or a plurality of ICs.
  • the display 3 has a display area (also referred to as a display screen) having a predetermined screen resolution (vertical and horizontal number of pixels). When a display command signal is input from the control unit 2, the display 3 responds to the input display command signal. Display the display image.
  • the display 3 has a touch panel function that allows a user to touch (touch) a finger, and a surface portion thereof is configured as a touch screen.
  • the image display program includes a procedure or instruction to be executed by a computer, and can be provided as a program product stored in a non-transition computer-readable storage medium.
  • the operation detection unit 4 detects that the user has touched the finger using a capacitance method, and an operation detection signal indicating a position where the finger touches and a time during which the finger is touched. Is output to the control unit 2.
  • the method for detecting that the user touches the finger is not limited to the capacitance method, and other methods such as a resistance film method and an electromagnetic induction method may be used. In the present embodiment, assuming that the user multi-touches a finger (touches two or more points at the same time), an electrostatic capacity method capable of detecting two or more points is adopted.
  • the various buttons 5 are buttons mechanically arranged on the casing 1a of the information communication terminal 1 (see FIG. 9 to be described later). For example, a “power” button for switching power on and off, and a home image are displayed. For example, a “home” button. When the user presses the various buttons 5, an operation detection signal indicating the button pressed by the user is output to the control unit 2.
  • the various buttons 5 do not need to include all of the exemplified buttons, and some of the functions may be realized by a touch panel, and the types and the number thereof vary depending on the model. For example, in addition to the buttons described above, a “menu” button for displaying a menu image and a “return” button for displaying a previous display image (display image displayed until immediately before) are arranged. Also good.
  • the control unit 2 analyzes the input operation detection signal to determine the operation content of the user, and the determination result In response to this, a display command signal is output to the display 3, and the display image is switched according to the user's operation.
  • the communication unit 6 communicates various data with the communication unit 13 of the center 11 through the communication network 21.
  • the communication network 21 includes a mobile communication network and a fixed communication network.
  • the memory 7 can store various data.
  • the center 11 includes a control unit 12, a communication unit 13, and a map database 14 that stores map data.
  • the control unit 12 is mainly configured by a microcomputer having a CPU, a ROM, a RAM, and the like.
  • the control unit 12 controls the overall operation of the center 11 by the CPU executing a control program stored in the ROM.
  • the control unit 12 receives the map data request signal transmitted from the information communication terminal 1 by the communication unit 13 via the communication network 21, the map indicated by the received map data request signal.
  • Data is extracted from the map database 14, and the extracted map data is transmitted from the communication unit 13 to the information communication terminal 1 via the communication network 21.
  • the control unit 2 when the control unit 2 receives the map data transmitted from the center 11 via the communication network 21 by the communication unit 6, the control unit 2 stores the received map data in the memory 7. While the application for displaying the map image is being executed, the control unit 2 extracts the corresponding map data designated by the operation from the memory 7 by touching the finger with the finger on the touch screen, and displays the display command signal. 3 and the map image of the corresponding map data is displayed on the display 3.
  • the control unit 2 not only displays the map image of the map data received (downloaded) from the center 11 in this way on the display 3, but also the map data stored in the memory 7 in advance (at the time of product shipment). It is also possible to display a map image on the display 3.
  • the information communication terminal 1 receives the voice spoken by the user via a communication network 21 from a microphone (not shown) and a telephone (not shown) of the other party.
  • a tap is an operation in which a finger is lightly touched on the touch screen once.
  • Double tap is an operation of touching a finger lightly on the touch screen twice in succession.
  • a long tap is an operation (long pressing operation) in which a finger is kept touching the touch screen for a certain period of time.
  • Flicking is an operation to lightly touch a finger on the touch screen.
  • Dragging is an operation of moving (sliding) while keeping a finger touching the touch screen.
  • Pinch-in is an operation of narrowing the interval between fingers while touching two fingers on the touch screen.
  • Pinch out is an operation of increasing the distance between fingers while touching two fingers on the touch screen.
  • the rotation is an operation of rotating two fingers simultaneously while touching the touch screen.
  • operations for scrolling, reducing (zoom out), enlarging (zoom in) or rotating the display image include flicking, dragging, pinching in, pinching out, rotation, and the like.
  • the controller 2 When the controller 2 detects a flick operation while displaying the map image on the display 3, the controller 2 activates the flick function and scrolls the map image in the direction in which the finger is removed.
  • the control unit 2 displays a map image and detects a drag operation, the control unit 2 activates the drag function and scrolls the map image in the direction in which the finger is moved.
  • the control unit 2 displays a map image and detects a pinch-in operation, the control unit 2 activates a zoom-out function and reduces the map image according to the amount of operation (reduces the scale).
  • the control unit 2 displays a map image and detects a pinch-out operation, the control unit 2 activates a zoom-in function and enlarges the map image according to the operation amount (increases the scale).
  • control unit 2 When the control unit 2 displays a map image and detects a rotation operation, the control unit 2 activates the rotation function and rotates the map image according to the operation amount. While executing an application for displaying a map image, the user can switch the display mode of the map image by properly using these operations, and can display a target point. It can be said that the display mode is a display mode or display manner.
  • the control unit 2 can activate the zoom scroll function in addition to the various functions described above.
  • the zoom scroll function is a function of first scrolling a display image, subsequently zooming out the display image and simultaneously scrolling (zoom-out scroll), and subsequently zooming in the display image.
  • the control unit 2 activates the zoom scroll function when the following conditions are satisfied.
  • the control unit 2 executes the following process in connection with the present disclosure.
  • the control unit 2 has a function of measuring first to fifth set times described later.
  • the control unit 2 monitors whether or not the user's finger is touched on the touch screen (S1). When the control unit 2 receives an operation detection signal from the operation detection unit 4 and determines that the user's finger is touched on the touch screen (S1: YES), the control unit 2 determines whether or not the number of touched fingers is one. Determine (S2). When the control unit 2 determines that the number of touched fingers is not one (two or more) (S2: NO), the control unit 2 proceeds to processing of a function different from the zoom scroll function, and performs another function. Start (S3). In this case, operations performed by the user with two or more fingers include pinch-in, pinch-out, and rotation.
  • control unit 2 determines that the user has performed a pinch-in operation
  • the control unit 2 activates a zoom-out function.
  • the control unit 2 determines that the user has performed a pinch-out operation
  • the control unit 2 activates a zoom-in function.
  • the rotation function is activated. Then, when the activated function is completed, the control unit 2 returns to step S1 and continues to monitor whether or not the user's finger is touched on the touch screen.
  • the control unit 2 determines that the number of touched fingers is one (S2: YES), the user's finger is equal to or longer than a first set time (a time shorter than a second set time described later). It is determined whether or not the touch is continued (S4).
  • the control unit 2 determines that the user's finger has not been touched for more than the first set time, that is, the user has released the finger from the touch screen before the first set time has elapsed (S4: NO). In this case as well, the process shifts to a function different from the zoom scroll function and activates another function (S5). In this case, as an operation in which the user does not continue touching one finger for the first set time or longer, there are tap and flick operations.
  • control unit 2 determines that the user has performed a tap operation
  • the control unit 2 activates the tap function.
  • the control unit 2 determines that the user has performed the flick operation
  • the control unit 2 activates the flick function. Then, when the activated function is completed, the control unit 2 returns to step S1 and continues to monitor whether or not the user's finger is touched on the touch screen.
  • control unit 2 determines that the user's finger has been touched for the first set time or more, that is, the first set time has passed without the user releasing the finger from the touch screen (S4: YES). ) It is determined whether or not the user's finger stays on the touch screen without moving (S6). If the control unit 2 determines that the user's finger has not been moved and stayed while touching the touch screen (S6: NO), the control unit 2 also moves to a process of a function different from the zoom scroll function. The function is activated (S7). In this case, as an operation in which the user moves while touching one finger for the first set time or longer, there is a drag operation.
  • control unit 2 determines that the user has performed a drag operation
  • the control unit 2 activates the drag function. Then, when the activated function is completed, the control unit 2 returns to step S1 and continues to monitor whether or not the user's finger is touched on the touch screen.
  • the control unit 2 determines that the user has not moved the finger when the user moves the finger within a minute distance range (within an allowable range). That is, the control unit 2 determines that the user's finger is not moved, for example, when the touching finger is shaken (the user does not have an intention to move the finger).
  • control unit 2 determines that the user's finger is not moved and stays on the touch screen (S6: YES) If the control unit 2 determines that the user's finger is not moved and stays on the touch screen (S6: YES), a second set time (for example, after the user's finger is touched on the touch screen) It is determined whether or not 1 second has elapsed (S8). If the control unit 2 determines that the second set time has not elapsed since the user's finger was touched on the touch screen (S8: NO), the control unit 2 returns to step S4 and repeats steps S4, S6, and S8. And execute.
  • control unit 2 determines that the second set time has elapsed since the user's finger touched the touch screen (S8: YES), the control unit 2 proceeds to zoom scroll processing (see FIG. 3), and zoom scrolling is performed.
  • the function is activated (S9). That is, when the control unit 2 determines that the user has continued to touch for a second set time or longer without moving one finger (long press of one finger), the control unit 2 activates the zoom scroll function.
  • the position touched by the user may be any position on the touch screen.
  • the control unit 2 can determine whether or not the user has kept touching one finger for moving for the second set time or longer without moving, the above-described steps S2, S4, S6, and S8 are performed. May be executed in any order.
  • the control unit 2 performs zoom scrolling only when it is determined that the user has kept touching one finger for more than the second set time without moving by performing the processing described above.
  • functions other than the zoom scroll function functions such as pinch-in and tap
  • the zoom scroll process includes a pre-zoom-out scroll process (see FIG. 4), a zoom-out scroll process (see FIG. 5), a post-zoom-out scroll process (see FIG. 6), and a zoom-in process (see FIG. 7).
  • the control unit 2 proceeds to the pre-zoom-out scroll process (S11).
  • the control unit 2 specifies the position where the user touches the finger at that time, and the angle from the center of the display area (also referred to as the display screen center) of the specified position. Calculate (direction) and distance.
  • the control unit 2 calculates a scroll direction (scroll direction) based on the calculated angle, and calculates a scroll speed (scroll speed) based on the calculated distance (S21).
  • the control unit 2 sets the scroll speed to a relatively fast value if the distance is relatively long, and sets the scroll speed to a relatively slow value if the distance is relatively short.
  • control part 2 starts scrolling of a map image according to the scroll direction and scroll speed which were calculated in this way (S22).
  • the control unit 2 calculates the scroll direction and the scroll speed with reference to the center of the display area, but any position on the display area may be used as a reference.
  • the control unit 2 determines whether or not a third set time (for example, 2 seconds) has elapsed since the start of the scroll of the map image (S23), and the user's finger touches it. It is determined whether or not the user has left the screen (S24), and it is determined whether or not the user's finger has been moved while touching the touch screen (S25).
  • a third set time for example, 2 seconds
  • the control unit 2 When determining that the third set time has elapsed without the user's finger leaving the touch screen (S23: YES), the control unit 2 ends scrolling the map image (S26) and ends the pre-zoom-out scrolling process. Then, the process returns to the zoom scroll process.
  • the control unit 2 determines that the user's finger has left the touch screen before the third set time has elapsed (S24: YES)
  • the control unit 2 also ends the scrolling of the map image (S26) and zooms.
  • the pre-out scroll process is terminated, and the process returns to the zoom scroll process. That is, the control unit 2 sets one of the fact that the third set time has elapsed and the fact that the user's finger has left the touch screen as an end condition for the pre-zoom-out scroll process.
  • control unit 2 determines that the user's finger has been moved while touching the touch screen (S25: YES)
  • the control unit 2 identifies the position where the user is touching the finger after the movement, and the identified movement.
  • the angle and distance from the display area center at the later position are recalculated.
  • the control unit 2 recalculates the scroll direction based on the recalculated angle, and recalculates the scroll speed based on the recalculated distance (S27).
  • the control unit 2 dynamically changes the scroll direction and the scroll speed according to the position after the movement according to the scroll direction and the scroll speed recalculated in this way (S28), and continues to scroll the map image.
  • step S28 the process returns to step S23, and steps S23, S24, and S25 are repeatedly executed.
  • the control unit 2 determines that the user's finger is moved while touching the touch screen, the control unit 2 dynamically changes the scroll direction and the scroll speed according to the position after the movement as described above.
  • control unit 2 determines whether or not the user's finger is away from the touch screen (S12). That is, the control unit 2 determines whether the third set time has elapsed or that the user's finger has left the touch screen as a factor for terminating the pre-zoom-out scroll processing.
  • control unit 2 determines that the user's finger is away from the touch screen, that is, the pre-zoom-out scrolling process is terminated because the user's finger is off the touch screen (S12: YES), the zoom scrolling process is performed. End (complete the zoom scroll function) and return to the main process.
  • the control unit 2 determines whether or not the zoom-out limit has been reached (S31).
  • the zoom-out limit is the scale at which zoom-out is stopped, and an absolute scale (absolute value) set in advance during the setting operation or product shipment by the user may be adopted. You may employ
  • the control unit 2 ends the zoom-out scroll process and returns to the zoom scroll process.
  • the control unit 2 calculates the scroll direction and the scroll speed as described in the scroll process (S32). Note that if the control unit 2 takes over the scroll direction and scroll speed immediately before the zoom-out pre-zoom-out scroll process is completed, the control unit 2 can adopt step S32 and skip the step S32 by adopting the succeeded scroll direction and scroll speed. good.
  • the control unit 2 starts zoom-out scrolling of the map image (S33). Specifically, the control unit 2 starts (resumes) the scrolling of the map image simultaneously with the start of zooming out (reducing) of the map image. In this case, the control unit 2 starts zooming out the map image with a constant speed (zoom out speed) for zooming out the map image.
  • the control unit 2 determines whether or not the zoom-out limit has been reached (S34), determines whether or not the user's finger has left the touch screen (S35), It is determined whether or not the user's finger is moved while being touched on the touch screen (S36).
  • the control unit 2 When determining that the zoom-out limit has been reached without the user's finger leaving the touch screen (S34: YES), the control unit 2 ends the zoom-out scroll of the map image (S37) and ends the zoom-out scroll process. Then, the process returns to the zoom scroll process.
  • the control unit 2 determines that the user's finger has moved away from the touch screen before reaching the zoom-out limit (S35: YES)
  • the zoom-out scroll of the map image is also terminated in this case (S37).
  • the out scroll process is terminated, and the process returns to the zoom scroll process. That is, the control unit 2 sets one of the fact that the zoom-out limit has been reached and the fact that the user's finger has moved away from the touch screen as an end condition for the zoom-out scroll process.
  • control unit 2 determines that the user's finger has been moved while being touched on the touch screen (S36: YES), the scroll direction and the scroll speed are recalculated in this case as described in the scroll process. In accordance with the recalculated scroll direction and scroll speed (S38), the scroll direction and scroll speed are dynamically changed according to the moved position (S39), and the zoom-out scroll of the map image is continued. Returning to step S34, steps S34, S35, and S36 are repeated. Each time the control unit 2 determines that the user's finger is moved while touching the touch screen, the control unit 2 dynamically changes the scroll direction and the scroll speed according to the position after the movement as described above.
  • control unit 2 determines whether or not the user's finger is away from the touch screen (S14). That is, the control unit 2 determines whether the factor that ended the zoom-out scroll process is that the zoom-out limit has been reached or that the user's finger has left the touch screen.
  • control unit 2 determines that the zoom-out scrolling process is terminated because the user's finger is away from the touch screen, that is, the user's finger is away from the touch screen (S14: YES), the user's finger is touched. It is determined whether or not the screen has been touched (S16), and it is determined whether or not a fourth set time (for example, 0.5 seconds) has elapsed since the user's finger left the touch screen (S17).
  • a fourth set time for example, 0.5 seconds
  • the control unit 2 determines that the zoom-out scroll process is terminated because the user's finger is not separated from the touch screen, that is, the zoom-out limit is reached (S14: NO), the post-zoom-out scroll process is performed. (S15).
  • the control unit 2 executes the same process as the scroll process before zoom-out described above except for the process of determining whether or not the third set time has elapsed (S41 to S47). ). If the control unit 2 takes over the scroll direction and scroll speed immediately before the zoom-out scroll process is completed, the control unit 2 may omit step S41 by adopting the succeeded scroll direction and scroll speed. .
  • control unit 2 uses the fact that the user's finger has left the touch screen as a condition for ending the scroll-out scrolling process. Then, when the scroll process after zooming out is finished, the control unit 2 also determines whether or not the user's finger is touched on the touch screen (S16), and after the user's finger is separated from the touch screen, the control unit 2 It is determined whether or not the set time of 4 has passed (S17).
  • control unit 2 determines that the user's finger has touched the touch screen before the fourth set time has elapsed (S16: YES)
  • the control unit 2 returns to step S13 described above.
  • the control unit 2 proceeds to a zoom-in process (S18).
  • the zoom-in process is started, the control unit 2 starts zoom-in (enlargement) of the map image (S51), determines whether or not the zoom-in limit has been reached (S52), and the user's finger moves away from the touch screen.
  • the zoom-in limit is a scale at which zoom-in is stopped, and an absolute scale (absolute value) set in advance at the stage of setting operation or product shipment by the user may be adopted, or zoom-out is started.
  • the immediately preceding scale (return value) may be adopted.
  • the control unit 2 starts zooming in the map image at a constant zoom-in speed (zoom-in speed).
  • the control unit 2 may set the zoom-in speed to the same speed as the zoom-out speed described above or a different speed.
  • the control unit 2 When determining that the user's finger has reached the zoom-in limit without touching the touch screen (S52: YES), the control unit 2 ends the zoom-in of the map image (S55), ends the zoom-in process, and zooms in. Return to scroll processing.
  • the control unit 2 determines that the user's finger has been touched on the touch screen before reaching the zoom-in limit and before the fifth set time has elapsed (S53: NO, S54: YES)
  • the zoom-in of the map image is finished (S55)
  • the zoom-in process is finished, and the process returns to the zoom scroll process.
  • control unit 2 sets one of the fact that the zoom-in limit has been reached and that the user's finger touched the touch screen before the fifth set time has passed as a zoom-in process end condition.
  • the procedure for determining YES in step S54 is the first procedure
  • S55 is the second procedure.
  • control unit 2 ends the zoom-in process and returns to the zoom scroll process
  • the control unit 2 ends the zoom scroll process (completes the zoom scroll function) and returns to the main process.
  • the control unit 2 determines whether or not the user's finger is touched on the touch screen (S10). In other words, the control unit 2 determines that the factor that ended the zoom-in process (the factor that completed the zoom scroll function) reached the zoom-in limit or that the user's finger touched the touch screen. Determine whether.
  • control unit 2 determines that the zoom-in process has been completed because the user's finger is not touching the touch screen, that is, the zoom-in limit has been reached (S10: NO)
  • the control unit 2 returns to step S1 described above.
  • the control unit 2 determines that the user's finger is touched on the touch screen, that is, the zoom-in process is completed due to the user's finger touching the touch screen (S10: YES)
  • the above-described step S9 is performed. Return to, re-enter the zoom scroll process, and restart the zoom scroll function.
  • FIG. 8 shows an example of the processing described above in time series.
  • the control unit 2 activates the zoom scroll function when the second set time elapses without moving after the user touches the touch screen with one finger.
  • the controller 2 starts scrolling the map image first when the zoom scroll function is activated, and starts zoom-out scrolling of the map image when the third set time elapses.
  • the control unit 2 starts zooming in the map image when the fourth set time has elapsed since the user lifted the finger off the touch screen.
  • the zoom-in limit is reached, the control unit 2 completes the zoom scroll function.
  • the user keeps touching and moving one finger for more than the second set time to activate the zoom scroll function and touching the finger to move the map image.
  • the map image can be scrolled, and the map image can be subsequently scrolled out.
  • the user can zoom in on the map image by releasing the finger.
  • FIG. 9 to 12 show the transition of the map image related to the series of processes described above. Characters such as “A”, “B”, and “C” in FIG. 9 and subsequent figures indicate blocks of the map image. Further, the positions on the touch screen touched by the user shown in FIG. 9 and subsequent figures are examples, and the same is true even if the user touches any position on the touch screen.
  • the control unit 2 activates the zoom scroll function. First, the scroll of the map image is started, and the map image is scrolled in the direction from the center of the display area to the lower left (the direction opposite to the upper right where the user touches the finger with respect to the display area center).
  • the control unit 2 starts zoom-out scrolling of the map image, and the map image is displayed simultaneously with zooming out. Scroll in the direction from the center of the area to the lower left, and the display mode shown in FIG. That is, the “E” block displayed at the location where the user continues to touch the finger moves to the lower left, and the “L” block is newly displayed at the location where the user continues to touch the finger.
  • the control unit 2 starts zooming in the map image, and the map image zooms in, resulting in the display mode shown in FIG. That is, the block around “I” displayed near the center of the display area immediately before zooming in is enlarged and displayed.
  • FIG. 10B the block displayed near the center of the display area immediately before zooming in is displayed in an enlarged manner, but near the location where the user touched the finger just before zooming in started. The displayed block may be enlarged and displayed.
  • the control unit 2 dynamically changes the scroll direction and the scroll speed as described above when the user moves while touching one finger while scrolling the map image and during zoom-out scroll. That is, in the display mode shown in FIG. 11A, when the map image is scrolled, when the user moves from the upper right to the upper left of the touch screen while touching one finger, the control unit 2 changes the scroll direction and the scroll speed.
  • the display mode is changed dynamically to the display mode shown in FIG. In FIG. 11, the scroll direction is from the center of the display area to the lower left before the user moves the finger. However, after the user moves the finger, the scroll direction is from the center of the display area to the lower right. . In the display mode shown in FIG.
  • the control unit 2 when the user moves one finger from the upper right to the upper left of the touch screen during the zoom-out scroll of the map image, the control unit 2 also changes the scroll direction and The scroll speed is dynamically changed, and the display mode shown in FIG. Also in FIG. 12, the scroll direction is from the center of the display area to the lower left before the user moves the finger, but after the user moves the finger, the scroll direction is from the center of the display area to the lower right.
  • 11A and 12A show display modes immediately before the user starts to move the touch position, and FIGS. 11B and 12B end the movement of the touch position by the user. Although the display mode immediately after is shown, the control unit 2 dynamically changes the scroll direction and the scroll speed even when the user moves the touch position.
  • the control unit 2 resumes scrolling following the zoom-out scroll of the map image.
  • the user can scroll the map image by continuously touching the finger even after zooming and scrolling the map image.
  • FIG. 13 and FIG. 14 show the transition of the map image related to the series of processes described above.
  • the control unit 2 resumes scrolling the map image, The image scrolls in the direction from the center of the display area to the lower left, and the display mode shown in FIG. That is, the “L” block displayed at the location where the user continues to touch the finger moves to the lower left, and the “N” block is newly displayed at the location where the user continues to touch the finger.
  • the control unit 2 also starts zooming in the map image in this case, and the map image zooms in, resulting in the display mode shown in FIG. .
  • the control unit 2 displays the map image. After zooming in, the zoom scroll function is once completed, and the zoom scroll function is restarted from the scale at that time. Accordingly, the user can repeat the zoom scroll function and restart by touching the finger during zooming in.
  • FIG. 15 shows the transition of the map image related to the series of processes described above.
  • the control unit 2 finishes zooming in the map image and temporarily ends the zoom scroll function. , Restart the zoom scroll function from the current scale. Thereafter, when the user continues to touch one finger, the control unit 2 starts scrolling the map image, and the display mode shown in FIG. 15B is obtained, and the map image is switched according to the user operation.
  • the information communication terminal 1 zooms in the map image when the user touches the touch screen with the finger even if the map image is zoomed in, even if the zoom-in limit is not reached. I ended it. Accordingly, it is possible to meet a request that the user wants to end zoom-in while the map image is zoomed in. As a result, the operability can be improved by improving the performance of the zoom-in function. In addition, when the zoom-in of the map image is completed and the zoom scroll function is completed, the zoom scroll function is restarted. This makes it possible to alternately repeat zoom-out scrolling suitable for displaying far away destinations and zooming in suitable for fine-tuning the display position of the destination, further improving operability. Can do.
  • zoom out and scroll to display far away destinations move far
  • zoom in a little to zoom in a little zoom out a little
  • zoom out a little further and zoom in to zoom in It is possible to adopt a usage form such as displaying in the center of the display area.
  • the zoom-in of the map image is terminated. This allows the user to finish zooming in with a simple operation of touching the touch screen before the fifth set time has elapsed since the user lifted his / her finger off the touch screen. Can be restarted.
  • the portable information communication terminal 1 is assumed to have an operation mode in which only the thumb is touched on the touch screen while holding the housing 1a with four fingers other than the thumb when operating with one hand.
  • an operation mode in which only the thumb is touched on the touch screen it is difficult to activate a zoom-out or zoom-in function that has been conventionally operated with two fingers.
  • two fingers are used. It is possible to utilize the zoom-out function and the zoom-in function without necessity, and it is possible to alternately repeat zoom-in and zoom-out. As a result, the operability can be greatly improved.
  • the present disclosure is not limited to the above-described embodiment, and can be modified or expanded as follows. A plurality of modified examples may be combined.
  • the present invention is not limited to a portable information communication terminal, and may be applied to a fixed type device. For example, the user may touch the touch screen using a pen-shaped tool.
  • the display image is not limited to a map image and may be any image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Lorsqu'un terminal de communication d'informations (1) termine un zoom arrière et un défilement dans une image de carte, ledit terminal de communication d'informations débute (S51) un zoom avant dans ladite image de carte, et pendant le zoom avant, même si une limite de zoom avant n'est pas atteinte ("NON" en S52), si ("OUI" en S54), un utilisateur utilise un doigt pour toucher un écran tactile, le terminal de communication d'informations arrête (S55) le zoom avant dans l'image de carte et réactive la fonction de zoom-défilement. Ainsi, tout en répondant à la demande de l'utilisateur de cesser le zoom avant dans l'image de carte au cours du processus de ce zoom, la caractéristique de zoom-défilement peut être réactivée rapidement.
PCT/JP2014/003752 2013-08-01 2014-07-16 Dispositif d'affichage d'image, procédé d'affichage d'image et produit programme d'affichage d'image WO2015015733A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-160402 2013-08-01
JP2013160402A JP2015032096A (ja) 2013-08-01 2013-08-01 画面表示装置、画面表示方法及び画面表示プログラム

Publications (1)

Publication Number Publication Date
WO2015015733A1 true WO2015015733A1 (fr) 2015-02-05

Family

ID=52431296

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/003752 WO2015015733A1 (fr) 2013-08-01 2014-07-16 Dispositif d'affichage d'image, procédé d'affichage d'image et produit programme d'affichage d'image

Country Status (2)

Country Link
JP (1) JP2015032096A (fr)
WO (1) WO2015015733A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006145930A (ja) * 2004-11-22 2006-06-08 Sony Corp 表示装置、表示方法、表示プログラム及び表示プログラムを記録した記録媒体
JP2009200640A (ja) * 2008-02-19 2009-09-03 Canon Inc 映像表示装置およびその制御方法、映像出力装置およびその制御方法、ならびに、映像出力システムおよび映像出力の制御方法
JP2009300328A (ja) * 2008-06-16 2009-12-24 Denso Corp 地図表示装置及び地図表示プログラム
JP2010107593A (ja) * 2008-10-28 2010-05-13 Nippon Telegr & Teleph Corp <Ntt> プレゼンテーション画像の提示システム、提示方法、提示プログラム及びその記録媒体

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4358882B2 (ja) * 2007-12-28 2009-11-04 株式会社ナビタイムジャパン 地図表示システム、地図表示装置および地図表示方法
MX2012014258A (es) * 2010-06-30 2013-01-18 Koninkl Philips Electronics Nv Acercar una imagen presentada.

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006145930A (ja) * 2004-11-22 2006-06-08 Sony Corp 表示装置、表示方法、表示プログラム及び表示プログラムを記録した記録媒体
JP2009200640A (ja) * 2008-02-19 2009-09-03 Canon Inc 映像表示装置およびその制御方法、映像出力装置およびその制御方法、ならびに、映像出力システムおよび映像出力の制御方法
JP2009300328A (ja) * 2008-06-16 2009-12-24 Denso Corp 地図表示装置及び地図表示プログラム
JP2010107593A (ja) * 2008-10-28 2010-05-13 Nippon Telegr & Teleph Corp <Ntt> プレゼンテーション画像の提示システム、提示方法、提示プログラム及びその記録媒体

Also Published As

Publication number Publication date
JP2015032096A (ja) 2015-02-16

Similar Documents

Publication Publication Date Title
US11966558B2 (en) Application association processing method and apparatus
US10216407B2 (en) Display control apparatus, display control method and display control program
JP5970086B2 (ja) タッチスクリーンホバリング入力処理
JP5946462B2 (ja) 携帯端末機及びその画面制御方法
JP6130096B2 (ja) タッチスクリーン端末機及びその端末機における画面ディスプレーの制御方法
KR102097496B1 (ko) 폴더블 이동 단말기 및 그 제어 방법
US8847978B2 (en) Information processing apparatus, information processing method, and information processing program
WO2016138661A1 (fr) Procédé de traitement pour une interface utilisateur d&#39;un terminal, interface utilisateur et terminal
JP2014505315A (ja) デバイスパネルの相対的な移動を用いたユーザー命令の入力方法及び装置
JP5638570B2 (ja) 画像表示装置、画像表示方法、及び、画像表示プログラム
JP6032702B2 (ja) タッチスクリーンを具備する電子機器における画面拡大装置及び方法
KR20150095541A (ko) 사용자 단말 장치 및 이의 디스플레이 방법
WO2015015732A1 (fr) Dispositif d&#39;affichage d&#39;image, procédé d&#39;affichage d&#39;image et produit-programme d&#39;affichage d&#39;image
JP6096100B2 (ja) 電子機器、制御方法、及び制御プログラム
JP5906344B1 (ja) 情報処理装置、情報表示プログラムおよび情報表示方法
KR101460363B1 (ko) 터치 스크린 입력을 이용한 화면 확대/축소 방법 및 장치
WO2015015733A1 (fr) Dispositif d&#39;affichage d&#39;image, procédé d&#39;affichage d&#39;image et produit programme d&#39;affichage d&#39;image
WO2015015731A1 (fr) Dispositif d&#39;affichage d&#39;images, procédé d&#39;affichage d&#39;images, et produit de programme d&#39;affichage d&#39;images
JP6730972B2 (ja) 情報制御プログラム、情報制御方法および端末装置
WO2018123701A1 (fr) Dispositif électronique, procédé de commande associé et programme
JP2014174779A (ja) 入力装置及びそれを用いた携帯端末装置
JP6194383B2 (ja) 情報処理装置、情報表示プログラムおよび情報表示方法
AU2015258317B2 (en) Apparatus and method for controlling motion-based user interface
JP5516794B2 (ja) 携帯情報端末、表示制御方法およびプログラム
JP2014044673A (ja) 携帯端末装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14832895

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14832895

Country of ref document: EP

Kind code of ref document: A1