CN113678089A - Control device, system, and program - Google Patents

Control device, system, and program Download PDF

Info

Publication number
CN113678089A
CN113678089A CN202080024345.4A CN202080024345A CN113678089A CN 113678089 A CN113678089 A CN 113678089A CN 202080024345 A CN202080024345 A CN 202080024345A CN 113678089 A CN113678089 A CN 113678089A
Authority
CN
China
Prior art keywords
unit
notification
display
display unit
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080024345.4A
Other languages
Chinese (zh)
Inventor
野村圭司
高井敏仁
村田健二
中根启太
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tokai Rika Co Ltd
Original Assignee
Tokai Rika Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tokai Rika Co Ltd filed Critical Tokai Rika Co Ltd
Priority claimed from PCT/JP2020/006862 external-priority patent/WO2020195409A1/en
Publication of CN113678089A publication Critical patent/CN113678089A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The operability is further improved. Provided is a control device provided with: a receiving unit that receives an operation performed on at least one operation unit; and a control unit configured to perform notification processing for controlling the notification unit to perform notification in association with at least one of the target images that moves in accordance with the content of the operation received by the reception unit, the notification unit being configured to perform notification processing in which at least one of the images displayed on the display unit is a target image, wherein the operation unit and the display unit are each disposed at positions at which it is difficult for a user to intuitively grasp a correspondence between an operation coordinate defining a position of an operation performed on the operation unit and a display coordinate defining a position of an image displayed on the display unit.

Description

Control device, system, and program
Technical Field
The present invention relates to a control device, a system, and a program.
Background
In recent years, systems using a GUI (Graphical User Interface) have been developed in large quantities. For example, patent document 1 discloses a system capable of executing a function corresponding to an instruction input option displayed in a monitor by operating a touch panel.
The system disclosed in cited document 1 is characterized in that the shape of the touch panel substantially coincides with the shape of the display area. This allows a user operating the touch panel to intuitively grasp the cursor position on the display area, and can perform an arbitrary operation without constantly watching the monitor.
However, in the system using the GUI, it is sometimes difficult for the user to intuitively grasp the correspondence between the operation coordinates for defining the position of the operation performed on the operation unit and the display coordinates for defining the position of the image displayed on the display unit. In such a system, the user may overlook the operation position, and operability may be degraded.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2003-108311
Disclosure of Invention
Problems to be solved by the invention
The present invention has been made in view of the above problems, and an object of the present invention is to provide a structure capable of further improving operability.
Means for solving the problems
In order to solve the above problem, according to an aspect of the present invention, there is provided a control device including: a receiving unit that receives an operation performed on at least one operation unit; and a control unit configured to perform notification processing for setting at least one of the images displayed on the display unit as a target image, and to control the notification unit so that the notification unit performs notification in association with at least one of the target images that moves in accordance with the content of the operation received by the receiving unit, wherein the operation unit and the display unit are respectively disposed at positions at which it is difficult for a user to intuitively grasp a correspondence between operation coordinates for defining a position of an operation performed on the operation unit and display coordinates for defining a position of an image displayed on the display unit.
In order to solve the above problem, according to another aspect of the present invention, there is provided a system including: at least one operation section; a receiving unit that receives an operation performed on the operation unit; a display unit that displays at least one image; and a control unit configured to perform notification processing for making a notification unit perform notification in association with at least one of the target images that moves in accordance with the content of the operation received by the receiving unit, the notification processing being configured to control the notification unit to perform notification in accordance with the at least one of the target images that moves in accordance with the content of the operation received by the receiving unit, wherein the operation unit and the display unit are each disposed at positions at which it is difficult for a user to intuitively grasp a correspondence between operation coordinates for defining a position of an operation performed on the operation unit and display coordinates for defining a position of an image displayed by the display unit.
In order to solve the above problem, according to another aspect of the present invention, there is provided a program for causing a computer to function as: a reception function for receiving an operation performed on at least one operation unit; and a control function for performing a notification process for making a notification unit perform a notification in association with at least one of the target images that moves in accordance with the content of the operation received by the receiving unit, the notification process being performed using at least one of the images displayed on the display unit, wherein the operation unit and the display unit are respectively disposed at positions at which it is difficult for a user to intuitively grasp a correspondence between an operation coordinate for defining a position of the operation performed on the operation unit and a display coordinate for defining a position of the image displayed on the display unit.
ADVANTAGEOUS EFFECTS OF INVENTION
As described above, according to the present invention, a structure capable of further improving operability is provided.
Drawings
Fig. 1 is a diagram showing an example of a functional configuration of a system 1 according to an embodiment of the present invention.
Fig. 2 is a diagram for explaining the notification processing performed by the control unit 420 according to this embodiment.
Fig. 3 is a diagram for explaining the notification processing performed by the control unit 420 according to this embodiment.
Fig. 4 is a diagram for explaining the notification processing performed by the control unit 420 according to this embodiment.
Fig. 5 is a diagram for explaining the notification processing performed by the control unit 420 according to this embodiment.
Fig. 6 is a flowchart showing a flow of the operation of the system 1 according to the embodiment.
Detailed Description
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the present specification and the drawings, the same reference numerals are given to the constituent elements having substantially the same functional configurations, and the overlapping description is omitted.
< embodiment >
< construction example >)
First, a configuration example of the system 1 according to an embodiment of the present invention will be described. Fig. 1 is a diagram showing an example of a functional configuration of a system 1 according to the present embodiment. As shown in fig. 1, the system 1 according to the present embodiment includes, for example, an operation device 10, a display device 20, a notification device 30, and a control device 40.
(operation device 10)
The operation device 10 according to the present embodiment is a device to which a user performs an operation. The operation device 10 according to the present embodiment may be, for example, a touch panel, a trackball, a mouse, a wheel, a slide switch, or the like. The operation device 10 according to the present embodiment includes an operation unit 110.
The operation unit 110 according to the present embodiment has a function of detecting an operation by a user. Therefore, the operation unit 110 may include various detection mechanisms according to the form of the operation device 10. For example, when the operation device 10 is a touch panel, the operation unit 110 may include a pressure sensor for converting a pressure change caused by a user operation into an electric signal, and a capacitance sensor for converting a capacitance change caused by the user operation into an electric signal. The operation unit 110 may be provided with a switch for detecting a pressing operation by the user. The operation unit 110 transmits a signal related to the detected user operation to the control device 40.
(display device 20)
The display device 20 according to the present embodiment is a device that displays visual information such as an image. The display device 20 according to the present embodiment may be any of various display devices. The display device 20 according to the present embodiment includes a display unit 210.
The display unit 210 according to the present embodiment may display an image based on a signal input from the control device 40. For example, the display unit 210 moves at least one object image on the display area based on a signal input from the control device 40. Here, the above-described target image means an image to be operated by the user among the images displayed on the display unit 210. The object image according to the present embodiment may be, for example, an icon, a button, a thumbnail, a character string, or the like.
(Notification device 30)
The notification device 30 according to the present embodiment is a device that performs various notifications to the user in accordance with the control performed by the control device 40. The notification device 30 according to the present embodiment includes a notification unit 310.
The notification unit 310 according to the present embodiment performs notification based on a control signal input from the control device 40. For this reason, the notification unit 310 includes various output means according to the form of notification. For example, when the notification is performed by the tactile sensation of the user, the notification unit 310 may include various actuators that can generate vibration stimulation, electrical stimulation, compression stimulation, warm and cold stimulation, and the like. For example, when the notification using the vibration stimulus is performed, the notification unit 310 may include an Eccentric motor (ERM), a Linear oscillator (LRA), a piezoelectric (Piezo) element, a voice coil motor, or the like. The notification unit 310 may include a plurality of actuators of the same type or different types.
For example, when the notification is performed by the auditory sense of the user, the notification unit 310 may include a speaker or a microphone. For example, when a notification is performed by the user's vision, the notification unit 310 may be provided with various illuminations for emitting light.
Note that the notification by the notification unit 310 may be a notification in which a plurality of different types of stimuli are combined. For example, the notification unit 310 may perform notification using vibration or sound. In addition, for example, the notification unit 310 may perform notification using electrical stimulation and light. The notification unit 310 according to the present embodiment performs notification using at least one of vibration, sound, and light based on control performed by the control device 40.
(control device 40)
The control device 40 according to the present embodiment is a device that receives an operation performed by a user on the operation device 10 and controls notification in conjunction with a behavior of an image displayed on the display device 20. The control device 40 according to the present embodiment includes a reception unit 410, a control unit 420, and a storage unit 430.
The reception unit 410 according to the present embodiment has a function of receiving an operation performed on at least one operation unit 110. For example, the reception unit 410 receives an electric signal generated by the operation unit 110 based on the detected user operation, and receives an operation corresponding to the electric signal.
The control unit 420 according to the present embodiment is configured to perform a notification process for controlling the notification unit 310 such that the notification unit 310 performs notification in conjunction with at least one of the target images moving in accordance with the content of the operation received by the reception unit 410, with at least one of the images displayed on the display unit 210 being the target image. The function of the control Unit 420 is realized by cooperation of a processor such as a CPU (Central Processing Unit), a RAM (Random Access Memory), and the like. The notification process according to the present embodiment will be described in detail.
The storage unit 430 according to the present embodiment stores various information related to the operation of the control device 40. The storage unit 430 stores, for example, a program for the control unit 420 to execute notification processing.
In the above, an example of the functional configuration of the system 1 according to the present embodiment is described. The functional configuration shown in fig. 1 is merely an example, and the functional configuration of the system 1 according to the present embodiment is not limited to the above example. For example, the operation device 10 and the notification device 30 may be formed as an integrated device. In this case, for example, by executing notification using a vibration stimulus, notification as feedback on the operation performed by the user on the operation device 10 can be realized. The functional configuration of the system 1 according to the present embodiment can be flexibly modified in accordance with the specification and the operation.
In addition to the above features, the system 1 according to the present embodiment is characterized in that: the operation unit 110 and the display unit 210 according to the present embodiment are respectively disposed at positions where it is difficult for the user to intuitively grasp the correspondence between the operation coordinates for defining the position of the operation performed on the operation unit 110 and the display coordinates for defining the position of the image displayed on the display unit 210. For example, the operation unit 110 and the display unit 210 may be arranged at positions where the operation unit 110 and the display unit 210 are distant from each other, as positions where it is difficult for the user to intuitively grasp the correspondence between the operation coordinates and the display coordinates.
On the other hand, as a mechanism capable of intuitively grasping the correspondence between the operation coordinates and the display coordinates, for example, a touch panel used in a smartphone or the like can be cited. When a user operates the touch panel, the user can select an object image such as an icon displayed in the display area or the operation area by touching the display position of the object image while visually checking the object image. In this case, the operation coordinates and the display coordinates can be said to substantially completely coincide with each other.
On the other hand, in a case where the operation unit 110 and the display unit 210 are provided as separate devices, it may be difficult for the user to intuitively grasp the correspondence between the operation coordinates and the display coordinates. For example, assume the following case: in a vehicle compartment of a mobile body such as a vehicle, the operation unit 110 and the display unit 210 are provided at positions distant from each other. For example, the operation unit 110 may be disposed on a spoke of the steering wheel. In addition, the display unit 210 may be provided in front of the driver's seat as (a part of) an instrument panel, or may be disposed on a center console or the like, for example. In such a case, the user operates the object image displayed on the display unit 210 using the operation unit 110 disposed at a position distant from the display unit 210. However, in this case, it is difficult for the user to simultaneously watch the operation unit 110 and the display unit 210. In addition, when the user is a driver, it is difficult to keep looking at the display unit 210 and the operation unit 110. Therefore, the user cannot intuitively grasp the correspondence between the operation coordinates and the display coordinates, and thus there is a possibility that an intended operation cannot be accurately performed or a meaningless operation may be repeatedly performed.
For example, it is also possible to say that it is difficult to intuitively grasp the correspondence between the operation coordinates and the display coordinates when the touch panel is disposed on the back surface of a smartphone or a gamepad (gamepad) as a part of the operation unit 110. As described above, the operation unit 110 and the display unit 210 according to the present embodiment may not necessarily be provided in separate devices.
The technical idea of the present invention is to further improve operability in a system using a GUI, which is conceived in view of the above-described problems. Therefore, the control device 40 according to an embodiment of the present invention includes: a reception unit 410 that receives an operation performed on at least one operation unit 110; and a control unit 420 for performing a notification process for controlling the notification unit 310 such that the notification unit 310 performs notification in conjunction with at least one of the target images moving in accordance with the content of the operation received by the reception unit 410, with at least one of the images displayed on the display unit 210 being the target image. In addition, one of the features is: the operation unit 110 and the display unit 210 according to an embodiment of the present invention are disposed at positions where it is difficult for a user to intuitively grasp the correspondence between operation coordinates for defining the position of an operation performed on the operation unit 110 and display coordinates for defining the position of an image displayed on the display unit 210. The notification process performed by the control device 40 will be described in detail below.
< details >
Hereinafter, a case where the system 1 according to the present embodiment is installed in a vehicle cabin of a mobile body such as a vehicle will be mainly described as an example. For example, at least one or more operation portions 110 may be provided on a spoke of the steering wheel. The notification unit 310 is formed integrally with the operation unit 110 and is used to notify using a vibration stimulus. The display unit 210 may be (a part of) an instrument panel or may be provided on a center console.
The notification process performed by the control unit 420 will be described in detail below with specific examples. Fig. 2 to 5 are diagrams for explaining the notification processing performed by the control unit 420. For example, in the case of the example shown in fig. 2, the control unit 420 executes the notification process based on the range (movable range) in which the target image can move according to the content of the operation.
Fig. 2 shows a case where 3 object images I1 to I3 displayed in the display area DA of the display unit 210 move in accordance with the operation of the user. The object images I1 to I3 may be icons corresponding to predetermined functions. The functions corresponding to the object images I1 to I3 can be designed arbitrarily. For example, when the system 1 is mounted on a mobile body such as a vehicle, the object images I1 to I3 may correspond to functions such as vehicle body control, air conditioning control, navigation control, and volume control.
In this case, the user can move the object images I1 to I3 displayed in the display area DA of the display unit 210 or select the object images I1 to I3 displayed in the display area DA of the display unit 210 by operating the operation unit 110. In the case of the example shown in fig. 2, the following are assumed: the user can move (scroll) the object images I1 to I3 in the vertical direction by performing a drawing operation (sliding operation) in the vertical direction on the operation unit 110. That is, the control unit 420 may instruct the display unit 210 to move the object images I1 to I3 based on the drawing operation received by the receiving unit 410. At this time, the controller 420 may instruct the display unit to move the object images I1 to I3 in the same manner. In this case, the user can move the plurality of object images I1 to I3 simultaneously by one drawing operation.
In the case of the example shown in fig. 2, the target image located near the center in the vertical direction of the display area DA is set to the selected state. That is, the user can select an object image by scrolling the object images I1 to I3 in the vertical direction by performing a drawing operation in the vertical direction on the operation unit 110, and moving an arbitrary object image to the vicinity of the center (hereinafter simply referred to as the vicinity of the center) in the vertical direction of the display area DA. For example, a state in which the object image I2 is selected is shown on the left side of fig. 2.
In the above state, when the user wants to select the object image I3 instead of the object image I2, the user performs a drawing operation in the upward direction (or in the downward direction) on the operation unit 110 to move the object image I2 to the vicinity of the center of the display area DA. However, when the movable range of the object images I1 to I3 in the vertical direction is not limited, if the drawing operation is performed excessively, all of the object images I1 to I3 disappear from the display area DA, and the user may overlook the object images I1 to I3. Therefore, in this example, the movable ranges SR of the object images OI1 to I3 are determined in advance. The movable range SR can be set by, for example, the upward movement boundary field SRu and the downward movement boundary field SRd.
In the case of the example shown in fig. 2, the movement boundary region SRu is set so that the object image I3 is located near the center of the display region DA when the upper end of the object image I1 reaches the movement boundary region SRu. In addition, the moving boundary field SRd is set so that the object image I1 is located near the center of the display area DA when the lower end of the object image I3 reaches the moving boundary field SRd.
On the other hand, even when the movable range SR of the object images I1 to I3 is determined as described above, there is a possibility that: in a case where the user does not look at the display area DA, the user continues the drawing operation without being aware of it although the object image I1 has reached the moving boundary field SRu (i.e., although the object image I3 has been in the selected state) as shown on the right side of fig. 2. The following possibilities also exist: although the object image I3 has reached the moving boundary field SRd (i.e., although the object image I1 has been in the selected state), the user does not realize that the drawing operation is continued.
In order to avoid such a situation, the control unit 420 according to the present embodiment may execute the notification process when at least one of the plurality of target images reaches the end of the movable range SR. For example, in the case of the example shown on the right side of fig. 2, the control unit 240 may cause the notification unit 310 to perform notification using a vibration stimulus or the like in response to the drawing operation causing the target image I1 to reach the movement boundary region SRu that is one end of the movable range SR. Similarly, the control unit 240 may cause the notification unit 310 to perform notification in response to the drawing operation causing the target image I3 to reach the movement boundary region SRd, which is one end of the movable range SR. By the above control performed by the control unit 420, the following effects can be achieved: the user can intuitively grasp that the target image has reached the end of the movable range SR, and thus unnecessary drawing operation thereafter is not performed.
Further, the control unit 420 may cause the notification unit 310 to perform different types of notifications when the object image I1 reaches the movement boundary domain SRu and when the object image I3 reaches the movement boundary domain SRd. The above-described form can be the number of times of notification, the notification time, the size of the notification, or other various parameters corresponding to the stimulus used in the notification. For example, when the notification using the vibration stimulus is performed, the control unit 420 can cause the notification unit 310 to perform the notification using the vibration stimulus of a different form by changing parameters related to acceleration, frequency, and the like.
In fig. 2, a case is illustrated in which the movable range SR is determined by the movement boundary field SRu set for the upward direction and the movement boundary field SRd set for the downward direction. However, the setting of the movable range SR according to the present embodiment is not limited to the above example. The movement boundary field is not limited to the up direction and the down direction, and may be arbitrarily set for all directions. For example, the movable range SR may be determined by a movement boundary field set for the left direction and a movement boundary field set for the right direction. The movable range SR may be determined by movement boundary regions set for 8 directions of up-down, left-right, and tilt. In addition, two movement boundary regions according to the present embodiment may not necessarily be set for the opposite directions. For example, the movable range SR may be determined only by the movement boundary field SRu set for the upward direction. In this case, although the range in which the object image can be moved in the upward direction is limited, the downward direction can be moved without limitation.
In fig. 2, the case where the object image is an icon corresponding to a predetermined function is illustrated, but the object image according to the present embodiment is not limited to the above example. For example, when the control is performed based on the movable range SR as described above, the target image may be in a rotation (scroll) type format for selecting and inputting a number, a date, or the like.
The target image according to the present embodiment may be an image displayed by drawing the front surface of the image displayed in the display area DA by a drawing operation (referred to as a drawn image in this specification), for example. For example, when an image related to an arbitrary application is displayed, the drawn image is displayed such that drawing operation is performed from one end (first edge) of the display area DA to the other end (second edge) thereof, thereby drawing the image related to the application from the one end to the other end on the front surface thereof. The pull-out image may be an image for performing general settings of the system 1, for example.
The movable range SR may be set for the above-described pull-out image so that the pull-out image stays in the display area DA even when the drawing operation is performed excessively. The control unit 420 may execute the notification process to prevent the user from continuing the unnecessary drawing operation without being aware that the end of the pull-out image has reached the movement boundary region.
A pull-out image OS pulled out from the upper end of the display area DA is illustrated on the left side of fig. 3. In this case, the user draws the drawn image OS downward by performing a drawing operation with the upper end of the display area DA as a first edge and from the first edge toward a second edge opposite to the first edge, that is, a lower end.
As shown on the right side of the figure, the control unit 420 may execute the notification process when the lower end of the drawn image OS reaches the second edge (i.e., the lower end) of the display area DA by the drawing operation received by the receiving unit 410. In this case, the movement boundary field may be the second edge of the display area DA. The control unit 420 performs the above control, thereby providing the following effects: the user can intuitively grasp that the pulled-out image OS has reached the moving boundary region, that is, the entire pulled-out image OA is displayed, and thus unnecessary drawing operation thereafter is not performed.
In the example shown in fig. 3, the case where the movement boundary area is the second edge of the display area DA has been described, but the setting of the movement boundary area for the pull-out image is not limited to the above example. The moving boundary field associated with the pull-out image can be anywhere from the first edge to the second edge. The moving boundary field related to the pulled-out image may be, for example, a point (line) that is close to the first edge by a predetermined length from the second edge.
The first edge and the second edge according to the present embodiment are not limited to the upper end and the lower end of the display area DA, respectively. The first edge and the second edge may each be any edge that the display area DA has. For example, the pull-out image may be pulled out from the left end toward the right end of the display area DA, or may be pulled out from the upper right end toward the lower left end.
The notification processing based on the movable range SR of the target image is described above with specific examples. Next, the notification processing based on the positional relationship between the specific region set in the display region DA and the target image according to the present embodiment will be described.
On the left side of fig. 4, a specific area SA set near the center of the display area DA and an object image I1 located outside the specific area SA are shown. Here, when the user performs a drawing operation on the operation unit 110 and the object image I1 reaches the inside of the specific area SA as shown on the right side in the figure, the control unit 420 executes a notification process. Here, the fact that the object image I1 reaches the inside of the specific area SA may mean that the entire object image I1 is accommodated inside the specific area SA, for example. On the other hand, the control unit 420 may determine that the object image I1 has reached the inside of the specific area SA when the end of the object image I1 has touched the outer periphery of the specific area SA or when a part of the object image I1 overlaps a part of the specific area SA in accordance with the drawing operation.
The operation of moving the target image to the inside of the specific area SA as described above may be, for example, an operation of executing a function corresponding to the target image. The operation of moving the object image to the inside of the specific area SA may be, for example, an operation of deleting an icon, a file, a folder, or the like corresponding to the object image (for example, dragging the object image to the specific area SA indicated by an image of a trash box). The operation of moving the object image to the inside of the specific area SA may be, for example, an operation of moving the object image corresponding to an icon, a file, a folder, or the like from the current storage location to another storage location (for example, dragging the object image to the specific area SA indicated by the image of the folder).
By the above-described control performed by the control unit 420 according to the present embodiment, the user who perceives the notification can intuitively grasp that the processing has been accepted, and can blindly complete a desired operation without looking at the display area DA.
For example, the control described above can also be applied to an operation of searching for an arbitrary point on the map displayed on the display area DA. In this case, the target image may be a point or the like that moves in accordance with the drawing operation. The specific area SA may be an area corresponding to an arbitrary point on the map where the user wants to confirm the position. The arbitrary point may be, for example, a return point including a destination, a home, or the like, an arbitrary type of point classified into a restaurant or the like, a favorite point preset by the user, or the like.
In this case, when the target image corresponding to the point or the like reaches the inside of the specific area SA as described above by the drawing operation of the user, the control unit 420 executes the notification process. By the above control, the user can intuitively perceive that a desired point exists on the map without paying attention to the map displayed on the display area DA.
In this case, the control unit 420 may cause the notification unit 310 to perform different types of notifications for each type of specific area SA. For example, the control unit 420 may cause the notification unit 310 to perform different types of notifications when the object image has reached a specific area SA corresponding to a restaurant and when the object image has reached a specific area SA corresponding to a gas station. With the above control, the user can intuitively perceive a plurality of kinds of places only by notification.
On the other hand, the notification processing based on the positional relationship between the specific area SA and the object image is not limited to the case where the object image reaches the inside of the specific area SA.
The specific area SA set in the display area DA and the object image I1 located inside the specific area SA are shown on the left side of fig. 5. Here, when the user performs a drawing operation on the operation unit 110 and the object image I1 reaches the outside of the specific area SA as shown on the right side in the figure, the control unit 420 executes a notification process. Here, the fact that the object image I1 reaches the outside of the specific area SA may mean, for example, that the entire object image I1 is outside the specific area SA. On the other hand, the control unit 420 may determine that the object image I1 has reached the outside of the specific region SA when the end of the object image I1 has touched the inner periphery of the specific region SA or when a part of the object image I1 overlaps a part of the specific region SA in accordance with the drawing operation.
The operation of moving the target image to the outside of the specific area SA as described above may be, for example, a process of stopping a function corresponding to the target image or a process of releasing the selected state. The operation of moving the target image to the outside of the specific area SA may be, for example, an operation of moving an icon, a file, a folder, or the like corresponding to the target image from the current storage position to another storage position (for example, an operation of moving an icon stored in the page a displayed in the display area DA and the specific area SA to the page B not displayed in the display area DA).
By the above-described control performed by the control unit 420 according to the present embodiment, the user who perceives the notification can intuitively grasp that the processing has been accepted, and can blindly complete a desired operation without looking at the display area DA.
< flow of operation >
Next, the flow of the operation of the system 1 according to the present embodiment will be described in detail. Fig. 6 is a flowchart showing a flow of the operation of the system 1 according to the present embodiment.
As shown in fig. 6, first, the operation unit 110 detects an operation by the user (S102). The operation unit 110 outputs a signal related to the detected operation to the reception unit 410.
Next, the reception unit 410 receives an operation corresponding to the signal input in step S102 based on the signal (S104). The reception unit 410 inputs information on the received operation to the control unit 420.
Next, the control unit 420 controls the display unit 210 based on the signal input in step S104 to move the target image in accordance with the operation received by the reception unit 410 (S106).
Next, the control unit 420 executes notification processing in conjunction with the movement of the target image in step S106 (S108).
The system 1 according to the present embodiment may repeatedly execute the above-described steps until the stop of the notification process is instructed.
< supplement >
Although the preferred embodiments of the present invention have been described in detail with reference to the accompanying drawings, the present invention is not limited to the above examples. It is clear that a person having ordinary knowledge in the technical field to which the present invention belongs can conceive various modifications and adaptations within the scope of the technical idea described in the claims, and it is understood that these modifications and adaptations naturally fall within the technical scope of the present invention.
Note that a series of processing performed by each device described in this specification can be realized by any of software, hardware, and a combination of software and hardware. The program constituting the software is stored in advance in a recording medium (non-transitory medium) provided inside or outside each device, for example. Each program is read into the RAM when executed by a computer, and executed by a processor such as a CPU. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. The computer program may be distributed, for example, via a network without using a recording medium.
Description of the reference numerals
1: a system; 10: an operating device; 110: an operation section; 20: a display device; 210: a display unit; 30: a notification device; 310: a notification unit; 40: a control device; 410: a reception unit; 420: a control unit.

Claims (11)

1. A control device is provided with:
a receiving unit that receives an operation performed on at least one operation unit;
a control unit configured to perform a notification process for controlling the notification unit to perform a notification in association with at least one of the target images that moves in accordance with the content of the operation received by the reception unit, the notification process being performed using at least one of the images displayed on the display unit as the target image,
the operation unit and the display unit are disposed at positions where it is difficult for a user to intuitively grasp a correspondence between an operation coordinate defining a position of an operation performed on the operation unit and a display coordinate defining a position of an image displayed on the display unit.
2. The control device according to claim 1,
the operation unit and the display unit are arranged with a position where the operation unit and the display unit are distant from each other as a position where it is difficult for the user to intuitively grasp a correspondence between the operation coordinates and the display coordinates.
3. The control device according to claim 1 or 2,
the control section executes the notification processing when at least one of the object images reaches an end of a movable range that is a range movable in accordance with the content of the operation.
4. The control device according to claim 3,
in the case where there are a plurality of the object images, each of the object images moves in the same manner in accordance with the contents of the operation,
the control section executes the notification process when at least one of the object images among the plurality of object images reaches an end of the movable range.
5. The control device according to any one of claims 1 to 4,
the control unit executes the notification processing based on a positional relationship between a specific region set in a display region of the display unit and a target image.
6. The control device according to claim 5,
the control unit executes the notification process when the object image originally located outside the specific region according to the operation reaches the inside of the specific region.
7. The control device according to claim 5,
the control unit executes the notification process when the object image originally located inside the specific area according to the operation reaches outside the specific area.
8. The control device according to any one of claim 3,
the control unit executes the notification process when the reception unit receives an operation to move the target image from a first edge of a display area of the display unit toward a second edge of the display area opposite to the first edge, and the target image reaches an end of the movable range.
9. The control device according to any one of claims 1 to 8,
the control unit causes the notification unit to perform notification using at least one of vibration, sound, and light as the notification processing.
10. A system is provided with:
at least one operation section;
a receiving unit that receives an operation performed on the operation unit;
a display unit that displays at least one image; and
a control unit configured to perform a notification process for controlling the notification unit to perform a notification in association with at least one of the target images that moves in accordance with the content of the operation received by the reception unit, the notification process being performed using at least one of the images displayed on the display unit as the target image,
the operation unit and the display unit are disposed at positions where it is difficult for a user to intuitively grasp a correspondence between an operation coordinate defining a position of an operation performed on the operation unit and a display coordinate defining a position of an image displayed on the display unit.
11. A program for causing a computer to realize functions of:
a reception function for receiving an operation performed on at least one operation unit; and
a control function for performing a notification process for controlling the notification unit to perform a notification in association with at least one of the target images that moves in accordance with the content of the operation received by the reception function, the notification process being performed on at least one of the images displayed on the display unit,
the operation unit and the display unit are disposed at positions where it is difficult for a user to intuitively grasp a correspondence between an operation coordinate defining a position of an operation performed on the operation unit and a display coordinate defining a position of an image displayed on the display unit.
CN202080024345.4A 2019-03-26 2020-02-20 Control device, system, and program Pending CN113678089A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2019-059342 2019-03-26
JP2019059342 2019-03-26
JP2020-016223 2020-02-03
JP2020016223A JP2020166831A (en) 2019-03-26 2020-02-03 Control device, system, and program
PCT/JP2020/006862 WO2020195409A1 (en) 2019-03-26 2020-02-20 Control device, system, and program

Publications (1)

Publication Number Publication Date
CN113678089A true CN113678089A (en) 2021-11-19

Family

ID=72714624

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080024345.4A Pending CN113678089A (en) 2019-03-26 2020-02-20 Control device, system, and program

Country Status (2)

Country Link
JP (1) JP2020166831A (en)
CN (1) CN113678089A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101398712A (en) * 2007-09-26 2009-04-01 达方电子股份有限公司 Image input control method and image input control system
JP2011517810A (en) * 2008-03-12 2011-06-16 イマージョン コーポレーション Tactilely enabled user interface
CN103314342A (en) * 2011-03-30 2013-09-18 本田技研工业株式会社 Operation device
CN103809839A (en) * 2012-11-01 2014-05-21 夏普株式会社 Information displaying apparatus and information displaying method
CN103927002A (en) * 2013-01-16 2014-07-16 阿自倍尔株式会社 Information displaying device, method, and program
CN108766096A (en) * 2018-03-21 2018-11-06 武汉理工大学 A kind of automatic Pilot human-computer interaction emulation test system based on driving simulator

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3282344A4 (en) * 2015-04-09 2018-04-11 Fujitsu Limited Drive control device, electronic equipment, drive control program, and drive control method
JP2016218564A (en) * 2015-05-15 2016-12-22 株式会社東海理化電機製作所 Tactile sense presentation device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101398712A (en) * 2007-09-26 2009-04-01 达方电子股份有限公司 Image input control method and image input control system
JP2011517810A (en) * 2008-03-12 2011-06-16 イマージョン コーポレーション Tactilely enabled user interface
CN103314342A (en) * 2011-03-30 2013-09-18 本田技研工业株式会社 Operation device
CN103809839A (en) * 2012-11-01 2014-05-21 夏普株式会社 Information displaying apparatus and information displaying method
CN103927002A (en) * 2013-01-16 2014-07-16 阿自倍尔株式会社 Information displaying device, method, and program
CN108766096A (en) * 2018-03-21 2018-11-06 武汉理工大学 A kind of automatic Pilot human-computer interaction emulation test system based on driving simulator

Also Published As

Publication number Publication date
JP2020166831A (en) 2020-10-08

Similar Documents

Publication Publication Date Title
JP6315456B2 (en) Touch panel vehicle information display device
JP5132028B2 (en) User interface device
JP5555555B2 (en) In-vehicle device that cooperates with a portable device and realizes an input operation possible for the portable device
WO2009128148A1 (en) Remote control device
JP5644962B2 (en) Operating device
EP2447823A2 (en) Method and apparatus for gesture recognition
CN107797726B (en) Information terminal
CN107918504B (en) Vehicle-mounted operating device
US11132119B2 (en) User interface and method for adapting a view of a display unit
JP5452770B2 (en) Input device
US20140152600A1 (en) Touch display device for vehicle and display method applied for the same
KR101664038B1 (en) Concentration manipulation system for vehicle
US20130201126A1 (en) Input device
JP2018136616A (en) Display operation system
US20160154488A1 (en) Integrated controller system for vehicle
JP2018010472A (en) In-vehicle electronic equipment operation device and in-vehicle electronic equipment operation method
JP2015118507A (en) Method, device, and computer program for selecting object
CN113678089A (en) Control device, system, and program
US20160253088A1 (en) Display control apparatus and display control method
JP2014182808A (en) Navigation control of touch screen user interface
WO2020195409A1 (en) Control device, system, and program
KR20150088024A (en) System and method for converting AVN mode
KR101752579B1 (en) Method for operating a control device in a vehicle
US20220129124A1 (en) Control device, system, and program
US20220137803A1 (en) Control device, system, and progam

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination