WO2021059479A1 - Display control device for displaying operation subject within operable range - Google Patents

Display control device for displaying operation subject within operable range Download PDF

Info

Publication number
WO2021059479A1
WO2021059479A1 PCT/JP2019/038146 JP2019038146W WO2021059479A1 WO 2021059479 A1 WO2021059479 A1 WO 2021059479A1 JP 2019038146 W JP2019038146 W JP 2019038146W WO 2021059479 A1 WO2021059479 A1 WO 2021059479A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
operator
unit
operation state
operation target
Prior art date
Application number
PCT/JP2019/038146
Other languages
French (fr)
Japanese (ja)
Inventor
匠 武井
政信 大澤
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2019/038146 priority Critical patent/WO2021059479A1/en
Publication of WO2021059479A1 publication Critical patent/WO2021059479A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output

Definitions

  • the present invention relates to a display control device and a display control method for controlling the display of a display object on a touch-operable display device.
  • a display device that can be touch-operated and has a large screen.
  • the display object displayed on the display device may be displayed at a position far from the operator so that the operator of the display device cannot perform a touch operation. Therefore, the operator may not be able to touch the desired display object from the current position.
  • a display device having a large screen as described above may be provided in a vehicle. In that case, the driver, who is the operator, may not be able to touch the desired display object from the current position.
  • Patent Document 1 provides a touch panel on the upper layer of the display unit, and when it detects that the operator's finger or the like touches the touch panel, the position of the input icon arranged on the display unit is determined.
  • a display device is disclosed in which the position of the input icon is moved closer to the finger in the order closer to the contact coordinates of the finger or the like on the display unit.
  • the number of input icons arranged around the contact coordinates is limited.
  • the present invention has been made to solve the above-mentioned problems, and in a touch-operable display device, a position where the operator can touch-operate a display object that the operator of the display device desires to touch. It is an object of the present invention to provide a display control device which can be displayed on the screen.
  • the display control device has an acquisition unit that acquires remote control information indicating that a function that can be executed via a touch-operable display device is specified by remote control, and an acquisition unit that acquires the function.
  • the operator of the display device Based on the remote control information, the operator of the display device identifies the operation target to be touch-operated, and the operation target identification unit that specifies the initial display position of the operation target, and the operator who performed the remote control.
  • the operator can touch-operate the initial display position of the operation target specified by the operation target identification unit based on the operation state specification unit that specifies the operation state and the operation state of the operator specified by the operation state specification unit.
  • the operation target It is provided with a display control unit that controls the display of the operation target object on the display device so that the object is displayed within the operable range.
  • a display object that the operator of the display unit desires to touch can be displayed at a position where the operator can touch-operate.
  • FIG. 3A is a diagram for explaining an example of an image of the operable range based on the operating state of the operator in the first embodiment, and FIG. 3A shows the operable state when the operating state is the position of the “left seat”.
  • FIG. 3B is a diagram for explaining an image of a range, and FIG. 3B is a diagram for explaining an image of an operable range when the operating state is the position of the “right seat”.
  • the display position determination unit determines the display position of the Audio icon so that the Audio icon is displayed within the operable range, and the display control unit determines the audio icon at the position determined by the display position determination unit. It is a figure which shows the image of the display screen example of the display device after displaying. It is a flowchart for demonstrating operation of the display control apparatus which concerns on Embodiment 1. FIG. It is a figure which shows the configuration example of the display control device which concerns on Embodiment 2. FIG. It is a figure for demonstrating the image of an example of the target identification information which an operation object specifying part refers to when specifying an operation object in Embodiment 2. FIG.
  • FIG. 1 It is a figure for demonstrating the image of an example of the initial display position of an audio menu screen in Embodiment 2.
  • FIG. 1 the display position determination unit determines the display position of the audio menu screen so that the audio menu screen is displayed within the operable range, and the display control unit determines the position determined by the display position determination unit. It is a figure which shows the image of the display screen example of the display device after displaying an audio menu screen. It is a flowchart for demonstrating operation of the automatic operation control apparatus which concerns on Embodiment 2.
  • 11A and 11B are diagrams showing an example of the hardware configuration of the display control device according to the first embodiment and the second embodiment.
  • FIG. 1 is a diagram showing a configuration example of the display control device 1 according to the first embodiment.
  • the display control device 1 is mounted on the vehicle.
  • a large screen and touch-operable display device 2 is installed in front of the driver's seat and the passenger seat located in front of the vehicle in the traveling direction of the vehicle. It is assumed that it has been done.
  • FIG. 2 is a diagram for explaining an image of the display device 2 in a state where the display device 2 is installed in the vehicle in the vehicle equipped with the display control device 1 according to the first embodiment.
  • the display device 2 is a touch panel type display having a touch panel on the surface.
  • One or more display objects are displayed on the display device 2.
  • the driver performs various functions by touch-operating any one of one or more display objects displayed on the display device 2.
  • the display device 2 is a large-screen display device 2 having a width such that a touch-operable area exists both in front of the driver's seat and in front of the passenger seat. Therefore, the display object displayed on the display device 2 can be displayed at a position so far away from the driver that the driver cannot perform a touch operation.
  • the driver's seat is also referred to as a "right seat” or a “left seat”.
  • the "right seat” is a seat that passes through the center of the width direction of the vehicle in the direction of travel of the vehicle and is located on the right side of a straight line substantially parallel to the direction of travel of the vehicle, and is a "left seat”. Refers to a seat existing on the left side of the straight line in the direction of travel of the vehicle. That is, when the vehicle is a right-handed vehicle, the driver's seat is the "right seat”, and when the vehicle is a left-handed vehicle, the driver's seat is the "left seat”.
  • the display control device 1 controls the display device 2 as described above so that the display object touch-operated by the driver is displayed in a range that the driver can touch.
  • the display control of the display object displayed on the display device 2 is performed by the display control unit 15 described later in the display control device 1.
  • the display control device 1 acquires information (hereinafter referred to as "remote control information") indicating that a function that can be executed via the display device 2 is specified by remote control. Based on the acquired remote control information, a display object (hereinafter referred to as "operation target object”) to be touch-operated by the driver is specified.
  • the display control device 1 displays the specified operation target on the display device 2 in a range where the driver can touch-operate (hereinafter, referred to as "operable range").
  • the remote control is an operation performed in a non-contact manner, for example, an operation performed by utterance.
  • a person who touch-operates an operation object displayed on the display device 2 is referred to as an "operator". That is, in the first embodiment, the driver is the "operator".
  • the functions that can be executed via the display device 2 include various functions that can be executed by the occupant in the vehicle. For example, a telephone function for making a call, a search function for causing a navigation device (not shown) to execute a route search, or an audio function for causing an audio device (not shown) to play music. Is.
  • the operator can execute the above-mentioned functions via the display device 2. Specifically, the operator can execute the function by performing a determination operation on the operation target displayed on the display device 2.
  • the determination operation is, for example, a touch operation.
  • the display object displayed on the display device 2 may be displayed at a position so far away from the operator that the operator cannot perform a touch operation. Therefore, the operator can specify the function to be executed by remote control, and the display control device 1 can acquire the remote control information when the remote control is performed. ing.
  • the operator specifies a function to be executed by remote control using the input device 3.
  • the input device 3 is, for example, an array microphone.
  • the array microphone can acquire spatial information of sound that cannot be obtained by one microphone, and can control directivity and the like.
  • the display control device 1 acquires remote control information from the input device 3.
  • the display control device 1 identifies the operation target based on the acquired remote control information, and controls the operation target so as to be displayed within the operable range.
  • the operation target displayed on the display device 2 is an icon.
  • the icon is a button or the like for the operator to execute various functions.
  • a TEL icon 201 for instructing the execution of the telephone function a Search icon 202 for instructing the execution of the search function, and an Audio icon 203 for instructing the execution of the audio function are display devices. It is assumed that it is displayed in 2.
  • the icon is displayed in the initial state of the display device 2.
  • the initial state of the display device 2 is, for example, when the power is turned on.
  • the function to be executed when the icon is determined is determined in advance for each icon.
  • the display control device 1 includes an acquisition unit 11, an operation target object identification unit 12, an operation state identification unit 13, a determination unit 14, and a display control unit 15.
  • the operation target object identification unit 12 includes an initial display position identification unit 121.
  • the determination unit 14 includes an operation range determination unit 141.
  • the display control unit 15 includes a display position determination unit 151.
  • the acquisition unit 11 acquires remote control information indicating that a function that can be executed via the touch-operable display device 2 is specified by remote control. Specifically, the acquisition unit 11 acquires a voice signal based on the voice that specifies the execution of the function, which is input by the operator using the array microphone, as remote control information from the array microphone. For example, as shown in FIG.
  • the acquisition unit 11 acquires an audio signal based on the utterance "manipulate audio" from the array microphone as remote control information.
  • the acquisition unit 11 outputs the acquired remote control information to the operation target object identification unit 12 and the operation state identification unit 13.
  • the operation object identification unit 12 Based on the remote control information acquired by the acquisition unit 11, the operation object identification unit 12 specifies the operation object to be touch-operated by the operator of the display device 2 and specifies the initial display position of the operation object. .. First, the operation target object identification unit 12 selects an operation target object to be touch-operated by the operator among one or more display objects displayed on the display device 2 based on the remote control information output from the acquisition unit 11. Identify. Specifically, the operation target identification unit 12 selects an icon that the operator touch-operates among one or more icons displayed on the display device 2 based on the audio signal output from the acquisition unit 11. , Specify as an operation target. The operation target object identification unit 12 performs voice recognition processing on the voice signal by using a well-known voice recognition technique.
  • the operation target identification unit 12 identifies the operation target by referring to the target identification information, for example, based on the voice recognition result.
  • the target identification information is at least information in which keywords and icon identification information are associated with each other.
  • the icon identification information may be any information that can identify the icon.
  • the target identification information is generated in advance and stored in a storage unit (not shown) that can be referred to by the display control device 1.
  • the storage unit may be provided in the display control device 1, or may be provided in a place outside the display control device 1 where the display control device 1 can be referred to.
  • the operation target identification unit 12 performs voice recognition processing on a voice signal indicating "manipulate audio", and obtains the keyword "audio" as the voice recognition result. Further, in the target identification information, it is assumed that the identification information of the Audio icon 203 is associated with the "audio". The operation target object identification unit 12 identifies the Audio icon 203 as an operation target object with reference to the target identification information.
  • the operation target object specifying unit 12 specifies the initial display position of the operation target object on the display screen of the display device 2 based on the information about the specified operation object. Specifically, the initial display position specifying unit 121 of the operation target object specifying unit 12 specifies the initial display position of the operation target object.
  • the current display position on which the display object is displayed on the display device 2 is predetermined according to the display object before the remote control is performed by the operator. In the first embodiment, the current display position determined in advance according to the display object is referred to as an initial display position of the display object.
  • the initial display position is, for example, a display position where a display object is displayed when the power is turned on.
  • the initial display position is, for example, a display position that is shifted from the display position of the display object that was displayed when the power is turned on, depending on the display status of other display objects and the like.
  • the display position of the display object is represented by the coordinates on the display screen of the display device 2.
  • the initial display position specifying unit 121 specifies the initial display position of the operation target by referring to the initial display position specifying information based on the information about the operation target. For example, the initial display position specifying information in which the identification information of the display object that can be displayed on the display device 2 and the information regarding the initial display position of the display object are associated with each other is generated in advance and stored in the storage unit.
  • the identification information of the icon that can be displayed on the display device 2 and the information about the initial display position of the icon are defined in association with the information for specifying the initial display position.
  • the display control unit 15 described later has the function of the initial display position specifying unit 121, and the operation target object specifying unit 12 sets the initial display position based on the information regarding the initial display position acquired from the display control unit 15. It may be specified.
  • the management unit (not shown) has the function of the initial display position specifying unit 121, and the operation target object specifying unit 12 specifies the initial display position based on the information regarding the initial display position acquired from the management unit. You may do so.
  • the initial display position specifying unit 121 specifies the initial display position of the Audio icon 203.
  • the initial display position of the Audio icon 203 is the position shown in FIG.
  • the operation target object identification unit 12 outputs the specified information on the operation target object and the information on the initial display position of the operation target object to the determination unit 14 and the display control unit 15.
  • the target identification information and the initial display position identification information are set as separate information, but this is only an example, and the target identification information and the initial display position identification information can be combined as one piece of information.
  • the information regarding the initial display position of the icon may be defined in association with the identification information of the icon.
  • the initial display position specifying unit 121 may specify the initial display position of the icon based on the target identification information.
  • the operation state specifying unit 13 specifies the operation state of the operator based on the remote control information output from the acquisition unit 11.
  • the operation state of the operator is the position of the operator who has performed the remote control.
  • the operation state specifying unit 13 specifies the position of the operator when the remote control is performed, based on the audio signal output from the acquisition unit 11.
  • the position of the operator is, for example, the position of the seat in which the operator is seated.
  • the operation state specifying unit 13 analyzes the audio signal output from the acquisition unit 11 and specifies the direction of the source of the audio signal.
  • the operation state specifying unit 13 may analyze the voice signal and specify the direction of the source of the voice signal by using a well-known technique.
  • the operation state specifying unit 13 identifies the seat in which the operator is seated from the direction of the specified source of the voice signal. Then, the operation state specifying unit 13 sets the specified seat position as the position of the seat on which the operator is seated.
  • the position of each seat of the vehicle is predetermined, and the operation state specifying unit 13 may specify the position of the seat on which the operator is seated based on the predetermined information regarding the position of each seat. it can.
  • the operation state specifying unit 13 is, for example, on the right side of a straight line in which the direction of the identified voice signal source passes through the center in the width direction of the vehicle with respect to the traveling direction of the vehicle and is substantially parallel to the traveling direction of the vehicle.
  • the operation state specifying unit 13 is, for example, a straight line in which the direction of the identified voice signal source passes through the center in the width direction of the vehicle with respect to the traveling direction of the vehicle and is substantially parallel to the traveling direction of the vehicle.
  • the "left seat” is the seat on which the operator is seated, and the position of the "left seat” is the position of the operator.
  • the operation state specifying unit 13 uses the position of the "left seat” as the position of the operator.
  • the operation state specifying unit 13 outputs the specified information on the operation state of the operator, in other words, the specified information on the position of the operator, to the determination unit 14.
  • the information regarding the position of the operating seat is, for example, the position information of the "right seat” or the position information of the "left seat”.
  • the determination unit 14 is a display device 2 in which the operator can touch-operate the initial display position of the operation object specified by the operation object identification unit 12 based on the operation state of the operator specified by the operation state identification unit 13. It is determined whether or not it is within the operable range on the display screen of.
  • the operation range determination unit 141 of the determination unit 14 determines the operable range based on the information regarding the operation state of the operator output from the operation state identification unit 13.
  • the operable range is predetermined according to the position of the seat in which the operator is seated, and is for specifying the range in which the information regarding the position of the seat in the vehicle and the operable range are associated with each other. Information is generated in advance and stored in the storage unit.
  • the operable range is determined to be, for example, the range on the display screen of the display device 2 in which an operator having a standard physique can perform touch operations without being in an unreasonable posture while sitting in a seat. There is.
  • the operable range is represented by the coordinates on the display screen of the display device 2.
  • FIG. 3 is a diagram for explaining an example of an image of the operable range based on the operating state of the operator in the first embodiment.
  • FIG. 3A is a diagram illustrating an image of an operable range (shown by 301 in FIG. 3A) when the operator's operating state is the position of the “left seat” (see FIG. 2)
  • FIG. 3B is a diagram illustrating It is a figure explaining the image of the operable range (shown by 302 in FIG. 3B) when the operation state of an operator is the position of "right seat” (see FIG. 2).
  • the operable range is a circular range, but the operable range is not limited to the circular range.
  • the operable range may be a range on the display screen of the display device 2 that can be touch-operated without being in an unreasonable posture when the operator is seated.
  • the operation range determination unit 141 determines the operable range with reference to the range specifying information based on the information regarding the operation state of the operator. In the above example, the operation range determination unit 141 identifies the operable range (for example, see 301 in FIG. 3A) associated with the position of the “left seat” in the range identification information as the operable range. To do.
  • the determination unit 14 determines the operation target object based on the information regarding the initial display position of the operation target object output from the operation target object identification unit 12 and the information regarding the operable range specified by the operation range determination unit 141. Determine if the initial display position is within the operable range.
  • the determination unit 14 operates whether or not the initial display position of the operation target is within the operable range, for example, the center coordinates when the operation target is displayed at the initial display position. Judgment shall be made based on whether or not it is within the possible range.
  • the determination unit 14 determines whether or not the center coordinates when the Audio icon 203 is displayed at the initial display position (see FIG. 2) are within the operable range shown by 301 in FIG. 3A. To judge.
  • the determination unit 14 outputs a determination result (hereinafter referred to as “range determination result information”) as to whether or not the initial display position of the operation target is within the operable range to the display control unit 15.
  • range determination result information a determination result as to whether or not the initial display position of the operation target is within the operable range.
  • the display control unit 15 displays the operation target on the display device 2 so that the operation target is displayed within the operable range. Control the display of the operation target. Specifically, in the display control unit 15, the display position determination unit 151 outputs information on the initial display position of the operation object output from the operation object identification unit 12, and the range determination result output from the determination unit 14. Based on the information, the display position of the operation target on the display device 2 is determined. When the display position determination unit 151 indicates that the range determination result information indicates that the initial display position of the operation target is not within the operable range, in other words, the determination unit 14 can operate the initial display position of the operation object.
  • the display position of the operation target is determined so that the operation target is displayed within the operable range.
  • the state in which the operation target is displayed within the operable range means that at least the operation target is appropriately performed by touching the operation target. It is assumed that an area having a sufficient size (hereinafter referred to as "touch operation effective area") is displayed within the operable range.
  • touch operation effective area an area having a sufficient size
  • the display position is displayed.
  • the determination unit 151 determines the display position of the Audio icon 203 in which the Audio icon 203 is displayed within the operable range.
  • the display position determination unit 151 determines the initial display position of the operation target.
  • the initial display position of the operation target is determined to be the display position of the operation target.
  • the display position determination unit 151 may determine the initial display position of the operation object based on the information regarding the initial display position of the operation object output from the operation object identification unit 12.
  • the information regarding the display position of the operation object determined by the display position determination unit 151 is, for example, the coordinates of the center of the operation object on the display screen of the display device 2.
  • the display control unit 15 determines that the operation object is displayed at the display position determined by the display position determination unit 151 based on the information regarding the display position of the operation object determined by the display position determination unit 151. Controls the display on the display device 2. Specifically, the display control unit 15 sets the coordinates of the center of the operation target when displayed on the display device 2 to be the coordinates of the center of the operation target determined by the display position determination unit 151, for example. The display position of the operation target is moved. In FIG. 4, in the first embodiment, the display position determination unit 151 determines the display position of the Audio icon 203 so that the Audio icon 203 is displayed within the operable range, and the display control unit 15 determines the display position.
  • FIG. 4 shows the image of the display screen example of the display device 2 after displaying the Audio icon 203 at the position decided by the part 151.
  • the display control unit 15 shifts the display position of the Audio icon 203 from the position shown in FIG. 2 to the position shown in FIG. 4 based on the display position of the Audio icon 203 determined by the display position determination unit 151.
  • the driver can touch the Audio icon 203.
  • the display control unit 15 controls the display of the operation target object so that it is displayed at the display position determined by the display position determination unit 151 from the state displayed at the initial display position, the operation target is operated.
  • the display position of the object is to be moved.
  • the display control unit 15 keeps the operation target in the state of being displayed at the initial display position, and further, the operation target is also displayed at the display position determined by the display position determination unit 151. It may be controlled so that it is displayed.
  • the display control unit 15 may display the operation target at the display position of the operation target determined by the display position determination unit 151.
  • the display control unit 15 can control the display position of the other icon when the operation target is displayed and overlaps with the other icon. Specifically, the display control unit 15 shifts the display position of another icon to a position that does not overlap with the operation target, for example.
  • the display position determining unit 151 may determine whether or not the display position of the operation object overlaps with another icon.
  • FIG. 5 is a flowchart for explaining the operation of the display control device 1 according to the first embodiment.
  • the acquisition unit 11 acquires remote control information indicating that a function that can be executed via the touch-operable display device 2 is specified by remote control (step ST501).
  • the acquisition unit 11 outputs the acquired remote control information to the operation target object identification unit 12 and the operation state identification unit 13.
  • the operation object identification unit 12 Based on the remote control information acquired by the acquisition unit 11 in step ST501, the operation object identification unit 12 identifies the operation object to be touch-operated by the operator of the display device 2, and initially displays the operation object. The position is specified (step ST502). First, the operation target object identification unit 12 selects an operation target object to be touch-operated by the operator among one or more display objects displayed on the display device 2 based on the remote control information output from the acquisition unit 11. Identify. Then, the operation target object specifying unit 12 specifies the initial display position of the operation target object on the display screen of the display device 2 based on the information about the specified operation object. Specifically, the initial display position specifying unit 121 of the operation target object specifying unit 12 specifies the initial display position of the operation target object. The operation target object identification unit 12 outputs the specified information on the operation target object and the information on the initial display position of the operation target object to the determination unit 14 and the display control unit 15.
  • the operation state specifying unit 13 specifies the operation state of the operator based on the remote control information output from the acquisition unit 11 in step ST501 (step ST503). Specifically, the operation state specifying unit 13 specifies the position of the operator when the remote control is performed, based on the audio signal output from the acquisition unit 11. The operation state specifying unit 13 outputs the specified information on the operation state of the operator, in other words, the specified information on the position of the operator, to the determination unit 14.
  • the initial display position of the operation target object specified by the operation target object identification unit 12 in step ST502 is the operable range based on the operation state of the operator specified by the operation state identification unit 13 in step ST503. It is determined whether or not it is inside (step ST504). Specifically, first, in the determination unit 14, the operation range determination unit 141 of the determination unit 14 determines the operable range based on the information regarding the operation state of the operator output from the operation state identification unit 13. Next, the determination unit 14 determines the operation target object based on the information regarding the initial display position of the operation target object output from the operation target object identification unit 12 and the information regarding the operable range specified by the operation range determination unit 141. Determine if the initial display position is within the operable range. The determination unit 14 outputs the range determination result information as to whether or not the initial display position of the operation target is within the operable range to the display control unit 15.
  • the display control unit 15 displays the operation target so that it is displayed within the operable range.
  • the display of the operation target object on the device 2 is controlled (step ST505). Specifically, in the display control unit 15, the display position determination unit 151 outputs information on the initial display position of the operation object output from the operation object identification unit 12, and the range determination result output from the determination unit 14. Based on the information, the display position of the operation target on the display device 2 is determined.
  • the display position determination unit 151 indicates that the range determination result information indicates that the initial display position of the operation target is not within the operable range, in other words, the determination unit 14 can operate the initial display position of the operation object.
  • the display position of the operation target is determined so that the operation target is displayed within the operable range.
  • the display position determination unit 151 determines the initial display position of the operation target.
  • the initial display position of the operation target is determined to be the display position of the operation target.
  • the display control unit 15 determines that the operation object is displayed at the display position determined by the display position determination unit 151 based on the information regarding the display position of the operation object determined by the display position determination unit 151. Controls the display on the display device 2.
  • step ST502 and the operation of step ST503 does not matter.
  • the operation of step ST502 and the operation of step ST503 may be performed in parallel.
  • the display control device 1 is the display device 2 capable of touch operation, and the initial display of the operation target object for which the operator of the display device 2 desires the touch operation. Even if the position is a position where the operator cannot perform the touch operation, it is possible to control the operation object to be displayed at the position where the operator can perform the touch operation. Specifically, in the display control device 1 according to the first embodiment, the initial display position of the icon for executing the function designated by the remote control in the touch-operable display device 2 is operated. It is possible to control the icon to be displayed at a position where the operator can perform the touch operation even if the person cannot perform the touch operation.
  • the display object that the operator desires to touch is not always moved near the finger. Further, in the conventional technique, the operation target that the operator does not want is moved near the operator's finger. As a result, even if the operation target desired by the operator is moved near the finger, the operator may erroneously operate an input icon that is not desired.
  • the initial display position of the operation object that the operator of the display device 2 desires to touch is the position where the operator cannot touch the operation object. Even so, it is possible to control the operation object to be displayed at a position where the operator can perform a touch operation.
  • the remote control information indicating that the function that can be executed by the display control device 1 via the touch-operable display device 2 is specified by remote control.
  • the operation object to be touch-operated by the operator of the display device 2 is specified, and the initial display position of the operation object is specified.
  • the determination unit 14 and the determination unit 14 determine whether or not the initial display position of the operation object specified by 12 is within the operable range on the display screen of the display device 2 that can be touch-controlled by the operator. , When it is determined that the initial display position of the operation object is not within the operable range, the display control for controlling the display of the operation object on the display device 2 so that the operation object is displayed within the operable range. It is configured to include a part 15. Therefore, in the touch-operable display device 2, the operation target that the operator of the display device 2 desires to touch can be displayed at a position where the operator can touch-operate.
  • Embodiment 2 In the first embodiment, the display device 2 displays one or more icons as display objects, and the display control device 1 is designated by remote control among the one or more icons displayed on the display device 2. The icon for executing the specified function is specified as the operation target. In the second embodiment, the operation target specified by the display control device 1a will be a display object to be displayed on the display device 2 after the icon is operated.
  • the display device 2 displays one or more icons for executing the function as display objects (see FIG. 2). Further, in the second embodiment, for example, when any one of the icons displayed on the display device 2 is operated, the function associated with the icon is executed, and the function associated with the icon is executed according to the executed function. Displayed objects will be displayed. For example, if the Audio icon 203 is operated, the audio menu screen will be displayed. For example, if the audio menu screen is operated, the volume setting screen may be displayed. It should be noted that the case where the displayed object is operated means the case where the displayed object is determined, and the determined operation is a touch operation or the like.
  • the display objects are defined in a hierarchical structure.
  • the audio menu screen is defined in the lower layer of the Audio icon 203.
  • the hierarchical structure has two or more layers. For example, when a function associated with an icon is executed, one of the display objects defined in a plurality of layers below the icon is displayed according to the executed function. The object is displayed on the display device 2.
  • the display object displayed when the function is executed is referred to as a "response display object".
  • the input device 3 is, for example, an array microphone.
  • the operator uses the array microphone to specify the function to be executed by voice.
  • FIG. 6 is a diagram showing a configuration example of the display control device 1a according to the second embodiment.
  • the same reference numerals are given to the same configuration as the display control device 1 described with reference to FIG. 1 in the first embodiment, and duplicate description will be omitted.
  • the configuration of the display control device 1a according to the second embodiment is different from the configuration of the display control device 1 according to the first embodiment in that the display control unit 15a includes the response generation unit 152.
  • the specific operations of the operation target object identification unit 12a, the initial display position identification unit 121a, the determination unit 14a, the display control unit 15a, and the display position determination unit 151a are performed. Specific operations of the operation target identification unit 12, the initial display position identification unit 121, the determination unit 14, the display control unit 15, and the display position determination unit 151 in the display control device 1 according to the first embodiment, respectively. Is different.
  • the operation of the operation target identification unit 12a is basically the same as the operation of the operation object identification unit 12 described in the first embodiment.
  • the operation target specified by the operation target identification unit 12 is an icon displayed on the display device 2 for executing the function designated by remote control.
  • the operation target specified by the operation target identification unit 12a is displayed on the display device 2 according to the execution of the function when the remote operation for designating the execution of the function is performed. The difference is that it is a response display.
  • the operator displays the TEL icon 201, the Search icon 202, and the Audio icon 203 on the display device 2.
  • the acquisition unit 11 acquires a voice signal based on the voice "manipulate audio" from the array microphone as remote control information, and outputs the remote control information to the operation target object identification unit 12a and the operation state identification unit 13. ..
  • the operation target identification unit 12a responds to be displayed on the display device 2 according to the executed function when the function specified by remote control is executed based on the audio signal output from the acquisition unit 11.
  • the display object is specified as an operation object, and the initial display position of the operation object is specified.
  • the operation target object identification unit 12a first performs voice recognition processing on the voice signal by using a well-known voice recognition technique. Then, the operation target object identification unit 12a identifies the operation target object by referring to the target identification information, for example, based on the voice recognition result.
  • the identification information of the icon, the identification information of the function to be executed by operating the icon, and the identification information of the function to be executed when the function is executed are displayed in the target identification information. It is assumed that the identification information of the response display object is associated with the information.
  • the function identification information may be any information that can identify the function. Further, the identification information of the response display object may be any information that can identify the response display object.
  • FIG. 7 is a diagram for explaining an image of an example of target identification information referred to when the operation target object identification unit 12a identifies the operation target object in the second embodiment.
  • the operation target object identification unit 12a first performs voice recognition processing on a voice signal indicating "manipulate audio", and as a voice recognition result, "audio". Get the keyword.
  • the operation target identification unit 12a identifies the "audio menu screen", which is a response display object associated with the "audio”, as the operation target by referring to the target identification information.
  • the operation target object specifying unit 12a specifies the initial display position of the operation target object on the display screen of the display device 2 based on the information about the specified operation object. Specifically, the initial display position specifying unit 121a of the operation target object specifying unit 12a specifies the initial display position of the operation target object.
  • the operation of the initial display position specifying unit 121a is basically the same as the operation of the initial display position specifying unit 121 described in the first embodiment. In the first embodiment, the operation target for which the initial display position specifying unit 121 specifies the initial display position is an icon, whereas in the second embodiment, the initial display position specifying unit 121a specifies the initial display position. The difference is that the operation target is a response display.
  • the initial display position specifying unit 121a specifies the initial display position of the operation target, in other words, the response display object, with reference to the initial display position specifying information based on the information about the operation target, for example.
  • the identification information of the response display object that can be displayed on the display device 2 and the information regarding the initial display position of the response display object are defined in the initial display position identification information in association with each other. ing.
  • the initial display position specifying unit 121a specifies the initial display position of the audio menu screen.
  • FIG. 8 is a diagram for explaining an image of an example of the initial display position of the audio menu screen 801 in the second embodiment.
  • the initial display position specifying unit 121a specifies the display position of the audio menu screen 801 shown in FIG. 8 as the initial display position of the audio menu screen 801. Note that FIG. 8 shows a state in which the audio menu screen 801 is displayed at the initial display position for convenience, but the audio menu screen 801 is not a display object displayed in the initial state of the display device 2.
  • the audio menu screen 801 is displayed on the display device 2 after the icon for executing the audio function is operated. Specifically, on the audio menu screen 801, when a function called "audio menu display" associated with the audio icon 203 for executing the audio function is executed, the "audio menu display” is executed. As a result, it will be displayed on the display device 2.
  • the operation object specifying unit 12a outputs the specified information on the operation object and the information on the initial display position of the operation object to the determination unit 14a and the display control unit 15a.
  • the operation of the determination unit 14a is basically the same as the operation of the determination unit 14 described in the first embodiment.
  • the operation target for which the determination unit 14 determines whether or not the initial display position is within the operable range is an icon
  • the determination unit 14a determines whether or not the initial display position is within the operable range.
  • the difference is that the operation target for determining whether or not the initial display position is within the operable range is the response display object.
  • the specific operation of the operation range determination unit 141a for determining the operable range is the same as the specific operation of the operation range determination unit 141 described in the first embodiment. Duplicate description is omitted.
  • the determination unit 14a is based on the information regarding the initial display position of the operation target object output from the operation target object identification unit 12a and the information regarding the operable range specified by the operation range determination unit 141a, that is, the operation target object, in other words, It is determined whether or not the initial display position of the response display object is within the operable range. In the above example, the determination unit 14a determines whether or not the center coordinates when the audio menu screen 801 is displayed at the initial display position (see FIG. 8) are within the operable range (see 301 in FIG. 8). Is determined. In the above example, the determination unit 14a outputs the range determination result information that the center coordinates of the initial display position of the audio menu screen are not within the operable range to the display control unit 15a.
  • the display control unit 15a includes a display position determination unit 151a and a response generation unit 152.
  • the response generation unit 152 generates response display information for displaying the response display object based on the information about the operation object output from the operation object identification unit 12a. In the above example, the response generation unit 152 generates response display information for displaying the “audio menu screen”.
  • the display control unit 15a displays the operation target within the operable range. , Controls the display of the operation target on the display device 2. Specifically, in the display control unit 15a, the display position determination unit 151a has information on the initial display position of the operation object output from the operation object identification unit 12a, and the range determination result output from the determination unit 14a. Based on the information, the display position of the operation target on the display device 2 is determined. When the range determination result information indicates that the initial display position of the operation target is not within the operable range, the display position determination unit 151a, in other words, the determination unit 14a can operate the initial display position of the operation target.
  • the display position of the operation target is determined so that the operation target is displayed within the operable range.
  • the determination unit 14a outputs range determination result information that the center coordinates of the initial display position of the audio menu screen 801 are not within the operable range (see 301 in FIG. 8).
  • the position determination unit 151a determines the display position of the audio menu screen 801 on which the audio menu screen 801 is displayed within the operable range.
  • the state in which the operation target is displayed within the operable range means that at least the touch operation effective area is displayed within the operable range.
  • the audio menu screen 801 is a display object including a plurality of display objects such as buttons for which the operator performs a touch operation
  • the display position determination unit 151a touches all the display objects included in the audio menu screen 801. The display position in which the operation effective area is within the operable range is determined as the display position of the audio menu screen 801.
  • the display position determination unit 151a indicates that the range determination result information indicates that the initial display position of the response display object is within the operable range, in other words, the determination unit 14a determines that the initial display position of the response display object is If it is determined that the response display is within the operable range, the initial display position of the response display is determined to be the display position of the response display.
  • the display position determination unit 151a may determine the initial display position of the response display object based on the information regarding the initial display position of the response display object output from the operation target object identification unit 12a.
  • the display control unit 15a operates at the display position determined by the display position determination unit 151a based on the information regarding the display position of the operation target determined by the display position determination unit 151a and the response display information generated by the response generation unit 152.
  • the display of the response display object on the display device 2 is controlled so that the object, in other words, the response display object is displayed.
  • the display control unit 15a for example, when the response display object is displayed on the display device 2 based on the response display information, the coordinates of the center of the response display object are set by the display position determination unit 151a.
  • the response display object is displayed so as to be the coordinates of the center of the determined response display object.
  • the response display information includes information such as the size of the response display object, and the display control unit 15a sets the coordinates of the center of the displayed response display object as the display position based on the response display information. It is possible to specify the coordinates of the display position of the response display object, which is the coordinates of the center of the response display object determined by the determination unit 151a.
  • the display position determination unit 151a determines the display position of the audio menu screen 801 so that the audio menu screen 801 is displayed within the operable range, and the display control unit 15a displays the display. It is a figure which shows the image of the display screen example of the display device 2 after displaying the audio menu screen 801 at the position determined by the position determination unit 151a.
  • the display control unit 15a displays the audio menu screen 801 as the initial display of the audio menu screen 801 shown in FIG. 8 based on the display position of the audio menu screen 801 determined by the display position determination unit 151a.
  • the driver can perform a touch operation on the audio menu screen 801.
  • FIG. 10 is a flowchart for explaining the operation of the display control device 1a according to the second embodiment.
  • the acquisition unit 11 acquires remote control information indicating that a function that can be executed via the touch-operable display device 2 is specified by remote control (step ST1001).
  • the specific operation of step ST1001 is the same as the specific operation of step ST501 described with reference to FIG. 5 in the first embodiment.
  • the acquisition unit 11 outputs the acquired remote control information to the operation target object identification unit 12a and the operation state identification unit 13.
  • the operation target identification unit 12a identifies the operation target to be touch-operated by the operator of the display device 2 based on the remote control information acquired by the acquisition unit 11 in step ST1001, and initially displays the operation target.
  • the position is specified (step ST1002).
  • the operation object specifying unit 12a outputs the specified information on the operation object and the information on the initial display position of the operation object to the determination unit 14a and the display control unit 15a.
  • the operation state specifying unit 13 specifies the operation state of the operator based on the remote control information output from the acquisition unit 11 in step ST1001 (step ST1003).
  • the specific operation of step ST1003 is the same as the specific operation of step ST503 described with reference to FIG. 5 in the first embodiment.
  • the operation state specifying unit 13 outputs the information regarding the specified operation state, in other words, the specified information regarding the position of the seat where the operator is seated, to the determination unit 14a.
  • the initial display position of the operation target object specified by the operation target object identification unit 12a in step ST1002 is within the operable range based on the operation state of the operator specified by the operation state identification unit 13 in step ST1003. It is determined whether or not it is inside (step ST1004).
  • the determination unit 14a outputs the range determination result information as to whether or not the initial display position of the operation target is within the operable range to the display control unit 15a.
  • the display control unit 15a determines in step ST1004 that the operation target object, in other words, the initial display position of the response display object is not within the operable range, the operation target object is within the operable range.
  • the display of the operation target object on the display device 2 is controlled so as to be displayed in (step ST1005).
  • the display control unit 15a operates at the display position determined by the display position determination unit 151a based on the information regarding the display position of the operation target determined by the display position determination unit 151a and the response display information generated by the response generation unit 152.
  • the display of the response display object on the display device 2 is controlled so that the object, in other words, the response display object is displayed.
  • step ST1002 and the operation of step ST1003 does not matter.
  • the operation of step ST1002 and the operation of step ST1003 may be performed in parallel.
  • the initial display of the operation target object for which the operator of the display device 2 desires the touch operation is displayed. Even if the position is a position where the operator cannot perform the touch operation, it is possible to control the operation object to be displayed within the range where the operator can perform the touch operation.
  • the display control device 1a according to the second embodiment is displayed according to the function specified by the remote control on the touch-operable display device 2. For a response display object that the operator desires to touch, the operator can touch the response display object even if the initial display position of the response display object is a position where the operator cannot perform the touch operation. It can be controlled to be displayed at the position.
  • the display control device 1 uses an operation target as an icon displayed on the display device 2 for executing a function designated by remote control, and is associated with the icon by remote control.
  • the icon is displayed at a position where the operator can perform a touch operation.
  • the operator touches the icon again to instruct the execution of the function associated with the icon, even though the function to be executed by remote control is specified.
  • the response display object to be displayed when the function specified by the remote control is executed is displayed.
  • the operator can specify the audio function to be executed by saying "operate the audio", and can execute the audio function by touching the Audio icon 203 again. it can.
  • the operator needs to specify the audio function by remote control and instruct the execution of the audio function by touch operation before the audio function is executed. Even though the audio function to be executed is specified, it is possible for the operator to execute the audio function by touching the Audio icon 203 after the Audio icon 203 is displayed at a position where the audio function can be touched again. It can be said that it will be troublesome twice.
  • the display control device 1a when the remote control for designating the execution of the function is performed, the response display to be displayed on the display device 2 according to the execution of the function is performed. Display an object at a display position where the operator can touch it. Specifically, for example, when a remote control for designating the execution of a function called "audio menu display” is performed, the display control device 1a should be displayed on the display device 2 according to the execution of the function. The "audio menu screen" is displayed at a display position where the operator can perform touch operation.
  • the operator performs an operation of touch-operating the Audio icon 203 after the Audio icon 203 for executing the function, which is associated with the function of "audio menu display", is displayed at a touch-operable position.
  • the "audio menu screen" displayed when the audio function is executed by the touch operation of the Audio icon 203 can be touch-operated without performing the operation.
  • the display control device 1a when the display control device 1a according to the second embodiment performs the remote control for designating the execution of the function, the display control device 1a operates the response display object to be displayed on the display device 2 according to the execution of the function.
  • the operator can execute the desired function without performing the double operation of remote control and touch operation, and the function is executed.
  • the display control of the display object displayed on the display device 2 can be performed so that the display object can be quickly touch-controlled.
  • the display control device 1a indicates that the function that can be executed via the touch-operable display device 2 is specified by remote control. Based on the acquisition unit 11 to acquire the above and the remote control information acquired by the acquisition unit 11, the operation object to be touch-operated by the operator of the display device 2 is specified, and the initial display position of the operation object is specified. Operation target object identification unit 12a, operation state identification unit 13 that specifies the operation state of the operator who performed the remote control, and the operation object identification unit 13 that specifies the operation state of the operator specified by the operation state identification unit 13.
  • the determination unit 14a and the determination unit 14a determine whether or not the initial display position of the operation object specified by 12a is within the operable range on the display screen of the display device 2 that can be touch-controlled by the operator. , When it is determined that the initial display position of the operation object is not within the operable range, the display control for controlling the display of the operation object on the display device 2 so that the operation object is displayed within the operable range. It is configured to include the part 15a. Therefore, in the touch-operable display device 2, even if the initial display position of the operation target that the operator of the display device 2 desires to touch is a position that the operator cannot touch. It is possible to control the operation object to be displayed in the touchable range of the operator.
  • the display device 2 can display an icon for executing the function, and the operation target is the display device after the icon is operated. It was assumed that the response display was displayed in 2. That is, when the display control device 1a determines that the initial display position of the response display object is not within the operable range, the response display object is displayed on the display device 2 so that the response display object is displayed within the operable range. Control the display of. As a result, the display control device 1a causes the operator to execute the desired function without performing the double operation of remote control and touch operation, and promptly touches the response display object due to the execution of the function. It is possible to control the display of the display object displayed on the display device 2 so that the display device 2 can be operated.
  • the operation state specifying unit 13 of the display control devices 1 and 1a determines the operation state of the operator based on the voice signal output from the array microphone as the input device 3. It was intended to be specified, but this is just an example.
  • the operation state specifying unit 13 may specify the operation state based on the captured image captured inside the vehicle.
  • the acquisition unit 11 of the display control devices 1 and 1a is connected to an image pickup device (not shown) that images the inside of the vehicle, and acquires the captured image from the image pickup device.
  • the imaging device is, for example, a visible light camera or an infrared camera.
  • the image pickup device may be shared with the image pickup device included in the so-called "driver monitoring system” mounted on the vehicle for monitoring the state of the driver in the vehicle.
  • the operation state specifying unit 13 calculates the opening degree of the occupant of the vehicle based on the captured image acquired by the acquisition unit 11.
  • the operation state specifying unit 13 may perform image recognition processing using a well-known image recognition technique to calculate the opening degree of the occupant of the vehicle.
  • the operation state specifying unit 13 considers that the occupant whose calculated opening is equal to or higher than a preset threshold value (hereinafter referred to as “opening degree determination threshold value”) is an operator, and the seat in which the operator is seated.
  • the position of is specified as the position of the operator.
  • the operation state specifying unit 13 specifies the operation state of the operator based on the captured image, it is not essential that the microphone as the input device 3 is an array microphone.
  • the operator specifies the function to be executed by voice, but this is only an example.
  • the operator may specify the function to be executed by a gesture.
  • the input device 3 is an imaging device as described above.
  • the acquisition unit 11 of the display control devices 1 and 1a acquires the acquired captured image as remote control information, and the operation target identification units 12 and 12a specify the operation target based on the captured image acquired by the acquisition unit 11.
  • the gesture performed by the operator is a pointing action of pointing to an icon associated with the function to be executed, and in the display control device 1 according to the first embodiment, the operation target identification unit 12 is set.
  • the icon pointed by the operator is specified as the operation target.
  • the operation target object identification unit 12a performs an image recognition process using a well-known image recognition technique, and the operator points and points. By specifying the direction, the icon pointed by the operator is specified, and from the specified icon, the response display object associated with the icon is specified as the operation target object.
  • Gestures performed by the operator are not limited to pointing movements.
  • the gesture is determined in advance according to the function to be executed, such as holding a hand, and the operator may perform the gesture according to the function to be executed.
  • the operation object identification units 12 and 12a perform image recognition processing using a well-known image recognition technique, identify gestures from captured images, and identify predetermined functions associated with gestures. .. Then, the operation target object specifying units 12 and 12a specify the operation target object from the specified function.
  • the operation state specifying unit 13 of the display control devices 1 and 1a operates the operator based on the captured image. Identify the state. Specifically, the operation state specifying unit 13 performs image recognition processing using, for example, a well-known image recognition technique, identifies an operator performing a gesture, and seats the operator. The position of is set to the operation state of the operator.
  • the input device 3 may be a remote controller (hereinafter referred to as "remote controller"), and the operator may specify the function to be executed by the remote controller. Good.
  • the acquisition unit 11 acquires information indicating the designated function (hereinafter referred to as "remote control information") from the remote control as remote control information.
  • the operation target identification units 12 and 12a of the display control devices 1, 1a specify the operation target based on the remote control information acquired by the acquisition unit 11.
  • the display control devices 1 and 1a are connected to, for example, an imaging device as described above, and the operation state specifying unit 13 acquires an captured image from the imaging device via the acquisition unit 11. Then, the operation state specifying unit 13 specifies the operation state of the operator based on the acquired captured image. Specifically, the operation state specifying unit 13 performs image recognition processing using, for example, a well-known image recognition technique, identifies an operator who is operating the remote controller, and the operator is seated. The position of the seat is the operating state of the operator.
  • the input device 3 is an array microphone, a microphone, an image pickup device, or a remote controller
  • the acquisition unit 11 is an audio signal output from the array microphone or the microphone, an image captured image output from the image pickup device, or a remote controller. It was explained that the remote control information output from the remote control may be used as remote control information, but this is only an example.
  • the acquisition unit 11 may acquire, for example, any one or more of the audio signal, the captured image, and the remote control information as remote control information.
  • the operation state specifying unit 13 when the operation state specifying unit 13 specifies the position of the operator as the operation state of the operator based on the captured image, the inside of the vehicle of the operator The position in the space of is also the position of the operator.
  • the operation state specifying unit 13 analyzes the voice signal and specifies the position of the operator by specifying the direction of the source of the voice signal, strictly speaking, the position of the operator in the vehicle may be specified. Can not. Therefore, in the first and second embodiments described above, the position of the seat where the operator is seated is set as the position of the operator based on the direction of the source of the voice signal.
  • the operation state specifying unit 13 can specify the position of the operator in the space inside the vehicle based on the captured image.
  • the operation state specifying unit 13 may specify the face position of the operator in the captured image based on the captured image, and set the face position as the position of the operator in the space inside the vehicle. Since the imaging range of the imaging device is known in advance, the operation state specifying unit 13 can identify the position of the operator in the space inside the vehicle if the face position of the operator in the captured image can be specified.
  • the operation state specifying unit 13 determines that the position of the operator in the space inside the vehicle is the position of the operator, as compared with the case where the position of the seat in which the operator is seated is the position of the operator.
  • the operation range determination units 141, 141a of the units 14 and 14a determine the operable range
  • the operator's operable range can be specified more accurately.
  • the display control units 15 and 15a can more reliably determine the display position of the operation target and control the display of the operation target so that the operator can perform the touch operation more reliably.
  • the spatial position in the vehicle and the operable range are associated with the range specifying information referred to when the operating range determining units 141 and 141a specify the operable range.
  • the operation state specifying unit 13 specifies the operation state of the operator based on the captured image, the position of the operator and the posture of the operator are determined. , It may be specified as the operation state of the operator.
  • the posture of the operator is, for example, the orientation of the body of the operator.
  • the orientation of the operator's body is, for example, the amount of change in the angle from the reference orientation, with the vehicle traveling direction side as the front and the operator sitting straight toward the front as the reference orientation. Represented.
  • the angle when the operator is facing the reference direction, the angle is set to 0 degrees, and when the operator is facing the right side with respect to the front, the angle is set to be positive by the amount facing the right side. If the operator faces the left side with respect to the front, the angle should be negative by the amount facing the left side.
  • the operation state specifying unit 13 specifies the position of the operator and the angle as described above as the operation state of the operator.
  • the operation range determination units 141 and 141a determine the operable range based on the position of the operator and the posture of the operator specified by the operation state identification unit 13.
  • the operation range determination units 141 and 141a display the operable range associated with the position of the operator on the right side or the display screen of the display device 2 according to the posture of the operator. Move it to the left. It is assumed that how much the operation range determination units 141 and 141a shift the operable range according to how much posture is determined in advance.
  • the posture of the operator may be, for example, the amount of inclination of the seat in which the operator is seated.
  • the operation state specifying unit 13 may calculate the amount of inclination of the seat in which the operator is seated, for example, based on the captured image. Further, the operation state specifying unit 13 may acquire information regarding the amount of inclination of the seat in which the operator is seated, for example, from a sensor installed in the seat. The operation state specifying unit 13 specifies the position of the operator and the amount of inclination calculated based on the captured image as the operation state of the operator. Alternatively, the operation state specifying unit 13 specifies the position of the operator and the amount of inclination acquired from the sensor as the operation state of the operator.
  • the operation range determination units 141 and 141a determine the operable range based on the position of the operator and the posture of the operator specified by the operation state identification unit 13. Specifically, for example, the operation range determination units 141 and 141a display the operable range associated with the position of the operator on the right side or the display screen of the display device 2 according to the posture of the operator. Move it to the left and reduce it. It is assumed that how much the operation range determination units 141 and 141a shift the operable range according to the posture and how much the operable range is reduced are predetermined.
  • the display device 2 is installed in the vehicle in front of the driver's seat and the passenger's seat located in front of the traveling direction of the vehicle.
  • the installation position of the display device 2 assumed in the first embodiment and the second embodiment is only an example.
  • the display device 2 may be installed on the right side surface or the left side surface of the vehicle interior with respect to the traveling direction of the vehicle, or may be installed on the ceiling of the vehicle interior.
  • a plurality of display devices 2 may be installed in the vehicle. Specifically, for example, the display device 2 may be installed in the vehicle in front of the driver's seat and the passenger seat located in front of the vehicle in the traveling direction, and on both sides of the vehicle interior. Good.
  • the operation range determination units 141 and 141a determine the operable range based on the position of the operator and the posture of the operator specified by the operation state specifying unit 13.
  • the operation range determination units 141 and 141a select the display device 2 to be touch-operated by the operator from among the plurality of display devices 2, and then determine the operable range in the selected display device 2.
  • the posture of the operator is the orientation of the body of the operator.
  • the operation state specifying unit 13 specifies that the position of the driver's seat where the operator is seated is the position of the "right seat” and the posture of the operator is plus 90 degrees.
  • the display device 2 is installed in the vehicle in front of the driver's seat with respect to the traveling direction of the vehicle and on both side surfaces of the vehicle interior.
  • the operation range determination units 141 and 141a first select the display device 2 installed on the right side surface of the vehicle interior with respect to the traveling direction. For example, the first display device selection in which information on the position of the operator, information on the posture of the operator, and information on the display device 2 determined to be touch-operated by the operator among the plurality of display devices 2 are associated. It is assumed that the information for use is generated in advance and stored in the storage unit.
  • the position of the operator is the position of the "right seat” and the posture of the operator "plus 90 degrees” is the display device 2 installed on the right side surface. It is assumed that they are associated with each other.
  • the operation range determination units 141 and 141a select the first display device based on the information of the position of the "right seat” and the information of the operator's posture "plus 90 degrees” specified by the operation state identification unit 13.
  • the display device 2 installed on the right side of the vehicle interior with respect to the traveling direction is selected with reference to the information. Then, the operation range determination units 141 and 141a determine the operable range of the selected display device 2 installed on the right side surface.
  • the first display device selection information defines the position of the operator, the posture of the operator, and the operable range of the display device 2 in addition to the display device 2, and determines the operating range.
  • Units 141 and 141a may determine the operable range with reference to the information for selecting the first display device.
  • the posture of the operator may be the amount of inclination of the seat in which the operator is seated.
  • the position of the operator, the amount of inclination of the seat on which the operator is seated, and the display device 2 which is determined to be touch-operated by the operator among the plurality of display devices 2 are associated with each other in advance. It is assumed that the information for selecting the second display device is generated and stored in the storage unit.
  • the operation range determination units 141 and 141a select the display device 2 to be touch-operated by the operator with reference to the second display device selection information. Then, the operation range determination units 141 and 141a determine the operable range in the selected display device 2.
  • the operation state specifying unit 13 specifies that the position of the driver's seat in which the operator is seated is the position of the "right seat” and the amount of inclination of the "right seat” is "140 degrees”. Further, it is assumed that the display device 2 is installed in the vehicle in front of the driver's seat with respect to the traveling direction of the vehicle and on both side surfaces of the vehicle interior. In addition, in the information for selecting the second display device, the position of the "right seat”, the amount of inclination of the seat on which the operator is seated "130 to 145 degrees”, and the installation on the right side of the passenger compartment with respect to the traveling direction. It is assumed that the display device 2 is associated with the display device 2.
  • the operation range determination units 141 and 141a provide the second display device selection information based on the position of the operator specified by the operation state identification unit 13 and the amount of inclination of the seat in which the operator is seated.
  • the display device 2 installed on the right side surface of the vehicle interior is selected with respect to the traveling direction, and the operable range is determined.
  • the second display device selection information includes information on the position of the operator, information on the amount of inclination of the seat in which the operator is seated, and information on the display device 2, and the operable range in the display device 2.
  • the operation range determination units 141 and 141a may determine the operable range by referring to the information for selecting the second display device.
  • the operator is the driver, but the operator is not limited to the driver.
  • the operator may be an occupant other than the driver.
  • the operator may be an occupant seated in the passenger seat, or the operator may be an occupant seated in the rear seat.
  • the number of operators is one, but the number of operators is not limited to one, and the number of operators may be a plurality of occupants. ..
  • the operation state specifying unit 13 can identify a plurality of operators. Then, when a plurality of operators are specified, the operation state specifying unit 13 specifies the operation state of each of the specified plurality of operators.
  • the plurality of operators are all persons who can perform touch operations on the display device 2, but it is the plurality of operators who perform remote control on one or more display objects displayed on the display device 2. It shall be one of them.
  • the display control devices 1 and 1a are connected to, for example, an imaging device as described above, and the operation state specifying unit 13 acquires an captured image from the imaging device via the acquisition unit 11 and captures the image.
  • a plurality of operators are specified based on the image, and the operation state is specified for each of the specified operators.
  • the operation state specifying unit 13 detects, for example, a plurality of occupants existing in the vehicle by performing image recognition processing on the captured image acquired from the image pickup device using a well-known image recognition technique. To do.
  • the operation state specifying unit 13 detects the driver and the passenger and identifies them as the operators, respectively.
  • the operation state specifying unit 13 specifies the operation state of the driver and the operation state of the passenger. For example, when the operating state is the position of the operator, the operating state specifying unit 13 specifies the position of the driver and the position of the passenger. Then, the operation state specifying unit 13 outputs the identified information on the driver's operation state and the information on the passenger's operation state to the determination units 14 and 14a. In this case, the operation range determination units 141 and 141a of the determination units 14 and 14a determine the operable range based on the operation states of a plurality of operators specified by the operation state identification unit 13.
  • the operation range determination units 141 and 141a first determine the operable range of each of the plurality of operators (hereinafter referred to as "operator unit operable range") based on the operation states of the plurality of operators. To do.
  • the method in which the operation range determination units 141 and 141a determine the operable range for each operator is the method in which the operation range determination units 141 and 141a determine the operable range described in the first and second embodiments. It is the same method as.
  • the operation range determination units 141 and 141a calculate the range in which the operator unit operable range corresponding to each of the plurality of operators overlaps. Then, the operation range determination units 141 and 141a determine the calculated overlapping range as the operable range.
  • the operation range determination units 141 and 141a first determine the operator-based operable range according to the driver based on the position of the driver. Further, the operation range determination units 141 and 141a determine the operator unit operable range according to the passenger based on the position of the passenger. Next, the operation range determination units 141 and 141a calculate a range in which the operator unit operable range corresponding to the driver and the operator unit operation range corresponding to the passenger overlap. Then, the operation range determination units 141 and 141a determine the calculated overlapping range as the operable range.
  • the operation range determination units 141 and 141a determine that the operator unit operable range corresponding to the operator who performed the remote control is the operable range. To do. For example, when the operation state specifying unit 13 specifies the operation state, the operator who has performed the remote control is also specified.
  • the operation range determination units 141 and 141a may acquire information about the operator who has performed the remote control from the operation state specifying unit 13. For example, when the operator performs remote control by voice, the operation state specifying unit 13 performs remote control among a plurality of operators based on the direction of the source of the voice signal and the captured image acquired from the imaging device. It suffices to identify the operator. Further, for example, when the operator performs remote control by gesture, the operation state specifying unit 13 may identify the operator who has performed remote control among a plurality of operators based on the captured image acquired from the imaging device. Good.
  • the operation state specifying unit 13 acquires information on the function associated with the operation target from the operation target identification units 12 and 12a.
  • the operation target object specifying units 12 and 12a can specify the function associated with the operation target object based on the target identification information.
  • the target identification information includes at least the keyword, the identification information of the icon, the identification information of the function executed by operating the icon, and the response display displayed when the function is executed. It is assumed that the identification information of the object is associated with it.
  • the operation state specifying unit 13 detects an occupant present in the vehicle when the function associated with the operation target is a function that can be targeted by a plurality of operators. Specifically, the operation state specifying unit 13 acquires an captured image from the image pickup device as described above via, for example, the acquisition unit 11, and uses a well-known image recognition technique for the captured image. Image recognition processing is performed to detect one or more occupants present in the vehicle. The operation state specifying unit 13 uses one or more detected occupants as operators. Then, the operation state specifying unit 13 specifies the operation state for each of one or more operators. Further, the operation range determination units 141 and 141a determine the operable range based on the operation state of one or more operators specified by the operation state identification unit 13.
  • the operation state specifying unit 13 determines a specific operation for specifying the operation state for each of one or more operators, and the operation range determination units 141 and 141a determine the operable range based on the operation state of one or more operators. Since the specific operation to be performed has already been explained, duplicate explanations will be omitted.
  • the operation state specifying unit 13 is irrespective of the number of occupants present in the vehicle. , Only the operator who performed the remote control is set as the operator, and the operation state of the operator is specified. Since the specific operation of the operation state specifying unit 13 to identify the operator who performed the remote control has already been explained, duplicate explanations will be omitted.
  • the display control devices 1 and 1a respecify the operation state of the operator in a preset cycle (hereinafter referred to as "range re-determination cycle"). Then, the operable range may be re-determined based on the operation state of the specified operator.
  • range re-determination cycle A specific operation in which the display control devices 1 and 1a re-identify the operating state of the operator and redetermine the operable range based on the specified operating state of the operator will be described.
  • the operations described below are based on the premise that the operations described using the flowcharts shown in FIGS. 5 and 10, respectively, have been performed in the display control devices 1 and 1a, respectively. In the following description, it is assumed that the image pickup device as described above is connected to the display control devices 1, 1a, and the display control devices 1, 1a acquire the captured image of the inside of the vehicle from the image pickup device. ..
  • the operation range determination units 141 and 141a redetermine the range. It waits until the cycle is reached, and when the range redetermination cycle is reached, the captured image is acquired from the imaging device via the acquisition unit 11.
  • the operation state specifying unit 13 specifies the current operation state of the operator based on the acquired captured image. Since the specific operation of specifying the operation state of the operator based on the captured image has already been explained, duplicate description will be omitted.
  • the operation state specifying unit 13 specifies the position of the operator as the operation state of the operator.
  • the operation state specifying unit 13 determines whether or not there is a change between the most recently determined operation state and the current operation state of the operator. Specifically, here, the operation state specifying unit 13 determines whether or not the current position of the operator has changed from the position of the operator determined most recently. When the operation state of the operator is specified, the operation state specifying unit 13 stores the specified operation state of the operator in the storage unit. The operation state specifying unit 13 compares the specified current position of the operator with the stored latest position of the operator to determine the operator's current position most recently. From the position of, it is determined whether or not it has changed.
  • the operation state specifying unit 13 determines that there is no change between the most recently determined operation state of the operator and the current operation state of the operator, the operation state specifying unit 13 waits again until the range redetermination cycle is reached. On the other hand, when the operation state specifying unit 13 determines that there is a change between the most recently determined operation state of the operator and the current operation state of the operator, the operation state specifying unit 13 operates information on the current operation state of the operator. It is output to the range determination units 141 and 141a. Here, the operation state specifying unit 13 outputs information regarding the current position of the operator to the operation range determination units 141 and 141a.
  • the operation range determination unit 141, 141a When the operation range determination unit 141, 141a outputs information on the current operation state of the operator from the operation state identification unit 13, the operator's operable range is based on the output information on the operation state of the operator. (See, for example, step ST504 in FIG. 5 and step ST1004 in FIG. 10). Since the subsequent specific operation is the same as the specific operation of step ST505 described in FIG. 5 or the specific operation of step ST1005 described in FIG. 10, duplicated description will be omitted. In this way, in the display control devices 1 and 1a, the operation state of the operator is re-specified in the range re-judgment cycle, and the re-judgment of the operable range is performed based on the specified operation state of the operator. can do.
  • the display control devices 1 and 1a described above re-identify the operating state of the operator in the range re-judgment cycle, and re-determine the operable range based on the specified operating state of the operator.
  • the display control devices 1 and 1a are connected to the image pickup device, and the operation state specifying unit 13 specifies the current operation state of the operator based on the captured image acquired from the image pickup device.
  • the display control devices 1 and 1a are connected to the array microphone, and the operation state specifying unit 13 acquires a voice signal based on the utterance from the array microphone via the acquisition unit 11, and based on the voice signal, The current operation state of the operator may be specified.
  • the display control devices 1 and 1a described above re-identify the operation state of the operator in the range re-judgment cycle, and re-determine the operable range based on the specified operation state of the operator.
  • the operation state of the operator is the position of the operator, but this is only an example.
  • the operating state of the operator may be the posture of the operator.
  • the operation state specifying unit 13 specifies the current posture of the operator when the range re-determination cycle is reached, and determines whether or not there is a change in the posture of the operator.
  • the operation state specifying unit 13 acquires an image of the inside of the vehicle from an image pickup device as described above via the acquisition unit 11, and specifies the current posture of the operator based on the acquired image. To do.
  • the operation state specifying unit 13 determines that the posture of the operator has changed, the operation range determination units 141 and 141a operate the operator based on the current posture of the operator specified by the operation state specifying unit 13. Redetermine the possible range.
  • the re-determination of the operable range may be performed depending on whether or not the number of operators changes.
  • the operation state specifying unit 13 detects the operator and the current number of the operators when the range re-determination cycle is reached, and determines whether or not there is a change in the current number of operators. To do.
  • the operation state specifying unit 13 acquires an image of the inside of the vehicle from an image pickup device as described above via the acquisition unit 11, and based on the acquired image, the current operator and the operator. Detect the number of operators.
  • the operation range determination units 141 and 141a may redetermine the operable range of the operator when the operation state specifying unit 13 determines that the number of operators has changed. When it is determined that the number of operators has changed, the operation state specifying unit 13 specifies the operation state of each of the detected one or more operators.
  • the operation range determination units 141 and 141a determine the operable range based on the operation state of each of one or more operators specified by the operation state identification unit 13. Since the specific operation in which the operation range determination units 141 and 141a determine the operable range based on the operation state of one or more operators has already been explained, duplicate description will be omitted.
  • the range in the vehicle for the operation state specifying unit 13 to detect the number of operators (hereinafter referred to as “number of person specifying range”) is predetermined, and the operation state specifying unit 13 May detect the current number of operators in the number-specific range.
  • the number-of-person identification range is, for example, a predetermined range centered on the position of the operator who has performed the remote control for designating the execution of the function in the captured image.
  • the display control devices 1 and 1a determine whether or not there is a change in the operating state, and redetermine the operable range based on the determination result of whether or not there is a change in the operating state. You may try to do it. Further, the display control devices 1 and 1a determine whether or not the number of operators has changed, and redetermine the operable range based on the determination result of whether or not the number of the operators has changed. May be carried out. As described above, in the display control devices 1, 1a, the operation range determination units 141, 141a determine the operable range according to the current operation state of the operator specified by the operation state identification unit 13. It can be determined dynamically.
  • the display control devices 1 and 1a determine whether or not there is a change in the operating state, or determine whether or not there is a change in the number of operators, and re-determine the operable range based on the determination result. By doing so, the display control devices 1 and 1a cannot touch the operation target because the operation state changes after the operation target is once displayed in the operable range. It can be prevented from being lost.
  • FIG. 11A and 11B are diagrams showing an example of the hardware configuration of the display control devices 1 and 1a according to the first and second embodiments.
  • the functions of the acquisition unit 11, the operation target identification unit 12, 12a, the operation state identification unit 13, the determination unit 14, 14a, and the display control unit 15, 15a are It is realized by the processing circuit 1101.
  • the display control devices 1 and 1a are processing circuits that specify the operation target and control the display position of the specified operation target on the display device 2 when the remote control for specifying the execution of the function is performed.
  • 1101 is provided.
  • the processing circuit 1101 may be dedicated hardware as shown in FIG. 11A, or may be a CPU (Central Processing Unit) 1105 that executes a program stored in the memory 1106 as shown in FIG. 11B.
  • CPU Central Processing Unit
  • the processing circuit 1101 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable). Gate Array) or a combination of these is applicable.
  • the functions of the acquisition unit 11, the operation target identification unit 12, 12a, the operation state identification unit 13, the determination unit 14, 14a, and the display control unit 15, 15a are software and firmware. Or, it is realized by a combination of software and firmware. That is, the acquisition unit 11, the operation target object identification unit 12, 12a, the operation state identification unit 13, the determination unit 14, 14a, and the display control unit 15, 15a are HDD (Hard Disk Drive) 1102, memory 1106, etc. It is realized by a processing circuit such as a CPU 1105 that executes a program stored in the above and a system LSI (Large-Scale Integration). Further, the programs stored in the HDD 1102, the memory 1106, etc.
  • HDD Hard Disk Drive
  • the memory 1106 is, for example, a RAM, a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EPROM (Electrically Emergency Memory), an EPROM (Electrically Emergency Memory), a volatile Optical, etc.
  • a semiconductor memory, a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, a DVD (Digital Versaille Disc), or the like is applicable.
  • the functions of the acquisition unit 11, the operation target identification unit 12, 12a, the operation state identification unit 13, the determination unit 14, 14a, and the display control unit 15, 15a are realized by dedicated hardware. However, a part may be realized by software or firmware.
  • the operation target identification units 12 and 12a are realized by the processing circuit 1101 as dedicated hardware, and the acquisition unit 11, the operation state identification unit 13, the determination units 14, 14a, and the display control unit are realized.
  • the functions of 15 and 15a can be realized by the processing circuit 1101 reading and executing the program stored in the memory 1106.
  • the display control devices 1 and 1a include a device such as a display device 2 or an input device, and an input interface device 1103 and an output interface device 1104 that perform wired communication or wireless communication.
  • the display control devices 1 and 1a are in-vehicle devices mounted on the vehicle, and the acquisition unit 11, the operation target object identification units 12, 12a, and the operation state identification unit 13 are used.
  • the determination units 14, 14a and the display control units 15, 15a are assumed to be provided in the display control devices 1, 1a. Not limited to this, a part of the acquisition unit 11, the operation target object identification unit 12, 12a, the operation state identification unit 13, the determination unit 14, 14a, and the display control unit 15, 15a is mounted on the vehicle.
  • An automatic driving control system may be configured by the in-vehicle device and the server, assuming that the in-vehicle device and the server are provided in the server connected to the in-vehicle device via a network.
  • the operation target identification units 12 and 12a may be provided in the server.
  • the response display information corresponding to the response display object may be generated in advance and stored in the storage unit.
  • the display control unit 15a acquires the generated response display information and controls the display of the response display object so that the response display object is displayed at the display position determined by the display position determination unit 151a.
  • the display control device 1a does not have to include the response generation unit 152.
  • the display control device can be applied to a display control device that controls the display of a display object on a touch-operable display device.
  • 1,1a Display control device 2 Display device, 3 Input device, 11 Acquisition unit, 12, 12a Operation target identification unit, 121, 121a Initial display position identification unit, 13 Operation state identification unit, 14, 14a Judgment unit, 141 , 141a operation range determination unit, 15,15a display control unit, 151,151a display position determination unit, 152 response generation unit, 1101 processing circuit, 1102 HDD, 1103 input interface device, 1104 output interface device, 1105 CPU, 1106 memory.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A display control device comprising: an acquisition unit (11) that acquires remote operation information indicating that a function executable by a touch-operable display device (2) has been designated by a remote operation; an operation subject identification unit (12, 12a) that, on the basis of the remote operation information, identifies an operation subject to be touch-operated by an operator of the display device (2) and identifies the initial display position of the operation subject; an operation state identification unit (13) that identifies the operation state of the operator who performed the remote operation; a determination unit (14, 14a) that, on the basis of the operation state of the operator, determines whether the initial display position of the operation subject on the display screen of the display device (2) is within an operable range of touch-operability by the operator; and a display control unit (15, 15a) that, if the determination unit (14, 14a) determines that the initial display position of the operation subject is not within the operable range, controls display of the operation subject by the display device (2) such that the operation subject is displayed within the operable range.

Description

[規則37.2に基づきISAが決定した発明の名称] 操作対象物を操作可能範囲内に表示する表示制御装置[Name of invention determined by ISA based on Rule 37.2.] Display control device that displays the object to be operated within the operable range
 この発明は、タッチ操作可能な表示装置への表示物の表示を制御する表示制御装置および表示制御方法に関する。 The present invention relates to a display control device and a display control method for controlling the display of a display object on a touch-operable display device.
 近年、タッチ操作可能な表示装置であって、大画面を有する表示装置が知られている。大画面を有する表示装置においては、当該表示装置に表示される表示物が、当該表示装置の操作者がタッチ操作できないほど操作者から離れた位置に表示され得る。したがって、操作者は、現在いる位置から、所望の表示物をタッチ操作できないことがある。
 例えば、車両において上述したような大画面を有する表示装置が設けられることがある。その場合、操作者である運転者が、現在いる位置から、所望の表示物をタッチ操作できない可能性がある。
 これに対し、例えば、特許文献1には、表示部の上層にタッチパネルを備え、操作者の指等がタッチパネルに接触したことを検知すると、表示部上に配置されている入力アイコンの位置を、当該入力アイコンの位置が表示部上の指等の接触座標に近い順に、指の近くに移動させるようにする表示装置が開示されている。特許文献1に開示されている表示装置において、接触座標の周囲に配置する入力アイコンの数には限りがある。
In recent years, a display device that can be touch-operated and has a large screen is known. In a display device having a large screen, the display object displayed on the display device may be displayed at a position far from the operator so that the operator of the display device cannot perform a touch operation. Therefore, the operator may not be able to touch the desired display object from the current position.
For example, a display device having a large screen as described above may be provided in a vehicle. In that case, the driver, who is the operator, may not be able to touch the desired display object from the current position.
On the other hand, for example, Patent Document 1 provides a touch panel on the upper layer of the display unit, and when it detects that the operator's finger or the like touches the touch panel, the position of the input icon arranged on the display unit is determined. A display device is disclosed in which the position of the input icon is moved closer to the finger in the order closer to the contact coordinates of the finger or the like on the display unit. In the display device disclosed in Patent Document 1, the number of input icons arranged around the contact coordinates is limited.
特開2011-210083号公報Japanese Unexamined Patent Publication No. 2011-210083
 特許文献1に開示されているような従来技術においては、上述のとおり、接触座標の周囲に配置する入力アイコンの数に限りがあるため、操作者がタッチ操作を所望する表示物が、操作者の指の近くに移動されない可能性がある。
 したがって、依然として、操作者が、現在いる位置から、所望の表示物をタッチ操作できない可能性があるという課題があった。
In the prior art as disclosed in Patent Document 1, as described above, since the number of input icons arranged around the contact coordinates is limited, the display object that the operator desires to touch is the operator. May not be moved near your finger.
Therefore, there is still a problem that the operator may not be able to touch the desired display object from the current position.
 この発明は、上記のような課題を解決するためになされたもので、タッチ操作可能な表示装置において、表示装置の操作者がタッチ操作を所望する表示物を、操作者がタッチ操作可能な位置に表示させることができる表示制御装置を提供することを目的とする。 The present invention has been made to solve the above-mentioned problems, and in a touch-operable display device, a position where the operator can touch-operate a display object that the operator of the display device desires to touch. It is an object of the present invention to provide a display control device which can be displayed on the screen.
 この発明に係る表示制御装置は、タッチ操作可能な表示装置を介して実行させることが可能な機能が、遠隔操作によって指定されたことを示す遠隔操作情報を取得する取得部と、取得部が取得した遠隔操作情報に基づいて、表示装置の操作者がタッチ操作する操作対象物を特定するとともに、当該操作対象物の初期表示位置を特定する操作対象物特定部と、遠隔操作を行った操作者の操作状態を特定する操作状態特定部と、操作状態特定部が特定した操作者の操作状態に基づき、操作対象物特定部が特定した操作対象物の初期表示位置が、操作者がタッチ操作可能な、表示装置の表示画面における、操作可能範囲内にあるか否かを判定する判定部と、判定部が、操作対象物の初期表示位置は操作可能範囲内にないと判定した場合、操作対象物が操作可能範囲内に表示されるよう、表示装置への、操作対象物の表示を制御する表示制御部を備えたものである。 The display control device according to the present invention has an acquisition unit that acquires remote control information indicating that a function that can be executed via a touch-operable display device is specified by remote control, and an acquisition unit that acquires the function. Based on the remote control information, the operator of the display device identifies the operation target to be touch-operated, and the operation target identification unit that specifies the initial display position of the operation target, and the operator who performed the remote control. The operator can touch-operate the initial display position of the operation target specified by the operation target identification unit based on the operation state specification unit that specifies the operation state and the operation state of the operator specified by the operation state specification unit. When the judgment unit that determines whether or not the operation target is within the operable range on the display screen of the display device and the judgment unit determine that the initial display position of the operation target is not within the operable range, the operation target It is provided with a display control unit that controls the display of the operation target object on the display device so that the object is displayed within the operable range.
 この発明によれば、タッチ操作可能な表示装置において、表示部の操作者がタッチ操作を所望する表示物を、操作者がタッチ操作可能な位置に表示させることができる。 According to the present invention, in a touch-operable display device, a display object that the operator of the display unit desires to touch can be displayed at a position where the operator can touch-operate.
実施の形態1に係る表示制御装置の構成例を示す図である。It is a figure which shows the configuration example of the display control device which concerns on Embodiment 1. FIG. 実施の形態1に係る表示制御装置が搭載された車両において、車内に表示装置が設置された状態の、表示装置のイメージを説明するための図である。It is a figure for demonstrating the image of the display device in the state which the display device is installed in the vehicle in the vehicle equipped with the display control device which concerns on Embodiment 1. FIG. 実施の形態1における、操作者の操作状態に基づく操作可能範囲のイメージの一例を説明するための図であって、図3Aは、操作状態が「左座席」の位置である場合の、操作可能範囲のイメージを説明する図であり、図3Bは、操作状態が「右座席」の位置である場合の、操作可能範囲のイメージを説明する図である。FIG. 3A is a diagram for explaining an example of an image of the operable range based on the operating state of the operator in the first embodiment, and FIG. 3A shows the operable state when the operating state is the position of the “left seat”. FIG. 3B is a diagram for explaining an image of a range, and FIG. 3B is a diagram for explaining an image of an operable range when the operating state is the position of the “right seat”. 実施の形態1において、表示位置決定部が、Audioアイコンが操作可能範囲内に表示されるよう、Audioアイコンの表示位置を決定し、表示制御部が、表示位置決定部が決定した位置にAudioアイコンを表示させた後の表示装置の表示画面例のイメージを示す図である。In the first embodiment, the display position determination unit determines the display position of the Audio icon so that the Audio icon is displayed within the operable range, and the display control unit determines the audio icon at the position determined by the display position determination unit. It is a figure which shows the image of the display screen example of the display device after displaying. 実施の形態1に係る表示制御装置の動作を説明するためのフローチャートである。It is a flowchart for demonstrating operation of the display control apparatus which concerns on Embodiment 1. FIG. 実施の形態2に係る表示制御装置の構成例を示す図である。It is a figure which shows the configuration example of the display control device which concerns on Embodiment 2. FIG. 実施の形態2において、操作対象物特定部が操作対象物を特定する際に参照する対象特定用情報の一例のイメージを説明するための図である。It is a figure for demonstrating the image of an example of the target identification information which an operation object specifying part refers to when specifying an operation object in Embodiment 2. FIG. 実施の形態2において、オーディオメニュー画面の初期表示位置の一例のイメージを説明するための図である。It is a figure for demonstrating the image of an example of the initial display position of an audio menu screen in Embodiment 2. FIG. 実施の形態2において、表示位置決定部が、オーディオメニュー画面が操作可能範囲内に表示されるよう、オーディオメニュー画面の表示位置を決定し、表示制御部が、表示位置決定部が決定した位置にオーディオメニュー画面を表示させた後の表示装置の表示画面例のイメージを示す図である。In the second embodiment, the display position determination unit determines the display position of the audio menu screen so that the audio menu screen is displayed within the operable range, and the display control unit determines the position determined by the display position determination unit. It is a figure which shows the image of the display screen example of the display device after displaying an audio menu screen. 実施の形態2に係る自動運転制御装置の動作を説明するためのフローチャートである。It is a flowchart for demonstrating operation of the automatic operation control apparatus which concerns on Embodiment 2. 図11A,図11Bは、実施の形態1および実施の形態2に係る表示制御装置のハードウェア構成の一例を示す図である。11A and 11B are diagrams showing an example of the hardware configuration of the display control device according to the first embodiment and the second embodiment.
 以下、この発明の実施の形態について、図面を参照しながら詳細に説明する。
実施の形態1.
 図1は、実施の形態1に係る表示制御装置1の構成例を示す図である。
 実施の形態1において、表示制御装置1は、車両に搭載されているものとする。また、表示制御装置1が搭載される車両には、車内において、車両の進行方向に対して前方に位置する運転席および助手席のさらに前方に、大画面かつタッチ操作可能な表示装置2が設置されていることを前提とする。
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
Embodiment 1.
FIG. 1 is a diagram showing a configuration example of the display control device 1 according to the first embodiment.
In the first embodiment, it is assumed that the display control device 1 is mounted on the vehicle. Further, in the vehicle equipped with the display control device 1, a large screen and touch-operable display device 2 is installed in front of the driver's seat and the passenger seat located in front of the vehicle in the traveling direction of the vehicle. It is assumed that it has been done.
 ここで、図2は、実施の形態1に係る表示制御装置1が搭載された車両において、車内に表示装置2が設置された状態の、表示装置2のイメージを説明するための図である。
 表示装置2は、表面にタッチパネルを有するタッチパネル式ディスプレイである。表示装置2には、1つ以上の表示物が表示される。運転者は、表示装置2に表示された1つ以上の表示物のうちのいずれか1つをタッチ操作することで、種々の機能を実行させる。表示装置2は、運転席の前方にも、助手席の前方にも、タッチ操作可能な領域が存在する程度の幅を有する大画面の表示装置2である。そのため、表示装置2に表示される表示物は、運転者がタッチ操作できないほどに当該運転者から離れた位置に表示され得る。
 以下の説明において、運転席を、「右座席」または「左座席」ともいう。「右座席」とは、車両の進行方向に向かって、車両の幅方向の中心を通る、車両の進行方向と略平行な直線に対して右側に存在する座席のことをいい、「左座席」とは、車両の進行方向に向かって、当該直線に対して左側に存在する座席のことをいう。すなわち、車両が右ハンドル車である場合、運転席は「右座席」であり、車両が左ハンドル車である場合、運転席は「左座席」である。
Here, FIG. 2 is a diagram for explaining an image of the display device 2 in a state where the display device 2 is installed in the vehicle in the vehicle equipped with the display control device 1 according to the first embodiment.
The display device 2 is a touch panel type display having a touch panel on the surface. One or more display objects are displayed on the display device 2. The driver performs various functions by touch-operating any one of one or more display objects displayed on the display device 2. The display device 2 is a large-screen display device 2 having a width such that a touch-operable area exists both in front of the driver's seat and in front of the passenger seat. Therefore, the display object displayed on the display device 2 can be displayed at a position so far away from the driver that the driver cannot perform a touch operation.
In the following description, the driver's seat is also referred to as a "right seat" or a "left seat". The "right seat" is a seat that passes through the center of the width direction of the vehicle in the direction of travel of the vehicle and is located on the right side of a straight line substantially parallel to the direction of travel of the vehicle, and is a "left seat". Refers to a seat existing on the left side of the straight line in the direction of travel of the vehicle. That is, when the vehicle is a right-handed vehicle, the driver's seat is the "right seat", and when the vehicle is a left-handed vehicle, the driver's seat is the "left seat".
 実施の形態1において、表示制御装置1は、上述したような表示装置2において、運転者がタッチ操作する表示物を、運転者がタッチ可能な範囲に表示するよう制御する。なお、表示装置2に表示される表示物の表示制御は、表示制御装置1において、後述の表示制御部15が行う。
 具体的には、表示制御装置1は、表示装置2を介して実行させることが可能な機能が、遠隔操作によって指定されたことを示す情報(以下「遠隔操作情報」という。)を取得し、取得した遠隔操作情報に基づいて、運転者がタッチ操作する表示物(以下「操作対象物」という。)を特定する。そして、表示制御装置1は、特定した操作対象物を、表示装置2において、運転者がタッチ操作可能な範囲(以下「操作可能範囲」という。)に表示させる。
 実施の形態1において、遠隔操作とは、非接触で行われる操作のことであり、例えば、発話によって行われる操作である。また、以下の実施の形態1において、表示装置2に表示された操作対象物をタッチ操作する人を「操作者」という。すなわち、実施の形態1では、運転者が「操作者」である。
 また、実施の形態1において、表示装置2を介して実行させることが可能な機能は、車両において、乗員が実行し得る種々の機能を含む。例えば、電話をかける電話機能、ナビゲーション装置(図示省略)に対して経路検索等を実行させるための検索機能、または、オーディオ装置(図示省略)に対して音楽の再生等を実行させるためのオーディオ機能である。
In the first embodiment, the display control device 1 controls the display device 2 as described above so that the display object touch-operated by the driver is displayed in a range that the driver can touch. The display control of the display object displayed on the display device 2 is performed by the display control unit 15 described later in the display control device 1.
Specifically, the display control device 1 acquires information (hereinafter referred to as "remote control information") indicating that a function that can be executed via the display device 2 is specified by remote control. Based on the acquired remote control information, a display object (hereinafter referred to as "operation target object") to be touch-operated by the driver is specified. Then, the display control device 1 displays the specified operation target on the display device 2 in a range where the driver can touch-operate (hereinafter, referred to as "operable range").
In the first embodiment, the remote control is an operation performed in a non-contact manner, for example, an operation performed by utterance. Further, in the following embodiment 1, a person who touch-operates an operation object displayed on the display device 2 is referred to as an "operator". That is, in the first embodiment, the driver is the "operator".
Further, in the first embodiment, the functions that can be executed via the display device 2 include various functions that can be executed by the occupant in the vehicle. For example, a telephone function for making a call, a search function for causing a navigation device (not shown) to execute a route search, or an audio function for causing an audio device (not shown) to play music. Is.
 操作者は、表示装置2を介して、上述したような機能を実行させることが可能である。具体的には、操作者は、表示装置2に表示された操作対象物に対して決定操作を行うことで、機能を実行させることが可能である。決定操作とは、例えば、タッチ操作である。しかし、上述のとおり、表示装置2に表示される表示物は、操作者がタッチ操作できないほどに当該操作者から離れた位置に表示され得る。
 そこで、操作者は、遠隔操作によって、実行させたい機能を指定することが可能となっており、表示制御装置1は、遠隔操作が行われると、当該遠隔操作情報を取得することが可能となっている。具体的には、操作者は、入力装置3を用いて、遠隔操作によって、実行させたい機能を指定する。実施の形態1において、入力装置3は、例えば、アレイマイクとする。アレイマイクは、1つのマイクでは得られないような音の空間的な情報を取得することができ、指向性の制御等を行うことができる。
 表示制御装置1は、入力装置3から遠隔操作情報を取得する。表示制御装置1は、取得した遠隔操作情報に基づき、操作対象物を特定し、当該操作対象物を、操作可能範囲内に表示するよう、制御する。
The operator can execute the above-mentioned functions via the display device 2. Specifically, the operator can execute the function by performing a determination operation on the operation target displayed on the display device 2. The determination operation is, for example, a touch operation. However, as described above, the display object displayed on the display device 2 may be displayed at a position so far away from the operator that the operator cannot perform a touch operation.
Therefore, the operator can specify the function to be executed by remote control, and the display control device 1 can acquire the remote control information when the remote control is performed. ing. Specifically, the operator specifies a function to be executed by remote control using the input device 3. In the first embodiment, the input device 3 is, for example, an array microphone. The array microphone can acquire spatial information of sound that cannot be obtained by one microphone, and can control directivity and the like.
The display control device 1 acquires remote control information from the input device 3. The display control device 1 identifies the operation target based on the acquired remote control information, and controls the operation target so as to be displayed within the operable range.
 実施の形態1において、表示装置2に表示される操作対象物は、アイコンとする。アイコンは、操作者が、種々の機能を実行させるためのボタン等である。
 図2では、一例として、電話機能の実行を指示するためのTELアイコン201、検索機能の実行を指示するためのSearchアイコン202、および、オーディオ機能の実行を指示するためのAudioアイコン203が表示装置2に表示されているものとしている。
 なお、アイコンは、表示装置2の初期状態において表示される。表示装置2の初期状態とは、例えば、電源投入時である。また、予め、アイコン毎に、アイコンが決定操作された場合に実行される機能が決められている。
In the first embodiment, the operation target displayed on the display device 2 is an icon. The icon is a button or the like for the operator to execute various functions.
In FIG. 2, as an example, a TEL icon 201 for instructing the execution of the telephone function, a Search icon 202 for instructing the execution of the search function, and an Audio icon 203 for instructing the execution of the audio function are display devices. It is assumed that it is displayed in 2.
The icon is displayed in the initial state of the display device 2. The initial state of the display device 2 is, for example, when the power is turned on. In addition, the function to be executed when the icon is determined is determined in advance for each icon.
 図1の説明に戻る。
 表示制御装置1は、取得部11、操作対象物特定部12、操作状態特定部13、判定部14、および、表示制御部15を備える。操作対象物特定部12は、初期表示位置特定部121を備える。判定部14は、操作範囲判定部141を備える。表示制御部15は、表示位置決定部151を備える。
 取得部11は、タッチ操作可能な表示装置2を介して実行させることが可能な機能が、遠隔操作によって指定されたことを示す遠隔操作情報、を取得する。
 具体的には、取得部11は、操作者がアレイマイクを用いて入力した、機能の実行を指定する音声に基づく音声信号を、アレイマイクから、遠隔操作情報として取得する。例えば、図2に示すように、表示装置2には、TELアイコン201、Searchアイコン202、および、Audioアイコン203が表示された状態で、操作者が、「オーディオを操作」と発話したとする。なお、このとき、操作者が着座している運転席は「左座席」であり、操作者はAudioアイコン203をタッチ操作できないものとする。取得部11は、アレイマイクから、「オーディオを操作」という発話に基づく音声信号を、遠隔操作情報として取得する。
 取得部11は、取得した遠隔操作情報を、操作対象物特定部12および操作状態特定部13に出力する。
Returning to the description of FIG.
The display control device 1 includes an acquisition unit 11, an operation target object identification unit 12, an operation state identification unit 13, a determination unit 14, and a display control unit 15. The operation target object identification unit 12 includes an initial display position identification unit 121. The determination unit 14 includes an operation range determination unit 141. The display control unit 15 includes a display position determination unit 151.
The acquisition unit 11 acquires remote control information indicating that a function that can be executed via the touch-operable display device 2 is specified by remote control.
Specifically, the acquisition unit 11 acquires a voice signal based on the voice that specifies the execution of the function, which is input by the operator using the array microphone, as remote control information from the array microphone. For example, as shown in FIG. 2, it is assumed that the operator utters "operate audio" while the TEL icon 201, the Search icon 202, and the Audio icon 203 are displayed on the display device 2. At this time, the driver's seat in which the operator is seated is the "left seat", and the operator cannot touch the Audio icon 203. The acquisition unit 11 acquires an audio signal based on the utterance "manipulate audio" from the array microphone as remote control information.
The acquisition unit 11 outputs the acquired remote control information to the operation target object identification unit 12 and the operation state identification unit 13.
 操作対象物特定部12は、取得部11が取得した遠隔操作情報に基づいて、表示装置2の操作者がタッチ操作する操作対象物を特定するとともに、当該操作対象物の初期表示位置を特定する。
 まず、操作対象物特定部12は、取得部11から出力された遠隔操作情報に基づいて、表示装置2に表示される1つ以上の表示物のうち、操作者がタッチ操作する操作対象物を特定する。具体的には、操作対象物特定部12は、取得部11から出力された音声信号に基づいて、表示装置2に表示されている1つ以上のアイコンのうち、操作者がタッチ操作するアイコンを、操作対象物として特定する。操作対象物特定部12は、周知の音声認識技術を用いて、音声信号に対する音声認識処理を実施する。そして、操作対象物特定部12は、例えば、音声認識結果に基づき、対象特定用情報を参照して、操作対象物を特定する。対象特定用情報は、少なくとも、キーワードとアイコンの識別情報とが対応付けられた情報である。アイコンの識別情報は、アイコンを特定可能な情報であればよい。対象特定用情報は、予め生成され、表示制御装置1が参照可能な記憶部(図示省略)に記憶されている。記憶部は、表示制御装置1に備えられるものであってもよいし、表示制御装置1の外部の、表示制御装置1が参照可能な場所に備えられるものであってもよい。
Based on the remote control information acquired by the acquisition unit 11, the operation object identification unit 12 specifies the operation object to be touch-operated by the operator of the display device 2 and specifies the initial display position of the operation object. ..
First, the operation target object identification unit 12 selects an operation target object to be touch-operated by the operator among one or more display objects displayed on the display device 2 based on the remote control information output from the acquisition unit 11. Identify. Specifically, the operation target identification unit 12 selects an icon that the operator touch-operates among one or more icons displayed on the display device 2 based on the audio signal output from the acquisition unit 11. , Specify as an operation target. The operation target object identification unit 12 performs voice recognition processing on the voice signal by using a well-known voice recognition technique. Then, the operation target identification unit 12 identifies the operation target by referring to the target identification information, for example, based on the voice recognition result. The target identification information is at least information in which keywords and icon identification information are associated with each other. The icon identification information may be any information that can identify the icon. The target identification information is generated in advance and stored in a storage unit (not shown) that can be referred to by the display control device 1. The storage unit may be provided in the display control device 1, or may be provided in a place outside the display control device 1 where the display control device 1 can be referred to.
 例えば、上述の例でいうと、操作対象物特定部12は、「オーディオを操作」を示す音声信号に対して音声認識処理を実施し、音声認識結果として、「オーディオ」とのキーワードを得る。また、対象特定用情報では、「オーディオ」に対して、Audioアイコン203の識別情報が対応付けられているものとする。操作対象物特定部12は、対象特定用情報を参照して、Audioアイコン203を、操作対象物と特定する。 For example, in the above example, the operation target identification unit 12 performs voice recognition processing on a voice signal indicating "manipulate audio", and obtains the keyword "audio" as the voice recognition result. Further, in the target identification information, it is assumed that the identification information of the Audio icon 203 is associated with the "audio". The operation target object identification unit 12 identifies the Audio icon 203 as an operation target object with reference to the target identification information.
 そして、操作対象物特定部12は、特定した操作対象物に関する情報に基づき、表示装置2の表示画面における、操作対象物の初期表示位置を特定する。具体的には、操作対象物特定部12の初期表示位置特定部121が、操作対象物の初期表示位置を特定する。
 操作者によって遠隔操作が行われるよりも前の時点において、表示物が表示装置2に表示される現在の表示位置は、当該表示物に応じて予め決められている。実施の形態1において、表示物に応じて予め決められている現在の表示位置を、表示物の初期表示位置という。初期表示位置は、例えば、電源投入時に表示物が表示される表示位置である。また、初期表示位置は、例えば、電源投入時に表示されていた表示物の表示位置から、他の表示物の表示状況等によってずらされて表示されることになった表示位置である。
 表示物の表示位置は、表示装置2の表示画面上の座標であらわされる。
 初期表示位置特定部121は、操作対象物に関する情報に基づき、初期表示位置特定用情報を参照して、操作対象物の初期表示位置を特定する。例えば、表示装置2に表示され得る表示物の識別情報と、当該表示物の初期表示位置に関する情報とが対応付けられた初期表示位置特定用情報が予め生成され、記憶部に記憶されている。実施の形態1では、初期表示位置特定用情報には、少なくとも、表示装置2に表示され得るアイコンの識別情報と、当該アイコンの初期表示位置に関する情報とが対応付けられて定義されているものとする。
 なお、例えば、後述の表示制御部15が初期表示位置特定部121の機能を有し、操作対象物特定部12は、表示制御部15から取得した初期表示位置に関する情報に基づき、初期表示位置を特定するようにしてもよい。また、例えば、管理部(図示省略)が初期表示位置特定部121の機能を有し、操作対象物特定部12は、管理部から取得した初期表示位置に関する情報に基づき、初期表示位置を特定するようにしてもよい。
Then, the operation target object specifying unit 12 specifies the initial display position of the operation target object on the display screen of the display device 2 based on the information about the specified operation object. Specifically, the initial display position specifying unit 121 of the operation target object specifying unit 12 specifies the initial display position of the operation target object.
The current display position on which the display object is displayed on the display device 2 is predetermined according to the display object before the remote control is performed by the operator. In the first embodiment, the current display position determined in advance according to the display object is referred to as an initial display position of the display object. The initial display position is, for example, a display position where a display object is displayed when the power is turned on. Further, the initial display position is, for example, a display position that is shifted from the display position of the display object that was displayed when the power is turned on, depending on the display status of other display objects and the like.
The display position of the display object is represented by the coordinates on the display screen of the display device 2.
The initial display position specifying unit 121 specifies the initial display position of the operation target by referring to the initial display position specifying information based on the information about the operation target. For example, the initial display position specifying information in which the identification information of the display object that can be displayed on the display device 2 and the information regarding the initial display position of the display object are associated with each other is generated in advance and stored in the storage unit. In the first embodiment, at least the identification information of the icon that can be displayed on the display device 2 and the information about the initial display position of the icon are defined in association with the information for specifying the initial display position. To do.
For example, the display control unit 15 described later has the function of the initial display position specifying unit 121, and the operation target object specifying unit 12 sets the initial display position based on the information regarding the initial display position acquired from the display control unit 15. It may be specified. Further, for example, the management unit (not shown) has the function of the initial display position specifying unit 121, and the operation target object specifying unit 12 specifies the initial display position based on the information regarding the initial display position acquired from the management unit. You may do so.
 上述の例でいうと、初期表示位置特定部121は、Audioアイコン203の初期表示位置を、特定する。ここでは、Audioアイコン203の初期表示位置は、図2で示す位置であるとする。
 操作対象物特定部12は、特定した、操作対象物に関する情報、および、当該操作対象物の初期表示位置に関する情報を、判定部14および表示制御部15に出力する。
In the above example, the initial display position specifying unit 121 specifies the initial display position of the Audio icon 203. Here, it is assumed that the initial display position of the Audio icon 203 is the position shown in FIG.
The operation target object identification unit 12 outputs the specified information on the operation target object and the information on the initial display position of the operation target object to the determination unit 14 and the display control unit 15.
 なお、ここでは、対象特定用情報と初期表示位置特定用情報は、別情報としたが、これは一例に過ぎず、対象特定用情報と初期表示位置特定用情報をあわせて一つの情報としてもよい。例えば、対象特定用情報において、アイコンの初期表示位置に関する情報が、アイコンの識別情報と対応付けられて定義されるようにしてもよい。この場合、初期表示位置特定部121は、対象特定用情報に基づいて、アイコンの初期表示位置を特定すればよい。 Here, the target identification information and the initial display position identification information are set as separate information, but this is only an example, and the target identification information and the initial display position identification information can be combined as one piece of information. Good. For example, in the target identification information, the information regarding the initial display position of the icon may be defined in association with the identification information of the icon. In this case, the initial display position specifying unit 121 may specify the initial display position of the icon based on the target identification information.
 操作状態特定部13は、取得部11から出力された遠隔操作情報に基づいて、操作者の操作状態を特定する。実施の形態1では、操作者の操作状態とは、遠隔操作を行った操作者の位置とする。
 具体的には、操作状態特定部13は、取得部11から出力された音声信号に基づいて、遠隔操作が行われた際の操作者の位置を特定する。操作者の位置は、例えば、操作者が着座している座席の位置とする。
 操作状態特定部13は、取得部11から出力された音声信号を解析し、当該音声信号の発生源の方向を特定する。操作状態特定部13は、周知の技術を用いて、音声信号を解析し、音声信号の発生源の方向を特定すればよい。操作状態特定部13は、特定した、音声信号の発生源の方向から、操作者が着座している座席を特定する。そして、操作状態特定部13は、特定した座席の位置を、操作者が着座している座席の位置とする。車両の各座席の位置は、予め決められており、操作状態特定部13は、予め決められている各座席の位置に関する情報に基づき、操作者が着座している座席の位置を特定することができる。
 操作状態特定部13は、例えば、特定した、音声信号の発生源の方向が、車両の進行方向に対して、車両の幅方向の中心を通る、車両の進行方向と略平行な直線よりも右側の空間の方向である場合、「右座席」を操作者が着座している座席とし、「右座席」の位置を、操作者の位置とする。逆に、操作状態特定部13は、例えば、特定した、音声信号の発生源の方向が、車両の進行方向に対して、車両の幅方向の中心を通る、車両の進行方向と略平行な直線よりも左側の空間の方向である場合、「左座席」を、操作者が着座している座席とし、「左座席」の位置を、操作者の位置とする。上述の例でいうと、操作状態特定部13は、「左座席」の位置を、操作者の位置とする。
 操作状態特定部13は、特定した、操作者の操作状態に関する情報、言い換えれば、特定した、操作者の位置に関する情報を、判定部14に出力する。実施の形態1において、操作席の位置に関する情報とは、例えば、「右座席」の位置情報、または、「左座席」の位置情報である。
The operation state specifying unit 13 specifies the operation state of the operator based on the remote control information output from the acquisition unit 11. In the first embodiment, the operation state of the operator is the position of the operator who has performed the remote control.
Specifically, the operation state specifying unit 13 specifies the position of the operator when the remote control is performed, based on the audio signal output from the acquisition unit 11. The position of the operator is, for example, the position of the seat in which the operator is seated.
The operation state specifying unit 13 analyzes the audio signal output from the acquisition unit 11 and specifies the direction of the source of the audio signal. The operation state specifying unit 13 may analyze the voice signal and specify the direction of the source of the voice signal by using a well-known technique. The operation state specifying unit 13 identifies the seat in which the operator is seated from the direction of the specified source of the voice signal. Then, the operation state specifying unit 13 sets the specified seat position as the position of the seat on which the operator is seated. The position of each seat of the vehicle is predetermined, and the operation state specifying unit 13 may specify the position of the seat on which the operator is seated based on the predetermined information regarding the position of each seat. it can.
The operation state specifying unit 13 is, for example, on the right side of a straight line in which the direction of the identified voice signal source passes through the center in the width direction of the vehicle with respect to the traveling direction of the vehicle and is substantially parallel to the traveling direction of the vehicle. In the case of the direction of the space, the "right seat" is the seat on which the operator is seated, and the position of the "right seat" is the position of the operator. On the contrary, the operation state specifying unit 13 is, for example, a straight line in which the direction of the identified voice signal source passes through the center in the width direction of the vehicle with respect to the traveling direction of the vehicle and is substantially parallel to the traveling direction of the vehicle. In the direction of the space on the left side of, the "left seat" is the seat on which the operator is seated, and the position of the "left seat" is the position of the operator. In the above example, the operation state specifying unit 13 uses the position of the "left seat" as the position of the operator.
The operation state specifying unit 13 outputs the specified information on the operation state of the operator, in other words, the specified information on the position of the operator, to the determination unit 14. In the first embodiment, the information regarding the position of the operating seat is, for example, the position information of the "right seat" or the position information of the "left seat".
 判定部14は、操作状態特定部13が特定した操作者の操作状態に基づき、操作対象物特定部12が特定した操作対象物の初期表示位置が、操作者がタッチ操作可能な、表示装置2の表示画面における、操作可能範囲内にあるか否かを判定する。
 まず、判定部14において、当該判定部14の操作範囲判定部141は、操作状態特定部13から出力された操作者の操作状態に関する情報に基づき、操作可能範囲を判定する。
 実施の形態1において、操作可能範囲は、操作者が着座している座席の位置に応じて予め決められており、車内の座席の位置に関する情報と操作可能範囲とが対応付けられた範囲特定用情報が予め生成され、記憶部に記憶されている。操作可能範囲は、例えば、標準的な体格の操作者が座席に着座した状態で当該操作者が無理な姿勢になることなくタッチ操作可能な、表示装置2の表示画面上の範囲と決められている。操作可能範囲は、表示装置2の表示画面上の座標であらわされる。
The determination unit 14 is a display device 2 in which the operator can touch-operate the initial display position of the operation object specified by the operation object identification unit 12 based on the operation state of the operator specified by the operation state identification unit 13. It is determined whether or not it is within the operable range on the display screen of.
First, in the determination unit 14, the operation range determination unit 141 of the determination unit 14 determines the operable range based on the information regarding the operation state of the operator output from the operation state identification unit 13.
In the first embodiment, the operable range is predetermined according to the position of the seat in which the operator is seated, and is for specifying the range in which the information regarding the position of the seat in the vehicle and the operable range are associated with each other. Information is generated in advance and stored in the storage unit. The operable range is determined to be, for example, the range on the display screen of the display device 2 in which an operator having a standard physique can perform touch operations without being in an unreasonable posture while sitting in a seat. There is. The operable range is represented by the coordinates on the display screen of the display device 2.
 ここで、図3は、実施の形態1における、操作者の操作状態に基づく操作可能範囲のイメージの一例を説明するための図である。図3Aは、操作者の操作状態が「左座席」(図2参照)の位置である場合の、操作可能範囲(図3Aにおいて301で示す)のイメージを説明する図であり、図3Bは、操作者の操作状態が「右座席」(図2参照)の位置である場合の、操作可能範囲(図3Bにおいて302で示す)のイメージを説明する図である。なお、図3では、一例として、操作可能範囲は、円状の範囲としているが、操作可能範囲は円状の範囲に限らない。操作可能範囲は、操作者が着席した状態で、無理な姿勢になることなく、タッチ操作可能な、表示装置2の表示画面上の範囲になっていればよい。 Here, FIG. 3 is a diagram for explaining an example of an image of the operable range based on the operating state of the operator in the first embodiment. FIG. 3A is a diagram illustrating an image of an operable range (shown by 301 in FIG. 3A) when the operator's operating state is the position of the “left seat” (see FIG. 2), and FIG. 3B is a diagram illustrating It is a figure explaining the image of the operable range (shown by 302 in FIG. 3B) when the operation state of an operator is the position of "right seat" (see FIG. 2). In FIG. 3, as an example, the operable range is a circular range, but the operable range is not limited to the circular range. The operable range may be a range on the display screen of the display device 2 that can be touch-operated without being in an unreasonable posture when the operator is seated.
 操作範囲判定部141は、操作者の操作状態に関する情報に基づき、範囲特定用情報を参照して、操作可能範囲を判定する。
 上述の例でいうと、操作範囲判定部141は、範囲特定用情報において「左座席」の位置に対応付けられている操作可能範囲(例えば、図3Aの301参照)を、操作可能範囲と特定する。
The operation range determination unit 141 determines the operable range with reference to the range specifying information based on the information regarding the operation state of the operator.
In the above example, the operation range determination unit 141 identifies the operable range (for example, see 301 in FIG. 3A) associated with the position of the “left seat” in the range identification information as the operable range. To do.
 次に、判定部14は、操作対象物特定部12から出力された、操作対象物の初期表示位置に関する情報と、操作範囲判定部141が特定した操作可能範囲に関する情報に基づき、操作対象物の初期表示位置が操作可能範囲内にあるか否かを判定する。実施の形態1では、判定部14は、操作対象物の初期表示位置が操作可能範囲内にあるか否かを、例えば、操作対象物が初期表示位置に表示された場合の中心座標が、操作可能範囲内にあるか否かによって判定するものとする。
 上述の例でいうと、判定部14は、Audioアイコン203が初期表示位置(図2参照)に表示された場合の中心座標が、図3Aの301で示される操作可能範囲内にあるか否かを判定する。
 判定部14は、操作対象物の初期表示位置が操作可能範囲内にあるか否かの判定結果(以下「範囲判定結果情報」という。)を、表示制御部15に出力する。
 上述の例では、Audioアイコン203が初期表示位置に表示された場合の中心座標は操作可能範囲内ではないので、判定部14は、Audioアイコン203の初期表示位置は操作可能範囲内にないとする範囲判定結果情報を、表示制御部15に出力する。
Next, the determination unit 14 determines the operation target object based on the information regarding the initial display position of the operation target object output from the operation target object identification unit 12 and the information regarding the operable range specified by the operation range determination unit 141. Determine if the initial display position is within the operable range. In the first embodiment, the determination unit 14 operates whether or not the initial display position of the operation target is within the operable range, for example, the center coordinates when the operation target is displayed at the initial display position. Judgment shall be made based on whether or not it is within the possible range.
In the above example, the determination unit 14 determines whether or not the center coordinates when the Audio icon 203 is displayed at the initial display position (see FIG. 2) are within the operable range shown by 301 in FIG. 3A. To judge.
The determination unit 14 outputs a determination result (hereinafter referred to as “range determination result information”) as to whether or not the initial display position of the operation target is within the operable range to the display control unit 15.
In the above example, since the center coordinates when the Audio icon 203 is displayed in the initial display position are not within the operable range, the determination unit 14 determines that the initial display position of the Audio icon 203 is not within the operable range. The range determination result information is output to the display control unit 15.
 表示制御部15は、判定部14が、操作対象物の初期表示位置は操作可能範囲内にないと判定した場合、操作対象物が操作可能範囲内に表示されるよう、表示装置2への、操作対象物の表示を制御する。
 具体的には、表示制御部15において、表示位置決定部151は、操作対象物特定部12から出力された操作対象物の初期表示位置に関する情報、および、判定部14から出力された範囲判定結果情報、に基づき、操作対象物の、表示装置2における表示位置を決定する。
 表示位置決定部151は、範囲判定結果情報が、操作対象物の初期表示位置が操作可能範囲内にないことを示す場合、言い換えれば、判定部14が、操作対象物の初期表示位置は操作可能範囲内にないと判定した場合、操作対象物が操作可能範囲内に表示される、当該操作対象物の表示位置を決定する。
 実施の形態1において、操作対象物が操作可能範囲内に表示されている状態とは、少なくとも、操作対象物について、当該操作対象物がタッチ操作されることで適切に機能が実行されるようになる、十分な大きさを有する領域(以下「タッチ操作有効領域」という。)が、操作可能範囲内に表示されている状態をいうものとする。
 上述の例では、判定部14からは、Audioアイコン203の初期表示位置の中心座標は、操作可能範囲内(図3の301参照)にないとする範囲判定結果情報が出力されるので、表示位置決定部151は、Audioアイコン203が操作可能範囲内に表示される、Audioアイコン203の表示位置を決定する。
When the determination unit 14 determines that the initial display position of the operation target is not within the operable range, the display control unit 15 displays the operation target on the display device 2 so that the operation target is displayed within the operable range. Control the display of the operation target.
Specifically, in the display control unit 15, the display position determination unit 151 outputs information on the initial display position of the operation object output from the operation object identification unit 12, and the range determination result output from the determination unit 14. Based on the information, the display position of the operation target on the display device 2 is determined.
When the display position determination unit 151 indicates that the range determination result information indicates that the initial display position of the operation target is not within the operable range, in other words, the determination unit 14 can operate the initial display position of the operation object. If it is determined that the operation target is not within the range, the display position of the operation target is determined so that the operation target is displayed within the operable range.
In the first embodiment, the state in which the operation target is displayed within the operable range means that at least the operation target is appropriately performed by touching the operation target. It is assumed that an area having a sufficient size (hereinafter referred to as "touch operation effective area") is displayed within the operable range.
In the above example, since the determination unit 14 outputs the range determination result information that the center coordinates of the initial display position of the Audio icon 203 are not within the operable range (see 301 in FIG. 3), the display position is displayed. The determination unit 151 determines the display position of the Audio icon 203 in which the Audio icon 203 is displayed within the operable range.
 一方、表示位置決定部151は、範囲判定結果情報が、操作対象物の初期表示位置が操作可能範囲内にあることを示す場合、言い換えれば、判定部14が、操作対象物の初期表示位置は操作可能範囲内にあると判定した場合、操作対象物の初期表示位置を、当該操作対象物の表示位置に決定する。表示位置決定部151は、操作対象物の初期表示位置を、操作対象物特定部12から出力された操作対象物の初期表示位置に関する情報に基づき決定すればよい。表示位置決定部151が決定する、操作対象物の表示位置に関する情報は、例えば、表示装置2の表示画面における、操作対象物の中心の座標である。 On the other hand, when the range determination result information indicates that the initial display position of the operation target is within the operable range, the display position determination unit 151, in other words, the determination unit 14 determines the initial display position of the operation target. When it is determined that the operation target is within the operable range, the initial display position of the operation target is determined to be the display position of the operation target. The display position determination unit 151 may determine the initial display position of the operation object based on the information regarding the initial display position of the operation object output from the operation object identification unit 12. The information regarding the display position of the operation object determined by the display position determination unit 151 is, for example, the coordinates of the center of the operation object on the display screen of the display device 2.
 表示制御部15は、表示位置決定部151が決定した操作対象物の表示位置に関する情報に基づき、表示位置決定部151が決定した表示位置に操作対象物が表示されるよう、操作対象物の、表示装置2への表示を制御する。具体的には、表示制御部15は、例えば、表示装置2に表示された際の操作対象物の中心の座標が、表示位置決定部151が決定した操作対象物の中心の座標になるよう、当該操作対象物の表示位置を移動させる。
 図4は、実施の形態1において、表示位置決定部151が、Audioアイコン203が操作可能範囲内に表示されるよう、Audioアイコン203の表示位置を決定し、表示制御部15が、表示位置決定部151が決定した位置にAudioアイコン203を表示させた後の表示装置2の表示画面例のイメージを示す図である。
 図4に示すように、表示制御部15が、表示位置決定部151が決定したAudioアイコン203の表示位置に基づき、Audioアイコン203の表示位置を、図2に示す位置から、図4に示す位置に移動させて表示させたことで、運転者は、当該Audioアイコン203をタッチ操作できるようになる。
 なお、ここでは、一例として、表示制御部15は、操作対象物について、初期表示位置に表示された状態から表示位置決定部151が決定した表示位置に表示されるよう表示制御する際、操作対象物の表示位置を移動させるものとした。しかし、これは一例に過ぎず、表示制御部15は、操作対象物について、初期表示位置に表示された状態のまま、さらに、表示位置決定部151が決定した表示位置にも当該操作対象物が表示されるよう制御してもよい。表示制御部15は、表示位置決定部151が決定した操作対象物の表示位置に、操作対象物を表示させるようになっていればよい。
 なお、表示制御部15は、操作対象物を表示させた際、他のアイコンと重なる等した場合は、当該他のアイコンの表示位置を制御することができる。具体的には、表示制御部15は、例えば、他のアイコンの表示位置を、操作対象物と重ならない位置までずらせるようにする。操作対象物の表示位置が、他のアイコンと重なる位置であるか否かは、表示位置決定部151が判定するようにしてもよい。
The display control unit 15 determines that the operation object is displayed at the display position determined by the display position determination unit 151 based on the information regarding the display position of the operation object determined by the display position determination unit 151. Controls the display on the display device 2. Specifically, the display control unit 15 sets the coordinates of the center of the operation target when displayed on the display device 2 to be the coordinates of the center of the operation target determined by the display position determination unit 151, for example. The display position of the operation target is moved.
In FIG. 4, in the first embodiment, the display position determination unit 151 determines the display position of the Audio icon 203 so that the Audio icon 203 is displayed within the operable range, and the display control unit 15 determines the display position. It is a figure which shows the image of the display screen example of the display device 2 after displaying the Audio icon 203 at the position decided by the part 151.
As shown in FIG. 4, the display control unit 15 shifts the display position of the Audio icon 203 from the position shown in FIG. 2 to the position shown in FIG. 4 based on the display position of the Audio icon 203 determined by the display position determination unit 151. By moving the audio icon 203 to the display, the driver can touch the Audio icon 203.
Here, as an example, when the display control unit 15 controls the display of the operation target object so that it is displayed at the display position determined by the display position determination unit 151 from the state displayed at the initial display position, the operation target is operated. The display position of the object is to be moved. However, this is only an example, and the display control unit 15 keeps the operation target in the state of being displayed at the initial display position, and further, the operation target is also displayed at the display position determined by the display position determination unit 151. It may be controlled so that it is displayed. The display control unit 15 may display the operation target at the display position of the operation target determined by the display position determination unit 151.
The display control unit 15 can control the display position of the other icon when the operation target is displayed and overlaps with the other icon. Specifically, the display control unit 15 shifts the display position of another icon to a position that does not overlap with the operation target, for example. The display position determining unit 151 may determine whether or not the display position of the operation object overlaps with another icon.
 実施の形態1に係る表示制御装置1の動作について説明する。
 図5は、実施の形態1に係る表示制御装置1の動作を説明するためのフローチャートである。
 取得部11は、タッチ操作可能な表示装置2を介して実行させることが可能な機能が、遠隔操作によって指定されたことを示す遠隔操作情報、を取得する(ステップST501)。
 取得部11は、取得した遠隔操作情報を、操作対象物特定部12および操作状態特定部13に出力する。
The operation of the display control device 1 according to the first embodiment will be described.
FIG. 5 is a flowchart for explaining the operation of the display control device 1 according to the first embodiment.
The acquisition unit 11 acquires remote control information indicating that a function that can be executed via the touch-operable display device 2 is specified by remote control (step ST501).
The acquisition unit 11 outputs the acquired remote control information to the operation target object identification unit 12 and the operation state identification unit 13.
 操作対象物特定部12は、ステップST501にて取得部11が取得した遠隔操作情報に基づいて、表示装置2の操作者がタッチ操作する操作対象物を特定するとともに、当該操作対象物の初期表示位置を特定する(ステップST502)。
 まず、操作対象物特定部12は、取得部11から出力された遠隔操作情報に基づいて、表示装置2に表示される1つ以上の表示物のうち、操作者がタッチ操作する操作対象物を特定する。
 そして、操作対象物特定部12は、特定した操作対象物に関する情報に基づき、表示装置2の表示画面における、操作対象物の初期表示位置を特定する。具体的には、操作対象物特定部12の初期表示位置特定部121が、操作対象物の初期表示位置を特定する。
 操作対象物特定部12は、特定した、操作対象物に関する情報、および、当該操作対象物の初期表示位置に関する情報を、判定部14および表示制御部15に出力する。
Based on the remote control information acquired by the acquisition unit 11 in step ST501, the operation object identification unit 12 identifies the operation object to be touch-operated by the operator of the display device 2, and initially displays the operation object. The position is specified (step ST502).
First, the operation target object identification unit 12 selects an operation target object to be touch-operated by the operator among one or more display objects displayed on the display device 2 based on the remote control information output from the acquisition unit 11. Identify.
Then, the operation target object specifying unit 12 specifies the initial display position of the operation target object on the display screen of the display device 2 based on the information about the specified operation object. Specifically, the initial display position specifying unit 121 of the operation target object specifying unit 12 specifies the initial display position of the operation target object.
The operation target object identification unit 12 outputs the specified information on the operation target object and the information on the initial display position of the operation target object to the determination unit 14 and the display control unit 15.
 操作状態特定部13は、ステップST501にて取得部11から出力された遠隔操作情報に基づいて、操作者の操作状態を特定する(ステップST503)。
 具体的には、操作状態特定部13は、取得部11から出力された音声信号に基づいて、遠隔操作が行われた際の操作者の位置を特定する。
 操作状態特定部13は、特定した、操作者の操作状態に関する情報、言い換えれば、特定した、操作者の位置に関する情報を、判定部14に出力する。
The operation state specifying unit 13 specifies the operation state of the operator based on the remote control information output from the acquisition unit 11 in step ST501 (step ST503).
Specifically, the operation state specifying unit 13 specifies the position of the operator when the remote control is performed, based on the audio signal output from the acquisition unit 11.
The operation state specifying unit 13 outputs the specified information on the operation state of the operator, in other words, the specified information on the position of the operator, to the determination unit 14.
 判定部14は、ステップST503にて操作状態特定部13が特定した操作者の操作状態に基づき、ステップST502にて操作対象物特定部12が特定した操作対象物の初期表示位置が、操作可能範囲内にあるか否かを判定する(ステップST504)。
 具体的には、まず、判定部14において、当該判定部14の操作範囲判定部141は、操作状態特定部13から出力された操作者の操作状態に関する情報に基づき、操作可能範囲を判定する。
 次に、判定部14は、操作対象物特定部12から出力された、操作対象物の初期表示位置に関する情報と、操作範囲判定部141が特定した操作可能範囲に関する情報に基づき、操作対象物の初期表示位置が操作可能範囲内にあるか否かを判定する。
 判定部14は、操作対象物の初期表示位置が操作可能範囲内にあるか否かの範囲判定結果情報を、表示制御部15に出力する。
In the determination unit 14, the initial display position of the operation target object specified by the operation target object identification unit 12 in step ST502 is the operable range based on the operation state of the operator specified by the operation state identification unit 13 in step ST503. It is determined whether or not it is inside (step ST504).
Specifically, first, in the determination unit 14, the operation range determination unit 141 of the determination unit 14 determines the operable range based on the information regarding the operation state of the operator output from the operation state identification unit 13.
Next, the determination unit 14 determines the operation target object based on the information regarding the initial display position of the operation target object output from the operation target object identification unit 12 and the information regarding the operable range specified by the operation range determination unit 141. Determine if the initial display position is within the operable range.
The determination unit 14 outputs the range determination result information as to whether or not the initial display position of the operation target is within the operable range to the display control unit 15.
 表示制御部15は、ステップST504にて、判定部14が、操作対象物の初期表示位置は操作可能範囲内にないと判定した場合、操作対象物が操作可能範囲内に表示されるよう、表示装置2への、操作対象物の表示を制御する(ステップST505)。
 具体的には、表示制御部15において、表示位置決定部151は、操作対象物特定部12から出力された操作対象物の初期表示位置に関する情報、および、判定部14から出力された範囲判定結果情報、に基づき、操作対象物の、表示装置2における表示位置を決定する。
 表示位置決定部151は、範囲判定結果情報が、操作対象物の初期表示位置が操作可能範囲内にないことを示す場合、言い換えれば、判定部14が、操作対象物の初期表示位置は操作可能範囲内にないと判定した場合、操作対象物が操作可能範囲内に表示される、当該操作対象物の表示位置を決定する。
 一方、表示位置決定部151は、範囲判定結果情報が、操作対象物の初期表示位置が操作可能範囲内にあることを示す場合、言い換えれば、判定部14が、操作対象物の初期表示位置は操作可能範囲内にあると判定した場合、操作対象物の初期表示位置を、当該操作対象物の表示位置に決定する。
 表示制御部15は、表示位置決定部151が決定した操作対象物の表示位置に関する情報に基づき、表示位置決定部151が決定した表示位置に操作対象物が表示されるよう、操作対象物の、表示装置2への表示を制御する。
When the determination unit 14 determines in step ST504 that the initial display position of the operation target is not within the operable range, the display control unit 15 displays the operation target so that it is displayed within the operable range. The display of the operation target object on the device 2 is controlled (step ST505).
Specifically, in the display control unit 15, the display position determination unit 151 outputs information on the initial display position of the operation object output from the operation object identification unit 12, and the range determination result output from the determination unit 14. Based on the information, the display position of the operation target on the display device 2 is determined.
When the display position determination unit 151 indicates that the range determination result information indicates that the initial display position of the operation target is not within the operable range, in other words, the determination unit 14 can operate the initial display position of the operation object. If it is determined that the operation target is not within the range, the display position of the operation target is determined so that the operation target is displayed within the operable range.
On the other hand, when the range determination result information indicates that the initial display position of the operation target is within the operable range, the display position determination unit 151, in other words, the determination unit 14 determines the initial display position of the operation target. When it is determined that the operation target is within the operable range, the initial display position of the operation target is determined to be the display position of the operation target.
The display control unit 15 determines that the operation object is displayed at the display position determined by the display position determination unit 151 based on the information regarding the display position of the operation object determined by the display position determination unit 151. Controls the display on the display device 2.
 なお、図5のフローチャートにおいて、ステップST502の動作と、ステップST503の動作の順番は問わない。ステップST502の動作と、ステップST503の動作は並行して行われてもよい。 In the flowchart of FIG. 5, the order of the operation of step ST502 and the operation of step ST503 does not matter. The operation of step ST502 and the operation of step ST503 may be performed in parallel.
 以上のように、実施の形態1に係る表示制御装置1は、タッチ操作可能な表示装置2において、表示装置2の操作者がタッチ操作を所望する操作対象物について、当該操作対象物の初期表示位置が、操作者がタッチ操作できない位置であっても、当該操作対象物を、操作者がタッチ操作可能な位置に表示させるよう制御することができる。具体的には、実施の形態1に係る表示制御装置1は、タッチ操作可能な表示装置2において、遠隔操作によって指定された機能を実行させるためのアイコンについて、当該アイコンの初期表示位置が、操作者がタッチ操作できない位置であっても、当該アイコンを、操作者がタッチ操作可能な位置に表示させるよう制御することができる。 As described above, the display control device 1 according to the first embodiment is the display device 2 capable of touch operation, and the initial display of the operation target object for which the operator of the display device 2 desires the touch operation. Even if the position is a position where the operator cannot perform the touch operation, it is possible to control the operation object to be displayed at the position where the operator can perform the touch operation. Specifically, in the display control device 1 according to the first embodiment, the initial display position of the icon for executing the function designated by the remote control in the touch-operable display device 2 is operated. It is possible to control the icon to be displayed at a position where the operator can perform the touch operation even if the person cannot perform the touch operation.
 上述したような、入力アイコンの位置を指の近くに移動させる従来技術では、操作者がタッチ操作を所望する表示物が指の近くに移動されるとは限らない。
 また、当該従来技術では、操作者が所望しない操作対象まで、操作者の指の近くに移動されることになる。その結果、仮に、操作者が所望する操作対象が指の近くに移動されたとしても、操作者は、所望していない入力アイコンを誤操作してしまうという問題も発生し得る。
 これに対し、実施の形態1に係る表示制御装置1は、表示装置2の操作者がタッチ操作を所望する操作対象物について、当該操作対象物の初期表示位置が、操作者がタッチ操作できない位置であっても、当該操作対象物を、操作者がタッチ操作可能な位置に表示させるよう制御することができる。
In the conventional technique of moving the position of the input icon near the finger as described above, the display object that the operator desires to touch is not always moved near the finger.
Further, in the conventional technique, the operation target that the operator does not want is moved near the operator's finger. As a result, even if the operation target desired by the operator is moved near the finger, the operator may erroneously operate an input icon that is not desired.
On the other hand, in the display control device 1 according to the first embodiment, the initial display position of the operation object that the operator of the display device 2 desires to touch is the position where the operator cannot touch the operation object. Even so, it is possible to control the operation object to be displayed at a position where the operator can perform a touch operation.
 以上のように、実施の形態1によれば、表示制御装置1は、タッチ操作可能な表示装置2を介して実行させることが可能な機能が、遠隔操作によって指定されたことを示す遠隔操作情報、を取得する取得部11と、取得部11が取得した遠隔操作情報に基づいて、表示装置2の操作者がタッチ操作する操作対象物を特定するとともに、当該操作対象物の初期表示位置を特定する操作対象物特定部12と、遠隔操作を行った操作者の操作状態を特定する操作状態特定部13と、操作状態特定部13が特定した操作者の操作状態に基づき、操作対象物特定部12が特定した操作対象物の初期表示位置が、操作者がタッチ操作可能な、表示装置2の表示画面における、操作可能範囲内にあるか否かを判定する判定部14と、判定部14が、操作対象物の初期表示位置は操作可能範囲内にないと判定した場合、操作対象物が操作可能範囲内に表示されるよう、表示装置2への、操作対象物の表示を制御する表示制御部15を備えるように構成した。
 そのため、タッチ操作可能な表示装置2において、表示装置2の操作者がタッチ操作を所望する操作対象物について、操作者がタッチ操作可能な位置に表示させることができる。
As described above, according to the first embodiment, the remote control information indicating that the function that can be executed by the display control device 1 via the touch-operable display device 2 is specified by remote control. Based on the acquisition unit 11 to acquire the above and the remote control information acquired by the acquisition unit 11, the operation object to be touch-operated by the operator of the display device 2 is specified, and the initial display position of the operation object is specified. Operation target identification unit 12 to be operated, an operation state specification unit 13 to specify the operation state of the operator who performed the remote control, and an operation target identification unit 13 to specify the operation state of the operator specified by the operation state specification unit 13. The determination unit 14 and the determination unit 14 determine whether or not the initial display position of the operation object specified by 12 is within the operable range on the display screen of the display device 2 that can be touch-controlled by the operator. , When it is determined that the initial display position of the operation object is not within the operable range, the display control for controlling the display of the operation object on the display device 2 so that the operation object is displayed within the operable range. It is configured to include a part 15.
Therefore, in the touch-operable display device 2, the operation target that the operator of the display device 2 desires to touch can be displayed at a position where the operator can touch-operate.
実施の形態2.
 実施の形態1では、表示装置2には、表示物として1つ以上のアイコンが表示され、表示制御装置1は、表示装置2に表示されている1つ以上のアイコンのうち、遠隔操作によって指定された機能を実行させるためのアイコンを、操作対象物として特定するものとした。
 実施の形態2では、表示制御装置1aが特定する操作対象物は、アイコンが操作された場合以降に表示装置2に表示される表示物とする実施の形態について説明する。
Embodiment 2.
In the first embodiment, the display device 2 displays one or more icons as display objects, and the display control device 1 is designated by remote control among the one or more icons displayed on the display device 2. The icon for executing the specified function is specified as the operation target.
In the second embodiment, the operation target specified by the display control device 1a will be a display object to be displayed on the display device 2 after the icon is operated.
 実施の形態2では、実施の形態1同様、表示装置2には、表示物として、機能を実行させるための1つ以上のアイコンが表示される(図2参照)。
 さらに、実施の形態2では、例えば、表示装置2に表示されているアイコンのうちいずれか1つが操作されたとした場合、当該アイコンに対応付けられている機能が実行され、実行された機能に応じた表示物が表示されるようになる。例えば、Audioアイコン203が操作されたとした場合、オーディオメニュー画面が表示されるようになる。例えば、さらに、オーディオメニュー画面が操作されたとした場合、音量設定画面が表示されるようになることもある。なお、表示物が操作されたとした場合、とは、表示物が決定操作されたとした場合、を意味しており、決定操作とは、タッチ操作等である。
 このように、実施の形態2では、ある表示物と、当該表示物が操作された場合に次に表示されることになる表示物とが決められている。つまり、表示物は、階層構造で定義されている。上述の例でいうと、Audioアイコン203の下層にオーディオメニュー画面が定義されている。なお、上記階層構造は、2つ以上の階層を有する。例えば、あるアイコンに対応付けられている機能が実行されると、当該あるアイコンの下層の複数の階層で定義されている表示物のうち、実行された機能に応じた、いずれかの階層の表示物が、表示装置2に表示される。
 実施の形態2において、機能が実行された場合に表示される表示物を、「応答表示物」という。
In the second embodiment, as in the first embodiment, the display device 2 displays one or more icons for executing the function as display objects (see FIG. 2).
Further, in the second embodiment, for example, when any one of the icons displayed on the display device 2 is operated, the function associated with the icon is executed, and the function associated with the icon is executed according to the executed function. Displayed objects will be displayed. For example, if the Audio icon 203 is operated, the audio menu screen will be displayed. For example, if the audio menu screen is operated, the volume setting screen may be displayed. It should be noted that the case where the displayed object is operated means the case where the displayed object is determined, and the determined operation is a touch operation or the like.
As described above, in the second embodiment, a certain display object and a display object to be displayed next when the display object is operated are determined. That is, the display objects are defined in a hierarchical structure. In the above example, the audio menu screen is defined in the lower layer of the Audio icon 203. The hierarchical structure has two or more layers. For example, when a function associated with an icon is executed, one of the display objects defined in a plurality of layers below the icon is displayed according to the executed function. The object is displayed on the display device 2.
In the second embodiment, the display object displayed when the function is executed is referred to as a "response display object".
 なお、実施の形態2においても、実施の形態1同様、入力装置3は、例えば、アレイマイクとする。操作者は、アレイマイクを用いて、音声によって、実行させたい機能を指定する。 Also in the second embodiment, as in the first embodiment, the input device 3 is, for example, an array microphone. The operator uses the array microphone to specify the function to be executed by voice.
 図6は、実施の形態2に係る表示制御装置1aの構成例を示す図である。
 実施の形態2に係る表示制御装置1aについて、実施の形態1で図1を用いて説明した表示制御装置1と同じ構成には、同じ符号を付して重複した説明を省略する。実施の形態2に係る表示制御装置1aの構成は、実施の形態1に係る表示制御装置1の構成とは、表示制御部15aが、応答生成部152を備える点が異なる。また、実施の形態2に係る表示制御装置1aでは、操作対象物特定部12a、初期表示位置特定部121a、判定部14a、表示制御部15a、および、表示位置決定部151aの具体的な動作が、それぞれ、実施の形態1に係る表示制御装置1における、操作対象物特定部12、初期表示位置特定部121、判定部14、表示制御部15、および、表示位置決定部151の具体的な動作とは異なる。
FIG. 6 is a diagram showing a configuration example of the display control device 1a according to the second embodiment.
Regarding the display control device 1a according to the second embodiment, the same reference numerals are given to the same configuration as the display control device 1 described with reference to FIG. 1 in the first embodiment, and duplicate description will be omitted. The configuration of the display control device 1a according to the second embodiment is different from the configuration of the display control device 1 according to the first embodiment in that the display control unit 15a includes the response generation unit 152. Further, in the display control device 1a according to the second embodiment, the specific operations of the operation target object identification unit 12a, the initial display position identification unit 121a, the determination unit 14a, the display control unit 15a, and the display position determination unit 151a are performed. Specific operations of the operation target identification unit 12, the initial display position identification unit 121, the determination unit 14, the display control unit 15, and the display position determination unit 151 in the display control device 1 according to the first embodiment, respectively. Is different.
 操作対象物特定部12aの動作は、基本的には、実施の形態1において説明した、操作対象物特定部12の動作と同様である。
 但し、実施の形態1では、操作対象物特定部12が特定する操作対象物は、表示装置2に表示されている、遠隔操作によって指定された機能を実行させるためのアイコンであったのに対し、実施の形態2では、操作対象物特定部12aが特定する操作対象物は、機能の実行を指定する遠隔操作が行われた場合に、当該機能の実行に応じて表示装置2に表示される応答表示物である点が、異なる。
The operation of the operation target identification unit 12a is basically the same as the operation of the operation object identification unit 12 described in the first embodiment.
However, in the first embodiment, the operation target specified by the operation target identification unit 12 is an icon displayed on the display device 2 for executing the function designated by remote control. In the second embodiment, the operation target specified by the operation target identification unit 12a is displayed on the display device 2 according to the execution of the function when the remote operation for designating the execution of the function is performed. The difference is that it is a response display.
 例えば、実施の形態1で挙げた一例のように、表示装置2に、図2に示すように、TELアイコン201、Searchアイコン202、および、Audioアイコン203が表示された状態で、操作者が、「オーディオを操作」と発話したとする。なお、このとき操作者が着座している運転席は左座席であり、操作者は、Audioアイコン203をタッチ操作できないものとする。取得部11は、アレイマイクから、「オーディオを操作」という音声に基づく音声信号を、遠隔操作情報として取得し、当該遠隔操作情報を、操作対象物特定部12aおよび操作状態特定部13に出力する。 For example, as in the example given in the first embodiment, as shown in FIG. 2, the operator displays the TEL icon 201, the Search icon 202, and the Audio icon 203 on the display device 2. Suppose you say "manipulate audio". At this time, the driver's seat in which the operator is seated is the left seat, and the operator cannot touch the Audio icon 203. The acquisition unit 11 acquires a voice signal based on the voice "manipulate audio" from the array microphone as remote control information, and outputs the remote control information to the operation target object identification unit 12a and the operation state identification unit 13. ..
 操作対象物特定部12aは、取得部11から出力された音声信号に基づいて、遠隔操作によって指定された機能が実行された場合に、実行された機能に応じて表示装置2に表示される応答表示物を操作対象物として特定するとともに、当該操作対象物の初期表示位置を特定する。
 具体的には、操作対象物特定部12aは、まず、周知の音声認識技術を用いて、音声信号に対する音声認識処理を実施する。そして、操作対象物特定部12aは、例えば、音声認識結果に基づき、対象特定用情報を参照して、操作対象物を特定する。実施の形態2では、対象特定用情報には、少なくとも、キーワードと、アイコンの識別情報と、アイコンが操作されることで実行される機能の識別情報と、当該機能が実行された場合に表示される応答表示物の識別情報とが対応付けられているものとする。機能の識別情報は、機能を特定可能な情報であればよい。また、応答表示物の識別情報は、応答表示物を特定可能な情報であればよい。
The operation target identification unit 12a responds to be displayed on the display device 2 according to the executed function when the function specified by remote control is executed based on the audio signal output from the acquisition unit 11. The display object is specified as an operation object, and the initial display position of the operation object is specified.
Specifically, the operation target object identification unit 12a first performs voice recognition processing on the voice signal by using a well-known voice recognition technique. Then, the operation target object identification unit 12a identifies the operation target object by referring to the target identification information, for example, based on the voice recognition result. In the second embodiment, at least the keyword, the identification information of the icon, the identification information of the function to be executed by operating the icon, and the identification information of the function to be executed when the function is executed are displayed in the target identification information. It is assumed that the identification information of the response display object is associated with the information. The function identification information may be any information that can identify the function. Further, the identification information of the response display object may be any information that can identify the response display object.
 図7は、実施の形態2において、操作対象物特定部12aが操作対象物を特定する際に参照する対象特定用情報の一例のイメージを説明するための図である。
 例えば、実施の形態1で挙げた例でいうと、操作対象物特定部12aは、まず、「オーディオを操作」を示す音声信号に対して音声認識処理を実施し、音声認識結果として、「オーディオ」とのキーワードを得る。操作対象物特定部12aは、対象特定用情報を参照して、「オーディオ」と対応付けられている応答表示物である「オーディオメニュー画面」を、操作対象物と特定する。
FIG. 7 is a diagram for explaining an image of an example of target identification information referred to when the operation target object identification unit 12a identifies the operation target object in the second embodiment.
For example, in the example given in the first embodiment, the operation target object identification unit 12a first performs voice recognition processing on a voice signal indicating "manipulate audio", and as a voice recognition result, "audio". Get the keyword. The operation target identification unit 12a identifies the "audio menu screen", which is a response display object associated with the "audio", as the operation target by referring to the target identification information.
 そして、操作対象物特定部12aは、特定した操作対象物に関する情報に基づき、表示装置2の表示画面における、操作対象物の初期表示位置を特定する。具体的には、操作対象物特定部12aの初期表示位置特定部121aが、操作対象物の初期表示位置を特定する。初期表示位置特定部121aの動作は、基本的には、実施の形態1において説明した、初期表示位置特定部121の動作と同様である。
 実施の形態1では、初期表示位置特定部121が初期表示位置を特定する操作対象物はアイコンであったのに対し、実施の形態2では、初期表示位置特定部121aが初期表示位置を特定する操作対象物は応答表示物である点が、異なる。
 初期表示位置特定部121aは、例えば、操作対象物に関する情報に基づき、初期表示位置特定用情報を参照して、操作対象物、言い換えれば、応答表示物、の初期表示位置を特定する。実施の形態2では、初期表示位置特定用情報には、少なくとも、表示装置2に表示され得る応答表示物の識別情報と、当該応答表示物の初期表示位置に関する情報とが対応付けられて定義されている。
 上述の例でいうと、初期表示位置特定部121aは、オーディオメニュー画面の初期表示位置を、特定する。
Then, the operation target object specifying unit 12a specifies the initial display position of the operation target object on the display screen of the display device 2 based on the information about the specified operation object. Specifically, the initial display position specifying unit 121a of the operation target object specifying unit 12a specifies the initial display position of the operation target object. The operation of the initial display position specifying unit 121a is basically the same as the operation of the initial display position specifying unit 121 described in the first embodiment.
In the first embodiment, the operation target for which the initial display position specifying unit 121 specifies the initial display position is an icon, whereas in the second embodiment, the initial display position specifying unit 121a specifies the initial display position. The difference is that the operation target is a response display.
The initial display position specifying unit 121a specifies the initial display position of the operation target, in other words, the response display object, with reference to the initial display position specifying information based on the information about the operation target, for example. In the second embodiment, at least the identification information of the response display object that can be displayed on the display device 2 and the information regarding the initial display position of the response display object are defined in the initial display position identification information in association with each other. ing.
In the above example, the initial display position specifying unit 121a specifies the initial display position of the audio menu screen.
 図8は、実施の形態2において、オーディオメニュー画面801の初期表示位置の一例のイメージを説明するための図である。初期表示位置特定部121aは、図8に示すオーディオメニュー画面801の表示位置を、オーディオメニュー画面801の初期表示位置と特定する。なお、図8では、便宜上、オーディオメニュー画面801が初期表示位置に表示された状態を示しているが、オーディオメニュー画面801は、表示装置2の初期状態において表示されている表示物ではない。オーディオメニュー画面801は、オーディオ機能を実行するためのアイコンが操作された場合以降に表示装置2に表示される。具体的には、オーディオメニュー画面801は、オーディオ機能を実行するためのAudioアイコン203に対応付けられている「オーディオメニュー表示」という機能が実行されると、当該「オーディオメニュー表示」が実行されることによって表示装置2に表示されるようになる。
 操作対象物特定部12aは、特定した、操作対象物に関する情報、および、当該操作対象物の初期表示位置に関する情報を、判定部14aおよび表示制御部15aに出力する。
FIG. 8 is a diagram for explaining an image of an example of the initial display position of the audio menu screen 801 in the second embodiment. The initial display position specifying unit 121a specifies the display position of the audio menu screen 801 shown in FIG. 8 as the initial display position of the audio menu screen 801. Note that FIG. 8 shows a state in which the audio menu screen 801 is displayed at the initial display position for convenience, but the audio menu screen 801 is not a display object displayed in the initial state of the display device 2. The audio menu screen 801 is displayed on the display device 2 after the icon for executing the audio function is operated. Specifically, on the audio menu screen 801, when a function called "audio menu display" associated with the audio icon 203 for executing the audio function is executed, the "audio menu display" is executed. As a result, it will be displayed on the display device 2.
The operation object specifying unit 12a outputs the specified information on the operation object and the information on the initial display position of the operation object to the determination unit 14a and the display control unit 15a.
 判定部14aの動作は、基本的には、実施の形態1において説明した、判定部14の動作と同様である。
 実施の形態1では、判定部14が、初期表示位置が操作可能範囲内にあるか否かを判定する操作対象物はアイコンであったのに対し、実施の形態2では、判定部14aが、初期表示位置が操作可能範囲内にあるか否かを判定する操作対象物は応答表示物である点が、異なる。なお、実施の形態2において、操作範囲判定部141aが操作可能範囲を判定する具体的な動作は、実施の形態1において説明した、操作範囲判定部141の具体的な動作と同様であるため、重複した説明を省略する。
 判定部14aは、操作対象物特定部12aから出力された、操作対象物の初期表示位置に関する情報と、操作範囲判定部141aが特定した操作可能範囲に関する情報に基づき、操作対象物、言い換えれば、応答表示物の初期表示位置が操作可能範囲内にあるか否かを判定する。
 上述の例でいうと、判定部14aは、オーディオメニュー画面801が初期表示位置(図8参照)に表示された場合の中心座標が、操作可能範囲内(図8の301参照)にあるか否かを判定する。
 上述の例では、判定部14aは、オーディオメニュー画面の初期表示位置の中心座標は、操作可能範囲内にないとする範囲判定結果情報を、表示制御部15aに出力する。
The operation of the determination unit 14a is basically the same as the operation of the determination unit 14 described in the first embodiment.
In the first embodiment, the operation target for which the determination unit 14 determines whether or not the initial display position is within the operable range is an icon, whereas in the second embodiment, the determination unit 14a determines whether or not the initial display position is within the operable range. The difference is that the operation target for determining whether or not the initial display position is within the operable range is the response display object. In the second embodiment, the specific operation of the operation range determination unit 141a for determining the operable range is the same as the specific operation of the operation range determination unit 141 described in the first embodiment. Duplicate description is omitted.
The determination unit 14a is based on the information regarding the initial display position of the operation target object output from the operation target object identification unit 12a and the information regarding the operable range specified by the operation range determination unit 141a, that is, the operation target object, in other words, It is determined whether or not the initial display position of the response display object is within the operable range.
In the above example, the determination unit 14a determines whether or not the center coordinates when the audio menu screen 801 is displayed at the initial display position (see FIG. 8) are within the operable range (see 301 in FIG. 8). Is determined.
In the above example, the determination unit 14a outputs the range determination result information that the center coordinates of the initial display position of the audio menu screen are not within the operable range to the display control unit 15a.
 実施の形態2において、表示制御部15aは、表示位置決定部151aおよび応答生成部152を備える。
 表示制御部15aにおいて、応答生成部152は、操作対象物特定部12aから出力された、操作対象物に関する情報に基づき、応答表示物を表示させるための応答表示用情報を生成する。上述の例でいうと、応答生成部152は、「オーディオメニュー画面」を表示させるための応答表示用情報を生成する。
In the second embodiment, the display control unit 15a includes a display position determination unit 151a and a response generation unit 152.
In the display control unit 15a, the response generation unit 152 generates response display information for displaying the response display object based on the information about the operation object output from the operation object identification unit 12a. In the above example, the response generation unit 152 generates response display information for displaying the “audio menu screen”.
 表示制御部15aは、判定部14aが、操作対象物、言い換えれば、応答表示物の初期表示位置は操作可能範囲内にないと判定した場合、操作対象物が操作可能範囲内に表示されるよう、表示装置2への、操作対象物の表示を制御する。
 具体的には、表示制御部15aにおいて、表示位置決定部151aは、操作対象物特定部12aから出力された操作対象物の初期表示位置に関する情報、および、判定部14aから出力された範囲判定結果情報、に基づき、操作対象物の、表示装置2における表示位置を決定する。
 表示位置決定部151aは、範囲判定結果情報が、操作対象物の初期表示位置が操作可能範囲内にないことを示す場合、言い換えれば、判定部14aが、操作対象物の初期表示位置は操作可能範囲内にないと判定した場合、操作対象物が操作可能範囲内に表示される、当該操作対象物の表示位置を決定する。
 上述の例では、判定部14aからは、オーディオメニュー画面801の初期表示位置の中心座標は、操作可能範囲(図8の301参照)内にないとする範囲判定結果情報が出力されるので、表示位置決定部151aは、オーディオメニュー画面801が操作可能範囲内に表示される、オーディオメニュー画面801の表示位置を決定する。
When the determination unit 14a determines that the operation target, in other words, the initial display position of the response display object is not within the operable range, the display control unit 15a displays the operation target within the operable range. , Controls the display of the operation target on the display device 2.
Specifically, in the display control unit 15a, the display position determination unit 151a has information on the initial display position of the operation object output from the operation object identification unit 12a, and the range determination result output from the determination unit 14a. Based on the information, the display position of the operation target on the display device 2 is determined.
When the range determination result information indicates that the initial display position of the operation target is not within the operable range, the display position determination unit 151a, in other words, the determination unit 14a can operate the initial display position of the operation target. If it is determined that the operation target is not within the range, the display position of the operation target is determined so that the operation target is displayed within the operable range.
In the above example, the determination unit 14a outputs range determination result information that the center coordinates of the initial display position of the audio menu screen 801 are not within the operable range (see 301 in FIG. 8). The position determination unit 151a determines the display position of the audio menu screen 801 on which the audio menu screen 801 is displayed within the operable range.
 実施の形態2においても、実施の形態1同様、操作対象物が操作可能範囲内に表示されている状態とは、少なくとも、タッチ操作有効領域が操作可能範囲内に表示されている状態というものとする。例えば、オーディオメニュー画面801は、操作者がタッチ操作を行うボタン等の表示物を複数含む表示物である場合、表示位置決定部151aは、オーディオメニュー画面801に含まれる全ての表示物について、タッチ操作有効領域が、操作可能範囲内にある状態となる表示位置を、オーディオメニュー画面801の表示位置として決定する。 Also in the second embodiment, as in the first embodiment, the state in which the operation target is displayed within the operable range means that at least the touch operation effective area is displayed within the operable range. To do. For example, when the audio menu screen 801 is a display object including a plurality of display objects such as buttons for which the operator performs a touch operation, the display position determination unit 151a touches all the display objects included in the audio menu screen 801. The display position in which the operation effective area is within the operable range is determined as the display position of the audio menu screen 801.
 一方、表示位置決定部151aは、範囲判定結果情報が、応答表示物の初期表示位置が操作可能範囲内にあることを示す場合、言い換えれば、判定部14aが、応答表示物の初期表示位置は操作可能範囲内にあると判定した場合、応答表示物の初期表示位置を、当該応答表示物の表示位置に決定する。表示位置決定部151aは、応答表示物の初期表示位置を、操作対象物特定部12aから出力された応答表示物の初期表示位置に関する情報に基づき決定すればよい。 On the other hand, when the display position determination unit 151a indicates that the range determination result information indicates that the initial display position of the response display object is within the operable range, in other words, the determination unit 14a determines that the initial display position of the response display object is If it is determined that the response display is within the operable range, the initial display position of the response display is determined to be the display position of the response display. The display position determination unit 151a may determine the initial display position of the response display object based on the information regarding the initial display position of the response display object output from the operation target object identification unit 12a.
 表示制御部15aは、表示位置決定部151aが決定した操作対象物の表示位置に関する情報と、応答生成部152が生成した応答表示用情報に基づき、表示位置決定部151aが決定した表示位置に操作対象物、言い換えれば、応答表示物が表示されるよう、応答表示物の、表示装置2への表示を制御する。具体的には、表示制御部15aは、例えば、応答表示用情報に基づき、応答表示物が表示装置2に表示された際に、当該応答表示物の中心の座標が、表示位置決定部151aが決定した応答表示物の中心の座標となるよう、応答表示物を表示させる。応答表示用情報には、応答表示物の大きさ等の情報が含まれており、表示制御部15aは、応答表示用情報に基づけば、表示された応答表示物の中心の座標が、表示位置決定部151aが決定した応答表示物の中心の座標になる、当該応答表示物の表示位置の座標を特定することができる。 The display control unit 15a operates at the display position determined by the display position determination unit 151a based on the information regarding the display position of the operation target determined by the display position determination unit 151a and the response display information generated by the response generation unit 152. The display of the response display object on the display device 2 is controlled so that the object, in other words, the response display object is displayed. Specifically, in the display control unit 15a, for example, when the response display object is displayed on the display device 2 based on the response display information, the coordinates of the center of the response display object are set by the display position determination unit 151a. The response display object is displayed so as to be the coordinates of the center of the determined response display object. The response display information includes information such as the size of the response display object, and the display control unit 15a sets the coordinates of the center of the displayed response display object as the display position based on the response display information. It is possible to specify the coordinates of the display position of the response display object, which is the coordinates of the center of the response display object determined by the determination unit 151a.
 図9は、実施の形態2において、表示位置決定部151aが、オーディオメニュー画面801が操作可能範囲内に表示されるよう、オーディオメニュー画面801の表示位置を決定し、表示制御部15aが、表示位置決定部151aが決定した位置にオーディオメニュー画面801を表示させた後の表示装置2の表示画面例のイメージを示す図である。
 図9に示すように、表示制御部15aが、表示位置決定部151aが決定したオーディオメニュー画面801の表示位置に基づき、オーディオメニュー画面801を、図8に示す、当該オーディオメニュー画面801の初期表示位置ではなく、図9に示す位置に表示させたことで、運転者は、当該オーディオメニュー画面801をタッチ操作できるようになる。
In FIG. 9, in the second embodiment, the display position determination unit 151a determines the display position of the audio menu screen 801 so that the audio menu screen 801 is displayed within the operable range, and the display control unit 15a displays the display. It is a figure which shows the image of the display screen example of the display device 2 after displaying the audio menu screen 801 at the position determined by the position determination unit 151a.
As shown in FIG. 9, the display control unit 15a displays the audio menu screen 801 as the initial display of the audio menu screen 801 shown in FIG. 8 based on the display position of the audio menu screen 801 determined by the display position determination unit 151a. By displaying the audio menu screen 801 at the position shown in FIG. 9 instead of the position, the driver can perform a touch operation on the audio menu screen 801.
 実施の形態2に係る表示制御装置1aの動作について説明する。
 図10は、実施の形態2に係る表示制御装置1aの動作を説明するためのフローチャートである。
 取得部11は、タッチ操作可能な表示装置2を介して実行させることが可能な機能が、遠隔操作によって指定されたことを示す遠隔操作情報、を取得する(ステップST1001)。ステップST1001の具体的な動作は、実施の形態1において図5を用いて説明した、ステップST501の具体的な動作と同様である。
 取得部11は、取得した遠隔操作情報を、操作対象物特定部12aおよび操作状態特定部13に出力する。
The operation of the display control device 1a according to the second embodiment will be described.
FIG. 10 is a flowchart for explaining the operation of the display control device 1a according to the second embodiment.
The acquisition unit 11 acquires remote control information indicating that a function that can be executed via the touch-operable display device 2 is specified by remote control (step ST1001). The specific operation of step ST1001 is the same as the specific operation of step ST501 described with reference to FIG. 5 in the first embodiment.
The acquisition unit 11 outputs the acquired remote control information to the operation target object identification unit 12a and the operation state identification unit 13.
 操作対象物特定部12aは、ステップST1001にて取得部11が取得した遠隔操作情報に基づいて、表示装置2の操作者がタッチ操作する操作対象物を特定するとともに、当該操作対象物の初期表示位置を特定する(ステップST1002)。
 操作対象物特定部12aは、特定した、操作対象物に関する情報、および、当該操作対象物の初期表示位置に関する情報を、判定部14aおよび表示制御部15aに出力する。
The operation target identification unit 12a identifies the operation target to be touch-operated by the operator of the display device 2 based on the remote control information acquired by the acquisition unit 11 in step ST1001, and initially displays the operation target. The position is specified (step ST1002).
The operation object specifying unit 12a outputs the specified information on the operation object and the information on the initial display position of the operation object to the determination unit 14a and the display control unit 15a.
 操作状態特定部13は、ステップST1001にて取得部11から出力された遠隔操作情報に基づいて、操作者の操作状態を特定する(ステップST1003)。ステップST1003の具体的な動作は、実施の形態1において図5を用いて説明した、ステップST503の具体的な動作と同様である。
 操作状態特定部13は、特定した操作状態に関する情報、言い換えれば、特定した、操作者が着座している座席の位置に関する情報を、判定部14aに出力する。
The operation state specifying unit 13 specifies the operation state of the operator based on the remote control information output from the acquisition unit 11 in step ST1001 (step ST1003). The specific operation of step ST1003 is the same as the specific operation of step ST503 described with reference to FIG. 5 in the first embodiment.
The operation state specifying unit 13 outputs the information regarding the specified operation state, in other words, the specified information regarding the position of the seat where the operator is seated, to the determination unit 14a.
 判定部14aは、ステップST1003にて操作状態特定部13が特定した操作者の操作状態に基づき、ステップST1002にて操作対象物特定部12aが特定した操作対象物の初期表示位置が、操作可能範囲内にあるか否かを判定する(ステップST1004)。
 判定部14aは、操作対象物の初期表示位置が操作可能範囲内にあるか否かの範囲判定結果情報を、表示制御部15aに出力する。
In the determination unit 14a, the initial display position of the operation target object specified by the operation target object identification unit 12a in step ST1002 is within the operable range based on the operation state of the operator specified by the operation state identification unit 13 in step ST1003. It is determined whether or not it is inside (step ST1004).
The determination unit 14a outputs the range determination result information as to whether or not the initial display position of the operation target is within the operable range to the display control unit 15a.
 表示制御部15aは、ステップST1004にて、判定部14aが、操作対象物、言い換えれば、応答表示物の初期表示位置は操作可能範囲内にないと判定した場合、操作対象物が操作可能範囲内に表示されるよう、表示装置2への、操作対象物の表示を制御する(ステップST1005)。
 表示制御部15aは、表示位置決定部151aが決定した操作対象物の表示位置に関する情報と、応答生成部152が生成した応答表示用情報に基づき、表示位置決定部151aが決定した表示位置に操作対象物、言い換えれば、応答表示物が表示されるよう、応答表示物の、表示装置2への表示を制御する。
When the display control unit 15a determines in step ST1004 that the operation target object, in other words, the initial display position of the response display object is not within the operable range, the operation target object is within the operable range. The display of the operation target object on the display device 2 is controlled so as to be displayed in (step ST1005).
The display control unit 15a operates at the display position determined by the display position determination unit 151a based on the information regarding the display position of the operation target determined by the display position determination unit 151a and the response display information generated by the response generation unit 152. The display of the response display object on the display device 2 is controlled so that the object, in other words, the response display object is displayed.
 なお、図10のフローチャートにおいて、ステップST1002の動作と、ステップST1003の動作の順番は問わない。ステップST1002の動作と、ステップST1003の動作は並行して行われてもよい。 In the flowchart of FIG. 10, the order of the operation of step ST1002 and the operation of step ST1003 does not matter. The operation of step ST1002 and the operation of step ST1003 may be performed in parallel.
 以上のように、実施の形態2に係る表示制御装置1aは、タッチ操作可能な表示装置2において、表示装置2の操作者がタッチ操作を所望する操作対象物について、当該操作対象物の初期表示位置が、操作者がタッチ操作できない位置であっても、当該操作対象物を、操作者がタッチ操作可能な範囲に表示させるよう制御することができる。具体的には、実施の形態2に係る表示制御装置1aは、タッチ操作可能な表示装置2において、遠隔操作によって指定された機能が実行された場合に、当該機能に応じて表示されるようになる、操作者がタッチ操作を所望する応答表示物について、当該応答表示物の初期表示位置が、操作者がタッチ操作できない位置であっても、当該応答表示物を、操作者がタッチ操作可能な位置に表示させるよう制御することができる。 As described above, in the display control device 1a according to the second embodiment, in the touch-operable display device 2, the initial display of the operation target object for which the operator of the display device 2 desires the touch operation is displayed. Even if the position is a position where the operator cannot perform the touch operation, it is possible to control the operation object to be displayed within the range where the operator can perform the touch operation. Specifically, the display control device 1a according to the second embodiment is displayed according to the function specified by the remote control on the touch-operable display device 2. For a response display object that the operator desires to touch, the operator can touch the response display object even if the initial display position of the response display object is a position where the operator cannot perform the touch operation. It can be controlled to be displayed at the position.
 実施の形態1に係る表示制御装置1は、操作対象物を、表示装置2に表示されている、遠隔操作によって指定された機能を実行させるためのアイコンとし、遠隔操作によって当該アイコンに対応付けられている機能の実行が指定されると、当該アイコンを、操作者がタッチ操作可能な位置に表示させるようにしていた。
 この場合、操作者は、遠隔操作によって実行させたい機能を指定したにも関わらず、改めてアイコンをタッチ操作し、当該アイコンに対応付けられている機能の実行を指示することになる。そして、操作者が改めてアイコンをタッチ操作したことで、遠隔操作によって指定された機能が実行された場合に表示されるべき応答表示物が、表示されるようになる。
 上述の例でいうと、操作者は、「オーディオを操作」と発話することで、実行させたいオーディオ機能を指定し、改めて、Audioアイコン203をタッチ操作することで、オーディオ機能を実行させることができる。
 すなわち、操作者は、オーディオ機能が実行されるまでに、遠隔操作によるオーディオ機能の指定と、タッチ操作によるオーディオ機能の実行の指示とを行う必要がある。実行させるオーディオ機能を指定したにも関わらず、改めてAudioアイコン203がタッチ操作可能な位置に表示されてから、当該Audioアイコン203をタッチ操作してオーディオ機能を実行させることは、操作者にとっては、二度手間になると言える。
The display control device 1 according to the first embodiment uses an operation target as an icon displayed on the display device 2 for executing a function designated by remote control, and is associated with the icon by remote control. When the execution of the function is specified, the icon is displayed at a position where the operator can perform a touch operation.
In this case, the operator touches the icon again to instruct the execution of the function associated with the icon, even though the function to be executed by remote control is specified. Then, when the operator touches the icon again, the response display object to be displayed when the function specified by the remote control is executed is displayed.
In the above example, the operator can specify the audio function to be executed by saying "operate the audio", and can execute the audio function by touching the Audio icon 203 again. it can.
That is, the operator needs to specify the audio function by remote control and instruct the execution of the audio function by touch operation before the audio function is executed. Even though the audio function to be executed is specified, it is possible for the operator to execute the audio function by touching the Audio icon 203 after the Audio icon 203 is displayed at a position where the audio function can be touched again. It can be said that it will be troublesome twice.
 これに対し、実施の形態2に係る表示制御装置1aは、上述のとおり、機能の実行を指定する遠隔操作が行われると、当該機能の実行に応じて表示装置2に表示されるべき応答表示物を、操作者がタッチ操作可能な表示位置に表示させる。
 具体的には、例えば、表示制御装置1aは、「オーディオメニュー表示」という機能の実行を指定する遠隔操作が行われた場合に、当該機能の実行に応じて表示装置2に表示されるべき「オーディオメニュー画面」を、操作者がタッチ操作可能な表示位置に表示させる。操作者は、「オーディオメニュー表示」という機能に対応付けられた、当該機能を実行させるためのAudioアイコン203がタッチ操作可能な位置に表示されてから、当該Audioアイコン203をタッチ操作するという動作を行うことなく、当該Audioアイコン203のタッチ操作によってオーディオ機能が実行された場合に表示される「オーディオメニュー画面」を、タッチ操作できる。
 このように、実施の形態2に係る表示制御装置1aは、機能の実行を指定する遠隔操作が行われると、当該機能の実行に応じて表示装置2に表示されるべき応答表示物を、操作者がタッチ操作可能な表示位置に表示させるようにしたことで、操作者が、遠隔操作およびタッチ操作という二重の操作を行うことなく、所望の機能を実行させ、当該機能が実行されたことによる応答表示物を速やかにタッチ操作できるよう、表示装置2に表示される表示物の表示制御を行うことができる。
On the other hand, in the display control device 1a according to the second embodiment, as described above, when the remote control for designating the execution of the function is performed, the response display to be displayed on the display device 2 according to the execution of the function is performed. Display an object at a display position where the operator can touch it.
Specifically, for example, when a remote control for designating the execution of a function called "audio menu display" is performed, the display control device 1a should be displayed on the display device 2 according to the execution of the function. The "audio menu screen" is displayed at a display position where the operator can perform touch operation. The operator performs an operation of touch-operating the Audio icon 203 after the Audio icon 203 for executing the function, which is associated with the function of "audio menu display", is displayed at a touch-operable position. The "audio menu screen" displayed when the audio function is executed by the touch operation of the Audio icon 203 can be touch-operated without performing the operation.
As described above, when the display control device 1a according to the second embodiment performs the remote control for designating the execution of the function, the display control device 1a operates the response display object to be displayed on the display device 2 according to the execution of the function. By displaying the display at a display position where the user can perform touch operation, the operator can execute the desired function without performing the double operation of remote control and touch operation, and the function is executed. The display control of the display object displayed on the display device 2 can be performed so that the display object can be quickly touch-controlled.
 以上のように、実施の形態2によれば、表示制御装置1aは、タッチ操作可能な表示装置2を介して実行させることが可能な機能が、遠隔操作によって指定されたことを示す遠隔操作情報、を取得する取得部11と、取得部11が取得した遠隔操作情報に基づいて、表示装置2の操作者がタッチ操作する操作対象物を特定するとともに、当該操作対象物の初期表示位置を特定する操作対象物特定部12aと、遠隔操作を行った操作者の操作状態を特定する操作状態特定部13と、操作状態特定部13が特定した操作者の操作状態に基づき、操作対象物特定部12aが特定した操作対象物の初期表示位置が、操作者がタッチ操作可能な、表示装置2の表示画面における、操作可能範囲内にあるか否かを判定する判定部14aと、判定部14aが、操作対象物の初期表示位置は操作可能範囲内にないと判定した場合、操作対象物が操作可能範囲内に表示されるよう、表示装置2への、操作対象物の表示を制御する表示制御部15aを備えるように構成した。
 そのため、タッチ操作可能な表示装置2において、表示装置2の操作者がタッチ操作を所望する操作対象物について、当該操作対象物の初期表示位置が、操作者がタッチ操作できない位置であっても、当該操作対象物を、操作者のタッチ操作可能範囲に表示させるよう制御することができる。
As described above, according to the second embodiment, the display control device 1a indicates that the function that can be executed via the touch-operable display device 2 is specified by remote control. Based on the acquisition unit 11 to acquire the above and the remote control information acquired by the acquisition unit 11, the operation object to be touch-operated by the operator of the display device 2 is specified, and the initial display position of the operation object is specified. Operation target object identification unit 12a, operation state identification unit 13 that specifies the operation state of the operator who performed the remote control, and the operation object identification unit 13 that specifies the operation state of the operator specified by the operation state identification unit 13. The determination unit 14a and the determination unit 14a determine whether or not the initial display position of the operation object specified by 12a is within the operable range on the display screen of the display device 2 that can be touch-controlled by the operator. , When it is determined that the initial display position of the operation object is not within the operable range, the display control for controlling the display of the operation object on the display device 2 so that the operation object is displayed within the operable range. It is configured to include the part 15a.
Therefore, in the touch-operable display device 2, even if the initial display position of the operation target that the operator of the display device 2 desires to touch is a position that the operator cannot touch. It is possible to control the operation object to be displayed in the touchable range of the operator.
 また、実施の形態2に係る表示制御装置1aにおいては、表示装置2には機能を実行させるためのアイコンが表示可能であって、操作対象物は、当該アイコンが操作された場合以降に表示装置2に表示される応答表示物であるものとした。すなわち、表示制御装置1aは、応答表示物の初期表示位置は操作可能範囲内にないと判定した場合、応答表示物が操作可能範囲内に表示されるよう、表示装置2への、応答表示物の表示を制御する。これにより、表示制御装置1aは、操作者が、遠隔操作およびタッチ操作という二重の操作を行うことなく、所望の機能を実行させ、当該機能が実行されたことによる応答表示物を速やかにタッチ操作できるよう、表示装置2に表示される表示物の表示制御を行うことができる。 Further, in the display control device 1a according to the second embodiment, the display device 2 can display an icon for executing the function, and the operation target is the display device after the icon is operated. It was assumed that the response display was displayed in 2. That is, when the display control device 1a determines that the initial display position of the response display object is not within the operable range, the response display object is displayed on the display device 2 so that the response display object is displayed within the operable range. Control the display of. As a result, the display control device 1a causes the operator to execute the desired function without performing the double operation of remote control and touch operation, and promptly touches the response display object due to the execution of the function. It is possible to control the display of the display object displayed on the display device 2 so that the display device 2 can be operated.
 以上の実施の形態1および実施の形態2では、表示制御装置1,1aの操作状態特定部13は、入力装置3としてのアレイマイクから出力された音声信号に基づいて、操作者の操作状態を特定するものとしていたが、これは一例に過ぎない。
 例えば、操作状態特定部13は、車内が撮像された撮像画像に基づき、操作状態を特定するものとしてもよい。
 この場合、表示制御装置1,1aの取得部11は、車内を撮像する撮像装置(図示省略)と接続され、撮像装置から、撮像画像を取得するものとする。撮像装置は、例えば、可視光カメラ、または、赤外線カメラである。撮像装置は、車内の運転者の状態を監視するために車両に搭載される、いわゆる「ドライバモニタリングシステム」が有する撮像装置と共用のものであってもよい。
 操作状態特定部13は、取得部11が取得した撮像画像に基づき、車両の乗員の開口度を算出する。操作状態特定部13は、周知の画像認識技術を用いて画像認識処理を実施して、車両の乗員の開口度を算出すればよい。操作状態特定部13は、算出した開口部が予め設定された閾値(以下「開口度判定用閾値」という。)以上である乗員を、操作者であるとし、当該操作者が着座している座席の位置を、操作者の位置として特定する。
 なお、上述のように、操作状態特定部13が撮像画像に基づいて操作者の操作状態を特定する場合、入力装置3としてのマイクは、アレイマイクであることを必須としない。
In the first and second embodiments described above, the operation state specifying unit 13 of the display control devices 1 and 1a determines the operation state of the operator based on the voice signal output from the array microphone as the input device 3. It was intended to be specified, but this is just an example.
For example, the operation state specifying unit 13 may specify the operation state based on the captured image captured inside the vehicle.
In this case, the acquisition unit 11 of the display control devices 1 and 1a is connected to an image pickup device (not shown) that images the inside of the vehicle, and acquires the captured image from the image pickup device. The imaging device is, for example, a visible light camera or an infrared camera. The image pickup device may be shared with the image pickup device included in the so-called "driver monitoring system" mounted on the vehicle for monitoring the state of the driver in the vehicle.
The operation state specifying unit 13 calculates the opening degree of the occupant of the vehicle based on the captured image acquired by the acquisition unit 11. The operation state specifying unit 13 may perform image recognition processing using a well-known image recognition technique to calculate the opening degree of the occupant of the vehicle. The operation state specifying unit 13 considers that the occupant whose calculated opening is equal to or higher than a preset threshold value (hereinafter referred to as “opening degree determination threshold value”) is an operator, and the seat in which the operator is seated. The position of is specified as the position of the operator.
As described above, when the operation state specifying unit 13 specifies the operation state of the operator based on the captured image, it is not essential that the microphone as the input device 3 is an array microphone.
 また、以上の実施の形態1および実施の形態2では、操作者は、実行させたい機能を、音声によって指定するものとしたが、これは一例に過ぎない。
 例えば、操作者は、実行させたい機能を、ジェスチャによって指定するものとしてもよい。この場合、入力装置3は、上述したような撮像装置とする。
 表示制御装置1,1aの取得部11は取得した撮像画像を遠隔操作情報として取得し、操作対象物特定部12,12aは、取得部11が取得した撮像画像に基づいて、操作対象物を特定する。
 具体的には、例えば、操作者が行うジェスチャは、実行させる機能に対応付けられたアイコンを指さす指差し動作とし、実施の形態1に係る表示制御装置1において、操作対象物特定部12は、周知の画像認識技術を用いて画像認識処理を実施し、操作者による指差し動作内容、および、指差し方向を特定することで、操作者が指差したアイコンを操作対象物として特定する。また、例えば、実施の形態2に係る表示制御装置1aにおいて、操作対象物特定部12aは、周知の画像認識技術を用いて画像認識処理を実施し、操作者による指差し動作、および、指差し方向を特定することで、操作者が指差したアイコンを特定し、さらに、特定したアイコンから、当該アイコンに対応付けられた応答表示物を、操作対象物として特定する。
Further, in the above-described first and second embodiments, the operator specifies the function to be executed by voice, but this is only an example.
For example, the operator may specify the function to be executed by a gesture. In this case, the input device 3 is an imaging device as described above.
The acquisition unit 11 of the display control devices 1 and 1a acquires the acquired captured image as remote control information, and the operation target identification units 12 and 12a specify the operation target based on the captured image acquired by the acquisition unit 11. To do.
Specifically, for example, the gesture performed by the operator is a pointing action of pointing to an icon associated with the function to be executed, and in the display control device 1 according to the first embodiment, the operation target identification unit 12 is set. By performing image recognition processing using a well-known image recognition technique and specifying the content of the pointing operation by the operator and the pointing direction, the icon pointed by the operator is specified as the operation target. Further, for example, in the display control device 1a according to the second embodiment, the operation target object identification unit 12a performs an image recognition process using a well-known image recognition technique, and the operator points and points. By specifying the direction, the icon pointed by the operator is specified, and from the specified icon, the response display object associated with the icon is specified as the operation target object.
 操作者が行うジェスチャは、指差し動作に限らない。例えば、手をかざす等、予め、実行させる機能に応じてジェスチャが決められており、操作者は、実行させたい機能に応じたジェスチャを行うものとしてもよい。操作対象物特定部12,12aは、周知の画像認識技術を用いて画像認識処理を実施して、撮像画像からジェスチャを特定し、予め決められている、ジェスチャに対応付けられた機能を特定する。そして、操作対象物特定部12,12aは、特定した機能から、操作対象物を特定する。 Gestures performed by the operator are not limited to pointing movements. For example, the gesture is determined in advance according to the function to be executed, such as holding a hand, and the operator may perform the gesture according to the function to be executed. The operation object identification units 12 and 12a perform image recognition processing using a well-known image recognition technique, identify gestures from captured images, and identify predetermined functions associated with gestures. .. Then, the operation target object specifying units 12 and 12a specify the operation target object from the specified function.
 入力装置3を撮像装置とし、操作者がジェスチャによって、実行させたい機能を指定するものとする場合、表示制御装置1,1aの操作状態特定部13は、撮像画像に基づいて、操作者の操作状態を特定する。具体的には、操作状態特定部13は、例えば、周知の画像認識技術を用いて画像認識処理を実施して、ジェスチャを行っている操作者を特定し、当該操作者が着座している座席の位置を、操作者の操作状態とする。 When the input device 3 is used as an image pickup device and the operator specifies a function to be executed by a gesture, the operation state specifying unit 13 of the display control devices 1 and 1a operates the operator based on the captured image. Identify the state. Specifically, the operation state specifying unit 13 performs image recognition processing using, for example, a well-known image recognition technique, identifies an operator performing a gesture, and seats the operator. The position of is set to the operation state of the operator.
 また、以上の実施の形態1および実施の形態2において、例えば、入力装置3をリモートコントローラ(以下「リモコン」という。)とし、操作者は、実行させたい機能を、リモコンによって指定するものとしてもよい。操作者がリモコン操作によって、実行させたい機能を指定すると、取得部11は、リモコンから、当該指定された機能を示す情報(以下「リモコン情報」という。)を、遠隔操作情報として取得する。
 表示制御装置1,1aの操作対象物特定部12,12aは、取得部11が取得したリモコン情報に基づいて、操作対象物を特定する。
 この場合、表示制御装置1,1aは、例えば、上述したような撮像装置と接続されるものとし、操作状態特定部13は、取得部11を介して撮像装置から撮像画像を取得する。そして、操作状態特定部13は、取得した撮像画像に基づいて、操作者の操作状態を特定する。具体的には、操作状態特定部13は、例えば、周知の画像認識技術を用いて画像認識処理を実施して、リモコンを操作している操作者を特定し、当該操作者が着座している座席の位置を、操作者の操作状態とする。
Further, in the above-described first and second embodiments, for example, the input device 3 may be a remote controller (hereinafter referred to as "remote controller"), and the operator may specify the function to be executed by the remote controller. Good. When the operator specifies a function to be executed by remote control operation, the acquisition unit 11 acquires information indicating the designated function (hereinafter referred to as "remote control information") from the remote control as remote control information.
The operation target identification units 12 and 12a of the display control devices 1, 1a specify the operation target based on the remote control information acquired by the acquisition unit 11.
In this case, the display control devices 1 and 1a are connected to, for example, an imaging device as described above, and the operation state specifying unit 13 acquires an captured image from the imaging device via the acquisition unit 11. Then, the operation state specifying unit 13 specifies the operation state of the operator based on the acquired captured image. Specifically, the operation state specifying unit 13 performs image recognition processing using, for example, a well-known image recognition technique, identifies an operator who is operating the remote controller, and the operator is seated. The position of the seat is the operating state of the operator.
 なお、以上において、入力装置3がアレイマイク、マイク、撮像装置、または、リモコンであり、取得部11は、アレイマイクまたはマイクから出力される音声信号、撮像装置から出力される撮像画像、または、リモコンから出力されるリモコン情報を、それぞれ、遠隔操作情報としてもよい旨説明したが、これは一例に過ぎない。
 取得部11は、例えば、音声信号、撮像画像、または、リモコン情報のうちの、いずれか1つ以上を、遠隔操作情報として取得するようになっていてもよい。
In the above, the input device 3 is an array microphone, a microphone, an image pickup device, or a remote controller, and the acquisition unit 11 is an audio signal output from the array microphone or the microphone, an image captured image output from the image pickup device, or a remote controller. It was explained that the remote control information output from the remote control may be used as remote control information, but this is only an example.
The acquisition unit 11 may acquire, for example, any one or more of the audio signal, the captured image, and the remote control information as remote control information.
 また、以上の実施の形態1および実施の形態2において、操作状態特定部13は、撮像画像に基づき、操作者の操作状態として操作者の位置を特定するものとした場合、操作者の、車内の空間における位置を、操作者の位置とすることもできる。
 操作状態特定部13は、音声信号を解析し、当該音声信号の発生源の方向を特定することで操作者の位置を特定する場合、厳密には、車内における操作者の位置を特定することができない。そのため、以上の実施の形態1および実施の形態2では、音声信号の発生源の方向に基づき、操作者の着座している座席の位置を、操作者の位置としていた。
 これに対し、操作状態特定部13は、撮像画像に基づけば、操作者の、車内の空間における位置を特定することができる。例えば、操作状態特定部13は、撮像画像に基づいて、撮像画像における操作者の顔位置を特定し、当該顔位置を、車内の空間における操作者の位置とすればよい。撮像装置の撮像範囲は予めわかっているので、操作状態特定部13は、撮像画像における操作者の顔位置が特定できれば、車内の空間における操作者の位置が特定できる。
Further, in the above-described first and second embodiments, when the operation state specifying unit 13 specifies the position of the operator as the operation state of the operator based on the captured image, the inside of the vehicle of the operator The position in the space of is also the position of the operator.
When the operation state specifying unit 13 analyzes the voice signal and specifies the position of the operator by specifying the direction of the source of the voice signal, strictly speaking, the position of the operator in the vehicle may be specified. Can not. Therefore, in the first and second embodiments described above, the position of the seat where the operator is seated is set as the position of the operator based on the direction of the source of the voice signal.
On the other hand, the operation state specifying unit 13 can specify the position of the operator in the space inside the vehicle based on the captured image. For example, the operation state specifying unit 13 may specify the face position of the operator in the captured image based on the captured image, and set the face position as the position of the operator in the space inside the vehicle. Since the imaging range of the imaging device is known in advance, the operation state specifying unit 13 can identify the position of the operator in the space inside the vehicle if the face position of the operator in the captured image can be specified.
 操作状態特定部13が、操作者の、車内の空間における位置を、操作者の位置とすることで、操作者が着座している座席の位置を操作者の位置とする場合と比べて、判定部14,14aの操作範囲判定部141,141aが操作可能範囲を判定する際に、より正確に操作者の操作可能範囲を特定することができるようになる。その結果、表示制御部15,15aは、より確実に、操作者がタッチ操作できるよう、操作対象物の表示位置の決定、および、操作対象物の表示制御を行うことができる。この場合、例えば、操作範囲判定部141,141aが操作可能範囲を特定する際に参照する範囲特定用情報には、車内の空間位置と操作可能範囲とが対応付けられているものとする。 The operation state specifying unit 13 determines that the position of the operator in the space inside the vehicle is the position of the operator, as compared with the case where the position of the seat in which the operator is seated is the position of the operator. When the operation range determination units 141, 141a of the units 14 and 14a determine the operable range, the operator's operable range can be specified more accurately. As a result, the display control units 15 and 15a can more reliably determine the display position of the operation target and control the display of the operation target so that the operator can perform the touch operation more reliably. In this case, for example, it is assumed that the spatial position in the vehicle and the operable range are associated with the range specifying information referred to when the operating range determining units 141 and 141a specify the operable range.
 また、以上の実施の形態1および実施の形態2において、操作状態特定部13は、撮像画像に基づいて操作者の操作状態を特定するものとした場合、操作者の位置および操作者の姿勢を、操作者の操作状態として特定するようにしてもよい。操作者の姿勢とは、例えば、操作者の身体の向きである。操作者の身体の向きは、例えば、車両の進行方向側を正面とし、操作者が当該正面に向かって真っすぐ着座している状態を基準の向きとして、当該基準の向きからの角度の変化量であらわされる。具体例を挙げると、例えば、操作者が基準の向きを向いている場合の角度を0度とし、操作者が正面に対して右側に向いた場合は、右側に向いた分だけプラスの角度とし、操作者が正面に対して左側に向いた場合は、左側に向いた分だけマイナスの角度とする。操作状態特定部13は、操作者の位置、および、上述したような角度を、操作者の操作状態として特定する。
 操作範囲判定部141,141aは、操作状態特定部13が特定した、操作者の位置および操作者の姿勢に基づき、操作可能範囲を判定する。具体的には、例えば、操作範囲判定部141,141aは、操作者の姿勢に応じて、操作者の位置に対応付けられている操作可能範囲を、表示装置2の表示画面上において、右側または左側にずらす。操作範囲判定部141,141aが、どれぐらいの姿勢に応じて、どれぐらい操作可能範囲をずらすかは、予め決められているものとする。
Further, in the above-described first and second embodiments, when the operation state specifying unit 13 specifies the operation state of the operator based on the captured image, the position of the operator and the posture of the operator are determined. , It may be specified as the operation state of the operator. The posture of the operator is, for example, the orientation of the body of the operator. The orientation of the operator's body is, for example, the amount of change in the angle from the reference orientation, with the vehicle traveling direction side as the front and the operator sitting straight toward the front as the reference orientation. Represented. To give a specific example, for example, when the operator is facing the reference direction, the angle is set to 0 degrees, and when the operator is facing the right side with respect to the front, the angle is set to be positive by the amount facing the right side. If the operator faces the left side with respect to the front, the angle should be negative by the amount facing the left side. The operation state specifying unit 13 specifies the position of the operator and the angle as described above as the operation state of the operator.
The operation range determination units 141 and 141a determine the operable range based on the position of the operator and the posture of the operator specified by the operation state identification unit 13. Specifically, for example, the operation range determination units 141 and 141a display the operable range associated with the position of the operator on the right side or the display screen of the display device 2 according to the posture of the operator. Move it to the left. It is assumed that how much the operation range determination units 141 and 141a shift the operable range according to how much posture is determined in advance.
 また、操作者の姿勢とは、例えば、操作者が着座している座席の傾き量としてもよい。
 操作状態特定部13は、操作者が着座している座席の傾き量を、例えば、撮像画像に基づいて算出するようにすればよい。また、操作状態特定部13は、例えば、座席に設置されたセンサから、操作者が着座している座席の傾き量に関する情報を取得するようにしてもよい。操作状態特定部13は、操作者の位置、および、撮像画像に基づいて算出した傾き量を、操作者の操作状態として特定する。または、操作状態特定部13は、操作者の位置、および、センサから取得した傾き量を、操作者の操作状態として特定する。
 操作範囲判定部141,141aは、操作状態特定部13が特定した、操作者の位置および操作者の姿勢に基づき、操作可能範囲を判定する。具体的には、例えば、操作範囲判定部141,141aは、操作者の姿勢に応じて、操作者の位置に対応付けられている操作可能範囲を、表示装置2の表示画面上において、右側または左側にずらすとともに縮小させる。操作範囲判定部141,141aが、どれぐらいの姿勢に応じて、どれぐらい操作可能範囲をずらすかは、また、どれぐらい操作可能範囲を縮小させるかは、予め決められているものとする。
Further, the posture of the operator may be, for example, the amount of inclination of the seat in which the operator is seated.
The operation state specifying unit 13 may calculate the amount of inclination of the seat in which the operator is seated, for example, based on the captured image. Further, the operation state specifying unit 13 may acquire information regarding the amount of inclination of the seat in which the operator is seated, for example, from a sensor installed in the seat. The operation state specifying unit 13 specifies the position of the operator and the amount of inclination calculated based on the captured image as the operation state of the operator. Alternatively, the operation state specifying unit 13 specifies the position of the operator and the amount of inclination acquired from the sensor as the operation state of the operator.
The operation range determination units 141 and 141a determine the operable range based on the position of the operator and the posture of the operator specified by the operation state identification unit 13. Specifically, for example, the operation range determination units 141 and 141a display the operable range associated with the position of the operator on the right side or the display screen of the display device 2 according to the posture of the operator. Move it to the left and reduce it. It is assumed that how much the operation range determination units 141 and 141a shift the operable range according to the posture and how much the operable range is reduced are predetermined.
 また、以上の実施の形態1および実施の形態2では、表示装置2は、車内において、車両の進行方向に対して前方に位置する運転席および助手席のさらに前方に設置されていることを前提としたが、実施の形態1および実施の形態2で前提とした表示装置2の設置位置は、一例に過ぎない。
 表示装置2は、例えば、車両の進行方向に対して、車室の右側面または左側面に設置されているものとしてもよいし、車室の天井に設置されているものとしてもよい。
 また、表示装置2は、車内に複数設置されるものとしてもよい。具体的には、例えば、表示装置2は、車内において、車両の進行方向に対して前方に位置する運転席および助手席のさらに前方と、車室の両側面とに設置されているものとしてもよい。
Further, in the above-described first and second embodiments, it is premised that the display device 2 is installed in the vehicle in front of the driver's seat and the passenger's seat located in front of the traveling direction of the vehicle. However, the installation position of the display device 2 assumed in the first embodiment and the second embodiment is only an example.
The display device 2 may be installed on the right side surface or the left side surface of the vehicle interior with respect to the traveling direction of the vehicle, or may be installed on the ceiling of the vehicle interior.
Further, a plurality of display devices 2 may be installed in the vehicle. Specifically, for example, the display device 2 may be installed in the vehicle in front of the driver's seat and the passenger seat located in front of the vehicle in the traveling direction, and on both sides of the vehicle interior. Good.
 表示装置2が車内に複数設置されている場合であって、かつ、操作範囲判定部141,141aが、操作状態特定部13が特定した操作者の位置および操作者の姿勢に基づき操作可能範囲を判定する場合、操作範囲判定部141,141aは、複数の表示装置2のうち、操作者がタッチ操作する表示装置2を選択した上で、選択した表示装置2における操作可能範囲を判定するようにしてもよい。
 例えば、操作者の姿勢は、操作者の身体の向きとする。そして、操作状態特定部13が、操作者が着座している運転席の位置が「右座席」の位置であり、操作者の姿勢は、プラス90度と特定したとする。また、表示装置2は、車内において、車両の進行方向に対して運転席のさらに前方、および、車室の両側面に設置されているものとする。
 この場合、操作範囲判定部141,141aは、まず、進行方向に対して車室の右側面に設置されている表示装置2を選択する。例えば、操作者の位置に関する情報、操作者の姿勢に関する情報、および、複数の表示装置2のうち、操作者がタッチ操作すると判定する表示装置2に関する情報が対応付けられた、第1表示装置選択用情報が予め生成されて、記憶部に記憶されているものとする。ここでは、例えば、第1表示装置選択用情報において、操作者の位置が「右座席」の位置、かつ、操作者の姿勢「プラス90度」は、右側面に設置されている表示装置2と対応付けられているものとする。操作範囲判定部141,141aは、操作状態特定部13が特定した、「右座席」の位置との情報、および、操作者の姿勢「プラス90度」との情報に基づき、第1表示装置選択用情報を参照して、進行方向に対して車室の右側面に設置されている表示装置2を選択する。
 そして、操作範囲判定部141,141aは、選択した、右側面に設置されている表示装置2における操作可能範囲を判定する。例えば、第1表示装置選択用情報には、操作者の位置、操作者の姿勢、および、表示装置2に加え、当該表示装置2における操作可能範囲についても定義されているものとし、操作範囲判定部141,141aは、第1表示装置選択用情報を参照して、操作可能範囲を判定すればよい。
When a plurality of display devices 2 are installed in the vehicle, the operation range determination units 141 and 141a determine the operable range based on the position of the operator and the posture of the operator specified by the operation state specifying unit 13. When making a determination, the operation range determination units 141 and 141a select the display device 2 to be touch-operated by the operator from among the plurality of display devices 2, and then determine the operable range in the selected display device 2. You may.
For example, the posture of the operator is the orientation of the body of the operator. Then, it is assumed that the operation state specifying unit 13 specifies that the position of the driver's seat where the operator is seated is the position of the "right seat" and the posture of the operator is plus 90 degrees. Further, it is assumed that the display device 2 is installed in the vehicle in front of the driver's seat with respect to the traveling direction of the vehicle and on both side surfaces of the vehicle interior.
In this case, the operation range determination units 141 and 141a first select the display device 2 installed on the right side surface of the vehicle interior with respect to the traveling direction. For example, the first display device selection in which information on the position of the operator, information on the posture of the operator, and information on the display device 2 determined to be touch-operated by the operator among the plurality of display devices 2 are associated. It is assumed that the information for use is generated in advance and stored in the storage unit. Here, for example, in the information for selecting the first display device, the position of the operator is the position of the "right seat" and the posture of the operator "plus 90 degrees" is the display device 2 installed on the right side surface. It is assumed that they are associated with each other. The operation range determination units 141 and 141a select the first display device based on the information of the position of the "right seat" and the information of the operator's posture "plus 90 degrees" specified by the operation state identification unit 13. The display device 2 installed on the right side of the vehicle interior with respect to the traveling direction is selected with reference to the information.
Then, the operation range determination units 141 and 141a determine the operable range of the selected display device 2 installed on the right side surface. For example, it is assumed that the first display device selection information defines the position of the operator, the posture of the operator, and the operable range of the display device 2 in addition to the display device 2, and determines the operating range. Units 141 and 141a may determine the operable range with reference to the information for selecting the first display device.
 また、表示装置2が車内に複数設置されている場合の上述の例において、操作者の姿勢は、操作者が着座している座席の傾き量としてもよい。
 この場合、例えば、予め、操作者の位置、操作者が着座している座席の傾き量、および、複数の表示装置2のうち操作者がタッチ操作すると判定する表示装置2、が対応付けられた第2表示装置選択用情報が生成されて、記憶部に記憶されているものとする。操作範囲判定部141,141aは、第2表示装置選択用情報を参照して、操作者がタッチ操作する表示装置2を選択する。そして、操作範囲判定部141,141aは、選択した表示装置2における操作可能範囲を判定する。
Further, in the above-described example in which a plurality of display devices 2 are installed in the vehicle, the posture of the operator may be the amount of inclination of the seat in which the operator is seated.
In this case, for example, the position of the operator, the amount of inclination of the seat on which the operator is seated, and the display device 2 which is determined to be touch-operated by the operator among the plurality of display devices 2 are associated with each other in advance. It is assumed that the information for selecting the second display device is generated and stored in the storage unit. The operation range determination units 141 and 141a select the display device 2 to be touch-operated by the operator with reference to the second display device selection information. Then, the operation range determination units 141 and 141a determine the operable range in the selected display device 2.
 例えば、操作状態特定部13が、操作者が着座している運転席の位置が「右座席」の位置であり、当該「右座席」の傾き量は「140度」と特定したとする。また、表示装置2は、車内において、車両の進行方向に対して運転席のさらに前方、および、車室の両側面に設置されているものとする。また、第2表示装置選択用情報において、「右座席」の位置と、操作者が着座している座席の傾き量「130~145度」と、進行方向に対して車室の右側面に設置されている表示装置2とが、対応付けられているものとする。 For example, it is assumed that the operation state specifying unit 13 specifies that the position of the driver's seat in which the operator is seated is the position of the "right seat" and the amount of inclination of the "right seat" is "140 degrees". Further, it is assumed that the display device 2 is installed in the vehicle in front of the driver's seat with respect to the traveling direction of the vehicle and on both side surfaces of the vehicle interior. In addition, in the information for selecting the second display device, the position of the "right seat", the amount of inclination of the seat on which the operator is seated "130 to 145 degrees", and the installation on the right side of the passenger compartment with respect to the traveling direction. It is assumed that the display device 2 is associated with the display device 2.
 この場合、操作範囲判定部141,141aは、操作状態特定部13が特定した、操作者の位置、および、操作者が着座している座席の傾き量に基づき、第2表示装置選択用情報を参照して、進行方向に対して車室の右側面に設置されている表示装置2を選択し、操作可能範囲を判定する。例えば、第2表示装置選択用情報には、操作者の位置に関する情報、操作者が着座している座席の傾き量に関する情報、および、表示装置2に関する情報に加え、表示装置2における操作可能範囲についても定義されているものとし、操作範囲判定部141,141aは、第2表示装置選択用情報を参照して、操作可能範囲を判定すればよい。 In this case, the operation range determination units 141 and 141a provide the second display device selection information based on the position of the operator specified by the operation state identification unit 13 and the amount of inclination of the seat in which the operator is seated. With reference to this, the display device 2 installed on the right side surface of the vehicle interior is selected with respect to the traveling direction, and the operable range is determined. For example, the second display device selection information includes information on the position of the operator, information on the amount of inclination of the seat in which the operator is seated, and information on the display device 2, and the operable range in the display device 2. Is also defined, and the operation range determination units 141 and 141a may determine the operable range by referring to the information for selecting the second display device.
 また、以上の実施の形態1および実施の形態2では、操作者は運転者であることを前提としたが、操作者は運転者に限らない。操作者は、運転者以外の乗員であってもよい。
 例えば、操作者は、助手席に着座している乗員であってもよいし、操作者は、後部座席に着座している乗員であってもよい。
Further, in the above-described first and second embodiments, it is assumed that the operator is the driver, but the operator is not limited to the driver. The operator may be an occupant other than the driver.
For example, the operator may be an occupant seated in the passenger seat, or the operator may be an occupant seated in the rear seat.
 また、以上の実施の形態1および実施の形態2では、操作者は1人であることを前提としたが、操作者は1人に限らず、操作者は複数人の乗員であってもよい。具体例を挙げると、例えば、運転者、および、助手席に着座している乗員(以下、単に「同乗者」という。)を操作者としてもよい。この場合、操作状態特定部13が、複数の操作者を特定可能とする。そして、操作状態特定部13は、複数の操作者を特定した場合、特定した複数の操作者それぞれの操作状態を特定する。
 なお、複数の操作者は、全員、表示装置2のタッチ操作を行い得る人であるが、表示装置2に表示される1つ以上の表示物に関する遠隔操作を行うのは、当該複数の操作者のうちのいずれか1人であるものとする。
Further, in the above-described first and second embodiments, it is assumed that the number of operators is one, but the number of operators is not limited to one, and the number of operators may be a plurality of occupants. .. To give a specific example, for example, the driver and the occupant seated in the passenger seat (hereinafter, simply referred to as "passenger") may be used as the operator. In this case, the operation state specifying unit 13 can identify a plurality of operators. Then, when a plurality of operators are specified, the operation state specifying unit 13 specifies the operation state of each of the specified plurality of operators.
The plurality of operators are all persons who can perform touch operations on the display device 2, but it is the plurality of operators who perform remote control on one or more display objects displayed on the display device 2. It shall be one of them.
 この場合、表示制御装置1,1aは、例えば、上述したような撮像装置と接続されているものとし、操作状態特定部13は、取得部11を介して撮像装置から撮像画像を取得し、撮像画像に基づいて、複数の操作者を特定し、特定した操作者それぞれについて、操作状態を特定する。
 具体的には、操作状態特定部13は、例えば、撮像装置から取得した撮像画像に対して、周知の画像認識技術を用いて画像認識処理を実施して、車内に存在する複数の乗員を検出する。上述の例でいうと、操作状態特定部13は、運転者および同乗者を検出し、それぞれ操作者と特定する。そして、操作状態特定部13は、運転者の操作状態、および、同乗者の操作状態を、特定する。例えば、操作状態を操作者の位置とする場合、操作状態特定部13は、運転者の位置、および、同乗者の位置を、特定する。そして、操作状態特定部13は、特定した、運転者の操作状態に関する情報、および、同乗者の操作状態に関する情報を、判定部14,14aに出力する。
 この場合、判定部14,14aの操作範囲判定部141,141aは、操作状態特定部13が特定した、複数の操作者の操作状態に基づき、操作可能範囲を判定する。具体的には、操作範囲判定部141,141aは、まず、複数の操作者の操作状態に基づき、複数の操作者それぞれの操作可能範囲(以下「操作者単位操作可能範囲」という。)を判定する。なお、操作範囲判定部141,141aが操作者単位操作可能範囲を判定する方法は、実施の形態1および実施の形態2で説明した、操作範囲判定部141,141aが操作可能範囲を判定する方法と同様の方法である。次に、操作範囲判定部141,141aは、複数の操作者それぞれ応じた操作者単位操作可能範囲が重複する範囲を算出する。そして、操作範囲判定部141,141aは、算出した重複範囲を、操作可能範囲と判定する。
 上述の例でいうと、操作範囲判定部141,141aは、まず、運転者の位置に基づき、運転者に応じた操作者単位操作可能範囲を判定する。また、操作範囲判定部141,141aは、同乗者の位置に基づき、同乗者に応じた操作者単位操作可能範囲を判定する。次に、操作範囲判定部141,141aは、運転者に応じた操作者単位操作可能範囲と同乗者に応じた操作者単位操作可能範囲が重複する範囲を算出する。そして、操作範囲判定部141,141aは、算出した重複範囲を、操作可能範囲と判定する。
In this case, it is assumed that the display control devices 1 and 1a are connected to, for example, an imaging device as described above, and the operation state specifying unit 13 acquires an captured image from the imaging device via the acquisition unit 11 and captures the image. A plurality of operators are specified based on the image, and the operation state is specified for each of the specified operators.
Specifically, the operation state specifying unit 13 detects, for example, a plurality of occupants existing in the vehicle by performing image recognition processing on the captured image acquired from the image pickup device using a well-known image recognition technique. To do. In the above example, the operation state specifying unit 13 detects the driver and the passenger and identifies them as the operators, respectively. Then, the operation state specifying unit 13 specifies the operation state of the driver and the operation state of the passenger. For example, when the operating state is the position of the operator, the operating state specifying unit 13 specifies the position of the driver and the position of the passenger. Then, the operation state specifying unit 13 outputs the identified information on the driver's operation state and the information on the passenger's operation state to the determination units 14 and 14a.
In this case, the operation range determination units 141 and 141a of the determination units 14 and 14a determine the operable range based on the operation states of a plurality of operators specified by the operation state identification unit 13. Specifically, the operation range determination units 141 and 141a first determine the operable range of each of the plurality of operators (hereinafter referred to as "operator unit operable range") based on the operation states of the plurality of operators. To do. The method in which the operation range determination units 141 and 141a determine the operable range for each operator is the method in which the operation range determination units 141 and 141a determine the operable range described in the first and second embodiments. It is the same method as. Next, the operation range determination units 141 and 141a calculate the range in which the operator unit operable range corresponding to each of the plurality of operators overlaps. Then, the operation range determination units 141 and 141a determine the calculated overlapping range as the operable range.
In the above example, the operation range determination units 141 and 141a first determine the operator-based operable range according to the driver based on the position of the driver. Further, the operation range determination units 141 and 141a determine the operator unit operable range according to the passenger based on the position of the passenger. Next, the operation range determination units 141 and 141a calculate a range in which the operator unit operable range corresponding to the driver and the operator unit operation range corresponding to the passenger overlap. Then, the operation range determination units 141 and 141a determine the calculated overlapping range as the operable range.
 もし、複数の操作者単位操作可能範囲が重複する範囲がない場合、操作範囲判定部141,141aは、遠隔操作を行った操作者に応じた操作者単位操作可能範囲を、操作可能範囲と判定する。例えば、操作状態特定部13は、操作状態を特定する際に、遠隔操作を行った操作者もあわせて特定するようにする。操作範囲判定部141,141aは、操作状態特定部13から、遠隔操作を行った操作者に関する情報を取得すればよい。例えば、操作者が音声によって遠隔操作を行う場合、操作状態特定部13は、音声信号の発生源の方向と、撮像装置から取得した撮像画像に基づき、複数の操作者のうち、遠隔操作を行った操作者を特定すればよい。また、例えば、操作者がジェスチャによって遠隔操作を行う場合、操作状態特定部13は、撮像装置から取得した撮像画像に基づき、複数の操作者のうち、遠隔操作を行った操作者を特定すればよい。 If there is no overlapping range of a plurality of operator unit operable ranges, the operation range determination units 141 and 141a determine that the operator unit operable range corresponding to the operator who performed the remote control is the operable range. To do. For example, when the operation state specifying unit 13 specifies the operation state, the operator who has performed the remote control is also specified. The operation range determination units 141 and 141a may acquire information about the operator who has performed the remote control from the operation state specifying unit 13. For example, when the operator performs remote control by voice, the operation state specifying unit 13 performs remote control among a plurality of operators based on the direction of the source of the voice signal and the captured image acquired from the imaging device. It suffices to identify the operator. Further, for example, when the operator performs remote control by gesture, the operation state specifying unit 13 may identify the operator who has performed remote control among a plurality of operators based on the captured image acquired from the imaging device. Good.
 また、実施の形態1および実施の形態2に係る表示制御装置1,1aにおいて、実行する機能に応じて、操作者を、1人のみとするか、複数人もあり得るものとするか、決定するようにしてもよい。
 具体的には、操作状態特定部13は、操作状態を特定する際、操作対象物特定部12,12aから、操作対象物に対応付けられている機能に関する情報を取得するようにする。なお、操作対象物特定部12,12aは、対象特定用情報に基づいて、操作対象物に対応付けられている機能を特定することができる。この場合、対象特定用情報には、少なくとも、キーワードと、アイコンの識別情報と、アイコンが操作されることで実行される機能の識別情報と、当該機能が実行された場合に表示される応答表示物の識別情報とが対応付けられているものとする。
 操作状態特定部13は、操作対象物に対応付けられている機能が、複数人の操作者もあり得るものとする対象となる機能である場合、車内に存在する乗員を検出する。具体的には、操作状態特定部13は、例えば、取得部11を介して、上述したような撮像装置から撮像画像を取得するようにし、当該撮像画像に対して、周知の画像認識技術を用いて画像認識処理を実施して、車内に存在する1人以上の乗員を検出する。操作状態特定部13は、検出した1人以上の乗員を、操作者とする。そして、操作状態特定部13は、1人以上の操作者それぞれについて、操作状態を特定する。また、操作範囲判定部141,141aは、操作状態特定部13が特定した、1人以上の操作者の操作状態に基づき、操作可能範囲を判定する。操作状態特定部13が1人以上の操作者それぞれについて操作状態を特定する具体的な動作、および、操作範囲判定部141,141aが1人以上の操作者の操作状態に基づき操作可能範囲を判定する具体的な動作は、説明済みであるため、重複した説明を省略する。
Further, in the display control devices 1 and 1a according to the first embodiment and the second embodiment, it is determined whether the number of operators is only one or a plurality of operators depending on the function to be executed. You may try to do it.
Specifically, when specifying the operation state, the operation state specifying unit 13 acquires information on the function associated with the operation target from the operation target identification units 12 and 12a. The operation target object specifying units 12 and 12a can specify the function associated with the operation target object based on the target identification information. In this case, the target identification information includes at least the keyword, the identification information of the icon, the identification information of the function executed by operating the icon, and the response display displayed when the function is executed. It is assumed that the identification information of the object is associated with it.
The operation state specifying unit 13 detects an occupant present in the vehicle when the function associated with the operation target is a function that can be targeted by a plurality of operators. Specifically, the operation state specifying unit 13 acquires an captured image from the image pickup device as described above via, for example, the acquisition unit 11, and uses a well-known image recognition technique for the captured image. Image recognition processing is performed to detect one or more occupants present in the vehicle. The operation state specifying unit 13 uses one or more detected occupants as operators. Then, the operation state specifying unit 13 specifies the operation state for each of one or more operators. Further, the operation range determination units 141 and 141a determine the operable range based on the operation state of one or more operators specified by the operation state identification unit 13. The operation state specifying unit 13 determines a specific operation for specifying the operation state for each of one or more operators, and the operation range determination units 141 and 141a determine the operable range based on the operation state of one or more operators. Since the specific operation to be performed has already been explained, duplicate explanations will be omitted.
 一方、操作状態特定部13は、操作対象物に対応付けられている機能が、複数人の操作者もあり得るものとする対象となる機能ではない場合、車内に存在する乗員の数にかかわらず、遠隔操作を行った操作者のみを操作者とし、当該操作者の操作状態を特定する。操作状態特定部13が遠隔操作を行った操作者を特定する具体的な動作は、説明済みであるため、重複した説明を省略する。 On the other hand, if the function associated with the operation target object is not the target function that can be operated by a plurality of operators, the operation state specifying unit 13 is irrespective of the number of occupants present in the vehicle. , Only the operator who performed the remote control is set as the operator, and the operation state of the operator is specified. Since the specific operation of the operation state specifying unit 13 to identify the operator who performed the remote control has already been explained, duplicate explanations will be omitted.
 また、以上の実施の形態1および実施の形態2において、表示制御装置1,1aは、予め設定された周期(以下「範囲再判定周期」という。)にて、操作者の操作状態を再度特定し、特定した操作者の操作状態に基づき、操作可能範囲の再判定を実施するようにしてもよい。
 表示制御装置1,1aが、操作者の操作状態を再度特定し、特定した操作者の操作状態に基づき、操作可能範囲の再判定を実施する具体的な動作について、説明する。以下に説明する動作は、表示制御装置1,1aにおいて、それぞれ、図5,図10に示したフローチャートを用いて説明した動作が行われたことを前提としている。なお、以下の説明では、上述したような撮像装置が表示制御装置1,1aと接続され、表示制御装置1,1aは、撮像装置から、車内を撮像した撮像画像を取得しているものとする。
Further, in the above-described first and second embodiments, the display control devices 1 and 1a respecify the operation state of the operator in a preset cycle (hereinafter referred to as "range re-determination cycle"). Then, the operable range may be re-determined based on the operation state of the specified operator.
A specific operation in which the display control devices 1 and 1a re-identify the operating state of the operator and redetermine the operable range based on the specified operating state of the operator will be described. The operations described below are based on the premise that the operations described using the flowcharts shown in FIGS. 5 and 10, respectively, have been performed in the display control devices 1 and 1a, respectively. In the following description, it is assumed that the image pickup device as described above is connected to the display control devices 1, 1a, and the display control devices 1, 1a acquire the captured image of the inside of the vehicle from the image pickup device. ..
 表示制御部15,15aが、操作対象物の、表示装置2への表示を制御する(図5のステップST505、図10のステップST1005参照)と、操作範囲判定部141,141aは、範囲再判定周期になるまで待機し、範囲再判定周期になると、取得部11を介して、撮像装置から撮像画像を取得する。
 操作状態特定部13は、取得した撮像画像に基づいて、操作者の現在の操作状態を特定する。撮像画像に基づいて操作者の操作状態を特定する具体的な動作は、説明済みであるため、重複した説明を省略する。ここでは、一例として、操作状態特定部13は、操作者の操作状態として、操作者の位置を特定するものとする。
When the display control units 15 and 15a control the display of the operation target object on the display device 2 (see step ST505 in FIG. 5 and step ST1005 in FIG. 10), the operation range determination units 141 and 141a redetermine the range. It waits until the cycle is reached, and when the range redetermination cycle is reached, the captured image is acquired from the imaging device via the acquisition unit 11.
The operation state specifying unit 13 specifies the current operation state of the operator based on the acquired captured image. Since the specific operation of specifying the operation state of the operator based on the captured image has already been explained, duplicate description will be omitted. Here, as an example, the operation state specifying unit 13 specifies the position of the operator as the operation state of the operator.
 ここで、操作状態特定部13は、直近に判定した操作状態と、操作者の現在の操作状態との間に変化があるか否かを判定する。具体的には、ここでは、操作状態特定部13は、操作者の現在の位置が、直近に判定した操作者の位置から、変化しているか否かを判定する。
 なお、操作状態特定部13は、操作者の操作状態を特定すると、特定した操作者の操作状態を、記憶部に記憶させているものとする。操作状態特定部13は、特定した、操作者の現在の位置と、記憶されている、操作者の最新の位置とを比較することで、操作者の現在の位置が、直近に判定した操作者の位置から、変化しているか否かを判定する。
 操作状態特定部13は、直近に判定した操作者の操作状態と操作者の現在の操作状態との間に変化がないと判定した場合、再び、範囲再判定周期になるまで待機する。
 一方、操作状態特定部13は、直近に判定した操作者の操作状態と操作者の現在の操作状態との間に変化があると判定した場合、操作者の現在の操作状態に関する情報を、操作範囲判定部141,141aに出力する。ここでは、操作状態特定部13は、操作者の現在の位置に関する情報を、操作範囲判定部141,141aに出力する。
Here, the operation state specifying unit 13 determines whether or not there is a change between the most recently determined operation state and the current operation state of the operator. Specifically, here, the operation state specifying unit 13 determines whether or not the current position of the operator has changed from the position of the operator determined most recently.
When the operation state of the operator is specified, the operation state specifying unit 13 stores the specified operation state of the operator in the storage unit. The operation state specifying unit 13 compares the specified current position of the operator with the stored latest position of the operator to determine the operator's current position most recently. From the position of, it is determined whether or not it has changed.
When the operation state specifying unit 13 determines that there is no change between the most recently determined operation state of the operator and the current operation state of the operator, the operation state specifying unit 13 waits again until the range redetermination cycle is reached.
On the other hand, when the operation state specifying unit 13 determines that there is a change between the most recently determined operation state of the operator and the current operation state of the operator, the operation state specifying unit 13 operates information on the current operation state of the operator. It is output to the range determination units 141 and 141a. Here, the operation state specifying unit 13 outputs information regarding the current position of the operator to the operation range determination units 141 and 141a.
 操作範囲判定部141,141aは、操作状態特定部13から、操作者の現在の操作状態に関する情報が出力されると、出力された操作者の操作状態に関する情報に基づき、操作者の操作可能範囲を判定する(例えば、図5のステップST504、図10のステップST1004参照)。
 以降の具体的な動作は、図5において説明した、ステップST505の具体的な動作、または、図10において説明した、ステップST1005の具体的な動作と同様であるため、重複した説明を省略する。
 このように、表示制御装置1,1aにおいて、範囲再判定周期にて、操作者の操作状態を再度特定し、特定した操作者の操作状態に基づき、操作可能範囲の再判定を実施するようにすることができる。
When the operation range determination unit 141, 141a outputs information on the current operation state of the operator from the operation state identification unit 13, the operator's operable range is based on the output information on the operation state of the operator. (See, for example, step ST504 in FIG. 5 and step ST1004 in FIG. 10).
Since the subsequent specific operation is the same as the specific operation of step ST505 described in FIG. 5 or the specific operation of step ST1005 described in FIG. 10, duplicated description will be omitted.
In this way, in the display control devices 1 and 1a, the operation state of the operator is re-specified in the range re-judgment cycle, and the re-judgment of the operable range is performed based on the specified operation state of the operator. can do.
 なお、以上の、表示制御装置1,1aが、範囲再判定周期にて、操作者の操作状態を再度特定し、特定した操作者の操作状態に基づき、操作可能範囲の再判定を実施するようにする場合の説明では、表示制御装置1,1aは撮像装置と接続され、操作状態特定部13は、撮像装置から取得した撮像画像に基づいて、操作者の現在の操作状態を特定するものとしたが、これは一例に過ぎない。例えば、表示制御装置1,1aは、アレイマイクと接続され、操作状態特定部13は、取得部11を介してアレイマイクから発話に基づく音声信号を取得するようにし、当該音声信号に基づいて、操作者の現在の操作状態を特定するようにしてもよい。 It should be noted that the display control devices 1 and 1a described above re-identify the operating state of the operator in the range re-judgment cycle, and re-determine the operable range based on the specified operating state of the operator. In the description of the case, the display control devices 1 and 1a are connected to the image pickup device, and the operation state specifying unit 13 specifies the current operation state of the operator based on the captured image acquired from the image pickup device. However, this is just one example. For example, the display control devices 1 and 1a are connected to the array microphone, and the operation state specifying unit 13 acquires a voice signal based on the utterance from the array microphone via the acquisition unit 11, and based on the voice signal, The current operation state of the operator may be specified.
 また、以上の、表示制御装置1,1aが、範囲再判定周期にて、操作者の操作状態を再度特定し、特定した操作者の操作状態に基づき、操作可能範囲の再判定を実施するようにする場合の説明では、操作者の操作状態とは、操作者の位置としたが、これは一例に過ぎない。例えば、操作者の操作状態は、操作者の姿勢としてもよい。
 表示制御装置1,1aにおいて、操作状態特定部13は、範囲再判定周期になると操作者の現在の姿勢を特定し、操作者の姿勢に変化があるか否かを判定する。例えば、操作状態特定部13は、取得部11を介して、上述したような撮像装置から、車内を撮像した撮像画像を取得し、取得した撮像画像に基づいて、操作者の現在の姿勢を特定する。
 操作範囲判定部141,141aは、操作状態特定部13が操作者の姿勢に変化があると判定した場合に、操作状態特定部13が特定した操作者の現在の姿勢に基づき、操作者の操作可能範囲を再判定する。
Further, the display control devices 1 and 1a described above re-identify the operation state of the operator in the range re-judgment cycle, and re-determine the operable range based on the specified operation state of the operator. In the explanation of the case, the operation state of the operator is the position of the operator, but this is only an example. For example, the operating state of the operator may be the posture of the operator.
In the display control devices 1 and 1a, the operation state specifying unit 13 specifies the current posture of the operator when the range re-determination cycle is reached, and determines whether or not there is a change in the posture of the operator. For example, the operation state specifying unit 13 acquires an image of the inside of the vehicle from an image pickup device as described above via the acquisition unit 11, and specifies the current posture of the operator based on the acquired image. To do.
When the operation state specifying unit 13 determines that the posture of the operator has changed, the operation range determination units 141 and 141a operate the operator based on the current posture of the operator specified by the operation state specifying unit 13. Redetermine the possible range.
 また、例えば、表示制御装置1,1aにおいて、操作者の人数に変化があるか否かによって、操作可能範囲の再判定を実施するようにしてもよい。
 表示制御装置1,1aにおいて、操作状態特定部13は、範囲再判定周期になると操作者および当該操作者の現在の人数を検出し、操作者の現在の人数に変化があるか否かを判定する。例えば、操作状態特定部13は、取得部11を介して、上述したような撮像装置から、車内を撮像した撮像画像を取得し、取得した撮像画像に基づいて、現在の、操作者、および、当該操作者の人数を検出する。
 操作範囲判定部141,141aは、操作状態特定部13が操作者の人数に変化があると判定した場合に、操作者の操作可能範囲を再判定するようにすることもできる。操作状態特定部13は、操作者の人数に変化があると判定した場合、検出した1人以上の操作者それぞれの操作状態を特定する。操作範囲判定部141,141aは、操作状態特定部13が特定した、1人以上の操作者それぞれの操作状態に基づき、操作可能範囲を判定する。操作範囲判定部141,141aが1人以上の操作者の操作状態に基づき操作可能範囲を判定する具体的な動作は、説明済みであるため、重複した説明を省略する。
 例えば、範囲再判定周期にて、操作状態特定部13が操作者の人数を検出するための車内の範囲(以下「人数特定用範囲」という。)が予め決められており、操作状態特定部13は、人数特定用範囲における、操作者の現在の人数を検出するようにしてもよい。人数特定用範囲は、例えば、撮像画像において、機能の実行を指定する遠隔操作を行った操作者の位置を中心とした、予め決められた範囲である。
Further, for example, in the display control devices 1 and 1a, the re-determination of the operable range may be performed depending on whether or not the number of operators changes.
In the display control devices 1 and 1a, the operation state specifying unit 13 detects the operator and the current number of the operators when the range re-determination cycle is reached, and determines whether or not there is a change in the current number of operators. To do. For example, the operation state specifying unit 13 acquires an image of the inside of the vehicle from an image pickup device as described above via the acquisition unit 11, and based on the acquired image, the current operator and the operator. Detect the number of operators.
The operation range determination units 141 and 141a may redetermine the operable range of the operator when the operation state specifying unit 13 determines that the number of operators has changed. When it is determined that the number of operators has changed, the operation state specifying unit 13 specifies the operation state of each of the detected one or more operators. The operation range determination units 141 and 141a determine the operable range based on the operation state of each of one or more operators specified by the operation state identification unit 13. Since the specific operation in which the operation range determination units 141 and 141a determine the operable range based on the operation state of one or more operators has already been explained, duplicate description will be omitted.
For example, in the range re-determination cycle, the range in the vehicle for the operation state specifying unit 13 to detect the number of operators (hereinafter referred to as “number of person specifying range”) is predetermined, and the operation state specifying unit 13 May detect the current number of operators in the number-specific range. The number-of-person identification range is, for example, a predetermined range centered on the position of the operator who has performed the remote control for designating the execution of the function in the captured image.
 このように、表示制御装置1,1aは、操作状態に変化があるか否かの判定を行い、当該操作状態に変化があるか否かの判定結果に基づき、操作可能範囲の再判定を実施するようにしてもよい。また、表示制御装置1,1aは、操作者の人数に変化があるか否かの判定を行い、当該操作者の人数に変化があるか否かの判定結果に基づき、操作可能範囲の再判定を実施するようにしてもよい。以上に説明した例のように、表示制御装置1,1aにおいて、操作範囲判定部141,141aは、操作状態特定部13が特定した、操作者の現在の操作状態に応じて、操作可能範囲を動的に判定するようにすることができる。
 表示制御装置1,1aが、操作状態に変化があるか否かの判定、または、操作者の人数に変化があるか否かの判定を行い、判定結果に基づき、操作可能範囲の再判定を実施するようにすることで、表示制御装置1,1aは、一度、操作対象物を操作可能範囲に表示させた後に、操作状態が変化したことによって、操作者が操作対象物をタッチ操作できなくなってしまうことを防ぐことができる。
In this way, the display control devices 1 and 1a determine whether or not there is a change in the operating state, and redetermine the operable range based on the determination result of whether or not there is a change in the operating state. You may try to do it. Further, the display control devices 1 and 1a determine whether or not the number of operators has changed, and redetermine the operable range based on the determination result of whether or not the number of the operators has changed. May be carried out. As described above, in the display control devices 1, 1a, the operation range determination units 141, 141a determine the operable range according to the current operation state of the operator specified by the operation state identification unit 13. It can be determined dynamically.
The display control devices 1 and 1a determine whether or not there is a change in the operating state, or determine whether or not there is a change in the number of operators, and re-determine the operable range based on the determination result. By doing so, the display control devices 1 and 1a cannot touch the operation target because the operation state changes after the operation target is once displayed in the operable range. It can be prevented from being lost.
 図11A,図11Bは、実施の形態1および実施の形態2に係る表示制御装置1,1aのハードウェア構成の一例を示す図である。
 実施の形態1および実施の形態2において、取得部11と、操作対象物特定部12,12aと、操作状態特定部13と、判定部14,14aと、表示制御部15,15aの機能は、処理回路1101により実現される。すなわち、表示制御装置1,1aは、機能の実行を指定する遠隔操作が行われた場合に、操作対象物を特定し、特定した操作対象物の、表示装置2における表示位置を制御する処理回路1101を備える。
 処理回路1101は、図11Aに示すように専用のハードウェアであっても、図11Bに示すようにメモリ1106に格納されるプログラムを実行するCPU(Central Processing Unit)1105であってもよい。
11A and 11B are diagrams showing an example of the hardware configuration of the display control devices 1 and 1a according to the first and second embodiments.
In the first and second embodiments, the functions of the acquisition unit 11, the operation target identification unit 12, 12a, the operation state identification unit 13, the determination unit 14, 14a, and the display control unit 15, 15a are It is realized by the processing circuit 1101. That is, the display control devices 1 and 1a are processing circuits that specify the operation target and control the display position of the specified operation target on the display device 2 when the remote control for specifying the execution of the function is performed. 1101 is provided.
The processing circuit 1101 may be dedicated hardware as shown in FIG. 11A, or may be a CPU (Central Processing Unit) 1105 that executes a program stored in the memory 1106 as shown in FIG. 11B.
 処理回路1101が専用のハードウェアである場合、処理回路1101は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、またはこれらを組み合わせたものが該当する。 When the processing circuit 1101 is dedicated hardware, the processing circuit 1101 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable). Gate Array) or a combination of these is applicable.
 処理回路1101がCPU1105の場合、取得部11と、操作対象物特定部12,12aと、操作状態特定部13と、判定部14,14aと、表示制御部15,15aの機能は、ソフトウェア、ファームウェア、または、ソフトウェアとファームウェアとの組み合わせにより実現される。すなわち、取得部11と、操作対象物特定部12,12aと、操作状態特定部13と、判定部14,14aと、表示制御部15,15aは、HDD(Hard Disk Drive)1102、メモリ1106等に記憶されたプログラムを実行するCPU1105、システムLSI(Large-Scale Integration)等の処理回路により実現される。また、HDD1102、メモリ1106等に記憶されたプログラムは、取得部11と、操作対象物特定部12,12aと、操作状態特定部13と、判定部14,14aと、表示制御部15,15aの手順または方法をコンピュータに実行させるものであるとも言える。ここで、メモリ1106とは、例えば、RAM、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read-Only Memory)等の、不揮発性もしくは揮発性の半導体メモリ、または、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disc)等が該当する。 When the processing circuit 1101 is the CPU 1105, the functions of the acquisition unit 11, the operation target identification unit 12, 12a, the operation state identification unit 13, the determination unit 14, 14a, and the display control unit 15, 15a are software and firmware. Or, it is realized by a combination of software and firmware. That is, the acquisition unit 11, the operation target object identification unit 12, 12a, the operation state identification unit 13, the determination unit 14, 14a, and the display control unit 15, 15a are HDD (Hard Disk Drive) 1102, memory 1106, etc. It is realized by a processing circuit such as a CPU 1105 that executes a program stored in the above and a system LSI (Large-Scale Integration). Further, the programs stored in the HDD 1102, the memory 1106, etc. are the acquisition unit 11, the operation target object identification unit 12, 12a, the operation state identification unit 13, the determination unit 14, 14a, and the display control unit 15, 15a. It can also be said to cause a computer to perform a procedure or method. Here, the memory 1106 is, for example, a RAM, a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EPROM (Electrically Emergency Memory), an EPROM (Electrically Emergency Memory), a volatile Optical, etc. A semiconductor memory, a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, a DVD (Digital Versaille Disc), or the like is applicable.
 なお、取得部11と、操作対象物特定部12,12aと、操作状態特定部13と、判定部14,14aと、表示制御部15,15aの機能について、一部を専用のハードウェアで実現し、一部をソフトウェアまたはファームウェアで実現するようにしてもよい。例えば、操作対象物特定部12,12aについては専用のハードウェアとしての処理回路1101でその機能を実現し、取得部11と、操作状態特定部13と、判定部14,14aと、表示制御部15,15aについては処理回路1101がメモリ1106に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。
 また、表示制御装置1,1aは、表示装置2または入力装置等の装置と、有線通信または無線通信を行う入力インタフェース装置1103および出力インタフェース装置1104を備える。
Some of the functions of the acquisition unit 11, the operation target identification unit 12, 12a, the operation state identification unit 13, the determination unit 14, 14a, and the display control unit 15, 15a are realized by dedicated hardware. However, a part may be realized by software or firmware. For example, the operation target identification units 12 and 12a are realized by the processing circuit 1101 as dedicated hardware, and the acquisition unit 11, the operation state identification unit 13, the determination units 14, 14a, and the display control unit are realized. The functions of 15 and 15a can be realized by the processing circuit 1101 reading and executing the program stored in the memory 1106.
Further, the display control devices 1 and 1a include a device such as a display device 2 or an input device, and an input interface device 1103 and an output interface device 1104 that perform wired communication or wireless communication.
 以上の実施の形態1および実施の形態2では、表示制御装置1,1aは、車両に搭載される車載装置とし、取得部11と、操作対象物特定部12,12aと、操作状態特定部13と、判定部14,14aと、表示制御部15,15aは、表示制御装置1,1aに備えられているものとした。
 これに限らず、取得部11と、操作対象物特定部12,12aと、操作状態特定部13と、判定部14,14aと、表示制御部15,15aのうち、一部を車両の車載装置に搭載されるものとし、その他を当該車載装置とネットワークを介して接続されるサーバに備えられるものとして、車載装置とサーバとで自動運転制御システムを構成するようにしてもよい。具体的には、例えば、操作対象物特定部12,12aは、サーバに備えられるものとしてもよい。
In the above-described first and second embodiments, the display control devices 1 and 1a are in-vehicle devices mounted on the vehicle, and the acquisition unit 11, the operation target object identification units 12, 12a, and the operation state identification unit 13 are used. The determination units 14, 14a and the display control units 15, 15a are assumed to be provided in the display control devices 1, 1a.
Not limited to this, a part of the acquisition unit 11, the operation target object identification unit 12, 12a, the operation state identification unit 13, the determination unit 14, 14a, and the display control unit 15, 15a is mounted on the vehicle. An automatic driving control system may be configured by the in-vehicle device and the server, assuming that the in-vehicle device and the server are provided in the server connected to the in-vehicle device via a network. Specifically, for example, the operation target identification units 12 and 12a may be provided in the server.
 また、以上の実施の形態2において、応答表示物に応じた応答表示用情報が、予め生成され、記憶部に記憶されているものとしてもよい。この場合、表示制御部15aは、生成済の応答表示用情報を取得し、表示位置決定部151aが決定した表示位置に応答表示物が表示されるよう、当該応答表示物の表示を制御する。このような構成とする場合、表示制御装置1aは、応答生成部152を備えることを必須としない。 Further, in the above-described second embodiment, the response display information corresponding to the response display object may be generated in advance and stored in the storage unit. In this case, the display control unit 15a acquires the generated response display information and controls the display of the response display object so that the response display object is displayed at the display position determined by the display position determination unit 151a. In such a configuration, the display control device 1a does not have to include the response generation unit 152.
 なお、本願発明はその発明の範囲内において、各実施の形態の自由な組み合わせ、あるいは各実施の形態の任意の構成要素の変形、もしくは各実施の形態において任意の構成要素の省略が可能である。 In the present invention, within the scope of the invention, it is possible to freely combine each embodiment, modify any component of each embodiment, or omit any component in each embodiment. ..
 この発明に係る表示制御装置は、タッチ操作可能な表示装置への表示物の表示を制御する表示制御装置に適用することができる。 The display control device according to the present invention can be applied to a display control device that controls the display of a display object on a touch-operable display device.
 1,1a 表示制御装置、2 表示装置、3 入力装置、11 取得部、12,12a 操作対象物特定部、121,121a 初期表示位置特定部、13 操作状態特定部、14,14a 判定部、141,141a 操作範囲判定部、15,15a 表示制御部、151,151a 表示位置決定部、152 応答生成部、1101 処理回路、1102 HDD、1103 入力インタフェース装置、1104 出力インタフェース装置、1105 CPU、1106 メモリ。 1,1a Display control device, 2 Display device, 3 Input device, 11 Acquisition unit, 12, 12a Operation target identification unit, 121, 121a Initial display position identification unit, 13 Operation state identification unit, 14, 14a Judgment unit, 141 , 141a operation range determination unit, 15,15a display control unit, 151,151a display position determination unit, 152 response generation unit, 1101 processing circuit, 1102 HDD, 1103 input interface device, 1104 output interface device, 1105 CPU, 1106 memory.

Claims (7)

  1.  タッチ操作可能な表示装置を介して実行させることが可能な機能が、遠隔操作によって指定されたことを示す遠隔操作情報、を取得する取得部と、
     前記取得部が取得した遠隔操作情報に基づいて、前記表示装置の操作者がタッチ操作する操作対象物を特定するとともに、当該操作対象物の初期表示位置を特定する操作対象物特定部と、
     遠隔操作を行った前記操作者の操作状態を特定する操作状態特定部と、
     前記操作状態特定部が特定した前記操作者の操作状態に基づき、前記操作対象物特定部が特定した前記操作対象物の初期表示位置が、前記操作者がタッチ操作可能な、前記表示装置の表示画面における、操作可能範囲内にあるか否かを判定する判定部と、
     前記判定部が、前記操作対象物の初期表示位置は前記操作可能範囲内にないと判定した場合、前記操作対象物が前記操作可能範囲内に表示されるよう、前記表示装置への、前記操作対象物の表示を制御する表示制御部
     を備えた表示制御装置。
    An acquisition unit that acquires remote control information indicating that a function that can be executed via a touch-operable display device is specified by remote control,
    Based on the remote control information acquired by the acquisition unit, the operation object identification unit that specifies the operation object to be touch-operated by the operator of the display device and the initial display position of the operation object, and the operation object identification unit.
    An operation state specifying unit that specifies the operation state of the operator who performed remote control,
    A display of the display device on which the operator can touch-operate the initial display position of the operation target specified by the operation target identification unit based on the operation state of the operator specified by the operation state specifying unit. On the screen, a judgment unit that determines whether or not it is within the operable range, and
    When the determination unit determines that the initial display position of the operation target is not within the operable range, the operation on the display device so that the operation target is displayed within the operable range. A display control device equipped with a display control unit that controls the display of an object.
  2.  前記表示装置には機能を実行させるためのアイコンが表示可能であって、前記操作対象物は、当該アイコンが操作された場合以降に前記表示装置に表示される応答表示物である
     ことを特徴とする請求項1記載の表示制御装置。
    An icon for executing a function can be displayed on the display device, and the operation target object is a response display object displayed on the display device after the icon is operated. The display control device according to claim 1.
  3.  前記遠隔操作は、音声操作、ジェスチャ、または、リモコン操作のうちのいずれか1つ以上を含む
     ことを特徴とする請求項1記載の表示制御装置。
    The display control device according to claim 1, wherein the remote control includes any one or more of voice operation, gesture, and remote control operation.
  4.  前記操作状態特定部は、遠隔操作を行った前記操作者の位置および姿勢を特定し、
     前記判定部は、前記操作状態特定部が特定した前記操作者の位置および姿勢に基づき、前記操作対象物の初期表示位置が前記操作可能範囲内にあるか否かを判定する
     ことを特徴とする請求項1記載の表示制御装置。
    The operation state specifying unit identifies the position and posture of the operator who has performed remote control, and determines the position and posture of the operator.
    The determination unit is characterized in that it determines whether or not the initial display position of the operation target is within the operable range based on the position and posture of the operator specified by the operation state specifying unit. The display control device according to claim 1.
  5.  前記操作状態特定部は、複数の前記操作者を特定可能であり、複数の前記操作者を特定した場合、特定した複数の前記操作者それぞれの操作状態を特定し、
     前記判定部は、前記操作状態特定部が特定した前記操作者の操作状態に基づき、前記操作者毎に、前記操作者がタッチ操作可能な、前記表示装置の表示画面における操作者単位操作可能範囲、を特定し、特定した、前記操作者毎の操作者単位操作可能範囲に基づき、前記操作可能範囲を判定する操作範囲判定部を備えた
     ことを特徴とする請求項1記載の表示制御装置。
    The operation state specifying unit can identify a plurality of the operators, and when a plurality of the operators are specified, the operation state of each of the specified plurality of operators is specified.
    Based on the operation state of the operator specified by the operation state specifying unit, the determination unit can be touch-operated by the operator for each operator, and can be operated by the operator on the display screen of the display device. The display control device according to claim 1, further comprising an operation range determination unit for determining the operable range based on the operator-unit operable range for each operator.
  6.  前記操作範囲判定部は、
     複数の前記操作者単位操作可能範囲が重複している範囲を、前記操作可能範囲とする
     ことを特徴とする請求項5記載の表示制御装置。
    The operation range determination unit
    The display control device according to claim 5, wherein a range in which a plurality of the operator-unit operable ranges overlap is defined as the operable range.
  7.  取得部が、タッチ操作可能な表示装置を介して実行させることが可能な機能が、遠隔操作によって指定されたことを示す遠隔操作情報、を取得するステップと、
     操作対象物特定部が、前記取得部が取得した遠隔操作情報に基づいて、前記表示装置の操作者がタッチ操作する操作対象物を特定するとともに、当該操作対象物の初期表示位置を特定するステップと、
     操作状態特定部が、遠隔操作を行った前記操作者の操作状態を特定するステップと、
     判定部が、前記操作状態特定部が特定した前記操作者の操作状態に基づき、前記操作対象物特定部が特定した前記操作対象物の初期表示位置が、前記操作者がタッチ操作可能な、前記表示装置の表示画面における、操作可能範囲内にあるか否かを判定するステップと、
     表示制御部が、前記判定部が前記操作対象物の初期表示位置は前記操作可能範囲内にないと判定した場合、前記操作対象物が前記操作可能範囲内に表示されるよう、前記表示装置への、前記操作対象物の表示を制御するステップ
     を備えた表示制御方法。
    A step of acquiring remote control information indicating that a function that can be executed by the acquisition unit via a touch-operable display device is specified by remote control, and
    A step in which the operation target identification unit specifies an operation target to be touch-operated by the operator of the display device based on the remote control information acquired by the acquisition unit, and also specifies the initial display position of the operation target. When,
    A step in which the operation state specifying unit specifies the operation state of the operator who has performed remote control,
    The operator can touch-operate the initial display position of the operation target specified by the operation target identification unit based on the operation state of the operator specified by the operation state identification unit. On the display screen of the display device, the step of determining whether or not it is within the operable range, and
    When the display control unit determines that the initial display position of the operation target is not within the operable range, the display control unit displays the operation target within the operable range on the display device. A display control method comprising a step of controlling the display of the operation object.
PCT/JP2019/038146 2019-09-27 2019-09-27 Display control device for displaying operation subject within operable range WO2021059479A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/038146 WO2021059479A1 (en) 2019-09-27 2019-09-27 Display control device for displaying operation subject within operable range

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/038146 WO2021059479A1 (en) 2019-09-27 2019-09-27 Display control device for displaying operation subject within operable range

Publications (1)

Publication Number Publication Date
WO2021059479A1 true WO2021059479A1 (en) 2021-04-01

Family

ID=75166025

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/038146 WO2021059479A1 (en) 2019-09-27 2019-09-27 Display control device for displaying operation subject within operable range

Country Status (1)

Country Link
WO (1) WO2021059479A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013099042A1 (en) * 2011-12-27 2013-07-04 パナソニック株式会社 Information terminal, method of controlling information terminal, and program
JP2015014933A (en) * 2013-07-05 2015-01-22 Necカシオモバイルコミュニケーションズ株式会社 Information processing apparatus, and control method and program of the same
WO2015198729A1 (en) * 2014-06-25 2015-12-30 ソニー株式会社 Display control device, display control method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013099042A1 (en) * 2011-12-27 2013-07-04 パナソニック株式会社 Information terminal, method of controlling information terminal, and program
JP2015014933A (en) * 2013-07-05 2015-01-22 Necカシオモバイルコミュニケーションズ株式会社 Information processing apparatus, and control method and program of the same
WO2015198729A1 (en) * 2014-06-25 2015-12-30 ソニー株式会社 Display control device, display control method, and program

Similar Documents

Publication Publication Date Title
US20180232057A1 (en) Information Processing Device
US9703472B2 (en) Method and system for operating console with touch screen
KR102029842B1 (en) System and control method for gesture recognition of vehicle
US10528150B2 (en) In-vehicle device
US20140079285A1 (en) Movement prediction device and input apparatus using the same
JP5334618B2 (en) Touch panel device and input direction detection device
KR101741691B1 (en) Vehicle and method of controlling the same
KR102084032B1 (en) User interface, means of transport and method for distinguishing a user
KR20110076921A (en) Display and control system in a motor vehicle having user-adjustable representation of displayed objects, and method for operating such a display and control system
JP2014225245A (en) Traffic information presentation system, traffic information presentation method and electronic device
WO2011158605A1 (en) Information display device and method for moving operation of onscreen button
JPWO2014109262A1 (en) Touch panel system
US10780781B2 (en) Display device for vehicle
JP7338184B2 (en) Information processing device, information processing system, moving body, information processing method, and program
WO2018061603A1 (en) Gestural manipulation system, gestural manipulation method, and program
US10809823B2 (en) Input system
US20180052563A1 (en) Touch panel control device and in-vehicle information device
JP4858206B2 (en) In-vehicle device operation support device and operation support method
JP2014021748A (en) Operation input device and on-vehicle equipment using the same
CN105759955B (en) Input device
JP6473610B2 (en) Operating device and operating system
CN111638786B (en) Display control method, device, equipment and storage medium of vehicle-mounted rear projection display system
WO2021059479A1 (en) Display control device for displaying operation subject within operable range
JP6819539B2 (en) Gesture input device
WO2016031152A1 (en) Input interface for vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19946466

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19946466

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP