CN107831975B - Information providing device, control method and terminal equipment - Google Patents

Information providing device, control method and terminal equipment Download PDF

Info

Publication number
CN107831975B
CN107831975B CN201710833212.1A CN201710833212A CN107831975B CN 107831975 B CN107831975 B CN 107831975B CN 201710833212 A CN201710833212 A CN 201710833212A CN 107831975 B CN107831975 B CN 107831975B
Authority
CN
China
Prior art keywords
button
terminal
information providing
cpu11
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710833212.1A
Other languages
Chinese (zh)
Other versions
CN107831975A (en
Inventor
平間美香
藤原彰彦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba TEC Corp
Original Assignee
Toshiba TEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba TEC Corp filed Critical Toshiba TEC Corp
Publication of CN107831975A publication Critical patent/CN107831975A/en
Application granted granted Critical
Publication of CN107831975B publication Critical patent/CN107831975B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The invention discloses an information providing device, a control method and a terminal device, wherein the content displayed on the picture of a display device becomes clear and visible, the information providing device comprises: a screen display unit, an object display unit, a movement unit, and a movement control unit. The screen display unit causes the display device to display an information providing screen including the content. The object display unit displays an object linked with the terminal on the information providing screen according to the connection of the terminal. The moving unit moves the display position of the object based on an instruction from the terminal. The movement control unit controls the movement unit so that the object does not enter a predetermined prohibited area within the information providing screen.

Description

Information providing device, control method and terminal equipment
This application claims priority to japanese application having a filing date of 2016, 15/09, and a filing number of JP2016-180601, and the contents of the above application are incorporated herein by reference in their entirety.
Technical Field
The embodiment of the invention relates to an information providing device, a control method and terminal equipment.
Background
Conventionally, there are display devices such as digital signage that display contents such as advertisements, tourist guides, entertainment, and other various information, and the like, which display devices can be linked to a plurality of terminals. A terminal linked to such a digital signage can move an object such as a cursor displayed on a screen of the digital signage based on an operation performed by an operator of the terminal. However, when the object is superimposed on a character string, an image, a moving image (video), or the like displayed on the screen of the digital signage, there is a problem that the character string, the image, the moving image, or the like becomes hard to see (blurred).
Disclosure of Invention
In view of the above problems, an object of the present invention is to provide an information providing apparatus, a control method, and a terminal device, in which contents displayed on a screen of a display apparatus become clearly visible.
In order to solve the above problem, an embodiment of the present invention provides an information providing apparatus, including: a screen display unit, an object display unit, a movement unit, and a movement control unit. The screen display unit causes the display device to display an information providing screen including the content. The object display unit displays an object linked with the terminal on the information providing screen according to the connection of the terminal. The moving unit moves the display position of the object based on an instruction from the terminal. The movement control unit controls the movement unit so that the object does not enter a predetermined prohibited area within the information providing screen.
With this configuration, the content displayed on the screen of the display device becomes clearly visible.
For the information providing apparatus, in one possible embodiment, the method further includes: and a selection control unit that sets the button in a selected state when the object enters a predetermined selection area surrounding the button displayed on the information providing screen.
According to this configuration, even when the button is set as the prohibited area, the operator of the terminal can set the button to the selected state by putting the object into the area.
For the information providing apparatus, in one possible embodiment, the method further includes: and a projection for displaying the button in the selected state.
With this configuration, the operator of the terminal can easily distinguish which button is in the selected state.
For the information providing apparatus, in one possible embodiment, the method further includes: and a notification unit configured to notify the terminal of information specifying the button in the selected state.
With this configuration, the operator of the terminal can easily distinguish which button is in the selected state by the object linked to the terminal.
For the information providing apparatus, in one possible embodiment, the method further includes: and a changing unit that changes an appearance of the object based on an operation of the terminal.
With this configuration, the operator of the terminal can easily recognize which object among the plurality of objects the operator operates.
Another embodiment of the present invention provides a method of controlling an information providing apparatus, including: a screen display step of causing a display device to display an information providing screen including content; an object display step of displaying an object linked with a terminal on the information providing screen, in accordance with a state in which the terminal is connected; a moving step of moving a display position of the object based on an instruction from the terminal; and a movement control step of controlling the movement step so that the object does not enter a predetermined prohibited area within the information providing screen.
According to such a control method, the content displayed on the screen of the display device becomes clearly visible.
For the control method, in one possible embodiment, the method further comprises the following steps: and a selection control step of setting the button in a selected state when the object enters a predetermined selection area surrounding the button displayed on the information providing screen.
According to such a control method, even when a button is set as a prohibited area, the operator of the terminal can put an object into the area to set the button to a selected state.
For the control method, in one possible embodiment, the method further comprises the following steps: and a highlighting step of highlighting the button in the selected state.
According to such a control method, the operator of the terminal can easily distinguish which button is in the selected state.
For the control method, in one possible embodiment, the method further comprises the following steps: and a notification step of causing the terminal to notify information specifying the button in the selected state.
According to such a control method, the operator of the terminal can easily distinguish which button is in the selected state by the object linked with the terminal.
A third embodiment of the present invention provides a terminal device, including: the control method comprises a processor, a memory, an interface and a bus, wherein the processor, the memory and the interface complete mutual communication through the bus, and the memory stores at least one executable instruction which enables the processor to execute the corresponding operation of the control method.
With such a configuration, a function of clearly viewing the content displayed on the screen of the display device can be realized.
Drawings
Next, an information providing apparatus and a program according to an embodiment will be described with reference to the drawings. A more complete appreciation of the invention and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein the accompanying drawings are included to provide a further understanding of the invention and form a part of this application, and wherein the illustrated embodiments of the invention and the description thereof are intended to illustrate and not limit the invention, wherein:
fig. 1 is a block diagram showing a main circuit configuration of each of devices included in an information providing system according to an embodiment;
fig. 2 is a diagram showing an example of a screen displayed by the display device in fig. 1;
fig. 3 is a flowchart of a control process based on the CPU of the server in fig. 1;
fig. 4 is a diagram showing an example of a screen displayed on the touch panel in fig. 1;
fig. 5 is a flowchart of a control process based on the CPU of the server in fig. 1;
fig. 6 is a diagram showing an example of a screen displayed by the display device in fig. 1;
fig. 7 is a flowchart of a control process based on the CPU of the information terminal in fig. 1;
fig. 8 is a diagram showing an example of a screen displayed on the touch panel in fig. 1;
fig. 9 is a flowchart of control processing based on the CPU of the server in fig. 1; and
fig. 10 is a flowchart of control processing based on the CPU of the server in fig. 1.
Description of the reference numerals
1 information providing system 10 server
11. 31 CPU 12, 32 Main memory
13. 33 auxiliary storage device 14 output interface
15. 35 communication interface 16, 37 bus
20 display device 30 information terminal
34 touch panel 36 camera
Detailed Description
Various exemplary embodiments, features and aspects of the present invention will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present invention. It will be understood by those skilled in the art that the present invention may be practiced without some of these specific details. In some instances, methods, procedures, components, and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present invention.
Next, an information providing system according to an embodiment will be described with reference to the drawings.
Fig. 1 is a block diagram showing a main circuit configuration of each of the devices included in the information providing system 1 according to the embodiment. The information providing system 1 includes a server 10, a display device 20, and an information terminal 30. The server 10 and the information terminal 30 are connected to a network NW. The network NW is typically a communication network comprising the internet. Alternatively, the Network NW may be a communication Network including a LAN (Local Area Network) or other communication lines. For example, in the information providing system 1, HTTP (Hypertext Transfer Protocol) is used as a communication Protocol for communication via the network NW. Therefore, the request described in the operation description is an HTTP request, and the response is an HTTP response. Although only one information terminal 30 is shown in fig. 1, the number is not limited to one, and typically a plurality of information terminals 30 are present.
The information providing system 1 causes the display device 20 to display an information providing screen including content under the control of the server 10. In addition, the server 10 can cause the information terminal 30 and the display device 20 to be linked with each other in a state where the display device 20 displays the information providing screen. When the information terminal 30 and the display device 20 are interlocked, an object interlocked with the information terminal 30 is displayed on the screen of the display device 20. Then, the operator of the information terminal 30 can operate the object displayed on the screen of the display device 20 by operating the information terminal 30 linked to the display device 20. Further, the operator of the information terminal 30 can change the icon image used for the appearance (outline) of the object. Then, the operator of the information terminal 30 can set the button displayed on the screen of the display device 20 to the selected state by moving the object. Further, the operator of the information terminal 30 can download the content corresponding to the button in the selected state from the server 10 to the information terminal 30.
The server 10 is, for example, a Web site (Web) server. The server 10 publishes a web page. The web page is composed of data such as HTML or XML, and data such as image, video, or audio (voice). The server 10 is an example of an information providing apparatus.
The server 10 includes a CPU (Central Processing Unit) 11, a main memory 12, an auxiliary storage device 13, an output interface 14, a communication interface 15, and a bus 16.
The CPU11 corresponds to a central part of a computer that performs processing and control necessary for the operation of the server 10. The CPU11 controls each unit to realize various functions of the server 10 based on programs such as an operating system and application software stored in the main memory 12.
The main memory 12 corresponds to a main memory portion of the above-described computer. The main memory 12 stores programs such as an operating system and application software. The main memory 12 stores data to be referred to when the CPU11 performs various kinds of processing. The main memory 12 is used as a work area for temporarily storing data to be used when the CPU11 performs various processes.
The auxiliary storage device 13 corresponds to an auxiliary storage section of the computer. The auxiliary storage device 13 is, for example, an EEPROM (electrically Erasable Programmable Read Only Memory), an HDD (Hard disk Drive) or an SSD (Solid State Drive). The auxiliary storage device 13 stores data used when the CPU11 performs various processes, data generated by processes performed by the CPU11, and the like. The auxiliary storage device 13 may store programs such as the operating system and application software.
In addition, the auxiliary storage device 13 also stores a user information DB (DataBase: DataBase) 131. The user information DB131 associates and stores information on each user of the users of the service provided by the server 10 with a unique user ID (Identifier) identifying the user. The user information DB131 may store icon images in association with user IDs of the user IDs.
The programs stored in the main memory 12 or the auxiliary storage device 13 include control programs described in connection with control processing described later. For example, the server 10 is assigned to an operator (carrier) of the server 10 in a state where the control program is stored in the main memory 12 or the auxiliary storage device 13. However, the server 10 may be assigned to the operator in a state where the control program is not stored in the main memory 12 or the secondary storage device 13. Further, the control program assigned to the operator may be written in the main memory 12 or the secondary storage device 13 by an operation performed by the operator or a service person. The transfer of the control program in this case can be realized by recording the program in a removable storage medium such as a magnetic disk, an optical disk, a semiconductor memory, or by downloading the program via a network.
The output interface 14 is an interface for outputting data such as video. The output interface 14 may output audio (voice) data in addition to the video data.
The communication interface 15 is an interface for the server 10 to communicate via a network.
The bus 16 includes an address bus, a data bus, and the like, and transmits signals transmitted and received by each unit of the server 10.
The display device 20 is, for example, a digital signage. The display device 20 is connected to the output interface 14 of the server 10. The display device 20 and the output interface 14 may be connected by a wired communication path or a wireless communication path. Alternatively, the display device 20 and the output interface 14 may be connected by a communication path in which both wire and wireless are connected. The display device 20 receives the video data output from the output interface 14. The display device 20 displays a screen based on the inputted image data. The display device 20 may be connected to the network NW. In this case, the video data and the like may be output through the communication interface 15 instead of the output interface 14 and input to the display device 20 through the network NW.
The information terminal 30 is operated by a user or the like who uses the service provided by the server 10.
The information terminal 30 is a portable Computer such as a portable phone, a smart phone, a tablet PC (Personal Computer), a notebook PC, or the like.
The information terminal 30 includes a CPU31, a main memory 32, an auxiliary storage device 33, a touch panel 34, a communication interface 35, a camera 36, and a bus 37.
CPU31 corresponds to a central part of a computer that performs processing and control necessary for the operation of information terminal 30. CPU31 controls each unit to realize various functions of information terminal 30 based on programs such as an operating system and application software stored in main memory 32.
The main memory 32 corresponds to a main memory portion of the above-described computer. The main memory 32 stores programs such as an operating system and application software. The main memory 32 stores data to be referred to when the CPU31 performs various kinds of processing. The main memory 32 is used as a work area for temporarily storing data to be used when the CPU31 performs various processes.
The auxiliary storage device 33 corresponds to an auxiliary storage section of the computer. The auxiliary storage device 33 is, for example, an EEPROM, an HDD, or an SSD. The auxiliary storage device 33 stores data used when the CPU31 performs various processes, data generated by processes performed by the CPU31, and the like. The auxiliary storage device 33 may store programs such as the operating system and application software.
In addition, the auxiliary storage device 33 also stores some icon images.
The programs stored in the main memory 32 or the secondary storage device 33 include a web browser. For example, the information terminal 30 is transferred to the user in a state where a web browser is stored in the main memory 32 or the auxiliary storage device 33. However, the information terminal 30 may be transferred to the user in a state where the web browser is not stored in the main memory 32 or the auxiliary storage device 33. Further, a web browser that is also assigned to the user may be written to the main memory 32 or the auxiliary storage device 33 by an operation performed by the user or the like. The transfer of the web browser in this case can be realized by recording in a removable storage medium such as a magnetic disk, an optical disk, a semiconductor memory, or by downloading via a network.
The web browser stored in the main memory 32 or the auxiliary storage device 33 causes a web page or the like published by the server 10 to be displayed on the touch panel 34.
The touch panel 34 has a function as a display device that displays a screen for notifying an operator of the information terminal 30 of various information. The touch panel 34 also functions as an input device that receives a touch operation by the operator.
The communication interface 35 is an interface for the information terminal 30 to communicate via a network.
The camera 36 is used for taking images such as still images or moving images (videos).
Bus 37 includes an address bus, a data bus, and the like, and transmits signals transmitted and received to and from each unit of information terminal 30.
Next, the operation of the information providing system 1 according to the embodiment will be described with reference to fig. 2 to 10. The processing contents in the following operation description are examples, and various processes that can obtain the same result can be appropriately used.
In operation of the server 10, the display device 20 displays a screen containing advertising, tourist guides, entertainment, or other various information through the action of the server 10.
Fig. 2 is a diagram illustrating an example of a display screen SC1 displayed on the display device 20. While the server 10 is operating, the CPU11 instructs the output interface 14 to output video data corresponding to the display screen SC 1. The output interface 14 outputs the image data after receiving the instruction. The outputted image data is inputted to the display device 20. The display device 20 displays the display screen SC1 based on the video data that has been input. The display screen SC1 is an example of an information providing screen including content. Therefore, by performing this processing, the computer having the CPU11 as a main component operates as a screen display means (screen display unit, screen display step) for causing the display device to display an information provision screen.
The display screen SC1 includes buttons B11 to B15, a code C, and areas a11 and a 12. Each of the buttons B11 to B15 is a button for accessing various contents. By operating the buttons B11 to B15, it is possible to access a web page or the like related to the content or to display details of the content. Each of the buttons B11 to B15 displays a character string, an image, a moving image (video), or the like indicating which content the button can access. As an example, the code C is a two-dimensional code. The code C includes a URL for accessing a web page disclosed by the server 10. When the information terminal 30 accesses the URL, the information terminal 30 transmits a connection request to the server 10. Area a11 displays a text column as the password. The area a12 displays content represented by an image, a moving image, or the like.
The display screen SC1 further includes an area a13 to an area a 17. The regions a13 to a17 are not actually displayed. In fig. 2, the region a13 to the region a17 are shown by imaginary lines. Further, a unique area number is associated with each of the areas a13 to a 17. However, the region number is regarded as a number other than 0. The region a13 to the region a17 exemplify selected regions.
The ranges in which the button B11 through the button B15, the code C, and the area a11 are displayed are set as the prohibited areas. The outside of the display screen SC1 is also set as a prohibited area.
Next, a process of connecting the information terminal 30 and the server 10 will be described. In order to link the information terminal 30 and the display device 20, the information terminal 30 and the server 10 need to be connected.
The operator of the information terminal 30, such as reading the code C using the camera 36 provided to the information terminal 30, causes the information terminal 30 to access a web page shown by the URL contained in the code C. The CPU31 instructs the communication interface 35 to send a connection request to the address indicated by the URL, that is, the server 10. Upon receiving the instruction, the communication interface 35 transmits the connection request to the server 10. The connection request that has been sent is received through the communication interface 15 of the server 10.
Fig. 3 is a flowchart of the control process by the CPU11 of the server 10. The CPU11 executes the control processing based on the control program stored in the main memory 12 or the secondary storage device 13.
In Act1 of fig. 3, the CPU11 of the server 10 confirms whether a connection request is received through the communication interface 15. If the connection request is not received, the CPU11 determines No in Act1 and proceeds to Act 2.
In Act2, the CPU11 confirms whether an operation request is received through the communication interface 15. The operation request will be described later. If the operation request is not received, the CPU11 determines No in Act2 and proceeds to Act 3.
In Act3, the CPU11 confirms whether a setting request is received through the communication interface 15. The following description will be made of the setting request. If the setting request is not received, the CPU11 determines No in Act3 and returns to Act 1. Thus, the CPU11 repeats Act1 to Act3 until a connection request, an operation request, or a setting request is received. If the CPU11 receives a connection request while in the standby accepted state of Act1 to Act3, it is determined Yes in Act1 and proceeds to Act 4.
In Act4, the CPU11 confirms whether or not the information terminal 30 (hereinafter referred to as "transmission source terminal") that is the transmission source of the connection request whose reception was confirmed in Act1 is in a registered state in the service provided by the server 10. The CPU11 confirms whether the transmission source terminal is in the registered state, such as using a Cookie (session). If the transmission source terminal is in the registered state, the CPU11 can specify with which user ID the transmission source terminal is registered by using the Cookie. If the transmission source terminal is not in the registered state, the CPU11 determines No in Act4 and proceeds to Act 5.
In Act5, the CPU11 performs processing for registering a transmission source terminal in a service provided by the server 10. Since a known method can be used for the process for registering the transmission source terminal, the description thereof is omitted. When the user registration is not performed by the operator of the transmission source terminal, the CPU11 performs a process for causing the operator of the transmission source terminal to perform the user registration before performing a process for causing the registration. At this time, the CPU11 stores information relating to user registration in the user information DB 131. The CPU11 proceeds to Act6 after the processing of Act 5. Further, if the transmission source terminal is in the registered state, the CPU11 determines Yes in Act4 and proceeds to Act 6.
In Act6, the CPU11 generates a my page response for causing the touch panel 34 of the transmission source terminal to display a my page. The my page response includes my page data for displaying the my page. Further, the CPU11 instructs the communication interface 15 to cause the my page response that has been generated to be transmitted to the transmission source terminal. The communication interface 15 transmits the my page response to the transmission source terminal upon receiving the instruction. The my page response that has been transmitted is received through the communication interface 35 of the transmission source terminal. The CPU11 returns to Act1 after the process of Act 6.
On the other hand, the CPU31 of the transmission source terminal that transmitted the connection request receives a standby accept response through the communication interface 35. If the my page response has been received, the CPU31 generates an image corresponding to the my page based on the my page data contained in the my page response. Further, the CPU31 instructs the touch panel 34 to display the image. The touch panel 34 displays my page after accepting the instruction.
Fig. 4 is a diagram showing an example of the my page SC2 displayed on the touch panel 34. My page SC2 includes a character string CH, an input field F, and buttons B21 to B25. The character column CH indicates a command for prompting the operator of the source terminal to input a password in the input field F. The input field F is a field in which a character string is input by an operator of the source terminal. When the button B21 is to be operated in conjunction with the display device 20 at the transmission source terminal, it is touched by the operator of the transmission source terminal. When the appearance (icon image) of the object is to be changed, the button B22 is touched by the operator of the transmission source terminal. Buttons B23 to B25 are buttons having other functions. The buttons B23 to B25 are not described in detail.
When the operator of the transmission source terminal attempts to perform an operation in conjunction with the display device 20 at the transmission source terminal, the operator views the display device 20 and confirms the password displayed in the area a11 of the display screen SC 1. Then, the operator of the transmission source terminal enters the confirmed password in the input field F and touches the button B21.
The CPU31 of the transmission source terminal generates an operation request when the button B21 is touched. The operation request includes the character string input in the input field F, and the CPU31 generates the operation request. Further, the CPU31 instructs the communication interface 35 to transmit the operation request to the server 10. Upon receiving the instruction, the communication interface 35 transmits the operation request to the server 10. The operation request that has been transmitted is received through the communication interface 15 of the server 10.
The CPU11 of the server 10 determines Yes in Act2 and proceeds to Act7 if the operation request is received while in the standby accepted state of Act1 to Act3 in fig. 3.
In Act7, the CPU11 confirms whether the password is correct. That is, the CPU11 confirms whether the character string included in the operation request confirmed to be received in Act2 is the same as the character string displayed in the area a1 of the display screen SC 1. By doing so, only the information terminal 30 operated by a person who can see the display screen SC1 displayed on the display device 20 can be connected to the server 10. If the password is not correct, the CPU11 determines No in Act7 and proceeds to Act 8.
In Act8, the CPU11 instructs the communication interface 15 to transmit an error response to the transmission source terminal, the error response being used to display an error screen indicating that the password is different on the touch panel 34 of the transmission source terminal. The error response includes error picture data for displaying an error picture. Upon receiving the instruction, the communication interface 15 transmits the error response to the transmission source terminal. The transmitted error response is received through the communication interface 35 of the transmission source terminal. The CPU11 returns to Act1 after the processing of Act 8.
On the other hand, the CPU31 of the transmission source terminal that transmitted the operation request receives a standby accept response through the communication interface 35. If an error response has been received, the CPU31 generates an image corresponding to an error picture based on the error picture data contained in the error response. Further, the CPU31 instructs the touch panel 34 to display the image. Upon receiving the instruction, the touch panel 34 displays an error screen.
For example, the error screen is a screen in which a character string indicating a different meaning of the password is added to my page SC2 shown in fig. 4.
On the other hand, if the password is correct, the CPU11 determines Yes in Act7 of fig. 3 and proceeds to Act 9.
In Act9, the CPU11 connects the server 10 and the transmission source terminal so as to be capable of bidirectional communication. As an example, Websocket (web connection service) or the like is used for bidirectional communication.
In Act10, the CPU11 starts the interlocking process for the transmission source terminal.
The CPU11 returns to Act1 after the processing of Act 10.
Next, an operation in which the information terminal 30 and the display device 20 of the information providing system 1 are linked will be described.
The CPU11 of the server 10 starts the interlocking process shown in fig. 5 for each of the information terminals 30 connected bidirectionally communicably to the server 10. However, in the following description, the processing of one information terminal 30 will be described.
Fig. 5 is a flowchart of the control process by the CPU11 of the server 10. The CPU11 executes the control processing based on the control program stored in the main memory 12 or the secondary storage device 13. The CPU11 executes the control processing shown in fig. 5 by a thread different from the control processing shown in fig. 3, for example, and performs processing in parallel with the control processing shown in fig. 3. Further, when there are a plurality of information terminals 30 connected bidirectionally to the server 10, the CPU11 executes respective control processes such as the control process shown in fig. 5 for each of the information terminals 30 in different threads, and performs the processes in parallel. When the CPU31 starts the control process shown in fig. 5, the variable a of the initial value 0 is secured in the memory. The variable a indicates in which selected area the object corresponding to the information terminal 30 is located by storing the area number associated with the selected area.
In Act21 of fig. 5, the CPU11 adds one object linked to the information terminal 30 to the display screen SC 1. When the user ID of the information terminal 30 is associated with an icon image, the CPU11 regards the appearance of the added object as the icon image. When no icon image is associated with the user ID of the information terminal 30, the CPU11 regards the appearance of the object as a default icon image. The default icon image is stored in the auxiliary storage device 13 or the like in advance. In addition, the default icon image may be of various kinds. At this time, the CPU11 changes the type of icon image used for the appearance of the object for each object so that objects of the same appearance are not displayed on the display screen SC 1. Alternatively, the CPU11 may change the color of each object so that objects having the same appearance are not displayed on the display screen SC 1.
That is, the CPU11 instructs the output interface 14 to output video data corresponding to the display screen SC1 to which one object has been added. The output interface 14 outputs the image data after receiving the instruction. The outputted image data is inputted to the display device 20. The display device 20 displays the display screen SC1 to which one object is added, based on the input video data. Fig. 6 is a diagram showing an example of the display screen SC1 to which an object is added. Fig. 6 shows a state in which three objects OB11 to OB13 are displayed on the display screen SC 1. Displayed on the display screen SC1The number of information terminals 30 (the number corresponding to the number of information terminals 30) to which the server 10 can bidirectionally communicate with. Each of the objects corresponds one-to-one to each of the information terminals 30 connected to the server 10 so as to be capable of bidirectional communication. As an example, the position of the object is represented by coordinates (x, y) of (0.0) with the upper left corner of the display screen as the origin, and coordinates (x, y) of (k)x,ky). Further, the x coordinate increases toward the right of the screen, and the y coordinate increases toward the lower side of the screen. By performing the process of Act21, the computer having the CPU11 as the center operates as an object display means (object display unit, object display step) for displaying an object linked to the terminal on the information providing screen in accordance with the connection of the terminal.
In Act22 of fig. 5, the CPU11 generates an operation screen command for causing the touch panel 34 of the information terminal 30 to display an operation screen. The operation screen command includes operation screen data for displaying an operation screen. Further, the CPU11 instructs the communication interface 15 to transmit the generated operation screen command to the information terminal 30. Upon receiving the instruction, the communication interface 15 transmits the operation screen command to the information terminal 30. The operation screen command that has been transmitted is received through the communication interface 35 of the information terminal 30.
On the other hand, the CPU31 of the information terminal 30 that has received the operation screen command starts the control process shown in fig. 7.
Fig. 7 is a flowchart of control processing by the CPU31 of the information terminal 30. The CPU31 executes the control processing based on the web browser stored in the main memory 32 or the auxiliary storage device 33 and the received operation screen command.
In Act41 of fig. 7, the CPU31 of the information terminal 30 generates an image corresponding to the operation screen based on the operation screen data contained in the operation screen command. Further, the CPU31 instructs the touch panel 34 to display the image. Upon receiving the instruction, the touch panel 34 displays the operation screen.
Fig. 8 is a diagram showing an example of an operation screen SC3 displayed on the touch panel 34 of the information terminal 30. The operation screen SC3 includes a right button B31, a left button B32, a down button B33, an up button B34, a DL button B35, a change button B36, an end button B37, and an area a 31. The right button B31, left button B32, lower button B33, and upper button B34 are cross keys. The right button B31 is operated by the operator of the information terminal 30 when the operator attempts to move the object displayed on the display device to the right. The left button B32 is operated by the operator of the information terminal 30 when the object displayed on the display device is to be moved to the left. The down button B33 is operated by the operator of the information terminal 30 when the operator intends to move the object displayed on the display device downward. The up button B34 is operated by the operator of the information terminal 30 when the object displayed on the display device is to be moved upward. The DL button B35 is operated by the operator of the information terminal 30 when the content corresponding to the button that has been currently selected on the display screen SC1 is to be downloaded. The change button B36 is operated by the operator of the information terminal 30 when the appearance of the object displayed on the display device is to be changed. The appearance of the change button B36 is the same as that of the object corresponding to the information terminal 30 displayed on the display device 20. Alternatively, the appearance of the change button B36 may be an appearance for enlarging or reducing the object. In this way, it becomes easy for the operator of the information terminal 30 to understand what appearance the object displayed on the display device 20 is operating by himself or herself. The end button B37 is operated by the operator of the information terminal 30 when the operation linked with the display device 20 in the information terminal 30 is to be ended. The area a31 displays information specifying the button that has been currently selected on the display screen SC 1.
In Act42 of fig. 7, the CPU31 confirms whether a selection command is received through the communication interface 35. The description of the selection command will be described later. If the selection command is not received, the CPU31 determines No in Act42 and proceeds to Act 43.
In Act43, the CPU31 confirms whether the right button B31 has been touched. If the right button B31 is not touched, the CPU31 determines No in Act43 and proceeds to Act 44.
In Act44 the CPU31 confirms whether the left button B32 has been touched. If the left button B32 is not touched, the CPU31 determines No in Act44 and proceeds to Act 45.
In Act45, the CPU31 confirms whether the down button B33 has been touched. If the down button B33 is not touched, the CPU31 determines No in Act45 and proceeds to Act 46.
In Act46, the CPU31 confirms whether the up button B34 has been touched. If the up button B34 is not touched, the CPU31 determines No in Act46 and proceeds to Act 47.
In Act47, the CPU31 confirms whether the DL button B35 shown in fig. 8 has been touched. If the DL button B35 is not touched, the CPU31 determines No in Act47 and proceeds to Act 48.
In Act48, the CPU31 confirms whether the change button B36 has been touched. If the change button B36 is not touched, the CPU31 determines No in Act48 and proceeds to Act 49.
In Act49, the CPU31 confirms whether the end button B37 has been touched. If the end button B37 is not touched, the CPU31 determines No in Act49 and returns to Act 42. Thus, the CPU31 repeats Act42 to Act49 until a selection command is received or any one of the button B31 to button B37 is touched. If the right button B31 has been touched while in the standby accepted state of Act42 to Act49, the CPU31 determines Yes in Act43 and proceeds to Act 50.
In Act50, the CPU31 instructs the communication interface 35 to transmit a right command instructing to move the object to the right to the server 10. Upon receiving the instruction, the communication interface 35 transmits the right command to the server 10. The right command that has been sent is received through the communication interface 15.
Further, if the left button B32 is touched while in the standby accepted state of Act42 to Act49, the CPU31 makes a determination of Yes in Act44 and proceeds to Act 51.
In Act51, the CPU31 instructs the communication interface 35 to transmit a left command to the server 10 to instruct the object to move to the left. Upon receiving the instruction, the communication interface 35 transmits the left command to the server 10. The left command that has been sent is received through the communication interface 15.
Further, if the button B33 is touched while in the standby accepted state of Act42 to Act49, the CPU31 makes a determination of Yes in Act45 and proceeds to Act 52.
In Act52, the CPU31 instructs the communication interface 35 to transmit a down command to the server 10 to instruct the object to move downward. The communication interface 35 transmits the down command to the server 10 upon receiving the instruction. The down command that has been transmitted is received through the communication interface 15.
Further, if the up button B34 has been touched while in the standby accepted state of Act42 to Act49, the CPU31 makes a determination of Yes in Act46 and proceeds to Act 51.
In Act51, the CPU31 instructs the communication interface 35 to transmit an up command for instructing the server 10 to move the object upward. Upon receiving the instruction, the communication interface 35 transmits the up command to the server 10. The up command that has been sent is received through the communication interface 15.
On the other hand, in Act23 of fig. 5, the CPU11 of the server 10 confirms whether a right command or a left command is received through the communication interface 15. If the left or right command is not received, the CPU11 determines No in Act23 and proceeds to Act 24.
In Act24, the CPU11 confirms whether a down command or an up command is received through the communication interface 15. If the down command or the up command is not received, the CPU11 determines No in Act24 and proceeds to Act 25.
In Act25 the CPU11 confirms whether a DL command is received through the communication interface 15. The DL command will be described later. If No DL command is received through the communication interface 15, the CPU11 determines No in Act25 and proceeds to Act 26.
In Act26, the CPU11 confirms whether a change command is received through the communication interface 15. The description of the change command will be described later. If the change command is not received through the communication interface 15, the CPU11 determines No in Act26 and proceeds to Act 27.
In Act27 the CPU11 confirms whether an end command is received through the communication interface 15. The description of the termination command will be described later. If the end command is not received through the communication interface 15, the CPU11 determines No in Act27 and returns to Act 23. Thus, the CPU11 repeats Act23 to Act27 until a right command, a left command, a down command, an up command, a DL command, a change command, or an end command is accepted. If the CPU11 receives a right or left command while in the standby accepted state of Act23 to Act27, the CPU determines Yes in Act23 and proceeds to Act 28.
In Act28, the CPU11 performs the leftward and rightward movement processing shown in fig. 9. However, if it is confirmed in Act23 that the received command is a right command, the CPU11 performs left-right movement processing as n being 1. Further, if it is confirmed in Act23 that the received command is a left command, the CPU11 performs left-right movement processing as n-1.
Fig. 9 is a flowchart of control processing by the CPU11 of the server 10. The CPU11 executes the control processing based on the control program stored in the main memory 12 or the secondary storage device 13.
In Act81 of FIG. 9, the CPU11 checks the coordinate (k)x+n、ky) Whether the indicated position is within a range that has been set as a forbidden area. If the coordinate (k)x+n、ky) If the coordinate is the coordinate in the prohibited area, the CPU11 determines Yes in Act81 and ends the left-right movement processing. For this, if the coordinate (k)x+n、ky) Not the coordinates within the prohibited area, the CPU11 determines No in Act81 and proceeds to Act 82.
In Act82, CPU11 makes kxBecomes kx + n. Then, the CPU11 moves the display position of the object linked to the information terminal 30 that is the transmission source of the right command or the left command to the right by n. That is, the CPU11 instructs the output interface 14 to output video data corresponding to the display screen SC1 in which the display position of the object has been shifted by n to the right. The output interface 14 outputs the image data after receiving the instruction. The outputted image data is inputted to the display device 20. The display device 20 displays the display screen SC1 in which the display position of the object is shifted to the right by n based on the inputted video data. Therefore, by performing the process of Act82, the computer having the CPU11 as the center operates as a moving means (moving unit, moving step) for moving the display position of the object. In addition, the processing of Act82 was determined to be in the processing of Act81Yes is performed. That is, the CPU11 does not perform the process of Act82 when the object moves within the predetermined prohibited area. Therefore, by performing the process of Act81, the computer having the CPU11 as the center operates as a movement control means (movement control unit, movement control step) for controlling the movement unit so that the object does not enter a predetermined prohibited area within the information providing screen.
In Act83, CPU11 if (k)x、ky) In any one of the area a13 to the area a17, the area number associated with the area is acquired. If (k)x、ky) When the CPU11 is not in any of the area a13 to the area a17, the CPU11 acquires 0 as the area number.
In Act84, the CPU11 confirms whether or not the area number acquired in Act83 is different from the value of the variable a. If the value of the variable a is the same, the CPU11 determines No in Act84 and ends the left-right movement processing. In contrast, if the value of the variable a is different, the CPU11 determines Yes in Act84 and proceeds to Act 85.
In Act85, the CPU11 changes the value of the variable a to the area number acquired in Act 83. By this processing, the button in the area associated with the area number is in the selected state. Therefore, the computer having the CPU11 as the center operates as a selection control means (selection control unit, selection control procedure) for setting the button in the selected state when an object enters a predetermined selection area surrounding the button displayed on the information providing screen by performing the process of Act 85.
In Act86, the CPU11 highlights a button displayed in an area with an area number of the value of the variable a, that is, a button in the selected state. That is, the CPU11 instructs the output interface 14 to output video data corresponding to the display screen SC1 on which the button is highlighted. The output interface 14 outputs the image data after receiving the instruction. The outputted image data is inputted to the display device 20. The display device 20 displays a display screen SC1 for highlighting the button based on the inputted video data. In addition, the highlighting of the button is such as blinking the button, changing the color of the button, or thickening the frame of the button, or the like. Therefore, by performing the process of Act86, the computer having the CPU11 as the center operates as a highlighting means (a projecting section, a projecting step) for highlighting the button in the selected state.
In Act87, the CPU11 changes the appearance of the object. That is, the CPU11 displays the graphic of the dialog box on the object. The dialog box includes a character string or the like indicating that the button is being selected. A dialog box OB121 shown in fig. 6 is an example of a dialog box displayed to show an object OB 12. The CPU11 then rotates the object as necessary. For example, when the appearance of the object indicates a predetermined direction like an arrow, the CPU11 rotates the object so that the direction of the arrow is toward the button being selected. Object OB12 shown in fig. 6 is an example of an object rotated so that the object having the same appearance as object OB11 is oriented in the direction of the button being selected.
In Act88, the CPU11 generates a selection command containing an outline (brief description) of the content corresponding to the button in the selected state. However, if the area number stored in Act83 is 0, the CPU11 generates a selection command so as not to include the summary of the content. Further, the CPU11 instructs the communication interface 15 to cause the generated selection command to be transmitted to the information terminal 30. Upon receiving the instruction, the communication interface 15 transmits the selection command to the information terminal 30. The transmitted selection command is received through the communication interface 35 of the information terminal 30. After the process of Act88, the CPU11 ends the left-right movement process. The CPU11 returns to Act23 after the left-right movement processing is completed.
On the other hand, in the standby accepted states of Act42 to Act49 in fig. 7, if the CPU31 of the information terminal 30 receives the selection command, it is determined Yes in Act42 and proceeds to Act 54.
In Act54, the CPU31 causes the summary of the content included in the selection command whose reception was confirmed in Act42 to be displayed in the area a31 of the operation screen SC 3. In addition, if the summary of the content is not contained in the selection command, the CPU31 is in a state of displaying nothing in the area a31 of the operation screen SC 3. That is, the CPU31 generates an image corresponding to the operation screen SC3 displayed in the area a31 as described above. Further, the CPU31 instructs the touch panel 34 to display the image. Upon receiving the instruction, touch panel 34 displays operation screen SC 3. The CPU31 returns to Act42 after the process of Act 54. The summary of the content included in the selection command is a summary of the content corresponding to the button in the selected state. That is, the content displayed in the area a31 is information specifying the button in the selected state. Therefore, the information specifying the button in the selected state is notified to the operator of the information terminal 30 or the like by the processing of Act 54. The selection command including the summary of the content is a command transmitted in Act 88. Therefore, by performing the process of Act88, the computer having the CPU11 as the center operates as a notification means (notification unit, notification step) for causing the terminal to notify information specifying the button in the selected state.
On the other hand, if the CPU11 of the server 10 receives a down command or an up command in the standby accepted states of Act23 to Act27 in fig. 5, it is determined as Yes in Act24 and the process proceeds to Act 29.
In Act29, the CPU11 performs the vertical movement processing shown in fig. 10. However, if the CPU11 confirms that the received command is a down command at Act24, it performs the up/down movement process with m equal to 1. Further, if it is confirmed in Act24 that the received command is an up command, the CPU11 performs up-and-down movement processing as m-1.
Fig. 10 is a flowchart of control processing by the CPU11 of the server 10. The CPU11 executes the control processing based on the control program stored in the main memory 12 or the secondary storage device 13.
In Act91 of FIG. 10, the CPU11 checks the coordinate (k)x、ky+ m) is within a range that has been set as the prohibited area. If the coordinate (k)x、ky+ m) is the coordinate within the prohibited area, the CPU11 determines Yes in Act91 and ends the up-down movement process. For this, if the coordinate (k)x、ky+ m) is not the coordinate within the prohibited area, the CPU11 determines No in Act91 and proceeds to Act 92.
In Act92, CPU11 makes kyBecomes ky+ m. Then, the CPU11 moves the display position of the object linked to the information terminal 30 as the transmission source of the down command or the up command downward by m. That is, the CPU11 instructs the output interface 14 to output video data corresponding to the display screen SC1 in which the display position of the object is moved downward by m. The output interface 14 outputs the image data after receiving the instruction. The outputted image data is inputted to the display device 20. The display device 20 displays the display screen SC1 in which the display position of the object is moved downward by m based on the inputted video data. Therefore, by performing the process of Act92, the computer having the CPU11 as the center operates as a moving means (moving unit, moving step) for moving the display position of the object. Further, the process of Act92 is performed when it is determined to be Yes in the process of Act 91. That is, the CPU11 does not perform the process of Act92 when the object moves within the predetermined prohibited area. Therefore, by performing the process of Act91, the computer having the CPU11 as the center operates as a movement control means (movement control unit, movement control step) for controlling the movement unit so that the object does not enter a predetermined prohibited area within the information providing screen.
The CPU11 performs the same processing as Act83 to Act88 of the left-right movement processing shown in fig. 9 as the processing of Act93 to Act 97. However, when the CPU11 determines No in Act94, the up-down movement processing is ended. Further, the CPU11 ends the vertical movement processing after the processing of Act 98. The CPU11 returns to Act23 after the up-down movement processing is finished.
Further, if the DL button B35 is touched while the CPU31 of the information terminal 30 is in the standby accepted state of Act42 to Act49 in fig. 7, it is determined as Yes in Act47 and the process proceeds to Act 55.
In Act55, the CPU31 instructs the communication interface 35 to cause a DL command for downloading content corresponding to the button in the selected state to be transmitted to the server 10. Upon receiving the instruction, the communication interface 35 transmits the DL command to the server 10. The transmitted DL command is received through the communication interface 15 of the server 10.
On the other hand, if the CPU11 of the server 10 receives a content command while in the standby accepted state of Act23 to Act27 in fig. 5, it is determined Yes in Act25 and proceeds to Act 30.
In Act30, the CPU11 of the server 10 generates a content command containing content corresponding to the button in the selected state. However, if the value of the variable a is 0, the CPU11 generates a content command without including content. Further, the CPU11 instructs the communication interface 15 to cause the generated content command to be transmitted to the information terminal 30. Upon receiving the instruction, the communication interface 15 transmits the content command to the information terminal 30. The content command that has been transmitted is received through the communication interface 35 of the information terminal 30.
On the other hand, in Act56 of fig. 7, the CPU31 of the information terminal 30 waits for an accept content command to be received through the communication interface 35. If the content command is received, the CPU31 determines Yes in Act56 and proceeds to Act 57.
In Act57, the CPU31 performs processing corresponding to the content included in the content command. For example, the CPU31 performs processing for accessing a designated URL, processing for saving the content, and the like. The CPU31 stores content such as coupons and the like. In addition, if the content is not contained in the content command, the CPU31 ends the process of Act 57. The CPU31 returns to Act42 after the processing of Act 57.
Further, if the change button B36 shown in fig. 8 is touched while in the standby accepted states of Act42 to Act49, the CPU31 determines Yes in Act48 and proceeds to Act 58.
In Act58, the CPU31 causes a list of icon images stored in the auxiliary storage device 33 to be displayed on the operation screen SC3 shown in fig. 8. That is, the CPU31 generates an image corresponding to the screen on which the list of icon images is displayed on the operation screen SC 3. Further, the CPU31 instructs the touch panel 34 to cause the generated image to be displayed. Upon receiving the instruction, the touch panel 34 displays the screen.
The operator of information terminal 30 touches an icon image to be used as the appearance of the object from among the icon images displayed on operation screen SC 3.
In Act59 of fig. 7, the CPU31 waits for any one of the icon images in the list of icon images to be accepted to be touched. If the icon image has been touched, the CPU31 determines Yes in Act59 and proceeds to Act 60.
In Act60, the CPU31 generates a change command including an icon image touched by the operator of the information terminal 30. Further, the CPU31 instructs the communication interface 35 to transmit the generated change command to the server 10. Upon receiving the instruction, the communication interface 35 transmits the change command to the server 10. The transmitted change command is received through the communication interface 15 of the server 10.
On the other hand, if the CPU11 of the server 10 receives a change command while in the standby accepted states of Act23 to Act27 in fig. 5, it is determined as Yes in Act26 and the process proceeds to Act 31.
In Act31, the CPU11 stores the icon image in the user information DB131 in association with the user ID of the information terminal 30 that is the transmission source of the change command. Then, the CPU11 changes the appearance of the object linked to the information terminal 30 as the transmission source of the change command to the icon image included in the change command whose reception was confirmed in Act 26. That is, the CPU11 instructs the output interface 14 to output video data corresponding to the display screen SC1 in which the appearance of the object is changed to the icon image. The output interface 14 outputs the image data after receiving the instruction. The outputted image data is inputted to the display device 20. The display device 20 displays the display screen SC1 in which the appearance of the object is changed to the icon image based on the video data that has been input. Therefore, by performing the process of Act11, the computer having the CPU11 as the center operates as a changing means (changing unit, changing step) for changing the appearance of the object based on the operation to the terminal.
Further, if the end button B37 is touched while the CPU31 of the information terminal 30 is in the standby accepted state of Act42 to Act49 in fig. 7, it is determined as Yes in Act49 and the process proceeds to Act 61.
In Act61, the CPU31 instructs the communication interface 35 to cause an end command for ending the operation linked with the display device 20 to be transmitted to the server 10. Upon receiving the instruction, the communication interface 35 transmits the end command to the server 10. The end command that has been sent is received through the communication interface 15 of the server 10.
On the other hand, if the CPU11 of the server 10 receives an end command while in the standby accepted state of Act23 to Act27 in fig. 5, it is determined as Yes in Act27 and the process proceeds to Act 32.
In Act32, the CPU11 cuts off the connection between the server 10 and the information terminal 30 that enables bidirectional communication.
In Act33, the CPU11 erases the object linked to the information terminal 30 that is the transmission source of the end command. That is, the CPU11 instructs the output interface 14 to output the video data corresponding to the display screen SC1 from which the object has been erased. The output interface 14 outputs the image data after receiving the instruction. The image data that has been output is input to the display device 20. The display device 20 displays the erased display screen SC1 of the object based on the image data that has been input. After the process of Act33, the CPU11 ends the linkage process for the information terminal 30 that is the transmission source of the end command.
In the above description, the process of changing the appearance of the object when the operation linked with the display device 20 is performed is described, and in the following description, the process of changing the appearance of the object when the operation linked with the display device 20 is not performed is described.
When the operator of the source terminal attempts to change the appearance of the object, the operator touches the button B22 of the my page SC2 shown in fig. 4.
When the button B22 is touched, the CPU31 of the transmission source terminal causes a list of icon images stored in the auxiliary storage device 33 to be displayed on the my page SC2 shown in fig. 8, in the same manner as Act58 in fig. 7. Then, the operator of the transmission source terminal touches an icon image to be used as the appearance of the object from among the icon images displayed on my page SC 2.
When any one of the icon images in the list of icon images is touched, the CPU31 generates a change request including the touched icon image. Then, the CPU31 instructs the communication interface 35 to transmit the change request to the server 10. Upon receiving the instruction, the communication interface 35 transmits the change request to the server 10. The transmitted change request is received through the communication interface 15 of the server 10.
On the other hand, if the CPU11 of the server 10 receives a change request while in the standby accepted state of Act1 to Act3 in fig. 3, it is determined Yes in Act3 and the process proceeds to Act 11.
In Act11, the CPU11 stores the icon image in the user information DB131 in association with the user ID of the transmission source terminal. In this way, when an operation in conjunction with the display device 20 is performed by the information terminal 30 in which the user ID is registered in the service provided by the server 10, the appearance of the object displayed on the display device 20 is the icon image. Therefore, by performing the process of Act11, the CPU11 is regarded as a central computer and operates as a changing means (changing unit, changing step) for changing the appearance of the object based on the operation of the terminal. The CPU11 returns to Act1 after the processing of Act 11.
According to the information providing system 1 of the embodiment, the server 10 sets the prohibition area in advance in the display screen SC1 displayed on the display device 20. Then, the server 10 causes the display screen SC1 to display an object that moves on the display screen SC1 based on an operation performed on the information terminal 30. However, the server 10 controls the movement of the object so that the object does not enter the prohibited area. Therefore, the object is not superimposed on the character string, the image, the moving image (video), or the like displayed in the prohibited area and is not blurred (difficult to see).
Further, according to the information providing system 1 of the embodiment, the server 10 brings the buttons in any one of the areas a13 to a17 into the selected state when the object is located in the area. Therefore, even when the button is set as the prohibited area, the operator of the information terminal 30 can place the object in the area to set the button in the selected state.
Further, according to the information providing system 1 of the embodiment, the server 10 highlights the button in the selected state. Therefore, the operator of the information terminal 30 can easily distinguish (clarify) which button is in the selected state.
Further, according to the information providing system 1 of the embodiment, the server 10 notifies the information terminal 30 linked with the object of the information specifying which button the button in the selected state of the passing object is. Therefore, the operator of the information terminal 30 can easily distinguish (clarify) which button is in the selected state by the object linked to the information terminal 30.
Further, according to the information providing system 1 of the embodiment, the operator of the information terminal 30 can change the appearance of the object by operating the information terminal 30. At this time, the server 10 changes the appearance of the object based on the operation performed on the information terminal 30. When the appearance of the object is changed, a plurality of objects having different appearances may be displayed on the display device 20. In this case, it is easy for the operator of the information terminal 30 to determine which object among the plurality of objects the operator operates.
The above-described embodiment may also be modified as follows.
In the above embodiment, the object is movable in four directions, i.e., up, down, left, and right. However, the object may be moved in a direction other than up, down, left, and right. At this time, the CPU31 causes the touch panel 34 to display buttons capable of indicating four or more directions. When the button that has been operated is a button that has the right direction as 0 degrees and is moved in a direction rotated counterclockwise by an angle θ, the CPU31 generates a command including the angle θ. Further, the CPU31 instructs the communication interface 35 to cause the generated command to be transmitted to the server 10. The communication interface 35 transmits the command to the server 10 upon receiving the instruction. The command that has been transmitted is received through the communication interface 15 of the server 10. The CPU11 of the server 10 that has received the command moves the object in the direction of the angle θ, and therefore performs processing of Act81 and Act82 in fig. 9 and Act91 and Act92 in fig. 10 as n-cos θ and m-sin θ, for example. After that, the CPU11 performs the processing of Act83 to Act88 of fig. 9.
In the above-described embodiment, the information terminal 30 displays a list of icon images stored in the auxiliary storage device 33 when the appearance of the object is changed. Then, the information terminal 30 transmits the icon image selected by the operator of the information terminal 30 from the list to the server 10. However, when the information terminal 30 selects the icon image, the camera 36 is activated based on an operation performed by the operator of the information terminal 30. Further, the information terminal 30 may transmit the image captured by the camera 36 to the server 10. In this way, the server 10 that has received the image can recognize the appearance of the object as an image captured by the camera 36. In this case, if a face of an operator or the like of the information terminal 30 is imaged by the camera 36, the appearance of the object can be regarded as the face of the operator or the like.
In the above-described embodiment, in Act54 of fig. 7, the CPU31 causes the summary of the content included in the selection icon whose reception is confirmed in Act42 to be displayed in the area a31 of the operation screen SC 3. However, the CPU31 may inform the summary of the content in other ways. For example, the CPU31 causes a speaker, not shown, to output the outline of the content as sound.
The information terminal 30 may use an application other than the web browser, such as an application dedicated to the information providing system 1, instead of the web browser.
While several embodiments of the invention have been described, these embodiments have been presented by way of example, and are not intended to limit the scope of the invention. These novel embodiments may be embodied in other various forms, and various omissions, substitutions, and changes may be made without departing from the spirit of the invention. These embodiments and modifications are included in the scope and gist of the invention, and are included in the invention described in the scope of claims and the equivalent scope thereof.
In the present invention, there is provided a terminal device including: the control method comprises a processor, a memory, an interface and a bus, wherein the processor, the memory and the interface complete mutual communication through the bus, and the memory stores at least one executable instruction which enables the processor to execute the corresponding operation of the control method. With such a configuration, it is possible to realize a function of effectively playing (reproducing) video data related to a transaction.
One or more embodiments of the present invention may be implemented as a computer-readable recording medium on which commands or instructions such as program modules executed by a computer are recorded. The computer readable recording medium may be any medium that can be accessed by the computer, such as volatile media and the like. Also, the computer readable recording medium may be a computer storage medium or a communication medium that may be any information transmission medium.
A computer-readable recording medium of the present invention stores a program for causing a computer provided in an information providing apparatus to function as: a screen display unit that causes a display device to display an information provision screen including content; an object display unit that displays an object linked with a terminal on the information providing screen, in accordance with a state in which the terminal is connected; a moving unit that moves a display position of the object based on an instruction from the terminal; and a movement control unit that controls the movement unit so that the object does not enter a predetermined prohibited area within the information providing screen.

Claims (8)

1. An information providing apparatus comprising:
a screen display unit that causes a display device to display an information provision screen including content;
an object display unit that displays an object linked with a terminal on the information providing screen, in accordance with a state in which the terminal is connected;
a moving unit that moves a display position of the object based on an instruction from the terminal;
a movement control unit that controls the movement unit so that the object does not enter a range in which a button for accessing the content is displayed, the range being set as a prohibited area within the information providing screen; and
and a selection control unit that sets the button in a selected state when the object enters a predetermined selection area surrounding the button displayed on the information providing screen.
2. The information providing apparatus according to claim 1, further comprising:
and a projection for displaying the button in the selected state.
3. The information providing apparatus according to claim 1, further comprising:
and a notification unit configured to notify the terminal of information specifying the button in the selected state.
4. The information providing apparatus according to any one of claims 1 to 3, further comprising:
and a changing unit that changes an appearance of the object based on an operation of the terminal.
5. A control method of an information providing apparatus, comprising the steps of:
a screen display step of causing a display device to display an information providing screen including content;
an object display step of displaying an object linked with a terminal on the information providing screen, in accordance with a state in which the terminal is connected;
a moving step of moving a display position of the object based on an instruction from the terminal;
a movement control step of controlling the movement step so that the object does not enter a range in which a button for accessing the content is displayed, the range being set as a prohibited area within the information providing screen; and
and a selection control step of setting the button in a selected state when the object enters a predetermined selection area surrounding the button displayed on the information providing screen.
6. The control method according to claim 5, further comprising the steps of:
and a highlighting step of highlighting the button in the selected state.
7. The control method according to claim 5, further comprising the steps of:
and a notification step of causing the terminal to notify information specifying the button in the selected state.
8. A terminal device, comprising: a processor, a memory, an interface and a bus, through which the processor, the memory and the interface communicate with each other,
the memory stores at least one executable instruction, and the executable instruction causes the processor to execute the operation corresponding to the control method according to any one of claims 5 to 7.
CN201710833212.1A 2016-09-15 2017-09-15 Information providing device, control method and terminal equipment Active CN107831975B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JPJP2016-180601 2016-09-15
JP2016180601A JP6794200B2 (en) 2016-09-15 2016-09-15 Information providing equipment and programs

Publications (2)

Publication Number Publication Date
CN107831975A CN107831975A (en) 2018-03-23
CN107831975B true CN107831975B (en) 2021-06-08

Family

ID=61643856

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710833212.1A Active CN107831975B (en) 2016-09-15 2017-09-15 Information providing device, control method and terminal equipment

Country Status (2)

Country Link
JP (1) JP6794200B2 (en)
CN (1) CN107831975B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101765826A (en) * 2007-09-27 2010-06-30 株式会社日立医药 Information display device
CN102204224A (en) * 2008-10-27 2011-09-28 日本电气株式会社 Information processing device
CN103616996A (en) * 2013-11-28 2014-03-05 Tcl集团股份有限公司 Mouse displaying method and system capable of displaying covered content
CN104423847A (en) * 2013-09-11 2015-03-18 联想(北京)有限公司 Information processing method, electronic device and external device
CN104881423A (en) * 2014-02-28 2015-09-02 东芝泰格有限公司 Information Providing Method And System Using Signage Device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101765826A (en) * 2007-09-27 2010-06-30 株式会社日立医药 Information display device
CN102204224A (en) * 2008-10-27 2011-09-28 日本电气株式会社 Information processing device
CN104423847A (en) * 2013-09-11 2015-03-18 联想(北京)有限公司 Information processing method, electronic device and external device
CN103616996A (en) * 2013-11-28 2014-03-05 Tcl集团股份有限公司 Mouse displaying method and system capable of displaying covered content
CN104881423A (en) * 2014-02-28 2015-09-02 东芝泰格有限公司 Information Providing Method And System Using Signage Device

Also Published As

Publication number Publication date
JP6794200B2 (en) 2020-12-02
CN107831975A (en) 2018-03-23
JP2018045506A (en) 2018-03-22

Similar Documents

Publication Publication Date Title
US9569097B2 (en) Video streaming in a web browser
JP5772023B2 (en) Information processing system and information processing method
US11693535B2 (en) Display apparatus, user terminal, control method, and computer-readable medium
US8495495B2 (en) Information processing apparatus, bookmark setting method, and program
US20120085819A1 (en) Method and apparatus for displaying using image code
KR20140144104A (en) Electronic apparatus and Method for providing service thereof
KR20040038860A (en) Method, apparatus, and program for image processing
JP2016157156A (en) Information processing apparatus, information processing system, information processing method, and program
TW200807246A (en) A two-way data transmission system and method, a display device and a microcomputer
US10575030B2 (en) System, method, and program for distributing video
EP2605527B1 (en) A method and system for mapping visual display screens to touch screens
US20140104183A1 (en) Method and device for controlling at least one apparatus by at least one other apparatus, system implementing such a device
JP6062984B2 (en) Information processing apparatus and information display method
CN111064983B (en) Display device
CN107037961B (en) Display device, image forming apparatus, and control method of display device
CN107831975B (en) Information providing device, control method and terminal equipment
US10114518B2 (en) Information processing system, information processing device, and screen display method
KR102117452B1 (en) Electronic Device and the Method for Producing Contents
KR102168340B1 (en) Video display device
JP6779778B2 (en) Display control device and its control method
KR100739767B1 (en) Method for providing photo development service in digital TV and apparatus therefor
JP2018205825A (en) Workflow server, information processing method, and program
KR101875485B1 (en) Electronic apparatus and Method for providing service thereof
JP6174515B2 (en) Operation support system
TW201741958A (en) Single page shopping system converging video streaming and commerce processes and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant