WO2014136400A1 - 携帯端末装置、画像生成方法及びプログラムが格納された非一時的なコンピュータ可読媒体 - Google Patents
携帯端末装置、画像生成方法及びプログラムが格納された非一時的なコンピュータ可読媒体 Download PDFInfo
- Publication number
- WO2014136400A1 WO2014136400A1 PCT/JP2014/000970 JP2014000970W WO2014136400A1 WO 2014136400 A1 WO2014136400 A1 WO 2014136400A1 JP 2014000970 W JP2014000970 W JP 2014000970W WO 2014136400 A1 WO2014136400 A1 WO 2014136400A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- unit
- display unit
- screen
- image data
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1641—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being formed by a plurality of foldable display components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1647—Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
- G06F3/1431—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/239—Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
- H04N21/2393—Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests involving handling client requests
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44209—Monitoring of downstream path of the transmission network originating from a server, e.g. bandwidth variations of a wireless network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/61—Network physical structure; Signal processing
- H04N21/6106—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
- H04N21/6131—Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via a mobile phone network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6581—Reference data, e.g. a movie identifier for ordering a movie or a product identifier in a home shopping application
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
- G06F3/1446—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2300/00—Aspects of the constitution of display devices
- G09G2300/02—Composition of display devices
- G09G2300/026—Video wall, i.e. juxtaposition of a plurality of screens to create a display screen of bigger dimensions
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
- G09G2340/0421—Horizontal resolution change
Definitions
- the present invention relates to a non-transitory computer-readable medium storing a portable terminal device, an image generation method, and a program, and in particular, stores a portable terminal device, an image generation method, and a program that generate image data indicating a displayed screen.
- a non-transitory computer-readable medium storing a portable terminal device, an image generation method, and a program, and in particular, stores a portable terminal device, an image generation method, and a program that generate image data indicating a displayed screen.
- a mobile terminal device such as a mobile phone or a smartphone
- the convenience of the user can be improved by increasing the size of the screen.
- the size of the screen is enlarged too much, the size of the entire mobile terminal device becomes large, which may be inconvenient to carry. Therefore, by providing a plurality of screens on the mobile terminal device and having a structure that can be folded or slid, it is possible to prevent inconvenience in carrying.
- there is a technique for generating and storing image data indicating a screen displayed on a display device such as a screen shot, a screen capture, a screen dump, or a print screen. This technology is also applied to mobile phones and the like.
- Patent Document 1 displays program content on the first display unit, and displays an image of the program content displayed on the first display unit on the second display unit as an image capture unit.
- a mobile phone that displays an image capture button for capturing.
- Patent Document 2 captures at least one of an image input by the first camera unit, an image input by the second camera unit, and a received image and stores it in the storage unit as a still image or a moving image.
- a portable terminal device that can be stored, and can capture a plurality of images, combine them into one image, and store them.
- Patent Document 1 only an image of content displayed on one display unit is captured.
- Patent Document 2 when images taken by two cameras are displayed on two screens, they are only captured and combined into one image. That is, in the techniques of Patent Literature 1 and Patent Literature 2, when content is displayed on at least one of the plurality of display portions of the mobile terminal device using application software, the display portion to be captured is selected and displayed. It is difficult to generate data indicating the screen.
- An object of the present invention is to solve such problems, and easily generates data indicating a screen displayed on a display unit using software according to a selection state of the display unit. It is another object of the present invention to provide a non-transitory computer-readable medium storing a portable terminal device, an image generation method, and a program.
- a mobile terminal device includes a plurality of display units that display images, a display control unit that displays content on at least one of the plurality of display units using one or more software, and the plurality of display units Selection means for selecting one or more of the display sections, display screen acquisition means for acquiring screen data indicating the screen displayed on the display section selected by the selection means, and acquisition by the display screen acquisition means And image data generating means for generating image data based on the screen data.
- the image generation method uses at least one software to display content on at least one of a plurality of display units that display an image provided in the mobile terminal device, and One or more of them are selected, screen data indicating the screen displayed on the selected display unit is acquired, and image data is generated based on the acquired screen data.
- a program stored in a non-transitory computer-readable medium uses one or more software programs to store content in at least one of a plurality of display units that display an image provided in a mobile terminal device.
- an image data generating step for generating image data based on the screen data acquired in the display screen acquiring step.
- a portable terminal device an image generation method, and a program that can easily generate data indicating a screen displayed on a display unit using software according to a selection state of the display unit are stored.
- Non-transitory computer readable media can be provided.
- FIG. 1 is a plan view showing an external appearance of a mobile terminal device according to a first embodiment
- 1 is a diagram illustrating a configuration of a mobile terminal device according to a first exemplary embodiment
- FIG. 3 is a diagram illustrating each component realized by a control unit according to the first embodiment.
- 3 is a flowchart showing processing according to the first exemplary embodiment
- 3 is a flowchart showing processing according to the first exemplary embodiment
- FIG. 6 is a diagram for explaining processing according to the first embodiment
- FIG. 6 is a diagram for explaining processing according to the first embodiment
- FIG. 6 is a diagram for explaining processing according to the first embodiment
- FIG. 6 is a diagram for explaining processing according to the first embodiment
- FIG. 6 is a diagram for explaining processing according to the first embodiment
- FIG. 1 is a plan view showing an external appearance of a mobile terminal device according to a first embodiment
- 1 is a diagram illustrating a configuration of a mobile terminal device according to a first exemplary embodiment
- FIG. 3 is
- FIG. 1 is a diagram showing an outline of a mobile terminal device 1 according to an embodiment of the present invention.
- the mobile terminal device 1 includes a first display unit 2 and a second display unit 4 that are a plurality of display units, a display control unit 6, a selection unit 8, a display screen acquisition unit 10, And image data generation means 12.
- the first display unit 2 and the second display unit 4 display images.
- the display control means 6 displays content on at least one of the first display unit 2 and the second display unit 4 using one or more software.
- the selection unit 8 selects one or more of the first display unit 2 and the second display unit 4.
- the display screen acquisition unit 10 acquires screen data indicating the screen displayed on the display unit selected by the selection unit 8.
- the image data generation unit 12 generates image data based on the screen data acquired by the display screen acquisition unit 10.
- the mobile terminal device 1 According to the mobile terminal device 1 according to the embodiment of the present invention, it is possible to easily generate data indicating the screen displayed on the display unit using software according to the selection state of the display unit. .
- data indicating the screen displayed on the display unit is easily generated using software according to the selection state of the display unit. Is possible.
- FIG. 2 is a plan view illustrating an appearance of the mobile terminal device 100 according to the first embodiment.
- FIG. 3 is a diagram of the mobile terminal device 100 according to the first embodiment.
- substantially the same components are denoted by the same reference numerals.
- the mobile terminal device 100 is, for example, a mobile phone, a smartphone, a tablet PC (Personal Computer), a mobile game machine, or the like.
- the mobile terminal device 1 includes a first housing 102 and a second housing 104.
- a first display portion 112 is provided in the first housing 102.
- the second housing 104 is provided with a second display unit 114.
- At least one of the first display unit 112 and the second display unit 114 is configured as a touch screen in which a display device and an input device are integrated. That is, the mobile terminal device 100 is configured so that the user can operate the touch panel by bringing the operating body such as a finger into contact with the surface of the first display unit 112 or the second display unit 114.
- the portable terminal device 100 is configured to be easily carried by stacking the first housing 102 and the second housing 104 when not in use. For example, it may be foldable via a hinge (not shown) provided between the first housing 102 and the second housing 104. In addition, for example, by allowing the first casing 102 and the second casing 104 to slide relative to each other, the second casing 104 is placed under the first casing 102 when the portable terminal device 100 is not used.
- the first housing 102 may be movable under the second housing 104 so that the first housing 102 can move.
- an input key 120 is provided on the side surface of the second housing 104.
- the input key 120 includes, for example, a power button 120a, a volume increase button 120b, and a volume decrease button 120c.
- the mobile terminal device 100 can be turned on and off by pressing the power button 120a.
- the volume of the mobile terminal device 100 can be increased by pressing the volume increase button 120b.
- the volume of the mobile terminal device 100 can be reduced by pressing the volume reduction button 120c.
- Display screen acquisition processing is started by operating the input key 120 by a predetermined method.
- Display screen acquisition processing generates image data (screen data) indicating a screen displayed on at least one of the first display unit 112 and the second display unit 114, such as a screen shot, screen capture, screen dump, or print screen. The process to do.
- An imaging unit 130 is provided in the first casing 102 or the second casing 104.
- the imaging unit 130 is, for example, a camera, captures a subject, and generates image data indicating imaging.
- the first housing 102 or the second housing 104 is provided with a microphone 132 and a speaker 134. Furthermore, the first housing 102 or the second housing 104 is provided with an LED (Light Emitting Diode) 136 that is turned on under the control of the control unit 200 described later.
- LED Light Emitting Diode
- a wireless communication unit 140 performs wireless communication with other devices via the antenna 140a to transmit / receive information.
- the sound conversion unit 144 converts the sound data received by the wireless communication unit 140 into sound and causes the speaker 134 to output the sound data.
- the voice conversion unit 144 converts the voice data output from the control unit 200 into voice and causes the speaker 134 to output the voice data. Further, the voice conversion unit 144 converts the voice collected by the microphone 132 into voice data and outputs the voice data to the wireless communication unit 140 or the control unit 200.
- the storage unit 146 is, for example, a ROM (Read Only Memory) or a RAM (Random Access Memory), and stores application software that executes various functions under the control of the control unit 200.
- the memory 148 is a nonvolatile memory, for example, and stores various data under the control of the control unit 200.
- the control unit 200 may include, for example, a central processing unit (CPU), a main storage device, an input / output port, and the like, and various controls using various application software (programs) stored in the storage unit 146. Execute.
- the control unit 200 includes the first display unit 112, the second display unit 114, and the like according to the user's operation on the input key 120 or the input device provided in the first display unit 112 or the second display unit 114.
- the operation of the imaging unit 130, the LED 136, the wireless communication unit 140, the storage unit 146, and the memory 148 is controlled.
- control unit 200 executes application software (hereinafter simply referred to as “application”) to generate various contents. Further, the control unit 200 displays the generated content on the first display unit 112 and the second display unit 114, or any one of them.
- the control unit 200 can simultaneously execute a plurality of applications. In this case, the control unit 200 can display the generated plurality of contents on the first display unit 112 and the second display unit 114, respectively.
- the control unit 200 can perform display screen acquisition processing to acquire screen data indicating a screen including content displayed on the first display unit 112 and the second display unit 114. Details will be described below.
- FIG. 4 is a diagram illustrating each component realized by the control unit 200 according to the first embodiment illustrated in FIG. 3.
- 5 and 6 are flowcharts illustrating processing performed by the control unit 200 according to the first embodiment.
- FIGS. 7A to 13 are diagrams for explaining processing performed by the control unit 200 according to the first embodiment.
- the control unit 200 includes an application execution unit 202, a display control unit 204, a screen acquisition operation reception unit 206, a display screen acquisition unit 210, a selection unit 212, a selection operation reception unit 214, and an image data generation unit 220. , And an image data storage unit 222.
- achieves is realizable by making a program run by control of the arithmetic unit (not shown) with which the control part 200 which is a computer is provided, for example. More specifically, the control unit 200 is realized by loading a program stored in the storage unit 146 into a main storage device (not shown) and executing the program under the control of the arithmetic device.
- each component is not limited to being realized by software by a program, but may be realized by any combination of hardware, firmware, and software.
- the control unit 200 executes the application and displays the content on at least one of the first display unit 112 and the second display unit 114 (S102). Specifically, the application execution unit 202 acquires application data from the storage unit 146 and executes the application. The application execution unit 202 generates content corresponding to the application by executing the application. Further, the application execution unit 202 outputs the generated content to the display control unit 204. The display control unit 204 performs necessary processing such as conversion processing on the content generated by the application execution unit 202. The display control unit 204 controls at least one of the first display unit 112 and the second display unit 114 so as to display the content.
- the display control unit 204 may cause the first display unit 112 to display the content A as illustrated in FIG. 7A. At this time, the display control unit 204 may control the second display unit 114 that is not displaying content to turn off display. Thereby, power consumption can be saved. For example, when the application execution unit 202 executes the application B and generates the content B, the display control unit 204 may cause the second display unit 114 to display the content B as illustrated in FIG. 7B. . At this time, the display control unit 204 may control the first display unit 112 that is not displaying content to turn off display.
- the display control unit 204 is displayed on both the first display unit 112 and the second display unit 114 as illustrated in FIG. Content A may be displayed. At this time, the display control unit 204 may generate the content Aa indicating the right side of the content A and generate the content Ab indicating the left side of the content A. Then, the display control unit 204 may display the content Aa on the first display unit 112 and display the content Ab on the second display unit 114.
- the application execution unit 202 may execute the application A and the application B at the same time.
- the application execution unit 202 may generate the content A by executing the application A and generate the content B by executing the application B.
- the display control unit 204 may display the content A on the first display unit 112 and display the content B on the second display unit 114.
- the control unit 200 receives a display screen acquisition operation for starting the display screen acquisition process (S104). Specifically, for example, when the user performs a predetermined operation on the input key 120, the screen acquisition operation reception unit 206 receives a signal indicating the screen acquisition operation, and displays this signal as the display screen acquisition unit 210. Output for. Thereby, the display screen acquisition unit 210 starts display screen acquisition processing.
- the screen acquisition operation reception unit 206 may receive a signal indicating the screen acquisition operation.
- control unit 200 determines whether the content is displayed on either the first display unit 112 or the second display unit 114 (S106). Specifically, the display screen acquisition unit 210 determines whether the display control unit 204 is displaying content on both the first display unit 112 and the second display unit 114.
- the control unit 200 displays the content on either the first display unit 112 or the second display unit 114. It is determined whether it is displayed (S108). Specifically, for example, as illustrated in FIG. 7A, when the display control unit 204 displays content on the first display unit 112, the selection unit 212 selects “first display unit 112”. On the other hand, as illustrated in FIG. 7B, for example, when the display control unit 204 displays the content on the second display unit 114, the selection unit 212 selects the “second display unit 114”. Then, the display screen acquisition unit 210 determines which of the first display unit 112 and the second display unit 114 the selection unit 212 has selected.
- the control unit 200 acquires screen data indicating the screen displayed on the first display unit 112 (S110). ). Specifically, the display screen acquisition unit 210 acquires screen data indicating the screen displayed on the first display unit 112. As an example of the method for acquiring the screen data, for example, a signal transmitted from the control unit 200 to the first display unit 112 may be acquired, and the screen data may be generated from the signal. The screen displayed on the 1 display unit 112 may be taken.
- the control unit 200 generates image data corresponding to the screen displayed on the first display unit 112 (S112). Specifically, the image data generation unit 220 performs necessary processing such as data format conversion on the screen data acquired by the display screen acquisition unit 210 to generate image data. For example, as illustrated in FIG. 7A, when the screen including the content A is displayed on the first display unit 112, the image data generation unit 220 displays the image of the screen including the content A as illustrated in FIG. 10A. Is generated. Next, the image data storage unit 222 of the control unit 200 stores the image data generated by the image data generation unit 220 in the memory 148 (S114).
- the control unit 200 acquires screen data indicating the screen displayed on the second display unit 114 (S120). ). Specifically, the display screen acquisition unit 210 acquires screen data indicating the screen displayed on the second display unit 114 by the same method as the process of S110.
- the control unit 200 generates image data corresponding to the screen displayed on the second display unit 114 (S122). Specifically, the image data generation unit 220 performs necessary processing such as data format conversion on the screen data acquired by the display screen acquisition unit 210 to generate image data. For example, as illustrated in FIG. 7B, when a screen including the content B is displayed on the second display unit 114, the image data generation unit 220 may include a screen including the content B as illustrated in FIG. 10B, for example. The image data indicating the image is generated.
- the image data storage unit 222 of the control unit 200 stores the image data generated by the image data generation unit 220 in the memory 148 (S114).
- image data relating to the screen displayed on the displayed display unit can be generated. Further, in this case, after the user performs a display screen acquisition operation, image data relating to the screen displayed on the display unit can be generated without performing any operation.
- the control unit 200 uses the same application to display the first display unit 112 and the second display unit 114. It is determined whether or not content is displayed on both (S130). Specifically, for example, as illustrated in FIG. 8, the display control unit 204 displays content on both the first display unit 112 and the second display unit 114 by one application (application A in the example of FIG. 8). If so, the selection unit 212 selects both the “first display unit 112” and the “second display unit 114”. That is, when content is displayed on both the first display unit 112 and the second display unit 114 by the same application, the selection unit 212 automatically selects the “first display unit 112” and the “second display unit 114”. Select both.
- the display control unit 204 causes a plurality of applications (application A and application B in the example of FIG. 9) to display content on both the first display unit 112 and the second display unit 114. If it is, the selection unit 212 does not select the display unit. In this case, as will be described later, after the selection operation is performed, the selection unit 212 selects at least one of the “first display unit 112” and the “second display unit 114” according to the selection operation.
- the display screen acquisition unit 210 determines which of the first display unit 112 and the second display unit 114 the selection unit 212 has selected.
- the selection unit 212 selects both the “first display unit 112” and the “second display unit 114”.
- the control unit 200 acquires two screen data indicating the screens respectively displayed on both the first display unit 112 and the second display unit 114 (S132).
- the display screen acquisition unit 210 acquires screen data indicating the screen displayed on the first display unit 112 by the same method as the process of S110. Similarly, the display screen acquisition unit 210 acquires screen data indicating the screen displayed on the second display unit 114. For example, in the example of FIG. 8, the display screen acquisition unit 210 acquires screen data a indicating the screen a including the content Aa displayed on the first display unit 112. Similarly, the display screen acquisition unit 210 acquires screen data b indicating the screen b including the content Ab displayed on the second display unit 114.
- control unit 200 generates image data corresponding to the screens displayed on the first display unit 112 and the second display unit 114 (S134). Specifically, the image data generation unit 220 synthesizes two screen data acquired by the display screen acquisition unit 210. Further, the image data generation unit 220 performs necessary processing such as data format conversion on the combined screen data to generate image data.
- the image data generation unit 220 when the screen a including the content Aa is displayed on the first display unit 112 and the screen b including the content Ab is displayed on the second display unit 114, the image data generation unit 220 is displayed. 11 generates image data indicating an image obtained by combining the screen a and the screen b, as illustrated in FIG. That is, as illustrated in FIG. 8, the content A is displayed by being separated into two display units by the application A, which is one application, but the image data generated by the display screen acquisition process is illustrated in FIG. As illustrated, it is not separated but is combined into one.
- the image data storage unit 222 of the control unit 200 stores the image data generated by the image data generation unit 220 in the memory 148 (S114).
- the control unit 200 shifts to the display unit selection operation mode (S140).
- the selection unit 212 controls the selection operation reception unit 214 to enter a state (display unit selection operation mode) for receiving an operation for selecting a display unit.
- a state display unit selection operation mode
- the user when the user performs a predetermined operation, it is possible to select which display portion the screen to be acquired is acquired.
- the first display unit 112 is configured as a touch screen
- the user can perform a selection operation (first operation) by tapping the surface of the first display unit 112.
- “tapping” refers to an operation of bringing a finger into contact with the surface of the display unit and releasing the finger within a predetermined time (for example, within 1 second).
- the user taps the first display unit 112 and selects the selection operation reception unit 214.
- the display control unit 204 controls the display brightness of the second display unit 114 to be darker than that of the first display unit 112, as illustrated in FIG. 12B. If a tap operation is not performed for a certain time (for example, 5 seconds) in this state, the selection unit 212 selects the “first display unit 112”.
- the display control unit 204 displays the first display unit as illustrated in FIG. 12C.
- the display brightness of the first display unit 112 is controlled to be darker than that of the second display unit 114, and the display brightness of the second display unit 114 is controlled to be brighter than that of the first display unit 112. If the tap operation is not performed for a certain time (for example, 5 seconds) in this state, the selection unit 212 selects the “second display unit 114”.
- the display control unit 204 displays the first display unit as illustrated in FIG. Control is performed so that both the first display unit 112 and the second display unit are brightened in the same manner. If a tap operation is not performed for a certain time (for example, 5 seconds) in this state, the selection unit 212 selects “the first display unit 112 and the second display unit 114”.
- the display of the 1st display part 112 and the 2nd display part 114 changes whenever selection operation (for example, tap operation) is performed, if a user does not perform selection operation for a fixed time in this state, It is possible to easily grasp which display unit is selected (that is, which display unit is a candidate that can be selected (selection candidate)). Further, the user can switch the selection candidates as described above by a simple operation such as a tap operation.
- control unit 200 determines whether or not the display unit selection process has been completed (S142). Specifically, as described above, the selection unit 212 determines whether or not the selection operation is performed for a certain period of time. When it is determined that the selection process of the display unit is not completed (that is, when a selection operation is performed within a predetermined time) (NO in S142), the control unit 200 repeats the process of S140.
- the control unit 200 displays the first display unit 112 and the second display unit. It is determined which display unit of the unit 114 has been selected (S146). Specifically, in the display screen acquisition unit 210, the selection unit 212 selects any one of the “first display unit 112”, the “second display unit 114”, and the “first display unit 112 and the second display unit 114”. Judgment is made.
- the control unit 200 acquires screen data indicating the screen displayed on the first display unit 112 (S150). Specifically, the display screen acquisition unit 210 acquires screen data indicating the screen displayed on the first display unit 112 as in the process of S110. Next, the control unit 200 generates image data corresponding to the screen displayed on the first display unit 112 (S152). Specifically, the image data generation unit 220 performs necessary processing such as data format conversion on the screen data acquired by the display screen acquisition unit 210 to generate image data.
- the screen including the content A is displayed on the first display unit 112 and the screen including the content B is displayed on the second display unit 114, which is illustrated in FIG. 12B.
- the image data generation unit 220 when the first display unit 112 is selected, the image data generation unit 220 generates image data indicating a screen image including the content A as illustrated in FIG. 10A.
- the image data storage unit 222 of the control unit 200 stores the image data generated by the image data generation unit 220 in the memory 148 (S154).
- control unit 200 acquires screen data indicating the screen displayed on second display unit 114 (S160). ). Specifically, the display screen acquisition unit 210 acquires screen data indicating the screen displayed on the second display unit 114, as in the process of S120. Next, the control unit 200 generates image data corresponding to the screen displayed on the second display unit 114 (S162). Specifically, the image data generation unit 220 performs necessary processing such as data format conversion on the screen data acquired by the display screen acquisition unit 210 to generate image data.
- the screen including the content A is displayed on the first display unit 112 and the screen including the content B is displayed on the second display unit 114, which is illustrated in FIG. 12C.
- the image data generation unit 220 when the second display unit 114 is selected, the image data generation unit 220 generates image data indicating an image of a screen including the content B as illustrated in FIG. 10B.
- the image data storage unit 222 of the control unit 200 stores the image data generated by the image data generation unit 220 in the memory 148 (S154).
- the control unit 200 controls both the first display unit 112 and the second display unit 114.
- Two pieces of screen data representing the displayed screens are acquired (S170).
- the display screen acquisition unit 210 acquires screen data indicating the screen displayed on the first display unit 112 as in the process of S132.
- the display screen acquisition unit 210 acquires screen data indicating the screen displayed on the second display unit 114.
- control unit 200 generates image data corresponding to the screens displayed on the first display unit 112 and the second display unit 114 (S172). Specifically, the image data generation unit 220 synthesizes two screen data acquired by the display screen acquisition unit 210. Further, the image data generation unit 220 performs necessary processing such as data format conversion on the combined screen data to generate image data.
- the screen A including the content A is displayed on the first display unit 112 and the screen B including the content B is displayed on the second display unit 114.
- the image data generation unit 220 combines the screen A and the screen B as illustrated in FIG. 13. Is generated.
- the image data storage unit 222 of the control unit 200 stores the image data generated by the image data generation unit 220 in the memory 148 (S154).
- the user can repeatedly perform a selection operation (for example, a tap operation) and select a screen to be acquired according to the number of selection operations.
- a selection operation for example, a tap operation
- FIGS. 12A to 12C assuming that the state is the state of FIG. 12A first, when the user performs a selection operation once, the state of FIG. 12B is obtained, so the selection unit 212 selects the first display unit 112. . That is, the display screen acquisition unit 210 can acquire the display screen of the first display unit 112. When the user performs the selection operation twice, the state shown in FIG. 12C is entered, so the selection unit 212 selects the second display unit 114. That is, the display screen acquisition unit 210 can acquire the display screen of the second display unit 114.
- the selection unit 212 selects the first display unit 112 and the second display unit 114. That is, the display screen acquisition unit 210 can acquire the display screen of the first display unit 112 and the display screen of the second display unit 114.
- the image data generation unit 220 can generate image data synthesized with the display screen of the first display unit 112 and the display screen of the second display unit 114.
- the display unit is automatically selected, so that the user can simply perform one operation such as a tap operation, and can easily perform image data relating to a desired screen. Can be obtained.
- FIG. 14 is a flowchart of a process performed by the control unit 200 according to the second embodiment.
- the flowchart of FIG. 14 is obtained by replacing S130 to S134 of FIG. 15A to 17B are diagrams for explaining the processing according to the second embodiment.
- the control unit 200 displays the content on both the first display unit 112 and the second display unit 114 by the same application. It is determined whether or not (S202). Specifically, in the display screen acquisition unit 210, is the display control unit 204 displaying content on both the first display unit 112 and the second display unit 114 by one application (application A in the example of FIG. 8)? Judge whether or not.
- the selection unit 212 controls the selection operation reception unit 214 to enter a state (display unit selection operation mode) for receiving an operation for selecting a display unit.
- a state display unit selection operation mode
- the selection unit 212 controls the selection operation reception unit 214 to enter a state (display unit selection operation mode) for receiving an operation for selecting a display unit.
- this display portion selection mode when the user performs a predetermined operation, it is possible to select which display portion the screen to be acquired is acquired. Furthermore, when acquiring the screen displayed on both the 1st display part 112 and the 2nd display part 114, the image data which synthesize
- the user taps the first display unit 112 and selects the selection operation receiving unit 214.
- the display control unit 204 controls the display brightness of the second display unit 114 to be darker than that of the first display unit 112 as illustrated in FIG. 15B. If a tap operation is not performed for a certain time (for example, 5 seconds) in this state, the selection unit 212 selects the “first display unit 112”.
- the display control unit 204 displays the first display unit as illustrated in FIG. 15C.
- the display brightness of the first display unit 112 is controlled to be darker than that of the second display unit 114, and the display brightness of the second display unit 114 is controlled to be brighter than that of the first display unit 112.
- the tap operation selection operation
- the selection unit 212 selects the “second display unit 114”.
- the display control unit 204 displays the first display unit as illustrated in FIG.
- the state in which the display brightness of the first display unit 112 is brighter than that of the second display unit 114 and the state in which the display brightness of the second display unit 114 is brighter than that of the first display unit 112 are determined in advance. It repeats alternately every time (for example, every 0.2 seconds).
- the display control unit 204 first makes the display brightness of the first display unit 112 brighter than that of the second display unit 114. For example, when 0.2 seconds elapses, the display control unit 204 increases the display brightness of the second display unit 114. Brighter than the first display unit 112. After another 0.2 seconds, the display control unit 204 makes the display brightness of the first display unit 112 brighter than that of the second display unit 114. Thereafter, the process of making one side brighter and the other darker is repeated. If the tap operation is not performed for a certain time (for example, 5 seconds) in this state, the selection unit 212 selects “the first display unit 112 and the second display unit 114” and “do not combine”.
- the display control unit 204 displays the state shown in FIG. 15A. As illustrated, both the first display unit 112 and the second display unit are controlled to be similarly brightened. If the tap operation is not performed for a certain time (for example, 5 seconds) in this state, the selection unit 212 selects “the first display unit 112 and the second display unit 114” and “synthesize”.
- the display of the 1st display part 112 and the 2nd display part 114 changes whenever selection operation (for example, tap operation) is performed, if a user does not perform selection operation for a fixed time in this state, It is possible to easily grasp which display unit is selected (that is, which display unit is a candidate that can be selected).
- control unit 200 determines whether or not the display unit selection process has been completed (S206). Specifically, as described above, the selection unit 212 determines whether or not the selection operation is performed for a certain period of time. When it is determined that the selection process of the display unit is not completed (that is, when a selection operation is performed within a predetermined time) (NO in S206), the control unit 200 repeats the process of S204.
- the control unit 200 displays the first display unit 112 and the second display unit.
- 114 is determined (S208) Specifically, in the display screen acquisition unit 210, the selection unit 212 uses the “first display unit 112”, the “second display unit 114”, or the “second display unit”. It is determined which one of the first display unit 112 and the second display unit 114 is selected.
- the control unit 200 acquires screen data indicating the screen displayed on the first display unit 112 (S210). Specifically, the display screen acquisition unit 210 acquires screen data indicating the screen displayed on the first display unit 112, as in the processing of S110 of the first embodiment. Next, the control unit 200 generates image data corresponding to the screen displayed on the first display unit 112 (S212). Specifically, the image data generation unit 220 performs necessary processing such as data format conversion on the screen data acquired by the display screen acquisition unit 210 to generate image data.
- a screen including the content Aa that is a part of the content A is displayed on the first display unit 112, and a screen including the content Ab that is a part of the content A on the second display unit 114.
- the image data generation unit 220 displays the screen including the content Aa as illustrated in FIG. 17A.
- Image data indicating an image is generated.
- the image data storage unit 222 of the control unit 200 stores the image data generated by the image data generation unit 220 in the memory 148 (S214).
- the control unit 200 acquires screen data indicating the screen displayed on the second display unit 114 (S220). Specifically, the display screen acquisition unit 210 acquires screen data indicating the screen displayed on the second display unit 114, as in the process of S120 of the first embodiment. Next, the control unit 200 generates image data corresponding to the screen displayed on the second display unit 114 (S222). Specifically, the image data generation unit 220 performs necessary processing such as data format conversion on the screen data acquired by the display screen acquisition unit 210 to generate image data.
- a screen including the content Aa that is a part of the content A is displayed on the first display unit 112, and a screen including the content Ab that is a part of the content A on the second display unit 114.
- the image data generation unit 220 displays the screen including the content Aa as illustrated in FIG. 17B.
- Image data indicating an image is generated.
- the image data storage unit 222 of the control unit 200 stores the image data generated by the image data generation unit 220 in the memory 148 (S214).
- the control unit 200 determines whether to synthesize the screens displayed on both display units. Is determined (S230). Specifically, the display screen acquisition unit 210 determines whether the selection unit 212 has selected “synthesize” or “do not combine”.
- the control unit 200 displays two screens respectively displayed on both the first display unit 112 and the second display unit 114, as in S132 of the first embodiment.
- Screen data is acquired (S240).
- the control unit 200 generates image data corresponding to the screens displayed on the first display unit 112 and the second display unit 114, similarly to S134 of the first embodiment (S242).
- the image data generation unit 220 synthesizes two screen data acquired by the display screen acquisition unit 210. Further, the image data generation unit 220 performs necessary processing such as data format conversion on the combined screen data to generate image data.
- the screen a including the content Aa is displayed on the first display unit 112 and the screen b including the content Ab is displayed on the second display unit 114.
- the image data generation unit 220 is illustrated in FIG.
- image data indicating an image obtained by combining the screen a and the screen b is generated. That is, as illustrated in FIG. 8, the content A is displayed by being separated into two display units by the application A, which is one application, but the image data generated by the display screen acquisition process is illustrated in FIG. As illustrated, it is not separated but is combined into one.
- the image data storage unit 222 of the control unit 200 stores the image data generated by the image data generation unit 220 in the memory 148 (S214).
- the control unit 200 acquires two screen data indicating the screens displayed on both the first display unit 112 and the second display unit 114 (S250). ). Specifically, the display screen acquisition unit 210 acquires screen data indicating the screen displayed on the first display unit 112 by the same method as the process of S110 of the first embodiment. Similarly, the display screen acquisition unit 210 acquires screen data indicating the screen displayed on the second display unit 114. For example, in the example of FIG. 8, the display screen acquisition unit 210 acquires screen data a indicating the screen a including the content Aa displayed on the first display unit 112. Similarly, the display screen acquisition unit 210 acquires screen data b indicating the screen b including the content Ab displayed on the second display unit 114.
- the control unit 200 separately generates image data corresponding to the screens displayed on the first display unit 112 and the second display unit 114 (S252). Specifically, the image data generation unit 220 generates image data by performing necessary processing such as data format conversion on the screen data indicating the screen displayed on the first display unit 112. Similarly, the image data generation unit 220 performs necessary processing such as data format conversion on the screen data indicating the screen displayed on the second display unit 114 to generate image data.
- FIG. 17 when the screen a including the content Aa is displayed on the first display unit 112 and the screen b including the content Ab is displayed on the second display unit 114, FIG.
- the image data generation unit 220 displays the images in FIGS. 17A and 17B.
- image data (FIG. 17A) indicating a screen image including the content Aa and image data (FIG. 17B) indicating a screen image including the content Ab are separately generated.
- the image data storage unit 222 of the control unit 200 stores the two image data generated by the image data generation unit 220 in the memory 148 (S214).
- the user can repeatedly perform a selection operation (for example, a tap operation), select a screen to be acquired according to the number of selection operations, and further select whether to synthesize a plurality of screens. be able to.
- a selection operation for example, a tap operation
- the display screen acquisition unit 210 can acquire the display screen of the first display unit 112.
- the selection unit 212 selects the second display unit 114. That is, the display screen acquisition unit 210 can acquire the display screen of the second display unit 114.
- the selection unit 212 selects the first display unit 112 and the second display unit 114. That is, the display screen acquisition unit 210 can acquire the display screen of the first display unit 112 and the display screen of the second display unit 114.
- the image data generation unit 220 can generate image data separately for the display screen of the first display unit 112 and the display screen of the second display unit 114.
- the selection unit 212 selects the first display unit 112 and the second display unit 114. That is, the display screen acquisition unit 210 can acquire the display screen of the first display unit 112 and the display screen of the second display unit 114.
- the image data generation unit 220 can generate image data synthesized with the display screen of the first display unit 112 and the display screen of the second display unit 114.
- the display unit is automatically selected, and further whether or not to synthesize is also selected, so that the user simply repeats one operation such as a tap operation, Image data relating to a desired screen can be easily obtained.
- the present invention is not limited to the above-described embodiment, and can be modified as appropriate without departing from the spirit of the present invention.
- there are two housings but there may be one housing or three or more housings.
- the number of display units is two, but the number of display units may be three or more.
- the operating body is a user's finger, but the operating body is not limited to a finger.
- the operating body may be a touch pen or a stylus.
- the selection operation is a tap operation, but the selection operation is not limited to a tap operation.
- the selection operation may be a flick operation or the input key 120 may be pressed.
- two or more of the constituent elements shown in the above-described embodiments may be integrally formed.
- the display screen acquisition unit 210 and the selection unit 212 may be integrated.
- the display screen acquisition unit 210 may have the function of the selection unit 212.
- the order of the processes (steps) can be changed as appropriate.
- another process (step) may be executed while one process (step) is being executed.
- One or more of the plurality of processes (steps) may be omitted.
- a display unit that is not a selection candidate is darker than a display unit that is a selection candidate. It is not necessary to distinguish whether it is a selection candidate in light and dark.
- the display control unit 204 may add a predetermined mark (for example, a cross or a character such as “non-selected”) to the screen of the display unit that is not a selection candidate.
- the display control unit 204 may add a predetermined mark (for example, a mark such as a mark or “selection”) to the screen of the display unit that is a selection candidate.
- the display state of the display unit and the selection candidates are sequentially changed according to the number of times the selection operation is performed using FIGS. 12A to 12C, 15A to 15C, and 16.
- the order of the transition is not limited to the order of FIGS. 12A to 12C, 15A to 15C, and FIG. 16, and can be appropriately set by those skilled in the art.
- Non-transitory computer readable media include various types of tangible storage media (tangible storage medium).
- Examples of non-transitory computer-readable media include magnetic recording media (eg flexible disks, magnetic tapes, hard disk drives), magneto-optical recording media (eg magneto-optical discs), CD-ROMs (Read Only Memory), CD-Rs, CD-R / W, semiconductor memory (for example, mask ROM, PROM (Programmable ROM), EPROM (Erasable ROM), flash ROM, RAM (random access memory)) are included.
- the program may also be supplied to the computer by various types of temporary computer-readable media. Examples of transitory computer readable media include electrical signals, optical signals, and electromagnetic waves.
- the temporary computer-readable medium can supply the program to the computer via a wired communication path such as an electric wire and an optical fiber, or a wireless communication path.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computer Hardware Design (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Graphics (AREA)
- Databases & Information Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Digital Computer Display Output (AREA)
Abstract
Description
一方、スクリーンショット、スクリーンキャプチャ、スクリーンダンプ又はプリントスクリーン等の、表示デバイスに表示された画面を示す画像データを生成し記憶する技術がある。この技術は、携帯電話等においても適用されている。
また、特許文献2には、第1カメラ部により入力された画像、第2カメラ部により入力された画像、受信された画像の少なくとも1つの画像をキャプチャして静止画像又は動画像として保存部に保存することもでき、複数の画像をキャプチャして1つの画像に合成して保存することもできる携帯端末機が開示されている。
実施の形態の説明に先立って、図1を用いて、本発明の実施の形態にかかる携帯端末装置の概要を説明する。図1は、本発明の実施の形態にかかる携帯端末装置1の概要を示す図である。図1に示すように、携帯端末装置1は、複数の表示部である第1表示部2及び第2表示部4と、表示制御手段6と、選択手段8と、表示画面取得手段10と、画像データ生成手段12とから構成される。
以下、図面を参照して実施の形態1について説明する。図2は、実施の形態1にかかる携帯端末装置100の外観を示す平面図である。また、図3は、実施の形態1にかかる携帯端末装置100を示す図である。なお、以下、実質的に同じ構成要素には、同じ符号が付される。
図4に示すように、制御部200は、アプリケーション実行部202、表示制御部204、画面取得操作受付部206、表示画面取得部210、選択部212、選択操作受付部214、画像データ生成部220、及び画像データ格納部222から構成される。
次に、制御部200の画像データ格納部222は、画像データ生成部220によって生成された画像データを、メモリ148に格納する(S114)。
次に、制御部200の画像データ格納部222は、画像データ生成部220によって生成された画像データを、メモリ148に格納する(S154)。
次に、制御部200の画像データ格納部222は、画像データ生成部220によって生成された画像データを、メモリ148に格納する(S154)。
次に、制御部200の画像データ格納部222は、画像データ生成部220によって生成された画像データを、メモリ148に格納する(S154)。
次に、実施の形態2について説明する。実施の形態2は、実施の形態1において、同じアプリケーションによって、第1表示部112及び第2表示部114の両方にコンテンツが表示されている場合の処理を変形したものである。以下、図5の「S106のYES」以降を変形した処理について説明する。
図14は、実施の形態2にかかる制御部200によってなされる処理を示すフローチャートである。図14のフローチャートは、図5のS130~S134を置き換えたものである。また、図15A~図17Bは、実施の形態2にかかる処理を説明するための図である。
次に、制御部200の画像データ格納部222は、画像データ生成部220によって生成された画像データを、メモリ148に格納する(S214)。
次に、制御部200の画像データ格納部222は、画像データ生成部220によって生成された画像データを、メモリ148に格納する(S214)。
次に、制御部200の画像データ格納部222は、画像データ生成部220によって生成された画像データを、メモリ148に格納する(S214)。
なお、本発明は上記実施の形態に限られたものではなく、以下のように、趣旨を逸脱しない範囲で適宜変更することが可能である。例えば、上述した実施の形態においては、筐体は2つであるとしたが、筐体は1つであってもよく、3つ以上であってもよい。また、上述した実施の形態においては、表示部は2つであるとしたが、表示部は3つ以上であってもよい。
また、図5,図6,図14に示したフローチャートにおいて、処理(ステップ)の順序は、適宜、変更可能である。また、1つの処理(ステップ)が実行されている間に他の処理(ステップ)が実行されてもよい。また、複数ある処理(ステップ)のうちの1つ以上は、省略されてもよい。
2 第1表示部
4 第2表示部
6 表示制御手段
8 選択手段
10 表示画面取得手段
12 画像データ生成手段
100 携帯端末装置
112 第1表示部
114 第2表示部
120 入力キー
120a 電源ボタン
120b 音量増ボタン
120c 音量減ボタン
146 記憶部
148 メモリ
200 制御部
202 アプリケーション実行部
204 表示制御部
206 画面取得操作受付部
210 表示画面取得部
212 選択部
214 選択操作受付部
220 画像データ生成部
222 画像データ格納部
Claims (13)
- 画像を表示する複数の表示部と、
1つ以上のソフトウェアを用いて、前記複数の表示部の少なくとも1つにコンテンツを表示させる表示制御手段と、
前記複数の表示部のうちの1つ以上を選択する選択手段と、
前記選択手段によって選択された前記表示部に表示された画面を示す画面データを取得する表示画面取得手段と、
前記表示画面取得手段によって取得された前記画面データに基づいて、画像データを生成する画像データ生成手段と
を有する携帯端末装置。 - 使用者による第1の操作を受け付ける操作受付手段
をさらに有し、
前記選択手段は、前記操作受付手段が前記第1の操作を受け付けた回数に応じて、前記複数の表示部のうちの1つ以上を選択する
請求項1に記載の携帯端末装置。 - 前記表示制御手段が、複数のソフトウェアを用いて複数の前記表示部にそれぞれコンテンツを表示させている場合、前記選択手段は、前記操作受付手段が前記第1の操作を受け付けた回数に応じて、前記複数の表示部のうちの1つ以上を選択する
請求項2に記載の携帯端末装置。 - 前記操作受付手段が、前記第1の操作を受け付けてから予め定められた時間以内に、次の前記第1の操作を受け付けない場合に、前記選択手段は、前記複数の表示部のうちの1つ以上を選択する
請求項2又は3に記載の携帯端末装置。 - 使用者による第1の操作を受け付ける操作受付手段
をさらに有し、
前記表示制御手段は、前記操作受付手段が前記第1の操作を受け付けた場合に、前記複数の表示部の少なくとも1つの表示状態を切り替える
請求項1から4のいずれか1項に記載の携帯端末装置。 - 前記表示制御手段は、前記操作受付手段が前記第1の操作を受け付けたことに応じて前記複数の表示部の少なくとも1つの表示状態を切り替える場合、前記選択手段によって選択されうる候補となっている前記表示部の表示状態と、前記選択手段によって選択されうる候補となっていない前記表示部の表示状態とが異なるように、前記複数の表示部を制御する
請求項5に記載の携帯端末装置。 - 前記複数の表示部の少なくとも1つは、表示手段と入力手段とが一体に構成されており、
前記選択手段は、前記表示部の前記入力手段に操作体が接触したことに応じて、選択されうる候補となる前記表示部を切り替える
請求項1から6のいずれか1項に記載の携帯端末装置。 - 前記表示制御手段が、1つのソフトウェアを用いて複数の前記表示部にコンテンツを表示させている場合、前記画像データ生成手段は、前記複数の表示部それぞれに表示された画面を合成して、前記画像データを生成する
請求項1から7のいずれか1項に記載の携帯端末装置。 - 前記表示制御手段が、1つのソフトウェアを用いて複数の前記表示部にコンテンツを表示させている場合、前記選択手段は、前記操作受付手段が前記第1の操作を受け付けた回数に応じて、前記複数の表示部のうちの1つ以上を選択し、さらに、前記複数の表示部のうちの2つ以上を選択した場合に、前記画像データ生成手段によって前記選択された2つ以上の表示部それぞれに表示された画面を合成するか否かを選択する
請求項2に記載の携帯端末装置。 - 1つ以上のソフトウェアを用いて、携帯端末装置に設けられた、画像を表示する複数の表示部の少なくとも1つにコンテンツを表示させ、
前記複数の表示部のうちの1つ以上を選択し、
前記選択された前記表示部に表示された画面を示す画面データを取得し、
前記取得された前記画面データに基づいて、画像データを生成する
画像生成方法。 - 使用者による第1の操作を受け付け、
前記第1の操作を受け付けた回数に応じて、前記複数の表示部のうちの1つ以上を選択する
請求項10に記載の画像生成方法。 - 1つ以上のソフトウェアを用いて、携帯端末装置に設けられた、画像を表示する複数の表示部の少なくとも1つにコンテンツを表示させる表示ステップと、
前記複数の表示部のうちの1つ以上を選択する選択ステップと、
前記選択ステップにおいて選択された前記表示部に表示された画面を示す画面データを取得する表示画面取得ステップと、
前記表示画面取得ステップにおいて取得された前記画面データに基づいて、画像データを生成する画像データ生成ステップと
を前記携帯端末装置のコンピュータに実行させるプログラムが格納された非一時的なコンピュータ可読媒体。 - 使用者による第1の操作を受け付ける操作受付ステップ
を前記携帯端末装置のコンピュータに実行させ、
前記選択ステップにおいて、前記操作受付ステップにおいて前記第1の操作を受け付けた回数に応じて、前記複数の表示部のうちの1つ以上を選択する
請求項12に記載のプログラムが格納された非一時的なコンピュータ可読媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201480012077.9A CN105103116B (zh) | 2013-03-05 | 2014-02-25 | 移动终端设备、图像生成方法和存储程序的非暂时性计算机可读介质 |
JP2015504158A JP6245250B2 (ja) | 2013-03-05 | 2014-02-25 | 携帯端末装置、画像生成方法及びプログラム |
EP14760340.1A EP2966556B1 (en) | 2013-03-05 | 2014-02-25 | Mobile terminal apparatus, image generation method, and non-transitory computer-readable medium storing program |
US14/760,110 US10222940B2 (en) | 2013-03-05 | 2014-02-25 | Handheld terminal, image generation method, and non-transitory computer-readable medium containing program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013042735 | 2013-03-05 | ||
JP2013-042735 | 2013-03-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014136400A1 true WO2014136400A1 (ja) | 2014-09-12 |
Family
ID=51490939
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/000970 WO2014136400A1 (ja) | 2013-03-05 | 2014-02-25 | 携帯端末装置、画像生成方法及びプログラムが格納された非一時的なコンピュータ可読媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10222940B2 (ja) |
EP (1) | EP2966556B1 (ja) |
JP (1) | JP6245250B2 (ja) |
CN (1) | CN105103116B (ja) |
WO (1) | WO2014136400A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017211494A (ja) * | 2016-05-25 | 2017-11-30 | 株式会社リコー | 画像処理装置、画像処理システム、画像処理方法及びプログラム |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9880799B1 (en) * | 2014-08-26 | 2018-01-30 | Sprint Communications Company L.P. | Extendable display screens of electronic devices |
WO2016182133A1 (ko) * | 2015-05-14 | 2016-11-17 | 엘지전자 주식회사 | 디스플레이 장치 및 그의 동작 방법 |
US11093197B2 (en) * | 2017-07-31 | 2021-08-17 | Stmicroelectronics, Inc. | System and method to increase display area utilizing a plurality of discrete displays |
US10230826B1 (en) | 2017-08-22 | 2019-03-12 | Futurewei Technologies, Inc. | Foldable mobile device |
JP7290095B2 (ja) * | 2019-09-30 | 2023-06-13 | 富士通株式会社 | 表示制御プログラム、表示制御方法および表示制御装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09102918A (ja) * | 1995-10-05 | 1997-04-15 | Matsushita Electric Ind Co Ltd | 映像信号表示装置 |
JP2007082223A (ja) | 2005-09-09 | 2007-03-29 | Lg Electronics Inc | 携帯端末機及びこれを利用した表示方法 |
WO2008016031A1 (fr) | 2006-07-31 | 2008-02-07 | Access Co., Ltd. | Composant électronique, système d'affichage, procédé d'affichage et programme |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4189315B2 (ja) | 2003-12-26 | 2008-12-03 | 任天堂株式会社 | ゲーム装置およびゲームプログラム |
JP4012933B2 (ja) | 2004-03-22 | 2007-11-28 | 任天堂株式会社 | ゲーム装置、ゲームプログラム、ゲームプログラムを記憶した記憶媒体およびゲーム制御方法 |
JP5004435B2 (ja) * | 2005-05-13 | 2012-08-22 | 株式会社カプコン | ゲームプログラム及びゲーム装置 |
JP4643484B2 (ja) | 2006-03-29 | 2011-03-02 | 株式会社東芝 | 画面送信装置および画面表示方法 |
KR20100074160A (ko) | 2007-09-04 | 2010-07-01 | 인터내셔널 비지네스 머신즈 코포레이션 | 전자 문서를 검증하는 시스템 및 방법 |
JP2010157189A (ja) | 2009-01-05 | 2010-07-15 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
CN102023834B (zh) * | 2009-09-21 | 2013-03-20 | 联想(北京)有限公司 | 一种背景图片的显示方法及系统 |
JP4818427B2 (ja) * | 2009-12-22 | 2011-11-16 | 株式会社東芝 | 情報処理装置及び画面選択方法 |
KR101087479B1 (ko) | 2010-01-29 | 2011-11-25 | 주식회사 팬택 | 멀티 디스플레이 장치 및 그 제어 방법 |
KR101642722B1 (ko) * | 2010-02-04 | 2016-07-27 | 삼성전자 주식회사 | 듀얼 표시부를 가지는 휴대단말 및 그 표시부의 표시 제어 방법 |
KR20110092802A (ko) * | 2010-02-10 | 2011-08-18 | 삼성전자주식회사 | 복수개의 표시부를 가지는 단말기의 데이터 운용 방법 및 이를 지원하는 단말기 |
US20110209101A1 (en) * | 2010-02-25 | 2011-08-25 | Hinckley Kenneth P | Multi-screen pinch-to-pocket gesture |
JP5635850B2 (ja) * | 2010-09-16 | 2014-12-03 | 任天堂株式会社 | 情報処理装置、情報処理プログラム、情報処理システムおよび情報処理方法 |
JP5598207B2 (ja) | 2010-09-24 | 2014-10-01 | 日本電気株式会社 | 表示装置、表示方法およびプログラム |
JP5749043B2 (ja) * | 2011-03-11 | 2015-07-15 | 京セラ株式会社 | 電子機器 |
JP5805428B2 (ja) * | 2011-04-26 | 2015-11-04 | 京セラ株式会社 | 携帯端末装置およびプログラム |
-
2014
- 2014-02-25 EP EP14760340.1A patent/EP2966556B1/en active Active
- 2014-02-25 JP JP2015504158A patent/JP6245250B2/ja active Active
- 2014-02-25 CN CN201480012077.9A patent/CN105103116B/zh active Active
- 2014-02-25 WO PCT/JP2014/000970 patent/WO2014136400A1/ja active Application Filing
- 2014-02-25 US US14/760,110 patent/US10222940B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09102918A (ja) * | 1995-10-05 | 1997-04-15 | Matsushita Electric Ind Co Ltd | 映像信号表示装置 |
JP2007082223A (ja) | 2005-09-09 | 2007-03-29 | Lg Electronics Inc | 携帯端末機及びこれを利用した表示方法 |
WO2008016031A1 (fr) | 2006-07-31 | 2008-02-07 | Access Co., Ltd. | Composant électronique, système d'affichage, procédé d'affichage et programme |
Non-Patent Citations (3)
Title |
---|
"2106BPM,'SUMAHO DE KACHIKACHI DEKIRU! ''ORITATAMI NO N'' GA H anatsu, Masaka no Sotogawa 2 Gamen no Oritatami Smartphone 'MEDIAS W N-05E' o Shashin to Doga de Check [Report", GADGET NEWS, 22 January 2013 (2013-01-22), pages 1 - 7, XP055290535, Retrieved from the Internet <URL:HTTP://GETNEWS.JP/ARCHIVES/285450> [retrieved on 20140320] * |
See also references of EP2966556A4 |
YOICHI HIRAGA: "2 Gamen de Kitto Hakadoru: Shashin to Doga de Kaisetsu suru 'MEDIAS W N-05E (2/3", IT MEDIA MOBILE, 23 January 2013 (2013-01-23), pages 1 - 5, XP055290537, Retrieved from the Internet <URL:HTTP://WWW.ITMEDIA.CO.JP/MOBILE/ARTICLES/1301/23/NEWS144_2.HTML> [retrieved on 20140320] * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017211494A (ja) * | 2016-05-25 | 2017-11-30 | 株式会社リコー | 画像処理装置、画像処理システム、画像処理方法及びプログラム |
US10725653B2 (en) | 2016-05-25 | 2020-07-28 | Ricoh Company, Ltd. | Image processing device, image processing system, and image processing method |
Also Published As
Publication number | Publication date |
---|---|
CN105103116B (zh) | 2018-09-21 |
US10222940B2 (en) | 2019-03-05 |
EP2966556A4 (en) | 2016-10-26 |
JP6245250B2 (ja) | 2017-12-13 |
EP2966556B1 (en) | 2019-10-09 |
JPWO2014136400A1 (ja) | 2017-02-09 |
US20150355791A1 (en) | 2015-12-10 |
EP2966556A1 (en) | 2016-01-13 |
CN105103116A (zh) | 2015-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6245250B2 (ja) | 携帯端末装置、画像生成方法及びプログラム | |
US10754532B2 (en) | Method and apparatus for operating function in touch device | |
EP3246802A1 (en) | Mobile terminal and method for controlling the same | |
JP6007469B2 (ja) | 情報処理装置、表示制御方法、およびプログラム | |
JP6196398B2 (ja) | タッチボタン及び指紋認証を実現する装置、方法、端末機器、並びにプログラム及び記録媒体 | |
KR20170006559A (ko) | 이동단말기 및 그 제어방법 | |
KR20180095331A (ko) | 이동단말기 및 그 제어 방법 | |
CN102981750A (zh) | 用于在移动终端中编辑文本的方法和设备 | |
KR20180133743A (ko) | 이동 단말기 및 그 제어 방법 | |
JP2019159261A (ja) | 電子黒板、映像表示方法、プログラム | |
US9621809B2 (en) | Display control apparatus and method for controlling the same | |
WO2011067985A1 (ja) | 携帯端末装置及び携帯端末装置の機能設定方法 | |
US10855731B2 (en) | Information processing apparatus, data processing method thereof, and program | |
US10075645B2 (en) | Control of display of plurality of items and associated information | |
KR101591329B1 (ko) | 이동단말기 및 이동 단말기의 제어방법 | |
US10432848B2 (en) | Electronic apparatus and method for controlling the same | |
KR100749481B1 (ko) | 무접점 기능을 수행하는 장치 및 방법 | |
JP6457170B2 (ja) | 携帯電子機器 | |
CN117519565A (zh) | 一种设备的操控方法及电子设备 | |
JP2014021893A (ja) | 情報処理装置、操作信号生成方法、およびプログラム | |
KR20080018332A (ko) | 통신 단말기의 정보 리스트 표시 방법 | |
JP2005237042A (ja) | 携帯電話機 | |
KR20170028103A (ko) | 이동 단말기 및 그 제어 방법 | |
JP2015184938A (ja) | 情報処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480012077.9 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14760340 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014760340 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14760110 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2015504158 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |