US20150154730A1 - Data processing device - Google Patents

Data processing device Download PDF

Info

Publication number
US20150154730A1
US20150154730A1 US14/553,288 US201414553288A US2015154730A1 US 20150154730 A1 US20150154730 A1 US 20150154730A1 US 201414553288 A US201414553288 A US 201414553288A US 2015154730 A1 US2015154730 A1 US 2015154730A1
Authority
US
United States
Prior art keywords
data
region
sensing
display portion
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/553,288
Other languages
English (en)
Inventor
Yoshiharu Hirakata
Shunpei Yamazaki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Semiconductor Energy Laboratory Co Ltd
Original Assignee
Semiconductor Energy Laboratory Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Semiconductor Energy Laboratory Co Ltd filed Critical Semiconductor Energy Laboratory Co Ltd
Assigned to SEMICONDUCTOR ENERGY LABORATORY CO., LTD. reassignment SEMICONDUCTOR ENERGY LABORATORY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YAMAZAKI, SHUNPEI, HIRAKATA, YOSHIHARU
Publication of US20150154730A1 publication Critical patent/US20150154730A1/en
Priority to US16/450,049 priority Critical patent/US11475532B2/en
Priority to US17/960,984 priority patent/US11983793B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/14Handling requests for interconnection or transfer
    • G06F13/20Handling requests for interconnection or transfer for access to input/output bus
    • G06F13/24Handling requests for interconnection or transfer for access to input/output bus using interrupt

Definitions

  • One embodiment of the present invention relates to a method and a program for processing and displaying image information, and a device including a storage medium in which the program is stored.
  • one embodiment of the present invention relates to a method for processing and displaying image data by which an image including information processed by a data processing device provided with a display portion is displayed, a program for displaying an image including information processed by a data processing device provided with a display portion, and a data processing device including a storage medium in which the program is stored.
  • one embodiment of the present invention is not limited to the above technical field.
  • the technical field of one embodiment of the invention disclosed in this specification and the like relates to an object, a method, or a manufacturing method.
  • one embodiment of the present invention relates to a process, a machine, manufacture, or a composition of matter.
  • examples of the technical field of one embodiment of the present invention disclosed in this specification include a semiconductor device, a display device, a light-emitting device, a power storage device, a memory device, a method for driving any of them, and a method for manufacturing any of them.
  • the social infrastructures relating to means for transmitting information have advanced. This has made it possible to acquire, process, and send out many pieces and various kinds of information with the use of a data processing device not only at home or office but also at other visiting places.
  • Portable data processing devices are often used while being carried around, and force might be accidentally applied, by dropping for example, to the data processing devices and to display devices included in them.
  • a display device that is not easily broken a display device having high adhesiveness between a structure body by which a light-emitting layer is divided and a second electrode layer is known (Patent Document 1).
  • a cellular phone in which a display device is provided on a front side and on an upper side in the longitudinal direction of a housing (Patent Document 2).
  • Patent Document 1 Japanese Published Patent Application No. 2012-190794
  • Patent Document 2 Japanese Published Patent Application No. 2010-153813
  • An object of one embodiment of the present invention is to provide a novel human interface with excellent operability. Another object is to provide a novel data processing device with excellent operability. Another object is to provide a novel data processing device, a novel display device, or the like.
  • One embodiment of the present invention is a data processing device including an input and output device supplied with first image data and second image data and capable of supplying first sensing data, and an arithmetic device capable of supplying the first image data and the second image data and supplied with the first sensing data.
  • the input and output device includes a first display portion supplied with and capable of displaying the first image data, a second display portion supplied with and capable of displaying the second image data, a first sensing portion capable of sensing an object obscuring the first display portion and supplying the first sensing data, a first region provided with the first display portion, a second region provided with the second display portion, and a first curved portion between the first region and the second region.
  • the arithmetic device includes an arithmetic portion and a memory portion capable of storing a program to be executed by the arithmetic portion.
  • the arithmetic portion is capable of generating the first image data or the second image data based on the first sensing data.
  • the above-described data processing device of one embodiment of the present invention includes the input and output device supplied with image data and capable of supplying sensing data, and the arithmetic device capable of supplying the image data and supplied with the sensing data.
  • the input and output device includes a plurality of display portions capable of displaying display data and a sensing portion capable of sensing an object obscuring one of the plurality of display portions, and includes one region provided with the one of the plurality of display portions and the sensing portion, another region provided with the other display portions, and a curved portion between the one region and the other region.
  • the arithmetic device includes the arithmetic portion and the memory portion capable of storing a program to be executed by the arithmetic portion.
  • the input and output device may include a second sensing portion capable of sensing an object obscuring the second display portion and supplying second sensing data.
  • the arithmetic device is supplied with the second sensing data.
  • the arithmetic portion is capable of generating the first image data and/or the second image data based on the first sensing data and/or the second sensing data.
  • the above-described data processing device of one embodiment of the present invention may include the second display portion, the second sensing portion capable of sensing an object obscuring the second display portion, and the second region provided with the second display portion and the second sensing portion.
  • image data based on sensing data supplied from one of the regions can be generated and displayed by the input and output device. Consequently, a novel data processing device can be provided.
  • the first region can be folded or unfolded.
  • the data processing device of one embodiment of the present invention includes the first region which can be folded or unfolded. Accordingly, the data processing device can be used with the first region having a highly portable size or a highly browsable size. Consequently, a novel data processing device can be provided.
  • Another embodiment of the present invention is a data processing device including an input and output device supplied with first image data and second image data and capable of supplying first sensing data, and an arithmetic device capable of supplying the first image data and the second image data and supplied with the first sensing data.
  • the input and output device includes a terminal supplied with the first image data and the second image data, a first display portion supplied with and capable of displaying the first image data, a second display portion supplied with and capable of displaying the second image data, a first sensing portion capable of sensing an object obscuring the first display portion and supplying the first sensing data, a first region provided with the first display portion, a second region provided with the second display portion, a third region provided with the terminal, a first curved portion between the first region and the second region, and a second curved portion between the first region and the third region.
  • the third region is capable of supplying the first image data and the second image data.
  • the first region is supplied with the first image data and the second image data and is capable of supplying the second image data.
  • the second region is supplied with the second image data.
  • an arithmetic portion is capable of generating the first image data or the second image data based on the first sensing data.
  • the above-described data processing device of one embodiment of the present invention includes the first region provided with the first display portion, the second region provided with the second display portion, the third region provided with the terminal, the first curved portion between the first region and the second region, and the second curved portion between the first region and the third region. Accordingly, the terminal is capable of supplying the first image data and the second image data.
  • the first region is capable of displaying the first image data and supplying the second image data
  • the second region is capable of displaying the second image data. Consequently, a novel data processing device can be provided.
  • the input and output device may be capable of supplying first positional data and second positional data.
  • the arithmetic device may be supplied with the first positional data and the second positional data.
  • the input and output device may include a first positional data input portion capable of supplying the first positional data and a second positional data input portion capable of supplying the second positional data.
  • the first region may include the first positional data input portion overlapping with the first display portion.
  • the second region may include the second positional data input portion overlapping with the second display portion.
  • the first region includes the first positional data input portion overlapping with the first display portion
  • the second region includes the second positional data input portion overlapping with the second display portion. Accordingly, image data based on positional data supplied from one data input portion can be generated and displayed on the first display portion or the second display portion. Consequently, a novel data processing device can be provided.
  • Another embodiment of the present invention is the above-described data processing device with a program including a first step of acquiring initial data including status data; a second step of allowing an interrupt processing; a third step of acquiring predetermined data; a fourth step of selecting a fifth step when the status data shows a first status or a sixth step when the status data shows a second status; the fifth step of generating first image data based on the predetermined data and displaying the first image data on the first display portion; the sixth step of generating second image data based on the predetermined data and displaying the second image data on the second display portion; a seventh step of selecting an eighth step when a termination instruction is supplied in the interrupt processing or the third step when no termination instruction is supplied in the interrupt processing; and the eighth step of terminating the program.
  • the interrupt processing includes a ninth step of acquiring first sensing data and second sensing data; a tenth step of determining candidate data based on the first sensing data and the second sensing data; an eleventh step of selecting a twelfth step when the candidate data differs from the status data or the ninth step when the candidate data is the same as the status data; the twelfth step of updating the status data with the candidate data; and a thirteenth step of returning from the interrupt processing.
  • the program includes the step of determining candidate data by acquiring the first sensing data and the second sensing data; the step of updating the status data with the candidate data when the status data differs from the candidate data; and the step of generating and displaying image data including predetermined data based on the updated status data.
  • a novel human interface with excellent operability can be provided.
  • a novel data processing device with excellent operability can be provided.
  • a novel data processing device, a novel display device, or the like can be provided. Note that the description of these effects does not disturb the existence of other effects.
  • One embodiment of the present invention does not necessarily achieve all the above effects. Other effects will be apparent from and can be derived from the description of the specification, the drawings, the claims, and the like.
  • FIG. 1 is a block diagram illustrating a structure of a data processing device of an embodiment.
  • FIGS. 2A , 2 B, 2 C 1 , 2 C 2 , and 2 D are schematic diagrams illustrating a structure of a data processing device of an embodiment.
  • FIGS. 3 A 1 , 3 A 2 , 3 A 3 , 3 B, and 3 C are schematic diagrams illustrating a structure of a data processing device of an embodiment.
  • FIG. 4 is a block diagram illustrating a structure of a data processing device of an embodiment.
  • FIGS. 5 A 1 , 5 A 2 , 5 B, 5 C, and 5 D are schematic diagrams illustrating a structure of a data processing device of an embodiment.
  • FIGS. 6 A 1 , 6 A 2 , 6 B 1 , and 6 B 2 are schematic diagrams illustrating a structure of a data processing device of an embodiment.
  • FIG. 7 is a flowchart illustrating a program stored in a memory portion of a data processing device of an embodiment.
  • FIG. 8 is a flowchart illustrating a program stored in a memory portion of a data processing device of an embodiment.
  • FIGS. 9A to 9C illustrate a structure of a touch panel that can be used in a data processing device of an embodiment.
  • FIGS. 10A and 10B illustrate a structure of a touch panel that can be used in a data processing device of an embodiment.
  • FIGS. 11A to 11C each illustrate a structure of a touch panel that can be used in a data processing device of an embodiment.
  • FIGS. 12A to 12C each illustrate a structure of a touch panel that can be used in a data processing device of an embodiment.
  • FIGS. 13A to 13D illustrate a method for manufacturing a bendable or foldable device of an embodiment.
  • FIGS. 14A to 14D illustrate a method for manufacturing a bendable or foldable device of an embodiment.
  • FIGS. 15A to 15D illustrate a method for manufacturing a bendable or foldable device of an embodiment.
  • a data processing device of one embodiment of the present invention includes an input and output device supplied with image data and capable of supplying sensing data, and an arithmetic device capable of supplying the image data and supplied with the sensing data.
  • the input and output device includes a plurality of display portions capable of displaying display data and a sensing portion capable of sensing an object obscuring one of the plurality of display portions, and includes one region provided with the one of the display portions and the sensing portion, another region provided with the other display portions, and a curved portion between the one region and the other region.
  • the arithmetic device includes an arithmetic portion and a memory portion capable of storing a program to be executed by the arithmetic portion.
  • image data based on sensing data supplied from a first region can be generated and displayed on the first region and/or a second region.
  • a novel human interface with excellent operability can be provided.
  • a novel data processing device with excellent operability can be provided.
  • a novel data processing device, a novel display device, or the like can be provided.
  • FIG. 1 a structure of a data processing device of one embodiment of the present invention will be described with reference to FIG. 1 and FIGS. 2A , 2 B, 2 C 1 , 2 C 2 , and 2 D.
  • FIG. 1 is a block diagram illustrating a structure of a data processing device 100 of one embodiment of the present invention.
  • FIG. 2A is a schematic diagram illustrating the appearance of the data processing device 100 of one embodiment of the present invention
  • FIG. 2B is a cross-sectional view illustrating a cross-sectional structure along a cutting-plane line X1-X2 in FIG. 2A .
  • FIG. 2 C 1 is a schematic diagram illustrating the appearance of a positional data input portion and a display portion which can be used in the data processing device 100 .
  • FIG. 2 C 2 is a schematic diagram illustrating the appearance of a proximity sensor 142 which can be used in the positional data input portion.
  • FIG. 2D is a cross-sectional view illustrating a cross-sectional structure of the proximity sensor 142 along a cutting-plane line X3-X4 in FIG. 2 C 2 .
  • the data processing device 100 described in this embodiment includes an input and output device 120 that is supplied with first image data V 1 and second image data V 2 and supplies first sensing data S 1 , and an arithmetic device 110 that supplies the first image data V 1 and the second image data V 2 and is supplied with the first sensing data S 1 (see FIG. 1 ).
  • the input and output device 120 includes a first display portion 130 ( 1 ) that is supplied with and displays the first image data V 1 , a second display portion 130 ( 2 ) that is supplied with and displays the second image data V 2 , and a first sensing portion 150 ( 1 ) that senses an object obscuring the first display portion 130 ( 1 ) and supplies the first sensing data S 1 .
  • the input and output device 120 also includes a first region 120 ( 1 ) provided with the first display portion 130 ( 1 ) and the first sensing portion 150 ( 1 ), a second region 120 ( 2 ) provided with the second display portion 130 ( 2 ), and a first curved portion 120 c ( 1 ) between the first region 120 ( 1 ) and the second region 120 ( 2 ) (see FIG. 1 and FIGS. 2A and 2B ).
  • the arithmetic device 110 includes an arithmetic portion 111 and a memory portion 112 that stores a program to be executed by the arithmetic portion 111 .
  • the arithmetic portion 111 generates the first image data V 1 or the second image data V 2 based on the first sensing data S 1 (see FIG. 1 ).
  • the above-described data processing device of one embodiment of the present invention includes the input and output device 120 that is supplied with image data and supplies sensing data, and the arithmetic device 110 that supplies the image data and is supplied with the sensing data.
  • the input and output device 120 includes a plurality of display portions that display display data and a sensing portion that senses an object obscuring one of the display portions, and includes one region provided with the one of the display portions and the sensing portion, another region provided with the other display portions, and a curved portion between the one region and the other region.
  • the arithmetic device includes an arithmetic portion and a memory portion that stores a program to be executed by the arithmetic portion.
  • the input and output device 120 may be configured to supply first positional data L 1 and second positional data L 2
  • the arithmetic device 110 may be configured to be supplied with the first positional data L 1 and the second positional data L 2 (see FIG. 1 ).
  • the input and output device 120 may include a positional data input portion 140 capable of supplying positional data.
  • the first region 120 ( 1 ) may include a first positional data input portion 140 ( 1 ) overlapping with the first display portion 130 ( 1 )
  • the second region 120 ( 2 ) may include a second positional data input portion 140 ( 2 ) overlapping with the second display portion 130 ( 2 ).
  • the first positional data input portion 140 ( 1 ) may be configured to supply the first positional data L 1
  • the second positional data input portion 140 ( 2 ) may be configured to supply the second positional data L 2 .
  • the data processing device 100 described in this embodiment as an example can generate image data based on positional data supplied from the positional data input portion and display it on the first display portion or the second display portion. Consequently, a novel data processing device can be provided.
  • the input and output device 120 may include an input and output portion 145 that supplies and is supplied with data and a communication portion 160 that supplies and is supplied with communication data COM.
  • the arithmetic device 110 may include a transmission path 114 that supplies and is supplied with data, and an input and output interface 115 that supplies and is supplied with data.
  • a touch panel in which a display portion overlaps with a touch sensor serves as the positional data input portion 140 as well as a display portion 130 .
  • this embodiment describes a touch sensor having a structure where the positional data input portion 140 is placed on a display surface side of the display portion 130 as an example, one embodiment of the present invention is not limited to this structure. Specifically, the display portion 130 may be placed on a sensing surface side of the positional data input portion 140 , or the display portion 130 and the positional data input portion 140 may be integrated into one unit. In other words, either an on-cell touch panel or an in-cell touch panel may be employed.
  • the data processing device 100 includes the input and output device 120 and the arithmetic device 110 (see FIG. 1 ).
  • the input and output device 120 includes the display portion 130 and a sensing portion 150 .
  • the input and output device 120 is supplied with the first image data V 1 and the second image data V 2 and supplies the first sensing data S 1 and the second sensing data S 2 .
  • the input and output device 120 may include the positional data input portion 140 , the input and output portion 145 , and the communication portion 160 .
  • the input and output device 120 includes the first region 120 ( 1 ), the second region 120 ( 2 ), the first curved portion 120 c ( 1 ), and a second curved portion 120 c ( 2 ) (see FIGS. 2A and 2B ).
  • a portion showing the most significant change in curvature between the first region 120 ( 1 ) and the second region 120 ( 2 ) is referred to as the first curved portion 120 c ( 1 ).
  • the first curved portion 120 c ( 1 ) includes a portion with the smallest curvature radius that appears in a section of the curved surface.
  • the curvature radius of the curved portion is 10 mm or less, preferably 8 mm or less, further preferably 5 mm or less, particularly preferably 4 mm or less.
  • the first curved portion 120 c ( 1 ) and/or the second curved portion 120 c ( 2 ) may have a display portion and a positional data input portion that overlaps with the display portion. With such a structure, positional data supplied from the first curved portion 120 c ( 1 ) and/or the second curved portion 120 c ( 2 ) may be used instead of the second positional data L 2 .
  • the first region 120 ( 1 ) includes the first display portion 130 ( 1 ) and the first sensing portion 150 ( 1 ).
  • the first region 120 ( 1 ) may include the first positional data input portion 140 ( 1 ).
  • the second region 120 ( 2 ) includes the second display portion 130 ( 2 ).
  • the second region 120 ( 2 ) may also include the second positional data input portion 140 ( 2 ) and/or a second sensing portion 150 ( 2 ) that senses an object obscuring the second region.
  • the input and output device 120 has two second regions 120 ( 2 ) is shown in FIG. 2B , one embodiment of the present invention is not limited to this example.
  • the input and output device 120 may have only one second region 120 ( 2 ), or three or more second regions 120 ( 2 ).
  • two second regions 120 ( 2 ) may be arranged to face each other (see FIG. 2B ).
  • the distance between the two second regions 120 ( 2 ) is, for example, 17 cm or shorter, preferably 9 cm or shorter, further preferably 7 cm or shorter.
  • the thumb of the holding hand can be used to obtain positional data in a large area of the first positional data input portion 140 ( 1 ).
  • the display portion 130 There is no particular limitation on the display portion 130 as long as the display portion 130 can display supplied image data (see FIG. 2 C 1 ).
  • the display portion 130 includes the first display portion 130 ( 1 ) and the second display portion 130 ( 2 ).
  • the first display portion 130 ( 1 ) displays the first image data V 1 that is supplied thereto, and the second display portion 130 ( 2 ) displays the second image data V 2 that is supplied thereto.
  • the first display portion 130 ( 1 ) and the second display portion 130 ( 2 ) may be driven as one display portion.
  • one driver circuit may supply signals to select scan lines.
  • the first display portion 130 ( 1 ) and the second display portion 130 ( 2 ) may be driven as different display portions.
  • separate driver circuits may be provided for the display portions, and the driver circuits may supply signals to select scan lines to the corresponding display portions.
  • the data processing device 100 when the data processing device 100 is in a standby state, only the second display portion 130 ( 2 ) may be driven, and drive of the first display portion 130 ( 1 ) may be stopped. Stopping drive of the first display portion 130 ( 1 ) can reduce power consumption.
  • a flexible display portion which can be bent at a position overlapping with the first curved portion 120 c ( 1 ) can be used as the display portion 130 .
  • Specific examples of structures that can be employed in the display portion 130 are described in Embodiments 4 to 6.
  • the sensing portion 150 senses the states of the data processing device 100 and the circumstances and supplies sensing data (see FIG. 1 ).
  • the sensing portion 150 includes the first sensing portion 150 ( 1 ), and the first sensing portion senses an object obscuring the first display portion 130 ( 1 ). Then, the first sensing portion 150 ( 1 ) supplies the first sensing data S 1 including data about whether the first display portion 130 ( 1 ) is obscured or not.
  • any of a variety of sensing elements such as a photoelectric conversion element, an imaging element, a magnetic sensor, and a proximity sensor can be used in the first sensing portion 150 ( 1 ).
  • a photoelectric conversion element 150 PD is provided in the first region 120 ( 1 ) so as to sense the intensity of light incident from a side where the first display portion 130 ( 1 ) displays image data (see FIG. 2A ).
  • the first sensing portion 150 ( 1 ) is not necessarily provided in the first region 120 ( 1 ) and may be provided in another place as long as the first sensing portion 150 ( 1 ) can sense an object obscuring the first display portion 130 ( 1 ).
  • the first sensing portion 150 ( 1 ) may be provided in the second region, or data supplied from another device may be used as the first sensing data S 1 .
  • a sensing element capable of sensing a much wider range with use of a fish-eye lens may be provided in the second region and used as the first sensing portion 150 ( 1 ).
  • an image taken by a monitoring camera may be obtained through a communication network and used as the first sensing data S 1 .
  • the sensing portion 150 may sense acceleration, a direction, pressure, a global positioning system (GPS) signal, temperature, humidity, or the like and supply data thereon.
  • GPS global positioning system
  • the positional data input portion 140 senses an approaching object and supplies positional data of the approaching object to the arithmetic device 110 . Note that when the positional data input portion 140 is positioned closer to the user than the display portion 130 is, the positional data input portion 140 has a light-transmitting property.
  • the user of the data processing device 100 can give a variety of operating instructions to the data processing device 100 by making his/her finger, palm, or the like in proximity to the positional data input portion 140 .
  • an operating instruction including a termination instruction an instruction to terminate the program can be supplied.
  • the proximity sensors 142 may be arranged in a matrix over a flexible substrate 141 to constitute the positional data input portion 140 (see FIGS. 2 C 1 , 2 C 2 , and 2 D).
  • the positional data input portion 140 includes the first positional data input portion 140 ( 1 ) and the second positional data input portion 140 ( 2 ).
  • the first positional data input portion 140 ( 1 ) supplies the first positional data L 1
  • the second positional data input portion 140 ( 2 ) supplies the second positional data L 2 .
  • the first positional data input portion 140 ( 1 ) and the second positional data input portion 140 ( 2 ) may be driven as one positional data input portion.
  • the positional data input portion 140 may be divided into the first positional data input portion 140 ( 1 ) and the second positional data input portion 140 ( 2 ) which are partially driven. In other words, the second positional data input portion 140 ( 2 ) may be driven independently of the first positional data input portion 140 ( 1 ).
  • X1-X2 direction is set as a row direction
  • the direction crossing the row direction is set as a column direction.
  • a plurality of scan lines extending in the row direction so as to cross the first positional data input portion 140 ( 1 ) and the second positional data input portion 140 ( 2 ), a plurality of signal lines extending in the column direction, and the proximity sensors 142 each electrically connected to one scan line and one signal line are provided in a matrix.
  • the positional data input portion 140 may be partially driven in the following manner: a proximity sensor connected to a first signal line provided in the first positional data input portion 140 ( 1 ) and a proximity sensor connected to a second signal line provided in the second positional data input portion 140 ( 2 ) are driven independently of each other.
  • the scan line is shared by the first positional data input portion 140 ( 1 ) and the second positional data input portion 140 ( 2 ); thus, the proximity sensor provided in the first positional data input portion 140 ( 1 ) and the proximity sensor provided in the second positional data input portion 140 ( 2 ) are driven at different times.
  • the data processing device 100 is used with its housing 101 being held by the user's hand, only the first positional data input portion 140 ( 1 ) may be driven and drive of the second positional data input portion 140 ( 2 ) may be stopped. Stopping drive of the second positional data input portion 140 ( 2 ) can reduce malfunctions due to the second positional data L 2 supplied from the second positional data input portion 140 ( 2 ) as a result of sensing the hand holding the data processing device 100 .
  • the sum of power consumed by the first positional data input portion 140 ( 1 ) and power consumed by the second positional data input portion 140 ( 2 ) is larger than power consumed by the first positional data input portion 140 ( 1 )
  • only the second positional data input portion 140 ( 2 ) may be driven and drive of the first positional data input portion 140 ( 1 ) may be stopped in a standby state of the data processing device 100 . Stopping drive of the first positional data input portion 140 ( 1 ) can reduce power consumption.
  • the proximity sensor 142 senses proximity or touch of an object (e.g., a finger or a palm), and a capacitor or an imaging element can be used as the proximity sensor.
  • a substrate provided with capacitors arranged in a matrix can be referred to as a capacitive touch sensor, and a substrate provided with an imaging element can be referred to as an optical touch sensor.
  • a resin that is thin enough to have flexibility can be used.
  • the resin include a polyester, a polyolefin, a polyamide (such as a nylon or an aramid), a polyimide, a polycarbonate, and an acrylic resin.
  • a glass substrate As a normal non-flexible substrate, a glass substrate, a quartz substrate, a semiconductor substrate, or the like can be used.
  • a flexible positional data input portion which can be bent at a position overlapping with the first curved portion 120 c ( 1 ) can be used as the positional data input portion 140 .
  • Specific examples of structures that can be employed in the positional data input portion 140 are described in Embodiments 4 to 6.
  • the communication portion 160 supplies the data COM supplied by the arithmetic device 110 to a device or a communication network outside the data processing device 100 . Furthermore, the communication portion 160 acquires the data COM from the device or communication network outside the data processing device 100 and supplies the data COM.
  • the data COM can include a variety of instructions or the like in addition to audio data, image data, and the like.
  • the data COM can include an operating instruction to make the arithmetic portion 111 generate or delete the first image data V 1 and the second image data V 2 .
  • a communication unit for connection to the external device or external communication network e.g., a hub, a router, or a modem, can be used for the communication portion 160 .
  • the connection method is not limited to a method using a wire, and a wireless method (e.g., radio wave or infrared rays) may be used.
  • the input and output portion 145 for example, a camera, a microphone, a read-only external memory portion, an external memory portion, a scanner, a speaker, or a printer can be used (see FIG. 1 ).
  • a digital camera a digital video camera, or the like can be used.
  • a hard disk, a removable memory, or the like can be used.
  • a read-only external memory portion a CD-ROM, a DVD-ROM, or the like can be used.
  • the arithmetic device 110 includes the arithmetic portion 111 and the memory portion 112 .
  • the arithmetic device 110 supplies the first image data V 1 and the second image data V 2 and is supplied with the first sensing data S 1 and the second sensing data S 2 (see FIG. 1 ).
  • the arithmetic device 110 supplies the first image data V 1 and the second image data V 2 including an image used for operation of the data processing device 100 .
  • first image data V 1 is displayed on the first display portion 130 ( 1 )
  • second image data V 2 is displayed on the second display portion 130 ( 2 ).
  • the arithmetic device 110 may be configured to be supplied with the first positional data L 1 and the second positional data L 2 .
  • the user of the data processing device 100 can supply an operating instruction associated with the image to the arithmetic device 110 .
  • the user of the data processing device 100 can supply an operating instruction associated with the image to the arithmetic device 110 .
  • the arithmetic device 110 may further include the transmission path 114 and the input and output interface 115 .
  • the arithmetic portion 111 executes the program stored in the memory portion 112 . For example, in response to supply of positional data that is associated with a position in which an image used for operation is displayed, the arithmetic portion 111 executes a program associated in advance with the image.
  • the memory portion 112 stores the program to be executed by the arithmetic portion 111 .
  • the input and output interface 115 supplies data and is supplied with data.
  • the transmission path 114 can supply data, and the arithmetic portion 111 , the memory portion 112 , and the input and output interface 115 are supplied with the data.
  • the arithmetic portion 111 , the memory portion 112 , and the input and output interface 115 can supply data, and the transmission path 114 is supplied with the data.
  • the data processing device 100 includes the arithmetic device 110 , the input and output device 120 , and the housing 101 (see FIG. 2B ).
  • the housing 101 protects the arithmetic device 110 and the like from external stress.
  • the housing 101 can be formed using metal, plastic, glass, ceramics, or the like.
  • FIGS. 3 A 1 , 3 A 2 , 3 A 3 , 3 B, and 3 C Another structure of a data processing device of one embodiment of the present invention will be described with reference to FIGS. 3 A 1 , 3 A 2 , 3 A 3 , 3 B, and 3 C.
  • FIGS. 3 A 1 , 3 A 2 , 3 A 3 , 3 B, and 3 C illustrate a structure of a data processing device 100 B of one embodiment of the present invention.
  • FIGS. 3 A 1 and 3 A 2 are front and rear perspective views, respectively, of the data processing device 100 B of one embodiment of the present invention.
  • FIG. 3 A 3 is a top view thereof.
  • FIG. 3B is a schematic diagram illustrating the appearance of the positional data input portion 140 and the display portion 130 which can be used in the data processing device 100 B.
  • FIG. 3C illustrates a usage state of the data processing device 100 B.
  • the data processing device 100 B described in this embodiment differs from the data processing device 100 described with reference to FIGS. 2A , 2 B, 2 C 1 , 2 C 2 , and 2 D, in including the second sensing portion 150 ( 2 ) that senses an object obscuring the second display portion 130 ( 2 ) and supplies the second sensing data S 2 .
  • Different parts are described in detail below, and the above description is referred to for the other similar parts.
  • the input and output device 120 includes the second sensing portion 150 ( 2 ) that senses an object obscuring the second display portion 130 ( 2 ) and supplies the second sensing data S 2 .
  • the arithmetic device 110 is supplied with the second sensing data S 2 .
  • the arithmetic portion 111 generates the first image data V 1 and/or the second image data V 2 based on the first sensing data S 1 and/or the second sensing data S 2 .
  • the data processing device 100 B described in this embodiment can generate image data based on sensing data supplied from one region and display it on the input and output device. Consequently, a novel data processing device can be provided.
  • the data processing device 100 B When the data processing device 100 B is put in a breast pocket of user's clothes with the second region 120 ( 2 ) facing upward, the user can easily see text or image information displayed on the second region 120 ( 2 ) while the data processing device 100 B is placed in the pocket (see FIG. 3C ).
  • the user can see, from above, the second region 120 ( 2 ) displaying the phone number, name, and the like of the caller of an incoming call.
  • the data processing device 100 B can be provided with a vibration sensor or the like and a memory device that stores a program for shifting a mode into an incoming call rejection mode in accordance with vibration sensed by the vibration sensor or the like.
  • the user can shift the mode into the incoming call rejection mode by tapping the data processing device 100 B over his/her clothes so as to apply vibration.
  • the display portion 130 there is no particular limitation on the display portion 130 as long as the display portion 130 can display supplied image data (see FIG. 3B ).
  • the display portion that can be used in the data processing device 100 can be used in the data processing device 100 B.
  • the display portion 130 includes the first display portion 130 ( 1 ) and the second display portion 130 ( 2 ). Note that a plurality of second display portions 130 ( 2 ) may be provided.
  • the first display portion 130 ( 1 ) displays the first image data V 1 that is supplied thereto, and the second display portion 130 ( 2 ) displays the second image data V 2 that is supplied thereto.
  • the first display portion 130 ( 1 ) and the second display portion 130 ( 2 ) may be driven as one display portion.
  • one driver circuit may supply signals to select scan lines.
  • the first display portion 130 ( 1 ) and the second display portion 130 ( 2 ) may be driven as different display portions.
  • separate driver circuits may be provided for the display portions, and the driver circuits may supply signals to select scan lines to the corresponding display portions.
  • the data processing device 100 B when the data processing device 100 B is in a standby state, only the second display portion 130 ( 2 ) may be driven, and drive of the first display portion 130 ( 1 ) may be stopped. Stopping drive of the first display portion 130 ( 1 ) can reduce power consumption.
  • a flexible display portion which can be bent at positions overlapping with the first curved portion 120 c ( 1 ) and the second curved portion 120 c ( 2 ) can be used as the display portion 130 .
  • Specific examples of structures that can be employed in the display portion 130 are described in Embodiments 4 to 6.
  • the sensing portion 150 senses the states of the data processing device 100 B and the circumstances and supplies sensing data (see FIG. 1 and FIGS. 3 A 1 and 3 A 2 ).
  • the sensing portion 150 includes the first sensing portion 150 ( 1 ) and the second sensing portion 150 ( 2 ).
  • the first sensing portion senses an object obscuring the first display portion 130 ( 1 )
  • the second sensing portion senses an object obscuring the second display portion 130 ( 2 ).
  • the first sensing portion 150 ( 1 ) supplies the first sensing data S 1 including data about whether the first display portion 130 ( 1 ) is obscured or not
  • the second sensing portion 150 ( 2 ) supplies the second sensing data S 2 including data about whether the second display portion 130 ( 2 ) is obscured or not.
  • the second sensing data includes data about whether any one of the second display portions is obscured or not.
  • a sensing element that can be used in the first sensing portion 150 ( 1 ) can be used in the second sensing portion 150 ( 2 ).
  • a photoelectric conversion element provided so as to sense an object obscuring the second display portion 130 ( 2 ) can be used in the second sensing portion 150 ( 2 ).
  • a photoelectric conversion element 150 PD( 1 ) is provided in the first region 120 ( 1 ) so as to sense the intensity of light incident from a side where the first region 120 ( 1 ) displays an image
  • a photoelectric conversion element 150 PD( 2 ) is provided in the second region 120 ( 2 ) so as to sense the intensity of light incident from a side where the second display portion 130 ( 2 ) displays an image (see FIG. 3 A 1 or 3 A 2 ).
  • first region including the photoelectric conversion element 150 PD( 1 ) and/or the second region including the photoelectric conversion element 150 PD( 2 ) of the data processing device 100 B are/is covered with a protective case or cover for the data processing device 100 B, clothes, or the like.
  • the sensing portion 150 may be configured to sense an object obscuring another display portion.
  • positional data input portion 140 There is no particular limitation on the positional data input portion 140 as long as the positional data input portion 140 can supply positional data (see FIG. 3B ).
  • the positional data input portion that can be used in the data processing device 100 can be used in the data processing device 100 B.
  • a flexible positional data input portion which can be bent at a position overlapping with the first curved portion 120 c ( 1 ) can be used as the positional data input portion 140 .
  • Specific examples of structures that can be employed in the positional data input portion 140 are described in Embodiments 4 to 6.
  • FIG. 4 a structure of a data processing device of one embodiment of the present invention will be described with reference to FIG. 4 and FIGS. 5 A 1 , 5 A 2 , 5 B, 5 C, and 5 D.
  • FIG. 4 illustrates that a display portion 130 , a positional data input portion 140 , and a sensing portion 150 of a data processing device 100 C of one embodiment of the present invention differ from those of the data processing device 100 illustrated in FIG. 1 .
  • FIGS. 5 A 1 , 5 A 2 , 5 B, 5 C, and 5 D illustrate a structure of the data processing device 100 C of one embodiment of the present invention.
  • FIG. 5 A 1 is a top view of the data processing device 100 C in an unfolded state
  • FIG. 5 A 2 is a bottom view of the data processing device 100 C in the unfolded state.
  • FIG. 5B is a side view of the data processing device 100 C
  • FIG. 5C is a side view including a cross section taken along a cutting-plane line Y1-Y2 in FIG. 5 A 1 .
  • FIGS. 6 A 1 , 6 A 2 , 6 B 1 , and 6 B 2 illustrate the data processing device 100 C in half-folded states.
  • FIGS. 6 A 1 and 6 A 2 are side views illustrating a folded state in which a display portion in a first region 120 ( 1 ) faces inward.
  • FIGS. 6 B 1 and 6 B 2 are side views illustrating a folded state in which the display portion in the first region 120 ( 1 ) faces outward.
  • the data processing device 100 C described in this embodiment differs from the data processing device 100 described in Embodiment 1 with reference to FIG. 1 , in the following points: the input and output device 120 is supplied with first image data V 1 (V 1 includes V 1 a and V 1 b ) and the second image data V 2 and supplies first positional data L 1 (L 1 includes L 1 a and L 1 b ), the second positional data L 2 , first sensing data S 1 (S 1 includes S 1 a and S 1 b ), and the second sensing data S 2 ;
  • the first display portion 130 ( 1 ) includes a display portion 130 ( 1 a ) and a display portion 130 ( 1 b );
  • the first positional data input portion 140 ( 1 ) includes a positional data input portion 140 ( 1 a ) and a positional data input portion 140 ( 1 b );
  • the first sensing portion 150 ( 1 ) includes a sensing portion 150 ( 1 a ) and a sensing portion 150 ( 1 b );
  • the input and output device 120 includes the first region 120 ( 1 ) and the second region 120 ( 2 ).
  • the first region 120 ( 1 ) includes the region 120 ( 1 a ) and the region 120 ( 1 b ).
  • the first region 120 ( 1 ) can be folded at a portion between the region 120 ( 1 a ) and the region 120 ( 1 b ) (see FIG. 4 ).
  • the region 120 ( 1 a ) includes the display portion 130 ( 1 a ) and the positional data input portion 140 ( 1 a ), and the region 120 ( 1 b ) includes the display portion 130 ( 1 b ) and the positional data input portion 140 ( 1 b ) (see FIG. 4 and FIG. 5C ).
  • the second region 120 ( 2 ) includes the display portion 130 ( 2 ) and the positional data input portion 140 ( 2 ).
  • the sensing portion 150 includes the sensing portion 150 ( 1 a ), the sensing portion 150 ( 1 b ), and the sensing portion 150 ( 2 ).
  • the sensing portion 150 ( 1 a ) is provided in a housing 15 a so as to be able to sense an object obscuring the display portion in the region 120 ( 1 a ), and the sensing portion 150 ( 1 b ) is provided in a housing 15 b so as to be able to sense an object obscuring the display portion in the region 120 ( 1 b ) (see FIG. 5 A 1 ).
  • FIGS. 6 A 1 and 6 A 2 are side views of the data processing device 100 C in a half-folded state in which the sensing portion 150 ( 1 a ) is located on the inner side.
  • the region 120 ( 1 a ) faces the region 120 ( 1 b ), and the region 120 ( 1 a ) is obscured by the region 120 ( 1 b ).
  • the region 120 ( 1 b ) is obscured by the region 120 ( 1 a ).
  • the second region 120 ( 2 ) of the data processing device 100 C in this folded state can display an image in one direction indicated by an arrow in FIG. 6 A 1 .
  • the folded state in which the first region 120 ( 1 ) faces inward can be found from the sensing data S 1 a supplied from the sensing portion 150 ( 1 a ) and/or the sensing data S 1 b supplied from the sensing portion 150 ( 1 b ). Then, drive of an obscured portion of the first display portion 130 ( 1 ) may be stopped. This can reduce power consumption.
  • FIGS. 6 B 1 and 6 B 2 illustrate the data processing device 100 C in a half-folded state in which the sensing portion 150 ( 1 a ) is located on the outer side.
  • a back side of the region 120 ( 1 a ) faces a back side of the region 120 ( 1 b ), and the region 120 ( 1 a ) or the region 120 ( 1 b ) is not obscured by the other region.
  • the first region 120 ( 1 ) of the data processing device 100 C in this folded state can display an image in three directions indicated by arrows in FIG. 6 B 2 .
  • the second region 120 ( 2 ) can display an image in another direction.
  • the orientation of the data processing device 100 C or the like can be found from the sensing data S 1 a supplied from the sensing portion 150 ( 1 a ), the sensing data S 1 b supplied from the sensing portion 150 ( 1 b ), or sensing data supplied from a gravity sensor or a gyro sensor. Then, a portion where display is not necessary in the first region 120 ( 1 ) may be determined from a combination of these pieces of sensing data, and its drive may be stopped. This can reduce power consumption.
  • the data processing device 100 C includes the first region 120 ( 1 ) which can be folded or unfolded. Accordingly, the data processing device 100 C can be used with the first region having a highly portable size or a highly browsable size. Consequently, a novel data processing device can be provided.
  • the data processing device 100 C includes the input and output device 120 that is supplied with the first image data V 1 and the second image data V 2 and supplies the first sensing data S 1 , and an arithmetic device 110 that supplies the first image data V 1 and the second image data V 2 and is supplied with the first sensing data S 1 (see FIG. 4 ).
  • the input and output device 120 includes a terminal 125 that is supplied with the first image data V 1 and the second image data V 2 , the first display portion 130 ( 1 ) that is supplied with and displays the first image data V 1 , the second display portion 130 ( 2 ) that is supplied with and displays the second image data V 2 , and the first sensing portion 150 ( 1 ) that senses an object obscuring the first display portion 130 ( 1 ) and supplies the first sensing data S 1 ( FIG. 4 and FIG. 5C ).
  • the input and output device 120 also includes the first region 120 ( 1 ) provided with the first display portion 130 ( 1 ), the second region 120 ( 2 ) provided with the second display portion 130 ( 2 ), a third region 120 ( 3 ) provided with the terminal 125 , the first curved portion 120 c ( 1 ) between the first region 120 ( 1 ) and the second region 120 ( 2 ), and the second curved portion 120 c ( 2 ) between the first region 120 ( 1 ) and the third region 120 ( 3 ) ( FIG. 5C ).
  • the third region 120 ( 3 ) supplies the first image data V 1 and the second image data V 2 .
  • the first region 120 ( 1 ) is supplied with the first image data V 1 and the second image data V 2 and supplies the second image data V 2 .
  • the second region 120 ( 2 ) is supplied with the second image data V 2 ( FIG. 5D ).
  • the arithmetic portion 111 generates the first image data V 1 or the second image data V 2 based on the first sensing data S 1 .
  • the data processing device 100 C includes the first region 120 ( 1 ) provided with the first display portion 130 ( 1 ), the second region 120 ( 2 ) provided with the second display portion 130 ( 2 ), the third region 120 ( 3 ) provided with the terminal 125 , the first curved portion 120 c ( 1 ) between the first region 120 ( 1 ) and the second region 120 ( 2 ), and the second curved portion 120 c ( 2 ) between the first region 120 ( 1 ) and the third region 120 ( 3 ). Accordingly, the terminal 125 can supply the first image data V 1 and the second image data V 2 .
  • the first region 120 ( 1 ) can display the first image data V 1 and supplies the second image data V 2
  • the second region 120 ( 2 ) can display the second image data V 2 . Consequently, a novel data processing device can be provided.
  • the data processing device 100 C differs from the data processing device described in Embodiment 1 in that a foldable housing is included and that the first region 120 ( 1 ) can be folded. Different parts are described in detail below, and the above description is referred to for the other similar parts.
  • the data processing device 100 C includes the input and output device 120 , and the input and output device 120 includes the first region 120 ( 1 ) which can be folded or unfolded.
  • the second region 120 ( 2 ) is provided such that the first curved portion 120 c ( 1 ) is located between the first region 120 ( 1 ) and the second region 120 ( 2 )
  • the third region 120 ( 3 ) is provided such that the second curved portion 120 c ( 2 ) is located between the first region 120 ( 1 ) and the third region 120 ( 3 ) (see FIG. 4 and FIGS. 5 A 1 , 5 A 2 , 5 B, 5 C, and 5 D).
  • a signal line is provided in the first region 120 ( 1 ), the second region 120 ( 2 ), and the third region 120 ( 3 ), and the first region 120 ( 1 ) is electrically connected to the second region 120 ( 2 ) and the third region 120 ( 3 ).
  • the first region 120 ( 1 ) can be folded or unfolded and is held in a foldable housing.
  • sensing portion 150 ( 1 a ) and the sensing portion 150 ( 1 b ) may be provided.
  • the housing allows the first region 120 ( 1 ) to be folded or unfolded.
  • the data processing device 100 C includes housings 13 a and 13 b which are flexible and the housings 15 a and 15 b which are less flexible than the housings 13 a and 13 b.
  • a flexible member or a hinge can be used for the foldable housing.
  • the housing may be folded or unfolded by a method using user's hands, a spring, a motor, a piezoelectric element, or the like.
  • a resin, a rubber, a silicone rubber, or the like can be used for the flexible member.
  • a metal, an alloy, an engineering plastic, or the like can be used for the hinge.
  • the data processing device 100 C may include a housing that is more rigid than the foldable housing.
  • the housing 13 a is shaped so as not to obscure the first region 120 ( 1 ) and the second region 120 ( 2 ) (see FIG. 5 A 1 ), and the display portion 130 and the positional data input portion 140 are provided between the housing 13 a and the housing 13 b (see FIG. 5B ).
  • the housings 13 a and 13 b connect the housings 15 a and 15 b (see FIGS. 5 A 1 and 5 A 2 ).
  • the housing 15 a is shaped so as not to obscure the first region 120 ( 1 ) (see FIGS. 5 A 1 and 5 C).
  • the arithmetic device 110 is stored in the housing 15 a .
  • the arithmetic device 110 includes the terminal that supplies the first image data V 1 and the second image data V 2 and is supplied with the first sensing data S 1 .
  • the housing 15 b has an opening so as not to obscure the first region 120 ( 1 ) and the second region 120 ( 2 ). Specifically, the housing 15 a has an opening so as not to obscure the first region 120 ( 1 ), and the housing 15 b has an opening at a right-hand side surface so as not to obscure the second region 120 ( 2 ).
  • a user of the data processing device 100 C can hold the data processing device 100 C with the other hand such that the second region 120 ( 2 ) is positioned on a left-hand side.
  • the first image data V 1 or the second image data V 2 may be generated on the basis of sensing data about the orientation of the data processing device 100 C which is supplied from the sensing portion 150 . Accordingly, favorable display can be performed according to which hand is used to hold the data processing device 100 C. For example, a user can hold the housing 15 a with his/her left hand so that his/her right hand can be used to supply positional data from the positional data input portion 140 ( 2 ) in the second region 120 ( 2 ).
  • the input and output device 120 includes the first region 120 ( 1 ) that can be folded. Note that the first region 120 ( 1 ) includes the first display portion 130 ( 1 ) and the first sensing portion 150 ( 1 ).
  • an input and output device including a flexible substrate and a thin film element formed over the flexible substrate can be used as the input and output device 120 .
  • the first region 120 ( 1 ) and the second region 120 ( 2 ) can be integrated. Note that specific examples of structures that can be employed in the foldable input and output device 120 are described in Embodiments 4 to 6.
  • the input and output device 120 includes the terminal 125 in the third region 120 ( 3 ).
  • the third region 120 ( 3 ) includes the terminal 125 that is supplied with the first image data V 1 and the second image data V 2 and supplies the first sensing data S 1 (see FIG. 5D ).
  • the input and output device 120 includes a plurality of wirings.
  • a wiring 126 is electrically connected to the terminal 125 , through which a signal, a power supply potential, or the like can be supplied to the terminal.
  • the first image data V 1 and the second image data V 2 supplied thereto are supplied to the first region 120 ( 1 ).
  • the second image data V 2 supplied thereto is supplied to the second region 120 ( 2 ).
  • FIGS. 7 and 8 a structure of a data processing device of one embodiment of the present invention will be described with reference to FIGS. 7 and 8 .
  • FIG. 7 is a flowchart showing a program for the data processing device of one embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating an interrupt processing of the program described with reference to FIG. 7 .
  • the data processing devices 100 , 100 B, and 100 C described in this embodiment each include the memory portion 112 that stores the program including the following steps.
  • initial data including status data is acquired (S 1 in FIG. 7 ).
  • the initial data used in a later step is acquired.
  • predetermined data may be used, or sensing data supplied from the sensing portion may be used.
  • an interrupt processing is allowed (S 2 in FIG. 7 ).
  • the arithmetic portion 111 can receive an instruction to execute the interrupt processing.
  • the arithmetic portion 111 that has received the instruction to execute the interrupt processing stops the main processing and executes the interrupt processing.
  • the arithmetic portion 111 that has received an event associated with the instruction executes the interrupt processing, and stores the execution result in the memory portion. Then, the arithmetic portion 111 that has returned from the interrupt processing can resume the main processing on the basis of the execution result of the interrupt processing.
  • predetermined data is acquired (S 3 in FIG. 7 ).
  • Predetermined data which is the basis of first image data or second image data generated in a later step is acquired. For example, image data or text data whose size has not yet been optimized for the first region 120 ( 1 ) or the second region 120 ( 2 ) is acquired. Note that an operating instruction or data supplied in the interrupt processing is reflected in the third and subsequent steps.
  • a fifth step is selected when the status data shows a first status, or a sixth step is selected when the status data shows a second status (S 4 in FIG. 7 ).
  • the fifth step is selected when the first region 120 ( 1 ) is not obscured according to the status data determined on the basis of the first sensing data S 1 supplied from the first sensing portion 150 ( 1 ), or the sixth step is selected when the first region 120 ( 1 ) is obscured.
  • the first image data V 1 is generated on the basis of the data acquired in the third step, and the first image data V 1 is displayed on the first display portion 130 ( 1 ) (S 5 in FIG. 7 ).
  • the first image data V 1 is generated such that text information is displayed in a single line or a plurality of lines. It can also be generated on the basis of the orientation or size of the first display portion 130 ( 1 ) or a preferred design set by a user.
  • the second image data V 2 is generated on the basis of the data acquired in the third step, and the second image data V 2 is displayed on the second display portion 130 ( 2 ) (S 6 in FIG. 7 ).
  • the second image data V 2 is generated such that text information is displayed so as to move from one side to the other. It can also be generated on the basis of the orientation or size of the second display portion 130 ( 2 ) or a preferred design set by a user.
  • an eighth step is selected when a termination instruction is supplied in the interrupt processing, or the third step is selected when no termination instruction is supplied in the interrupt processing (S 7 in FIG. 7 ).
  • the program terminates (S 8 in FIG. 7 ).
  • the interrupt processing includes the following steps.
  • the first sensing data S 1 and the second sensing data S 2 are acquired (T 9 in FIG. 8 )
  • the first sensing data S 1 supplied from the first sensing portion 150 ( 1 ) and the second sensing data S 2 supplied from the second sensing portion 150 ( 2 ) are acquired using a timer or the like.
  • candidate data based on the first sensing data S 1 is determined (T 10 in FIG. 8 ).
  • a twelfth step is selected when the candidate data differs from the status data, or the ninth step is selected when the candidate data is the same as the status data (T 11 in FIG. 8 ).
  • the status data is updated with the candidate data (T 12 in FIG. 8 ).
  • the status data is updated when there is a change in the first sensing data S 1 .
  • the operation returns from the interrupt processing (T 13 in FIG. 8 ).
  • the program includes the step of determining the candidate data by acquiring the first sensing data; the step of updating the status data with the candidate data when the status data differs from the candidate data; and the step of generating and displaying image data including predetermined data based on the updated status data.
  • FIGS. 9A to 9C a structure of a bendable or foldable touch panel that can be used in the display portion 130 and the positional data input portion 140 of the data processing device of one embodiment of the present invention will be described with reference to FIGS. 9A to 9C .
  • FIG. 9A is a top view illustrating the structure of the touch panel that can be used in the data processing device of one embodiment of the present invention.
  • FIG. 9B is a cross-sectional view taken along cutting-plane lines A-B and C-D in FIG. 9A .
  • FIG. 9C is a cross-sectional view taken along a cutting-plane line E-F in FIG. 9A .
  • a touch panel 300 described as an example in this embodiment includes a display portion 301 (see FIG. 9A ).
  • the display portion 301 includes a plurality of pixels 302 and a plurality of imaging pixels 308 .
  • the imaging pixels 308 can sense a touch of a finger or the like on the display portion 301 .
  • a touch sensor can be formed using the imaging pixels 308 .
  • Each of the pixels 302 includes a plurality of sub-pixels (e.g., a sub-pixel 302 R).
  • a sub-pixel 302 R In the sub-pixels, light-emitting elements and pixel circuits that can supply electric power for driving the light-emitting elements are provided.
  • the pixel circuits are electrically connected to wirings through which selection signals and image signals are supplied.
  • the touch panel 300 is provided with a scan line driver circuit 303 g ( 1 ) that can supply selection signals to the pixels 302 and an image signal line driver circuit 303 s ( 1 ) that can supply image signals to the pixels 302 .
  • the imaging pixels 308 include photoelectric conversion elements and imaging pixel circuits that drive the photoelectric conversion elements.
  • the imaging pixel circuits are electrically connected to wirings through which control signals and power supply potentials are supplied.
  • control signals include a signal for selecting an imaging pixel circuit from which a recorded imaging signal is read, a signal for initializing an imaging pixel circuit, and a signal for determining the time for an imaging pixel circuit to sense light.
  • the touch panel 300 is provided with an imaging pixel driver circuit 303 g ( 2 ) that can supply control signals to the imaging pixels 308 and an imaging signal line driver circuit 303 s ( 2 ) that reads out imaging signals.
  • the touch panel 300 includes a substrate 310 and a counter substrate 370 opposite to the substrate 310 (see FIG. 9B ).
  • the touch panel 300 can have flexibility.
  • a functional element is preferably positioned in the center between the substrate 310 and the counter substrate 370 because a change in shape of the functional element can be prevented.
  • the substrate 310 is preferably formed using a material whose coefficient of linear expansion is substantially equal to that of the counter substrate 370 .
  • the coefficients of linear expansion of the materials are preferably lower than or equal to 1 ⁇ 10 ⁇ 3 /K, further preferably lower than or equal to 5 ⁇ 10 ⁇ 5 /K, and still further preferably lower than or equal to 1 ⁇ 10 ⁇ 5 /K.
  • materials that include polyester, polyolefin, polyamide (e.g., nylon, aramid), polyimide, polycarbonate, or a resin having an acrylic bond, a urethane bond, an epoxy bond, or a siloxane bond can be used for the substrate 310 and the counter substrate 370 .
  • the substrate 310 is a stacked body in which a substrate 310 b having flexibility, a barrier film 310 a that prevents unintentional diffusion of impurities to the light-emitting elements, and a resin layer 310 c that attaches the barrier film 310 a to the substrate 310 b are stacked.
  • the counter substrate 370 is a stacked body including a substrate 370 b having flexibility, a barrier film 370 a that prevents unintentional diffusion of impurities to the light-emitting elements, and a resin layer 370 c that attaches the barrier film 370 a to the substrate 370 b (see FIG. 9B ).
  • a sealant 360 attaches the counter substrate 370 to the substrate 310 .
  • the sealant 360 also serving as an optical adhesive layer, has a refractive index higher than that of air.
  • the pixel circuits and the light-emitting elements e.g., a first light-emitting element 350 R are provided between the substrate 310 and the counter substrate 370 .
  • Each of the pixels 302 includes the sub-pixel 302 R, a sub-pixel 302 G, and a sub-pixel 302 B (see FIG. 9C ).
  • the sub-pixel 302 R includes a light-emitting module 380 R
  • the sub-pixel 302 G includes a light-emitting module 380 G
  • the sub-pixel 302 B includes a light-emitting module 380 B.
  • the sub-pixel 302 R includes the first light-emitting element 350 R and the pixel circuit that can supply electric power to the first light-emitting element 350 R and includes a transistor 302 t (see FIG. 9B ).
  • the light-emitting module 380 R includes the first light-emitting element 350 R and an optical element (e.g., a first coloring layer 367 R).
  • the first light-emitting element 350 R includes a first lower electrode 351 R, an upper electrode 352 , and a layer 353 containing a light-emitting organic compound between the first lower electrode 351 R and the upper electrode 352 (see FIG. 9C ).
  • the layer 353 containing a light-emitting organic compound includes a light-emitting unit 353 a , a light-emitting unit 353 b , and an intermediate layer 354 between the light-emitting units 353 a and 353 b.
  • the first coloring layer 367 R of the light-emitting module 380 R is provided on the counter substrate 370 .
  • the coloring layer transmits light of a particular wavelength and is, for example, a layer that selectively transmits light of red, green, or blue color. A region that transmits light emitted from the light-emitting element as it is may be provided.
  • the light-emitting module 380 R for example, includes the sealant 360 that is in contact with the first light-emitting element 350 R and the first coloring layer 367 R.
  • the first coloring layer 367 R is positioned in a region overlapping with the first light-emitting element 350 R. Accordingly, part of light emitted from the first light-emitting element 350 R passes through the sealant 360 that also serves as an optical adhesive layer and through the first coloring layer 367 R and is emitted to the outside of the light-emitting module 380 R as indicated by arrows in FIGS. 9B and 9C .
  • the touch panel 300 includes a light-blocking layer 367 BM on the counter substrate 370 .
  • the light-blocking layer 367 BM is provided so as to surround the coloring layer (e.g., the first coloring layer 367 R).
  • the touch panel 300 includes an anti-reflective layer 367 p positioned in a region overlapping with the display portion 301 .
  • an anti-reflective layer 367 p a circular polarizing plate can be used, for example.
  • the touch panel 300 includes an insulating film 321 .
  • the insulating film 321 covers the transistor 302 t .
  • the insulating film 321 can be used as a layer for planarizing unevenness caused by the pixel circuits.
  • An insulating film on which a layer that can prevent diffusion of impurities to the transistor 302 t and the like is stacked can be used as the insulating film 321 .
  • the touch panel 300 includes the light-emitting elements (e.g., the first light-emitting element 350 R) over the insulating film 321 .
  • the light-emitting elements e.g., the first light-emitting element 350 R
  • the touch panel 300 includes, over the insulating film 321 , a partition wall 328 that overlaps with an end portion of the first lower electrode 351 R (see FIG. 9C ).
  • a spacer 329 that controls the distance between the substrate 310 and the counter substrate 370 is provided over the partition wall 328 .
  • the image signal line driver circuit 303 s ( 1 ) includes a transistor 303 t and a capacitor 303 c . Note that the driver circuit can be formed in the same process and over the same substrate as those of the pixel circuits.
  • the imaging pixels 308 each include a photoelectric conversion element 308 p and an imaging pixel circuit for sensing light received by the photoelectric conversion element 308 p .
  • the imaging pixel circuit includes a transistor 308 t.
  • a PIN photodiode can be used as the photoelectric conversion element 308 p.
  • the touch panel 300 includes a wiring 311 through which a signal is supplied.
  • the wiring 311 is provided with a terminal 319 .
  • an FPC 309 ( 1 ) through which a signal such as an image signal or a synchronization signal is supplied is electrically connected to the terminal 319 .
  • PWB printed wiring board
  • Transistors formed in the same process can be used as the transistor 302 t , the transistor 303 t , the transistor 308 t , and the like.
  • Transistors of a bottom-gate type, a top-gate type, or the like can be used.
  • any of various kinds of semiconductors can be used in the transistors.
  • an oxide semiconductor, single crystal silicon, polysilicon, amorphous silicon, or the like can be used.
  • FIGS. 10A and 10B and FIGS. 11A to 11C a structure of a bendable or foldable touch panel that can be used in the data processing device of one embodiment of the present invention will be described with reference to FIGS. 10A and 10B and FIGS. 11A to 11C .
  • FIG. 10A is a perspective view of a touch panel 500 described as an example in this embodiment. Note that FIGS. 10A and 10B illustrate only main components for simplicity. FIG. 10B is a developed perspective view of the touch panel 500 .
  • FIGS. 11A to 11C are cross-sectional views of the touch panel 500 taken along line X1-X2 in FIG. 10A .
  • the touch panel 500 includes a display portion 501 and a touch sensor 595 (see FIG. 10B ).
  • the touch panel 500 includes a substrate 510 , a substrate 570 , and a substrate 590 . Note that the substrate 510 , the substrate 570 , and the substrate 590 each have flexibility.
  • the display portion 501 includes the substrate 510 , a plurality of pixels over the substrate 510 , a plurality of wirings 511 through which signals are supplied to the pixels, and an image signal line driver circuit 503 s ( 1 ).
  • the plurality of wirings 511 are led to a peripheral portion of the substrate 510 , and parts of the plurality of wirings 511 form a terminal 519 .
  • the terminal 519 is electrically connected to an FPC 509 ( 1 ).
  • the substrate 590 includes the touch sensor 595 and a plurality of wirings 598 electrically connected to the touch sensor 595 .
  • the plurality of wirings 598 are led to a peripheral portion of the substrate 590 , and parts of the plurality of wirings 598 form a terminal.
  • the terminal is electrically connected to an FPC 509 ( 2 ). Note that in FIG. 10B , electrodes, wirings, and the like of the touch sensor 595 provided on the back side of the substrate 590 (the side facing the substrate 510 ) are indicated by solid lines for clarity.
  • a capacitive touch sensor can be used as the touch sensor 595 .
  • Examples of the capacitive touch sensor are a surface capacitive touch sensor and a projected capacitive touch sensor.
  • Examples of the projected capacitive touch sensor are a self capacitive touch sensor and a mutual capacitive touch sensor, which differ mainly in the driving method.
  • the use of a mutual capacitive type is preferable because multiple points can be sensed simultaneously.
  • the projected capacitive touch sensor 595 includes electrodes 591 and electrodes 592 .
  • the electrodes 591 are electrically connected to any of the plurality of wirings 598
  • the electrodes 592 are electrically connected to any of the other wirings 598 .
  • the electrodes 592 each have a shape of a plurality of quadrangles arranged in one direction with one corner of a quadrangle connected to one corner of another quadrangle as illustrated in FIGS. 10A and 10B .
  • the electrodes 591 each have a quadrangular shape and are arranged in a direction intersecting with the direction in which the electrodes 592 extend.
  • a wiring 594 electrically connects two electrodes 591 between which the electrode 592 is positioned.
  • the intersecting area of the electrode 592 and the wiring 594 is preferably as small as possible.
  • Such a structure allows a reduction in the area of a region where the electrodes are not provided, reducing unevenness in transmittance. As a result, unevenness in luminance of light passing through the touch sensor 595 can be reduced.
  • the shapes of the electrodes 591 and the electrodes 592 are not limited thereto and can be any of a variety of shapes.
  • a structure may be employed in which the plurality of electrodes 591 are arranged so that gaps between the electrodes 591 are reduced as much as possible, and the electrodes 592 are spaced apart from the electrodes 591 with an insulating layer interposed therebetween to have regions not overlapping with the electrodes 591 .
  • the structure of the touch sensor 595 is described with reference to FIGS. 11A to 11C .
  • the touch sensor 595 includes the substrate 590 , the electrodes 591 and the electrodes 592 provided in a staggered arrangement on the substrate 590 , an insulating layer 593 covering the electrodes 591 and the electrodes 592 , and the wiring 594 that electrically connects the adjacent electrodes 591 to each other.
  • a resin layer 597 attaches the substrate 590 to the substrate 570 so that the touch sensor 595 overlaps with the display portion 501 .
  • the electrodes 591 and the electrodes 592 are formed using a light-transmitting conductive material.
  • a light-transmitting conductive material a conductive oxide such as indium oxide, indium tin oxide, indium zinc oxide, zinc oxide, or zinc oxide to which gallium is added can be used.
  • a film including graphene may be used as well.
  • the film including graphene can be formed, for example, by reducing a film containing graphene oxide. As a reducing method, a method with application of heat or the like can be employed.
  • the electrodes 591 and the electrodes 592 may be formed by depositing a light-transmitting conductive material on the substrate 590 by a sputtering method and then removing an unnecessary portion by any of various patterning techniques such as photolithography.
  • Examples of a material for the insulating layer 593 are a resin such as an acrylic resin or an epoxy resin, a resin having a siloxane bond, and an inorganic insulating material such as silicon oxide, silicon oxynitride, or aluminum oxide.
  • Openings reaching the electrodes 591 are formed in the insulating layer 593 , and the wiring 594 electrically connects the adjacent electrodes 591 .
  • a light-transmitting conductive material can be favorably used as the wiring 594 because the aperture ratio of the touch panel can be increased.
  • a material with higher conductivity than the conductivities of the electrodes 591 and 592 can be favorably used for the wiring 594 because electric resistance can be reduced.
  • One electrode 592 extends in one direction, and a plurality of electrodes 592 are provided in the form of stripes.
  • the wiring 594 intersects with the electrode 592 .
  • Adjacent electrodes 591 are provided with one electrode 592 provided therebetween.
  • the wiring 594 electrically connects the adjacent electrodes 591 .
  • the plurality of electrodes 591 are not necessarily arranged in the direction orthogonal to one electrode 592 and may be arranged to intersect with one electrode 592 at an angle of less than 90 degrees.
  • One wiring 598 is electrically connected to any of the electrodes 591 and 592 . Part of the wiring 598 functions as a terminal.
  • a metal material such as aluminum, gold, platinum, silver, nickel, titanium, tungsten, chromium, molybdenum, iron, cobalt, copper, or palladium or an alloy material containing any of these metal materials can be used.
  • an insulating layer that covers the insulating layer 593 and the wiring 594 may be provided to protect the touch sensor 595 .
  • connection layer 599 electrically connects the wiring 598 to the FPC 509 ( 2 ).
  • connection layer 599 any of various anisotropic conductive films (ACF), anisotropic conductive pastes (ACP), or the like can be used.
  • ACF anisotropic conductive films
  • ACP anisotropic conductive pastes
  • the resin layer 597 has a light-transmitting property.
  • a thermosetting resin or an ultraviolet curable resin can be used; specifically, a resin such as an acrylic resin, a urethane resin, an epoxy resin, or a resin having a siloxane bond can be used.
  • the display portion 501 includes a plurality of pixels arranged in a matrix. Each of the pixels includes a display element and a pixel circuit for driving the display element.
  • organic electroluminescent elements that emit light of different colors may be included in sub-pixels so that the light of different colors can be emitted from the respective sub-pixels.
  • any of various display elements such as display elements (electronic ink) that perform display by an electrophoretic method, an electronic liquid powder method, an electrowetting method, or the like; MEMS shutter display elements; optical interference type MEMS display elements; and liquid crystal elements can be used.
  • this embodiment can be used in a transmissive liquid crystal display, a transflective liquid crystal display, a reflective liquid crystal display, a direct-view liquid crystal display, or the like.
  • some of or all of pixel electrodes function as reflective electrodes.
  • some or all of pixel electrodes are formed to contain aluminum, silver, or the like.
  • a memory circuit such as an SRAM can be provided under the reflective electrodes, leading to lower power consumption.
  • a structure suitable for employed display elements can be selected from a variety of structures of pixel circuits.
  • an active matrix method in which an active element is included in a pixel or a passive matrix method in which an active element is not included in a pixel can be used.
  • an active element not only a transistor but also various active elements (non-linear elements) can be used.
  • a metal insulator metal (MIM), a thin film diode (TFD), or the like can also be used. Since such an element has few numbers of manufacturing steps, manufacturing cost can be reduced or yield can be improved.
  • the aperture ratio can be improved, so that power consumption can be reduced or higher luminance can be achieved.
  • the passive matrix method in which an active element (a non-linear element) is not used can also be used. Since an active element (a non-linear element) is not used, the number of manufacturing steps is small, so that manufacturing cost can be reduced or yield can be improved. Alternatively, since an active element (a non-linear element) is not used, the aperture ratio can be improved, so that power consumption can be reduced or higher luminance can be achieved, for example.
  • Flexible materials can be favorably used for the substrate 510 and the substrate 570 .
  • Materials with which unintended passage of impurities is inhibited can be favorably used for the substrate 510 and the substrate 570 .
  • materials with a vapor permeability of lower than or equal to 10 ⁇ 5 g/m 2 ⁇ day, preferably lower than or equal to 10 ⁇ 6 g/m 2 ⁇ day can be favorably used.
  • the substrate 510 can be favorably formed using a material whose coefficient of linear expansion is substantially equal to that of the substrate 570 .
  • the coefficients of linear expansion of the materials are preferably lower than or equal to 1 ⁇ 10 ⁇ 5 /K, further preferably lower than or equal to 5 ⁇ 10 ⁇ 5 /K, and still further preferably lower than or equal to 1 ⁇ 10 ⁇ 5 /K.
  • the substrate 510 is a stacked body in which a substrate 510 b having flexibility, a barrier film 510 a that prevents unintentional diffusion of impurities to the light-emitting elements, and a resin layer 510 c that attaches the barrier film 510 a to the substrate 510 b are stacked.
  • materials that include polyester, polyolefin, polyamide (e.g., nylon, aramid), polyimide, polycarbonate, or a resin having an acrylic bond, a urethane bond, an epoxy bond, or a siloxane bond can be used for the resin layer 510 c.
  • the substrate 570 is a stacked body including a substrate 570 b having flexibility, a barrier film 570 a that prevents unintentional diffusion of impurities to the light-emitting elements, and a resin layer 570 c that attaches the barrier film 570 a to the substrate 570 b.
  • a sealant 560 attaches the substrate 570 to the substrate 510 .
  • the sealant 560 has a refractive index higher than that of air. In the case where light is extracted to the sealant 560 side, the sealant 560 serves as an optical adhesive layer.
  • the pixel circuits and the light-emitting elements e.g., a first light-emitting element 550 R are provided between the substrate 510 and the substrate 570 .
  • a pixel includes a sub-pixel 502 R, and the sub-pixel 502 R includes a light-emitting module 580 R.
  • the sub-pixel 502 R includes the first light-emitting element 550 R and the pixel circuit that can supply electric power to the first light-emitting element 550 R and includes a transistor 502 t .
  • the light-emitting module 580 R includes the first light-emitting element 550 R and an optical element (e.g., a first coloring layer 567 R).
  • the first light-emitting element 550 R includes a lower electrode, an upper electrode, and a layer containing a light-emitting organic compound between the lower electrode and the upper electrode.
  • the light-emitting module 580 R includes the first coloring layer 567 R on the light extraction side.
  • the coloring layer transmits light of a particular wavelength and is, for example, a layer that selectively transmits light of red, green, or blue color. Note that in another sub-pixel, a region that transmits light emitted from the light-emitting element as it is may be provided as well.
  • the sealant 560 is provided on the light extraction side, the sealant 560 is in contact with the first light-emitting element 550 R and the first coloring layer 567 R.
  • the first coloring layer 567 R is positioned in a region overlapping with the first light-emitting element 550 R. Accordingly, part of light emitted from the first light-emitting element 550 R passes through the first coloring layer 567 R and is emitted to the outside of the light-emitting module 580 R as indicated by an arrow in FIG. 11A .
  • the display portion 501 includes a light-blocking layer 567 BM on the light extraction side.
  • the light-blocking layer 567 BM is provided so as to surround the coloring layer (e.g., the first coloring layer 567 R).
  • the display portion 501 is provided with an anti-reflective layer 567 p positioned in a region overlapping with pixels.
  • an anti-reflective layer 567 p a circular polarizing plate can be used, for example.
  • the display portion 501 includes an insulating film 521 .
  • the insulating film 521 covers the transistor 502 t .
  • the insulating film 521 can be used as a layer for planarizing unevenness caused by the pixel circuits.
  • a stacked film including a layer that can prevent diffusion of impurities can be used as the insulating film 521 . This can prevent the reliability of the transistor 502 t or the like from being lowered by unintentional diffusion of impurities.
  • the display portion 501 includes the light-emitting elements (e.g., the first light-emitting element 550 R) over the insulating film 521 .
  • the light-emitting elements e.g., the first light-emitting element 550 R
  • the display portion 501 includes, over the insulating film 521 , a partition wall 528 that overlaps with an end portion of the lower electrode.
  • a spacer that controls the distance between the substrate 510 and the substrate 570 is provided over the partition wall 528 .
  • a scan line driver circuit 503 g ( 1 ) includes a transistor 503 t and a capacitor 503 c . Note that the driver circuit can be formed in the same process and over the same substrate as those of the pixel circuits.
  • the display portion 501 includes the wiring 511 through which a signal is supplied.
  • the wiring 511 is provided with the terminal 519 .
  • the FPC 509 ( 1 ) through which a signal such as an image signal or a synchronization signal is supplied is electrically connected to the terminal 519 .
  • PWB printed wiring board
  • the display portion 501 includes wirings such as scan lines, signal lines, and power supply lines. Any of various conductive films can be used as the wirings.
  • a metal element selected from aluminum, chromium, copper, tantalum, titanium, molybdenum, tungsten, nickel, yttrium, zirconium, silver, and manganese; an alloy including any of the above-described metal elements; an alloy including any of the above-described metal elements in combination; or the like can be used.
  • one or more elements selected from aluminum, chromium, copper, tantalum, titanium, molybdenum, and tungsten are preferably included.
  • an alloy of copper and manganese is suitably used in microfabrication with the use of a wet etching method.
  • a two-layer structure in which a titanium film is stacked over an aluminum film a two-layer structure in which a titanium film is stacked over a titanium nitride film, a two-layer structure in which a tungsten film is stacked over a titanium nitride film, a two-layer structure in which a tungsten film is stacked over a tantalum nitride film or a tungsten nitride film, a three-layer structure in which a titanium film, an aluminum film, and a titanium film are stacked in this order, or the like can be used.
  • a stacked structure in which an alloy film or a nitride film containing one or more elements selected from titanium, tantalum, tungsten, molybdenum, chromium, neodymium, and scandium is stacked over an aluminum film can be used.
  • a light-transmitting conductive material including indium oxide, tin oxide, or zinc oxide may be used.
  • any of various kinds of transistors can be used in the display portion 501 .
  • FIGS. 11A and 11B A structure in which bottom-gate transistors are used in the display portion 501 is illustrated in FIGS. 11A and 11B .
  • a semiconductor layer containing an oxide semiconductor, amorphous silicon, or the like can be used in the transistor 502 t and the transistor 503 t shown in FIG. 11A .
  • a film represented by an In-M-Zn oxide that contains at least indium (In), zinc (Zn), and M (M is a metal such as Al, Ga, Ge, Y, Zr, Sn, La, Ce, or Hf) is preferably included.
  • M is a metal such as Al, Ga, Ge, Y, Zr, Sn, La, Ce, or Hf
  • both In and Zn are preferably contained.
  • gallium (Ga), tin (Sn), hafnium (Hf), aluminum (Al), zirconium (Zr), or the like can be given.
  • lanthanoid such as lanthanum (La), cerium (Ce), praseodymium (Pr), neodymium (Nd), samarium (Sm), europium (Eu), gadolinium (Gd), terbium (Tb), dysprosium (Dy), holmium (Ho), erbium (Er), thulium (Tm), ytterbium (Yb), or lutetium (Lu) can be given.
  • any of the following can be used, for example: an In—Ga—Zn-based oxide, an In—Al—Zn-based oxide, an In—Sn—Zn-based oxide, an In—Hf—Zn-based oxide, an In—La—Zn-based oxide, an In—Ce—Zn-based oxide, an In—Pr—Zn-based oxide, an In—Nd—Zn-based oxide, an In—Sm—Zn-based oxide, an In—Eu—Zn-based oxide, an In—Gd—Zn-based oxide, an In—Tb—Zn-based oxide, an In—Dy—Zn-based oxide, an In—Ho—Zn-based oxide, an In—Er—Zn-based oxide, an In—Tm—Zn-based oxide, an In—Yb—Zn-based oxide, an In—Lu—Zn-based oxide, an In—Sn—Ga—Zn-based oxide, an In—Hf
  • an “In—Ga—Zn-based oxide” means an oxide containing In, Ga, and Zn as its main components and there is no limitation on the ratio of In:Ga:Zn.
  • the In—Ga—Zn-based oxide may contain another metal element in addition to In, Ga, and Zn.
  • a semiconductor layer containing polycrystalline silicon that is obtained by crystallization process such as laser annealing can be used in the transistor 502 t and the transistor 503 t shown in FIG. 11B .
  • FIG. 11C A structure in which top-gate transistors are used in the display portion 501 is shown in FIG. 11C .
  • a semiconductor layer including polycrystalline silicon, a single crystal silicon film that is transferred from a single crystal silicon substrate, or the like can be used in the transistor 502 t and the transistor 503 t shown in FIG. 11C .
  • FIGS. 12A to 12C a structure of a bendable or foldable touch panel that can be used in a data processing device of one embodiment of the present invention will be described with reference to FIGS. 12A to 12C .
  • FIGS. 12A to 12C are cross-sectional views illustrating a touch panel 500 B.
  • the touch panel 500 B described in this embodiment is different from the touch panel 500 described in Embodiment 5 in that the display portion 501 displays supplied image data on the side where the transistors are provided and that the touch sensor is provided on the substrate 510 side of the display portion.
  • the display portion 501 displays supplied image data on the side where the transistors are provided and that the touch sensor is provided on the substrate 510 side of the display portion.
  • Different parts are described in detail below, and the above description is referred to for the other similar parts.
  • the display portion 501 includes a plurality of pixels arranged in a matrix. Each of the pixels includes a display element and a pixel circuit for driving the display element.
  • a pixel includes a sub-pixel 502 R, and the sub-pixel 502 R includes a light-emitting module 580 R.
  • the sub-pixel 502 R includes a first light-emitting element 550 R and a pixel circuit that can supply electric power to the first light-emitting element 550 R and includes a transistor 502 t.
  • the light-emitting module 580 R includes the first light-emitting element 550 R and an optical element (e.g., a first coloring layer 567 R).
  • the first light-emitting element 550 R includes a lower electrode, an upper electrode, and a layer containing a light-emitting organic compound between the lower electrode and the upper electrode.
  • the light-emitting module 580 R includes the first coloring layer 567 R on the light extraction side.
  • the coloring layer transmits light of a particular wavelength and is, for example, a layer that selectively transmits light of red, green, or blue color. Note that in another sub-pixel, a region that transmits light emitted from the light-emitting element as it is may be provided as well.
  • the first coloring layer 567 R is positioned in a region overlapping with the first light-emitting element 550 R.
  • the first light-emitting element 550 R shown in FIG. 12A emits light to the side where the transistor 502 t is provided. Accordingly, part of light emitted from the first light-emitting element 550 R passes through the first coloring layer 567 R and is emitted to the outside of the light-emitting module 580 R as indicated by an arrow in FIG. 12A .
  • the display portion 501 includes a light-blocking layer 567 BM on the light extraction side.
  • the light-blocking layer 567 BM is provided so as to surround the coloring layer (e.g., the first coloring layer 567 R).
  • the display portion 501 includes an insulating film 521 .
  • the insulating film 521 covers the transistor 502 t .
  • the insulating film 521 can be used as a layer for planarizing unevenness caused by the pixel circuits.
  • a stacked film including a layer that can prevent diffusion of impurities can be used as the insulating film 521 . This can prevent the reliability of the transistor 502 t or the like from being lowered by unintentional diffusion of impurities from the first coloring layer 567 R, for example.
  • a touch sensor 595 is provided on the substrate 510 side of the display portion 501 (see FIG. 12A ).
  • a resin layer 597 is provided between the substrate 510 and the substrate 590 and attaches the touch sensor 595 to the display portion 501 .
  • any of various kinds of transistors can be used in the display portion 501 .
  • FIGS. 12A and 12B A structure in which bottom-gate transistors are used in the display portion 501 is illustrated in FIGS. 12A and 12B .
  • a semiconductor layer containing an oxide semiconductor, amorphous silicon, or the like can be used in the transistor 502 t and the transistor 503 t shown in FIG. 12A .
  • a channel formation region may be sandwiched between upper and lower gate electrodes, in which case variations in characteristics of the transistors can be prevented and thus the reliability can be increased.
  • a semiconductor layer containing polycrystalline silicon or the like can be used in the transistor 502 t and the transistor 503 t shown in FIG. 12B .
  • FIG. 12C A structure in which top-gate transistors are used in the display portion 501 is shown in FIG. 12C .
  • a semiconductor layer including polycrystalline silicon, a transferred single crystal silicon film, or the like can be used in the transistor 502 t and the transistor 503 t shown in FIG. 12C .
  • examples of the bendable or foldable device include a display device, a light-emitting device, an input device, and the like.
  • examples of the input device include a touch sensor, a touch panel, and the like.
  • Examples of the light-emitting device include an organic EL panel, a lighting device, and the like.
  • Examples of the display device include a light-emitting device, an organic EL panel, a liquid crystal display device, and the like.
  • a function of the input device such as a touch sensor may be provided in a display device or a light-emitting device.
  • a counter substrate e.g., a substrate not provided with a transistor
  • an element substrate e.g., a substrate provided with a transistor
  • the counter substrate and the element substrate of the display device or the light-emitting device may be provided with touch sensors.
  • a separation layer 703 is formed over a formation substrate 701 , and a layer 705 to be separated (hereinafter referred to as a layer 705 ) is formed over the separation layer 703 ( FIG. 13A ).
  • a separation layer 723 is formed over a formation substrate 721 , and a layer 725 to be separated (hereinafter referred to as a layer 725 ) is formed over the separation layer 723 ( FIG. 13B ).
  • a tungsten oxide film can be formed between the layer to be separated and the tungsten film by an oxidation method such as performing plasma treatment on the tungsten film with a gas containing oxygen such as N 2 O, annealing the tungsten film in a gas atmosphere containing oxygen, or forming a tungsten film by sputtering or the like in a gas atmosphere containing oxygen.
  • an oxidation method such as performing plasma treatment on the tungsten film with a gas containing oxygen such as N 2 O, annealing the tungsten film in a gas atmosphere containing oxygen, or forming a tungsten film by sputtering or the like in a gas atmosphere containing oxygen.
  • the tungsten oxide film include tungsten oxide with a composition in which the ratio of oxygen to tungsten is lower than 3.
  • tungsten oxide is W n O (3n-1) or W n O (3n-2) , which is a homologous series, shear is easily caused by heating because there is a crystal optical shear plane therein.
  • the tungsten oxide film can be directly formed without forming the tungsten film.
  • the tungsten oxide film may be formed as the separation layer by performing plasma treatment on a sufficiently thin tungsten film with a gas containing oxygen, annealing a sufficiently thin tungsten film in a gas atmosphere containing oxygen, or forming the oxide tungsten film by sputtering or the like in a gas atmosphere containing oxygen.
  • a step of removing the left tungsten oxide film is preferably performed after the step of separating the separation layer and the layer to be separated. Note that the above method for separation from the substrate does not necessarily require N 2 O plasma treatment, so that the step of removing the tungsten oxide film can also be omitted. In that case, the device can be fabricated more simply.
  • a tungsten film with a thickness of greater than or equal to 0.1 nm and less than 200 nm is formed over the substrate.
  • a film containing molybdenum, titanium, vanadium, tantalum, silicon, aluminum, or an alloy thereof can be used, besides a tungsten film. Furthermore, it is also possible to use a stack of such a film and its oxide film.
  • the separation layer is not limited to an inorganic film, and an organic film such as polyimide may be used.
  • a process temperature needs to be lower than or equal to 350° C. when low-temperature polysilicon is used as an active layer of a transistor.
  • dehydrogenation baking for silicon crystallization, hydrogenation for termination of defects in silicon, or activation of a doped region cannot be performed sufficiently, so that the performance of the transistor is limited.
  • the process temperature is not limited to 350° C., and excellent characteristics of a transistor can be obtained.
  • the organic resin or a functional element is damaged in some cases by laser irradiation at the time of crystallization; thus, it is preferable to use an inorganic film for the separation layer because such a problem is not caused.
  • the organic resin shrinks by laser irradiation for separating the resin and contact failure is caused in the contact portion of the terminal of an FPC or the like, which makes it difficult for functional elements with many terminals in a high-definition display, or the like to be separated and transferred with high yield.
  • an inorganic film for the separation layer there is no such limitation, and functional elements with many terminals of a high-definition display or the like can be separated and transferred with high yield.
  • an insulating layer and a transistor can be formed over a formation substrate at a temperature of lower than or equal to 600° C.
  • high-temperature polysilicon can be used for a semiconductor layer.
  • a semiconductor device with a high operation speed, a high gas barrier property, and high reliability can be mass-produced.
  • insulating layers having an excellent gas barrier property formed at a temperature of lower than or equal to 600° C. can be provided above and below an organic EL element. Accordingly, entry of impurities such as moisture into the organic EL element or the semiconductor layer can be suppressed, whereby an extraordinarily reliable light-emitting device can be obtained as compared with the case of using the organic resin or the like as the separation layer.
  • the insulating layer and the transistor can be formed over the formation substrate at 500° C. or lower.
  • low-temperature polysilicon or an oxide semiconductor can be used for the semiconductor layer, and mass production is possible with use of a conventional production line for low-temperature polysilicon.
  • insulating layers having an excellent gas barrier property formed at 500° C. or lower can be provided above and below the organic EL element. Accordingly, the entry of impurities such as moisture into the organic EL element or the semiconductor layer is suppressed, whereby a highly reliable light-emitting device can be obtained as compared with the case of using the organic resin as the separation layer.
  • the insulating layer and the transistor can be formed over the formation substrate at 400° C. or lower.
  • amorphous silicon or an oxide semiconductor can be used for the semiconductor layer, and mass production is possible with use of a conventional production line for amorphous silicon.
  • insulating layers having an excellent gas barrier property formed at 400° C. or lower can be provided above and below the organic EL element. Accordingly, the entry of impurities such as moisture into the organic EL element or the semiconductor layer can be suppressed, whereby a reliable light-emitting device can be obtained as compared with the case of using the organic resin or the like as the separation layer.
  • the formation substrate 701 and the formation substrate 721 are attached to each other by using a bonding layer 707 and a frame-like bonding layer 711 so that the surfaces over which the layers to be separated are formed face each other, and then, the bonding layer 707 and the frame-like bonding layer 711 are cured ( FIG. 13C ).
  • the frame-like bonding layer 711 and the bonding layer 707 in a region surrounded by the frame-like bonding layer 711 are provided over the layer 725 and after that, the formation substrate 701 and the formation substrate 721 face each other and are attached to each other.
  • the formation substrate 701 and the formation substrate 721 are preferably attached to each other in a reduced-pressure atmosphere.
  • FIG. 13C illustrates the case where the separation layer 703 and the separation layer 723 are different in size
  • separation layers having the same size as illustrated in FIG. 13D may be used.
  • the bonding layer 707 is provided to overlap with the separation layer 703 , the layer 705 , the layer 725 , and the separation layer 723 . Then, edges of the bonding layer 707 are preferably positioned inside an area between at least edges of either the separation layer 703 or the separation layer 723 (the separation layer which is desirably separated from the substrate first). Accordingly, strong adhesion between the formation substrate 701 and the formation substrate 721 can be suppressed; thus, a decrease in yield of a subsequent separating process can be suppressed.
  • a first trigger 741 for separation from the substrate is formed by laser irradiation ( FIGS. 14A and 14B ).
  • Either the formation substrate 701 or the formation substrate 721 may be separated first.
  • a substrate over which a larger separation layer is formed may be separated first or a substrate over which a smaller separation layer is formed may be separated first.
  • an element such as a semiconductor element, a light-emitting element, or a display element is formed over only one of the substrates
  • the substrate on the side where the element is formed may be separated first or the other substrate may be separated first.
  • an example in which the formation substrate 701 is separated first is described.
  • a region where the bonding layer 707 in a cured state or the frame-like bonding layer 711 in a cured state, the layer 705 , and the separation layer 703 overlap with one another is irradiated with laser light.
  • the bonding layer 707 is in a cured state and the frame-like bonding layer 711 is not in a cured state, and the bonding layer 707 in a cured state is irradiated with laser light (see an arrow P 3 in FIG. 14A ).
  • the first trigger 741 for separation from the substrate can be formed (see a region surrounded by a dashed line in FIG. 14B ).
  • the separation layer 703 , the bonding layer 707 , or another layer included in the layer 705 may be partly removed.
  • laser light irradiation be performed from the side of the substrate provided with the separation layer that is desirably separated.
  • the formation substrate 701 and the separation layer 703 can be selectively separated by cracking only the layer 705 of the layers 705 and 725 (see the region surrounded by the dotted line in FIG. 14B ).
  • the method for forming the first trigger 741 for separation from the substrate is not limited to laser light irradiation, and the first trigger 741 may be formed by a sharp knife such as a cutter.
  • the layer 705 and the formation substrate 701 are separated from each other from the first trigger 741 for separation from the substrate ( FIGS. 14C and 14D ). Consequently, the layer 705 can be transferred from the formation substrate 701 to the formation substrate 721 .
  • the layer 705 that is separated from the formation substrate 701 in the step in FIG. 14D is attached to a substrate 731 with a bonding layer 733 , and the bonding layer 733 is cured ( FIG. 15A ).
  • a second trigger 743 for separation from the substrate is formed by a sharp knife such as a cutter ( FIGS. 15B and 15C ).
  • the method for forming the second trigger 743 for separation from the substrate is not limited to a sharp knife such as a cutter, and the second trigger 743 may be formed by laser light irradiation or the like.
  • the substrate 731 on the side where the separation layer 723 is not provided can be cut by a knife or the like, a cut may be made in the substrate 731 , the bonding layer 733 , and the layer 725 (see arrows P 5 in FIG. 15B ). Consequently, part of the layer 725 can be removed; thus, the second trigger 743 for separation from the substrate can be formed (see a region surrounded by a dashed line in FIG. 15C ).
  • a cut is preferably made in a frame shape in a region where the bonding layer 733 in a cured state and the separation layer 723 overlap with each other to form the second trigger 743 for separation from the substrate in the form of a solid line. This can improve the yield of the process of separation from the substrate.
  • the layer 725 and the formation substrate 721 are separated from each other from the second trigger 743 for separation from the substrate ( FIG. 15D ), so that the layer 725 can be transferred from the formation substrate 721 to the substrate 731 .
  • the tungsten oxide film which is tightly anchored by N 2 O plasma or the like is formed on an inorganic film such as a tungsten film
  • adhesion can be relatively high in deposition.
  • the formation substrate 721 and the layer 725 may be separated from each other by filling the interface between the separation layer 723 and the layer 725 with a liquid such as water.
  • a portion between the separation layer 723 and the layer 725 absorbs a liquid through a capillarity action. Accordingly, an adverse effect on the functional element such as an FET included in the layer 725 due to static electricity caused at the time of separation from the substrate (e.g., a phenomenon in which a semiconductor element is damaged by static electricity) can be suppressed.
  • a liquid may be sprayed in an atomized form or in a vaporized form.
  • liquids include pure water, an organic solvent, a neutral, alkali, or acid aqueous solution, and an aqueous solution in which a salt is dissolved.
  • the temperature of the liquid and the substrate at the time of dynamic separation is set in the range from room temperature to 120° C., and preferably set to 60° C. to 90° C.
  • separation of the formation substrate is performed in such a manner that the second trigger 743 for separation from the substrate is formed by a sharp knife or the like so that the separation layer and the layer to be separated are made in a separable state. This can improve the yield of the process of separation from the substrate.
  • bonding of a substrate with which a device is to be formed can be performed after the following procedure: a pair of formation substrates each provided with a layer to be separated are attached to each other and the formation substrates are individually separated. Therefore, formation substrates having low flexibility can be attached to each other when the layers to be separated are attached to each other, whereby alignment accuracy at the time of attachment can be improved compared with the case where flexible substrates are attached to each other.
  • a layer to be separated over an oxide layer includes a first layer and a second layer from which hydrogen is released by heat treatment.
  • WO 3 in the oxide layer can be reduced by hydrogen released by heat treatment from the layer to be separated, so that the oxide layer can have a high WO 2 content. Consequently, separation from a substrate can be facilitated.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
  • Vehicle Body Suspensions (AREA)
  • Diaphragms For Electromechanical Transducers (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)
US14/553,288 2013-12-02 2014-11-25 Data processing device Abandoned US20150154730A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/450,049 US11475532B2 (en) 2013-12-02 2019-06-24 Foldable display device comprising a plurality of regions
US17/960,984 US11983793B2 (en) 2013-12-02 2022-10-06 Foldable display device including a plurality of regions

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013249677 2013-12-02
JP2013-249677 2013-12-02

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/450,049 Continuation US11475532B2 (en) 2013-12-02 2019-06-24 Foldable display device comprising a plurality of regions

Publications (1)

Publication Number Publication Date
US20150154730A1 true US20150154730A1 (en) 2015-06-04

Family

ID=53265734

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/553,288 Abandoned US20150154730A1 (en) 2013-12-02 2014-11-25 Data processing device
US16/450,049 Active US11475532B2 (en) 2013-12-02 2019-06-24 Foldable display device comprising a plurality of regions
US17/960,984 Active US11983793B2 (en) 2013-12-02 2022-10-06 Foldable display device including a plurality of regions

Family Applications After (2)

Application Number Title Priority Date Filing Date
US16/450,049 Active US11475532B2 (en) 2013-12-02 2019-06-24 Foldable display device comprising a plurality of regions
US17/960,984 Active US11983793B2 (en) 2013-12-02 2022-10-06 Foldable display device including a plurality of regions

Country Status (3)

Country Link
US (3) US20150154730A1 (ja)
JP (5) JP2015129917A (ja)
KR (6) KR20150063937A (ja)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150339804A1 (en) * 2014-05-26 2015-11-26 Samsung Electronics Co., Ltd. Electronic device and method for operating display
US20170025444A1 (en) * 2015-07-24 2017-01-26 Semiconductor Energy Laboratory Co., Ltd. Semiconductor device, display panel, method for manufacturing semiconductor device, method for manufacturing display panel, and information processing device
US9710033B2 (en) 2014-02-28 2017-07-18 Semiconductor Energy Laboratory Co., Ltd. Electronic device
US9847380B2 (en) 2013-11-27 2017-12-19 Simiconductor Enargy Laboratory Co., Ltd. Touch panel
US20180032202A1 (en) * 2016-07-27 2018-02-01 Samsung Electronics Co., Ltd. Electronic device having input sensing panels and method
US9952626B2 (en) 2013-12-20 2018-04-24 Semiconductor Energy Laboratory Co., Ltd. Semiconductor device
US10073571B2 (en) 2014-05-02 2018-09-11 Semiconductor Energy Laboratory Co., Ltd. Touch sensor and touch panel including capacitor
US10429999B2 (en) 2015-12-18 2019-10-01 Semiconductor Energy Laboratory Co., Ltd. Display panel, input/output device, data processing device, and method for manufacturing display panel
US10558265B2 (en) 2015-12-11 2020-02-11 Semiconductor Energy Laboratory Co., Ltd. Input device and system of input device
US11101333B2 (en) 2015-07-23 2021-08-24 Semiconductor Energy Laboratory Co., Ltd. Display device, module, and electronic device
US11209877B2 (en) 2018-03-16 2021-12-28 Semiconductor Energy Laboratory Co., Ltd. Electrical module, display panel, display device, input/output device, data processing device, and method of manufacturing electrical module
US11663990B2 (en) 2018-11-09 2023-05-30 Semiconductor Energy Laboratory Co., Ltd. Display apparatus and electronic device

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI730975B (zh) * 2015-08-19 2021-06-21 日商半導體能源研究所股份有限公司 資訊處理裝置
JPWO2019208125A1 (ja) * 2018-04-25 2021-04-22 富士フイルム株式会社 タッチセンサおよびタッチパネル
JP2021119427A (ja) * 2018-04-25 2021-08-12 富士フイルム株式会社 タッチセンサおよびタッチパネル
WO2021005798A1 (ja) * 2019-07-11 2021-01-14 堺ディスプレイプロダクト株式会社 フレキシブル発光デバイスの製造方法及び支持基板
JP7354647B2 (ja) * 2019-07-24 2023-10-03 セイコーエプソン株式会社 端末装置、表示制御プログラムおよび表示制御方法

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100117975A1 (en) * 2008-11-10 2010-05-13 Lg Electronics Inc. Mobile terminal using flexible display and method of controlling the mobile terminal
US20110007042A1 (en) * 2009-07-07 2011-01-13 Semiconductor Energy Laboratory Co., Ltd. Display device
US20120050135A1 (en) * 2010-08-27 2012-03-01 Advanced Micro Devices, Inc. Method and apparatus for configuring a plurality of displays into a single large surface display
US20120212430A1 (en) * 2011-02-23 2012-08-23 Lg Electronics Inc. Mobile terminal
US20120236484A1 (en) * 2009-12-01 2012-09-20 Hideyuki Miyake Foldable mobile terminal
US20120256896A1 (en) * 2005-08-12 2012-10-11 Semiconductor Energy Laboratory Co., Ltd. Display module, and cellular phone and electronic device provided with display module
US20130002583A1 (en) * 2011-06-30 2013-01-03 Jin Dong-Un Flexible display panel and display apparatus including the flexible display panel
US20130002133A1 (en) * 2011-06-30 2013-01-03 Jin Dong-Un Flexible display panel and display apparatus including the flexible display panel
US20130076649A1 (en) * 2011-09-27 2013-03-28 Scott A. Myers Electronic Devices With Sidewall Displays
US20130080762A1 (en) * 2010-09-17 2013-03-28 Apple Inc. Sensor fusion
US20130135182A1 (en) * 2011-11-28 2013-05-30 Samsung Electronics Co., Ltd. Apparatus and method for displaying an application in a wireless terminal
US20130222354A1 (en) * 2010-09-17 2013-08-29 Nokia Corporation Adjustment of Display Brightness
US20130300697A1 (en) * 2012-05-14 2013-11-14 Samsung Electronics Co. Ltd. Method and apparatus for operating functions of portable terminal having bended display
US20140132481A1 (en) * 2012-11-09 2014-05-15 Microsoft Corporation Mobile devices with plural displays
US20140183342A1 (en) * 2013-01-02 2014-07-03 Apple Inc. Electronic Devices With Light Sensors And Displays
US20140333542A1 (en) * 2013-05-10 2014-11-13 Research In Motion Limited Carrying case used with a portable electronic device

Family Cites Families (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5259374A (en) 1975-11-10 1977-05-16 Hitachi Plant Eng & Constr Co Ltd Dust-collector electrode for wet-type electric dust collector
JPS5319825U (ja) 1976-07-30 1978-02-20
JPS597343A (ja) * 1982-07-05 1984-01-14 Seiko Epson Corp 液晶表示装置
US6268857B1 (en) 1997-08-29 2001-07-31 Xerox Corporation Computer user interface using a physical manipulatory grammar
US6297838B1 (en) 1997-08-29 2001-10-02 Xerox Corporation Spinning as a morpheme for a physical manipulatory grammar
US6297805B1 (en) 1997-08-29 2001-10-02 Xerox Corporation Multiple interacting computers interfaceable through a physical manipulatory grammar
JP4149574B2 (ja) * 1997-08-29 2008-09-10 ゼロックス コーポレイション ユーザインターフェースサポートデバイス、及び情報入力方法
US6340957B1 (en) 1997-08-29 2002-01-22 Xerox Corporation Dynamically relocatable tileable displays
US6243074B1 (en) 1997-08-29 2001-06-05 Xerox Corporation Handedness detection for a physical manipulatory grammar
US6243075B1 (en) 1997-08-29 2001-06-05 Xerox Corporation Graspable device manipulation for controlling a computer display
TW496043B (en) * 2000-01-27 2002-07-21 Kyocera Corp Portable wireless equipment
JP2001255513A (ja) * 2000-03-13 2001-09-21 Minolta Co Ltd 液晶表示装置
JP2001282145A (ja) * 2000-03-30 2001-10-12 Sanyo Electric Co Ltd 表示装置
US6577496B1 (en) 2001-01-18 2003-06-10 Palm, Inc. Non-rigid mounting of a foldable display
JP2002278466A (ja) 2001-03-15 2002-09-27 Minolta Co Ltd 表示装置
JP4027740B2 (ja) 2001-07-16 2007-12-26 株式会社半導体エネルギー研究所 半導体装置の作製方法
US8415208B2 (en) 2001-07-16 2013-04-09 Semiconductor Energy Laboratory Co., Ltd. Semiconductor device and peeling off method and method of manufacturing semiconductor device
WO2003019349A1 (en) 2001-08-25 2003-03-06 Si Han Kim Portable multi-display device
US6953735B2 (en) 2001-12-28 2005-10-11 Semiconductor Energy Laboratory Co., Ltd. Method for fabricating a semiconductor device by transferring a layer to a support with curvature
US20030201974A1 (en) 2002-04-26 2003-10-30 Yin Memphis Zhihong Apparatus display
CN1685388A (zh) * 2002-09-25 2005-10-19 西铁城时计株式会社 显示设备
JP2005012528A (ja) * 2003-06-19 2005-01-13 Nec Saitama Ltd 画像表示部切替機能つき折畳型携帯電話機
JP2005114759A (ja) 2003-10-02 2005-04-28 Canon Inc ディスプレイ装置、携帯電話機、及び電子機器
JP4131237B2 (ja) 2003-12-24 2008-08-13 カシオ計算機株式会社 電子機器及びプログラム
JP2006005712A (ja) 2004-06-18 2006-01-05 Konica Minolta Holdings Inc 携帯端末
JP2006072115A (ja) * 2004-09-03 2006-03-16 Fuji Photo Film Co Ltd 画像表示装置
JP2007326259A (ja) 2006-06-07 2007-12-20 Yoshida Industry Co Ltd ディスプレイの製造方法及びディスプレイ
CA2653169A1 (en) * 2007-03-30 2008-10-09 Hitachi Chemical Co., Ltd. Optical connecting member and display apparatus
TWI424194B (zh) * 2007-07-20 2014-01-21 Ind Tech Res Inst 電子元件、顯示器及其製作方法
US8610155B2 (en) 2008-11-18 2013-12-17 Semiconductor Energy Laboratory Co., Ltd. Light-emitting device, method for manufacturing the same, and cellular phone
CN103325312A (zh) 2008-11-24 2013-09-25 柳相圭 用于可折叠便携终端的软性显示装置
JP5302027B2 (ja) * 2009-01-30 2013-10-02 ソフトバンクモバイル株式会社 情報通信端末
JP2010204497A (ja) * 2009-03-04 2010-09-16 Panasonic Corp 電子装置及び焼き付き防止方法
JP2010256660A (ja) * 2009-04-27 2010-11-11 Hitachi Displays Ltd 表示装置
CN104597651B (zh) 2009-05-02 2017-12-05 株式会社半导体能源研究所 显示设备
KR101155907B1 (ko) * 2009-06-04 2012-06-20 삼성모바일디스플레이주식회사 유기 발광 표시 장치 및 그 제조 방법
CN101924816B (zh) 2009-06-12 2013-03-20 清华大学 柔性手机
US20100317407A1 (en) * 2009-06-16 2010-12-16 Bran Ferren Secondary display device
CN101996535A (zh) 2009-08-25 2011-03-30 精工爱普生株式会社 电光学装置和电子设备
JP2012057662A (ja) * 2010-09-06 2012-03-22 Sony Corp 電子機器
KR101780419B1 (ko) 2011-02-15 2017-10-11 삼성디스플레이 주식회사 플렉시블 표시장치
WO2012115016A1 (en) 2011-02-25 2012-08-30 Semiconductor Energy Laboratory Co., Ltd. Light-emitting device and electronic device using light-emitting device
NL2006862C2 (en) 2011-05-30 2012-12-03 Polymer Vision Bv Display device with flexible display.
US8787016B2 (en) * 2011-07-06 2014-07-22 Apple Inc. Flexible display devices
US8929085B2 (en) * 2011-09-30 2015-01-06 Apple Inc. Flexible electronic devices
KR101515629B1 (ko) 2012-01-07 2015-04-27 삼성전자주식회사 플렉서블 표시부를 갖는 휴대단말의 이벤트 제공 방법 및 장치
KR101935553B1 (ko) 2012-02-01 2019-01-07 삼성디스플레이 주식회사 플렉시블 디스플레이 장치 및 그 제조방법
JP2013164775A (ja) * 2012-02-13 2013-08-22 Nec Casio Mobile Communications Ltd 携帯情報端末、表示制御装置、表示制御方法、及びプログラム
CN103294113A (zh) 2012-03-02 2013-09-11 联想(北京)有限公司 具有可折叠显示屏幕的电子装置
KR101661526B1 (ko) 2012-04-08 2016-10-04 삼성전자주식회사 플렉서블 디스플레이 장치 및 그 ui 방법
JP5319825B1 (ja) 2012-05-22 2013-10-16 株式会社東芝 電子機器
KR20150143638A (ko) 2013-04-15 2015-12-23 가부시키가이샤 한도오따이 에네루기 켄큐쇼 발광 장치
KR102633904B1 (ko) 2013-04-24 2024-02-07 가부시키가이샤 한도오따이 에네루기 켄큐쇼 표시 장치
KR101810049B1 (ko) 2013-04-26 2018-01-19 삼성디스플레이 주식회사 가요성 표시 패널 및 상기 가요성 표시 패널을 포함하는 표시 장치
KR102081931B1 (ko) * 2013-06-19 2020-02-26 엘지전자 주식회사 폴더블 디스플레이 디바이스 및 제어 방법
CN103399684A (zh) * 2013-07-03 2013-11-20 惠州Tcl移动通信有限公司 一种大小可变的显示屏幕、移动终端及其实现方法
KR20150042705A (ko) 2013-10-11 2015-04-21 가부시키가이샤 한도오따이 에네루기 켄큐쇼 정보 처리 장치
WO2015071800A1 (en) 2013-11-15 2015-05-21 Semiconductor Energy Laboratory Co., Ltd. Data processor

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120256896A1 (en) * 2005-08-12 2012-10-11 Semiconductor Energy Laboratory Co., Ltd. Display module, and cellular phone and electronic device provided with display module
US20100117975A1 (en) * 2008-11-10 2010-05-13 Lg Electronics Inc. Mobile terminal using flexible display and method of controlling the mobile terminal
US20110007042A1 (en) * 2009-07-07 2011-01-13 Semiconductor Energy Laboratory Co., Ltd. Display device
US20120236484A1 (en) * 2009-12-01 2012-09-20 Hideyuki Miyake Foldable mobile terminal
US20120050135A1 (en) * 2010-08-27 2012-03-01 Advanced Micro Devices, Inc. Method and apparatus for configuring a plurality of displays into a single large surface display
US20130222354A1 (en) * 2010-09-17 2013-08-29 Nokia Corporation Adjustment of Display Brightness
US20130080762A1 (en) * 2010-09-17 2013-03-28 Apple Inc. Sensor fusion
US20120212430A1 (en) * 2011-02-23 2012-08-23 Lg Electronics Inc. Mobile terminal
US20130002583A1 (en) * 2011-06-30 2013-01-03 Jin Dong-Un Flexible display panel and display apparatus including the flexible display panel
US20130002133A1 (en) * 2011-06-30 2013-01-03 Jin Dong-Un Flexible display panel and display apparatus including the flexible display panel
US20130076649A1 (en) * 2011-09-27 2013-03-28 Scott A. Myers Electronic Devices With Sidewall Displays
US20130135182A1 (en) * 2011-11-28 2013-05-30 Samsung Electronics Co., Ltd. Apparatus and method for displaying an application in a wireless terminal
US20130300697A1 (en) * 2012-05-14 2013-11-14 Samsung Electronics Co. Ltd. Method and apparatus for operating functions of portable terminal having bended display
US20140132481A1 (en) * 2012-11-09 2014-05-15 Microsoft Corporation Mobile devices with plural displays
US20140183342A1 (en) * 2013-01-02 2014-07-03 Apple Inc. Electronic Devices With Light Sensors And Displays
US20140333542A1 (en) * 2013-05-10 2014-11-13 Research In Motion Limited Carrying case used with a portable electronic device

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9847380B2 (en) 2013-11-27 2017-12-19 Simiconductor Enargy Laboratory Co., Ltd. Touch panel
US11785826B2 (en) 2013-11-27 2023-10-10 Semiconductor Energy Laboratory Co., Ltd. Touch panel
US11177328B2 (en) 2013-11-27 2021-11-16 Semiconductor Energy Laboratory Co., Ltd. Touch panel
US9952626B2 (en) 2013-12-20 2018-04-24 Semiconductor Energy Laboratory Co., Ltd. Semiconductor device
US10139879B2 (en) 2014-02-28 2018-11-27 Semiconductor Energy Laboratory Co., Ltd. Electronic device
US11474646B2 (en) 2014-02-28 2022-10-18 Semiconductor Energy Laboratory Co., Ltd. Electronic device
US10809784B2 (en) 2014-02-28 2020-10-20 Semiconductor Energy Laboratory Co., Ltd. Electronic device
US9710033B2 (en) 2014-02-28 2017-07-18 Semiconductor Energy Laboratory Co., Ltd. Electronic device
US11899886B2 (en) 2014-02-28 2024-02-13 Semiconductor Energy Laboratory Co., Ltd. Electronic device
US10073571B2 (en) 2014-05-02 2018-09-11 Semiconductor Energy Laboratory Co., Ltd. Touch sensor and touch panel including capacitor
US10068315B2 (en) * 2014-05-26 2018-09-04 Samsung Electronics Co., Ltd. Electronic device and method for operating display
US20150339804A1 (en) * 2014-05-26 2015-11-26 Samsung Electronics Co., Ltd. Electronic device and method for operating display
US11696481B2 (en) 2015-07-23 2023-07-04 Semiconductor Energy Laboratory Co., Ltd. Display device, module, and electronic device
US11101333B2 (en) 2015-07-23 2021-08-24 Semiconductor Energy Laboratory Co., Ltd. Display device, module, and electronic device
WO2017017558A1 (en) * 2015-07-24 2017-02-02 Semiconductor Energy Laboratory Co., Ltd. Semiconductor device, display panel, method for manufacturing semiconductor device, method for manufacturing display panel, and information processing device
US20170025444A1 (en) * 2015-07-24 2017-01-26 Semiconductor Energy Laboratory Co., Ltd. Semiconductor device, display panel, method for manufacturing semiconductor device, method for manufacturing display panel, and information processing device
US10978489B2 (en) * 2015-07-24 2021-04-13 Semiconductor Energy Laboratory Co., Ltd. Semiconductor device, display panel, method for manufacturing semiconductor device, method for manufacturing display panel, and information processing device
US10558265B2 (en) 2015-12-11 2020-02-11 Semiconductor Energy Laboratory Co., Ltd. Input device and system of input device
US10976872B2 (en) 2015-12-18 2021-04-13 Semiconductor Energy Laboratory Co., Ltd. Display panel, input/output device, data processing device, and method for manufacturing display panel
US10429999B2 (en) 2015-12-18 2019-10-01 Semiconductor Energy Laboratory Co., Ltd. Display panel, input/output device, data processing device, and method for manufacturing display panel
US10545605B2 (en) * 2016-07-27 2020-01-28 Samsung Electronics Co., Ltd. Electronic device having input sensing panels and method
US20180032202A1 (en) * 2016-07-27 2018-02-01 Samsung Electronics Co., Ltd. Electronic device having input sensing panels and method
US11209877B2 (en) 2018-03-16 2021-12-28 Semiconductor Energy Laboratory Co., Ltd. Electrical module, display panel, display device, input/output device, data processing device, and method of manufacturing electrical module
US11663990B2 (en) 2018-11-09 2023-05-30 Semiconductor Energy Laboratory Co., Ltd. Display apparatus and electronic device

Also Published As

Publication number Publication date
US11475532B2 (en) 2022-10-18
JP2022153449A (ja) 2022-10-12
US20190378237A1 (en) 2019-12-12
KR102446013B1 (ko) 2022-09-22
KR20240051095A (ko) 2024-04-19
KR20220130650A (ko) 2022-09-27
KR20230092859A (ko) 2023-06-26
JP2015129917A (ja) 2015-07-16
US11983793B2 (en) 2024-05-14
KR20230149790A (ko) 2023-10-27
KR102657219B1 (ko) 2024-04-16
US20230032671A1 (en) 2023-02-02
JP7105860B2 (ja) 2022-07-25
KR20150063937A (ko) 2015-06-10
JP7321330B2 (ja) 2023-08-04
JP2023153146A (ja) 2023-10-17
JP2019175489A (ja) 2019-10-10
KR20210127651A (ko) 2021-10-22
KR102593651B1 (ko) 2023-10-26
JP2021064391A (ja) 2021-04-22
KR102548559B1 (ko) 2023-06-29

Similar Documents

Publication Publication Date Title
US11983793B2 (en) Foldable display device including a plurality of regions
JP7482289B2 (ja) 情報処理装置
US10254796B2 (en) Information processing device, display device, and electronic device
US11244648B2 (en) Data processor
US10007299B2 (en) Display device and data processing device

Legal Events

Date Code Title Description
AS Assignment

Owner name: SEMICONDUCTOR ENERGY LABORATORY CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HIRAKATA, YOSHIHARU;YAMAZAKI, SHUNPEI;SIGNING DATES FROM 20141114 TO 20141118;REEL/FRAME:034263/0710

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION