US20230393727A1 - Ferroelectric device and semiconductor device - Google Patents

Ferroelectric device and semiconductor device Download PDF

Info

Publication number
US20230393727A1
US20230393727A1 US18/246,367 US202118246367A US2023393727A1 US 20230393727 A1 US20230393727 A1 US 20230393727A1 US 202118246367 A US202118246367 A US 202118246367A US 2023393727 A1 US2023393727 A1 US 2023393727A1
Authority
US
United States
Prior art keywords
light
emitting
layer
screen
receiving element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/246,367
Inventor
Fumiyasu SEINO
Tetsuji Ishitani
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Semiconductor Energy Laboratory Co Ltd
Original Assignee
Semiconductor Energy Laboratory Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Semiconductor Energy Laboratory Co Ltd filed Critical Semiconductor Energy Laboratory Co Ltd
Assigned to SEMICONDUCTOR ENERGY LABORATORY CO., LTD. reassignment SEMICONDUCTOR ENERGY LABORATORY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ISHITANI, TETSUJI, SEINO, Fumiyasu
Publication of US20230393727A1 publication Critical patent/US20230393727A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1652Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/045Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using resistive elements, e.g. a single continuous surface or two parallel surfaces put in contact
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/046Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • One embodiment of the present invention relates to an electronic device.
  • One embodiment of the present invention relates to a display device.
  • One embodiment of the present invention relates to a program.
  • one embodiment of the present invention is not limited to the above technical field.
  • Examples of the technical field of one embodiment of the present invention include a semiconductor device, a display device, a light-emitting apparatus, a power storage device, a memory device, an electronic device, a lighting device, an input device (e.g., a touch sensor), an input/output device (e.g., a touch panel), a driving method thereof, and a manufacturing method thereof.
  • a display includes a touch sensor that detects an object in contact with the display, and various movements are performed by a fingertip or the like touching a surface of the display; whereby it is easy to move a position of an object displayed on the display or zoom in/out on the object, for example.
  • Patent Document 1 discloses an electronic device including a touch sensor.
  • One embodiment of the present invention is a display device including a control portion, a display portion, and a detection portion.
  • the display portion includes a screen displaying an image.
  • the detection portion has a function of obtaining information about contact on the screen or positional information of a detection target approaching the screen in the normal direction and outputting the information to the control portion.
  • the control portion has a function of executing first processing when first operation is performed, a function of executing second processing when second operation is successively performed after the first operation, and a function of executing third processing when third operation is successively performed after the second operation.
  • the first operation is operation in which two pointed positions in contact with the screen are detected
  • the second operation is operation in which the two pointed positions move to reduce the distance therebetween
  • the third operation is operation in which the two pointed positions move in the normal direction with respect to the screen from the state where they are in contact with the screen.
  • the first processing is processing by which a selection region in the screen is determined
  • the second processing is processing by which an object positioned in the selection region is selected
  • the third processing is processing by which the object is picked up.
  • control portion can further have a function of executing fourth processing when fourth operation is performed after the third operation.
  • the fourth operation is operation in which the two pointed positions come in contact with the screen.
  • control portion can further have a function of executing fifth processing when fifth operation is performed after the third operation.
  • the fifth operation is operation in which the two pointed positions move to the height from the screen that exceeds a threshold value.
  • control portion can further have a function of executing sixth processing when sixth operation is performed after the third operation.
  • the sixth operation is operation in which the two pointed positions move to make the distance therebetween large in the state where the height of the two pointed positions from the screen is less than the threshold value and the two pointed positions are not in contact with the screen.
  • control portion preferably has a function of executing seventh processing when seventh operation is successively performed after the third operation.
  • the seventh operation is operation in which the two pointed positions move in a region where the height of the two pointed positions from the screen is less than the threshold value and the two pointed positions are not in contact with the screen.
  • the fourth processing is processing by which the selection of the object in the screen is canceled at the two pointed positions in contact with the screen.
  • the fifth processing is processing by which the selection of the object is canceled at a two-dimensional position in the screen of the time when the height of the two pointed positions from the screen exceeds the threshold value or at the two pointed positions in contact with the screen in the third operation.
  • the sixth processing is processing by which the selection of the object is canceled at a two-dimensional position in the screen of the time when the two pointed positions move to make the distance therebetween large or at the two pointed positions in contact with the screen in the third operation.
  • the display portion includes a light-emitting element.
  • the detection portion includes a photoelectric conversion element.
  • the light-emitting element and the photoelectric conversion element are preferably provided on the same plane.
  • the detection portion preferably includes a touch sensor with a capacitive type, a surface acoustic wave type, a resistive type, an ultrasonic type, an electromagnetic type, or an optical type.
  • Another embodiment of the present invention is a display module including the above display device and a connector or an integrated circuit.
  • Another embodiment of the present invention is an electronic device including the above display module and at least one of an antenna, a battery, a housing, a camera, a speaker, a microphone, and an operation button.
  • an electronic device capable of detecting a three-dimensional movement can be provided.
  • an electronic device capable of executing various types of processing by simple operation can be provided.
  • an electronic device capable of intuitive operation can be provided.
  • a novel electronic device can be provided.
  • FIG. 1 A and FIG. 1 B are diagrams each illustrating a structure example of a device.
  • FIG. 2 A to FIG. 2 C are diagrams illustrating movements of a finger.
  • FIG. 3 A to FIG. 3 C are diagrams illustrating methods for selecting an object.
  • FIG. 4 A to FIG. 4 C are diagrams illustrating methods for selecting an object.
  • FIG. 5 A to FIG. 5 C are diagrams illustrating detection of approach.
  • FIG. 6 A and FIG. 6 B are diagrams illustrating selection of an object.
  • FIG. 7 A and FIG. 7 B are diagrams illustrating moves of an object.
  • FIG. 8 A and FIG. 8 B are diagrams illustrating moves of an object.
  • FIG. 9 A and FIG. 9 B are diagrams each illustrating an example of an application that can be applied to an electronic device.
  • FIG. 10 A and FIG. 10 B are diagrams each illustrating an example of an application that can be applied to an electronic device.
  • FIG. 11 A to FIG. 11 C are diagrams each illustrating an example of an application that can be applied to an electronic device.
  • FIG. 12 A and FIG. 12 B are diagrams each illustrating an example of an application that can be applied to an electronic device.
  • FIG. 13 A , FIG. 13 B , and FIG. 13 D are cross-sectional views each illustrating an example of a display device.
  • FIG. 13 C and FIG. 13 E are diagrams each illustrating an example of an image captured by the display device.
  • FIG. 13 F to FIG. 13 H are top views each illustrating an example of a pixel.
  • FIG. 14 A is a cross-sectional view of a structure example of a display device.
  • FIG. 14 B to FIG. 14 D are top views each illustrating an example of a pixel.
  • FIG. 15 A is a cross-sectional view illustrating a structure example of a display device.
  • FIG. 15 B to FIG. 15 I are top views each illustrating an example of a pixel.
  • FIG. 16 A and FIG. 16 B are diagrams each illustrating a structure example of a display device.
  • FIG. 17 A to FIG. 17 G are diagrams illustrating structure examples of display devices.
  • FIG. 18 A to FIG. 18 C are diagrams each illustrating a structure example of a display device.
  • FIG. 19 A to FIG. 19 C are diagrams each illustrating a structure example of a display device.
  • FIG. 20 A and FIG. 20 B are diagrams each illustrating a structure example of a display device.
  • FIG. 21 is a diagram illustrating a structure example of a display device.
  • FIG. 22 A is a diagram illustrating a structure example of a display device.
  • FIG. 22 B and FIG. 22 C are diagrams illustrating structure examples of transistors.
  • FIG. 23 A and FIG. 23 B are diagrams illustrating an example of an electronic device.
  • FIG. 24 A to FIG. 24 D are diagrams illustrating examples of electronic devices.
  • FIG. 25 A to FIG. 25 F are diagrams illustrating examples of electronic devices.
  • film and the term “layer” can be interchanged with each other depending on the case or circumstances.
  • conductive layer can be replaced with the term “conductive film”.
  • insulating film can be replaced with the term “insulating layer”.
  • FIG. 1 A to FIG. 12 B a structure example and an actuation method of an electronic device of one embodiment of the present invention are described with reference to FIG. 1 A to FIG. 12 B .
  • components are classified according to their functions and shown as independent blocks; however, it is practically difficult to completely separate the components according to their functions, and one component may have a plurality of functions. Alternatively, a plurality of components may achieve one function.
  • the electronic device of one embodiment of the present invention can detect contact and approach of a detection target with/to a screen. That is, positional information (X,Y) that is a coordinate parallel to the screen and positional information (Z) that represents the height from the screen can each be detected. Accordingly, three-dimensional operation is possible; for example, an object displayed on a display can be displayed as if it is moved three-dimensionally.
  • FIG. 1 A illustrates a block diagram of a device 10 of one embodiment of the present invention.
  • the device 10 includes a control portion 11 and a display portion 12 .
  • the display portion 12 includes a detection portion 21 .
  • the device 10 can be used as an electronic device such as an information terminal device, for example.
  • the display portion 12 has a function of displaying an image and a function of detecting contact and approach of a detection target with/to a screen.
  • contact means the state where the detection target is in contact with the screen
  • approach means the state where the detection target is not in contact with the screen but is positioned near and above the screen in a sensing range of a sensor.
  • the detection portion 21 is a portion having, out of the above functions of the display portion 12 , the function of detecting contact and approach of a detection target with/to a screen.
  • the display portion 12 can also be referred to as a touch panel.
  • a display device described in detail in Embodiment 2 can be used for the display portion 12 . In this manner, the device 10 can detect two kinds of information, i.e., contact and approach of the detection target with/to the screen with one detection portion 21 , which is preferable because component cost and manufacturing cost of the device 10 can be reduced.
  • the detection portion 21 has a function of outputting, to the control portion 11 , two-dimensional positional information (X,Y) on the screen about the detection target whose contact has been detected and three-dimensional positional information (X,Y,Z) on the screen about the detection target whose approach has been detected.
  • Z represents the distance (height) in the normal direction with respect to a detection surface (screen).
  • the origin (reference point) of the positional information (X,Y) on the screen is at a given position, e.g., a corner or the center of the screen.
  • the origin (reference point) of the coordinate Z is on the surface of the screen; that is, the height of 0 is the reference point.
  • FIG. 1 A illustrates an example where the display portion 12 includes the detection portion 21 , they may be provided separately. That is, the screen and an operation portion can be separated.
  • examples of the detection portion 21 include a touch pad that does not have an image display function.
  • the control portion 11 can function as, for example, a central processing unit (CPU).
  • the control portion 11 interprets and executes instructions from various programs with a processor to process various kinds of data and control programs. Furthermore, for example, the control portion 11 can control a movement of the object in the screen, a display change, and the like by processing a signal from the detection portion 21 .
  • a touch sensor capable of position detection in a contact and noncontact state can be used.
  • a touch sensor with various types such as a capacitive type, a surface acoustic wave type, a resistive type, an ultrasonic type, an infrared type, an electromagnetic type, or an optical type can be used.
  • FIG. 1 B illustrates a block diagram of a device 20 of one embodiment of the present invention.
  • the device 20 includes the control portion 11 and the display portion 12 .
  • the display portion 12 includes a detection portion 22 and a detection portion 23 .
  • the device 20 can be used as an electronic device such as an information terminal device, for example.
  • the display portion 12 has a function of displaying an image and a function of detecting contact and approach of a detection target with/to a screen.
  • the display portion 12 includes the detection portion 22 and the detection portion 23 is illustrated.
  • the detection portion 22 is a portion having, out of the above functions of the display portion 12 , the function of detecting contact of a detection target with a screen.
  • the detection portion 23 is a portion having, out of the above functions of the display portion 12 , the function of detecting approach of a detection target to a screen.
  • the display portion 12 can also be referred to as a touch panel.
  • a display device described in detail in Embodiment 2 can be used for the display portion 12 .
  • the device 20 includes two detection portions, the detection portion 22 that detects contact of the detection target with the screen and the detection portion 23 that detects approach of the detection target to the screen; whereby the contact and approach can be each detected with high accuracy and more accurate operation is possible, which is preferable.
  • the detection portion 22 has a function of obtaining two-dimensional positional information (X,Y) on the screen about the detection target that is in contact with the screen and outputting the information to the control portion 11 . Furthermore, the detection portion 23 has a function of obtaining three-dimensional positional information (X,Y,Z) about the detection target that approaches the screen and outputting the information to the control portion 11 . Note that Z represents the distance in the normal direction with respect to a detection surface (screen).
  • FIG. 1 B illustrates an example where the display portion 12 includes the detection portion 22 and the detection portion 23 , they may be provided separately. That is, the screen and an operation portion can be separated.
  • examples of the detection portion 22 and the detection portion 23 include a touch pad that does not have an image display function.
  • control portion 11 can control a movement of the object in the screen, a display change, and the like by processing signals from the detection portion 22 and the detection portion 23 .
  • a touch sensor capable of position detection in a contact and noncontact state may be used.
  • a touch sensor with various types such as a capacitive type, a surface acoustic wave type, a resistive type, an ultrasonic type, an infrared type, an electromagnetic type, or an optical type can be used.
  • the device 10 can select an object (e.g., an icon) displayed on the screen by detecting contact and approach of a detection target by the detection portion 21 , and move the selected object to a given position in the screen.
  • the device 20 can select an object (e.g., an icon) displayed on the screen by detecting contact of a detection target by the detection portion 22 and detecting approach of the detection target by the detection portion 23 , and move the selected object to a given position in the screen.
  • the object in the screen can be selected, and actuation of pinching, picking up, moving, and putting down the selected object can be executed.
  • the actuation such as pinching, picking up, moving, and putting down the object refers to display processing in the screen of the display portion 12 .
  • “pinching an object” refers to processing of displaying the object as if the object is pinched
  • “picking up” refers to processing of displaying the object as if the object is picked up
  • “moving” refers to processing of displaying the object as if the object moves in the screen
  • “putting down” refers to processing of displaying the picked-up object as if the object is put down to the screen from above the screen.
  • the device can sense coordinates of fingertips of two fingers, and the coordinates are each referred to as a pointed position in some cases.
  • the coordinates of the contact portions correspond to the pointed positions.
  • the coordinates of points at which the fingertips and the screen are the closest to each other or the coordinates of positions where the intensities of detecting the fingertips peak can be the pointed positions.
  • part of a fingertip of an index finger and part of a fingertip of a thumb are made to be in contact with a coordinate A 1 and a coordinate B 1 on the screen, respectively.
  • the fingertips are moved to positions of a coordinate A 2 and a coordinate B 2 such that the fingertips are close to each other.
  • the coordinate A 2 and the coordinate B 2 become apart from each other, the fingers may be moved to be in contact with each other as illustrated in FIG. 2 C . In this case, the coordinate A 2 and the coordinate B 2 are in close contact with each other.
  • the part of the fingertip of the index finger and the part of the fingertip of the thumb may be in contact with the coordinate A 2 and the coordinate B 2 from the beginning, respectively, without being in contact with the coordinate A 1 and the coordinate B 1 .
  • the above is the pinching actuation.
  • the series of actuation illustrated in FIG. 2 A to FIG. 2 C cannot be distinguished from what is called pinch-in in some cases. Therefore, in the case where processing linked to pinch-in (e.g., zooming out on a screen) is separately set, it is preferable that when pinching actuation is performed, the input of pinch-in be temporarily inactivated. For example, an icon image linked to processing of temporarily making the pinch-in function on/off is displayed on a screen.
  • the actuation illustrated in FIG. 2 A to FIG. 2 C may be distinguished from pinch-in with a movement of a fingertip after being held in the state of FIG. 2 A for a certain time (also referred to as long-tap), for example.
  • FIG. 3 A a method for selecting an object is described.
  • a plurality of objects 100 displayed on the display portion 12 are illustrated as rectangles with rounded corners.
  • a dashed-dotted rectangular frame is a rectangle in which the coordinate A 1 and the coordinate B 1 are diagonally opposite, and the object 100 at least partly included therein is selected.
  • a selected object is indicated by a solid line, and a non-selected object is indicated by a dotted line.
  • the object 100 completely included in the rectangle in which the coordinate A 1 and the coordinate B 1 are diagonally opposite may be selected.
  • the object 100 overlapping with the dashed-dotted line is not selected.
  • the object 100 overlapping with any of two lines which are a line connecting the coordinate A 1 and the coordinate A 2 and a line connecting the coordinate B 1 and the coordinate B 2 may be selected. That is, an object that is positioned on a path corresponding to a movement of a finger may be selected.
  • the object 100 that overlaps with an arrow connecting the coordinate A 1 and the coordinate A 2 and the object 100 that overlaps with an arrow connecting the coordinate B 1 and the coordinate B 2 are selected.
  • the object 100 at least partly included in a rectangle in which the coordinate A 2 and the coordinate B 2 after pinching actuation are diagonally opposite may be selected.
  • two objects 100 are selected. In such a manner, the area of a rectangle becomes small, and thus the objects to be selected can be narrowed down.
  • the object 100 completely included in the rectangle in which the coordinate A 2 and the coordinate B 2 after pinching actuation are diagonally opposite may be selected.
  • one object 100 is selected. In such a manner, the area of a rectangle becomes smaller, and thus an intended object can be selected accurately.
  • FIG. 4 C When the coordinate A 2 and the coordinate B 2 are in close contact with each other as illustrated in FIG. 4 C , e.g., when an index finger and a thumb are in contact with each other, only the object 100 overlapping with both the coordinate A 2 and the coordinate B 2 may be selected. Accordingly, an intended object can be selected accurately. Alternatively, an intended object even with a small size can be selected accurately. Furthermore, also when the part of the index finger and the part of the thumb, without passing on the coordinate A 1 and the coordinate B 1 , are in contact with the coordinate A 2 and the coordinate B 2 from the beginning, an object can be selected as in FIG. 4 A to FIG. 4 C . The above is the description of the method for selecting an object.
  • FIG. 5 A to FIG. 5 C are schematic cross-sectional views seen in the direction indicated by an arrow 50 in FIG. 2 B .
  • FIG. 5 A is a diagram illustrating a state of pinching after the index finger in contact with the coordinate A 1 and the thumb in contact with the coordinate B 1 are moved to the positions of the coordinate A 2 and the coordinate B 2 such that the fingers are close to each other, and an object is selected.
  • a movement in which the part of the index finger and the part of the thumb having been in contact with the screen are lifted up (in the normal direction) is performed.
  • the part of the index finger and the part of the thumb become apart from the screen, whereby an object is displayed as if it is picked up.
  • the pinched object is displayed as if it is lifted up (floats) in the screen.
  • other display may be set.
  • the object may be displayed in a different color or a smaller size.
  • the object may be displayed in a different shape.
  • the operation in which the part of the index finger and the part of the thumb become apart from the screen from the state where they are in contact with the screen can be detected when a detection position of a contact sensor (the detection portion 21 or the detection portion 22 ) in the screen disappears, for example.
  • a detection position of a contact sensor the detection portion 21 or the detection portion 22
  • the device 10 or the device 20 can obtain three-dimensional positional information over the screen, it is preferable that the height of a detection target regarded as being in contact (referred to as a lower-limit threshold value Th 1 ) be set in advance, and the detection target be regarded as becoming apart from the screen when the lower-limit threshold value Th 1 is exceeded.
  • the object when the height H of the part of the index finger and the part of the thumb from the screen surface exceeds the lower-limit threshold value Th 1 , the object may be displayed as if it is picked up. In such a manner, when the lower-limit threshold value Th 1 is set, unintentional picking up of the object can be reduced, which is preferable. That is, when the height H of the part of the index finger and the part of the thumb from the screen surface is more than or equal to the threshold value Th 1 and less than or equal to a threshold value Th 2 , the object is displayed as if it is pinched and lifted up.
  • the threshold value Th 2 represents the upper detection limit in the Z direction.
  • the object is displayed as if it is dropped.
  • the object is at the position of the height H just before dropped, and without getting higher than that, the object is displayed as if it is dropped from the height H. Therefore, when the height H of the part of the index finger and the part of the thumb from the screen surface does not exceed the threshold value Th 2 , the display in which the object is in a picked up state is maintained. Furthermore, the object can be moved on the screen while the state where the object is pinched is maintained.
  • the picked up object 100 when the picked up object 100 is dropped, it may be put down not to the XY point of the position but to the XY point where the object is originally picked up (A 2 and B 2 ).
  • the above is the description of the actuation of picking up and putting down the object.
  • the method for canceling selection of an object is described.
  • the part of the index finger and the part of the thumb are put down to be in contact with the screen as in FIG. 5 A so as to put down the object, the selection of the object is canceled.
  • the object is dropped and the selection of the object is canceled. That is, putting down and dropping the object makes it possible to cancel the selection of the object.
  • the above is the description of the method for canceling selection of an object.
  • FIG. 6 A to FIG. 8 B are perspective views of the display portion 12 in the device 10 or the device 20 .
  • FIG. 6 A As illustrated in FIG. 6 A , two fingertips (not illustrated) are made in contact with the coordinate A 1 and the coordinate B 1 having the object 100 therebetween on the display portion 12 .
  • FIG. 6 B the balls of the fingers are moved close to each other while being in contact with the display portion 12 , so that the two fingertips are moved to the coordinate A 2 and the coordinate B 2 .
  • the above movements can be detected by the detection portion 21 and the detection portion 22 in the device 10 and the device 20 , respectively. Accordingly, the object 100 can be selected, i.e., the object 100 can be pinched.
  • a movement is performed in which the object 100 is picked up from the position of the coordinate A 2 and the coordinate B 2 to the position of a coordinate A 3 and a coordinate B 3 with the balls of the two fingers be in contact with each other.
  • This movement is detected by the detection portion 21 and the detection portion 23 in the device 10 and the device 20 , respectively. Therefore, the object 100 can be picked up.
  • the height to which the object 100 is picked up is H
  • the height H is more than the threshold value Th 1 (not illustrated) and less than the threshold value Th 2 .
  • the picked up object becomes apart from the two fingers and is dropped.
  • the two fingers are moved, with their balls be in contact with each other, from the position of the coordinate A 3 and the coordinate B 3 to the position of a coordinate A 4 and a coordinate B 4 .
  • This movement is detected by the detection portion 21 and the detection portion 23 in the device 10 and the device 20 , respectively. Therefore, the object 100 can be moved.
  • the object is moved linearly in FIG. 7 B , the movement is not limited thereto.
  • the object 100 may be swung up and down, left and right. However, when the height H of the pinched object 100 from the screen surface exceeds the threshold value Th 2 , the object is dropped.
  • the two fingers are put down, with their balls be in contact with each other from the position of the coordinate A 4 and the coordinate B 4 to the position of a coordinate A 5 and a coordinate B 5 to be in contact with the surface of the display portion 12 .
  • This movement is detected by the detection portion 21 and the detection portion 22 in the device 10 and the device 20 , respectively. Therefore, the object 100 can be put down.
  • FIG. 8 B when the two fingers become apart from each other before being put down from the position of the coordinate A 4 and the coordinate B 4 to the position of the coordinate A 5 and the coordinate B 5 , the object 100 is dropped.
  • the device 10 and the device 20 are each an electronic device capable of object operation with an intuitive movement of a finger in order to hold, lift up, move, and put down an object on a screen, for example.
  • operation using a detection target other than a finger can be performed.
  • a detection target other than a finger a writing material such as a stylus pen, a brush, a glass pen, and a quill pen can be used, for example.
  • operation may be performed using a finger and the above detection target other than a finger or using two detection targets other than a finger.
  • a detection target having two or more detection portions can be used.
  • equipment in which the distance of two tips changes such as tweezers, scissors, or chopsticks, can be used for operation.
  • FIG. 9 A illustrates an example where an object 100 a such as an icon displayed on the screen of the display portion 12 is moved to a given position.
  • a user selects the object 100 a on the screen with the above pinching actuation and picks up the object. Then, the user can move the object 100 a such as an icon in the screen by putting down or dropping the object to a given position.
  • FIG. 9 B illustrates an example where a destination, a starting point, or the like is specified in a map application.
  • a pin-shaped object 100 b drawn by a solid line is the object displayed on the screen, and the pin-shaped object 100 b drawn by a dotted line is the picked up object.
  • a user can accurately select the pin-shaped object 100 b displayed on the screen with the pinching actuation and intuitively put down the object to a given position on the screen.
  • the pin-shaped object 100 b on the lower right of the screen is picked up and put down to the destination. Since the device 10 and the device 20 each have a function of detecting contact of a detection target, a setting of the destination, the starting point, or the like can be changed by a finger made to be in contact with the pin-shaped object 100 b.
  • FIG. 10 A illustrates an example where a page is turned in an e-book reader application.
  • a user can perform page-turning actuation more naturally as if he or she turns a page of a real book.
  • the user can turn up an object 100 c that is part of the page with actuation in which an edge portion of the screen is picked up.
  • the page can be turned by actuation in which the fingers in a state of picking up the page are moved to the opposite page side and the fingers become apart from each other at the position or come in contact with the page.
  • FIG. 10 B illustrates an example where an object position is changed back and forth in editing software such as document creation software or presentation manuscript creation software.
  • a circular object 100 d positioned behind a triangle object and a quadrangular object is moved to the foreground.
  • a plurality of contact actuation are needed in order to move an object positioned on the back side to the foreground; however, when the above pinching actuation is applied, the object position can be easily changed back and forth.
  • the circular object 100 d is picked up and then put down at the position, the position of the circular object 100 d can be changed only back and forth.
  • FIG. 11 A to FIG. 12 A Examples of a game application to which the pinching actuation is applied are illustrated in FIG. 11 A to FIG. 12 A .
  • FIG. 11 A is an example where the pinching actuation is applied to a plant growing game.
  • a user can perform operation, such as pulling out a weed object 100 e , giving the plant a water object 100 g , or sowing a seed object 100 f , that is necessary for growing the plant with the pinching actuation.
  • containers corresponding to the weed (Weeds), water (Water), and seeds (Seeds) are illustrated. Since a movement close to a real movement can be employed to the game, the user can enjoy the game more intuitively.
  • FIG. 11 B and FIG. 11 C are each an example where the pinching actuation is applied to a game to interact with an animal.
  • a user can wave a stick-like toy object 100 i to an animal object 100 h or roll a ball-like toy object 100 j to the animal, for example.
  • the user can move the object with his/her intention and thus can find more pleasure.
  • display of the animal object 100 h being pinched and swung left and right is possible.
  • the animal object 100 h is swung by the movement of user's fingers, and the user can feel relaxed.
  • FIG. 12 A is an example where the pinching actuation is applied to a game to pull out sticks stacked with each other, in which scroll and pinching actuation are combined.
  • a given stacked surface is selected by scrolling, and a stick object 100 j to be pulled out is selected by pinching. After that, the stick object 100 j is picked up and then can be pulled out.
  • speed for picking up, whole balance, and the like are processed: when a certain condition is exceeded, the stacked sticks collapse.
  • FIG. 12 B illustrates an example where the pinching actuation is applied to application switching in an electronic device such as a smartphone or a tablet.
  • an electronic device such as a smartphone or a tablet.
  • picking up is performed at a given position in the display portion 12 .
  • a list of the applications in activation in the electronic device is displayed.
  • the fingers are turned with holding the picking up state, whereby the active application is selected one by one in order.
  • a solid line represents a selected application object 100 k .
  • the selected application can be activated on the screen.
  • a remotely-connected electronic device to which the pinching actuation is applied can be used for remote treatment in the medical field.
  • the device 10 and the device 20 each have a function of detecting contact to a screen; thus, when the screen on which affected part is displayed is touched, an area to be treated can be selected, for example.
  • movements such as pinching, picking up, and cutting the treatment area can be remotely performed.
  • a robot arm can perform the actual treatment, for example.
  • a program in which the processing method, detection method, operation method, actuation method, display method, or the like that is described above as an example and executed by the device 10 and the like is written can be stored in a non-transitory storage medium and can be read and executed by an arithmetic device or the like included in the control portion 11 of the device 10 . That is, a program that makes hardware execute the processing method, detection method, operation method, actuation method, display method, or the like described above as an example and a non-transitory storage medium storing the program are embodiments of the present invention.
  • a display device exemplified below can be favorably used for a light-emitting and light-receiving portion of the electronic device described in Embodiment 1.
  • a light-emitting and light-receiving portion of the light-emitting and light-receiving apparatus of one embodiment of the present invention includes a light-receiving element (also referred to as a light-receiving device) and a light-emitting element (also referred to as a light-emitting device).
  • the light-emitting and light-receiving portion has a function of displaying an image with the use of the light-emitting element.
  • the light-emitting and light-receiving portion has one or both of a function of capturing an image with the use of the light-receiving element and a sensing function.
  • the light-emitting and light-receiving apparatus of one embodiment of the present invention can be expressed as a display device, and the light-emitting and light-receiving portion can be expressed as a display portion.
  • the light-emitting and light-receiving apparatus of one embodiment of the present invention may have a structure including a light-emitting and light-receiving element (also referred to as a light-emitting and light-receiving device) and a light-emitting element.
  • a light-emitting and light-receiving element also referred to as a light-emitting and light-receiving device
  • a light-emitting element also referred to as a light-emitting and light-receiving device
  • a light-emitting and light-receiving apparatus including a light-receiving element and a light-emitting element is described.
  • the light-emitting and light-receiving apparatus of one embodiment of the present invention includes a light-receiving element and a light-emitting element in a light-emitting and light-receiving portion.
  • the light-emitting elements are arranged in a matrix in the light-emitting and light-receiving portion, and an image can be displayed on the light-emitting and light-receiving portion.
  • the light-receiving elements are arranged in a matrix in the light-emitting and light-receiving portion, and the light-emitting and light-receiving portion has one or both of an image capturing function and a sensing function.
  • the light-emitting and light-receiving portion can be used as an image sensor, a touch sensor, or the like. That is, light is detected in the light-emitting and light-receiving portion, whereby an image can be captured.
  • touch operation of an object e.g., a finger or a pen
  • an object e.g., a finger or a pen
  • the light-emitting elements can be used as a light source of the sensor. Accordingly, a light-receiving portion and a light source do not need to be provided separately from the light-emitting and light-receiving apparatus; hence, the number of components of an electronic device can be reduced.
  • the light-receiving element when an object reflects (or scatters) light emitted from the light-emitting element included in the light-emitting and light-receiving portion, the light-receiving element can detect the reflected light (or the scattered light); thus, image capturing and touch operation detection are possible even in a dark place.
  • the light-emitting element included in the light-emitting and light-receiving apparatus of one embodiment of the present invention functions as a display element (also referred to as a display device).
  • an EL element such as an OLED (Organic Light Emitting Diode) or a QLED (Quantum-dot Light Emitting Diode) is preferably used.
  • a light-emitting substance contained in the EL element include a substance exhibiting fluorescence (a fluorescent material), a substance exhibiting phosphorescence (a phosphorescent material), an inorganic compound (such as a quantum dot material), and a substance exhibiting thermally activated delayed fluorescence (a thermally activated delayed fluorescence (TADF) material).
  • an LED such as a micro-LED (Light Emitting Diode) can be used as the light-emitting element.
  • the light-emitting and light-receiving apparatus of one embodiment of the present invention has a function of detecting light with the use of a light-receiving element.
  • the light-emitting and light-receiving apparatus can capture an image using the light-receiving element.
  • the light-emitting and light-receiving apparatus can be used as a scanner.
  • An electronic device including the light-emitting and light-receiving apparatus of one embodiment of the present invention can obtain data related to biological information such as a fingerprint or a palm print by using a function of an image sensor. That is, a biometric authentication sensor can be incorporated in the light-emitting and light-receiving apparatus.
  • a biometric authentication sensor can be incorporated in the light-emitting and light-receiving apparatus.
  • the light-emitting and light-receiving apparatus incorporates a biometric authentication sensor, the number of components of an electronic device can be reduced as compared to the case where a biometric authentication sensor is provided separately from the light-emitting and light-receiving apparatus; thus, the size and weight of the electronic device can be reduced.
  • the light-emitting and light-receiving apparatus can detect touch operation of an object with the use of the light-receiving element.
  • the light-receiving element a pn photodiode or a pin photodiode can be used, for example.
  • the light-receiving element functions as a photoelectric conversion element (also referred to as a photoelectric conversion device) that detects light entering the light-receiving element and generates electric charge.
  • the amount of electric charge generated from the light-receiving element depends on the amount of light entering the light-receiving element.
  • an organic photodiode including a layer containing an organic compound as the light-receiving element.
  • An organic photodiode which is easily made thin, lightweight, and large in area and has a high degree of freedom for shape and design, can be used in a variety of devices.
  • organic EL elements also referred to as organic EL devices
  • organic photodiodes are used as the light-receiving elements.
  • the organic EL elements and the organic photodiodes can be formed over one substrate.
  • the organic photodiodes can be incorporated in the display device including the organic EL elements.
  • the number of deposition steps becomes extremely large.
  • a large number of layers of the organic photodiodes can have a structure in common with the organic EL elements; thus, concurrently depositing the layers that can have a common structure can inhibit an increase in the number of deposition steps.
  • one of a pair of electrodes can be a layer shared by the light-receiving element and the light-emitting element.
  • at least one of a hole-injection layer, a hole-transport layer, an electron-transport layer, and an electron-injection layer is preferably a layer shared by the light-receiving element and the light-emitting element.
  • the light-receiving element and the light-emitting element can have the same structure except that the light-receiving element includes an active layer and the light-emitting element includes a light-emitting layer.
  • the light-receiving element can be manufactured by only replacing the light-emitting layer of the light-emitting element with an active layer.
  • the light-receiving element and the light-emitting element include common layers in such a manner, the number of deposition steps and the number of masks can be reduced, whereby the number of manufacturing steps and the manufacturing cost of the light-emitting and light-receiving apparatus can be reduced.
  • the light-emitting and light-receiving apparatus including the light-receiving element can be manufactured using an existing manufacturing apparatus and an existing manufacturing method for the display device.
  • a layer shared by the light-receiving element and the light-emitting element might have functions different in the light-receiving element and the light-emitting element.
  • the name of a component is based on its function in the light-emitting element.
  • a hole-injection layer functions as a hole-injection layer in the light-emitting element and functions as a hole-transport layer in the light-receiving element.
  • an electron-injection layer functions as an electron-injection layer in the light-emitting element and functions as an electron-transport layer in the light-receiving element.
  • a layer shared by the light-receiving element and the light-emitting element may have the same functions in the light-receiving element and the light-emitting element.
  • a hole-transport layer functions as a hole-transport layer in both of the light-emitting element and the light-receiving element
  • an electron-transport layer functions as an electron-transport layer in both of the light-emitting element and the light-receiving element.
  • a subpixel exhibiting any color includes a light-emitting and light-receiving element instead of a light-emitting element, and subpixels exhibiting the other colors each include a light-emitting element.
  • the light-emitting and light-receiving element has both a function of emitting light (a light-emitting function) and a function of receiving light (a light-receiving function).
  • the light-emitting and light-receiving portion of the light-emitting and light-receiving apparatus of one embodiment of the present invention has a function of displaying an image using both a light-emitting and light-receiving element and a light-emitting element.
  • the light-emitting and light-receiving element functions as both a light-emitting element and a light-receiving element, whereby the pixel can have a light-receiving function without an increase in the number of subpixels included in the pixel.
  • the light-emitting and light-receiving portion of the light-emitting and light-receiving apparatus can be provided with one or both of an image capturing function and a sensing function while keeping the aperture ratio of the pixel (aperture ratio of each subpixel) and the resolution of the light-emitting and light-receiving apparatus.
  • the aperture ratio of the pixel can be more increased and the resolution can be increased more easily than in a light-emitting and light-receiving apparatus provided with a subpixel including a light-receiving element separately from a subpixel including a light-emitting element.
  • the light-emitting and light-receiving elements and the light-emitting elements are arranged in a matrix, and an image can be displayed on the light-emitting and light-receiving portion.
  • the light-emitting and light-receiving portion can be used as an image sensor and a touch sensor.
  • the light-emitting elements can be used as a light source of the sensor.
  • the light-emitting and light-receiving element can be manufactured by combining an organic EL element and an organic photodiode. For example, by adding an active layer of an organic photodiode to a layered structure of an organic EL element, the light-emitting and light-receiving element can be manufactured. Furthermore, in the light-emitting and light-receiving element formed of a combination of an organic EL element and an organic photodiode, depositing layers in one deposition step that can be shared with the organic EL element can inhibit an increase in the number of deposition steps.
  • one of a pair of electrodes can be a layer shared with the light-emitting and light-receiving element and the light-emitting element.
  • at least one of a hole-injection layer, a hole-transport layer, an electron-transport layer, and an electron-injection layer is preferably a layer shared with the light-emitting and light-receiving element and the light-emitting element.
  • the light-emitting and light-receiving element and the light-emitting element can have the same structure except for the presence or absence of an active layer of the light-receiving element.
  • the light-emitting and light-receiving element can be manufactured by only adding the active layer of the light-receiving element to the light-emitting element.
  • the light-emitting and light-receiving element and the light-emitting element include common layers in such a manner, the number of deposition steps and the number of masks can be reduced, thereby reducing the number of manufacturing steps and the manufacturing cost of the light-emitting and light-receiving apparatus.
  • the light-emitting and light-receiving apparatus including the light-emitting and light-receiving element can be manufactured using an existing manufacturing device and an existing manufacturing method for the display device.
  • a layer included in the light-emitting and light-receiving element may have a different function between the case where the light-emitting and light-receiving element functions as a light-receiving element and the case where the light-emitting and light-receiving element functions as a light-emitting element.
  • the name of a component is based on its function in the case where the light-emitting and light-receiving element functions as a light-emitting element.
  • the light-emitting and light-receiving apparatus of this embodiment has a function of displaying an image with the use of a light-emitting element and a light-emitting and light-receiving element. That is, the light-emitting element and the light-emitting and light-receiving element function as display elements.
  • the light-emitting and light-receiving apparatus of this embodiment has a function of detecting light with the use of a light-emitting and light-receiving element.
  • the light-emitting and light-receiving element can detect light having a shorter wavelength than light emitted by the light-emitting and light-receiving element itself.
  • the light-emitting and light-receiving apparatus of this embodiment can capture an image using the light-emitting and light-receiving element.
  • the light-emitting and light-receiving apparatus of this embodiment can detect touch operation of an object with the use of the light-emitting and light-receiving element.
  • the light-emitting and light-receiving element functions as a photoelectric conversion element.
  • the light-emitting and light-receiving element can be manufactured by adding an active layer of the light-receiving element to the above-described structure of the light-emitting element.
  • an active layer of a pn photodiode or a pin photodiode can be used, for example.
  • an active layer of an organic photodiode including a layer containing an organic compound.
  • An organic photodiode which is easily made thin, lightweight, and large in area and has a high degree of freedom for shape and design, can be used in a variety of devices.
  • the display device that is an example of the light-emitting and light-receiving apparatus of one embodiment of the present invention is specifically described below with reference to drawings.
  • FIG. 13 A is a schematic view of a display panel 200 .
  • the display panel 200 includes a substrate 201 , a substrate 202 , a light-receiving element 212 , a light-emitting element 211 R, a light-emitting element 211 G, a light-emitting element 211 B, a functional layer 203 , and the like.
  • the light-emitting element 211 R, the light-emitting element 211 G, the light-emitting element 211 B, the light-receiving element 212 are provided between the substrate 201 and the substrate 202 .
  • the light-emitting element 211 R, the light-emitting element 211 G, and the light-emitting element 211 B emit red (R) light, green (G) light, and blue (B) light, respectively.
  • the term “light-emitting element 211 ” may be used when the light-emitting element 211 R, the light-emitting element 211 G, and the light-emitting element 211 B are not distinguished from each other.
  • the display panel 200 includes a plurality of pixels arranged in a matrix.
  • One pixel includes one or more subpixels.
  • One subpixel includes one light-emitting element.
  • the pixel can have a structure including three subpixels (e.g., three colors of R, G, and B or three colors of yellow (Y), cyan (C), and magenta (M)) or four subpixels (e.g., four colors of R, G, B, and white (W) or four colors of R, G, B, and Y).
  • the pixel further includes the light-receiving element 212 .
  • the light-receiving element 212 may be provided in all the pixels or may be provided in some of the pixels.
  • one pixel may include a plurality of light-receiving elements 212 .
  • FIG. 8 A illustrates a finger 220 touching a surface of the substrate 202 .
  • Part of light emitted by the light-emitting element 211 G is reflected at a contact portion of the substrate 202 and the finger 220 .
  • the contact of the finger 220 with the substrate 202 can be detected. That is, the display panel 200 can function as a touch panel.
  • the functional layer 203 includes a circuit for driving the light-emitting element 211 R, the light-emitting element 211 G, and the light-emitting element 211 B and a circuit for driving the light-receiving element 212 .
  • the functional layer 203 is provided with a switch, a transistor, a capacitor, a wiring, and the like. Note that in the case where the light-emitting element 211 R, the light-emitting element 211 G, the light-emitting element 211 B, and the light-receiving element 212 are driven by a passive-matrix method, a structure not provided with a switch and a transistor may be employed.
  • the display panel 200 preferably has a function of detecting a fingerprint of the finger 220 .
  • FIG. 13 B schematically illustrates an enlarged view of the contact portion in a state where the finger 220 touches the substrate 202 .
  • FIG. 13 B illustrates the light-emitting elements 211 and the light-receiving elements 212 that are alternately arranged.
  • the fingerprint of the finger 220 is formed of depressions and projections. Therefore, as illustrated in FIG. 13 B , the projections of the fingerprint touch the substrate 202 .
  • Reflection of light from a surface or an interface is categorized into regular reflection and diffuse reflection.
  • Regularly reflected light is highly directional light with an angle of reflection equal to the angle of incidence. Diffusely reflected light has low directionality and low angular dependence of intensity.
  • regular reflection and diffuse reflection diffuse reflection components are dominant in the light reflected from the surface of the finger 220 . Meanwhile, regular reflection components are dominant in the light reflected from the interface between the substrate 202 and the air.
  • the intensity of light that is reflected from contact surfaces or non-contact surfaces between the finger 220 and the substrate 202 and is incident on the light-receiving elements 212 positioned directly below the contact surfaces or the non-contact surfaces is the sum of intensities of regularly reflected light and diffusely reflected light.
  • regularly reflected light (indicated by solid arrows) is dominant near the depressions of the finger 220 , where the finger 220 is not in contact with the substrate 202 ; whereas diffusely reflected light (indicated by dashed arrows) from the finger 220 is dominant near the projections of the finger 220 , where the finger 220 is in contact with the substrate 202 .
  • the intensity of light received by the light-receiving element 212 positioned directly below the depression is higher than the intensity of light received by the light-receiving element 212 positioned directly below the projection. Accordingly, a fingerprint image of the finger 220 can be captured.
  • an arrangement interval between the light-receiving elements 212 is smaller than a distance between two projections of a fingerprint, preferably a distance between a depression and a projection adjacent to each other, a clear fingerprint image can be obtained.
  • the distance between a depression and a projection of a human's fingerprint is approximately 200 ⁇ m; thus, the arrangement interval between the light-receiving elements 212 is, for example, less than or equal to 400 ⁇ m, preferably less than or equal to 200 ⁇ m, further preferably less than or equal to 150 ⁇ m, still further preferably less than or equal to 100 ⁇ m, yet still further preferably less than or equal to 50 ⁇ m and greater than or equal to 1 ⁇ m, preferably greater than or equal to 10 ⁇ m, further preferably greater than or equal to 20 ⁇ m.
  • FIG. 13 C illustrates an example of a fingerprint image captured by the display panel 200 .
  • the outline of the finger 220 is indicated by a dashed line and the outline of a contact portion 221 is indicated by a dashed-dotted line.
  • a high-contrast image of a fingerprint 222 can be captured owing to a difference in the amount of light incident on the light-receiving elements 212 .
  • the display panel 200 can also function as a touch panel or a pen tablet.
  • FIG. 13 D illustrates a state where a tip of a stylus 225 slides in a direction indicated with a dashed arrow while the tip of the stylus 225 touches the substrate 202 .
  • FIG. 13 E illustrates an example of a path 226 of the stylus 225 that is detected by the display panel 200 .
  • the display panel 200 can detect the position of a detection target, such as the stylus 225 , with high position accuracy, so that high-definition drawing can be performed using a drawing application or the like.
  • the display panel 200 can detect even the position of a highly insulating object to be detected, the material of a tip portion of the stylus 225 is not limited, and a variety of writing materials (e.g., a brush, a glass pen, a quill pen, and the like) can be used.
  • FIG. 13 F to FIG. 13 H illustrate examples of a pixel that can be used in the display panel 200 .
  • the pixels illustrated in FIG. 13 F and FIG. 13 G each include the light-emitting element 211 R for red (R), the light-emitting element 211 G for green (G), the light-emitting element 211 B for blue (B), and the light-receiving element 212 .
  • the pixels each include a pixel circuit for driving the light-emitting element 211 R, the light-emitting element 211 G, the light-emitting element 211 B, and the light-receiving element 212 .
  • FIG. 13 F illustrates an example in which three light-emitting elements and one light-receiving element are provided in a matrix of 2 ⁇ 2.
  • FIG. 13 G illustrates an example in which three light-emitting elements are arranged in one line and one laterally long light-receiving element 212 is provided below the three light-emitting elements.
  • the pixel illustrated in FIG. 13 H is an example including a light-emitting element 211 W for white (W).
  • a light-emitting element 211 W for white (W) is arranged in one line and the light-receiving element 212 is provided below the four light-emitting elements.
  • the pixel structure is not limited to the above structure, and a variety of arrangement methods can be employed.
  • a display panel 200 A illustrated in FIG. 14 A includes a light-emitting element 211 IR in addition to the components illustrated in FIG. 14 A as an example.
  • the light-emitting element 211 IR is a light-emitting element emitting infrared light IR.
  • an element capable of receiving at least the infrared light IR emitted by the light-emitting element 211 IR is preferably used as the light-receiving element 212 .
  • the light-receiving element 212 an element capable of receiving visible light and infrared light is further preferably used.
  • the infrared light IR emitted from the light-emitting element 211 IR is reflected by the finger 220 and part of reflected light is incident on the light-receiving element 212 , so that the positional information of the finger 220 can be obtained.
  • FIG. 14 B to FIG. 14 D illustrate examples of a pixel that can be used in the display panel 200 A.
  • FIG. 14 B illustrates an example in which three light-emitting elements are arranged in one line and the light-emitting element 211 IR and the light-receiving element 212 are arranged below the three light-emitting elements in a horizontal direction.
  • FIG. 14 C illustrates an example in which four light-emitting elements including the light-emitting element 211 IR are arranged in one line and the light-receiving element 212 is provided below the four light-emitting elements.
  • FIG. 14 D shows an example in which three light-emitting elements and the light-receiving element 212 are arranged in all directions with the light-emitting element 211 IR as the center.
  • the positions of the light-emitting elements can be interchangeable, or the positions of the light-emitting element and the light-receiving element can be interchangeable.
  • a display panel 200 B illustrated in FIG. 15 A includes the light-emitting element 211 B, the light-emitting element 211 G, and a light-emitting and light-receiving element 213 R.
  • the light-emitting and light-receiving element 213 R has a function of a light-emitting element that emits red (R) light, and a function of a photoelectric conversion element that receives visible light.
  • FIG. 15 A illustrates an example in which the light-emitting and light-receiving element 213 R receives green (G) light emitted by the light-emitting element 211 G.
  • the light-emitting and light-receiving element 213 R may receive blue (B) light emitted by the light-emitting element 211 B.
  • the light-emitting and light-receiving element 213 R may receive both green light and blue light.
  • the light-emitting and light-receiving element 213 R preferably receives light having a shorter wavelength than light emitted from itself.
  • the light-emitting and light-receiving element 213 R may receive light (e.g., infrared light) having a longer wavelength than light emitted from itself.
  • the light-emitting and light-receiving element 213 R may receive light having approximately the same wavelength as light emitted from itself; however, in that case, the light-emitting and light-receiving element 213 R also receives light emitted from itself, whereby its emission efficiency might be decreased. Therefore, the peak of the emission spectrum and the peak of the absorption spectrum of the light-emitting and light-receiving element 213 R preferably overlap as little as possible.
  • light emitted by the light-emitting and light-receiving element is not limited to red light.
  • the light emitted by the light-emitting elements is not limited to the combination of green light and blue light.
  • the light-emitting and light-receiving element can be an element that emits green or blue light and receives light having a different wavelength from light emitted from itself.
  • the light-emitting and light-receiving element 213 R serves as both a light-emitting element and a light-receiving element as described above, whereby the number of elements provided in one pixel can be reduced. Thus, higher definition, a higher aperture ratio, higher resolution, and the like can be easily achieved.
  • FIG. 15 B to FIG. 15 I illustrate examples of a pixel that can be used in the display panel 200 B.
  • FIG. 15 B illustrates an example in which the light-emitting and light-receiving element 213 R, the light-emitting element 211 G, and the light-emitting element 211 B are arranged in one column.
  • FIG. 15 C illustrates an example in which the light-emitting element 211 G and the light-emitting element 211 B are alternately arranged in the vertical direction and the light-emitting and light-receiving element 213 R is provided alongside the light-emitting elements.
  • FIG. 15 D illustrates an example in which three light-emitting elements (the light-emitting element 211 G, the light-emitting element 211 B, and a light-emitting element 211 X) and one light-emitting and light-receiving element are arranged in matrix of 2 ⁇ 2.
  • the light-emitting element 211 X is an element that emits light of a color other than R, G, and B.
  • the light of a color other than R, G, and B can be white (W) light, yellow (Y) light, cyan (C) light, magenta (M) light, infrared light (IR), ultraviolet light (UV), or the like.
  • the light-emitting and light-receiving element preferably has a function of detecting infrared light or a function of detecting both visible light and infrared light.
  • the wavelength of light detected by the light-emitting and light-receiving element can be determined depending on the application of a sensor.
  • FIG. 15 E illustrates two pixels. A region that includes three elements and is enclosed by a dotted line corresponds to one pixel.
  • Each of the pixels includes the light-emitting element 211 G, the light-emitting element 211 B, and the light-emitting and light-receiving element 213 R.
  • the light-emitting element 211 G is provided in the same row as the light-emitting and light-receiving element 213 R
  • the light-emitting element 211 B is provided in the same column as the light-emitting and light-receiving element 213 R.
  • the light-emitting element 211 G is provided in the same row as the light-emitting and light-receiving element 213 R, and the light-emitting element 211 B is provided in the same column as the light-emitting element 211 G.
  • the light-emitting and light-receiving element 213 R, the light-emitting element 211 G, and the light-emitting element 211 B are repeatedly arranged in both the odd-numbered row and the even-numbered row, and in each column, the light-emitting elements or the light-emitting element and the light-emitting and the receiving elements arranged in the odd-numbered row and the even-numbered row emit light of different colors.
  • FIG. 15 F illustrates four pixels which employ PenTile arrangement; adjacent two pixels have different combinations of light-emitting elements or light-emitting and light-receiving elements that emit light of two different colors.
  • FIG. 15 F illustrates the top-surface shapes of the light-emitting elements or light-emitting and light-receiving elements.
  • the upper left pixel and the lower right pixel in FIG. 15 F each include the light-emitting and light-receiving element 213 R and the light-emitting element 211 G.
  • the upper right pixel and the lower left pixel each include the light-emitting element 211 G and the light-emitting element 211 B. That is, in the example illustrated in FIG. 15 F , the light-emitting element 211 G is provided in each pixel.
  • the top surface shape of the light-emitting elements and the light-emitting and light-receiving elements is not particularly limited and can be a circular shape, an elliptical shape, a polygonal shape, a polygonal shape with rounded corners, or the like.
  • FIG. 15 F and the like illustrate examples in which the top surface shapes of the light-emitting elements and the light-emitting and light-receiving elements are each a square tilted at approximately 450 (a diamond shape).
  • top surface shape of the light-emitting elements and the light-emitting and light-receiving elements may vary depending on the color thereof, or the light-emitting elements and the light-emitting and light-receiving elements of some colors or every color may have the same top surface shape.
  • the sizes of light-emitting regions (or light-emitting and light-receiving regions) of the light-emitting elements and the light-emitting and light-receiving elements may vary depending on the color thereof, or the light-emitting elements and the light-emitting and light-receiving elements of some colors or every color may have light-emitting regions of the same size.
  • the light-emitting region of the light-emitting element 211 G provided in each pixel may have a smaller area than the light-emitting region (or the light-emitting and light-receiving region) of the other elements.
  • FIG. 15 G is a modification example of the pixel arrangement of FIG. 15 F . Specifically, the structure of FIG. 15 G is obtained by rotating the structure of FIG. 15 F by 45°. Although one pixel is regarded as including two elements in FIG. 15 F , one pixel can be regarded as being formed of four elements as illustrated in FIG. 15 G .
  • FIG. 15 H is a modification example of the pixel arrangement of FIG. 15 F .
  • the upper left pixel and the lower right pixel in FIG. 15 H each include the light-emitting and light-receiving element 213 R and the light-emitting element 211 G.
  • the upper right pixel and the lower left pixel each include the light-emitting and light-receiving element 213 R and the light-emitting element 211 B. That is, in the example illustrated in FIG. 15 H , the light-emitting and light-receiving element 213 R is provided in each pixel.
  • the structure illustrated in FIG. 15 H achieves higher-resolution image capturing than the structure illustrated in FIG. 15 F because of having the light-emitting and light-receiving element 213 R in each pixel. Thus, the accuracy of biometric authentication can be increased, for example.
  • FIG. 15 I shows a modification example of the pixel arrangement in FIG. 15 H , obtained by rotating the pixel arrangement in FIG. 15 H by 45°.
  • one pixel is described as being formed of four elements (two light-emitting elements and two light-emitting and light-receiving elements).
  • One pixel including a plurality of light-emitting and light-receiving elements having a light-receiving function allows high-resolution image capturing. Accordingly, the accuracy of biometric authentication can be increased.
  • the resolution of image capturing can be the square root of 2 times the resolution of display.
  • a display device that employs the structure illustrated in FIG. 15 H or FIG. 15 I includes p (p is an integer greater than or equal to 2) first light-emitting elements, q (q is an integer greater than or equal to 2) second light-emitting elements, and r (r is an integer greater than p and q) light-emitting and light-receiving elements.
  • Either the first light-emitting elements or the second light-emitting elements emit green light, and the other light-emitting elements emit blue light.
  • the light-emitting and light-receiving elements emit red light and have a light-receiving function.
  • the light-emitting and light-receiving elements In the case where touch operation is detected with the light-emitting and light-receiving elements, for example, it is preferable that light emitted from a light source be hard for a user to recognize. Since blue light has lower visibility than green light, light-emitting elements that emit blue light are preferably used as a light source. Accordingly, the light-emitting and light-receiving elements preferably have a function of receiving blue light. Note that without limitation to the above, light-emitting elements used as a light source can be selected as appropriate depending on the sensitivity of the light-emitting and light-receiving elements.
  • the display device of this embodiment can employ any of various types of pixel arrangements.
  • the display device of one embodiment of the present invention can have any of the following structures: a top-emission structure in which light is emitted in a direction opposite to the substrate where the light-emitting elements are formed, a bottom-emission structure in which light is emitted toward the substrate where the light-emitting elements are formed, and a dual-emission structure in which light is emitted toward both surfaces.
  • a top-emission display device is described as an example.
  • a display device 280 A illustrated in FIG. 16 A includes a light-receiving element 270 PD, a light-emitting element 270 R that emits red (R) light, a light-emitting element 270 G that emits green (G) light, and a light-emitting element 270 B that emits blue (B) light.
  • Each of the light-emitting elements includes a pixel electrode 271 , a hole-injection layer 281 , a hole-transport layer 282 , a light-emitting layer, an electron-transport layer 284 , an electron-injection layer 285 , and a common electrode 275 , which are stacked in this order.
  • the light-emitting element 270 R includes the light-emitting layer 283 R
  • the light-emitting element 270 G includes the light-emitting layer 283 G
  • the light-emitting element 270 B includes a light-emitting layer 283 B.
  • the light-emitting layer 283 R includes alight-emitting substance that emits red light
  • the light-emitting layer 283 G includes a light-emitting substance that emits green light
  • the light-emitting layer 283 B includes a light-emitting substance that emits blue light.
  • the light-emitting elements are electroluminescent elements that emit light to the common electrode 275 side by voltage application between the pixel electrodes 271 and the common electrode 275 .
  • the light-receiving element 270 PD includes the pixel electrode 271 , the hole-injection layer 281 , the hole-transport layer 282 , an active layer 273 , the electron-transport layer 284 , the electron-injection layer 285 , and the common electrode 275 , which are stacked in this order.
  • the light-receiving element 270 PD is a photoelectric conversion element that receives light entering from the outside of the display device 280 A and converts it into an electric signal.
  • the pixel electrode 271 functions as an anode and the common electrode 275 functions as a cathode in both of the light-emitting element and the light-receiving element.
  • the light-receiving element is driven by application of reverse bias between the pixel electrode 271 and the common electrode 275 , light incident on the light-receiving element can be detected and charge can be generated and extracted as current.
  • an organic compound is used for the active layer 273 of the light-receiving element 270 PD.
  • the layers other than the active layer 273 can have structures in common with the layers in the light-emitting elements. Therefore, the light-receiving element 270 PD can be formed concurrently with the formation of the light-emitting elements only by adding a step of depositing the active layer 273 in the manufacturing process of the light-emitting elements.
  • the light-emitting elements and the light-receiving element 270 PD can be formed over one substrate. Accordingly, the light-receiving element 270 PD can be incorporated into the display device without a significant increase in the number of manufacturing steps.
  • the display device 280 A is an example in which the light-receiving element 270 PD and the light-emitting elements have a common structure except that the active layer 273 of the light-receiving element 270 PD and the light-emitting layers 283 of the light-emitting elements are separately formed.
  • the structures of the light-receiving element 270 PD and the light-emitting elements are not limited thereto.
  • the light-receiving element 270 PD and the light-emitting elements may include separately formed layers other than the active layer 273 and the light-emitting layers 283 .
  • the light-receiving element 270 PD and the light-emitting elements preferably include at least one layer used in common (common layer). Thus, the light-receiving element 270 PD can be incorporated into the display device without a significant increase in the number of manufacturing steps.
  • a conductive film that transmits visible light is used as the electrode through which light is extracted, which is either the pixel electrode 271 or the common electrode 275 .
  • a conductive film that reflects visible light is preferably used as the electrode through which light is not extracted.
  • the light-emitting elements included in the display device of this embodiment preferably employs a micro optical resonator (microcavity) structure.
  • one of the pair of electrodes of the light-emitting elements is preferably an electrode having properties of transmitting and reflecting visible light (a semi-transmissive and semi-reflective electrode), and the other is preferably an electrode having a property of reflecting visible light (a reflective electrode).
  • the light-emitting elements have a microcavity structure, light obtained from the light-emitting layers can be resonated between both of the electrodes, whereby light emitted from the light-emitting elements can be intensified.
  • the semi-transmissive and semi-reflective electrode can have a stacked-layer structure of a reflective electrode and an electrode having a property of transmitting visible light (also referred to as a transparent electrode).
  • the transparent electrode has a light transmittance higher than or equal to 40%.
  • an electrode having a visible light (light with a wavelength greater than or equal to 400 nm and less than 750 nm) transmittance higher than or equal to 40% is preferably used in the light-emitting elements.
  • the semi-transmissive and semi-reflective electrode has a visible light reflectance of higher than or equal to 10% and lower than or equal to 95%, preferably higher than or equal to 30% and lower than or equal to 80%.
  • the reflective electrode has a visible light reflectance of higher than or equal to 40% and lower than or equal to 100%, preferably higher than or equal to 70% and lower than or equal to 100%.
  • These electrodes preferably have a resistivity lower than or equal to 1 ⁇ 10 ⁇ 2 2 ⁇ cm. Note that in the case where any of the light-emitting elements emits near-infrared light (light with a wavelength greater than or equal to 750 nm and less than or equal to 1300 nm), the near-infrared light transmittance and reflectance of these electrodes preferably satisfy the above-described numerical ranges of the visible light transmittance and reflectance.
  • the light-emitting element includes at least the light-emitting layer 283 .
  • the light-emitting element may further include, as a layer other than the light-emitting layer 283 , a layer containing a substance with a high hole-injection property, a substance with a high hole-transport property, a hole-blocking material, a substance with a high electron-transport property, a substance with a high electron-injection property, an electron-blocking material, a substance with a bipolar property (a substance with a high electron- and hole-transport property), or the like.
  • the light-emitting elements and the light-receiving element can share at least one of the hole-injection layer, the hole-transport layer, the electron-transport layer, and the electron-injection layer. Furthermore, at least one of the hole-injection layer, the hole-transport layer, the electron-transport layer, and the electron-injection layer can be separately formed for the light-emitting elements and the light-receiving element.
  • the hole-injection layer is a layer injecting holes from an anode to the hole-transport layer, and a layer containing a material with a high hole-injection property.
  • a material with a high hole-injection property an aromatic amine compound or a composite material containing a hole-transport material and an acceptor material (electron-accepting material) can be used.
  • the hole-transport layer is a layer transporting holes, which are injected from the anode by the hole-injection layer, to the light-emitting layer.
  • the hole-transport layer is a layer transporting holes, which are generated in the active layer on the basis of incident light, to the anode.
  • the hole-transport layer is a layer containing a hole-transport material.
  • As the hole-transport material a substance having a hole mobility greater than or equal to 1 ⁇ 10 ⁇ 6 cm 2 Ns is preferable. Note that other substances can also be used as long as they have a property of transporting more holes than electrons.
  • the hole-transport material materials having a high hole-transport property, such as a ⁇ -electron rich heteroaromatic compound (e.g., a carbazole derivative, a thiophene derivative, and a furan derivative) and an aromatic amine (a compound having an aromatic amine skeleton), are preferable.
  • a ⁇ -electron rich heteroaromatic compound e.g., a carbazole derivative, a thiophene derivative, and a furan derivative
  • an aromatic amine a compound having an aromatic amine skeleton
  • the electron-transport layer is a layer transporting electrons, which are injected from the cathode by the electron-injection layer, to the light-emitting layer.
  • the electron-transport layer is a layer transporting electrons, which are generated in the active layer on the basis of incident light, to the cathode.
  • the electron-transport layer is a layer containing an electron-transport material.
  • As the electron-transport material a substance having an electron mobility greater than or equal to 1 ⁇ 10 ⁇ 6 cm 2 Ns is preferable. Note that other substances can also be used as long as they have a property of transporting more electrons than holes.
  • the electron-transport material it is possible to use a material having a high electron-transport property, such as a metal complex having a quinoline skeleton, a metal complex having a benzoquinoline skeleton, a metal complex having an oxazole skeleton, a metal complex having a thiazole skeleton, an oxadiazole derivative, a triazole derivative, an imidazole derivative, an oxazole derivative, a thiazole derivative, a phenanthroline derivative, a quinoline derivative having a quinoline ligand, a benzoquinoline derivative, a quinoxaline derivative, a dibenzoquinoxaline derivative, a pyridine derivative, a bipyridine derivative, a pyrimidine derivative, or a it-electron deficient heteroaromatic compound such as a nitrogen-containing heteroaromatic compound.
  • a material having a high electron-transport property such as a metal complex having a quinoline skeleton,
  • the electron-injection layer is a layer injecting electrons from a cathode to the electron-transport layer, and a layer containing a material with a high electron-injection property.
  • a material with a high electron-injection property an alkali metal, an alkaline earth metal, or a compound thereof can be used.
  • a composite material containing an electron-transport material and a donor material can also be used.
  • the light-emitting layer 283 is a layer including a light-emitting substance.
  • the light-emitting layer 283 can include one or more kinds of light-emitting substances.
  • a substance that exhibits an emission color of blue, purple, bluish purple, green, yellowish green, yellow, orange, red, or the like is appropriately used.
  • a substance that emits near-infrared light can also be used.
  • Examples of the light-emitting substance include a fluorescent material, a phosphorescent material, a TADF material, and a quantum dot material.
  • Examples of the fluorescent material include a pyrene derivative, an anthracene derivative, a triphenylene derivative, a fluorene derivative, a carbazole derivative, a dibenzothiophene derivative, a dibenzofuran derivative, a dibenzoquinoxaline derivative, a quinoxaline derivative, a pyridine derivative, a pyrimidine derivative, a phenanthrene derivative, and a naphthalene derivative.
  • the phosphorescent material examples include an organometallic complex (particularly an iridium complex) having a 4H-triazole skeleton, a 1H-triazole skeleton, an imidazole skeleton, a pyrimidine skeleton, a pyrazine skeleton, or a pyridine skeleton; an organometallic complex (particularly an iridium complex) having a phenylpyridine derivative including an electron-withdrawing group as a ligand; a platinum complex; and a rare earth metal complex.
  • organometallic complex particularly an iridium complex having a 4H-triazole skeleton, a 1H-triazole skeleton, an imidazole skeleton, a pyrimidine skeleton, a pyrazine skeleton, or a pyridine skeleton
  • the light-emitting layer 283 may include one or more kinds of organic compounds (e.g., a host material and an assist material) in addition to the light-emitting substance (a guest material).
  • organic compounds e.g., a host material and an assist material
  • a host material and an assist material e.g., a host material and an assist material
  • the hole-transport material and the electron-transport material can be used.
  • a bipolar material or a TADF material may be used as one or more kinds of organic compounds.
  • the light-emitting layer 283 preferably includes a phosphorescent material and a combination of a hole-transport material and an electron-transport material that easily forms an exciplex.
  • ExTET Exciplex-Triplet Energy Transfer
  • a combination of materials is selected so as to form an exciplex that exhibits light emission whose wavelength overlaps the wavelength of a lowest-energy-side absorption band of the light-emitting substance, energy can be transferred smoothly and light emission can be obtained efficiently.
  • high efficiency, low-voltage driving, and a long lifetime of the light-emitting element can be achieved at the same time.
  • the HOMO level (highest occupied molecular orbital level) of the hole-transport material is preferably higher than or equal to the HOMO level of the electron-transport material.
  • the LUMO level (lowest unoccupied molecular orbital level) of the hole-transport material is preferably higher than or equal to the LUMO level of the electron-transport material.
  • the LUMO levels and the HOMO levels of the materials can be derived from the electrochemical characteristics (reduction potentials and oxidation potentials) of the materials that are measured by cyclic voltammetry (CV).
  • an exciplex can be confirmed by a phenomenon in which the emission spectrum of a mixed film in which the hole-transport material and the electron-transport material are mixed is shifted to the longer wavelength side than the emission spectrum of each of the materials (or has another peak on the longer wavelength side), observed by comparison of the emission spectrum of the hole-transport material, the emission spectrum of the electron-transport material, and the emission spectrum of the mixed film of these materials, for example.
  • the formation of an exciplex can be confirmed by a difference in transient response, such as a phenomenon in which the transient photoluminescence (PL) lifetime of the mixed film has longer lifetime components or has a larger proportion of delayed components than that of each of the materials, observed by comparison of the transient PL of the hole-transport material, the transient PL of the electron-transport material, and the transient PL of the mixed film of these materials.
  • the transient PL can be rephrased as transient electroluminescence (EL). That is, the formation of an exciplex can also be confirmed by a difference in transient response observed by comparison of the transient EL of the hole-transport material, the transient EL of the electron-transport material, and the transient EL of the mixed film of these materials.
  • the active layer 273 includes a semiconductor.
  • the semiconductor include an inorganic semiconductor such as silicon and an organic semiconductor including an organic compound.
  • This embodiment shows an example in which an organic semiconductor is used as the semiconductor included in the active layer 273 .
  • the use of an organic semiconductor is preferable because the light-emitting layer 283 and the active layer 273 can be formed by the same method (e.g., a vacuum evaporation method) and thus the same manufacturing apparatus can be used.
  • Examples of an n-type semiconductor material contained in the active layer 273 are electron-accepting organic semiconductor materials such as fullerene (e.g., C60 and C70) and a fullerene derivative.
  • Fullerene has a soccer ball-like shape, which is energetically stable. Both the HOMO level and the LUMO level of fullerene are deep (low). Having a deep LUMO level, fullerene has an extremely high electron-accepting property (acceptor property). When ⁇ -electron conjugation (resonance) spreads in a plane as in benzene, the electron-donating property (donor property) usually increases.
  • C60 and C70 have a wide absorption band in the visible light region, and C70 is especially preferable because of having a larger ⁇ -electron conjugation system and a wider absorption band in the long wavelength region than C60.
  • n-type semiconductor material examples include a metal complex having a quinoline skeleton, a metal complex having a benzoquinoline skeleton, a metal complex having an oxazole skeleton, a metal complex having a thiazole skeleton, an oxadiazole derivative, a triazole derivative, an imidazole derivative, an oxazole derivative, a thiazole derivative, a phenanthroline derivative, a quinoline derivative, a benzoquinoline derivative, a quinoxaline derivative, a dibenzoquinoxaline derivative, a pyridine derivative, a bipyridine derivative, a pyrimidine derivative, a naphthalene derivative, an anthracene derivative, a coumarin derivative, a rhodamine derivative, a triazine derivative, and a quinone derivative.
  • Examples of a p-type semiconductor material contained in the active layer 273 include electron-donating organic semiconductor materials such as copper(II) phthalocyanine (CuPc), tetraphenyldibenzoperiflanthene (DBP), zinc phthalocyanine (ZnPc), tin phthalocyanine (SnPc), and quinacridone.
  • electron-donating organic semiconductor materials such as copper(II) phthalocyanine (CuPc), tetraphenyldibenzoperiflanthene (DBP), zinc phthalocyanine (ZnPc), tin phthalocyanine (SnPc), and quinacridone.
  • Examples of a p-type semiconductor material include a carbazole derivative, a thiophene derivative, a furan derivative, and a compound having an aromatic amine skeleton.
  • Other examples of the p-type semiconductor material include a naphthalene derivative, an anthracene derivative, a pyrene derivative, a triphenylene derivative, a fluorene derivative, a pyrrole derivative, a benzofuran derivative, a benzothiophene derivative, an indole derivative, a dibenzofuran derivative, a dibenzothiophene derivative, an indolocarbazole derivative, a porphyrin derivative, a phthalocyanine derivative, a naphthalocyanine derivative, a quinacridone derivative, a polyphenylene vinylene derivative, a polyparaphenylene derivative, a polyfluorene derivative, a polyvinylcarbazole derivative, and a polythiophene derivative.
  • the HOMO level of the electron-donating organic semiconductor material is preferably shallower (higher) than the HOMO level of the electron-accepting organic semiconductor material.
  • the LUMO level of the electron-donating organic semiconductor material is preferably shallower (higher) than the LUMO level of the electron-accepting organic semiconductor material.
  • Fullerene having a spherical shape is preferably used as the electron-accepting organic semiconductor material, and an organic semiconductor material having a substantially planar shape is preferably used as the electron-donating organic semiconductor material.
  • Molecules of similar shapes tend to aggregate, and aggregated molecules of similar kinds, which have molecular orbital energy levels close to each other, can improve the carrier-transport property.
  • the active layer 273 is preferably formed by co-evaporation of an n-type semiconductor and a p-type semiconductor.
  • the active layer 273 may be formed by stacking an n-type semiconductor and a p-type semiconductor.
  • Either a low molecular compound or a high molecular compound can be used for the light-emitting element and the light-receiving element, and an inorganic compound may also be contained.
  • Each of the layers included in the light-emitting element and the light-receiving element can be formed by an evaporation method (including a vacuum evaporation method), a transfer method, a printing method, an inkjet method, a coating method, or the like.
  • a display device 280 B illustrated in FIG. 16 B is different from the display device 280 A in that the light-receiving element 270 PD and the light-emitting element 270 R have the same structure.
  • the light-receiving element 270 PD and the light-emitting element 270 R share the active layer 273 and the light-emitting layer 283 R.
  • the light-receiving element 270 PD have a structure in common with the light-emitting element that emits light with a wavelength longer than that of the light desired to be detected.
  • the light-receiving element 270 PD having a structure in which blue light is detected can have a structure which is similar to that of one or both of the light-emitting element 270 R and the light-emitting element 270 G.
  • the light-receiving element 270 PD having a structure in which green light is detected can have a structure similar to that of the light-emitting element 270 R.
  • the number of deposition steps and the number of masks can be smaller than those for the structure in which the light-receiving element 270 PD and the light-emitting element 270 R include separately formed layers. As a result, the number of manufacturing steps and the manufacturing cost of the display device can be reduced.
  • the light-receiving element 270 PD and the light-emitting element 270 R have a common structure, a margin for misalignment can be narrower than that for the structure in which the light-receiving element 270 PD and the light-emitting element 270 R include separately formed layers. Accordingly, the aperture ratio of a pixel can be increased, so that the light extraction efficiency of the display device can be increased. This can extend the life of the light-emitting element. Furthermore, the display device can exhibit a high luminance. Moreover, the resolution of the display device can also be increased.
  • the light-emitting layer 283 R includes a light-emitting material that emits red light.
  • the active layer 273 includes an organic compound that absorbs light with a wavelength shorter than that of red light (e.g., one or both of green light and blue light).
  • the active layer 273 preferably includes an organic compound that does not easily absorb red light and that absorbs light with a wavelength shorter than that of red light. In this way, red light can be efficiently extracted from the light-emitting element 270 R, and the light-receiving element 270 PD can detect light with a wavelength shorter than that of red light at high accuracy.
  • the light-emitting element 270 R and the light-receiving element 270 PD have the same structure in an example of the display device 280 B, the light-emitting element 270 R and the light-receiving element 270 PD may include optical adjustment layers with different thicknesses.
  • a display device 280 C illustrated in FIG. 17 A and FIG. 17 B includes a light-emitting and light-receiving element 270 SR that emits red (R) light and has a light-receiving function, the light-emitting element 270 G, and the light-emitting element 270 B.
  • the above description of the display device 280 A and the like can be referred to for the structures of the light-emitting element 270 G and the light-emitting element 270 B.
  • the light-emitting and light-receiving element 270 SR includes the pixel electrode 271 , the hole-injection layer 281 , the hole-transport layer 282 , the active layer 273 , the light-emitting layer 283 R, the electron-transport layer 284 , the electron-injection layer 285 , and the common electrode 275 , which are stacked in this order.
  • the light-emitting and light-receiving element 270 SR has the same structure as the light-emitting element 270 R and the light-receiving element 270 PD in the display device 280 B.
  • FIG. 17 A shows a case where the light-emitting and light-receiving element 270 SR functions as a light-emitting element.
  • the light-emitting element 270 B emits blue light
  • the light-emitting element 270 G emits green light
  • the light-emitting and light-receiving element 270 SR emits red light.
  • FIG. 17 B illustrates a case where the light-emitting and light-receiving element 270 SR functions as a light-receiving element.
  • the light-emitting and light-receiving element 270 SR detects blue light emitted by the light-emitting element 270 B and green light emitted by the light-emitting element 270 G.
  • the light-emitting element 270 B, the light-emitting element 270 G, and the light-emitting and light-receiving element 270 SR each include the pixel electrode 271 and the common electrode 275 .
  • the case where the pixel electrode 271 functions as an anode and the common electrode 275 functions as a cathode is described as an example.
  • the light-emitting and light-receiving element 270 SR is driven by application of reverse bias between the pixel electrode 271 and the common electrode 275 , light incident on the light-emitting and light-receiving element 270 SR can be detected and charge can be generated and extracted as current.
  • the light-emitting and light-receiving element 270 SR has a structure in which the active layer 273 is added to the light-emitting element. That is, the light-emitting and light-receiving element 270 SR can be formed concurrently with the formation of the light-emitting element only by adding a step of depositing the active layer 273 in the manufacturing process of the light-emitting element.
  • the light-emitting element and the light-emitting and light-receiving element can be formed over one substrate.
  • the display portion can be provided with one or both of an image capturing function and a sensing function without a significant increase in the number of manufacturing steps.
  • FIG. 17 A and FIG. 17 B each illustrate an example in which the active layer 273 is provided over the hole-transport layer 282 , and the light-emitting layer 283 R is provided over the active layer 273 .
  • the stacking order of the light-emitting layer 283 R and the active layer 273 may be reversed.
  • the light-emitting and light-receiving element may exclude at least one layer of the hole-injection layer 281 , the hole-transport layer 282 , the electron-transport layer 284 , and the electron-injection layer 285 . Furthermore, the light-emitting and light-receiving element may include another functional layer such as a hole-blocking layer or an electron-blocking layer.
  • a conductive film that transmits visible light is used as the electrode through which light is extracted.
  • a conductive film that reflects visible light is preferably used as the electrode through which light is not extracted.
  • the functions and materials of the layers constituting the light-emitting and light-receiving element are similar to those of the layers constituting the light-emitting elements and the light-receiving element and are not described in detail.
  • FIG. 17 C to FIG. 17 G illustrate examples of layered structures of light-emitting and light-receiving elements.
  • the light-emitting and light-receiving element illustrated in FIG. 17 C includes a first electrode 277 , the hole-injection layer 281 , the hole-transport layer 282 , the light-emitting layer 283 R, the active layer 273 , the electron-transport layer 284 , the electron-injection layer 285 , and a second electrode 278 .
  • FIG. 17 C illustrates an example in which the light-emitting layer 283 R is provided over the hole-transport layer 282 , and the active layer 273 is stacked over the light-emitting layer 283 R.
  • the active layer 273 and the light-emitting layer 283 R may be in contact with each other.
  • a buffer layer is preferably provided between the active layer 273 and the light-emitting layer 283 R.
  • the buffer layer preferably has a hole-transport property and an electron-transport property.
  • a substance with a bipolar property is preferably used for the buffer layer.
  • the buffer layer at least one layer of a hole-injection layer, a hole-transport layer, an electron-transport layer, an electron-injection layer, a hole-blocking layer, an electron-blocking layer, and the like can be used as the buffer layer.
  • FIG. 17 D illustrates an example in which the hole-transport layer 282 is used as the buffer layer.
  • the buffer layer provided between the active layer 273 and the light-emitting layer 283 R can inhibit transfer of excitation energy from the light-emitting layer 283 R to the active layer 273 . Furthermore, the buffer layer can also be used to adjust the optical path length (cavity length) of the microcavity structure. Thus, high emission efficiency can be obtained from a light-emitting and light-receiving element including the buffer layer between the active layer 273 and the light-emitting layer 283 R.
  • FIG. 17 E illustrates an example of a stacked-layer structure in which a hole-transport layer 282 - 1 , the active layer 273 , a hole-transport layer 282 - 2 , and the light-emitting layer 283 R are stacked in this order over the hole-injection layer 281 .
  • the hole-transport layer 282 - 2 functions as a buffer layer.
  • the hole-transport layer 282 - 1 and the hole-transport layer 282 - 2 may include the same material or different materials. Instead of the hole-transport layer 282 - 2 , any of the above layers that can be used as the buffer layer may be used.
  • the positions of the active layer 273 and the light-emitting layer 283 R may be interchanged.
  • the light-emitting and light-receiving element illustrated in FIG. 17 F is different from the light-emitting and light-receiving element illustrated in FIG. 17 A in not including the hole-transport layer 282 .
  • the light-emitting and light-receiving element may exclude at least one layer of the hole-injection layer 281 , the hole-transport layer 282 , the electron-transport layer 284 , and the electron-injection layer 285 .
  • the light-emitting and light-receiving element may include another functional layer such as a hole-blocking layer or an electron-blocking layer.
  • the light-emitting and light-receiving element illustrated in FIG. 17 G is different from the light-emitting and light-receiving element illustrated in FIG. 17 A in including a layer 289 serving as both a light-emitting layer and an active layer instead of including the active layer 273 and the light-emitting layer 283 R.
  • a layer containing three materials which are an n-type semiconductor that can be used for the active layer 273 , a p-type semiconductor that can be used for the active layer 273 , and a light-emitting substance that can be used for the light-emitting layer 283 R can be used, for example.
  • an absorption band on the lowest energy side of an absorption spectrum of a mixed material of the n-type semiconductor and the p-type semiconductor and a maximum peak of an emission spectrum (PL spectrum) of the light-emitting substance preferably do not overlap each other and are further preferably positioned fully apart from each other.
  • a detailed structure of the display device of one embodiment of the present invention will be described below.
  • an example of the display device including light-receiving elements and light-emitting elements will be described.
  • FIG. 18 A illustrates a cross-sectional view of a display device 300 A.
  • the display device 300 A includes a substrate 351 , a substrate 352 , a light-receiving element 310 , and a light-emitting element 390 .
  • the light-emitting element 390 includes a pixel electrode 391 , a buffer layer 312 , a light-emitting layer 393 , a buffer layer 314 , and a common electrode 315 , which are stacked in this order.
  • the buffer layer 312 can include one or both of a hole-injection layer and a hole-transport layer.
  • the light-emitting layer 393 includes an organic compound.
  • the buffer layer 314 can include one or both of an electron-injection layer and an electron-transport layer.
  • the light-emitting element 390 has a function of emitting visible light 321 .
  • the display device 300 A may also include a light-emitting element having a function of emitting infrared light.
  • the light-receiving element 310 includes a pixel electrode 311 , the buffer layer 312 , an active layer 313 , the buffer layer 314 , and the common electrode 315 , which are stacked in this order.
  • the active layer 313 includes an organic compound.
  • the light-receiving element 310 has a function of detecting visible light. Note that the light-receiving element 310 may also have a function of detecting infrared light.
  • the buffer layer 312 , the buffer layer 314 , and the common electrode 315 are common layers shared by the light-emitting element 390 and the light-receiving element 310 and provided across them.
  • the buffer layer 312 , the buffer layer 314 , and the common electrode 315 each include a portion overlapping with the active layer 313 and the pixel electrode 311 , a portion overlapping with the light-emitting layer 393 and the pixel electrode 391 , and a portion overlapping with none of them.
  • the pixel electrode functions as an anode and the common electrode 315 functions as a cathode in both of the light-emitting element 390 and the light-receiving element 310 .
  • the light-receiving element 310 is driven by application of reverse bias between the pixel electrode 311 and the common electrode 315 , so that light incident on the light-receiving element 310 can be detected and charge can be generated and extracted as current in the display device 300 A.
  • the pixel electrode 311 , the pixel electrode 391 , the buffer layer 312 , the active layer 313 , the buffer layer 314 , the light-emitting layer 393 , and the common electrode 315 may each have a single-layer structure or a stacked-layer structure.
  • the pixel electrode 311 and the pixel electrode 391 are each positioned over an insulating layer 414 .
  • the pixel electrodes can be formed using the same material in the same step.
  • An end portion of the pixel electrode 311 and an end portion of the pixel electrode 391 are covered with a partition 416 .
  • Two adjacent pixel electrodes are electrically insulated (electrically isolated) from each other by the partition 416 .
  • An organic insulating film is suitable for the partition 416 .
  • materials that can be used for the organic insulating film include an acrylic resin, a polyimide resin, an epoxy resin, a polyamide resin, a polyimide-amide resin, a siloxane resin, a benzocyclobutene-based resin, a phenol resin, and precursors of these resins.
  • the partition 416 is a layer that transmits visible light. A partition that blocks visible light may be provided instead of the partition 416 .
  • the common electrode 315 is a layer shared by the light-receiving element 310 and the light-emitting element 390 .
  • the material, thickness, and the like of the pair of electrodes can be the same between the light-receiving element 310 and the light-emitting element 390 . Accordingly, the manufacturing cost of the display device can be reduced, and the manufacturing process of the display device can be simplified.
  • the display device 300 A includes the light-receiving element 310 , the light-emitting element 390 , a transistor 331 , a transistor 332 , and the like between a pair of substrates (the substrate 351 and the substrate 352 ).
  • the buffer layer 312 , the active layer 313 , and the buffer layer 314 which are positioned between the pixel electrode 311 and the common electrode 315 , can each be referred to as an organic layer (a layer including an organic compound).
  • the pixel electrode 311 preferably has a function of reflecting visible light.
  • the common electrode 315 has a function of transmitting visible light. Note that in the case where the light-receiving element 310 is configured to detect infrared light, the common electrode 315 has a function of transmitting infrared light. Furthermore, the pixel electrode 311 preferably has a function of reflecting infrared light.
  • the light-receiving element 310 has a function of detecting light. Specifically, the light-receiving element 310 is a photoelectric conversion element that receives light 322 incident from the outside of the display device 300 A and converts it into an electric signal. The light 322 can also be expressed as light that is emitted from the light-emitting element 390 and then reflected by an object. The light 322 may be incident on the light-receiving element 310 through a lens or the like provided in the display device 300 A.
  • the buffer layer 312 , the light-emitting layer 393 , and the buffer layer 314 which are positioned between the pixel electrode 391 and the common electrode 315 , can be collectively referred to as an EL layer.
  • the EL layer includes at least the light-emitting layer 393 .
  • the pixel electrode 391 preferably has a function of reflecting visible light.
  • the common electrode 315 has a function of transmitting visible light. Note that in the case where the display device 300 A includes a light-emitting element that emits infrared light, the common electrode 315 has a function of transmitting infrared light.
  • the pixel electrode 391 preferably has a function of reflecting infrared light.
  • the light-emitting elements included in the display device of this embodiment preferably employ a micro optical resonator (microcavity) structure.
  • the light-emitting element 390 may include an optical adjustment layer between the pixel electrode 391 and the common electrode 315 .
  • the use of the micro resonator structure enables light of a specific color to be intensified and extracted from each of the light-emitting elements.
  • the light-emitting element 390 has a function of emitting visible light.
  • the light-emitting element 390 is an electroluminescent element that emits light (here, the visible light 321 ) to the substrate 352 side when voltage is applied between the pixel electrode 391 and the common electrode 315 .
  • the pixel electrode 311 included in the light-receiving element 310 is electrically connected to a source or a drain of the transistor 331 through an opening provided in the insulating layer 414 .
  • the pixel electrode 391 included in the light-emitting element 390 is electrically connected to a source or a drain of the transistor 332 through an opening provided in the insulating layer 414 .
  • the transistor 331 and the transistor 332 are on and in contact with the same layer (the substrate 351 in FIG. 18 A ).
  • At least part of a circuit electrically connected to the light-receiving element 310 and a circuit electrically connected to the light-emitting element 390 are preferably formed using the same material in the same step. In that case, the thickness of the display device can be reduced compared with the case where the two circuits are separately formed, resulting in simplification of the manufacturing process.
  • the light-receiving element 310 and the light-emitting element 390 are each preferably covered with a protective layer 395 .
  • the protective layer 395 is provided on and in contact with the common electrode 315 . Providing the protective layer 395 can inhibit entry of impurities such as water into the light-receiving element 310 and the light-emitting element 390 , so that the reliability of the light-receiving element 310 and the light-emitting element 390 can be increased.
  • the protective layer 395 and the substrate 352 are bonded to each other with an adhesive layer 342 .
  • a light-blocking layer 358 is provided on the surface of the substrate 352 on the substrate 351 side.
  • the light-blocking layer 358 has openings in a position overlapping with the light-emitting element 390 and in a position overlapping with the light-receiving element 310 .
  • the light-receiving element 310 detects light that is emitted from the light-emitting element 390 and then reflected by an object.
  • light emitted from the light-emitting element 390 is reflected inside the display device 300 A and is incident on the light-receiving element 310 without through an object.
  • the light-blocking layer 358 can reduce the influence of such stray light.
  • light 323 emitted from the light-emitting element 390 is reflected by the substrate 352 and reflected light 324 is incident on the light-receiving element 310 in some cases.
  • Providing the light-blocking layer 358 can inhibit the reflected light 324 to be incident on the light-receiving element 310 . Consequently, noise can be reduced, and the sensitivity of a sensor using the light-receiving element 310 can be increased.
  • the light-blocking layer 358 a material that blocks light emitted from the light-emitting element can be used.
  • the light-blocking layer 358 preferably absorbs visible light.
  • a black matrix can be formed using a metal material or a resin material containing pigment (e.g., carbon black) or dye, for example.
  • the light-blocking layer 358 may have a stacked-layer structure of a red color filter, a green color filter, and a blue color filter.
  • a display device 300 B illustrated in FIG. 18 B differs from the display device 300 A mainly in including a lens 349 .
  • the lens 349 is provided on a surface of the substrate 352 on the substrate 351 side.
  • the light 322 from the outside is incident on the light-receiving element 310 through the lens 349 .
  • a material that has high visible-light-transmitting property is preferably used for each of the lens 349 and the substrate 352 .
  • the range of light incident on the light-receiving element 310 can be narrowed.
  • overlap of imaging ranges between a plurality of light-receiving elements 310 can be inhibited, whereby a clear image with little blurring can be captured.
  • the lens 349 can condense incident light. Accordingly, the amount of light to be incident on the light-receiving element 310 can be increased. This can increase the photoelectric conversion efficiency of the light-receiving element 310 .
  • a display device 300 C illustrated in FIG. 18 C differs from the display device 300 A in the shape of the light-blocking layer 358 .
  • the light-blocking layer 358 is provided so that an opening portion overlapping with the light-receiving element 310 is positioned on an inner side of the light-receiving region of the light-receiving element 310 in a plan view.
  • overlap of imaging ranges between a plurality of light-receiving elements 310 can be inhibited, whereby a clear image with little blurring can be captured.
  • the area of the opening portion of the light-blocking layer 358 can be less than or equal to 80%, less than or equal to 70%, less than or equal to 60%, less than or equal to 50%, or less than or equal to 40% and greater than or equal to 1%, greater than or equal to 5%, or greater than or equal to 10% of the area of the light-receiving region of the light-receiving element 310 .
  • a clearer image can be obtained as the area of the opening portion of the light-blocking layer 358 becomes smaller.
  • the area of the opening portion is too small, the amount of light reaching the light-receiving element 310 might be reduced to reduce light sensitivity. Therefore, the area of the opening is preferably set within the above-described range. The above upper limits and lower limits can be combined freely.
  • the light-receiving region of the light-receiving element 310 can be referred to as the opening portion of the partition 416 .
  • the center of the opening portion of the light-blocking layer 358 overlapping with the light-receiving element 310 may be shifted from the center of the light-receiving region of the light-receiving element 310 in a plan view.
  • a structure in which the opening portion of the light-blocking layer 358 does not overlap with the light-receiving region of the light-receiving element 310 in a plan view may be employed.
  • only oblique light that has passed through the opening portion of the light-blocking layer 358 can be received by the light-receiving element 310 . Accordingly, the range of light incident on the light-receiving element 310 can be limited more effectively, so that a clear image can be captured.
  • a display device 300 D illustrated in FIG. 19 A differs from the display device 300 A mainly in that the buffer layer 312 is not a common layer.
  • the light-receiving element 310 includes the pixel electrode 311 , the buffer layer 312 , the active layer 313 , the buffer layer 314 , and the common electrode 315 .
  • the light-emitting element 390 includes the pixel electrode 391 , a buffer layer 392 , the light-emitting layer 393 , the buffer layer 314 , and the common electrode 315 .
  • Each of the active layer 313 , the buffer layer 312 , the light-emitting layer 393 , and the buffer layer 392 has an island-shaped top surface.
  • the buffer layer 312 and the buffer layer 392 may contain different materials or the same material.
  • the buffer layers are formed separately in the light-emitting element 390 and the light-receiving element 310 , the degree of freedom for selecting materials of the buffer layers included in the light-emitting element 390 and the light-receiving element 310 can be increased, which facilitates optimization.
  • the buffer layer 314 and the common electrode 315 are common layers, whereby the manufacturing process can be simplified and manufacturing cost can be reduced as compared to the case where the light-emitting element 390 and the light-receiving element 310 are manufactured separately.
  • a display device 300 E illustrated in FIG. 19 B differs from the display device 300 A mainly in that the buffer layer 314 is not a common layer.
  • the light-receiving element 310 includes the pixel electrode 311 , the buffer layer 312 , the active layer 313 , the buffer layer 314 , and the common electrode 315 .
  • the light-emitting element 390 includes the pixel electrode 391 , the buffer layer 312 , the light-emitting layer 393 , a buffer layer 394 , and the common electrode 315 .
  • Each of the active layer 313 , the buffer layer 314 , the light-emitting layer 393 , and the buffer layer 394 has an island-shaped top surface.
  • the buffer layer 314 and the buffer layer 394 may include different materials or the same material.
  • the buffer layers are formed separately in the light-emitting element 390 and the light-receiving element 310 , the degree of freedom for selecting materials of the buffer layers included in the light-emitting element 390 and the light-receiving element 310 can be increased, which facilitates optimization.
  • the buffer layer 312 and the common electrode 315 are common layers, whereby the manufacturing process can be simplified and manufacturing cost can be reduced as compared to the case where the light-emitting element 390 and the light-receiving element 310 are manufactured separately.
  • a display device 300 F illustrated in FIG. 19 C differs from the display device 300 A mainly in that the buffer layer 312 and the buffer layer 314 are not common layers.
  • the light-receiving element 310 includes the pixel electrode 311 , the buffer layer 312 , the active layer 313 , the buffer layer 314 , and the common electrode 315 .
  • the light-emitting element 390 includes the pixel electrode 391 , the buffer layer 392 , the light-emitting layer 393 , the buffer layer 394 , and the common electrode 315 .
  • Each of the buffer layer 312 , the active layer 313 , the buffer layer 314 , the buffer layer 392 , the light-emitting layer 393 , and the buffer layer 394 has an island-shaped top surface.
  • the common electrode 315 is a common layer, whereby the manufacturing process can be simplified and manufacturing cost can be reduced as compared to the case where the light-emitting element 390 and the light-receiving element 310 are manufactured separately.
  • a more detailed structure of the display device of one embodiment of the present invention will be described below.
  • an example of the display device including light-emitting and light-receiving elements and light-emitting elements will be described.
  • FIG. 20 A illustrates a cross-sectional view of a display device 300 G.
  • the display device 300 G includes a light-emitting and light-receiving element 390 SR, a light-emitting element 390 G, and a light-emitting element 390 B.
  • the light-emitting and light-receiving element 390 SR has a function of a light-emitting element that emits red light 321 R, and a function of a photoelectric conversion element that receives the light 322 .
  • the light-emitting element 390 G can emit green light 321 G.
  • the light-emitting element 390 B can emit blue light 321 B.
  • the light-emitting and light-receiving element 390 SR includes the pixel electrode 311 , the buffer layer 312 , the active layer 313 , a light-emitting layer 393 R, the buffer layer 314 , and the common electrode 315 .
  • the light-emitting element 390 G includes a pixel electrode 391 G, the buffer layer 312 , a light-emitting layer 393 G, the buffer layer 314 , and the common electrode 315 .
  • the light-emitting element 390 B includes a pixel electrode 391 B, the buffer layer 312 , a light-emitting layer 393 B, the buffer layer 314 , and the common electrode 315 .
  • the buffer layer 312 , the buffer layer 314 , and the common electrode 315 are common layers shared by the light-emitting and light-receiving element 390 SR, the light-emitting element 390 G, and the light-emitting element 390 B and provided across them.
  • Each of the active layer 313 , the light-emitting layer 393 R, the light-emitting layer 393 G, and the light-emitting layer 393 B has an island-shaped top surface. Note that although the stack body including the active layer 313 and the light-emitting layer 393 R, the light-emitting layer 393 G, and the light-emitting layer 393 B are provided separately from one another in the example illustrated in FIG. 20 A , adjacent two of them may include a region where the two overlaps each other.
  • the display device 300 G can have a structure in which one or both of the buffer layer 312 and the buffer layer 314 are not used as common layers.
  • the pixel electrode 311 is electrically connected to one of the source and the drain of the transistor 331 .
  • the pixel electrode 391 G is electrically connected to one of a source and a drain of a transistor 332 G.
  • the pixel electrode 391 B is electrically connected to one of a source and a drain of a transistor 332 B.
  • a display device 300 H illustrated in FIG. 20 B differs from the display device 300 G mainly in the structure of the light-emitting and light-receiving element 390 SR.
  • the light-emitting and light-receiving element 390 SR includes a light-emitting and light-receiving layer 318 R instead of the active layer 313 and the light-emitting layer 393 R.
  • the light-emitting and light-receiving layer 318 R is a layer that has both a function of a light-emitting layer and a function of an active layer.
  • a layer including the above-described light-emitting substance, an n-type semiconductor, and a p-type semiconductor can be used.
  • the manufacturing process can be simplified, facilitating cost reduction.
  • FIG. 21 illustrates a perspective view of a display device 400
  • FIG. 22 A illustrates a cross-sectional view of the display device 400 .
  • a substrate 353 and a substrate 354 are bonded to each other.
  • the substrate 354 is denoted by a dashed line.
  • the display device 400 includes a display portion 362 , a circuit 364 , a wiring 365 , and the like.
  • FIG. 21 illustrates an example in which the display device 400 is provided with an IC (integrated circuit) 373 and an FPC 372 .
  • the structure illustrated in FIG. 21 can also be regarded as a display module including the display device 400 , the IC, and the FPC.
  • a scan line driver circuit can be used as the circuit 364 .
  • the wiring 365 has a function of supplying a signal and power to the display portion 362 and the circuit 364 .
  • the signal and power are input to the wiring 365 from the outside through the FPC 372 or input to the wiring 365 from the IC 373 .
  • FIG. 21 illustrates an example in which the IC 373 is provided over the substrate 353 by a COG (Chip On Glass) method, a COF (Chip On Film) method, or the like.
  • An IC including a scan line driver circuit, a signal line driver circuit, or the like can be used as the IC 373 , for example.
  • the display device 400 and the display module are not necessarily provided with an IC.
  • the IC may be mounted on the FPC by a COF method or the like.
  • FIG. 22 A illustrates an example of cross-sections of part of a region including the FPC 372 , part of a region including the circuit 364 , part of a region including the display portion 362 , and part of a region including an end portion of the display device 400 illustrated in FIG. 21 .
  • the display device 400 illustrated in FIG. 22 A includes a transistor 408 , a transistor 409 , a transistor 410 , the light-emitting element 390 , the light-receiving element 310 , and the like between the substrate 353 and the substrate 354 .
  • the substrate 354 and the protective layer 395 are bonded to each other with the adhesive layer 342 , and a solid sealing structure is used for the display device 400 .
  • the substrate 353 and an insulating layer 412 are bonded to each other with an adhesive layer 355 .
  • a formation substrate provided with the insulating layer 412 , the transistors, the light-receiving element 310 , the light-emitting element 390 , and the like is bonded to the substrate 354 provided with the light-blocking layer 358 and the like with the adhesive layer 342 .
  • the substrate 353 is attached to a surface exposed by separation of the formation substrate, whereby the components formed over the formation substrate are transferred onto the substrate 353 .
  • the substrate 353 and the substrate 354 preferably have flexibility. This can increase the flexibility of the display device 400 .
  • the light-emitting element 390 has a stacked-layer structure in which the pixel electrode 391 , the buffer layer 312 , the light-emitting layer 393 , the buffer layer 314 , and the common electrode 315 are stacked in this order from the insulating layer 414 side.
  • the pixel electrode 391 is electrically connected to one of a source and a drain of in the transistor 408 through an opening provided in the insulating layer 414 .
  • the transistor 408 has a function of controlling a current flowing through the light-emitting element 390 .
  • the light-receiving element 310 has a stacked-layer structure in which the pixel electrode 311 , the buffer layer 312 , the active layer 313 , the buffer layer 314 , and the common electrode 315 are stacked in this order from the insulating layer 414 side.
  • the pixel electrode 311 is connected to one of a source and a drain of the transistor 409 through an opening provided in the insulating layer 414 .
  • the transistor 409 has a function of controlling transfer of charge accumulated in the light-receiving element 310 .
  • Light emitted by the light-emitting element 390 is emitted toward the substrate 354 side. Light is incident on the light-receiving element 310 through the substrate 354 and the adhesive layer 342 .
  • a material having a high visible-light-transmitting property is preferably used for the substrate 354 .
  • the pixel electrode 311 and the pixel electrode 391 can be formed using the same material in the same step.
  • the buffer layer 312 , the buffer layer 314 , and the common electrode 315 are shared by the light-receiving element 310 and the light-emitting element 390 .
  • the light-receiving element 310 and the light-emitting element 390 can have common components except the active layer 313 and the light-emitting layer 393 .
  • the light-receiving element 310 can be incorporated in the display device 400 without a significant increase in the number of manufacturing steps.
  • the light-blocking layer 358 is provided on a surface of the substrate 354 on the substrate 353 side.
  • the light-blocking layer 358 includes openings in a position overlapping with the light-emitting element 390 and in a position overlapping with the light-receiving element 310 .
  • Providing the light-blocking layer 358 can control the range where the light-receiving element 310 detects light. As described above, it is preferable to control light to be incident on the light-receiving element 310 by adjusting the position and area of the opening of the light-blocking layer provided in the position overlapping with the light-receiving element 310 . Furthermore, with the light-blocking layer 358 , light can be inhibited from being incident on the light-receiving element 310 directly from the light-emitting element 390 without through an object. Hence, a sensor with less noise and high sensitivity can be obtained.
  • An end portion of the pixel electrode 311 and an end portion of the pixel electrode 391 are each covered with the partition 416 .
  • the pixel electrode 311 and the pixel electrode 391 each include a material that reflects visible light, and the common electrode 315 includes a material that transmits visible light.
  • a region where part of the active layer 313 overlaps with part of the light-emitting layer 393 is included in the example illustrated in FIG. 22 A .
  • the portion where the active layer 313 overlaps with the light-emitting layer 393 preferably overlaps with the light-blocking layer 358 and the partition 416 .
  • the transistor 408 , the transistor 409 , and the transistor 410 are formed over the substrate 353 . These transistors can be formed using the same materials in the same steps.
  • the insulating layer 412 , an insulating layer 411 , an insulating layer 425 , an insulating layer 415 , an insulating layer 418 , and the insulating layer 414 are provided in this order over the substrate 353 with the adhesive layer 355 therebetween.
  • Each of the insulating layer 411 and the insulating layer 425 partially functions as a gate insulating layer for the transistors.
  • the insulating layer 415 and the insulating layer 418 are provided to cover the transistors.
  • the insulating layer 414 is provided to cover the transistors and has a function of a planarization layer. Note that there is no limitation on the number of gate insulating layers and the number of insulating layers covering the transistors, and each insulating layer may have either a single layer or two or more layers.
  • a material into which impurities such as water or hydrogen do not easily diffuse is preferably used for at least one of the insulating layers that cover the transistors. This allows the insulating layer to serve as a barrier layer. Such a structure can effectively inhibit diffusion of impurities into the transistors from the outside and increase the reliability of the display device.
  • An inorganic insulating film is preferably used as each of the insulating layer 411 , the insulating layer 412 , the insulating layer 425 , the insulating layer 415 , and the insulating layer 418 .
  • a silicon nitride film, a silicon oxynitride film, a silicon oxide film, a silicon nitride oxide film, an aluminum oxide film, or an aluminum nitride film can be used, for example.
  • a hafnium oxide film, a hafnium oxynitride film, a hafnium nitride oxide film, an yttrium oxide film, a zirconium oxide film, a gallium oxide film, a tantalum oxide film, a magnesium oxide film, a lanthanum oxide film, a cerium oxide film, a neodymium oxide film, or the like may be used.
  • a stack including two or more of the above insulating films may also be used.
  • an organic insulating film often has a lower barrier property than an inorganic insulating film. Therefore, the organic insulating film preferably has an opening in the vicinity of an end portion of the display device 400 . In a region 428 illustrated in FIG. 22 A , an opening is formed in the insulating layer 414 . This can inhibit entry of impurities from the end portion of the display device 400 through the organic insulating film.
  • the organic insulating film may be formed so that an end portion of the organic insulating film is positioned on the inner side compared to the end portion of the display device 400 , to prevent the organic insulating film from being exposed at the end portion of the display device 400 .
  • the insulating layer 418 and the protective layer 395 are preferably in contact with each other through the opening in the insulating layer 414 .
  • the inorganic insulating film included in the insulating layer 418 and the inorganic insulating film included in the protective layer 395 are preferably in contact with each other.
  • An organic insulating film is suitable for the insulating layer 414 functioning as a planarization layer.
  • materials that can be used for the organic insulating film include an acrylic resin, a polyimide resin, an epoxy resin, a polyamide resin, a polyimide-amide resin, a siloxane resin, a benzocyclobutene-based resin, a phenol resin, and precursors of these resins.
  • Providing the protective layer 395 covering the light-emitting element 390 and the light-receiving element 310 can inhibit impurities such as water from entering the light-emitting element 390 and the light-receiving element 310 and increase the reliability of the light-emitting element 390 and the light-receiving element 310 .
  • the protective layer 395 may have a single-layer structure or a stacked-layer structure.
  • the protective layer 395 may have a stacked-layer structure of an organic insulating film and an inorganic insulating film. In that case, an end portion of the inorganic insulating film preferably extends beyond an end portion of the organic insulating film.
  • FIG. 22 B is a cross-sectional view of a transistor 401 a that can be used as the transistor 408 , the transistor 409 , and the transistor 410 .
  • the transistor 401 a is provided over the insulating layer 412 (not illustrated) and includes a conductive layer 421 functioning as a first gate, the insulating layer 411 functioning as a first gate insulating layer, a semiconductor layer 431 , the insulating layer 425 functioning as a second gate insulating layer, and a conductive layer 423 functioning as a second gate.
  • the insulating layer 411 is positioned between the conductive layer 421 and the semiconductor layer 431 .
  • the insulating layer 425 is positioned between the conductive layer 423 and the semiconductor layer 431 .
  • the semiconductor layer 431 includes a region 431 i and a pair of regions 431 n .
  • the region 431 i functions as a channel formation region.
  • One of the pair of regions 431 n serves as a source and the other thereof serves as a drain.
  • the regions 431 n have higher carrier concentration and higher conductivity than the region 431 i .
  • the conductive layer 422 a and the conductive layer 422 b are connected to the regions 431 n through openings provided in the insulating layer 418 and the insulating layer 415 .
  • FIG. 22 C is a cross-sectional view of a transistor 401 b that can be used as the transistor 408 , the transistor 409 , and the transistor 410 . Furthermore, in the example illustrated in FIG. 22 C , the insulating layer 415 is not provided. In the transistor 401 b , the insulating layer 425 is processed in the same manner as the conductive layer 423 , and the insulating layer 418 is in contact with the regions 431 n.
  • transistors included in the display device of this embodiment there is no particular limitation on the structure of the transistors included in the display device of this embodiment.
  • a planar transistor, a staggered transistor, or an inverted staggered transistor can be used.
  • a top-gate or a bottom-gate transistor structure may be employed.
  • gates may be provided above and below a semiconductor layer in which a channel is formed.
  • the structure in which the semiconductor layer where a channel is formed is provided between two gates is used for the transistor 408 , the transistor 409 , and the transistor 410 .
  • the two gates may be connected to each other and supplied with the same signal to drive the transistor.
  • a potential for controlling the threshold voltage may be supplied to one of the two gates and a potential for driving may be supplied to the other to control the threshold voltage of the transistor.
  • crystallinity of a semiconductor material used for the transistors there is no particular limitation on the crystallinity of a semiconductor material used for the transistors; any of an amorphous semiconductor, a single crystal semiconductor, and a semiconductor having crystallinity (a microcrystalline semiconductor, a polycrystalline semiconductor, or a semiconductor partly including crystal regions) may be used.
  • a semiconductor having crystallinity is preferably used, in which case deterioration of the transistor characteristics can be suppressed.
  • the semiconductor layer of the transistor preferably includes a metal oxide (also referred to as an oxide semiconductor).
  • the semiconductor layer of the transistor may include silicon. Examples of silicon include amorphous silicon and crystalline silicon (e.g., low-temperature polysilicon or single crystal silicon).
  • the semiconductor layer preferably includes indium, M (M is one or more kinds selected from gallium, aluminum, silicon, boron, yttrium, tin, copper, vanadium, beryllium, titanium, iron, nickel, germanium, zirconium, molybdenum, lanthanum, cerium, neodymium, hafnium, tantalum, tungsten, and magnesium), and zinc, for example.
  • M is preferably one or more kinds selected from aluminum, gallium, yttrium, and tin.
  • an oxide containing indium (In), gallium (Ga), and zinc (Zn) also referred to as IGZO
  • IGZO oxide containing indium (In), gallium (Ga), and zinc (Zn)
  • the atomic ratio of In is preferably greater than or equal to the atomic ratio of M in the In-M-Zn oxide.
  • the case is included where the atomic ratio of Ga is greater than or equal to 1 and less than or equal to 3 and the atomic ratio of Zn is greater than or equal to 2 and less than or equal to 4 with the atomic ratio of In being 4.
  • the transistor 410 included in the circuit 364 and the transistor 408 and the transistor 409 included in the display portion 362 may have the same structure or different structures.
  • a plurality of transistors included in the circuit 364 may have the same structure or two or more kinds of structures.
  • a plurality of transistors included in the display portion 362 may have the same structure or two or more kinds of structures.
  • connection portion 404 is provided in a region of the substrate 353 that does not overlap with the substrate 354 .
  • the wiring 365 is electrically connected to the FPC 372 through a conductive layer 366 and a connection layer 442 .
  • the conductive layer 366 obtained by processing the same conductive film as the pixel electrode 311 and the pixel electrode 391 is exposed on a top surface of the connection portion 404 .
  • the connection portion 404 and the FPC 372 can be electrically connected to each other through the connection layer 442 .
  • optical members can be arranged on the outer side of the substrate 354 .
  • the optical members include a polarizing plate, a retardation plate, a light diffusion layer (a diffusion film or the like), an anti-reflective layer, and a light-condensing film.
  • an antistatic film preventing the attachment of dust, a water repellent film inhibiting the attachment of stain, a hard coat film inhibiting generation of a scratch caused by the use, a shock absorption layer, or the like may be placed on the outer side of the substrate 354 .
  • the flexibility of the display device can be increased.
  • the material is not limited thereto, and glass, quartz, ceramic, sapphire, resin, or the like can be used for each of the substrate 353 and the substrate 354 .
  • a variety of curable adhesives e.g., a photocurable adhesive such as an ultraviolet curable adhesive, a reactive curable adhesive, a thermosetting adhesive, and an anaerobic adhesive
  • these adhesives include an epoxy resin, an acrylic resin, a silicone resin, a phenol resin, a polyimide resin, an imide resin, a PVC (polyvinyl chloride) resin, a PVB (polyvinyl butyral) resin, and an EVA (ethylene vinyl acetate) resin.
  • a material with low moisture permeability such as an epoxy resin, is preferred.
  • a two-component resin may be used.
  • An adhesive sheet or the like may be used.
  • connection layer an anisotropic conductive film (ACF), an anisotropic conductive paste (ACP), or the like can be used.
  • ACF anisotropic conductive film
  • ACP anisotropic conductive paste
  • Examples of materials that can be used for a gate, a source, and a drain of a transistor and conductive layers such as a variety of wirings and electrodes included in a display device include metals such as aluminum, titanium, chromium, nickel, copper, yttrium, zirconium, molybdenum, silver, tantalum, or tungsten, and an alloy containing any of these metals as its main component.
  • a film containing any of these materials can be used in a single layer or as a stacked-layer structure.
  • a conductive oxide such as indium oxide, indium tin oxide, indium zinc oxide, zinc oxide, or zinc oxide containing gallium, or graphene
  • a metal material such as gold, silver, platinum, magnesium, nickel, tungsten, chromium, molybdenum, iron, cobalt, copper, palladium, or titanium, or an alloy material containing the metal material
  • a nitride of the metal material e.g., titanium nitride
  • the thickness is preferably set small enough to be able to transmit light.
  • a stacked-layer film of any of the above materials can be used as a conductive layer.
  • a stacked-layer film of indium tin oxide and an alloy of silver and magnesium, or the like is preferably used for increased conductivity.
  • These materials can also be used for conductive layers such as a variety of wirings and electrodes that constitute a display device, or conductive layers (conductive layers functioning as a pixel electrode or a common electrode) included in a light-emitting element and a light-receiving element (or a light-emitting and light-receiving element).
  • insulating material for example, a resin such as an acrylic resin or an epoxy resin, and an inorganic insulating material such as silicon oxide, silicon oxynitride, silicon nitride oxide, silicon nitride, or aluminum oxide can be given.
  • a resin such as an acrylic resin or an epoxy resin
  • an inorganic insulating material such as silicon oxide, silicon oxynitride, silicon nitride oxide, silicon nitride, or aluminum oxide
  • a metal oxide also referred to as an oxide semiconductor that can be used in the OS transistor described in the above embodiment is described.
  • the metal oxide preferably contains at least indium or zinc.
  • indium and zinc are preferably contained.
  • aluminum, gallium, yttrium, tin, or the like is preferably contained.
  • one or more kinds selected from boron, silicon, titanium, iron, nickel, germanium, zirconium, molybdenum, lanthanum, cerium, neodymium, hafnium, tantalum, tungsten, magnesium, cobalt, and the like may be contained.
  • the metal oxide can be formed by a sputtering method, a chemical vapor deposition (CVD) method such as a metal organic chemical vapor deposition (MOCVD) method, an atomic layer deposition (ALD) method, or the like.
  • CVD chemical vapor deposition
  • MOCVD metal organic chemical vapor deposition
  • ALD atomic layer deposition
  • Amorphous (including a completely amorphous structure), CAAC (c-axis-aligned crystalline), nc (nanocrystalline), CAC (cloud-aligned composite), single-crystal, and polycrystalline (poly crystal) structures can be given as examples of a crystal structure of an oxide semiconductor.
  • a crystal structure of a film or a substrate can be evaluated with an X-ray diffraction (XRD) spectrum.
  • XRD X-ray diffraction
  • evaluation is possible using an XRD spectrum which is obtained by GIXD (Grazing-Incidence XRD) measurement.
  • GIXD Gram-Incidence XRD
  • a GIXD method is also referred to as a thin film method or a Seemann-Bohlin method.
  • the XRD spectrum of the quartz glass substrate shows a peak with a substantially bilaterally symmetrical shape.
  • the peak of the XRD spectrum of the IGZO film having a crystal structure has a bilaterally asymmetrical shape.
  • the asymmetrical peak of the XRD spectrum clearly shows the existence of crystal in the film or the substrate. In other words, the crystal structure of the film or the substrate cannot be regarded as “amorphous” unless it has a bilaterally symmetrical peak in the XRD spectrum.
  • a crystal structure of a film or a substrate can also be evaluated with a diffraction pattern obtained by a nanobeam electron diffraction (NBED) method (such a pattern is also referred to as a nanobeam electron diffraction pattern).
  • NBED nanobeam electron diffraction
  • a halo pattern is observed in the diffraction pattern of the quartz glass substrate, which indicates that the quartz glass substrate is in an amorphous state.
  • not a halo pattern but a spot-like pattern is observed in the diffraction pattern of the IGZO film deposited at room temperature.
  • the IGZO film deposited at room temperature is in an intermediate state, which is neither a crystal state nor an amorphous state, and it cannot be concluded that the IGZO film is in an amorphous state.
  • Oxide semiconductors might be classified in a manner different from the above-described one when classified in terms of the structure. Oxide semiconductors are classified into a single crystal oxide semiconductor and a non-single-crystal oxide semiconductor, for example. Examples of the non-single-crystal oxide semiconductor include the above-described CAAC-OS and nc-OS. Other examples of the non-single-crystal oxide semiconductor include a polycrystalline oxide semiconductor, an amorphous-like oxide semiconductor (a-like OS), and an amorphous oxide semiconductor.
  • CAAC-OS CAAC-OS
  • nc-OS nc-OS
  • a-like OS are described in detail.
  • the CAAC-OS is an oxide semiconductor that has a plurality of crystal regions each of which has c-axis alignment in a particular direction.
  • the particular direction refers to the film thickness direction of a CAAC-OS film, the normal direction of the surface where the CAAC-OS film is formed, or the normal direction of the surface of the CAAC-OS film.
  • the crystal region refers to a region having a periodic atomic arrangement. When an atomic arrangement is regarded as a lattice arrangement, the crystal region also refers to a region with a uniform lattice arrangement.
  • the CAAC-OS has a region where a plurality of crystal regions are connected in the a-b plane direction, and the region has distortion in some cases.
  • distortion refers to a portion where the direction of a lattice arrangement changes between a region with a uniform lattice arrangement and another region with a uniform lattice arrangement in a region where a plurality of crystal regions are connected.
  • the CAAC-OS is an oxide semiconductor having c-axis alignment and having no clear alignment in the a-b plane direction.
  • each of the plurality of crystal regions is formed of one or more fine crystals (crystals each of which has a maximum diameter of less than 10 nm).
  • the maximum diameter of the crystal region is less than 10 nm.
  • the size of the crystal region may be approximately several tens of nanometers.
  • the CAAC-OS tends to have a layered crystal structure (also referred to as a layered structure) in which a layer containing indium (In) and oxygen (hereinafter, an In layer) and a layer containing the element M, zinc (Zn), and oxygen (hereinafter, an (M,Zn) layer) are stacked.
  • Indium and the element M can be replaced with each other. Therefore, indium may be contained in the (M,Zn) layer.
  • the element M may be contained in the In layer.
  • Zn may be contained in the In layer.
  • Such a layered structure is observed as a lattice image in a high-resolution TEM (Transmission Electron Microscope) image, for example.
  • a peak indicating c-axis alignment is detected at 2 ⁇ of 31° or around 31°.
  • the position of the peak indicating c-axis alignment may change depending on the kind, composition, or the like of the metal element contained in the CAAC-OS.
  • a plurality of bright spots are observed in the electron diffraction pattern of the CAAC-OS film. Note that one spot and another spot are observed point-symmetrically with a spot of the incident electron beam passing through a sample (also referred to as a direct spot) as the symmetric center.
  • a lattice arrangement in the crystal region is basically a hexagonal lattice arrangement; however, a unit lattice is not always a regular hexagon and is a non-regular hexagon in some cases.
  • a pentagonal lattice arrangement, a heptagonal lattice arrangement, and the like are included in the distortion in some cases.
  • a clear crystal grain boundary (grain boundary) cannot be observed even in the vicinity of the distortion in the CAAC-OS. That is, formation of a crystal grain boundary is inhibited by the distortion of lattice arrangement. This is probably because the CAAC-OS can tolerate distortion owing to a low density of arrangement of oxygen atoms in the a-b plane direction, an interatomic bond distance changed by substitution of a metal atom, and the like.
  • the CAAC-OS in which no clear crystal grain boundary is observed is one of crystalline oxides having a crystal structure suitable for a semiconductor layer of a transistor.
  • Zn is preferably contained to form the CAAC-OS.
  • an In—Zn oxide and an In—Ga—Zn oxide are suitable because they can inhibit generation of a crystal grain boundary as compared with an In oxide.
  • the CAAC-OS is an oxide semiconductor with high crystallinity in which no clear crystal grain boundary is observed. Thus, in the CAAC-OS, a reduction in electron mobility due to the crystal grain boundary is unlikely to occur. Moreover, since the crystallinity of an oxide semiconductor might be decreased by entry of impurities, formation of defects, or the like, the CAAC-OS can be regarded as an oxide semiconductor that has small amounts of impurities and defects (e.g., oxygen vacancies). Thus, an oxide semiconductor including the CAAC-OS is physically stable. Therefore, the oxide semiconductor including the CAAC-OS is resistant to heat and has high reliability. In addition, the CAAC-OS is stable with respect to high temperature in the manufacturing process (what is called thermal budget). Accordingly, the use of the CAAC-OS for the OS transistor can extend the degree of freedom of the manufacturing process.
  • nc-OS In the nc-OS, a microscopic region (e.g., a region with a size greater than or equal to 1 nm and less than or equal to 10 nm, in particular, a region with a size greater than or equal to 1 nm and less than or equal to 3 nm) has a periodic atomic arrangement.
  • the nc-OS includes a fine crystal.
  • the size of the fine crystal is, for example, greater than or equal to 1 nm and less than or equal to 10 nm, particularly greater than or equal to 1 nm and less than or equal to 3 nm; thus, the fine crystal is also referred to as a nanocrystal.
  • the nc-OS cannot be distinguished from an a-like OS or an amorphous oxide semiconductor by some analysis methods. For example, when an nc-OS film is subjected to structural analysis by Out-of-plane XRD measurement with an XRD apparatus using ⁇ /2 ⁇ scanning, a peak indicating crystallinity is not detected.
  • a diffraction pattern like a halo pattern is observed when the nc-OS film is subjected to electron diffraction (also referred to as selected-area electron diffraction) using an electron beam with a probe diameter larger than the diameter of a nanocrystal (e.g., larger than or equal to 50 nm).
  • electron diffraction also referred to as selected-area electron diffraction
  • a plurality of spots in a ring-like region with a direct spot as the center are observed in a nanobeam electron diffraction pattern of the nc-OS film obtained using an electron beam with a probe diameter nearly equal to or smaller than the diameter of a nanocrystal (e.g., 1 nm or larger and 30 nm or smaller).
  • the a-like OS is an oxide semiconductor having a structure between those of the nc-OS and the amorphous oxide semiconductor.
  • the a-like OS contains avoid or a low-density region. That is, the a-like OS has lower crystallinity than the nc-OS and the CAAC-OS. Moreover, the a-like OS has higher hydrogen concentration in the film than the nc-OS and the CAAC-OS.
  • CAC-OS relates to the material composition.
  • the CAC-OS refers to one composition of a material in which elements constituting a metal oxide are unevenly distributed with a size greater than or equal to 0.5 nm and less than or equal to 10 nm, preferably greater than or equal to 1 nm and less than or equal to 3 nm, or a similar size, for example.
  • a state in which one or more metal elements are unevenly distributed and regions including the metal element(s) are mixed with a size greater than or equal to 0.5 nm and less than or equal to 10 nm, preferably greater than or equal to 1 nm and less than or equal to 3 nm, or a similar size in a metal oxide is hereinafter referred to as a mosaic pattern or a patch-like pattern.
  • the CAC-OS has a composition in which materials are separated into a first region and a second region to form a mosaic pattern, and the first regions are distributed in the film (this composition is hereinafter also referred to as a cloud-like composition). That is, the CAC-OS is a composite metal oxide having a composition in which the first regions and the second regions are mixed.
  • the atomic ratios of In, Ga, and Zn to the metal elements contained in the CAC-OS in an In—Ga—Zn oxide are denoted by [In], [Ga], and [Zn], respectively.
  • the first region in the CAC-OS in the In—Ga—Zn oxide has [In] higher than that in the composition of the CAC-OS film.
  • the second region has [Ga] higher than that in the composition of the CAC-OS film.
  • the first region has higher [In] and lower [Ga] than the second region.
  • the second region has higher [Ga] and lower [In] than the first region.
  • the first region contains indium oxide, indium zinc oxide, or the like as its main component.
  • the second region contains gallium oxide, gallium zinc oxide, or the like as its main component. That is, the first region can be referred to as a region containing In as its main component.
  • the second region can be referred to as a region containing Ga as its main component.
  • CAC-OS In a material composition of a CAC-OS in an In—Ga—Zn oxide that contains In, Ga, Zn, and O, regions containing Ga as a main component are observed in part of the CAC-OS and regions containing In as a main component are observed in part thereof. These regions are randomly present to form a mosaic pattern.
  • the CAC-OS has a structure in which metal elements are unevenly distributed.
  • the CAC-OS can be formed by a sputtering method under a condition where a substrate is not heated, for example.
  • any one or more selected from an inert gas (typically, argon), an oxygen gas, and a nitrogen gas are used as a deposition gas.
  • the ratio of the flow rate of an oxygen gas to the total flow rate of the deposition gas at the time of deposition is preferably as low as possible, and for example, the ratio of the flow rate of an oxygen gas to the total flow rate of the deposition gas at the time of deposition is preferably higher than or equal to 0% and less than 30%, further preferably higher than or equal to 0% and less than or equal to 10%.
  • the CAC-OS in the In—Ga—Zn oxide has a structure in which the region containing In as its main component (the first region) and the region containing Ga as its main component (the second region) are unevenly distributed and mixed.
  • the first region has a higher conductivity than the second region.
  • the conductivity of a metal oxide is exhibited. Accordingly, when the first regions are distributed in a metal oxide like a cloud, high field-effect mobility ( ⁇ ) can be achieved.
  • the second region has a higher insulating property than the first region. In other words, when the second regions are distributed in a metal oxide, leakage current can be inhibited.
  • the CAC-OS can have a switching function (On/Off function). That is, the CAC-OS has a conducting function in part of the material and has an insulating function in another part of the material; as a whole, the CAC-OS has a function of a semiconductor. Separation of the conducting function and the insulating function can maximize each function. Accordingly, when the CAC-OS is used for a transistor, high on-state current (I on ), high field-effect mobility ( ⁇ ), and excellent switching operation can be achieved.
  • I on on-state current
  • high field-effect mobility
  • a transistor using the CAC-OS has high reliability.
  • the CAC-OS is most suitable for a variety of semiconductor devices such as display devices.
  • An oxide semiconductor has various structures with different properties. Two or more kinds among the amorphous oxide semiconductor, the polycrystalline oxide semiconductor, the a-like OS, the CAC-OS, the nc-OS, and the CAAC-OS may be included in an oxide semiconductor of one embodiment of the present invention.
  • the above oxide semiconductor is used for a transistor, a transistor with high field-effect mobility can be achieved. In addition, a transistor having high reliability can be achieved.
  • an oxide semiconductor having a low carrier concentration is preferably used in a transistor.
  • the carrier concentration of an oxide semiconductor is lower than or equal to 1 ⁇ 10 17 cm ⁇ 3 , preferably lower than or equal to 1 ⁇ 10 15 cm ⁇ 3 , further preferably lower than or equal to 1 ⁇ 10 13 cm ⁇ 3 , still further preferably lower than or equal to 1 ⁇ 10 11 cm ⁇ 3 , yet further preferably lower than 1 ⁇ 10 10 cm ⁇ 3 , and higher than or equal to 1 ⁇ 10 ⁇ 9 cm ⁇ 3 .
  • the impurity concentration in the oxide semiconductor film is reduced so that the density of defect states can be reduced.
  • a state with a low impurity concentration and a low density of defect states is referred to as a highly purified intrinsic or substantially highly purified intrinsic state.
  • an oxide semiconductor having a low carrier concentration may be referred to as a highly purified intrinsic or substantially highly purified intrinsic oxide semiconductor.
  • a highly purified intrinsic or substantially highly purified intrinsic oxide semiconductor film has a low density of defect states and thus has a low density of trap states in some cases.
  • impurity concentration in an oxide semiconductor is effective.
  • impurity concentration in an adjacent film it is preferable that the impurity concentration in an adjacent film be also reduced.
  • impurities include hydrogen, nitrogen, an alkali metal, an alkaline earth metal, iron, nickel, and silicon.
  • the concentration of silicon or carbon in the oxide semiconductor and the concentration of silicon or carbon in the vicinity of an interface with the oxide semiconductor are each set lower than or equal to 2 ⁇ 10 18 atoms/cm 3 , preferably lower than or equal to 2 ⁇ 10 17 atoms/cm 3 .
  • the oxide semiconductor contains an alkali metal or an alkaline earth metal
  • defect states are formed and carriers are generated in some cases.
  • a transistor using an oxide semiconductor that contains an alkali metal or an alkaline earth metal is likely to have normally-on characteristics.
  • the concentration of an alkali metal or an alkaline earth metal in the oxide semiconductor which is obtained by SIMS, is set lower than or equal to 1 ⁇ 10 18 atoms/cm 3 , preferably lower than or equal to 2 ⁇ 10 16 atoms/cm 3 .
  • the oxide semiconductor contains nitrogen
  • the oxide semiconductor easily becomes n-type by generation of electrons serving as carriers and an increase in carrier concentration.
  • a transistor using an oxide semiconductor containing nitrogen as a semiconductor is likely to have normally-on characteristics.
  • the concentration of nitrogen in the oxide semiconductor, which is obtained by SIMS is set lower than 5 ⁇ 10 19 atoms/cm 3 , preferably lower than or equal to 5 ⁇ 10 18 atoms/cm 3 , further preferably lower than or equal to 1 ⁇ 10 18 atoms/cm 3 , still further preferably lower than or equal to 5 ⁇ 10 17 atoms/cm 3 .
  • Hydrogen contained in the oxide semiconductor reacts with oxygen bonded to a metal atom to be water, and thus forms an oxygen vacancy in some cases. Entry of hydrogen into the oxygen vacancy generates an electron serving as a carrier in some cases. Furthermore, bonding of part of hydrogen to oxygen bonded to a metal atom causes generation of an electron serving as a carrier in some cases. Thus, a transistor using an oxide semiconductor containing hydrogen is likely to have normally-on characteristics. Accordingly, hydrogen in the oxide semiconductor is preferably reduced as much as possible.
  • the hydrogen concentration in the oxide semiconductor which is obtained by SIMS, is set lower than 1 ⁇ 10 20 atoms/cm 3 , preferably lower than 1 ⁇ 10 19 atoms/cm 3 , further preferably lower than 5 ⁇ 10 18 atoms/cm 3 , still further preferably lower than 1 ⁇ 10 18 atoms/cm 3 .
  • An electronic device of one embodiment of the present invention can perform image capturing and detect touch operation (contact or approach) in a display portion.
  • the electronic device can have improved functionality and convenience, for example.
  • Examples of the electronic devices of embodiments of the present invention include a digital camera, a digital video camera, a digital photo frame, a mobile phone, a portable game console, a portable information terminal, and an audio reproducing device, in addition to electronic devices with a relatively large screen, such as a television device, a desktop or laptop personal computer, a monitor of a computer or the like, digital signage, and a large game machine such as a pachinko machine.
  • the electronic device of one embodiment of the present invention may include a sensor (a sensor having a function of measuring force, displacement, position, speed, acceleration, angular velocity, rotational frequency, distance, light, liquid, magnetism, temperature, a chemical substance, sound, time, hardness, electric field, current, voltage, electric power, radiation, flow rate, humidity, gradient, oscillation, a smell, or infrared rays).
  • a sensor a sensor having a function of measuring force, displacement, position, speed, acceleration, angular velocity, rotational frequency, distance, light, liquid, magnetism, temperature, a chemical substance, sound, time, hardness, electric field, current, voltage, electric power, radiation, flow rate, humidity, gradient, oscillation, a smell, or infrared rays.
  • the electronic device of one embodiment of the present invention can have a variety of functions.
  • the electronic device can have a function of displaying a variety of data (a still image, a moving image, a text image, and the like) on the display portion, a touch panel function, a function of displaying a calendar, date, time, and the like, a function of executing a variety of software (programs), a wireless communication function, and a function of reading out a program or data stored in a recording medium.
  • An electronic device 6500 illustrated in FIG. 23 A is a portable information terminal that can be used as a smartphone.
  • the electronic device 6500 includes a housing 6501 , a display portion 6502 , a power button 6503 , buttons 6504 , a speaker 6505 , a microphone 6506 , a camera 6507 , a light source 6508 , and the like.
  • the display portion 6502 has a touch panel function.
  • the display device described in Embodiment 2 can be used in the display portion 6502 .
  • FIG. 23 B is a schematic cross-sectional view including an end portion of the housing 6501 on the microphone 6506 side.
  • a protection member 6510 having a light-transmitting property is provided on a display surface side of the housing 6501 , and a display panel 6511 , an optical member 6512 , a touch sensor panel 6513 , a printed circuit board 6517 , a battery 6518 , and the like are provided in a space surrounded by the housing 6501 and the protection member 6510 .
  • the display panel 6511 , the optical member 6512 , and the touch sensor panel 6513 are fixed to the protection member 6510 with an adhesive layer (not illustrated).
  • Part of the display panel 6511 is folded back in a region outside the display portion 6502 , and an FPC 6515 is connected to the part that is folded back.
  • An IC 6516 is mounted on the FPC 6515 .
  • the FPC 6515 is connected to a terminal provided on the printed circuit board 6517 .
  • a flexible display of one embodiment of the present invention can be used as the display panel 6511 .
  • an extremely lightweight electronic device can be provided. Since the display panel 6511 is extremely thin, the battery 6518 with high capacity can be mounted with the thickness of the electronic device controlled. An electronic device with a narrow frame can be obtained when part of the display panel 6511 is folded back so that the portion connected to the FPC 6515 is positioned on the rear side of a pixel portion.
  • Using the display device described in Embodiment 2 as the display panel 6511 allows image capturing on the display portion 6502 .
  • an image of a fingerprint is captured by the display panel 6511 ; thus, fingerprint authentication can be performed.
  • the display portion 6502 can have a touch panel function.
  • a touch panel function such as a capacitive type, a resistive type, a surface acoustic wave type, an infrared type, an optical type, and a pressure-sensitive type can be used for the touch sensor panel 6513 .
  • the display panel 6511 may function as a touch sensor; in such a case, the touch sensor panel 6513 is not necessarily provided.
  • FIG. 24 A illustrates an example of a television device.
  • a display portion 7000 is incorporated in a housing 7101 .
  • a structure in which the housing 7101 is supported by a stand 7103 is illustrated.
  • the display device described in Embodiment 2 can be used in the display portion 7000 .
  • Operation of the television device 7100 illustrated in FIG. 24 A can be performed with an operation switch provided in the housing 7101 or a separate remote controller 7111 .
  • the display portion 7000 may include a touch sensor, and the television device 7100 may be operated by touch on the display portion 7000 with a finger or the like.
  • the remote controller 7111 may be provided with a display portion for displaying data output from the remote controller 7111 . With operation keys or a touch panel provided in the remote controller 7111 , channels and volume can be controlled and videos displayed on the display portion 7000 can be controlled.
  • the television device 7100 has a structure in which a receiver, a modem, and the like are provided.
  • a general television broadcast can be received with the receiver.
  • the television device is connected to a communication network with or without wires via the modem, one-way (from a transmitter to a receiver) or two-way (between a transmitter and a receiver or between receivers, for example) data communication can be performed.
  • FIG. 24 B illustrates an example of a laptop personal computer.
  • a laptop personal computer 7200 includes a housing 7211 , a keyboard 7212 , a pointing device 7213 , an external connection port 7214 , and the like.
  • the display portion 7000 is incorporated.
  • the display device described in Embodiment 2 can be used in the display portion 7000 .
  • FIG. 24 C and FIG. 24 D illustrate examples of digital signage.
  • Digital signage 7300 illustrated in FIG. 24 C includes a housing 7301 , the display portion 7000 , a speaker 7303 , and the like. Furthermore, the digital signage 7300 can include an LED lamp, an operation key (including a power switch or an operation switch), a connection terminal, a variety of sensors, a microphone, and the like.
  • an operation key including a power switch or an operation switch
  • a connection terminal a variety of sensors, a microphone, and the like.
  • FIG. 24 D is digital signage 7400 attached to a cylindrical pillar 7401 .
  • the digital signage 7400 includes the display portion 7000 provided along a curved surface of the pillar 7401 .
  • the display device described in Embodiment 2 can be used in the display portion 7000 .
  • a larger area of the display portion 7000 can increase the amount of data that can be provided at a time.
  • the larger display portion 7000 attracts more attention, so that the effectiveness of the advertisement can be increased, for example.
  • a touch panel in the display portion 7000 is preferable because in addition to display of a still image or a moving image on the display portion 7000 , intuitive operation by a user is possible. Moreover, for an application for providing information such as route information or traffic information, usability can be enhanced by intuitive operation.
  • the digital signage 7300 or the digital signage 7400 can work with an information terminal 7311 or an information terminal 7411 such as a user's smartphone through wireless communication.
  • information of an advertisement displayed on the display portion 7000 can be displayed on a screen of the information terminal 7311 or the information terminal 7411 .
  • display on the display portion 7000 can be switched.
  • the digital signage 7300 or the digital signage 7400 execute a game with the use of the screen of the information terminal 7311 or the information terminal 7411 as an operation means (controller).
  • an unspecified number of users can join in and enjoy the game concurrently.
  • Electronic devices illustrated in FIG. 25 A to FIG. 25 F include a housing 9000 , a display portion 9001 , a speaker 9003 , an operation key 9005 (including a power switch or an operation switch), a connection terminal 9006 , a sensor 9007 (a sensor having a function of measuring force, displacement, position, speed, acceleration, angular velocity, rotational frequency, distance, light, liquid, magnetism, temperature, a chemical substance, sound, time, hardness, electric field, current, voltage, electric power, radiation, flow rate, humidity, gradient, oscillation, a smell, or infrared rays), a microphone 9008 , and the like.
  • a sensor 9007 a sensor having a function of measuring force, displacement, position, speed, acceleration, angular velocity, rotational frequency, distance, light, liquid, magnetism, temperature, a chemical substance, sound, time, hardness, electric field, current, voltage, electric power, radiation, flow rate, humidity, gradient, oscillation, a smell, or infrared
  • the electronic devices illustrated in FIG. 25 A to FIG. 25 F have a variety of functions.
  • the electronic devices can have a function of displaying a variety of data (a still image, a moving image, a text image, and the like) on the display portion, a touch panel function, a function of displaying a calendar, date, time, and the like, a function of controlling processing with the use of a variety of software (programs), a wireless communication function, and a function of reading out and processing a program or data stored in a recording medium.
  • the functions of the electronic devices are not limited thereto, and the electronic devices can have a variety of functions.
  • the electronic devices may each include a plurality of display portions.
  • the electronic devices may each include a camera or the like and have a function of taking a still image or a moving image and storing the taken image in a recording medium (an external recording medium or a recording medium incorporated in the camera), a function of displaying the taken image on the display portion, or the like.
  • FIG. 25 A to FIG. 25 F The details of the electronic devices illustrated in FIG. 25 A to FIG. 25 F are described below.
  • FIG. 25 A is a perspective view illustrating a portable information terminal 9101 .
  • the portable information terminal 9101 can be used as a smartphone.
  • the portable information terminal 9101 may be provided with the speaker 9003 , the connection terminal 9006 , the sensor 9007 , or the like.
  • the portable information terminal 9101 can display characters or image data on its plurality of surfaces.
  • FIG. 25 A illustrates an example where three icons 9050 are displayed. Information 9051 indicated by dashed rectangles can be displayed on another surface of the display portion 9001 .
  • Examples of the information 9051 include notification of reception of an e-mail, SNS, or an incoming call, the title and sender of an e-mail, SNS, or the like, the date, the time, remaining battery, and the reception strength of an antenna.
  • the icon 9050 or the like may be displayed in the position where the information 9051 is displayed.
  • FIG. 25 B is a perspective view illustrating a portable information terminal 9102 .
  • the portable information terminal 9102 has a function of displaying information on three or more surfaces of the display portion 9001 .
  • information 9052 , information 9053 , and information 9054 are displayed on different surfaces.
  • a user can check the information 9053 displayed in a position that can be observed from above the portable information terminal 9102 , with the portable information terminal 9102 put in a breast pocket of his/her clothes. The user can seethe display without taking out the portable information terminal 9102 from the pocket and decide whether to answer the call, for example.
  • FIG. 25 C is a perspective view illustrating a watch-type portable information terminal 9200 .
  • the display surface of the display portion 9001 is curved and provided, and display can be performed along the curved display surface.
  • Mutual communication between the portable information terminal 9200 and, for example, a headset capable of wireless communication enables hands-free calling.
  • the connection terminal 9006 With the connection terminal 9006 , the portable information terminal 9200 can perform mutual data transmission with another information terminal and charging. Note that the charging operation may be performed by wireless power feeding.
  • FIG. 25 D to FIG. 25 F are perspective views illustrating a foldable portable information terminal 9201 .
  • FIG. 25 D is a perspective view of an opened state of the portable information terminal 9201
  • FIG. 25 F is a perspective view of a folded state thereof
  • FIG. 25 E is a perspective view of a state in the middle of change from one of FIG. 25 D and FIG. 25 F to the other.
  • the portable information terminal 9201 is highly portable in the folded state and is highly browsable in the opened state because of a seamless large display region.
  • the display portion 9001 of the portable information terminal 9201 is supported by three housings 9000 joined by hinges 9055 .
  • the display portion 9001 can be folded with a radius of curvature greater than or equal to 0.1 mm and less than or equal to 150 mm.
  • a 1 -A 5 coordinate
  • B 1 -B 5 coordinate
  • 10 and 20 device
  • 11 control portion
  • 12 display portion
  • 21 , 22 , and 23 detection portion
  • 50 arrow
  • 100 and 100 a - 100 k object

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Electromagnetism (AREA)
  • Acoustics & Sound (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An electronic device capable of operation of a three-dimensional movement is provided. An electronic device capable of performing various types of actuation by simple operation is provided. A display device includes a control portion, a display portion, and a detection portion. The display portion includes a screen displaying an image, and the detection portion has a function of obtaining positional information of a detection target that is in contact with the screen or approaches above the screen and outputting the information to the control portion. The control portion has a function of executing first processing based on first operation, a function of executing second processing when second operation is successively performed after the first operation, and a function of executing third processing when third operation is successively performed after the second operation. The first operation is operation in which two pointed positions are detected on the screen, the second operation is operation in which the two pointed positions move to reduce the distance therebetween, and the third operation is operation in which the two pointed positions move in the normal direction with respect to the screen from the state where they are in contact with the screen.

Description

    TECHNICAL FIELD
  • One embodiment of the present invention relates to an electronic device. One embodiment of the present invention relates to a display device. One embodiment of the present invention relates to a program.
  • Note that one embodiment of the present invention is not limited to the above technical field. Examples of the technical field of one embodiment of the present invention include a semiconductor device, a display device, a light-emitting apparatus, a power storage device, a memory device, an electronic device, a lighting device, an input device (e.g., a touch sensor), an input/output device (e.g., a touch panel), a driving method thereof, and a manufacturing method thereof.
  • BACKGROUND ART
  • Most of information terminal devices, for example, mobile phones such as smartphones and tablet information terminals, are provided with a function of executing various types of processing by simple operation. For example, a display includes a touch sensor that detects an object in contact with the display, and various movements are performed by a fingertip or the like touching a surface of the display; whereby it is easy to move a position of an object displayed on the display or zoom in/out on the object, for example.
  • Patent Document 1 discloses an electronic device including a touch sensor.
  • REFERENCE Patent Document
    • [Patent Document 1] Japanese Published Patent Application No. 2015-127951
    SUMMARY OF THE INVENTION Problems to be Solved by the Invention
  • In touch sensors widely used in information terminal devices, operation is limited to be planar on displays.
  • An object of one embodiment of the present invention is to provide an electronic device capable of operation of a three-dimensional movement. Another object of one embodiment of the present invention is to provide an electronic device capable of performing various types of actuation by simple operation. Another object of one embodiment of the present invention is to provide an electronic device capable of intuitive operation. Another object of one embodiment of the present invention is to provide a novel electronic device.
  • Note that the description of these objects does not preclude the existence of other objects. One embodiment of the present invention does not need to achieve all the objects. Other objects can be derived from the description of the specification, the drawings, and the claims.
  • Means for Solving the Problems
  • One embodiment of the present invention is a display device including a control portion, a display portion, and a detection portion. The display portion includes a screen displaying an image. The detection portion has a function of obtaining information about contact on the screen or positional information of a detection target approaching the screen in the normal direction and outputting the information to the control portion. The control portion has a function of executing first processing when first operation is performed, a function of executing second processing when second operation is successively performed after the first operation, and a function of executing third processing when third operation is successively performed after the second operation. The first operation is operation in which two pointed positions in contact with the screen are detected, the second operation is operation in which the two pointed positions move to reduce the distance therebetween, and the third operation is operation in which the two pointed positions move in the normal direction with respect to the screen from the state where they are in contact with the screen.
  • The first processing is processing by which a selection region in the screen is determined, the second processing is processing by which an object positioned in the selection region is selected, and the third processing is processing by which the object is picked up.
  • In the above, the control portion can further have a function of executing fourth processing when fourth operation is performed after the third operation. The fourth operation is operation in which the two pointed positions come in contact with the screen. In the above, the control portion can further have a function of executing fifth processing when fifth operation is performed after the third operation. The fifth operation is operation in which the two pointed positions move to the height from the screen that exceeds a threshold value. In the above, the control portion can further have a function of executing sixth processing when sixth operation is performed after the third operation. The sixth operation is operation in which the two pointed positions move to make the distance therebetween large in the state where the height of the two pointed positions from the screen is less than the threshold value and the two pointed positions are not in contact with the screen.
  • In the above, furthermore, the control portion preferably has a function of executing seventh processing when seventh operation is successively performed after the third operation. The seventh operation is operation in which the two pointed positions move in a region where the height of the two pointed positions from the screen is less than the threshold value and the two pointed positions are not in contact with the screen.
  • The fourth processing is processing by which the selection of the object in the screen is canceled at the two pointed positions in contact with the screen. The fifth processing is processing by which the selection of the object is canceled at a two-dimensional position in the screen of the time when the height of the two pointed positions from the screen exceeds the threshold value or at the two pointed positions in contact with the screen in the third operation. The sixth processing is processing by which the selection of the object is canceled at a two-dimensional position in the screen of the time when the two pointed positions move to make the distance therebetween large or at the two pointed positions in contact with the screen in the third operation.
  • In the above, the display portion includes a light-emitting element. The detection portion includes a photoelectric conversion element. The light-emitting element and the photoelectric conversion element are preferably provided on the same plane. The detection portion preferably includes a touch sensor with a capacitive type, a surface acoustic wave type, a resistive type, an ultrasonic type, an electromagnetic type, or an optical type.
  • Another embodiment of the present invention is a display module including the above display device and a connector or an integrated circuit.
  • Another embodiment of the present invention is an electronic device including the above display module and at least one of an antenna, a battery, a housing, a camera, a speaker, a microphone, and an operation button.
  • Effect of the Invention
  • According to one embodiment of the present invention, an electronic device capable of detecting a three-dimensional movement can be provided. Alternatively, an electronic device capable of executing various types of processing by simple operation can be provided.
  • Alternatively, an electronic device capable of intuitive operation can be provided. Alternatively, a novel electronic device can be provided.
  • Note that the description of these effects does not preclude the existence of other effects. One embodiment of the present invention does not need to have all the effects. Other effects can be derived from the description of the specification, the drawings, and the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1A and FIG. 1B are diagrams each illustrating a structure example of a device.
  • FIG. 2A to FIG. 2C are diagrams illustrating movements of a finger.
  • FIG. 3A to FIG. 3C are diagrams illustrating methods for selecting an object.
  • FIG. 4A to FIG. 4C are diagrams illustrating methods for selecting an object.
  • FIG. 5A to FIG. 5C are diagrams illustrating detection of approach.
  • FIG. 6A and FIG. 6B are diagrams illustrating selection of an object.
  • FIG. 7A and FIG. 7B are diagrams illustrating moves of an object.
  • FIG. 8A and FIG. 8B are diagrams illustrating moves of an object.
  • FIG. 9A and FIG. 9B are diagrams each illustrating an example of an application that can be applied to an electronic device.
  • FIG. 10A and FIG. 10B are diagrams each illustrating an example of an application that can be applied to an electronic device.
  • FIG. 11A to FIG. 11C are diagrams each illustrating an example of an application that can be applied to an electronic device.
  • FIG. 12A and FIG. 12B are diagrams each illustrating an example of an application that can be applied to an electronic device.
  • FIG. 13A, FIG. 13B, and FIG. 13D are cross-sectional views each illustrating an example of a display device. FIG. 13C and FIG. 13E are diagrams each illustrating an example of an image captured by the display device. FIG. 13F to FIG. 13H are top views each illustrating an example of a pixel.
  • FIG. 14A is a cross-sectional view of a structure example of a display device. FIG. 14B to FIG. 14D are top views each illustrating an example of a pixel.
  • FIG. 15A is a cross-sectional view illustrating a structure example of a display device. FIG. 15B to FIG. 15I are top views each illustrating an example of a pixel.
  • FIG. 16A and FIG. 16B are diagrams each illustrating a structure example of a display device.
  • FIG. 17A to FIG. 17G are diagrams illustrating structure examples of display devices.
  • FIG. 18A to FIG. 18C are diagrams each illustrating a structure example of a display device.
  • FIG. 19A to FIG. 19C are diagrams each illustrating a structure example of a display device.
  • FIG. 20A and FIG. 20B are diagrams each illustrating a structure example of a display device.
  • FIG. 21 is a diagram illustrating a structure example of a display device.
  • FIG. 22A is a diagram illustrating a structure example of a display device. FIG. 22B and FIG. 22C are diagrams illustrating structure examples of transistors.
  • FIG. 23A and FIG. 23B are diagrams illustrating an example of an electronic device.
  • FIG. 24A to FIG. 24D are diagrams illustrating examples of electronic devices.
  • FIG. 25A to FIG. 25F are diagrams illustrating examples of electronic devices.
  • MODE FOR CARRYING OUT THE INVENTION
  • Embodiments are described in detail with reference to the drawings. Note that the present invention is not limited to the following description, and it will be readily appreciated by those skilled in the art that modes and details of the present invention can be modified in various ways without departing from the spirit and scope of the present invention. Thus, the present invention should not be construed as being limited to the description in the following embodiments.
  • Note that in structures of the invention described below, the same portions or portions having similar functions are denoted by the same reference numerals in different drawings, and the description thereof is not repeated. Furthermore, the same hatch pattern is used for the portions having similar functions, and the portions are not especially denoted by reference numerals in some cases.
  • The position, size, range, or the like of each component illustrated in drawings does not represent the actual position, size, range, or the like in some cases for easy understanding. Therefore, the disclosed invention is not necessarily limited to the position, size, range, or the like disclosed in the drawings.
  • Note that the term “film” and the term “layer” can be interchanged with each other depending on the case or circumstances. For example, the term “conductive layer” can be replaced with the term “conductive film”. As another example, the term “insulating film” can be replaced with the term “insulating layer”.
  • Embodiment 1
  • In this embodiment, a structure example and an actuation method of an electronic device of one embodiment of the present invention are described with reference to FIG. 1A to FIG. 12B.
  • Note that in a block diagram attached to this specification, components are classified according to their functions and shown as independent blocks; however, it is practically difficult to completely separate the components according to their functions, and one component may have a plurality of functions. Alternatively, a plurality of components may achieve one function.
  • The electronic device of one embodiment of the present invention can detect contact and approach of a detection target with/to a screen. That is, positional information (X,Y) that is a coordinate parallel to the screen and positional information (Z) that represents the height from the screen can each be detected. Accordingly, three-dimensional operation is possible; for example, an object displayed on a display can be displayed as if it is moved three-dimensionally.
  • [Structure Example of Electronic Device]
  • FIG. 1A illustrates a block diagram of a device 10 of one embodiment of the present invention. The device 10 includes a control portion 11 and a display portion 12. The display portion 12 includes a detection portion 21. The device 10 can be used as an electronic device such as an information terminal device, for example.
  • The display portion 12 has a function of displaying an image and a function of detecting contact and approach of a detection target with/to a screen. Here, contact means the state where the detection target is in contact with the screen, and approach means the state where the detection target is not in contact with the screen but is positioned near and above the screen in a sensing range of a sensor. Here, an example where the display portion 12 includes the detection portion 21 is illustrated. The detection portion 21 is a portion having, out of the above functions of the display portion 12, the function of detecting contact and approach of a detection target with/to a screen. The display portion 12 can also be referred to as a touch panel. For example, a display device described in detail in Embodiment 2 can be used for the display portion 12. In this manner, the device 10 can detect two kinds of information, i.e., contact and approach of the detection target with/to the screen with one detection portion 21, which is preferable because component cost and manufacturing cost of the device 10 can be reduced.
  • The detection portion 21 has a function of outputting, to the control portion 11, two-dimensional positional information (X,Y) on the screen about the detection target whose contact has been detected and three-dimensional positional information (X,Y,Z) on the screen about the detection target whose approach has been detected. Note that Z represents the distance (height) in the normal direction with respect to a detection surface (screen). The origin (reference point) of the positional information (X,Y) on the screen is at a given position, e.g., a corner or the center of the screen. In addition, the origin (reference point) of the coordinate Z is on the surface of the screen; that is, the height of 0 is the reference point.
  • Although FIG. 1A illustrates an example where the display portion 12 includes the detection portion 21, they may be provided separately. That is, the screen and an operation portion can be separated. In this case, examples of the detection portion 21 include a touch pad that does not have an image display function.
  • The control portion 11 can function as, for example, a central processing unit (CPU). The control portion 11 interprets and executes instructions from various programs with a processor to process various kinds of data and control programs. Furthermore, for example, the control portion 11 can control a movement of the object in the screen, a display change, and the like by processing a signal from the detection portion 21.
  • For the detection portion 21, a touch sensor capable of position detection in a contact and noncontact state can be used. For example, a touch sensor with various types such as a capacitive type, a surface acoustic wave type, a resistive type, an ultrasonic type, an infrared type, an electromagnetic type, or an optical type can be used.
  • FIG. 1B illustrates a block diagram of a device 20 of one embodiment of the present invention. The device 20 includes the control portion 11 and the display portion 12. The display portion 12 includes a detection portion 22 and a detection portion 23. The device 20 can be used as an electronic device such as an information terminal device, for example.
  • The display portion 12 has a function of displaying an image and a function of detecting contact and approach of a detection target with/to a screen. Here, an example where the display portion 12 includes the detection portion 22 and the detection portion 23 is illustrated. The detection portion 22 is a portion having, out of the above functions of the display portion 12, the function of detecting contact of a detection target with a screen. Furthermore, the detection portion 23 is a portion having, out of the above functions of the display portion 12, the function of detecting approach of a detection target to a screen. The display portion 12 can also be referred to as a touch panel. For example, a display device described in detail in Embodiment 2 can be used for the display portion 12. In this manner, the device 20 includes two detection portions, the detection portion 22 that detects contact of the detection target with the screen and the detection portion 23 that detects approach of the detection target to the screen; whereby the contact and approach can be each detected with high accuracy and more accurate operation is possible, which is preferable.
  • The detection portion 22 has a function of obtaining two-dimensional positional information (X,Y) on the screen about the detection target that is in contact with the screen and outputting the information to the control portion 11. Furthermore, the detection portion 23 has a function of obtaining three-dimensional positional information (X,Y,Z) about the detection target that approaches the screen and outputting the information to the control portion 11. Note that Z represents the distance in the normal direction with respect to a detection surface (screen).
  • Although FIG. 1B illustrates an example where the display portion 12 includes the detection portion 22 and the detection portion 23, they may be provided separately. That is, the screen and an operation portion can be separated. In this case, examples of the detection portion 22 and the detection portion 23 include a touch pad that does not have an image display function.
  • For example, the control portion 11 can control a movement of the object in the screen, a display change, and the like by processing signals from the detection portion 22 and the detection portion 23.
  • For each of the detection portion 22 and the detection portion 23, a touch sensor capable of position detection in a contact and noncontact state may be used. For example, a touch sensor with various types such as a capacitive type, a surface acoustic wave type, a resistive type, an ultrasonic type, an infrared type, an electromagnetic type, or an optical type can be used.
  • [Actuation Examples of Device 10 and Device 20]
  • Actuation examples of the device 10 and the device 20 are described below with reference to FIG. 2A to FIG. 8B. The device 10 can select an object (e.g., an icon) displayed on the screen by detecting contact and approach of a detection target by the detection portion 21, and move the selected object to a given position in the screen. The device 20 can select an object (e.g., an icon) displayed on the screen by detecting contact of a detection target by the detection portion 22 and detecting approach of the detection target by the detection portion 23, and move the selected object to a given position in the screen. Specifically, the object in the screen can be selected, and actuation of pinching, picking up, moving, and putting down the selected object can be executed.
  • In this embodiment, the actuation such as pinching, picking up, moving, and putting down the object refers to display processing in the screen of the display portion 12. For example, “pinching an object” refers to processing of displaying the object as if the object is pinched, “picking up” refers to processing of displaying the object as if the object is picked up, “moving” refers to processing of displaying the object as if the object moves in the screen, “putting down” refers to processing of displaying the picked-up object as if the object is put down to the screen from above the screen.
  • [Pinching]
  • First, “pinching” actuation using two fingers is described. Here, an example where an index finger and a thumb are used as the two fingers is described; however, any two fingers can be used.
  • Furthermore, the device can sense coordinates of fingertips of two fingers, and the coordinates are each referred to as a pointed position in some cases. For example, in the case where the fingertips are in contact with the screen, the coordinates of the contact portions correspond to the pointed positions. In addition, in the case where the fingertips are not in contact with the screen, the coordinates of points at which the fingertips and the screen are the closest to each other or the coordinates of positions where the intensities of detecting the fingertips peak can be the pointed positions.
  • First, as illustrated in FIG. 2A, part of a fingertip of an index finger and part of a fingertip of a thumb are made to be in contact with a coordinate A1 and a coordinate B1 on the screen, respectively. Next, as illustrated in FIG. 2B, in the state where the fingertips of the two fingers are in contact with the screen, the fingertips are moved to positions of a coordinate A2 and a coordinate B2 such that the fingertips are close to each other. Although in FIG. 2B, the coordinate A2 and the coordinate B2 become apart from each other, the fingers may be moved to be in contact with each other as illustrated in FIG. 2C. In this case, the coordinate A2 and the coordinate B2 are in close contact with each other. Alternatively, the part of the fingertip of the index finger and the part of the fingertip of the thumb may be in contact with the coordinate A2 and the coordinate B2 from the beginning, respectively, without being in contact with the coordinate A1 and the coordinate B1. The above is the pinching actuation.
  • Note that the series of actuation illustrated in FIG. 2A to FIG. 2C cannot be distinguished from what is called pinch-in in some cases. Therefore, in the case where processing linked to pinch-in (e.g., zooming out on a screen) is separately set, it is preferable that when pinching actuation is performed, the input of pinch-in be temporarily inactivated. For example, an icon image linked to processing of temporarily making the pinch-in function on/off is displayed on a screen. Alternatively, the actuation illustrated in FIG. 2A to FIG. 2C may be distinguished from pinch-in with a movement of a fingertip after being held in the state of FIG. 2A for a certain time (also referred to as long-tap), for example.
  • [Selection of Object]
  • Here, a method for selecting an object is described. In FIG. 3A, a plurality of objects 100 displayed on the display portion 12 are illustrated as rectangles with rounded corners. A dashed-dotted rectangular frame is a rectangle in which the coordinate A1 and the coordinate B1 are diagonally opposite, and the object 100 at least partly included therein is selected. In FIG. 3A, a selected object is indicated by a solid line, and a non-selected object is indicated by a dotted line.
  • As illustrated in FIG. 3B, the object 100 completely included in the rectangle in which the coordinate A1 and the coordinate B1 are diagonally opposite may be selected. In FIG. 3B, the object 100 overlapping with the dashed-dotted line is not selected.
  • As illustrated in FIG. 3C, the object 100 overlapping with any of two lines which are a line connecting the coordinate A1 and the coordinate A2 and a line connecting the coordinate B1 and the coordinate B2 may be selected. That is, an object that is positioned on a path corresponding to a movement of a finger may be selected. In FIG. 3C, the object 100 that overlaps with an arrow connecting the coordinate A1 and the coordinate A2 and the object 100 that overlaps with an arrow connecting the coordinate B1 and the coordinate B2 are selected.
  • As illustrated in FIG. 4A, the object 100 at least partly included in a rectangle in which the coordinate A2 and the coordinate B2 after pinching actuation are diagonally opposite may be selected. In FIG. 4A, two objects 100 are selected. In such a manner, the area of a rectangle becomes small, and thus the objects to be selected can be narrowed down.
  • As illustrated in FIG. 4B, the object 100 completely included in the rectangle in which the coordinate A2 and the coordinate B2 after pinching actuation are diagonally opposite may be selected. In FIG. 4B, one object 100 is selected. In such a manner, the area of a rectangle becomes smaller, and thus an intended object can be selected accurately.
  • When the coordinate A2 and the coordinate B2 are in close contact with each other as illustrated in FIG. 4C, e.g., when an index finger and a thumb are in contact with each other, only the object 100 overlapping with both the coordinate A2 and the coordinate B2 may be selected. Accordingly, an intended object can be selected accurately. Alternatively, an intended object even with a small size can be selected accurately. Furthermore, also when the part of the index finger and the part of the thumb, without passing on the coordinate A1 and the coordinate B1, are in contact with the coordinate A2 and the coordinate B2 from the beginning, an object can be selected as in FIG. 4A to FIG. 4C. The above is the description of the method for selecting an object.
  • [Picking Up]
  • “Picking up” actuation after pinching is described. FIG. 5A to FIG. 5C are schematic cross-sectional views seen in the direction indicated by an arrow 50 in FIG. 2B. FIG. 5A is a diagram illustrating a state of pinching after the index finger in contact with the coordinate A1 and the thumb in contact with the coordinate B1 are moved to the positions of the coordinate A2 and the coordinate B2 such that the fingers are close to each other, and an object is selected.
  • Next, as illustrated in FIG. 5B, a movement in which the part of the index finger and the part of the thumb having been in contact with the screen are lifted up (in the normal direction) is performed. At this time, the part of the index finger and the part of the thumb become apart from the screen, whereby an object is displayed as if it is picked up. As the display of the picking up actuation, the pinched object is displayed as if it is lifted up (floats) in the screen. Furthermore, as the display of the picking up actuation, other display may be set. For example, as the display of the picking up actuation, the object may be displayed in a different color or a smaller size. Furthermore, the object may be displayed in a different shape.
  • The operation in which the part of the index finger and the part of the thumb become apart from the screen from the state where they are in contact with the screen can be detected when a detection position of a contact sensor (the detection portion 21 or the detection portion 22) in the screen disappears, for example. In the case where the device 10 or the device 20 can obtain three-dimensional positional information over the screen, it is preferable that the height of a detection target regarded as being in contact (referred to as a lower-limit threshold value Th1) be set in advance, and the detection target be regarded as becoming apart from the screen when the lower-limit threshold value Th1 is exceeded. More specifically, when the height H of the part of the index finger and the part of the thumb from the screen surface exceeds the lower-limit threshold value Th1, the object may be displayed as if it is picked up. In such a manner, when the lower-limit threshold value Th1 is set, unintentional picking up of the object can be reduced, which is preferable. That is, when the height H of the part of the index finger and the part of the thumb from the screen surface is more than or equal to the threshold value Th1 and less than or equal to a threshold value Th2, the object is displayed as if it is pinched and lifted up. The threshold value Th2 represents the upper detection limit in the Z direction.
  • As illustrated in FIG. 5C, when the height H of the part of the index finger and the part of the thumb from the screen surface exceeds the threshold value Th2, the object is displayed as if it is dropped. At this time, the object is at the position of the height H just before dropped, and without getting higher than that, the object is displayed as if it is dropped from the height H. Therefore, when the height H of the part of the index finger and the part of the thumb from the screen surface does not exceed the threshold value Th2, the display in which the object is in a picked up state is maintained. Furthermore, the object can be moved on the screen while the state where the object is pinched is maintained.
  • [Putting Down (Dropping)]
  • Next, “putting down (dropping)” actuation is described. With a movement in which the part of the index finger and the part of the thumb are put down to be in contact with the screen as in FIG. 5A from a state of picking up the object, the object is displayed as if it is put down. Furthermore, when a movement is performed in which the part of the index finger and the part of the thumb become apart from each other as in FIG. 5B from a state of picking up to the height H, the object may be displayed as if it is dropped. In the case where the picked up object 100 is dropped when the height H exceeds the threshold value Th2, the object 100 may be displayed as if it is put down to the XY point of the time when the height H exceeds the threshold value Th2. Furthermore, when the picked up object 100 is dropped, it may be put down not to the XY point of the position but to the XY point where the object is originally picked up (A2 and B2). The above is the description of the actuation of picking up and putting down the object.
  • [Selection Cancellation of Object]
  • The method for canceling selection of an object is described. When the part of the index finger and the part of the thumb are put down to be in contact with the screen as in FIG. 5A so as to put down the object, the selection of the object is canceled. Furthermore, in the case where the object is picked up and the height H exceeds the threshold value Th2 or the case where the fingers pinching the object become apart from each other at the height H, the object is dropped and the selection of the object is canceled. That is, putting down and dropping the object makes it possible to cancel the selection of the object. The above is the description of the method for canceling selection of an object.
  • Next, an example of the series of actuation, 1) selecting an object, 2) picking up the object, 3) moving the object, and 4) putting down the object, is described with reference to FIG. 6A to FIG. 8B. Here, as the method for selecting the object, the method for selecting an intended object accurately shown in FIG. 4C is used. FIG. 6A to FIG. 8B are perspective views of the display portion 12 in the device 10 or the device 20.
  • As illustrated in FIG. 6A, two fingertips (not illustrated) are made in contact with the coordinate A1 and the coordinate B1 having the object 100 therebetween on the display portion 12. Next, as illustrated in FIG. 6B, the balls of the fingers are moved close to each other while being in contact with the display portion 12, so that the two fingertips are moved to the coordinate A2 and the coordinate B2. The above movements can be detected by the detection portion 21 and the detection portion 22 in the device 10 and the device 20, respectively. Accordingly, the object 100 can be selected, i.e., the object 100 can be pinched.
  • Next, as illustrated in FIG. 7A, a movement is performed in which the object 100 is picked up from the position of the coordinate A2 and the coordinate B2 to the position of a coordinate A3 and a coordinate B3 with the balls of the two fingers be in contact with each other. This movement is detected by the detection portion 21 and the detection portion 23 in the device 10 and the device 20, respectively. Therefore, the object 100 can be picked up. Here, when the height to which the object 100 is picked up is H, the height H is more than the threshold value Th1 (not illustrated) and less than the threshold value Th2. When the height H exceeds the threshold value Th2, the picked up object becomes apart from the two fingers and is dropped. Next, as illustrated in FIG. 7B, the two fingers are moved, with their balls be in contact with each other, from the position of the coordinate A3 and the coordinate B3 to the position of a coordinate A4 and a coordinate B4. This movement is detected by the detection portion 21 and the detection portion 23 in the device 10 and the device 20, respectively. Therefore, the object 100 can be moved. Although the object is moved linearly in FIG. 7B, the movement is not limited thereto. The object 100 may be swung up and down, left and right. However, when the height H of the pinched object 100 from the screen surface exceeds the threshold value Th2, the object is dropped.
  • Next, as illustrated in FIG. 8A, the two fingers are put down, with their balls be in contact with each other from the position of the coordinate A4 and the coordinate B4 to the position of a coordinate A5 and a coordinate B5 to be in contact with the surface of the display portion 12. This movement is detected by the detection portion 21 and the detection portion 22 in the device 10 and the device 20, respectively. Therefore, the object 100 can be put down. Furthermore, as illustrated in FIG. 8B, when the two fingers become apart from each other before being put down from the position of the coordinate A4 and the coordinate B4 to the position of the coordinate A5 and the coordinate B5, the object 100 is dropped. The movement in which the fingers become apart from each other at the height of the coordinate A4 and the coordinate B4 can be detected by the detection portion 21 and the detection portion 23 in the device 10 and the device 20, respectively. Therefore, the object 100 is put down. As described above, the device 10 and the device 20 are each an electronic device capable of object operation with an intuitive movement of a finger in order to hold, lift up, move, and put down an object on a screen, for example.
  • Specific Application Examples
  • Examples of a specific application that can be applied to the electronic device of one embodiment of the present invention are described below.
  • Note that although examples of operation using a finger are described below, operation using a detection target other than a finger can be performed. As a detection target other than a finger, a writing material such as a stylus pen, a brush, a glass pen, and a quill pen can be used, for example. In the examples below, operation may be performed using a finger and the above detection target other than a finger or using two detection targets other than a finger. Alternatively, a detection target having two or more detection portions can be used. For example, equipment in which the distance of two tips changes, such as tweezers, scissors, or chopsticks, can be used for operation.
  • FIG. 9A illustrates an example where an object 100 a such as an icon displayed on the screen of the display portion 12 is moved to a given position. A user selects the object 100 a on the screen with the above pinching actuation and picks up the object. Then, the user can move the object 100 a such as an icon in the screen by putting down or dropping the object to a given position.
  • FIG. 9B illustrates an example where a destination, a starting point, or the like is specified in a map application. A pin-shaped object 100 b drawn by a solid line is the object displayed on the screen, and the pin-shaped object 100 b drawn by a dotted line is the picked up object. A user can accurately select the pin-shaped object 100 b displayed on the screen with the pinching actuation and intuitively put down the object to a given position on the screen.
  • Therefore, a point intended by the user can be easily set. In FIG. 9B, the pin-shaped object 100 b on the lower right of the screen is picked up and put down to the destination. Since the device 10 and the device 20 each have a function of detecting contact of a detection target, a setting of the destination, the starting point, or the like can be changed by a finger made to be in contact with the pin-shaped object 100 b.
  • FIG. 10A illustrates an example where a page is turned in an e-book reader application. With the pinching actuation, a user can perform page-turning actuation more naturally as if he or she turns a page of a real book. As illustrated in FIG. 10A, the user can turn up an object 100 c that is part of the page with actuation in which an edge portion of the screen is picked up. Furthermore, the page can be turned by actuation in which the fingers in a state of picking up the page are moved to the opposite page side and the fingers become apart from each other at the position or come in contact with the page.
  • FIG. 10B illustrates an example where an object position is changed back and forth in editing software such as document creation software or presentation manuscript creation software. In the example of FIG. 10B, a circular object 100 d positioned behind a triangle object and a quadrangular object is moved to the foreground. In the case of a device detecting only contact, a plurality of contact actuation are needed in order to move an object positioned on the back side to the foreground; however, when the above pinching actuation is applied, the object position can be easily changed back and forth. When the circular object 100 d is picked up and then put down at the position, the position of the circular object 100 d can be changed only back and forth. Furthermore, in the case where after the circular object 100 d is picked up, the position of fingers are moved and the object is put down, not only the position of the circular object 100 d can be changed back and forth, but also the object can be moved to a given position in the screen.
  • Examples of a game application to which the pinching actuation is applied are illustrated in FIG. 11A to FIG. 12A.
  • FIG. 11A is an example where the pinching actuation is applied to a plant growing game. In the game, a user can perform operation, such as pulling out a weed object 100 e, giving the plant a water object 100 g, or sowing a seed object 100 f, that is necessary for growing the plant with the pinching actuation. In FIG. 11A, containers corresponding to the weed (Weeds), water (Water), and seeds (Seeds) are illustrated. Since a movement close to a real movement can be employed to the game, the user can enjoy the game more intuitively.
  • FIG. 11B and FIG. 11C are each an example where the pinching actuation is applied to a game to interact with an animal. As illustrated in FIG. 11B, a user can wave a stick-like toy object 100 i to an animal object 100 h or roll a ball-like toy object 100 j to the animal, for example. With the pinching actuation, the user can move the object with his/her intention and thus can find more pleasure. Furthermore, as illustrated in FIG. 1C, display of the animal object 100 h being pinched and swung left and right is possible. The animal object 100 h is swung by the movement of user's fingers, and the user can feel relaxed.
  • FIG. 12A is an example where the pinching actuation is applied to a game to pull out sticks stacked with each other, in which scroll and pinching actuation are combined. A given stacked surface is selected by scrolling, and a stick object 100 j to be pulled out is selected by pinching. After that, the stick object 100 j is picked up and then can be pulled out. In the game, speed for picking up, whole balance, and the like are processed: when a certain condition is exceeded, the stacked sticks collapse.
  • FIG. 12B illustrates an example where the pinching actuation is applied to application switching in an electronic device such as a smartphone or a tablet. When picking up is performed at a given position in the display portion 12, a list of the applications in activation in the electronic device is displayed. The fingers are turned with holding the picking up state, whereby the active application is selected one by one in order. A solid line represents a selected application object 100 k. When putting down is performed in this state, the selected application can be activated on the screen.
  • Other than the above, a remotely-connected electronic device to which the pinching actuation is applied can be used for remote treatment in the medical field. The device 10 and the device 20 each have a function of detecting contact to a screen; thus, when the screen on which affected part is displayed is touched, an area to be treated can be selected, for example. Furthermore, by combination with the pinching actuation, movements such as pinching, picking up, and cutting the treatment area can be remotely performed. In the case of remote treatment, a robot arm can perform the actual treatment, for example.
  • Note that the examples of the applications described here can be written as programs, for example. For example, a program in which the processing method, detection method, operation method, actuation method, display method, or the like that is described above as an example and executed by the device 10 and the like is written can be stored in a non-transitory storage medium and can be read and executed by an arithmetic device or the like included in the control portion 11 of the device 10. That is, a program that makes hardware execute the processing method, detection method, operation method, actuation method, display method, or the like described above as an example and a non-transitory storage medium storing the program are embodiments of the present invention.
  • This embodiment can be combined with the other embodiments as appropriate. In addition, in this specification, in the case where a plurality of structure examples are described in one embodiment, the structure examples can be combined as appropriate.
  • Embodiment 2
  • In this embodiment, a light-emitting and light-receiving apparatus of one embodiment of the present invention is described. A display device exemplified below can be favorably used for a light-emitting and light-receiving portion of the electronic device described in Embodiment 1.
  • A light-emitting and light-receiving portion of the light-emitting and light-receiving apparatus of one embodiment of the present invention includes a light-receiving element (also referred to as a light-receiving device) and a light-emitting element (also referred to as a light-emitting device). The light-emitting and light-receiving portion has a function of displaying an image with the use of the light-emitting element. Furthermore, the light-emitting and light-receiving portion has one or both of a function of capturing an image with the use of the light-receiving element and a sensing function. Thus, the light-emitting and light-receiving apparatus of one embodiment of the present invention can be expressed as a display device, and the light-emitting and light-receiving portion can be expressed as a display portion.
  • Alternatively, the light-emitting and light-receiving apparatus of one embodiment of the present invention may have a structure including a light-emitting and light-receiving element (also referred to as a light-emitting and light-receiving device) and a light-emitting element.
  • First, a light-emitting and light-receiving apparatus including a light-receiving element and a light-emitting element is described.
  • The light-emitting and light-receiving apparatus of one embodiment of the present invention includes a light-receiving element and a light-emitting element in a light-emitting and light-receiving portion. In the light-emitting and light-receiving apparatus of one embodiment of the present invention, the light-emitting elements are arranged in a matrix in the light-emitting and light-receiving portion, and an image can be displayed on the light-emitting and light-receiving portion. Furthermore, the light-receiving elements are arranged in a matrix in the light-emitting and light-receiving portion, and the light-emitting and light-receiving portion has one or both of an image capturing function and a sensing function. The light-emitting and light-receiving portion can be used as an image sensor, a touch sensor, or the like. That is, light is detected in the light-emitting and light-receiving portion, whereby an image can be captured. In addition, touch operation of an object (e.g., a finger or a pen) can be detected. Furthermore, in the light-emitting and light-receiving apparatus of one embodiment of the present invention, the light-emitting elements can be used as a light source of the sensor. Accordingly, a light-receiving portion and a light source do not need to be provided separately from the light-emitting and light-receiving apparatus; hence, the number of components of an electronic device can be reduced.
  • In the light-emitting and light-receiving apparatus of one embodiment of the present invention, when an object reflects (or scatters) light emitted from the light-emitting element included in the light-emitting and light-receiving portion, the light-receiving element can detect the reflected light (or the scattered light); thus, image capturing and touch operation detection are possible even in a dark place.
  • The light-emitting element included in the light-emitting and light-receiving apparatus of one embodiment of the present invention functions as a display element (also referred to as a display device).
  • As the light-emitting element, an EL element (also referred to as an EL device) such as an OLED (Organic Light Emitting Diode) or a QLED (Quantum-dot Light Emitting Diode) is preferably used. Examples of a light-emitting substance contained in the EL element include a substance exhibiting fluorescence (a fluorescent material), a substance exhibiting phosphorescence (a phosphorescent material), an inorganic compound (such as a quantum dot material), and a substance exhibiting thermally activated delayed fluorescence (a thermally activated delayed fluorescence (TADF) material). Alternatively, an LED such as a micro-LED (Light Emitting Diode) can be used as the light-emitting element.
  • The light-emitting and light-receiving apparatus of one embodiment of the present invention has a function of detecting light with the use of a light-receiving element.
  • When the light-receiving element is used as an image sensor, the light-emitting and light-receiving apparatus can capture an image using the light-receiving element. For example, the light-emitting and light-receiving apparatus can be used as a scanner.
  • An electronic device including the light-emitting and light-receiving apparatus of one embodiment of the present invention can obtain data related to biological information such as a fingerprint or a palm print by using a function of an image sensor. That is, a biometric authentication sensor can be incorporated in the light-emitting and light-receiving apparatus. When the light-emitting and light-receiving apparatus incorporates a biometric authentication sensor, the number of components of an electronic device can be reduced as compared to the case where a biometric authentication sensor is provided separately from the light-emitting and light-receiving apparatus; thus, the size and weight of the electronic device can be reduced.
  • When the light-receiving element is used as the touch sensor, the light-emitting and light-receiving apparatus can detect touch operation of an object with the use of the light-receiving element.
  • As the light-receiving element, a pn photodiode or a pin photodiode can be used, for example. The light-receiving element functions as a photoelectric conversion element (also referred to as a photoelectric conversion device) that detects light entering the light-receiving element and generates electric charge. The amount of electric charge generated from the light-receiving element depends on the amount of light entering the light-receiving element.
  • It is particularly preferable to use an organic photodiode including a layer containing an organic compound as the light-receiving element. An organic photodiode, which is easily made thin, lightweight, and large in area and has a high degree of freedom for shape and design, can be used in a variety of devices.
  • In the light-emitting and light-receiving apparatus of one embodiment of the present invention, organic EL elements (also referred to as organic EL devices) are used as the light-emitting elements, and organic photodiodes are used as the light-receiving elements. The organic EL elements and the organic photodiodes can be formed over one substrate. Thus, the organic photodiodes can be incorporated in the display device including the organic EL elements.
  • In the case where all the layers of the organic EL elements and the organic photodiodes are formed separately, the number of deposition steps becomes extremely large. However, a large number of layers of the organic photodiodes can have a structure in common with the organic EL elements; thus, concurrently depositing the layers that can have a common structure can inhibit an increase in the number of deposition steps.
  • For example, one of a pair of electrodes (a common electrode) can be a layer shared by the light-receiving element and the light-emitting element. For example, at least one of a hole-injection layer, a hole-transport layer, an electron-transport layer, and an electron-injection layer is preferably a layer shared by the light-receiving element and the light-emitting element. As another example, the light-receiving element and the light-emitting element can have the same structure except that the light-receiving element includes an active layer and the light-emitting element includes a light-emitting layer. In other words, the light-receiving element can be manufactured by only replacing the light-emitting layer of the light-emitting element with an active layer. When the light-receiving element and the light-emitting element include common layers in such a manner, the number of deposition steps and the number of masks can be reduced, whereby the number of manufacturing steps and the manufacturing cost of the light-emitting and light-receiving apparatus can be reduced. Furthermore, the light-emitting and light-receiving apparatus including the light-receiving element can be manufactured using an existing manufacturing apparatus and an existing manufacturing method for the display device.
  • Note that a layer shared by the light-receiving element and the light-emitting element might have functions different in the light-receiving element and the light-emitting element. In this specification, the name of a component is based on its function in the light-emitting element. For example, a hole-injection layer functions as a hole-injection layer in the light-emitting element and functions as a hole-transport layer in the light-receiving element. Similarly, an electron-injection layer functions as an electron-injection layer in the light-emitting element and functions as an electron-transport layer in the light-receiving element. A layer shared by the light-receiving element and the light-emitting element may have the same functions in the light-receiving element and the light-emitting element. A hole-transport layer functions as a hole-transport layer in both of the light-emitting element and the light-receiving element, and an electron-transport layer functions as an electron-transport layer in both of the light-emitting element and the light-receiving element.
  • Next, a light-emitting and light-receiving apparatus including light-emitting and light-receiving element and light-emitting element is described. Note that functions, behavior, effects, and the like similar to those in the above are not be described in some cases.
  • In the light-emitting and light-receiving apparatus of one embodiment of the present invention, a subpixel exhibiting any color includes a light-emitting and light-receiving element instead of a light-emitting element, and subpixels exhibiting the other colors each include a light-emitting element. The light-emitting and light-receiving element has both a function of emitting light (a light-emitting function) and a function of receiving light (a light-receiving function). For example, in the case where a pixel includes three subpixels of a red subpixel, a green subpixel, and a blue subpixel, at least one of the subpixels includes a light-emitting and light-receiving element, and the other subpixels each include a light-emitting element. Thus, the light-emitting and light-receiving portion of the light-emitting and light-receiving apparatus of one embodiment of the present invention has a function of displaying an image using both a light-emitting and light-receiving element and a light-emitting element.
  • The light-emitting and light-receiving element functions as both a light-emitting element and a light-receiving element, whereby the pixel can have a light-receiving function without an increase in the number of subpixels included in the pixel. Thus, the light-emitting and light-receiving portion of the light-emitting and light-receiving apparatus can be provided with one or both of an image capturing function and a sensing function while keeping the aperture ratio of the pixel (aperture ratio of each subpixel) and the resolution of the light-emitting and light-receiving apparatus. Accordingly, in the light-emitting and light-receiving apparatus of one embodiment of the present invention, the aperture ratio of the pixel can be more increased and the resolution can be increased more easily than in a light-emitting and light-receiving apparatus provided with a subpixel including a light-receiving element separately from a subpixel including a light-emitting element.
  • In the light-emitting and light-receiving portion of the light-emitting and light-receiving apparatus of one embodiment of the present invention, the light-emitting and light-receiving elements and the light-emitting elements are arranged in a matrix, and an image can be displayed on the light-emitting and light-receiving portion. The light-emitting and light-receiving portion can be used as an image sensor and a touch sensor. In the light-emitting and light-receiving apparatus of one embodiment of the present invention, the light-emitting elements can be used as a light source of the sensor. Thus, image capturing and touch operation detecting are possible even in a dark place.
  • The light-emitting and light-receiving element can be manufactured by combining an organic EL element and an organic photodiode. For example, by adding an active layer of an organic photodiode to a layered structure of an organic EL element, the light-emitting and light-receiving element can be manufactured. Furthermore, in the light-emitting and light-receiving element formed of a combination of an organic EL element and an organic photodiode, depositing layers in one deposition step that can be shared with the organic EL element can inhibit an increase in the number of deposition steps.
  • For example, one of a pair of electrodes (a common electrode) can be a layer shared with the light-emitting and light-receiving element and the light-emitting element. For example, at least one of a hole-injection layer, a hole-transport layer, an electron-transport layer, and an electron-injection layer is preferably a layer shared with the light-emitting and light-receiving element and the light-emitting element. As another example, the light-emitting and light-receiving element and the light-emitting element can have the same structure except for the presence or absence of an active layer of the light-receiving element. In other words, the light-emitting and light-receiving element can be manufactured by only adding the active layer of the light-receiving element to the light-emitting element. When the light-emitting and light-receiving element and the light-emitting element include common layers in such a manner, the number of deposition steps and the number of masks can be reduced, thereby reducing the number of manufacturing steps and the manufacturing cost of the light-emitting and light-receiving apparatus. Furthermore, the light-emitting and light-receiving apparatus including the light-emitting and light-receiving element can be manufactured using an existing manufacturing device and an existing manufacturing method for the display device.
  • Note that a layer included in the light-emitting and light-receiving element may have a different function between the case where the light-emitting and light-receiving element functions as a light-receiving element and the case where the light-emitting and light-receiving element functions as a light-emitting element. In this specification, the name of a component is based on its function in the case where the light-emitting and light-receiving element functions as a light-emitting element.
  • The light-emitting and light-receiving apparatus of this embodiment has a function of displaying an image with the use of a light-emitting element and a light-emitting and light-receiving element. That is, the light-emitting element and the light-emitting and light-receiving element function as display elements.
  • The light-emitting and light-receiving apparatus of this embodiment has a function of detecting light with the use of a light-emitting and light-receiving element. The light-emitting and light-receiving element can detect light having a shorter wavelength than light emitted by the light-emitting and light-receiving element itself.
  • When the light-emitting and light-receiving element is used as an image sensor, the light-emitting and light-receiving apparatus of this embodiment can capture an image using the light-emitting and light-receiving element. When the light-emitting and light-receiving element is used as the touch sensor, the light-emitting and light-receiving apparatus of this embodiment can detect touch operation of an object with the use of the light-emitting and light-receiving element.
  • The light-emitting and light-receiving element functions as a photoelectric conversion element. The light-emitting and light-receiving element can be manufactured by adding an active layer of the light-receiving element to the above-described structure of the light-emitting element. For the light-emitting and light-receiving element, an active layer of a pn photodiode or a pin photodiode can be used, for example.
  • It is particularly preferable to use, for the light-emitting and light-receiving element, an active layer of an organic photodiode including a layer containing an organic compound. An organic photodiode, which is easily made thin, lightweight, and large in area and has a high degree of freedom for shape and design, can be used in a variety of devices.
  • The display device that is an example of the light-emitting and light-receiving apparatus of one embodiment of the present invention is specifically described below with reference to drawings.
  • [Structure Example 1 of Display Device] [Structure Example 1-1]
  • FIG. 13A is a schematic view of a display panel 200. The display panel 200 includes a substrate 201, a substrate 202, a light-receiving element 212, a light-emitting element 211R, a light-emitting element 211G, a light-emitting element 211B, a functional layer 203, and the like.
  • The light-emitting element 211R, the light-emitting element 211G, the light-emitting element 211B, the light-receiving element 212 are provided between the substrate 201 and the substrate 202. The light-emitting element 211R, the light-emitting element 211G, and the light-emitting element 211B emit red (R) light, green (G) light, and blue (B) light, respectively. Note that in the following description, the term “light-emitting element 211” may be used when the light-emitting element 211R, the light-emitting element 211G, and the light-emitting element 211B are not distinguished from each other.
  • The display panel 200 includes a plurality of pixels arranged in a matrix. One pixel includes one or more subpixels. One subpixel includes one light-emitting element. For example, the pixel can have a structure including three subpixels (e.g., three colors of R, G, and B or three colors of yellow (Y), cyan (C), and magenta (M)) or four subpixels (e.g., four colors of R, G, B, and white (W) or four colors of R, G, B, and Y). The pixel further includes the light-receiving element 212. The light-receiving element 212 may be provided in all the pixels or may be provided in some of the pixels. In addition, one pixel may include a plurality of light-receiving elements 212.
  • FIG. 8A illustrates a finger 220 touching a surface of the substrate 202. Part of light emitted by the light-emitting element 211G is reflected at a contact portion of the substrate 202 and the finger 220. In the case where part of the reflected light is incident on the light-receiving element 212, the contact of the finger 220 with the substrate 202 can be detected. That is, the display panel 200 can function as a touch panel.
  • The functional layer 203 includes a circuit for driving the light-emitting element 211R, the light-emitting element 211G, and the light-emitting element 211B and a circuit for driving the light-receiving element 212. The functional layer 203 is provided with a switch, a transistor, a capacitor, a wiring, and the like. Note that in the case where the light-emitting element 211R, the light-emitting element 211G, the light-emitting element 211B, and the light-receiving element 212 are driven by a passive-matrix method, a structure not provided with a switch and a transistor may be employed.
  • The display panel 200 preferably has a function of detecting a fingerprint of the finger 220. FIG. 13B schematically illustrates an enlarged view of the contact portion in a state where the finger 220 touches the substrate 202. FIG. 13B illustrates the light-emitting elements 211 and the light-receiving elements 212 that are alternately arranged.
  • The fingerprint of the finger 220 is formed of depressions and projections. Therefore, as illustrated in FIG. 13B, the projections of the fingerprint touch the substrate 202.
  • Reflection of light from a surface or an interface is categorized into regular reflection and diffuse reflection. Regularly reflected light is highly directional light with an angle of reflection equal to the angle of incidence. Diffusely reflected light has low directionality and low angular dependence of intensity. As for regular reflection and diffuse reflection, diffuse reflection components are dominant in the light reflected from the surface of the finger 220. Meanwhile, regular reflection components are dominant in the light reflected from the interface between the substrate 202 and the air.
  • The intensity of light that is reflected from contact surfaces or non-contact surfaces between the finger 220 and the substrate 202 and is incident on the light-receiving elements 212 positioned directly below the contact surfaces or the non-contact surfaces is the sum of intensities of regularly reflected light and diffusely reflected light. As described above, regularly reflected light (indicated by solid arrows) is dominant near the depressions of the finger 220, where the finger 220 is not in contact with the substrate 202; whereas diffusely reflected light (indicated by dashed arrows) from the finger 220 is dominant near the projections of the finger 220, where the finger 220 is in contact with the substrate 202. Thus, the intensity of light received by the light-receiving element 212 positioned directly below the depression is higher than the intensity of light received by the light-receiving element 212 positioned directly below the projection. Accordingly, a fingerprint image of the finger 220 can be captured.
  • In the case where an arrangement interval between the light-receiving elements 212 is smaller than a distance between two projections of a fingerprint, preferably a distance between a depression and a projection adjacent to each other, a clear fingerprint image can be obtained. The distance between a depression and a projection of a human's fingerprint is approximately 200 μm; thus, the arrangement interval between the light-receiving elements 212 is, for example, less than or equal to 400 μm, preferably less than or equal to 200 μm, further preferably less than or equal to 150 μm, still further preferably less than or equal to 100 μm, yet still further preferably less than or equal to 50 μm and greater than or equal to 1 μm, preferably greater than or equal to 10 μm, further preferably greater than or equal to 20 μm.
  • FIG. 13C illustrates an example of a fingerprint image captured by the display panel 200. In an image-capturing range 223 in FIG. 13C, the outline of the finger 220 is indicated by a dashed line and the outline of a contact portion 221 is indicated by a dashed-dotted line. In the contact portion 221, a high-contrast image of a fingerprint 222 can be captured owing to a difference in the amount of light incident on the light-receiving elements 212.
  • The display panel 200 can also function as a touch panel or a pen tablet. FIG. 13D illustrates a state where a tip of a stylus 225 slides in a direction indicated with a dashed arrow while the tip of the stylus 225 touches the substrate 202.
  • As illustrated in FIG. 13D, when diffusely reflected light that is diffused at the contact surface of the tip of the stylus 225 and the substrate 202 is incident on the light-receiving element 212 that overlaps with the contact surface, the position of the tip of the stylus 225 can be detected with high accuracy.
  • FIG. 13E illustrates an example of a path 226 of the stylus 225 that is detected by the display panel 200. The display panel 200 can detect the position of a detection target, such as the stylus 225, with high position accuracy, so that high-definition drawing can be performed using a drawing application or the like. Unlike the case of using a capacitive touch sensor, an electromagnetic induction touch pen, or the like, the display panel 200 can detect even the position of a highly insulating object to be detected, the material of a tip portion of the stylus 225 is not limited, and a variety of writing materials (e.g., a brush, a glass pen, a quill pen, and the like) can be used.
  • Here, FIG. 13F to FIG. 13H illustrate examples of a pixel that can be used in the display panel 200.
  • The pixels illustrated in FIG. 13F and FIG. 13G each include the light-emitting element 211R for red (R), the light-emitting element 211G for green (G), the light-emitting element 211B for blue (B), and the light-receiving element 212. The pixels each include a pixel circuit for driving the light-emitting element 211R, the light-emitting element 211G, the light-emitting element 211B, and the light-receiving element 212.
  • FIG. 13F illustrates an example in which three light-emitting elements and one light-receiving element are provided in a matrix of 2×2. FIG. 13G illustrates an example in which three light-emitting elements are arranged in one line and one laterally long light-receiving element 212 is provided below the three light-emitting elements.
  • The pixel illustrated in FIG. 13H is an example including a light-emitting element 211W for white (W). Here, four light-emitting elements are arranged in one line and the light-receiving element 212 is provided below the four light-emitting elements.
  • Note that the pixel structure is not limited to the above structure, and a variety of arrangement methods can be employed.
  • [Structure Example 1-2]
  • An example of a structure including light-emitting elements emitting visible light, a light-emitting element emitting infrared light, and a light-receiving element is described below.
  • A display panel 200A illustrated in FIG. 14A includes a light-emitting element 211IR in addition to the components illustrated in FIG. 14A as an example. The light-emitting element 211IR is a light-emitting element emitting infrared light IR. Moreover, in that case, an element capable of receiving at least the infrared light IR emitted by the light-emitting element 211IR is preferably used as the light-receiving element 212. As the light-receiving element 212, an element capable of receiving visible light and infrared light is further preferably used.
  • As illustrated in FIG. 14A, when the finger 220 touches the substrate 202, the infrared light IR emitted from the light-emitting element 211IR is reflected by the finger 220 and part of reflected light is incident on the light-receiving element 212, so that the positional information of the finger 220 can be obtained.
  • FIG. 14B to FIG. 14D illustrate examples of a pixel that can be used in the display panel 200A.
  • FIG. 14B illustrates an example in which three light-emitting elements are arranged in one line and the light-emitting element 211IR and the light-receiving element 212 are arranged below the three light-emitting elements in a horizontal direction. FIG. 14C illustrates an example in which four light-emitting elements including the light-emitting element 211IR are arranged in one line and the light-receiving element 212 is provided below the four light-emitting elements.
  • FIG. 14D shows an example in which three light-emitting elements and the light-receiving element 212 are arranged in all directions with the light-emitting element 211IR as the center.
  • Note that in the pixels illustrated in FIG. 14B to FIG. 14D, the positions of the light-emitting elements can be interchangeable, or the positions of the light-emitting element and the light-receiving element can be interchangeable.
  • [Structure Example 1-3]
  • An example of a structure including a light-emitting element emitting visible light and a light-emitting and light-receiving element emitting and receiving visible light is described below.
  • A display panel 200B illustrated in FIG. 15A includes the light-emitting element 211B, the light-emitting element 211G, and a light-emitting and light-receiving element 213R. The light-emitting and light-receiving element 213R has a function of a light-emitting element that emits red (R) light, and a function of a photoelectric conversion element that receives visible light. FIG. 15A illustrates an example in which the light-emitting and light-receiving element 213R receives green (G) light emitted by the light-emitting element 211G. Note that the light-emitting and light-receiving element 213R may receive blue (B) light emitted by the light-emitting element 211B. Alternatively, the light-emitting and light-receiving element 213R may receive both green light and blue light.
  • For example, the light-emitting and light-receiving element 213R preferably receives light having a shorter wavelength than light emitted from itself. Alternatively, the light-emitting and light-receiving element 213R may receive light (e.g., infrared light) having a longer wavelength than light emitted from itself. The light-emitting and light-receiving element 213R may receive light having approximately the same wavelength as light emitted from itself; however, in that case, the light-emitting and light-receiving element 213R also receives light emitted from itself, whereby its emission efficiency might be decreased. Therefore, the peak of the emission spectrum and the peak of the absorption spectrum of the light-emitting and light-receiving element 213R preferably overlap as little as possible.
  • Here, light emitted by the light-emitting and light-receiving element is not limited to red light. Furthermore, the light emitted by the light-emitting elements is not limited to the combination of green light and blue light. For example, the light-emitting and light-receiving element can be an element that emits green or blue light and receives light having a different wavelength from light emitted from itself.
  • The light-emitting and light-receiving element 213R serves as both a light-emitting element and a light-receiving element as described above, whereby the number of elements provided in one pixel can be reduced. Thus, higher definition, a higher aperture ratio, higher resolution, and the like can be easily achieved.
  • FIG. 15B to FIG. 15I illustrate examples of a pixel that can be used in the display panel 200B.
  • FIG. 15B illustrates an example in which the light-emitting and light-receiving element 213R, the light-emitting element 211G, and the light-emitting element 211B are arranged in one column. FIG. 15C illustrates an example in which the light-emitting element 211G and the light-emitting element 211B are alternately arranged in the vertical direction and the light-emitting and light-receiving element 213R is provided alongside the light-emitting elements.
  • FIG. 15D illustrates an example in which three light-emitting elements (the light-emitting element 211G, the light-emitting element 211B, and a light-emitting element 211X) and one light-emitting and light-receiving element are arranged in matrix of 2×2. The light-emitting element 211X is an element that emits light of a color other than R, G, and B. The light of a color other than R, G, and B can be white (W) light, yellow (Y) light, cyan (C) light, magenta (M) light, infrared light (IR), ultraviolet light (UV), or the like. In the case where the light-emitting element 211X emits infrared light, the light-emitting and light-receiving element preferably has a function of detecting infrared light or a function of detecting both visible light and infrared light. The wavelength of light detected by the light-emitting and light-receiving element can be determined depending on the application of a sensor.
  • FIG. 15E illustrates two pixels. A region that includes three elements and is enclosed by a dotted line corresponds to one pixel. Each of the pixels includes the light-emitting element 211G, the light-emitting element 211B, and the light-emitting and light-receiving element 213R. In the left pixel in FIG. 15E, the light-emitting element 211G is provided in the same row as the light-emitting and light-receiving element 213R, and the light-emitting element 211B is provided in the same column as the light-emitting and light-receiving element 213R. In the right pixel in FIG. 15E, the light-emitting element 211G is provided in the same row as the light-emitting and light-receiving element 213R, and the light-emitting element 211B is provided in the same column as the light-emitting element 211G. In the pixel layout in FIG. 15E, the light-emitting and light-receiving element 213R, the light-emitting element 211G, and the light-emitting element 211B are repeatedly arranged in both the odd-numbered row and the even-numbered row, and in each column, the light-emitting elements or the light-emitting element and the light-emitting and the receiving elements arranged in the odd-numbered row and the even-numbered row emit light of different colors.
  • FIG. 15F illustrates four pixels which employ PenTile arrangement; adjacent two pixels have different combinations of light-emitting elements or light-emitting and light-receiving elements that emit light of two different colors. FIG. 15F illustrates the top-surface shapes of the light-emitting elements or light-emitting and light-receiving elements.
  • The upper left pixel and the lower right pixel in FIG. 15F each include the light-emitting and light-receiving element 213R and the light-emitting element 211G. The upper right pixel and the lower left pixel each include the light-emitting element 211G and the light-emitting element 211B. That is, in the example illustrated in FIG. 15F, the light-emitting element 211G is provided in each pixel.
  • The top surface shape of the light-emitting elements and the light-emitting and light-receiving elements is not particularly limited and can be a circular shape, an elliptical shape, a polygonal shape, a polygonal shape with rounded corners, or the like. FIG. 15F and the like illustrate examples in which the top surface shapes of the light-emitting elements and the light-emitting and light-receiving elements are each a square tilted at approximately 450 (a diamond shape). Note that the top surface shape of the light-emitting elements and the light-emitting and light-receiving elements may vary depending on the color thereof, or the light-emitting elements and the light-emitting and light-receiving elements of some colors or every color may have the same top surface shape.
  • The sizes of light-emitting regions (or light-emitting and light-receiving regions) of the light-emitting elements and the light-emitting and light-receiving elements may vary depending on the color thereof, or the light-emitting elements and the light-emitting and light-receiving elements of some colors or every color may have light-emitting regions of the same size. For example, in FIG. 15F, the light-emitting region of the light-emitting element 211G provided in each pixel may have a smaller area than the light-emitting region (or the light-emitting and light-receiving region) of the other elements.
  • FIG. 15G is a modification example of the pixel arrangement of FIG. 15F. Specifically, the structure of FIG. 15G is obtained by rotating the structure of FIG. 15F by 45°. Although one pixel is regarded as including two elements in FIG. 15F, one pixel can be regarded as being formed of four elements as illustrated in FIG. 15G.
  • FIG. 15H is a modification example of the pixel arrangement of FIG. 15F. The upper left pixel and the lower right pixel in FIG. 15H each include the light-emitting and light-receiving element 213R and the light-emitting element 211G. The upper right pixel and the lower left pixel each include the light-emitting and light-receiving element 213R and the light-emitting element 211B. That is, in the example illustrated in FIG. 15H, the light-emitting and light-receiving element 213R is provided in each pixel. The structure illustrated in FIG. 15H achieves higher-resolution image capturing than the structure illustrated in FIG. 15F because of having the light-emitting and light-receiving element 213R in each pixel. Thus, the accuracy of biometric authentication can be increased, for example.
  • FIG. 15I shows a modification example of the pixel arrangement in FIG. 15H, obtained by rotating the pixel arrangement in FIG. 15H by 45°.
  • In FIG. 15I, one pixel is described as being formed of four elements (two light-emitting elements and two light-emitting and light-receiving elements). One pixel including a plurality of light-emitting and light-receiving elements having a light-receiving function allows high-resolution image capturing. Accordingly, the accuracy of biometric authentication can be increased. For example, the resolution of image capturing can be the square root of 2 times the resolution of display.
  • A display device that employs the structure illustrated in FIG. 15H or FIG. 15I includes p (p is an integer greater than or equal to 2) first light-emitting elements, q (q is an integer greater than or equal to 2) second light-emitting elements, and r (r is an integer greater than p and q) light-emitting and light-receiving elements. As for p and r, r=2p is satisfied. As for p, q, and r, r=p+q is satisfied. Either the first light-emitting elements or the second light-emitting elements emit green light, and the other light-emitting elements emit blue light. The light-emitting and light-receiving elements emit red light and have a light-receiving function.
  • In the case where touch operation is detected with the light-emitting and light-receiving elements, for example, it is preferable that light emitted from a light source be hard for a user to recognize. Since blue light has lower visibility than green light, light-emitting elements that emit blue light are preferably used as a light source. Accordingly, the light-emitting and light-receiving elements preferably have a function of receiving blue light. Note that without limitation to the above, light-emitting elements used as a light source can be selected as appropriate depending on the sensitivity of the light-emitting and light-receiving elements.
  • As described above, the display device of this embodiment can employ any of various types of pixel arrangements.
  • [Device Structure]
  • Next, detailed structures of the light-emitting element, the light-receiving element, and the light-emitting and light-receiving element which can be used in the display device of one embodiment of the present invention are described.
  • The display device of one embodiment of the present invention can have any of the following structures: a top-emission structure in which light is emitted in a direction opposite to the substrate where the light-emitting elements are formed, a bottom-emission structure in which light is emitted toward the substrate where the light-emitting elements are formed, and a dual-emission structure in which light is emitted toward both surfaces.
  • In this embodiment, a top-emission display device is described as an example.
  • In this specification and the like, unless otherwise specified, in describing a structure including a plurality of components (e.g., light-emitting elements or light-emitting layers), alphabets are not added when a common part for the components is described. For example, when a common part of a light-emitting layer 283R, a light-emitting layer 283G, and the like is described, the light-emitting layers are simply referred to as a light-emitting layer 283, in some cases.
  • A display device 280A illustrated in FIG. 16A includes a light-receiving element 270PD, a light-emitting element 270R that emits red (R) light, a light-emitting element 270G that emits green (G) light, and a light-emitting element 270B that emits blue (B) light.
  • Each of the light-emitting elements includes a pixel electrode 271, a hole-injection layer 281, a hole-transport layer 282, a light-emitting layer, an electron-transport layer 284, an electron-injection layer 285, and a common electrode 275, which are stacked in this order. The light-emitting element 270R includes the light-emitting layer 283R, the light-emitting element 270G includes the light-emitting layer 283G, and the light-emitting element 270B includes a light-emitting layer 283B. The light-emitting layer 283R includes alight-emitting substance that emits red light, the light-emitting layer 283G includes a light-emitting substance that emits green light, and the light-emitting layer 283B includes a light-emitting substance that emits blue light.
  • The light-emitting elements are electroluminescent elements that emit light to the common electrode 275 side by voltage application between the pixel electrodes 271 and the common electrode 275.
  • The light-receiving element 270PD includes the pixel electrode 271, the hole-injection layer 281, the hole-transport layer 282, an active layer 273, the electron-transport layer 284, the electron-injection layer 285, and the common electrode 275, which are stacked in this order.
  • The light-receiving element 270PD is a photoelectric conversion element that receives light entering from the outside of the display device 280A and converts it into an electric signal.
  • In the description made in this embodiment, the pixel electrode 271 functions as an anode and the common electrode 275 functions as a cathode in both of the light-emitting element and the light-receiving element. In other words, when the light-receiving element is driven by application of reverse bias between the pixel electrode 271 and the common electrode 275, light incident on the light-receiving element can be detected and charge can be generated and extracted as current.
  • In the display device of this embodiment, an organic compound is used for the active layer 273 of the light-receiving element 270PD. In the light-receiving element 270PD, the layers other than the active layer 273 can have structures in common with the layers in the light-emitting elements. Therefore, the light-receiving element 270PD can be formed concurrently with the formation of the light-emitting elements only by adding a step of depositing the active layer 273 in the manufacturing process of the light-emitting elements. The light-emitting elements and the light-receiving element 270PD can be formed over one substrate. Accordingly, the light-receiving element 270PD can be incorporated into the display device without a significant increase in the number of manufacturing steps.
  • The display device 280A is an example in which the light-receiving element 270PD and the light-emitting elements have a common structure except that the active layer 273 of the light-receiving element 270PD and the light-emitting layers 283 of the light-emitting elements are separately formed. Note that the structures of the light-receiving element 270PD and the light-emitting elements are not limited thereto. The light-receiving element 270PD and the light-emitting elements may include separately formed layers other than the active layer 273 and the light-emitting layers 283. The light-receiving element 270PD and the light-emitting elements preferably include at least one layer used in common (common layer). Thus, the light-receiving element 270PD can be incorporated into the display device without a significant increase in the number of manufacturing steps.
  • A conductive film that transmits visible light is used as the electrode through which light is extracted, which is either the pixel electrode 271 or the common electrode 275. A conductive film that reflects visible light is preferably used as the electrode through which light is not extracted.
  • The light-emitting elements included in the display device of this embodiment preferably employs a micro optical resonator (microcavity) structure. Thus, one of the pair of electrodes of the light-emitting elements is preferably an electrode having properties of transmitting and reflecting visible light (a semi-transmissive and semi-reflective electrode), and the other is preferably an electrode having a property of reflecting visible light (a reflective electrode). When the light-emitting elements have a microcavity structure, light obtained from the light-emitting layers can be resonated between both of the electrodes, whereby light emitted from the light-emitting elements can be intensified.
  • Note that the semi-transmissive and semi-reflective electrode can have a stacked-layer structure of a reflective electrode and an electrode having a property of transmitting visible light (also referred to as a transparent electrode).
  • The transparent electrode has a light transmittance higher than or equal to 40%. For example, an electrode having a visible light (light with a wavelength greater than or equal to 400 nm and less than 750 nm) transmittance higher than or equal to 40% is preferably used in the light-emitting elements. The semi-transmissive and semi-reflective electrode has a visible light reflectance of higher than or equal to 10% and lower than or equal to 95%, preferably higher than or equal to 30% and lower than or equal to 80%. The reflective electrode has a visible light reflectance of higher than or equal to 40% and lower than or equal to 100%, preferably higher than or equal to 70% and lower than or equal to 100%. These electrodes preferably have a resistivity lower than or equal to 1×10−2 2 Ωcm. Note that in the case where any of the light-emitting elements emits near-infrared light (light with a wavelength greater than or equal to 750 nm and less than or equal to 1300 nm), the near-infrared light transmittance and reflectance of these electrodes preferably satisfy the above-described numerical ranges of the visible light transmittance and reflectance.
  • The light-emitting element includes at least the light-emitting layer 283. The light-emitting element may further include, as a layer other than the light-emitting layer 283, a layer containing a substance with a high hole-injection property, a substance with a high hole-transport property, a hole-blocking material, a substance with a high electron-transport property, a substance with a high electron-injection property, an electron-blocking material, a substance with a bipolar property (a substance with a high electron- and hole-transport property), or the like.
  • For example, the light-emitting elements and the light-receiving element can share at least one of the hole-injection layer, the hole-transport layer, the electron-transport layer, and the electron-injection layer. Furthermore, at least one of the hole-injection layer, the hole-transport layer, the electron-transport layer, and the electron-injection layer can be separately formed for the light-emitting elements and the light-receiving element.
  • The hole-injection layer is a layer injecting holes from an anode to the hole-transport layer, and a layer containing a material with a high hole-injection property. As the material with a high hole-injection property, an aromatic amine compound or a composite material containing a hole-transport material and an acceptor material (electron-accepting material) can be used.
  • In the light-emitting element, the hole-transport layer is a layer transporting holes, which are injected from the anode by the hole-injection layer, to the light-emitting layer. In the light-receiving element, the hole-transport layer is a layer transporting holes, which are generated in the active layer on the basis of incident light, to the anode. The hole-transport layer is a layer containing a hole-transport material. As the hole-transport material, a substance having a hole mobility greater than or equal to 1×10−6 cm2Ns is preferable. Note that other substances can also be used as long as they have a property of transporting more holes than electrons. As the hole-transport material, materials having a high hole-transport property, such as a π-electron rich heteroaromatic compound (e.g., a carbazole derivative, a thiophene derivative, and a furan derivative) and an aromatic amine (a compound having an aromatic amine skeleton), are preferable.
  • In the light-emitting element, the electron-transport layer is a layer transporting electrons, which are injected from the cathode by the electron-injection layer, to the light-emitting layer. In the light-receiving element, the electron-transport layer is a layer transporting electrons, which are generated in the active layer on the basis of incident light, to the cathode. The electron-transport layer is a layer containing an electron-transport material. As the electron-transport material, a substance having an electron mobility greater than or equal to 1×10−6 cm2Ns is preferable. Note that other substances can also be used as long as they have a property of transporting more electrons than holes. As the electron-transport material, it is possible to use a material having a high electron-transport property, such as a metal complex having a quinoline skeleton, a metal complex having a benzoquinoline skeleton, a metal complex having an oxazole skeleton, a metal complex having a thiazole skeleton, an oxadiazole derivative, a triazole derivative, an imidazole derivative, an oxazole derivative, a thiazole derivative, a phenanthroline derivative, a quinoline derivative having a quinoline ligand, a benzoquinoline derivative, a quinoxaline derivative, a dibenzoquinoxaline derivative, a pyridine derivative, a bipyridine derivative, a pyrimidine derivative, or a it-electron deficient heteroaromatic compound such as a nitrogen-containing heteroaromatic compound.
  • The electron-injection layer is a layer injecting electrons from a cathode to the electron-transport layer, and a layer containing a material with a high electron-injection property. As the material with a high electron-injection property, an alkali metal, an alkaline earth metal, or a compound thereof can be used. As the material with a high electron-injection property, a composite material containing an electron-transport material and a donor material (electron-donating material) can also be used.
  • The light-emitting layer 283 is a layer including a light-emitting substance. The light-emitting layer 283 can include one or more kinds of light-emitting substances. As the light-emitting substance, a substance that exhibits an emission color of blue, purple, bluish purple, green, yellowish green, yellow, orange, red, or the like is appropriately used. As the light-emitting substance, a substance that emits near-infrared light can also be used.
  • Examples of the light-emitting substance include a fluorescent material, a phosphorescent material, a TADF material, and a quantum dot material.
  • Examples of the fluorescent material include a pyrene derivative, an anthracene derivative, a triphenylene derivative, a fluorene derivative, a carbazole derivative, a dibenzothiophene derivative, a dibenzofuran derivative, a dibenzoquinoxaline derivative, a quinoxaline derivative, a pyridine derivative, a pyrimidine derivative, a phenanthrene derivative, and a naphthalene derivative.
  • Examples of the phosphorescent material include an organometallic complex (particularly an iridium complex) having a 4H-triazole skeleton, a 1H-triazole skeleton, an imidazole skeleton, a pyrimidine skeleton, a pyrazine skeleton, or a pyridine skeleton; an organometallic complex (particularly an iridium complex) having a phenylpyridine derivative including an electron-withdrawing group as a ligand; a platinum complex; and a rare earth metal complex.
  • The light-emitting layer 283 may include one or more kinds of organic compounds (e.g., a host material and an assist material) in addition to the light-emitting substance (a guest material). As one or more kinds of organic compounds, one or both of the hole-transport material and the electron-transport material can be used. Alternatively, as one or more kinds of organic compounds, a bipolar material or a TADF material may be used.
  • The light-emitting layer 283 preferably includes a phosphorescent material and a combination of a hole-transport material and an electron-transport material that easily forms an exciplex. With such a structure, light emission can be efficiently obtained by ExTET (Exciplex-Triplet Energy Transfer), which is energy transfer from an exciplex to a light-emitting substance (a phosphorescent material). When a combination of materials is selected so as to form an exciplex that exhibits light emission whose wavelength overlaps the wavelength of a lowest-energy-side absorption band of the light-emitting substance, energy can be transferred smoothly and light emission can be obtained efficiently. With this structure, high efficiency, low-voltage driving, and a long lifetime of the light-emitting element can be achieved at the same time.
  • In the combination of materials for forming an exciplex, the HOMO level (highest occupied molecular orbital level) of the hole-transport material is preferably higher than or equal to the HOMO level of the electron-transport material. The LUMO level (lowest unoccupied molecular orbital level) of the hole-transport material is preferably higher than or equal to the LUMO level of the electron-transport material. The LUMO levels and the HOMO levels of the materials can be derived from the electrochemical characteristics (reduction potentials and oxidation potentials) of the materials that are measured by cyclic voltammetry (CV).
  • Note that the formation of an exciplex can be confirmed by a phenomenon in which the emission spectrum of a mixed film in which the hole-transport material and the electron-transport material are mixed is shifted to the longer wavelength side than the emission spectrum of each of the materials (or has another peak on the longer wavelength side), observed by comparison of the emission spectrum of the hole-transport material, the emission spectrum of the electron-transport material, and the emission spectrum of the mixed film of these materials, for example.
  • Alternatively, the formation of an exciplex can be confirmed by a difference in transient response, such as a phenomenon in which the transient photoluminescence (PL) lifetime of the mixed film has longer lifetime components or has a larger proportion of delayed components than that of each of the materials, observed by comparison of the transient PL of the hole-transport material, the transient PL of the electron-transport material, and the transient PL of the mixed film of these materials. The transient PL can be rephrased as transient electroluminescence (EL). That is, the formation of an exciplex can also be confirmed by a difference in transient response observed by comparison of the transient EL of the hole-transport material, the transient EL of the electron-transport material, and the transient EL of the mixed film of these materials.
  • The active layer 273 includes a semiconductor. Examples of the semiconductor include an inorganic semiconductor such as silicon and an organic semiconductor including an organic compound. This embodiment shows an example in which an organic semiconductor is used as the semiconductor included in the active layer 273. The use of an organic semiconductor is preferable because the light-emitting layer 283 and the active layer 273 can be formed by the same method (e.g., a vacuum evaporation method) and thus the same manufacturing apparatus can be used.
  • Examples of an n-type semiconductor material contained in the active layer 273 are electron-accepting organic semiconductor materials such as fullerene (e.g., C60 and C70) and a fullerene derivative. Fullerene has a soccer ball-like shape, which is energetically stable. Both the HOMO level and the LUMO level of fullerene are deep (low). Having a deep LUMO level, fullerene has an extremely high electron-accepting property (acceptor property). When π-electron conjugation (resonance) spreads in a plane as in benzene, the electron-donating property (donor property) usually increases. Although π-electrons widely spread in fullerene having a spherical shape, its electron-accepting property is high. The high electron-accepting property efficiently causes rapid charge separation and is useful for a light-receiving element. Both C60 and C70 have a wide absorption band in the visible light region, and C70 is especially preferable because of having a larger π-electron conjugation system and a wider absorption band in the long wavelength region than C60.
  • Examples of the n-type semiconductor material include a metal complex having a quinoline skeleton, a metal complex having a benzoquinoline skeleton, a metal complex having an oxazole skeleton, a metal complex having a thiazole skeleton, an oxadiazole derivative, a triazole derivative, an imidazole derivative, an oxazole derivative, a thiazole derivative, a phenanthroline derivative, a quinoline derivative, a benzoquinoline derivative, a quinoxaline derivative, a dibenzoquinoxaline derivative, a pyridine derivative, a bipyridine derivative, a pyrimidine derivative, a naphthalene derivative, an anthracene derivative, a coumarin derivative, a rhodamine derivative, a triazine derivative, and a quinone derivative.
  • Examples of a p-type semiconductor material contained in the active layer 273 include electron-donating organic semiconductor materials such as copper(II) phthalocyanine (CuPc), tetraphenyldibenzoperiflanthene (DBP), zinc phthalocyanine (ZnPc), tin phthalocyanine (SnPc), and quinacridone.
  • Examples of a p-type semiconductor material include a carbazole derivative, a thiophene derivative, a furan derivative, and a compound having an aromatic amine skeleton. Other examples of the p-type semiconductor material include a naphthalene derivative, an anthracene derivative, a pyrene derivative, a triphenylene derivative, a fluorene derivative, a pyrrole derivative, a benzofuran derivative, a benzothiophene derivative, an indole derivative, a dibenzofuran derivative, a dibenzothiophene derivative, an indolocarbazole derivative, a porphyrin derivative, a phthalocyanine derivative, a naphthalocyanine derivative, a quinacridone derivative, a polyphenylene vinylene derivative, a polyparaphenylene derivative, a polyfluorene derivative, a polyvinylcarbazole derivative, and a polythiophene derivative.
  • The HOMO level of the electron-donating organic semiconductor material is preferably shallower (higher) than the HOMO level of the electron-accepting organic semiconductor material. The LUMO level of the electron-donating organic semiconductor material is preferably shallower (higher) than the LUMO level of the electron-accepting organic semiconductor material.
  • Fullerene having a spherical shape is preferably used as the electron-accepting organic semiconductor material, and an organic semiconductor material having a substantially planar shape is preferably used as the electron-donating organic semiconductor material. Molecules of similar shapes tend to aggregate, and aggregated molecules of similar kinds, which have molecular orbital energy levels close to each other, can improve the carrier-transport property.
  • For example, the active layer 273 is preferably formed by co-evaporation of an n-type semiconductor and a p-type semiconductor. Alternatively, the active layer 273 may be formed by stacking an n-type semiconductor and a p-type semiconductor.
  • Either a low molecular compound or a high molecular compound can be used for the light-emitting element and the light-receiving element, and an inorganic compound may also be contained. Each of the layers included in the light-emitting element and the light-receiving element can be formed by an evaporation method (including a vacuum evaporation method), a transfer method, a printing method, an inkjet method, a coating method, or the like.
  • A display device 280B illustrated in FIG. 16B is different from the display device 280A in that the light-receiving element 270PD and the light-emitting element 270R have the same structure.
  • The light-receiving element 270PD and the light-emitting element 270R share the active layer 273 and the light-emitting layer 283R.
  • Here, it is preferable that the light-receiving element 270PD have a structure in common with the light-emitting element that emits light with a wavelength longer than that of the light desired to be detected. For example, the light-receiving element 270PD having a structure in which blue light is detected can have a structure which is similar to that of one or both of the light-emitting element 270R and the light-emitting element 270G. For example, the light-receiving element 270PD having a structure in which green light is detected can have a structure similar to that of the light-emitting element 270R.
  • When the light-receiving element 270PD and the light-emitting element 270R have a common structure, the number of deposition steps and the number of masks can be smaller than those for the structure in which the light-receiving element 270PD and the light-emitting element 270R include separately formed layers. As a result, the number of manufacturing steps and the manufacturing cost of the display device can be reduced.
  • When the light-receiving element 270PD and the light-emitting element 270R have a common structure, a margin for misalignment can be narrower than that for the structure in which the light-receiving element 270PD and the light-emitting element 270R include separately formed layers. Accordingly, the aperture ratio of a pixel can be increased, so that the light extraction efficiency of the display device can be increased. This can extend the life of the light-emitting element. Furthermore, the display device can exhibit a high luminance. Moreover, the resolution of the display device can also be increased.
  • The light-emitting layer 283R includes a light-emitting material that emits red light. The active layer 273 includes an organic compound that absorbs light with a wavelength shorter than that of red light (e.g., one or both of green light and blue light). The active layer 273 preferably includes an organic compound that does not easily absorb red light and that absorbs light with a wavelength shorter than that of red light. In this way, red light can be efficiently extracted from the light-emitting element 270R, and the light-receiving element 270PD can detect light with a wavelength shorter than that of red light at high accuracy.
  • Although the light-emitting element 270R and the light-receiving element 270PD have the same structure in an example of the display device 280B, the light-emitting element 270R and the light-receiving element 270PD may include optical adjustment layers with different thicknesses.
  • A display device 280C illustrated in FIG. 17A and FIG. 17B includes a light-emitting and light-receiving element 270SR that emits red (R) light and has a light-receiving function, the light-emitting element 270G, and the light-emitting element 270B. The above description of the display device 280A and the like can be referred to for the structures of the light-emitting element 270G and the light-emitting element 270B.
  • The light-emitting and light-receiving element 270SR includes the pixel electrode 271, the hole-injection layer 281, the hole-transport layer 282, the active layer 273, the light-emitting layer 283R, the electron-transport layer 284, the electron-injection layer 285, and the common electrode 275, which are stacked in this order. The light-emitting and light-receiving element 270SR has the same structure as the light-emitting element 270R and the light-receiving element 270PD in the display device 280B.
  • FIG. 17A shows a case where the light-emitting and light-receiving element 270SR functions as a light-emitting element. In the example of FIG. 17A, the light-emitting element 270B emits blue light, the light-emitting element 270G emits green light, and the light-emitting and light-receiving element 270SR emits red light.
  • FIG. 17B illustrates a case where the light-emitting and light-receiving element 270SR functions as a light-receiving element. In FIG. 17B, the light-emitting and light-receiving element 270SR detects blue light emitted by the light-emitting element 270B and green light emitted by the light-emitting element 270G.
  • The light-emitting element 270B, the light-emitting element 270G, and the light-emitting and light-receiving element 270SR each include the pixel electrode 271 and the common electrode 275. In this embodiment, the case where the pixel electrode 271 functions as an anode and the common electrode 275 functions as a cathode is described as an example. When the light-emitting and light-receiving element 270SR is driven by application of reverse bias between the pixel electrode 271 and the common electrode 275, light incident on the light-emitting and light-receiving element 270SR can be detected and charge can be generated and extracted as current.
  • Note that it can be said that the light-emitting and light-receiving element 270SR has a structure in which the active layer 273 is added to the light-emitting element. That is, the light-emitting and light-receiving element 270SR can be formed concurrently with the formation of the light-emitting element only by adding a step of depositing the active layer 273 in the manufacturing process of the light-emitting element. The light-emitting element and the light-emitting and light-receiving element can be formed over one substrate. Thus, the display portion can be provided with one or both of an image capturing function and a sensing function without a significant increase in the number of manufacturing steps.
  • The stacking order of the light-emitting layer 283R and the active layer 273 is not limited. FIG. 17A and FIG. 17B each illustrate an example in which the active layer 273 is provided over the hole-transport layer 282, and the light-emitting layer 283R is provided over the active layer 273. The stacking order of the light-emitting layer 283R and the active layer 273 may be reversed.
  • The light-emitting and light-receiving element may exclude at least one layer of the hole-injection layer 281, the hole-transport layer 282, the electron-transport layer 284, and the electron-injection layer 285. Furthermore, the light-emitting and light-receiving element may include another functional layer such as a hole-blocking layer or an electron-blocking layer.
  • In the light-emitting and light-receiving element, a conductive film that transmits visible light is used as the electrode through which light is extracted. A conductive film that reflects visible light is preferably used as the electrode through which light is not extracted.
  • The functions and materials of the layers constituting the light-emitting and light-receiving element are similar to those of the layers constituting the light-emitting elements and the light-receiving element and are not described in detail.
  • FIG. 17C to FIG. 17G illustrate examples of layered structures of light-emitting and light-receiving elements.
  • The light-emitting and light-receiving element illustrated in FIG. 17C includes a first electrode 277, the hole-injection layer 281, the hole-transport layer 282, the light-emitting layer 283R, the active layer 273, the electron-transport layer 284, the electron-injection layer 285, and a second electrode 278.
  • FIG. 17C illustrates an example in which the light-emitting layer 283R is provided over the hole-transport layer 282, and the active layer 273 is stacked over the light-emitting layer 283R.
  • As illustrated in FIG. 17A to FIG. 17C, the active layer 273 and the light-emitting layer 283R may be in contact with each other.
  • A buffer layer is preferably provided between the active layer 273 and the light-emitting layer 283R. In this case, the buffer layer preferably has a hole-transport property and an electron-transport property. For example, a substance with a bipolar property is preferably used for the buffer layer. Alternatively, as the buffer layer, at least one layer of a hole-injection layer, a hole-transport layer, an electron-transport layer, an electron-injection layer, a hole-blocking layer, an electron-blocking layer, and the like can be used. FIG. 17D illustrates an example in which the hole-transport layer 282 is used as the buffer layer.
  • The buffer layer provided between the active layer 273 and the light-emitting layer 283R can inhibit transfer of excitation energy from the light-emitting layer 283R to the active layer 273. Furthermore, the buffer layer can also be used to adjust the optical path length (cavity length) of the microcavity structure. Thus, high emission efficiency can be obtained from a light-emitting and light-receiving element including the buffer layer between the active layer 273 and the light-emitting layer 283R.
  • FIG. 17E illustrates an example of a stacked-layer structure in which a hole-transport layer 282-1, the active layer 273, a hole-transport layer 282-2, and the light-emitting layer 283R are stacked in this order over the hole-injection layer 281. The hole-transport layer 282-2 functions as a buffer layer. The hole-transport layer 282-1 and the hole-transport layer 282-2 may include the same material or different materials. Instead of the hole-transport layer 282-2, any of the above layers that can be used as the buffer layer may be used. The positions of the active layer 273 and the light-emitting layer 283R may be interchanged.
  • The light-emitting and light-receiving element illustrated in FIG. 17F is different from the light-emitting and light-receiving element illustrated in FIG. 17A in not including the hole-transport layer 282. In this manner, the light-emitting and light-receiving element may exclude at least one layer of the hole-injection layer 281, the hole-transport layer 282, the electron-transport layer 284, and the electron-injection layer 285. Furthermore, the light-emitting and light-receiving element may include another functional layer such as a hole-blocking layer or an electron-blocking layer.
  • The light-emitting and light-receiving element illustrated in FIG. 17G is different from the light-emitting and light-receiving element illustrated in FIG. 17A in including a layer 289 serving as both a light-emitting layer and an active layer instead of including the active layer 273 and the light-emitting layer 283R.
  • As the layer serving as both a light-emitting layer and an active layer, a layer containing three materials which are an n-type semiconductor that can be used for the active layer 273, a p-type semiconductor that can be used for the active layer 273, and a light-emitting substance that can be used for the light-emitting layer 283R can be used, for example.
  • Note that an absorption band on the lowest energy side of an absorption spectrum of a mixed material of the n-type semiconductor and the p-type semiconductor and a maximum peak of an emission spectrum (PL spectrum) of the light-emitting substance preferably do not overlap each other and are further preferably positioned fully apart from each other.
  • [Structure Example 2 of Light-Emitting Device]
  • A detailed structure of the display device of one embodiment of the present invention will be described below. Here, in particular, an example of the display device including light-receiving elements and light-emitting elements will be described.
  • [Structure Example 2-1]
  • FIG. 18A illustrates a cross-sectional view of a display device 300A. The display device 300A includes a substrate 351, a substrate 352, a light-receiving element 310, and a light-emitting element 390.
  • The light-emitting element 390 includes a pixel electrode 391, a buffer layer 312, a light-emitting layer 393, a buffer layer 314, and a common electrode 315, which are stacked in this order. The buffer layer 312 can include one or both of a hole-injection layer and a hole-transport layer. The light-emitting layer 393 includes an organic compound. The buffer layer 314 can include one or both of an electron-injection layer and an electron-transport layer. The light-emitting element 390 has a function of emitting visible light 321. Note that the display device 300A may also include a light-emitting element having a function of emitting infrared light.
  • The light-receiving element 310 includes a pixel electrode 311, the buffer layer 312, an active layer 313, the buffer layer 314, and the common electrode 315, which are stacked in this order. The active layer 313 includes an organic compound. The light-receiving element 310 has a function of detecting visible light. Note that the light-receiving element 310 may also have a function of detecting infrared light.
  • The buffer layer 312, the buffer layer 314, and the common electrode 315 are common layers shared by the light-emitting element 390 and the light-receiving element 310 and provided across them. The buffer layer 312, the buffer layer 314, and the common electrode 315 each include a portion overlapping with the active layer 313 and the pixel electrode 311, a portion overlapping with the light-emitting layer 393 and the pixel electrode 391, and a portion overlapping with none of them.
  • This embodiment is described assuming that the pixel electrode functions as an anode and the common electrode 315 functions as a cathode in both of the light-emitting element 390 and the light-receiving element 310. In other words, the light-receiving element 310 is driven by application of reverse bias between the pixel electrode 311 and the common electrode 315, so that light incident on the light-receiving element 310 can be detected and charge can be generated and extracted as current in the display device 300A.
  • The pixel electrode 311, the pixel electrode 391, the buffer layer 312, the active layer 313, the buffer layer 314, the light-emitting layer 393, and the common electrode 315 may each have a single-layer structure or a stacked-layer structure.
  • The pixel electrode 311 and the pixel electrode 391 are each positioned over an insulating layer 414. The pixel electrodes can be formed using the same material in the same step. An end portion of the pixel electrode 311 and an end portion of the pixel electrode 391 are covered with a partition 416. Two adjacent pixel electrodes are electrically insulated (electrically isolated) from each other by the partition 416.
  • An organic insulating film is suitable for the partition 416. Examples of materials that can be used for the organic insulating film include an acrylic resin, a polyimide resin, an epoxy resin, a polyamide resin, a polyimide-amide resin, a siloxane resin, a benzocyclobutene-based resin, a phenol resin, and precursors of these resins. The partition 416 is a layer that transmits visible light. A partition that blocks visible light may be provided instead of the partition 416.
  • The common electrode 315 is a layer shared by the light-receiving element 310 and the light-emitting element 390.
  • The material, thickness, and the like of the pair of electrodes can be the same between the light-receiving element 310 and the light-emitting element 390. Accordingly, the manufacturing cost of the display device can be reduced, and the manufacturing process of the display device can be simplified.
  • The display device 300A includes the light-receiving element 310, the light-emitting element 390, a transistor 331, a transistor 332, and the like between a pair of substrates (the substrate 351 and the substrate 352).
  • In the light-receiving element 310, the buffer layer 312, the active layer 313, and the buffer layer 314, which are positioned between the pixel electrode 311 and the common electrode 315, can each be referred to as an organic layer (a layer including an organic compound). The pixel electrode 311 preferably has a function of reflecting visible light. The common electrode 315 has a function of transmitting visible light. Note that in the case where the light-receiving element 310 is configured to detect infrared light, the common electrode 315 has a function of transmitting infrared light. Furthermore, the pixel electrode 311 preferably has a function of reflecting infrared light.
  • The light-receiving element 310 has a function of detecting light. Specifically, the light-receiving element 310 is a photoelectric conversion element that receives light 322 incident from the outside of the display device 300A and converts it into an electric signal. The light 322 can also be expressed as light that is emitted from the light-emitting element 390 and then reflected by an object. The light 322 may be incident on the light-receiving element 310 through a lens or the like provided in the display device 300A.
  • In the light-emitting element 390, the buffer layer 312, the light-emitting layer 393, and the buffer layer 314, which are positioned between the pixel electrode 391 and the common electrode 315, can be collectively referred to as an EL layer. The EL layer includes at least the light-emitting layer 393. As described above, the pixel electrode 391 preferably has a function of reflecting visible light. The common electrode 315 has a function of transmitting visible light. Note that in the case where the display device 300A includes a light-emitting element that emits infrared light, the common electrode 315 has a function of transmitting infrared light.
  • Furthermore, the pixel electrode 391 preferably has a function of reflecting infrared light.
  • The light-emitting elements included in the display device of this embodiment preferably employ a micro optical resonator (microcavity) structure. The light-emitting element 390 may include an optical adjustment layer between the pixel electrode 391 and the common electrode 315. The use of the micro resonator structure enables light of a specific color to be intensified and extracted from each of the light-emitting elements.
  • The light-emitting element 390 has a function of emitting visible light. Specifically, the light-emitting element 390 is an electroluminescent element that emits light (here, the visible light 321) to the substrate 352 side when voltage is applied between the pixel electrode 391 and the common electrode 315.
  • The pixel electrode 311 included in the light-receiving element 310 is electrically connected to a source or a drain of the transistor 331 through an opening provided in the insulating layer 414. The pixel electrode 391 included in the light-emitting element 390 is electrically connected to a source or a drain of the transistor 332 through an opening provided in the insulating layer 414.
  • The transistor 331 and the transistor 332 are on and in contact with the same layer (the substrate 351 in FIG. 18A).
  • At least part of a circuit electrically connected to the light-receiving element 310 and a circuit electrically connected to the light-emitting element 390 are preferably formed using the same material in the same step. In that case, the thickness of the display device can be reduced compared with the case where the two circuits are separately formed, resulting in simplification of the manufacturing process.
  • The light-receiving element 310 and the light-emitting element 390 are each preferably covered with a protective layer 395. In FIG. 18A, the protective layer 395 is provided on and in contact with the common electrode 315. Providing the protective layer 395 can inhibit entry of impurities such as water into the light-receiving element 310 and the light-emitting element 390, so that the reliability of the light-receiving element 310 and the light-emitting element 390 can be increased. The protective layer 395 and the substrate 352 are bonded to each other with an adhesive layer 342.
  • A light-blocking layer 358 is provided on the surface of the substrate 352 on the substrate 351 side. The light-blocking layer 358 has openings in a position overlapping with the light-emitting element 390 and in a position overlapping with the light-receiving element 310.
  • Here, the light-receiving element 310 detects light that is emitted from the light-emitting element 390 and then reflected by an object. However, in some cases, light emitted from the light-emitting element 390 is reflected inside the display device 300A and is incident on the light-receiving element 310 without through an object. The light-blocking layer 358 can reduce the influence of such stray light. For example, in the case where the light-blocking layer 358 is not provided, light 323 emitted from the light-emitting element 390 is reflected by the substrate 352 and reflected light 324 is incident on the light-receiving element 310 in some cases. Providing the light-blocking layer 358 can inhibit the reflected light 324 to be incident on the light-receiving element 310. Consequently, noise can be reduced, and the sensitivity of a sensor using the light-receiving element 310 can be increased.
  • For the light-blocking layer 358, a material that blocks light emitted from the light-emitting element can be used. The light-blocking layer 358 preferably absorbs visible light. As the light-blocking layer 358, a black matrix can be formed using a metal material or a resin material containing pigment (e.g., carbon black) or dye, for example. The light-blocking layer 358 may have a stacked-layer structure of a red color filter, a green color filter, and a blue color filter.
  • [Structure Example 2-2]
  • A display device 300B illustrated in FIG. 18B differs from the display device 300A mainly in including a lens 349.
  • The lens 349 is provided on a surface of the substrate 352 on the substrate 351 side. The light 322 from the outside is incident on the light-receiving element 310 through the lens 349. For each of the lens 349 and the substrate 352, a material that has high visible-light-transmitting property is preferably used.
  • When light is incident on the light-receiving element 310 through the lens 349, the range of light incident on the light-receiving element 310 can be narrowed. Thus, overlap of imaging ranges between a plurality of light-receiving elements 310 can be inhibited, whereby a clear image with little blurring can be captured.
  • In addition, the lens 349 can condense incident light. Accordingly, the amount of light to be incident on the light-receiving element 310 can be increased. This can increase the photoelectric conversion efficiency of the light-receiving element 310.
  • [Structure Example 2-3]
  • A display device 300C illustrated in FIG. 18C differs from the display device 300A in the shape of the light-blocking layer 358.
  • The light-blocking layer 358 is provided so that an opening portion overlapping with the light-receiving element 310 is positioned on an inner side of the light-receiving region of the light-receiving element 310 in a plan view. The smaller the diameter of the opening portion overlapping with the light-receiving element 310 of the light-blocking layer 358 is, the narrower the range of light incident on the light-receiving element 310 becomes. Thus, overlap of imaging ranges between a plurality of light-receiving elements 310 can be inhibited, whereby a clear image with little blurring can be captured.
  • For example, the area of the opening portion of the light-blocking layer 358 can be less than or equal to 80%, less than or equal to 70%, less than or equal to 60%, less than or equal to 50%, or less than or equal to 40% and greater than or equal to 1%, greater than or equal to 5%, or greater than or equal to 10% of the area of the light-receiving region of the light-receiving element 310. A clearer image can be obtained as the area of the opening portion of the light-blocking layer 358 becomes smaller. In contrast, when the area of the opening portion is too small, the amount of light reaching the light-receiving element 310 might be reduced to reduce light sensitivity. Therefore, the area of the opening is preferably set within the above-described range. The above upper limits and lower limits can be combined freely. Furthermore, the light-receiving region of the light-receiving element 310 can be referred to as the opening portion of the partition 416.
  • Note that the center of the opening portion of the light-blocking layer 358 overlapping with the light-receiving element 310 may be shifted from the center of the light-receiving region of the light-receiving element 310 in a plan view. Moreover, a structure in which the opening portion of the light-blocking layer 358 does not overlap with the light-receiving region of the light-receiving element 310 in a plan view may be employed. Thus, only oblique light that has passed through the opening portion of the light-blocking layer 358 can be received by the light-receiving element 310. Accordingly, the range of light incident on the light-receiving element 310 can be limited more effectively, so that a clear image can be captured.
  • [Structure Example 2-4]
  • A display device 300D illustrated in FIG. 19A differs from the display device 300A mainly in that the buffer layer 312 is not a common layer.
  • The light-receiving element 310 includes the pixel electrode 311, the buffer layer 312, the active layer 313, the buffer layer 314, and the common electrode 315. The light-emitting element 390 includes the pixel electrode 391, a buffer layer 392, the light-emitting layer 393, the buffer layer 314, and the common electrode 315. Each of the active layer 313, the buffer layer 312, the light-emitting layer 393, and the buffer layer 392 has an island-shaped top surface.
  • The buffer layer 312 and the buffer layer 392 may contain different materials or the same material.
  • As described above, when the buffer layers are formed separately in the light-emitting element 390 and the light-receiving element 310, the degree of freedom for selecting materials of the buffer layers included in the light-emitting element 390 and the light-receiving element 310 can be increased, which facilitates optimization. In addition, the buffer layer 314 and the common electrode 315 are common layers, whereby the manufacturing process can be simplified and manufacturing cost can be reduced as compared to the case where the light-emitting element 390 and the light-receiving element 310 are manufactured separately.
  • [Structure Example 2-5]
  • A display device 300E illustrated in FIG. 19B differs from the display device 300A mainly in that the buffer layer 314 is not a common layer.
  • The light-receiving element 310 includes the pixel electrode 311, the buffer layer 312, the active layer 313, the buffer layer 314, and the common electrode 315. The light-emitting element 390 includes the pixel electrode 391, the buffer layer 312, the light-emitting layer 393, a buffer layer 394, and the common electrode 315. Each of the active layer 313, the buffer layer 314, the light-emitting layer 393, and the buffer layer 394 has an island-shaped top surface.
  • The buffer layer 314 and the buffer layer 394 may include different materials or the same material.
  • As described above, when the buffer layers are formed separately in the light-emitting element 390 and the light-receiving element 310, the degree of freedom for selecting materials of the buffer layers included in the light-emitting element 390 and the light-receiving element 310 can be increased, which facilitates optimization. In addition, the buffer layer 312 and the common electrode 315 are common layers, whereby the manufacturing process can be simplified and manufacturing cost can be reduced as compared to the case where the light-emitting element 390 and the light-receiving element 310 are manufactured separately.
  • [Structure Example 2-6]
  • A display device 300F illustrated in FIG. 19C differs from the display device 300A mainly in that the buffer layer 312 and the buffer layer 314 are not common layers.
  • The light-receiving element 310 includes the pixel electrode 311, the buffer layer 312, the active layer 313, the buffer layer 314, and the common electrode 315. The light-emitting element 390 includes the pixel electrode 391, the buffer layer 392, the light-emitting layer 393, the buffer layer 394, and the common electrode 315. Each of the buffer layer 312, the active layer 313, the buffer layer 314, the buffer layer 392, the light-emitting layer 393, and the buffer layer 394 has an island-shaped top surface.
  • As described above, when the buffer layers are formed separately in the light-emitting element 390 and the light-receiving element 310, the degree of freedom for selecting materials of the buffer layers included in the light-emitting element 390 and the light-receiving element 310 can be increased, which facilitates optimization. In addition, the common electrode 315 is a common layer, whereby the manufacturing process can be simplified and manufacturing cost can be reduced as compared to the case where the light-emitting element 390 and the light-receiving element 310 are manufactured separately.
  • <Structure Example 3 of Display Device>
  • A more detailed structure of the display device of one embodiment of the present invention will be described below. Here, in particular, an example of the display device including light-emitting and light-receiving elements and light-emitting elements will be described.
  • Note that in the description below, the above description is referred to for portions similar to those described above and the description of the portions is omitted in some cases.
  • [Structure Example 3-1]
  • FIG. 20A illustrates a cross-sectional view of a display device 300G. The display device 300G includes a light-emitting and light-receiving element 390SR, a light-emitting element 390G, and a light-emitting element 390B.
  • The light-emitting and light-receiving element 390SR has a function of a light-emitting element that emits red light 321R, and a function of a photoelectric conversion element that receives the light 322. The light-emitting element 390G can emit green light 321G. The light-emitting element 390B can emit blue light 321B.
  • The light-emitting and light-receiving element 390SR includes the pixel electrode 311, the buffer layer 312, the active layer 313, a light-emitting layer 393R, the buffer layer 314, and the common electrode 315. The light-emitting element 390G includes a pixel electrode 391G, the buffer layer 312, a light-emitting layer 393G, the buffer layer 314, and the common electrode 315. The light-emitting element 390B includes a pixel electrode 391B, the buffer layer 312, a light-emitting layer 393B, the buffer layer 314, and the common electrode 315.
  • The buffer layer 312, the buffer layer 314, and the common electrode 315 are common layers shared by the light-emitting and light-receiving element 390SR, the light-emitting element 390G, and the light-emitting element 390B and provided across them. Each of the active layer 313, the light-emitting layer 393R, the light-emitting layer 393G, and the light-emitting layer 393B has an island-shaped top surface. Note that although the stack body including the active layer 313 and the light-emitting layer 393R, the light-emitting layer 393G, and the light-emitting layer 393B are provided separately from one another in the example illustrated in FIG. 20A, adjacent two of them may include a region where the two overlaps each other.
  • Note that as in the case of the display device 300D, the display device 300E, or the display device 300F, the display device 300G can have a structure in which one or both of the buffer layer 312 and the buffer layer 314 are not used as common layers.
  • The pixel electrode 311 is electrically connected to one of the source and the drain of the transistor 331. The pixel electrode 391G is electrically connected to one of a source and a drain of a transistor 332G. The pixel electrode 391B is electrically connected to one of a source and a drain of a transistor 332B.
  • With such a structure, a display device with higher resolution can be achieved.
  • [Structure Example 3-2]
  • A display device 300H illustrated in FIG. 20B differs from the display device 300G mainly in the structure of the light-emitting and light-receiving element 390SR.
  • The light-emitting and light-receiving element 390SR includes a light-emitting and light-receiving layer 318R instead of the active layer 313 and the light-emitting layer 393R.
  • The light-emitting and light-receiving layer 318R is a layer that has both a function of a light-emitting layer and a function of an active layer. For example, a layer including the above-described light-emitting substance, an n-type semiconductor, and a p-type semiconductor can be used.
  • With such a structure, the manufacturing process can be simplified, facilitating cost reduction.
  • [Structure Example 4 of Display Device]
  • A more specific structure of the display device of one embodiment of the present invention will be described below.
  • FIG. 21 illustrates a perspective view of a display device 400, and FIG. 22A illustrates a cross-sectional view of the display device 400.
  • In the display device 400, a substrate 353 and a substrate 354 are bonded to each other. In FIG. 21 , the substrate 354 is denoted by a dashed line.
  • The display device 400 includes a display portion 362, a circuit 364, a wiring 365, and the like. FIG. 21 illustrates an example in which the display device 400 is provided with an IC (integrated circuit) 373 and an FPC 372. Thus, the structure illustrated in FIG. 21 can also be regarded as a display module including the display device 400, the IC, and the FPC.
  • As the circuit 364, for example, a scan line driver circuit can be used.
  • The wiring 365 has a function of supplying a signal and power to the display portion 362 and the circuit 364. The signal and power are input to the wiring 365 from the outside through the FPC 372 or input to the wiring 365 from the IC 373.
  • FIG. 21 illustrates an example in which the IC 373 is provided over the substrate 353 by a COG (Chip On Glass) method, a COF (Chip On Film) method, or the like. An IC including a scan line driver circuit, a signal line driver circuit, or the like can be used as the IC 373, for example. Note that the display device 400 and the display module are not necessarily provided with an IC. The IC may be mounted on the FPC by a COF method or the like.
  • FIG. 22A illustrates an example of cross-sections of part of a region including the FPC 372, part of a region including the circuit 364, part of a region including the display portion 362, and part of a region including an end portion of the display device 400 illustrated in FIG. 21 .
  • The display device 400 illustrated in FIG. 22A includes a transistor 408, a transistor 409, a transistor 410, the light-emitting element 390, the light-receiving element 310, and the like between the substrate 353 and the substrate 354.
  • The substrate 354 and the protective layer 395 are bonded to each other with the adhesive layer 342, and a solid sealing structure is used for the display device 400.
  • The substrate 353 and an insulating layer 412 are bonded to each other with an adhesive layer 355.
  • In a method for manufacturing the display device 400, first, a formation substrate provided with the insulating layer 412, the transistors, the light-receiving element 310, the light-emitting element 390, and the like is bonded to the substrate 354 provided with the light-blocking layer 358 and the like with the adhesive layer 342. Then, with the use of the adhesive layer 355, the substrate 353 is attached to a surface exposed by separation of the formation substrate, whereby the components formed over the formation substrate are transferred onto the substrate 353. The substrate 353 and the substrate 354 preferably have flexibility. This can increase the flexibility of the display device 400.
  • The light-emitting element 390 has a stacked-layer structure in which the pixel electrode 391, the buffer layer 312, the light-emitting layer 393, the buffer layer 314, and the common electrode 315 are stacked in this order from the insulating layer 414 side. The pixel electrode 391 is electrically connected to one of a source and a drain of in the transistor 408 through an opening provided in the insulating layer 414. The transistor 408 has a function of controlling a current flowing through the light-emitting element 390.
  • The light-receiving element 310 has a stacked-layer structure in which the pixel electrode 311, the buffer layer 312, the active layer 313, the buffer layer 314, and the common electrode 315 are stacked in this order from the insulating layer 414 side. The pixel electrode 311 is connected to one of a source and a drain of the transistor 409 through an opening provided in the insulating layer 414. The transistor 409 has a function of controlling transfer of charge accumulated in the light-receiving element 310.
  • Light emitted by the light-emitting element 390 is emitted toward the substrate 354 side. Light is incident on the light-receiving element 310 through the substrate 354 and the adhesive layer 342. For the substrate 354, a material having a high visible-light-transmitting property is preferably used.
  • The pixel electrode 311 and the pixel electrode 391 can be formed using the same material in the same step. The buffer layer 312, the buffer layer 314, and the common electrode 315 are shared by the light-receiving element 310 and the light-emitting element 390. The light-receiving element 310 and the light-emitting element 390 can have common components except the active layer 313 and the light-emitting layer 393. Thus, the light-receiving element 310 can be incorporated in the display device 400 without a significant increase in the number of manufacturing steps.
  • The light-blocking layer 358 is provided on a surface of the substrate 354 on the substrate 353 side. The light-blocking layer 358 includes openings in a position overlapping with the light-emitting element 390 and in a position overlapping with the light-receiving element 310.
  • Providing the light-blocking layer 358 can control the range where the light-receiving element 310 detects light. As described above, it is preferable to control light to be incident on the light-receiving element 310 by adjusting the position and area of the opening of the light-blocking layer provided in the position overlapping with the light-receiving element 310. Furthermore, with the light-blocking layer 358, light can be inhibited from being incident on the light-receiving element 310 directly from the light-emitting element 390 without through an object. Hence, a sensor with less noise and high sensitivity can be obtained.
  • An end portion of the pixel electrode 311 and an end portion of the pixel electrode 391 are each covered with the partition 416. The pixel electrode 311 and the pixel electrode 391 each include a material that reflects visible light, and the common electrode 315 includes a material that transmits visible light.
  • A region where part of the active layer 313 overlaps with part of the light-emitting layer 393 is included in the example illustrated in FIG. 22A. The portion where the active layer 313 overlaps with the light-emitting layer 393 preferably overlaps with the light-blocking layer 358 and the partition 416.
  • The transistor 408, the transistor 409, and the transistor 410 are formed over the substrate 353. These transistors can be formed using the same materials in the same steps.
  • The insulating layer 412, an insulating layer 411, an insulating layer 425, an insulating layer 415, an insulating layer 418, and the insulating layer 414 are provided in this order over the substrate 353 with the adhesive layer 355 therebetween. Each of the insulating layer 411 and the insulating layer 425 partially functions as a gate insulating layer for the transistors. The insulating layer 415 and the insulating layer 418 are provided to cover the transistors. The insulating layer 414 is provided to cover the transistors and has a function of a planarization layer. Note that there is no limitation on the number of gate insulating layers and the number of insulating layers covering the transistors, and each insulating layer may have either a single layer or two or more layers.
  • A material into which impurities such as water or hydrogen do not easily diffuse is preferably used for at least one of the insulating layers that cover the transistors. This allows the insulating layer to serve as a barrier layer. Such a structure can effectively inhibit diffusion of impurities into the transistors from the outside and increase the reliability of the display device.
  • An inorganic insulating film is preferably used as each of the insulating layer 411, the insulating layer 412, the insulating layer 425, the insulating layer 415, and the insulating layer 418. As the inorganic insulating film, a silicon nitride film, a silicon oxynitride film, a silicon oxide film, a silicon nitride oxide film, an aluminum oxide film, or an aluminum nitride film can be used, for example. A hafnium oxide film, a hafnium oxynitride film, a hafnium nitride oxide film, an yttrium oxide film, a zirconium oxide film, a gallium oxide film, a tantalum oxide film, a magnesium oxide film, a lanthanum oxide film, a cerium oxide film, a neodymium oxide film, or the like may be used. A stack including two or more of the above insulating films may also be used.
  • Here, an organic insulating film often has a lower barrier property than an inorganic insulating film. Therefore, the organic insulating film preferably has an opening in the vicinity of an end portion of the display device 400. In a region 428 illustrated in FIG. 22A, an opening is formed in the insulating layer 414. This can inhibit entry of impurities from the end portion of the display device 400 through the organic insulating film. Alternatively, the organic insulating film may be formed so that an end portion of the organic insulating film is positioned on the inner side compared to the end portion of the display device 400, to prevent the organic insulating film from being exposed at the end portion of the display device 400.
  • In the region 428 in the vicinity of the end portion of the display device 400, the insulating layer 418 and the protective layer 395 are preferably in contact with each other through the opening in the insulating layer 414. In particular, the inorganic insulating film included in the insulating layer 418 and the inorganic insulating film included in the protective layer 395 are preferably in contact with each other. Thus, entry of impurities into the display portion 362 from the outside through an organic insulating film can be inhibited. Thus, the reliability of the display device 400 can be increased.
  • An organic insulating film is suitable for the insulating layer 414 functioning as a planarization layer. Examples of materials that can be used for the organic insulating film include an acrylic resin, a polyimide resin, an epoxy resin, a polyamide resin, a polyimide-amide resin, a siloxane resin, a benzocyclobutene-based resin, a phenol resin, and precursors of these resins.
  • Providing the protective layer 395 covering the light-emitting element 390 and the light-receiving element 310 can inhibit impurities such as water from entering the light-emitting element 390 and the light-receiving element 310 and increase the reliability of the light-emitting element 390 and the light-receiving element 310.
  • The protective layer 395 may have a single-layer structure or a stacked-layer structure. For example, the protective layer 395 may have a stacked-layer structure of an organic insulating film and an inorganic insulating film. In that case, an end portion of the inorganic insulating film preferably extends beyond an end portion of the organic insulating film.
  • FIG. 22B is a cross-sectional view of a transistor 401 a that can be used as the transistor 408, the transistor 409, and the transistor 410.
  • The transistor 401 a is provided over the insulating layer 412 (not illustrated) and includes a conductive layer 421 functioning as a first gate, the insulating layer 411 functioning as a first gate insulating layer, a semiconductor layer 431, the insulating layer 425 functioning as a second gate insulating layer, and a conductive layer 423 functioning as a second gate. The insulating layer 411 is positioned between the conductive layer 421 and the semiconductor layer 431. The insulating layer 425 is positioned between the conductive layer 423 and the semiconductor layer 431.
  • The semiconductor layer 431 includes a region 431 i and a pair of regions 431 n. The region 431 i functions as a channel formation region. One of the pair of regions 431 n serves as a source and the other thereof serves as a drain. The regions 431 n have higher carrier concentration and higher conductivity than the region 431 i. The conductive layer 422 a and the conductive layer 422 b are connected to the regions 431 n through openings provided in the insulating layer 418 and the insulating layer 415.
  • FIG. 22C is a cross-sectional view of a transistor 401 b that can be used as the transistor 408, the transistor 409, and the transistor 410. Furthermore, in the example illustrated in FIG. 22C, the insulating layer 415 is not provided. In the transistor 401 b, the insulating layer 425 is processed in the same manner as the conductive layer 423, and the insulating layer 418 is in contact with the regions 431 n.
  • Note that there is no particular limitation on the structure of the transistors included in the display device of this embodiment. For example, a planar transistor, a staggered transistor, or an inverted staggered transistor can be used. A top-gate or a bottom-gate transistor structure may be employed. Alternatively, gates may be provided above and below a semiconductor layer in which a channel is formed.
  • The structure in which the semiconductor layer where a channel is formed is provided between two gates is used for the transistor 408, the transistor 409, and the transistor 410. The two gates may be connected to each other and supplied with the same signal to drive the transistor. Alternatively, a potential for controlling the threshold voltage may be supplied to one of the two gates and a potential for driving may be supplied to the other to control the threshold voltage of the transistor.
  • There is no particular limitation on the crystallinity of a semiconductor material used for the transistors; any of an amorphous semiconductor, a single crystal semiconductor, and a semiconductor having crystallinity (a microcrystalline semiconductor, a polycrystalline semiconductor, or a semiconductor partly including crystal regions) may be used. A semiconductor having crystallinity is preferably used, in which case deterioration of the transistor characteristics can be suppressed.
  • The semiconductor layer of the transistor preferably includes a metal oxide (also referred to as an oxide semiconductor). Alternatively, the semiconductor layer of the transistor may include silicon. Examples of silicon include amorphous silicon and crystalline silicon (e.g., low-temperature polysilicon or single crystal silicon).
  • The semiconductor layer preferably includes indium, M (M is one or more kinds selected from gallium, aluminum, silicon, boron, yttrium, tin, copper, vanadium, beryllium, titanium, iron, nickel, germanium, zirconium, molybdenum, lanthanum, cerium, neodymium, hafnium, tantalum, tungsten, and magnesium), and zinc, for example. In particular, M is preferably one or more kinds selected from aluminum, gallium, yttrium, and tin.
  • It is particularly preferable to use an oxide containing indium (In), gallium (Ga), and zinc (Zn) (also referred to as IGZO) for the semiconductor layer.
  • When the semiconductor layer is an In-M-Zn oxide, the atomic ratio of In is preferably greater than or equal to the atomic ratio of M in the In-M-Zn oxide. Examples of the atomic ratio of the metal elements in such an In-M-Zn oxide include In:M:Zn=1:1:1 or a composition in the neighborhood thereof, In:M:Zn=1:1:1.2 or a composition in the neighborhood thereof, In:M:Zn=2:1:3 or a composition in the neighborhood thereof, In:M:Zn=3:1:2 or a composition in the neighborhood thereof, In:M:Zn=4:2:3 or a composition in the neighborhood thereof, In:M:Zn=4:2:4.1 or a composition in the neighborhood thereof, In:M:Zn=5:1:3 or a composition in the neighborhood thereof, In:M:Zn=5:1:6 or a composition in the neighborhood thereof, In:M:Zn=5:1:7 or a composition in the neighborhood thereof, In:M:Zn=5:1:8 or a composition in the neighborhood thereof, In:M:Zn=6:1:6 or a composition in the neighborhood thereof, and In:M:Zn=5:2:5 or a composition in the neighborhood thereof. Note that a composition in the neighborhood includes the range of ±30% of a desired atomic ratio.
  • For example, when the atomic ratio is described as In:Ga:Zn=4:2:3 or a composition in the neighborhood thereof, the case is included where the atomic ratio of Ga is greater than or equal to 1 and less than or equal to 3 and the atomic ratio of Zn is greater than or equal to 2 and less than or equal to 4 with the atomic ratio of In being 4. When the atomic ratio is described as In:Ga:Zn=5:1:6 or a composition in the neighborhood thereof, the case is included where the atomic ratio of Ga is greater than 0.1 and less than or equal to 2 and the atomic ratio of Zn is greater than or equal to 5 and less than or equal to 7 with the atomic ratio of In being 5. When the atomic ratio is described as In:Ga:Zn=1:1:1 or a composition in the neighborhood thereof, the case is included where the atomic ratio of Ga is greater than 0.1 and less than or equal to 2 and the atomic ratio of Zn is greater than 0.1 and less than or equal to 2 with the atomic ratio of In being 1.
  • The transistor 410 included in the circuit 364 and the transistor 408 and the transistor 409 included in the display portion 362 may have the same structure or different structures. A plurality of transistors included in the circuit 364 may have the same structure or two or more kinds of structures. Similarly, a plurality of transistors included in the display portion 362 may have the same structure or two or more kinds of structures.
  • A connection portion 404 is provided in a region of the substrate 353 that does not overlap with the substrate 354. In the connection portion 404, the wiring 365 is electrically connected to the FPC 372 through a conductive layer 366 and a connection layer 442. The conductive layer 366 obtained by processing the same conductive film as the pixel electrode 311 and the pixel electrode 391 is exposed on a top surface of the connection portion 404. Thus, the connection portion 404 and the FPC 372 can be electrically connected to each other through the connection layer 442.
  • A variety of optical members can be arranged on the outer side of the substrate 354. Examples of the optical members include a polarizing plate, a retardation plate, a light diffusion layer (a diffusion film or the like), an anti-reflective layer, and a light-condensing film. Furthermore, an antistatic film preventing the attachment of dust, a water repellent film inhibiting the attachment of stain, a hard coat film inhibiting generation of a scratch caused by the use, a shock absorption layer, or the like may be placed on the outer side of the substrate 354.
  • When a flexible material is used for the substrate 353 and the substrate 354, the flexibility of the display device can be increased. The material is not limited thereto, and glass, quartz, ceramic, sapphire, resin, or the like can be used for each of the substrate 353 and the substrate 354.
  • As the adhesive layer, a variety of curable adhesives, e.g., a photocurable adhesive such as an ultraviolet curable adhesive, a reactive curable adhesive, a thermosetting adhesive, and an anaerobic adhesive can be used. Examples of these adhesives include an epoxy resin, an acrylic resin, a silicone resin, a phenol resin, a polyimide resin, an imide resin, a PVC (polyvinyl chloride) resin, a PVB (polyvinyl butyral) resin, and an EVA (ethylene vinyl acetate) resin. In particular, a material with low moisture permeability, such as an epoxy resin, is preferred. Alternatively, a two-component resin may be used. An adhesive sheet or the like may be used.
  • As the connection layer, an anisotropic conductive film (ACF), an anisotropic conductive paste (ACP), or the like can be used.
  • Examples of materials that can be used for a gate, a source, and a drain of a transistor and conductive layers such as a variety of wirings and electrodes included in a display device include metals such as aluminum, titanium, chromium, nickel, copper, yttrium, zirconium, molybdenum, silver, tantalum, or tungsten, and an alloy containing any of these metals as its main component. A film containing any of these materials can be used in a single layer or as a stacked-layer structure.
  • As a light-transmitting conductive material, a conductive oxide such as indium oxide, indium tin oxide, indium zinc oxide, zinc oxide, or zinc oxide containing gallium, or graphene can be used. Alternatively, a metal material such as gold, silver, platinum, magnesium, nickel, tungsten, chromium, molybdenum, iron, cobalt, copper, palladium, or titanium, or an alloy material containing the metal material can be used. Further alternatively, a nitride of the metal material (e.g., titanium nitride) or the like may be used. Note that in the case of using the metal material or the alloy material (or the nitride thereof), the thickness is preferably set small enough to be able to transmit light. A stacked-layer film of any of the above materials can be used as a conductive layer. For example, a stacked-layer film of indium tin oxide and an alloy of silver and magnesium, or the like is preferably used for increased conductivity. These materials can also be used for conductive layers such as a variety of wirings and electrodes that constitute a display device, or conductive layers (conductive layers functioning as a pixel electrode or a common electrode) included in a light-emitting element and a light-receiving element (or a light-emitting and light-receiving element).
  • As an insulating material that can be used for each insulating layer, for example, a resin such as an acrylic resin or an epoxy resin, and an inorganic insulating material such as silicon oxide, silicon oxynitride, silicon nitride oxide, silicon nitride, or aluminum oxide can be given.
  • At least part of this embodiment can be implemented in combination with the other embodiments described in this specification as appropriate.
  • Embodiment 3
  • In this embodiment, a metal oxide (also referred to as an oxide semiconductor) that can be used in the OS transistor described in the above embodiment is described.
  • The metal oxide preferably contains at least indium or zinc. In particular, indium and zinc are preferably contained. In addition, aluminum, gallium, yttrium, tin, or the like is preferably contained. Furthermore, one or more kinds selected from boron, silicon, titanium, iron, nickel, germanium, zirconium, molybdenum, lanthanum, cerium, neodymium, hafnium, tantalum, tungsten, magnesium, cobalt, and the like may be contained.
  • The metal oxide can be formed by a sputtering method, a chemical vapor deposition (CVD) method such as a metal organic chemical vapor deposition (MOCVD) method, an atomic layer deposition (ALD) method, or the like.
  • <Classification of Crystal Structure>
  • Amorphous (including a completely amorphous structure), CAAC (c-axis-aligned crystalline), nc (nanocrystalline), CAC (cloud-aligned composite), single-crystal, and polycrystalline (poly crystal) structures can be given as examples of a crystal structure of an oxide semiconductor.
  • Note that a crystal structure of a film or a substrate can be evaluated with an X-ray diffraction (XRD) spectrum. For example, evaluation is possible using an XRD spectrum which is obtained by GIXD (Grazing-Incidence XRD) measurement. Note that a GIXD method is also referred to as a thin film method or a Seemann-Bohlin method.
  • For example, the XRD spectrum of the quartz glass substrate shows a peak with a substantially bilaterally symmetrical shape. On the other hand, the peak of the XRD spectrum of the IGZO film having a crystal structure has a bilaterally asymmetrical shape. The asymmetrical peak of the XRD spectrum clearly shows the existence of crystal in the film or the substrate. In other words, the crystal structure of the film or the substrate cannot be regarded as “amorphous” unless it has a bilaterally symmetrical peak in the XRD spectrum.
  • A crystal structure of a film or a substrate can also be evaluated with a diffraction pattern obtained by a nanobeam electron diffraction (NBED) method (such a pattern is also referred to as a nanobeam electron diffraction pattern). For example, a halo pattern is observed in the diffraction pattern of the quartz glass substrate, which indicates that the quartz glass substrate is in an amorphous state. Furthermore, not a halo pattern but a spot-like pattern is observed in the diffraction pattern of the IGZO film deposited at room temperature. Thus, it is suggested that the IGZO film deposited at room temperature is in an intermediate state, which is neither a crystal state nor an amorphous state, and it cannot be concluded that the IGZO film is in an amorphous state.
  • <<Structure of Oxide Semiconductor>>
  • Oxide semiconductors might be classified in a manner different from the above-described one when classified in terms of the structure. Oxide semiconductors are classified into a single crystal oxide semiconductor and a non-single-crystal oxide semiconductor, for example. Examples of the non-single-crystal oxide semiconductor include the above-described CAAC-OS and nc-OS. Other examples of the non-single-crystal oxide semiconductor include a polycrystalline oxide semiconductor, an amorphous-like oxide semiconductor (a-like OS), and an amorphous oxide semiconductor.
  • Here, the above-described CAAC-OS, nc-OS, and a-like OS are described in detail.
  • [CAAC-OS]
  • The CAAC-OS is an oxide semiconductor that has a plurality of crystal regions each of which has c-axis alignment in a particular direction. Note that the particular direction refers to the film thickness direction of a CAAC-OS film, the normal direction of the surface where the CAAC-OS film is formed, or the normal direction of the surface of the CAAC-OS film. The crystal region refers to a region having a periodic atomic arrangement. When an atomic arrangement is regarded as a lattice arrangement, the crystal region also refers to a region with a uniform lattice arrangement. The CAAC-OS has a region where a plurality of crystal regions are connected in the a-b plane direction, and the region has distortion in some cases. Note that distortion refers to a portion where the direction of a lattice arrangement changes between a region with a uniform lattice arrangement and another region with a uniform lattice arrangement in a region where a plurality of crystal regions are connected. That is, the CAAC-OS is an oxide semiconductor having c-axis alignment and having no clear alignment in the a-b plane direction.
  • Note that each of the plurality of crystal regions is formed of one or more fine crystals (crystals each of which has a maximum diameter of less than 10 nm). In the case where the crystal region is formed of one fine crystal, the maximum diameter of the crystal region is less than 10 nm. In the case where the crystal region is formed of a large number of fine crystals, the size of the crystal region may be approximately several tens of nanometers.
  • In the case of an In-M-Zn oxide (the element M is one or more kinds selected from aluminum, gallium, yttrium, tin, titanium, and the like), the CAAC-OS tends to have a layered crystal structure (also referred to as a layered structure) in which a layer containing indium (In) and oxygen (hereinafter, an In layer) and a layer containing the element M, zinc (Zn), and oxygen (hereinafter, an (M,Zn) layer) are stacked. Indium and the element M can be replaced with each other. Therefore, indium may be contained in the (M,Zn) layer. In addition, the element M may be contained in the In layer. Note that Zn may be contained in the In layer. Such a layered structure is observed as a lattice image in a high-resolution TEM (Transmission Electron Microscope) image, for example.
  • When the CAAC-OS film is subjected to structural analysis by Out-of-plane XRD measurement with an XRD apparatus using θ/2θ scanning, for example, a peak indicating c-axis alignment is detected at 2θ of 31° or around 31°. Note that the position of the peak indicating c-axis alignment (the value of 2θ) may change depending on the kind, composition, or the like of the metal element contained in the CAAC-OS.
  • For example, a plurality of bright spots are observed in the electron diffraction pattern of the CAAC-OS film. Note that one spot and another spot are observed point-symmetrically with a spot of the incident electron beam passing through a sample (also referred to as a direct spot) as the symmetric center.
  • When the crystal region is observed from the particular direction, a lattice arrangement in the crystal region is basically a hexagonal lattice arrangement; however, a unit lattice is not always a regular hexagon and is a non-regular hexagon in some cases. A pentagonal lattice arrangement, a heptagonal lattice arrangement, and the like are included in the distortion in some cases. Note that a clear crystal grain boundary (grain boundary) cannot be observed even in the vicinity of the distortion in the CAAC-OS. That is, formation of a crystal grain boundary is inhibited by the distortion of lattice arrangement. This is probably because the CAAC-OS can tolerate distortion owing to a low density of arrangement of oxygen atoms in the a-b plane direction, an interatomic bond distance changed by substitution of a metal atom, and the like.
  • Note that a crystal structure in which a clear crystal grain boundary is observed is what is called polycrystal. It is highly probable that the crystal grain boundary becomes a recombination center and captures carriers and thus decreases the on-state current and field-effect mobility of a transistor, for example. Thus, the CAAC-OS in which no clear crystal grain boundary is observed is one of crystalline oxides having a crystal structure suitable for a semiconductor layer of a transistor. Note that Zn is preferably contained to form the CAAC-OS. For example, an In—Zn oxide and an In—Ga—Zn oxide are suitable because they can inhibit generation of a crystal grain boundary as compared with an In oxide.
  • The CAAC-OS is an oxide semiconductor with high crystallinity in which no clear crystal grain boundary is observed. Thus, in the CAAC-OS, a reduction in electron mobility due to the crystal grain boundary is unlikely to occur. Moreover, since the crystallinity of an oxide semiconductor might be decreased by entry of impurities, formation of defects, or the like, the CAAC-OS can be regarded as an oxide semiconductor that has small amounts of impurities and defects (e.g., oxygen vacancies). Thus, an oxide semiconductor including the CAAC-OS is physically stable. Therefore, the oxide semiconductor including the CAAC-OS is resistant to heat and has high reliability. In addition, the CAAC-OS is stable with respect to high temperature in the manufacturing process (what is called thermal budget). Accordingly, the use of the CAAC-OS for the OS transistor can extend the degree of freedom of the manufacturing process.
  • [nc-OS]
  • In the nc-OS, a microscopic region (e.g., a region with a size greater than or equal to 1 nm and less than or equal to 10 nm, in particular, a region with a size greater than or equal to 1 nm and less than or equal to 3 nm) has a periodic atomic arrangement. In other words, the nc-OS includes a fine crystal. Note that the size of the fine crystal is, for example, greater than or equal to 1 nm and less than or equal to 10 nm, particularly greater than or equal to 1 nm and less than or equal to 3 nm; thus, the fine crystal is also referred to as a nanocrystal. Furthermore, there is no regularity of crystal orientation between different nanocrystals in the nc-OS. Thus, the orientation in the whole film is not observed. Accordingly, the nc-OS cannot be distinguished from an a-like OS or an amorphous oxide semiconductor by some analysis methods. For example, when an nc-OS film is subjected to structural analysis by Out-of-plane XRD measurement with an XRD apparatus using θ/2θ scanning, a peak indicating crystallinity is not detected. Furthermore, a diffraction pattern like a halo pattern is observed when the nc-OS film is subjected to electron diffraction (also referred to as selected-area electron diffraction) using an electron beam with a probe diameter larger than the diameter of a nanocrystal (e.g., larger than or equal to 50 nm). Meanwhile, in some cases, a plurality of spots in a ring-like region with a direct spot as the center are observed in a nanobeam electron diffraction pattern of the nc-OS film obtained using an electron beam with a probe diameter nearly equal to or smaller than the diameter of a nanocrystal (e.g., 1 nm or larger and 30 nm or smaller).
  • [A-Like OS]
  • The a-like OS is an oxide semiconductor having a structure between those of the nc-OS and the amorphous oxide semiconductor. The a-like OS contains avoid or a low-density region. That is, the a-like OS has lower crystallinity than the nc-OS and the CAAC-OS. Moreover, the a-like OS has higher hydrogen concentration in the film than the nc-OS and the CAAC-OS.
  • <<Structure of Oxide Semiconductor>>
  • Next, the above-described CAC-OS is described in detail. Note that the CAC-OS relates to the material composition.
  • [CAC-OS]
  • The CAC-OS refers to one composition of a material in which elements constituting a metal oxide are unevenly distributed with a size greater than or equal to 0.5 nm and less than or equal to 10 nm, preferably greater than or equal to 1 nm and less than or equal to 3 nm, or a similar size, for example. Note that a state in which one or more metal elements are unevenly distributed and regions including the metal element(s) are mixed with a size greater than or equal to 0.5 nm and less than or equal to 10 nm, preferably greater than or equal to 1 nm and less than or equal to 3 nm, or a similar size in a metal oxide is hereinafter referred to as a mosaic pattern or a patch-like pattern.
  • In addition, the CAC-OS has a composition in which materials are separated into a first region and a second region to form a mosaic pattern, and the first regions are distributed in the film (this composition is hereinafter also referred to as a cloud-like composition). That is, the CAC-OS is a composite metal oxide having a composition in which the first regions and the second regions are mixed.
  • Note that the atomic ratios of In, Ga, and Zn to the metal elements contained in the CAC-OS in an In—Ga—Zn oxide are denoted by [In], [Ga], and [Zn], respectively. For example, the first region in the CAC-OS in the In—Ga—Zn oxide has [In] higher than that in the composition of the CAC-OS film. Moreover, the second region has [Ga] higher than that in the composition of the CAC-OS film. For example, the first region has higher [In] and lower [Ga] than the second region. Moreover, the second region has higher [Ga] and lower [In] than the first region.
  • Specifically, the first region contains indium oxide, indium zinc oxide, or the like as its main component. The second region contains gallium oxide, gallium zinc oxide, or the like as its main component. That is, the first region can be referred to as a region containing In as its main component. The second region can be referred to as a region containing Ga as its main component.
  • Note that a clear boundary between the first region and the second region cannot be observed in some cases.
  • In a material composition of a CAC-OS in an In—Ga—Zn oxide that contains In, Ga, Zn, and O, regions containing Ga as a main component are observed in part of the CAC-OS and regions containing In as a main component are observed in part thereof. These regions are randomly present to form a mosaic pattern. Thus, it is suggested that the CAC-OS has a structure in which metal elements are unevenly distributed.
  • The CAC-OS can be formed by a sputtering method under a condition where a substrate is not heated, for example. Moreover, in the case of forming the CAC-OS by a sputtering method, any one or more selected from an inert gas (typically, argon), an oxygen gas, and a nitrogen gas are used as a deposition gas. The ratio of the flow rate of an oxygen gas to the total flow rate of the deposition gas at the time of deposition is preferably as low as possible, and for example, the ratio of the flow rate of an oxygen gas to the total flow rate of the deposition gas at the time of deposition is preferably higher than or equal to 0% and less than 30%, further preferably higher than or equal to 0% and less than or equal to 10%.
  • For example, energy dispersive X-ray spectroscopy (EDX) is used to obtain EDX mapping, and according to the EDX mapping, the CAC-OS in the In—Ga—Zn oxide has a structure in which the region containing In as its main component (the first region) and the region containing Ga as its main component (the second region) are unevenly distributed and mixed.
  • Here, the first region has a higher conductivity than the second region. In other words, when carriers flow through the first region, the conductivity of a metal oxide is exhibited. Accordingly, when the first regions are distributed in a metal oxide like a cloud, high field-effect mobility (μ) can be achieved.
  • The second region has a higher insulating property than the first region. In other words, when the second regions are distributed in a metal oxide, leakage current can be inhibited.
  • Thus, in the case where a CAC-OS is used for a transistor, by the complementary action of the conductivity due to the first region and the insulating property due to the second region, the CAC-OS can have a switching function (On/Off function). That is, the CAC-OS has a conducting function in part of the material and has an insulating function in another part of the material; as a whole, the CAC-OS has a function of a semiconductor. Separation of the conducting function and the insulating function can maximize each function. Accordingly, when the CAC-OS is used for a transistor, high on-state current (Ion), high field-effect mobility (μ), and excellent switching operation can be achieved.
  • A transistor using the CAC-OS has high reliability. Thus, the CAC-OS is most suitable for a variety of semiconductor devices such as display devices.
  • An oxide semiconductor has various structures with different properties. Two or more kinds among the amorphous oxide semiconductor, the polycrystalline oxide semiconductor, the a-like OS, the CAC-OS, the nc-OS, and the CAAC-OS may be included in an oxide semiconductor of one embodiment of the present invention.
  • <Transistor Including Oxide Semiconductor>
  • Next, the case where the above oxide semiconductor is used for a transistor is described.
  • When the above oxide semiconductor is used for a transistor, a transistor with high field-effect mobility can be achieved. In addition, a transistor having high reliability can be achieved.
  • An oxide semiconductor having a low carrier concentration is preferably used in a transistor. For example, the carrier concentration of an oxide semiconductor is lower than or equal to 1×1017 cm−3, preferably lower than or equal to 1×1015 cm−3, further preferably lower than or equal to 1×1013 cm−3, still further preferably lower than or equal to 1×1011 cm−3, yet further preferably lower than 1×1010 cm−3, and higher than or equal to 1×10−9 cm−3. In order to reduce the carrier concentration in an oxide semiconductor film, the impurity concentration in the oxide semiconductor film is reduced so that the density of defect states can be reduced. In this specification and the like, a state with a low impurity concentration and a low density of defect states is referred to as a highly purified intrinsic or substantially highly purified intrinsic state. Note that an oxide semiconductor having a low carrier concentration may be referred to as a highly purified intrinsic or substantially highly purified intrinsic oxide semiconductor.
  • A highly purified intrinsic or substantially highly purified intrinsic oxide semiconductor film has a low density of defect states and thus has a low density of trap states in some cases.
  • Charge trapped by the trap states in the oxide semiconductor takes a long time to disappear and might behave like fixed charge. Thus, a transistor whose channel formation region is formed in an oxide semiconductor with a high density of trap states has unstable electrical characteristics in some cases.
  • Accordingly, in order to obtain stable electrical characteristics of a transistor, reducing the impurity concentration in an oxide semiconductor is effective. In order to reduce the impurity concentration in the oxide semiconductor, it is preferable that the impurity concentration in an adjacent film be also reduced. Examples of impurities include hydrogen, nitrogen, an alkali metal, an alkaline earth metal, iron, nickel, and silicon.
  • <Impurity>
  • Here, the influence of each impurity in the oxide semiconductor is described.
  • When silicon or carbon, which is one of Group 14 elements, is contained in the oxide semiconductor, defect states are formed in the oxide semiconductor. Thus, the concentration of silicon or carbon in the oxide semiconductor and the concentration of silicon or carbon in the vicinity of an interface with the oxide semiconductor (the concentration obtained by secondary ion mass spectrometry (SIMS)) are each set lower than or equal to 2×1018 atoms/cm3, preferably lower than or equal to 2×1017 atoms/cm3.
  • When the oxide semiconductor contains an alkali metal or an alkaline earth metal, defect states are formed and carriers are generated in some cases. Thus, a transistor using an oxide semiconductor that contains an alkali metal or an alkaline earth metal is likely to have normally-on characteristics. Thus, the concentration of an alkali metal or an alkaline earth metal in the oxide semiconductor, which is obtained by SIMS, is set lower than or equal to 1×1018 atoms/cm3, preferably lower than or equal to 2×1016 atoms/cm3.
  • Furthermore, when the oxide semiconductor contains nitrogen, the oxide semiconductor easily becomes n-type by generation of electrons serving as carriers and an increase in carrier concentration. As a result, a transistor using an oxide semiconductor containing nitrogen as a semiconductor is likely to have normally-on characteristics. When nitrogen is contained in the oxide semiconductor, trap states are sometimes formed. This might make the electrical characteristics of the transistor unstable. Therefore, the concentration of nitrogen in the oxide semiconductor, which is obtained by SIMS, is set lower than 5×1019 atoms/cm3, preferably lower than or equal to 5×1018 atoms/cm3, further preferably lower than or equal to 1×1018 atoms/cm3, still further preferably lower than or equal to 5×1017 atoms/cm3.
  • Hydrogen contained in the oxide semiconductor reacts with oxygen bonded to a metal atom to be water, and thus forms an oxygen vacancy in some cases. Entry of hydrogen into the oxygen vacancy generates an electron serving as a carrier in some cases. Furthermore, bonding of part of hydrogen to oxygen bonded to a metal atom causes generation of an electron serving as a carrier in some cases. Thus, a transistor using an oxide semiconductor containing hydrogen is likely to have normally-on characteristics. Accordingly, hydrogen in the oxide semiconductor is preferably reduced as much as possible. Specifically, the hydrogen concentration in the oxide semiconductor, which is obtained by SIMS, is set lower than 1×1020 atoms/cm3, preferably lower than 1×1019 atoms/cm3, further preferably lower than 5×1018 atoms/cm3, still further preferably lower than 1×1018 atoms/cm3.
  • When an oxide semiconductor with sufficiently reduced impurities is used for the channel formation region of the transistor, stable electrical characteristics can be given.
  • This embodiment can be combined with the other embodiments as appropriate.
  • Embodiment 4
  • In this embodiment, electronic devices of embodiments of the present invention are described with reference to FIG. 23A to FIG. 25F.
  • An electronic device of one embodiment of the present invention can perform image capturing and detect touch operation (contact or approach) in a display portion. Thus, the electronic device can have improved functionality and convenience, for example.
  • Examples of the electronic devices of embodiments of the present invention include a digital camera, a digital video camera, a digital photo frame, a mobile phone, a portable game console, a portable information terminal, and an audio reproducing device, in addition to electronic devices with a relatively large screen, such as a television device, a desktop or laptop personal computer, a monitor of a computer or the like, digital signage, and a large game machine such as a pachinko machine.
  • The electronic device of one embodiment of the present invention may include a sensor (a sensor having a function of measuring force, displacement, position, speed, acceleration, angular velocity, rotational frequency, distance, light, liquid, magnetism, temperature, a chemical substance, sound, time, hardness, electric field, current, voltage, electric power, radiation, flow rate, humidity, gradient, oscillation, a smell, or infrared rays).
  • The electronic device of one embodiment of the present invention can have a variety of functions. For example, the electronic device can have a function of displaying a variety of data (a still image, a moving image, a text image, and the like) on the display portion, a touch panel function, a function of displaying a calendar, date, time, and the like, a function of executing a variety of software (programs), a wireless communication function, and a function of reading out a program or data stored in a recording medium.
  • An electronic device 6500 illustrated in FIG. 23A is a portable information terminal that can be used as a smartphone.
  • The electronic device 6500 includes a housing 6501, a display portion 6502, a power button 6503, buttons 6504, a speaker 6505, a microphone 6506, a camera 6507, a light source 6508, and the like. The display portion 6502 has a touch panel function.
  • The display device described in Embodiment 2 can be used in the display portion 6502.
  • FIG. 23B is a schematic cross-sectional view including an end portion of the housing 6501 on the microphone 6506 side.
  • A protection member 6510 having a light-transmitting property is provided on a display surface side of the housing 6501, and a display panel 6511, an optical member 6512, a touch sensor panel 6513, a printed circuit board 6517, a battery 6518, and the like are provided in a space surrounded by the housing 6501 and the protection member 6510.
  • The display panel 6511, the optical member 6512, and the touch sensor panel 6513 are fixed to the protection member 6510 with an adhesive layer (not illustrated).
  • Part of the display panel 6511 is folded back in a region outside the display portion 6502, and an FPC 6515 is connected to the part that is folded back. An IC 6516 is mounted on the FPC 6515. The FPC 6515 is connected to a terminal provided on the printed circuit board 6517.
  • A flexible display of one embodiment of the present invention can be used as the display panel 6511. Thus, an extremely lightweight electronic device can be provided. Since the display panel 6511 is extremely thin, the battery 6518 with high capacity can be mounted with the thickness of the electronic device controlled. An electronic device with a narrow frame can be obtained when part of the display panel 6511 is folded back so that the portion connected to the FPC 6515 is positioned on the rear side of a pixel portion.
  • Using the display device described in Embodiment 2 as the display panel 6511 allows image capturing on the display portion 6502. For example, an image of a fingerprint is captured by the display panel 6511; thus, fingerprint authentication can be performed.
  • By further including the touch sensor panel 6513, the display portion 6502 can have a touch panel function. A variety of types such as a capacitive type, a resistive type, a surface acoustic wave type, an infrared type, an optical type, and a pressure-sensitive type can be used for the touch sensor panel 6513. Alternatively, the display panel 6511 may function as a touch sensor; in such a case, the touch sensor panel 6513 is not necessarily provided.
  • FIG. 24A illustrates an example of a television device. In a television device 7100, a display portion 7000 is incorporated in a housing 7101. Here, a structure in which the housing 7101 is supported by a stand 7103 is illustrated.
  • The display device described in Embodiment 2 can be used in the display portion 7000.
  • Operation of the television device 7100 illustrated in FIG. 24A can be performed with an operation switch provided in the housing 7101 or a separate remote controller 7111. Alternatively, the display portion 7000 may include a touch sensor, and the television device 7100 may be operated by touch on the display portion 7000 with a finger or the like. The remote controller 7111 may be provided with a display portion for displaying data output from the remote controller 7111. With operation keys or a touch panel provided in the remote controller 7111, channels and volume can be controlled and videos displayed on the display portion 7000 can be controlled.
  • Note that the television device 7100 has a structure in which a receiver, a modem, and the like are provided. A general television broadcast can be received with the receiver. When the television device is connected to a communication network with or without wires via the modem, one-way (from a transmitter to a receiver) or two-way (between a transmitter and a receiver or between receivers, for example) data communication can be performed.
  • FIG. 24B illustrates an example of a laptop personal computer. A laptop personal computer 7200 includes a housing 7211, a keyboard 7212, a pointing device 7213, an external connection port 7214, and the like. In the housing 7211, the display portion 7000 is incorporated.
  • The display device described in Embodiment 2 can be used in the display portion 7000.
  • FIG. 24C and FIG. 24D illustrate examples of digital signage.
  • Digital signage 7300 illustrated in FIG. 24C includes a housing 7301, the display portion 7000, a speaker 7303, and the like. Furthermore, the digital signage 7300 can include an LED lamp, an operation key (including a power switch or an operation switch), a connection terminal, a variety of sensors, a microphone, and the like.
  • FIG. 24D is digital signage 7400 attached to a cylindrical pillar 7401. The digital signage 7400 includes the display portion 7000 provided along a curved surface of the pillar 7401.
  • In FIG. 24C and FIG. 24D, the display device described in Embodiment 2 can be used in the display portion 7000.
  • A larger area of the display portion 7000 can increase the amount of data that can be provided at a time. The larger display portion 7000 attracts more attention, so that the effectiveness of the advertisement can be increased, for example.
  • The use of a touch panel in the display portion 7000 is preferable because in addition to display of a still image or a moving image on the display portion 7000, intuitive operation by a user is possible. Moreover, for an application for providing information such as route information or traffic information, usability can be enhanced by intuitive operation.
  • As illustrated in FIG. 24C and FIG. 24D, it is preferable that the digital signage 7300 or the digital signage 7400 can work with an information terminal 7311 or an information terminal 7411 such as a user's smartphone through wireless communication. For example, information of an advertisement displayed on the display portion 7000 can be displayed on a screen of the information terminal 7311 or the information terminal 7411. By operation of the information terminal 7311 or the information terminal 7411, display on the display portion 7000 can be switched.
  • It is possible to make the digital signage 7300 or the digital signage 7400 execute a game with the use of the screen of the information terminal 7311 or the information terminal 7411 as an operation means (controller). Thus, an unspecified number of users can join in and enjoy the game concurrently.
  • Electronic devices illustrated in FIG. 25A to FIG. 25F include a housing 9000, a display portion 9001, a speaker 9003, an operation key 9005 (including a power switch or an operation switch), a connection terminal 9006, a sensor 9007 (a sensor having a function of measuring force, displacement, position, speed, acceleration, angular velocity, rotational frequency, distance, light, liquid, magnetism, temperature, a chemical substance, sound, time, hardness, electric field, current, voltage, electric power, radiation, flow rate, humidity, gradient, oscillation, a smell, or infrared rays), a microphone 9008, and the like.
  • The electronic devices illustrated in FIG. 25A to FIG. 25F have a variety of functions. For example, the electronic devices can have a function of displaying a variety of data (a still image, a moving image, a text image, and the like) on the display portion, a touch panel function, a function of displaying a calendar, date, time, and the like, a function of controlling processing with the use of a variety of software (programs), a wireless communication function, and a function of reading out and processing a program or data stored in a recording medium. Note that the functions of the electronic devices are not limited thereto, and the electronic devices can have a variety of functions. The electronic devices may each include a plurality of display portions. The electronic devices may each include a camera or the like and have a function of taking a still image or a moving image and storing the taken image in a recording medium (an external recording medium or a recording medium incorporated in the camera), a function of displaying the taken image on the display portion, or the like.
  • The details of the electronic devices illustrated in FIG. 25A to FIG. 25F are described below.
  • FIG. 25A is a perspective view illustrating a portable information terminal 9101. For example, the portable information terminal 9101 can be used as a smartphone. Note that the portable information terminal 9101 may be provided with the speaker 9003, the connection terminal 9006, the sensor 9007, or the like. The portable information terminal 9101 can display characters or image data on its plurality of surfaces. FIG. 25A illustrates an example where three icons 9050 are displayed. Information 9051 indicated by dashed rectangles can be displayed on another surface of the display portion 9001. Examples of the information 9051 include notification of reception of an e-mail, SNS, or an incoming call, the title and sender of an e-mail, SNS, or the like, the date, the time, remaining battery, and the reception strength of an antenna. Alternatively, the icon 9050 or the like may be displayed in the position where the information 9051 is displayed.
  • FIG. 25B is a perspective view illustrating a portable information terminal 9102. The portable information terminal 9102 has a function of displaying information on three or more surfaces of the display portion 9001. Here, an example in which information 9052, information 9053, and information 9054 are displayed on different surfaces is illustrated. For example, a user can check the information 9053 displayed in a position that can be observed from above the portable information terminal 9102, with the portable information terminal 9102 put in a breast pocket of his/her clothes. The user can seethe display without taking out the portable information terminal 9102 from the pocket and decide whether to answer the call, for example.
  • FIG. 25C is a perspective view illustrating a watch-type portable information terminal 9200. The display surface of the display portion 9001 is curved and provided, and display can be performed along the curved display surface. Mutual communication between the portable information terminal 9200 and, for example, a headset capable of wireless communication enables hands-free calling. With the connection terminal 9006, the portable information terminal 9200 can perform mutual data transmission with another information terminal and charging. Note that the charging operation may be performed by wireless power feeding.
  • FIG. 25D to FIG. 25F are perspective views illustrating a foldable portable information terminal 9201. FIG. 25D is a perspective view of an opened state of the portable information terminal 9201, FIG. 25F is a perspective view of a folded state thereof, and FIG. 25E is a perspective view of a state in the middle of change from one of FIG. 25D and FIG. 25F to the other. The portable information terminal 9201 is highly portable in the folded state and is highly browsable in the opened state because of a seamless large display region. The display portion 9001 of the portable information terminal 9201 is supported by three housings 9000 joined by hinges 9055. For example, the display portion 9001 can be folded with a radius of curvature greater than or equal to 0.1 mm and less than or equal to 150 mm.
  • This embodiment can be combined with the other embodiments as appropriate.
  • REFERENCE NUMERALS
  • A1-A5: coordinate, B1-B5: coordinate, 10 and 20: device, 11: control portion, 12: display portion, 21, 22, and 23: detection portion, 50: arrow, 100 and 100 a-100 k: object

Claims (15)

1. A display device comprising a control portion, a display portion, and a detection portion,
wherein the display portion comprises a screen displaying an image,
wherein the detection portion is configured to obtain information about contact on the screen or positional information of a detection target approaching the screen in a normal direction and output the information to the control portion,
wherein the control portion is configured to execute a first processing when a first operation is performed, a second processing when a second operation is successively performed after the first operation, and a third processing when a third operation is successively performed after the second operation,
wherein the first operation is operation in which two pointed positions in contact with the screen are detected,
wherein the second operation is operation in which the two pointed positions move to reduce a distance between the two pointed positions, and
wherein the third operation is operation in which the two pointed positions move in the normal direction with respect to the screen from a state where the two pointed positions are in contact with the screen.
2. The display device according to claim 1,
wherein the first processing is processing by which a selection region in the screen is determined,
wherein the second processing is processing by which an object positioned in the selection region is selected, and
wherein the third processing is processing by which the object is displayed as if it is lifted up in the normal direction in the screen.
3. The display device according to claim 1,
wherein the control portion is further configured to execute a fourth processing when a fourth operation is performed after the third operation, and
wherein the fourth operation is operation in which the two pointed positions come in contact with the screen.
4. The display device according to claim 1,
wherein the control portion is further configured to execute a fifth processing when a fifth operation is performed after the third operation, and
wherein the fifth operation is operation in which the two pointed positions move to a height from the screen that exceeds a threshold value.
5. The display device according to claim 1,
wherein the control portion is further configured to execute a sixth processing when a sixth operation is performed after the third operation, and
wherein the sixth operation is operation in which the two pointed positions move to make a distance between the two pointed positions large in a state where a height of the two pointed positions from the screen is less than a threshold value and the two pointed positions are not in contact with the screen.
6. The display device according to claim 1,
wherein the control portion is further configured to execute a seventh processing when a seventh operation is successively performed after the third operation, and
wherein the seventh operation is operation in which the two pointed positions move in a region where the height of the two pointed positions from the screen is less than the threshold value and the two pointed positions are not in contact with the screen.
7. The display device according to claim 3,
wherein the fourth processing is processing by which selection of an object in the screen is canceled at the two pointed positions in contact with the screen.
8. The display device according to claim 4,
wherein the fifth processing is processing by which selection of an object is canceled at a two-dimensional position in the screen of a time when the height of the two pointed positions from the screen exceeds the threshold value.
9. The display device according to claim 4,
wherein the fifth processing is processing by which selection of an object is canceled at the two pointed positions in contact with the screen in the third operation.
10. The display device according to claim 5,
wherein the sixth processing is processing by which selection of an object is canceled at a two-dimensional position in the screen of a time when the two pointed positions move to make the distance between the two pointed positions large.
11. The display device according to claim 5,
wherein the sixth processing is processing by which selection of an object is canceled at the two pointed positions in contact with the screen in the third operation.
12. The display device according to claim 1,
wherein the display portion comprises a light-emitting element,
wherein the detection portion comprises a photoelectric conversion element, and
wherein the light-emitting element and the photoelectric conversion element are provided on a same plane.
13. The display device according to claim 1,
wherein the detection portion comprises a touch sensor with a capacitive type, a surface acoustic wave type, a resistive type, an ultrasonic type, an electromagnetic type, or an optical type.
14. A display module comprising the display device according to claim 1 and a connector or an integrated circuit.
15. An electronic device comprising:
the display module according to claim 14; and
at least one of an antenna, a battery, a housing, a camera, a speaker, a microphone, and an operation button.
US18/246,367 2020-10-02 2021-09-17 Ferroelectric device and semiconductor device Pending US20230393727A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2020-167868 2020-10-02
JP2020167868 2020-10-02
PCT/IB2021/058476 WO2022069988A1 (en) 2020-10-02 2021-09-17 Display device, display module, and electronic apparatus

Publications (1)

Publication Number Publication Date
US20230393727A1 true US20230393727A1 (en) 2023-12-07

Family

ID=80951394

Family Applications (1)

Application Number Title Priority Date Filing Date
US18/246,367 Pending US20230393727A1 (en) 2020-10-02 2021-09-17 Ferroelectric device and semiconductor device

Country Status (5)

Country Link
US (1) US20230393727A1 (en)
JP (1) JPWO2022069988A1 (en)
KR (1) KR20230075490A (en)
CN (1) CN116324678A (en)
WO (1) WO2022069988A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3876711B2 (en) * 2001-12-26 2007-02-07 ソニー株式会社 Display device and manufacturing method of display device
US9225810B2 (en) * 2012-07-03 2015-12-29 Sony Corporation Terminal device, information processing method, program, and storage medium
KR20150051278A (en) * 2013-11-01 2015-05-12 삼성전자주식회사 Object moving method and electronic device implementing the same
KR102239367B1 (en) 2013-11-27 2021-04-09 가부시키가이샤 한도오따이 에네루기 켄큐쇼 Touch panel

Also Published As

Publication number Publication date
WO2022069988A1 (en) 2022-04-07
CN116324678A (en) 2023-06-23
JPWO2022069988A1 (en) 2022-04-07
KR20230075490A (en) 2023-05-31

Similar Documents

Publication Publication Date Title
CN113302745A (en) Display device, display module, and electronic apparatus
US20240111385A1 (en) Display Device
US20240099038A1 (en) Display Apparatus, Display Module, and Electronic Device
US20230251743A1 (en) Driving Method of Display Device
US20220102430A1 (en) Display device and electronic device
US20230247873A1 (en) Display apparatus, display module, and electronic device
US20220320064A1 (en) Semiconductor device and manufacturing method thereof
US20240127625A1 (en) Electronic device and program
US20240127622A1 (en) Electronic device
US20230117024A1 (en) Display device, display module, and electronic device
US20240105120A1 (en) Display apparatus
US20220352265A1 (en) Display module and electronic device
US20220384526A1 (en) Display Device, Display Module, and Electronic Device
US20230103995A1 (en) Display device
US20230232645A1 (en) Display device, display module, and electronic device
US20230196843A1 (en) Electronic device and authentication method of electronic device
US20230031064A1 (en) Electronic device and program
US11847942B2 (en) Semiconductor device
US20230237831A1 (en) Electronic device and authentication method for electronic device
US20240074272A1 (en) Display Apparatus, Display Module, and Electronic Device
US20230393727A1 (en) Ferroelectric device and semiconductor device
US20230165055A1 (en) Display device, display module, and electronic device
US20230019576A1 (en) Electronic Device and Program

Legal Events

Date Code Title Description
AS Assignment

Owner name: SEMICONDUCTOR ENERGY LABORATORY CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SEINO, FUMIYASU;ISHITANI, TETSUJI;SIGNING DATES FROM 20230304 TO 20230307;REEL/FRAME:063075/0094

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION