CN115877943A - Electronic device and control method thereof - Google Patents

Electronic device and control method thereof Download PDF

Info

Publication number
CN115877943A
CN115877943A CN202111150487.8A CN202111150487A CN115877943A CN 115877943 A CN115877943 A CN 115877943A CN 202111150487 A CN202111150487 A CN 202111150487A CN 115877943 A CN115877943 A CN 115877943A
Authority
CN
China
Prior art keywords
electronic device
antenna array
unit
user
sensing unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111150487.8A
Other languages
Chinese (zh)
Inventor
许广成
李朦朦
陈峰文
丁根明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202111150487.8A priority Critical patent/CN115877943A/en
Publication of CN115877943A publication Critical patent/CN115877943A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

An embodiment of the present disclosure provides an electronic device and a control method thereof. The electronic device includes a body; a display unit located in the body and adapted to display a picture toward at least one predetermined orientation; a non-visual sensing unit located in a predetermined position of the body for sensing an object in the at least one predetermined orientation and providing sensing information; and a processing unit configured to receive the sensing information and control the electronic device and/or an external device according to the sensing information. According to the electronic equipment disclosed by the embodiment of the disclosure, the non-visual sensing unit such as the radar is integrated in the electronic equipment, so that the non-visual sensing of the object can be realized without externally hanging the radar or occupying too much internal space, and the layout of the electronic equipment is more compact and reasonable. The electronic equipment effectively avoids privacy disclosure risks and can better realize human-computer interaction, so that the safety and user experience of a user when the user uses the electronic equipment are improved.

Description

Electronic device and control method thereof
Technical Field
Embodiments of the present disclosure relate generally to an electronic device, and more particularly, to an electronic device having a display unit capable of being disposed at a predetermined position and a control method thereof.
Background
It is common today to provide electronic devices with a display unit at a predetermined location, whether in a home or office setting. These electronic devices include, but are not limited to, stand-alone displays for computers, all-in-one machines, televisions, display devices such as smart screens or enterprise smart screens. These electronic devices can provide audio-visual effects that cannot be replaced by other electronic devices, and occupy indispensable positions in life and work.
With the development of technology, the electronic devices with display units mentioned above usually have some intelligent functions, for example, they can be controlled by voice or can recognize the actions and identities of users standing in front and display specific pictures according to the actions or identities. Identifying the user's actions and identity is typically accomplished using visual sensors, which risks revealing the user's privacy.
Disclosure of Invention
To avoid privacy disclosure risks, embodiments of the present disclosure provide an electronic device employing a non-visual sensing unit for object detection and recognition and a control method thereof.
In a first aspect of the present disclosure, an electronic device is provided. The electronic device includes a body; a display unit located in the body for presenting a picture towards at least one predetermined orientation; a non-visual sensing unit located in a predetermined position of the body for sensing an object in the at least one predetermined orientation and providing sensing information; and a processing unit configured to receive the sensing information and control the electronic device and/or an external device according to the sensing information.
In the electronic device according to the embodiment of the present disclosure, the non-visual sensing unit such as the radar is integrated, and the non-visual sensing of the object can be realized without using a plug-in or occupying too much internal space, so that the layout of the electronic device can be made more compact and reasonable. In addition, the detection and identification of the object are completed by using the non-visual sensing unit, so that the privacy leakage risk is effectively avoided, and meanwhile, the man-machine interaction can be better realized, and the safety and the user experience of a user when the user uses the electronic equipment are improved.
In one implementation, the non-visual sensing unit includes an antenna array for transmitting and receiving beams toward the at least one predetermined orientation to sense the at least one predetermined oriented object; and a sensing system processor for processing the received beam to determine the sensing information. In this way, the processing amount of the processing unit can be reduced, and the recognition rate can be improved.
In one implementation, the non-visual sensing unit includes a radar. In this way, detection and identification of objects can be achieved in a cost-effective manner.
In one implementation, the electronic device further comprises a movable member movably arranged at a side of the body facing away from the display unit for movement between the first exposed position, in which a first portion of the movable member protrudes from the body, and a concealed position, in which the movable member is concealed by the body, in response to a control signal of the processing unit. The movable part may make the arrangement of the non-visual sensing unit more flexible.
In one implementation, the antenna array of non-visual sensing units is located in the first portion of the movable component. The non-visual sensing unit is disposed on the movable member without being limited by the size of the electronic apparatus, such as a bezel, etc., thereby just facilitating upgrading of the existing electronic apparatus to thereby reduce the cost of upgrading the electronic apparatus.
In one implementation, the movable component is further adapted to move from the first exposed position to a second exposed position in which the first and second portions of the movable component protrude from the body.
In one implementation, the electronic device further includes a camera in the second portion of the movable component. By arranging the non-visual sensing unit and the camera in the first and second parts of the movable part, respectively, detection and identification of objects can be done without the risk of privacy leakage. And meanwhile, tasks such as video conference and the like are not influenced by using the camera when needed, so that the user experience is further improved.
In one implementation, a first circuit board is disposed in the movable component, and the camera and the non-visual sensing unit are located on the first circuit board. This arrangement makes the arrangement of the electronics in the movable part more compact and thereby facilitates assembly and maintenance of the movable part.
In one implementation, the body comprises a plurality of frames, and at least one part of the display unit is arranged in a space defined by the frames; the antenna array of the non-visual sensing unit is disposed in the at least one bezel or between the at least one bezel and the display unit. In this way, the non-visual sensing unit is able to further complete the detection and identification of objects without being perceived by the user, thereby improving the user experience.
In one implementation, a second flexible circuit board is disposed in the body, the second circuit board including a first portion extending perpendicular to the display unit and a second portion extending parallel to the display unit, the non-visual sensing unit being located on the second circuit board, at least part of the antenna array being located at an end of the first portion, the sensing system processor being located at the second portion. This arrangement enables the non-visual sensing unit to be suitable for any appropriate electronic device, for example, an electronic device with an ultra-thin thickness may also have the possibility of arranging the antenna array portion of the non-visual sensing unit on the frame, thereby improving the object detection and recognition rate and the product applicability.
In one implementation, at least the antenna array of a non-visual sensing unit is at least partially integrated in the display unit. In this way, the large size of the display unit can be utilized to a greater extent, and the two-bit antenna array is not constrained by a narrow frame, so that the resolution and concealment of detection are further improved.
In one implementation, the antenna array includes a two-dimensional antenna array. In this way, the detection and recognition accuracy and efficiency of the non-visual sensing unit can be further improved, thereby improving the user experience.
According to a second aspect of the present disclosure there is provided a control method for an electronic device according to the first aspect hereinbefore. The method includes acquiring the sensing information from the non-visual sensing unit; processing the sensed information to determine if an object enters a predetermined area in the at least one orientation; in response to determining that an object enters the predetermined area, determining an attribute of the object from the sensed information; and controlling the electronic device in accordance with the determined property of the object. In this way, the non-visual sensing unit can be used to enable detection and identification of objects and thus control of the electronic device, thereby improving human-computer interaction experience while effectively avoiding disclosure of user privacy.
In one implementation, the method further includes, in response to the determined attribute of the object being a user, extracting biometric information of the user from the sensed information; and controlling the electronic device and/or the external device according to the extracted biometric information. By acquiring the biological characteristic information, the information such as the state of the user can be known more quickly, so that the man-machine interaction can be better realized.
In one implementation, the extracted biometric information includes at least one of: the method comprises the following steps of user breathing heartbeat information, motion trail information, body posture information and gesture control information.
In one implementation, in response to the determined attribute of the object being a user, determining an identity of the user further from the sensed information; acquiring identity information related to the identity according to the determined identity of the user; and controlling the electronic device and/or the external device according to the acquired identity information. By adopting targeted control related to the user identity, the user experience can be further improved.
In one implementation, the obtained identity information includes personal information corresponding to the identity, stored recorded information using the electronic device, health information, gesture recorded information.
In one implementation, the method further comprises determining a time at which the object has left the predetermined area from the sensed information; and controlling the electronic equipment to enter a standby state in response to the time exceeding a preset threshold. In this way, the power consumption of the electronic device can be further reduced.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.
Drawings
The above and other features, advantages and aspects of embodiments of the present disclosure will become more apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings. The same or similar reference numbers in the drawings identify the same or similar elements, of which:
fig. 1 shows a perspective schematic view of an electronic device according to an embodiment of the disclosure;
FIG. 2 shows a schematic diagram of a circuit board of an electronic device according to an embodiment of the disclosure;
FIG. 3 shows a schematic perspective view of an electronic device with a movable member moved to a first exposed position, according to an embodiment of the present disclosure;
FIG. 4 shows a perspective schematic view of an electronic device with a movable member moved to a second exposed position in accordance with an embodiment of the present disclosure;
FIG. 5 shows an exploded view of a movable component of an electronic device according to an embodiment of the disclosure;
fig. 6 shows a perspective schematic view of an electronic device showing an antenna array arranged in a bezel, in accordance with an embodiment of the present disclosure;
FIG. 7 illustrates a partial side cross-sectional view of an electronic device according to an embodiment of the disclosure;
fig. 8 shows a schematic perspective view of a second circuit board used by an electronic device for arranging an antenna array on a bezel in accordance with an embodiment of the present disclosure;
fig. 9 shows a perspective schematic view of an electronic device according to an embodiment of the present disclosure, wherein an antenna array is shown integrated in a display unit; and
FIG. 10 shows a flow diagram for controlling an electronic device according to an embodiment of the disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
In describing embodiments of the present disclosure, the terms "include" and its derivatives should be interpreted as being inclusive, i.e., "including but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The terms "first," "second," and the like may refer to different or the same object. Other explicit and implicit definitions are also possible below.
The electronic device 100 of the present disclosure generally refers to an electronic device 100 having a display unit 102 but generally being fixed at some predetermined location. These electronic devices 100 are typically not portable or intended to be carried, and are typically fixed or movably disposed to some predetermined location. Such an electronic apparatus 100 is generally able to see the display unit 102 and a housing accommodating the display unit 102 when viewed from the external appearance. For example, the electronic device 100 of the present disclosure includes, but is not limited to, a stand-alone display of a computer, a kiosk, a television, a smart screen, an enterprise smart screen, a door entry system having a display unit 102, and the like. The electronic device 100 according to the embodiment of the present disclosure will be described below mainly taking a smart phone as an example. It should be understood that the same is true for other electronic devices 100, which will not be described separately below.
With the development of science and technology, smart televisions are more and more popular. For example, some smart televisions are capable of recognizing user actions or identities and of performing predetermined responses according to the recognized actions or identities to realize human-computer interaction. The implementation of human-computer interaction improves the degree of intelligence of the electronic device 100 and the user experience of people using the electronic device 100. Currently, in order to realize recognition of user actions or identities, etc., a visual sensor for recognition, such as a camera, etc., is usually provided on the electronic device 100 such as a smart television. The vision sensor is generally capable of viewing in one or more directions (e.g., the direction in which the display unit 102 is facing) for detection and identification of objects. Of course, the camera can be used for video calls and the like in addition to the object detection and recognition function.
However, the use of a visual sensor such as a camera for object detection and recognition poses a privacy-disclosure risk. This affects to some extent the user experience or even sales volume of such electronic devices 100.
In order to avoid the risk of privacy disclosure, there are conventional solutions that implement person detection and identification through a personal device such as a mobile phone, a watch, or a bracelet. In this solution, after detecting that a personal device associated with the electronic device 100 is in the vicinity of the electronic device 100 or sending a control instruction to the electronic device 100 through a global positioning system, a beidou satellite positioning system, a Wi-Fi Communication technology, a bluetooth technology, a Near-Field Communication (NFC) technology, or the like, the electronic device 100 may respond so as to complete human-computer interaction.
However, since personal devices usually need to be carried around to be able to complete the detection, this solution has the problem that it does not support multi-person detection and detection of objects. Further, personal devices such as watches or wristbands are generally small in screen and small in battery capacity, and have a detection blind area. Detection cannot be done, for example, while charging or when not being carried for other reasons. In addition, for the solution relying on positioning to achieve detection, the positioning and recognition accuracy is often low in the indoor case. To achieve the required accuracy indoors, indoor environment pre-placement of Ultra Wide Band (UWB) wireless communication technology base stations and multi-port repeater (hub) networking may be required, which is often not easily achieved for individual homes.
In order to avoid privacy disclosure risks, another conventional scheme is implemented by using an external radar module. Similar to a conventional external camera connected to the electronic device 100 through a Universal Serial Bus (USB) interface or a Wi-Fi communication technology, the external radar module is also connected to the electronic device 100 through a wired or wireless connection. However, since the external radar module is not part of the electronic device 100, when the external radar module is mounted on the electronic device 100, the external radar module is obtrusive in appearance and is easily collided, thereby affecting detection due to changes in position and angle.
In addition, due to the size and configuration difference of the plug-in radar modules, the limited functions of the plug-in radar modules, and other factors, the electronic device 100 using the plug-in radar modules often needs to be subjected to compatibility improvement design, which causes inconvenience to the use of the plug-in radar modules. In addition, in order to facilitate the use, a universal standard protocol needs to be developed for the electronic device 100 and the plug-in radar module, and no product of the universal protocol is born at present. The plug-in radar module also has the common fault of plug-in equipment, namely, the plug-in radar module occupies wired interfaces such as a USB interface of the electronic equipment 100 when the plug-in equipment is wired and influences the placement position. When the plug-in is used wirelessly, the use experience is not ideal due to factors such as transmission delay and signal interference. There is a need for an electronic device 100 that can employ non-visual means to enable object detection and identification without risking privacy disclosure.
Embodiments in accordance with the present disclosure provide an electronic device 100 that may solve, or at least partially solve, the above and other potential problems with conventional electronic devices 100. An electronic device 100 according to an embodiment of the present disclosure will now be described by referring to the drawings. Fig. 1 illustrates a perspective view of an electronic device 100 according to an embodiment of the present disclosure. The electronic device 100 according to the embodiment of the present disclosure mainly includes a body 101, a display unit 102, a non-visual sensing unit 103, and a processing unit 105. The body 101 is generally arranged or placed in a predetermined position. The predetermined position herein refers to an object on which the body 101 can be fixedly or movably disposed, including but not limited to a table, a wall W or a ceiling, etc. Fig. 1 shows an exemplary embodiment in which the body 101 is arranged on a wall W. The body 101 may have a holder, a housing, etc. thereon, which are required to fix the electronic apparatus 100 to a predetermined position.
The display unit 102 is located in the body 101, for example, in a housing fixed to the body 101, and is capable of presenting a screen toward at least one orientation. The orientation here may refer to a sector area within a predetermined angular range centered on the direction in which the display unit 102 faces (i.e., the direction perpendicular to the surface of the display unit). For example, in some embodiments, the predetermined angular range may be 0 ° to 180 °. Currently, with the development of transparent display technology, the display unit 102 may also be a transparent display unit 102. In this case, the display unit 102 may display the picture toward two predetermined orientations (forward and backward) in which it faces. Embodiments of the present disclosure will be described below mainly by taking the display unit 102 as an example of displaying a screen toward one orientation, and it should be understood that the description is similar for other cases and will not be repeated separately below.
The processing unit 105 may be a System On Chip (SOC) or a System On Chip (SOC) System. In a narrow sense, the processing unit 105 is a chip integration of the core of the information system, and is to integrate the key components of the system on one chip; in a broad sense, the processing unit 105 is a micro-miniature system. For example, the processing unit 105 may include a system-on-chip control logic module, a microprocessor/microcontroller CPU core module, a digital signal processor DSP module, an embedded memory module, an interface module for communicating with the outside, an analog front end module including ADC/DAC, a power supply and power consumption management module, user defined logic, and a micro-electro-mechanical module. Further, the processing unit 105 may have embedded therein a basic software module or loadable user software or the like.
In addition, in some embodiments, the processing unit 105 may be integrated with a coprocessor 1051 unit and an Artificial Intelligence (AI) arithmetic unit. Coprocessor 1051 may be used to exclusively exchange data with some electronic devices, such as non-visual sensing unit 103, to manage and control non-visual sensing unit 103, to thereby enable low power operation of the electronic device in a sleep state. In addition, the AI arithmetic unit may be used to process the data returned by the non-visual sensing unit 103 to accomplish the detection and identification of objects. This can improve detection and recognition efficiency, and can reduce the amount of data computation in other parts of the processing unit 105, thereby improving response speed.
The non-visual sensing unit 103 refers to a unit that senses an object in at least one orientation by non-visual means and provides sensing information. The non-visual sensing unit 103 is located in a predetermined position in the body 101. The non-visual sensing unit 103 may be a radar in some embodiments. Radar uses the relationship between a transmitted beam and a reflected beam to detect objects. By using radar, a visual sensing device is no longer required to sense objects. The waves used in radar may be sound waves, ultrasonic waves, electromagnetic waves (including microwaves), infrared waves, etc., and although all waves can be used in radar technology, different waves are often selected according to different detection purposes and detection objects. Microwave is an electromagnetic wave, and the use of microwave in radar technology is one of the important applications for the performance of microwave properties. The radar can be divided into pulse radar, continuous wave radar, pulse part compression radar, frequency agility radar and the like according to the signal form, and can be divided into over-the-horizon radar, microwave radar, ultrasonic radar, millimeter wave radar, laser radar and the like according to the radar frequency band. The embodiments of the present disclosure may use any suitable radar as the non-visual sensing unit 103, for example, different radars may be selected according to the requirements of cost and detection range. For example, in some embodiments, the radar may be a millimeter wave radar or an ultrasonic radar, thereby enabling detection and identification of objects at a lower cost.
In an embodiment where radar is used as the non-vision sensing unit 103, the non-vision sensing unit 103 is integrated into the electronic device 100, unlike a conventional solution using a radar module of the add-on type. Fig. 2 shows a schematic diagram of a circuit board of the electronic device 100 according to an embodiment of the disclosure. As shown in fig. 2, the electronic device 100 according to the embodiment of the present disclosure has integrated on a circuit board a processing unit 105 and a non-visual sensing unit 103 for controlling a single sub device. The circuit board shown in fig. 2 is only schematic and in practice the circuit board may comprise a plurality of sub-circuit boards. For example, a sensing unit circuit board for arranging the non-visual sensing unit 103 and a main circuit board 109 for arranging electronic devices such as the processing unit 105. In addition, for the non-visual sensing unit 103, its own respective electronic devices may be disposed on one circuit board, or may be disposed on different circuit boards, respectively, as will be further explained later. These circuit boards implement power, control and data connections through appropriate circuitry such as flexible circuit boards. In some embodiments, the processing unit 105 and the non-visual sensing unit 103 may be disposed on the same circuit board.
As shown in fig. 2, an audio output interface 1091 for connection to a speaker unit, an audio input interface 1092 for connection to a microphone, a display interface 1093 and a network interface 1094 for connection to the display unit 102, and the like may be provided on the main circuit board 109. The audio output interface 1091 and the audio input interface 1092 may be implemented in the same interface. Of course, it should be understood that the situation illustrated in fig. 2 is illustrative only and is not intended to limit the scope of the present disclosure. Any other suitable arrangement is possible. For example, in some embodiments, one or more of the various interfaces described above may also be disposed on other circuit boards than the main circuit board 109.
Furthermore, in some alternative embodiments, electronic device 100 may also include multiple portions. The parts may be physically separated, but may be connected by a very stable wired or wireless connection, such as Wi-Fi, bluetooth, etc. These various components mentioned above are included in various parts of the electronic device 100. For example, some portions include the display unit 102, and some portions include the non-visual sensing unit 103, and the like. The portion including the non-visual sensing unit 103 may include a plurality and be fixedly disposed at a plurality of locations of the room to further improve accuracy of detection and recognition, and the like. Hereinafter, the concepts of the present disclosure will be mainly described by taking an example that each component of the electronic device 100 is integrated in one physical unit, and are similar to other cases, which will not be described separately below.
In some embodiments, the non-visual sensing unit 103 may include an antenna array 1031 to increase the sensing range, resolution, and accuracy of the non-visual sensing unit 103. The antenna array 1031 herein is formed by feeding and spatially arranging two or more antenna elements of the same frequency or different frequencies in operation according to certain requirements and laws. The antenna array 1031 may be located at any suitable location (e.g., bezel, movable member 104, etc., as will be further explained below) of the electronic device 100 for transmitting and/or receiving a beam toward a predetermined direction to sense an object in the predetermined direction. For example, in case of a smart tv, the antenna array 1031 may transmit and/or receive beams toward at least one orientation toward which the display unit 102 is oriented in order to detect objects in at least one orientation toward which the display unit 102 is oriented, as shown in fig. 3. In order to accomplish the detection and identification of the object, the antenna array 1031 of the non-visual sensing unit 103 typically cannot be blocked by metal components. Therefore, how to arrange the non-visual sensing unit 103, in particular the antenna array 1031, is a challenge in integrating the non-visual sensing unit 103 in the electronic device 100, which will be further explained later.
In addition, in some embodiments, the non-visual sensing unit 103 may also include a transceiver and sensing system processor 1032 or some other electronics necessary to implement the functionality, including, but not limited to: flash memory unit, power management unit and micro control unit etc. In some embodiments, similar to the processing unit 105, the sensing system processor 1032 may also be a system on a chip or a system on a chip, which may be integrated with various required modules. In some embodiments, the transceiver may be integrated in sensing system processor 1032. A transceiver generally includes a transmitter and a receiver. A transmitter refers to a radio device that provides a high power radio frequency signal to a radar. The transmitter can be classified into a continuous wave transmitter and a pulse transmitter according to a modulation scheme. The transmitter consists of a primary radio frequency oscillator and a pulse modulator. The radio frequency energy generated by the transmitter is transmitted to the antenna array 1031 through the feeder system, and is radiated into the air by the beam formed by the antenna array 1031. The main task of the receiver is to amplify the echo received by the antenna array 1031 and suppress unwanted waves without distortion. That is, the receiver is a device that performs preprocessing such as amplification, transformation, and processing on the echo in the radar. The receiver can adopt a superheterodyne receiver and is additionally provided with various anti-interference circuits. The receiver output signal may be sent to a sensing system processor 1032 for further processing of the output signal by the micro-processing unit 105 to thereby provide sensed information.
By integrating the non-visual sensing unit 103 in the electronic device 100, on one hand, the non-visual sensing unit 103 can be used for object detection and identification, and since the output data is at most some discontinuous point clouds and does not include sensitive information such as the face and posture of the user, the risk of privacy disclosure caused by using the visual sensing device is avoided, which is more easily received by the user and thus improves the user experience. On the other hand, the non-visual sensing unit 103, such as a radar, is integrated in the electronic device 100 and is no longer obtrusive in appearance, thereby improving the aesthetic appearance of the electronic device 100. The aesthetic appearance is an important determinant of the user in selecting the electronic device 100. In addition, the integration of the non-visual sensing unit 103 into the electronic device 100 does not bring about detection errors due to factors such as mis-touch, and makes detection and recognition more accurate and faster. In addition, the connection between the non-visual sensing unit 103 and the processing unit 105 is more reliable and does not occupy an additional data interface, thereby being more convenient to use.
The non-visual sensing unit 103 may be arranged in any suitable location of the electronic device 100. For example, in some embodiments, the electronic device 100 may also include a movable member 104, as shown in FIG. 3. The movable member 104 refers to a member movably disposed at the body 101 of the electronic apparatus 100. In order not to obstruct the display unit 102, the movable part 104 may be located at a side of the body 101 facing away from the display unit 102. The movable part 104 is movable between at least one exposed position and a concealed position in response to a control signal of the processing unit 105. In the hidden position, as shown in fig. 1, the movable member 104 is moved into the body 101 or the rear side of the body 101 facing away from the display unit 102 is shielded by the body 101. Fig. 3 and 4 show schematic views of the movable member 104 moved to two exposed positions, respectively. As shown in fig. 3, in the first exposed position, the movable member 104 protrudes from the body 101 by a first height, thereby causing a first portion of the body 101 to protrude from the body 101. As shown in fig. 4, in the second exposed position, the movable member 104 protrudes from the body 101 by a second height such that both the first portion and the second portion of the body 101 protrude from the body 101.
The signal that the processing unit 105 controls the movement of the movable part 104 between the exposed position and the concealed position may be generated on the basis of any suitable situation. For example, in some embodiments, a signal is generated to move the movable member 104 from the concealed position to the first exposed position in response to a user pressing a button on the electronic device 100, thereby causing a corresponding movement of the movable member 104. In some embodiments, the processing unit 105 may also automatically or periodically generate signals to control the movable member 104 as needed and set. For example, the processing unit 105 may move the movable component 104 from the concealed position to the first exposed position in response to a signal from the audio input interface 1092 (e.g., a voice control signal of a user), and control the movable component 104 to return to the concealed position if the non-visual sensing unit 103 does not sense the presence of an object. It should be understood that the above mentioned ways of the processing unit 105 controlling the movable part 104 are not exhaustive, and that any other suitable ways may exist, which will not be described separately in the following.
In addition, the movable member 104 is shown in FIG. 3 as being located on the top rim of the electronic device 100. It should be understood that this is illustrative only and is not intended to limit the scope of the present disclosure. Any other suitable arrangement or location is possible. For example, in some embodiments, the movable member 104 may also be located on a bottom bezel or a side bezel of the electronic device 100. Further, in some alternative embodiments, the movable member 104 may be switched between the exposed position and the hidden position in such a manner as to rotate about a certain rotation axis, unlike the case where the movable member 104 moves. The concepts of the present disclosure will be described below primarily in the manner in which the movable member 104 is slid between an exposed position and a concealed position from the top bezel of the electronic device 100 as shown in fig. 3 and 4. Other arrangement positions or movement manners are also similar and will not be described in detail below.
To facilitate sensing by the non-visual sensing unit 103, the antenna array 1031 of the non-visual sensing unit 103 may be arranged in a first portion of the movable part 104, while in a second portion, a camera 106 may be provided. The camera 106 is a device for capturing images. Accordingly, on the circuit board, a corresponding video input interface 1095 may be provided to couple with the camera head 106. Since the camera 106 is located in the second portion of the movable part 104, when the user only needs to perform object detection and recognition, the user only needs to lift the movable part 104 to the first height to expose the first portion to thereby enable the non-visual sensing unit 103 to complete object detection and recognition. At this time, the second portion is still in a state of being shielded by the body 101, and therefore, there is no risk of privacy leakage. If the user requires the camera 106 to complete a task such as a video conference, the movable member 104 need only be raised to the second height by the processing unit 105 to expose a second portion of the non-movable member.
In some embodiments, the non-visual sensing unit 103 and the camera 106 may be located on one circuit board, i.e. on the first circuit board 107 of the movable part 104, as shown in fig. 5. Fig. 5 shows a simplified exploded schematic view of the movable member 104. As shown in fig. 5, the movable member 104 may include a bottom case 1041 and a panel 1042. The first circuit board 107 is located in the bottom housing 1041 and covered by the panel 1042. The first circuit board 107 and the panel 1042 may be fixed in the bottom case 1041 by appropriate fasteners or fastening means. The first circuit board 107 may further include a sensing system processor 1032 of the non-visual sensing unit 103, and the like, and a camera 106, in addition to the antenna array 1031 of the aforementioned non-visual sensing unit 103. That is, the components of the non-vision sensing unit 103 and the camera 106 may be located on the first circuit board 107. A transparent portion or opening 1043 in the panel 1042 that exposes the camera of the camera 106. The first circuit board 107 may be electrically, data and control connected to the aforementioned main circuit board 109 for arranging the processing unit 105 by suitable circuitry.
Furthermore, the antenna array 1031 may comprise a plurality of elements. Each unit at least comprises one or more radiation parts and a power feeding part. A surrounding frame can be arranged between the units to improve the isolation between the units. The elements of the antenna array 1031 may share a backplane, an antenna carrier board, and a skin covered with a cover board. The radiating portion in each cell transmits and receives beams toward a predetermined direction. In addition, in some embodiments, the sensing system processor may also perform amplitude-phase control on each cell to achieve two-dimensional scanning of the radiation pattern, thereby improving the detection and identification accuracy of the object.
In some embodiments, the substrate of the antenna array 1031 may be made of FR-4 grade plate material, metal-based plate material, or high frequency/high speed substrate made of material such as Megtron 7N (M7N) or Polytetrafluoroethylene (PTFE) to improve the signal and quality of signal transmission. In addition, the antenna array 1031 may be implemented as an on-board antenna or a waveguide antenna.
Of course, it should be understood that the examples of the movable component 104 and the first circuit board 107 shown in fig. 5 are merely illustrative and are not intended to limit the scope of the present disclosure. Any other suitable arrangement or configuration is possible. For example, in some alternative embodiments, the non-visual sensing unit 103 and the camera head 106 may be located on different circuit boards, respectively. In addition, in some alternative embodiments, the microphone module may also be integrated on the first circuit board 107, so as to form an integrated module integrating multiple functions such as sound pickup, image pickup, and non-line-of-sight sensing, thereby further improving the integration level of the system.
The provision of the non-visual sensing unit 103 on the movable part 104 is not limited by the size of the electronic apparatus 100, such as a bezel, and the like, and thus just facilitates the upgrade modification of the existing electronic apparatus 100 to thereby reduce the cost of upgrading the electronic apparatus 100. Of course, it should be understood that the manner in which the non-visual sensing unit 103 is located on the movable member 104 is merely illustrative and is not intended to limit the scope of the present disclosure. Any other suitable arrangement and location is possible. For example, in some alternative embodiments, the antenna array 1031 of the non-visual sensing unit 103 may be located on the rim 1011 of the body 101.
It is mentioned above that the body 101 may have a plurality of rims 1011 such as a top rim, a bottom rim and a side rim, and the display unit 102 is located in a space defined by the plurality of rims 1011. In some embodiments, the antenna array 1031 of the non-visual sensing unit 103 may be located in at least one of the rims or between at least one of the rims and the display unit 102 along a direction in which the rim extends. Fig. 6 shows an exemplary embodiment where the antenna array 1031 is located at the top rim of the body 101. In some embodiments, the antenna array 1031 may alternatively or additionally be located in the side frame and/or the bottom frame. The concepts of the present disclosure will be described below primarily with the antenna array 1031 located on the top rim as an example. It should be understood that other arrangement positions and manners are also similar and will not be described in detail below.
Considering signal loss of the non-visual sensing units 103, etc., in the case where the antenna array 1031 is located on a border, other portions of the non-visual sensing units 103 (e.g., the sensing system processor 1032, etc.) may also preferably be located on the same circuit board as the antenna array 1031. Specifically, in some embodiments, a bendable second circuit board 1032 is also provided in the body 101 of the electronic device 100. The second circuit board 1032 may include a first portion extending in a direction perpendicular to the display unit 102 and a second portion extending parallel to the display unit 102, as shown in fig. 7. That is, in some embodiments, the first portion and the second portion do not lie in the same plane, and the planes of extension in which the portions lie are perpendicular to one another. The non-visual sensing unit 103 may be located on the second circuit board 108 as shown in fig. 7 and 8. In some embodiments, the second circuit board 108 may be bendable using a flexible board or a rigid-flexible hybrid board. In this manner, the antenna array 1031 may employ an end-fire antenna array. At least a portion of the antenna array 1031 is located at an end of a first portion of the second circuit board 108, while other portions, such as the transceiver and sensing system processor 1032, may be located on a second portion of the second circuit board 108. With this arrangement, even if the electronic device 100 is thin, the antenna array 1031 of the non-vision sensing unit 103 can be positioned on the rim without affecting the sensing of the non-vision sensing unit 103. This arrangement makes the internal spatial layout of the electronic device 100 more reasonable. Of course, it should be understood that the first and second portions of second circuit board 1032 may also lie in the same plane or at other non-zero angles to the plane in which they lie, as desired for internal layout. For example, in some embodiments, second circuit board 1032 may also take the form of a shaped circuit board, such as an L-shaped board, where the first and second portions are in the same plane.
In some embodiments, the antenna array 1031 may include a plurality of elements, each element having a radiating element, a feed element, as shown in fig. 8, to achieve an end-fire pattern toward at least one predetermined orientation. Each element may employ a tapered slot (Vivaldi) antenna or a broadband dipole antenna, etc. In addition, in order to improve the performance of the antenna array 1031 and utilize the space between the frame or the frame and the display unit 102, a phase-guiding unit 1033 may be disposed in the radiation direction of each unit to provide a complementary radiation unit for each radiation unit. The different radiation units are controlled in amplitude and phase by, for example, the sensing system processor 1032, so as to realize beam scanning control along the frame direction. The antenna array 1031 is disposed between the frame or the frame and the display unit 102, which does not affect the appearance of the electronic device 100, thereby facilitating the design of the electronic device 100.
In addition to providing the antenna array 1031 between the bezel or bezel and the display unit 102 or on the movable member 104, in some alternative embodiments, the antenna array 1031 of the non-vision sensing unit 103 may be integrated in the display unit 102 as shown in fig. 9. For example, the antenna array 1031 may implement the antenna array 1031 in the form of an on-screen antenna. For example, at a predetermined position of the display unit 102 where the antenna array 1031 needs to be arranged, the light emitting units of the display unit 102 may be sparsely arranged, so that a metal mesh is inserted into the gaps of the sparsely arranged light emitting units to thereby form the antenna array 1031. The signal transmission lines of the antenna array 1031 and the sensing system processor 1032 may also be implemented on-screen or by using a coupling feed. The antenna array 1031 and the display unit 102 are integrated, so that the large size of the display unit 102 can be utilized to the maximum, and the realization of the two-dimensional antenna array 1031 is facilitated, so that the detection accuracy and the identification rate can be further improved. In addition, the method has better hiding performance, and basically can ensure that the user has no perception, thereby improving the user experience.
Various embodiments of integrating the non-visual sensing unit 103 in the electronic device 100 are described above in connection with the figures. In this way, the non-visual sensing unit 103 can be used for detection and identification of objects, avoiding the risk of privacy leakage caused by the use of visual sensing devices, which significantly improves the user experience. In addition, integrating a non-visual sensing unit 103, such as a radar, into the electronic device 100 makes the electronic device 100 no longer obtrusive in appearance, thereby improving the aesthetic appearance of the electronic device 100 and, thus, the user experience.
Embodiments of the present disclosure also provide a method of controlling the electronic device 100 based on the integrated non-visual sensing unit 103. The method will be described below in conjunction with fig. 10. The method may be performed by the processing unit 105 of the electronic device 100 to enable control of the electronic device 100. In some embodiments, to further reduce power consumption, the processing unit 105 may include a co-processor 1051 to process signals associated with the non-visual sensing unit 103. In this way, low power operation with detection and identification of objects in a dormant or standby state of the electronic device 100 may be achieved.
Fig. 10 shows a flowchart of the control method. As shown in fig. 10, at block 610, the processing unit 105 first acquires sensing information from the non-visual sensing unit 103. It was mentioned in the foregoing that the sensing information is provided by the sensing system processor 1032 of the non-visual sensing unit 103. At block 620, the processing unit 105 processes the received sensed information to determine whether an object enters a predetermined area in at least one predetermined orientation based on the sensed information. The predetermined area may be a whole area in a predetermined orientation of the electronic apparatus 100 or a designated local area. In some embodiments, the acts in blocks 610 and 620 may be performed at intervals of a predetermined period of time while the electronic device 100 is in a standby state. In some alternative embodiments, the actions in blocks 610 and 620 may also be performed in response to receiving a signal from another interface, such as audio input interface 1092. For example, within a predetermined period of time in the standby state (the next detection timing has not yet been reached), in response to receiving an input signal from the audio input interface 1092, indicating that there may be an object in the vicinity of the electronic apparatus 100, sensing information is acquired at this time and whether there is an object in the predetermined area is analyzed. This way, the detection of the object is made more sensitive and faster, thereby improving the user experience.
If it is determined from the sensing information that an object has entered the predetermined area, at block 630, in response to determining that an object has entered the predetermined area, an attribute of the object is determined from the sensing information by an algorithm. Determining the properties of the object may be achieved by a suitable algorithm. For example, the processing unit 105 may determine whether the object is a swaying curtain, fan, pet, user, or the like, by sensing information. Next, at block 640, the processing unit 105 controls the electronic device 100 according to the determined property of the object. For example, if it is determined from the algorithm that the object's attributes are merely a curtain fluttering, a fan turning, or a pet's motion, etc., then no action is taken and the action indicated by block 610 continues. In addition, alternatively or additionally, the processing unit 105 may also control an external device by the electronic device 100 according to the determined property of the object. The external device herein refers to other electronic devices that are connected to the electronic device 100 by wire or wirelessly such as bluetooth, NFC, wi-Fi, and the like.
For example, in some embodiments, the electronic device 100 may be connected to other electronic devices such as an air conditioner via a wireless connection. According to the determined attribute of the object, the person is determined to be at a certain position, the processing unit 105 may also control the electronic device 100 to be powered on and control the air conditioner to blow air towards the certain position, and the air outlet direction of the air conditioner can be adjusted along with the movement of the user, so as to improve the user experience. Of course, it should be understood that the embodiments of controlling an air conditioner by electronic devices mentioned in the foregoing are only illustrative and are not intended to limit the scope of the present disclosure. Any other suitable external device may be controlled by the electronic device 100 to thereby further enhance the user experience.
For example, in some embodiments, the processing unit 105 may also implement control of the speaker unit according to the sensing information sent by the non-visual sensing unit 103. For example, if the processing unit 105 determines that the user a moves from the first position to the second position by sensing the information, the speaker unit may be controlled accordingly to gradually move the audio output direction from the first position to the second position, so as to implement the functions of voice-following and human movement, and further improve the user experience.
In some embodiments, if it is determined from the sensed information that the attribute of the object is the user, extraction of biometric information of the user or the like from the sensed information may be continued. Biometric information in this context refers to information related to a person's biometric features, which may include, but is not limited to, user breathing heartbeat information, motion trajectory information, body posture information, cell phone control information, and the like. For example, if a sudden change occurs in the body posture of the user, for example, the user falls suddenly, according to the sensing information, the processing unit 105 may issue a relevant alarm or reminder according to the biometric information to obtain the user's response. If the user responds further only that emergency assistance is required, the processing unit 105 may assist the user by calling other personnel or alerting, etc. In this way, the user's experience and security of using the electronic device 100 can be further improved.
It should be understood that the above mentioned embodiments of the biometric information about a sudden fall of a user are only illustrative and do not limit the scope of the present disclosure. Any other suitable biometric information is also possible. For example, in some embodiments, the biometric information may include information such as respiratory heartbeat. After the attribute of the object is determined to be the user, the breathing and heartbeat information of the user can be detected, processed and recorded for a certain time, so that the current health characteristics of the user, such as the breathing and heartbeat frequency, can be identified. In addition, the processing unit 105 may further continue to determine whether the breathing and the heartbeat are regular, stable, abnormal, and the like, and perform operations such as reporting, reminding, and corresponding warning according to the setting of the user.
In addition, the biometric information may also include body state information and the like. For example, in some embodiments, the processing unit 105 may prompt the user a for a sedentary reminder or a sleep detection or reminder, etc. according to the data related to the physical state information. When the user plays a game using the electronic device 100 or controls the electronic device 100, the processing unit 105 may analyze the gesture of the user according to the sensing information transmitted by the non-visual sensing unit 103 and perform control of the electronic device 100 or the game according to the function of the gesture set by the user.
In some embodiments, alternatively or additionally, if it is determined from the sensed information that the property of the object is a user, the identity of the user may also be confirmed from the sensed information. Identity information relating to the identity may then be obtained from the identity of the user. The identity information may be stored in a memory of the electronic device 100, or may be stored in a database on the network side, for example, by obtaining identity information related to the identity in a network-connected manner. Such identity information may include, but is not limited to, personal information corresponding to the identity of the user, recorded information using the electronic device 100, health information, gesture recorded information, and the like. In this way, more targeted human-computer interaction can be achieved, thereby further improving user experience.
For example, if it is detected that the object entering the predetermined area is a user, and the identity of the user is determined to be user a. The processing unit 105 will retrieve the relevant information of user a from the memory. For example, assume that the user a has watched movie F to a position of 20 minutes last time using the electronic apparatus 100. The processing unit 105 may display the screen of the movie F on the display unit 102 and set the progress bar at a position of 20 minutes according to the acquired information, thereby facilitating the user to continue watching, to thereby improve the user experience. In addition, the processing unit 105 may also display information related to the user a at a suitable position of the display unit 102 to realize information push and interaction.
In some embodiments, if it is determined from the sensed information that the user is not a registered user but a new user, the processing unit 105 may control the electronic apparatus 100 to display information such as a registration page, or control the electronic apparatus 100 to autonomously establish a user profile and learn the usage habits, viewing progress, and other relevant information of the user and record that the user has been used subsequently in the memory.
In some embodiments, if the user leaves the predetermined area without turning off the electronic device 100 or leaves the predetermined orientation of the display screen of the display unit 102, the processing unit 105 determines a time when the person has left the predetermined area from the sensed information. This may be achieved by the processing unit 105 determining from the sensing information provided by the non-visual sensing unit 103 that the user of the relevant identity is not starting to time at the predetermined area. When the time exceeds a predetermined threshold, the electronic device 100 may be controlled to enter a standby state. The predetermined threshold for the time may be related to a user setting. For example, if the user a sets that the electronic apparatus 100 is brought into the standby state after 5 minutes from leaving the predetermined area of the electronic apparatus 100, the electronic apparatus 100 is brought into the standby state if the processing unit 105 determines that the user a has left the predetermined area for more than 5 minutes according to the sensing information. The control mode is more beneficial to improving the user experience.
Furthermore, it should be understood that the above mentioned ways of the processing unit 105 to control the electronic device 100 in dependence of the sensed information are not exhaustive, and that there may be any other suitable ways to achieve more and more interesting human-computer interaction. In addition, the processing unit 105 can provide real-time information of guardians and the like according to the sensing information to ensure the safety and health of family members.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (12)

1. An electronic device, comprising:
a body (101);
a display unit (102) located in the body (101) for presenting a picture towards at least one predetermined orientation;
a non-visual sensing unit (103) located in a predetermined position of the body (101) for sensing an object in the at least one predetermined orientation and providing sensing information; and
a processing unit (105) configured to receive the sensing information and to control the electronic device and/or an external device in accordance with the sensing information.
2. The electronic device of claim 1, wherein the non-visual sensing unit (103) comprises:
an antenna array (1031) for transmitting and receiving beams towards the at least one predetermined orientation to sense objects in the at least one predetermined orientation; and
a sensing system processor (1032) for processing the received beam to determine the sensing information.
3. The electronic device of claim 1, wherein the non-visual sensing unit (103) comprises a radar.
4. The electronic device of claim 2 or 3, further comprising:
a movable member (104) movably arranged at a side of the body (101) facing away from the display unit (102) for moving between the first exposed position, in which a first portion of the movable member (104) protrudes from the body (101), and a concealed position, in which the movable member (104) is concealed by the body (101), in response to a control signal of the processing unit (105).
5. The electronic device of claim 4, wherein the antenna array (1031) of the non-visual sensing unit (103) is located in the first portion of the movable component (104).
6. The electronic device of claim 4, wherein the movable part (104) is further adapted to move from the first exposed position to a second exposed position in which the first and second portions of the movable part (104) protrude from the body (101).
7. The electronic device of claim 6, further comprising:
a camera (106) located in the second portion of the movable member (104).
8. The electronic device of claim 7, the movable component (104) having a first circuit board (107) disposed therein, the camera (106) and the non-visual sensing unit (103) being located on the first circuit board (107).
9. The electronic device of claim 2 or 3, wherein the body (101) comprises a plurality of rims (1011), at least a portion of the display unit (102) being disposed in a space defined by the plurality of rims (1011);
the antenna array (1031) of the non-visual sensing unit (103) is disposed in the at least one bezel or between the at least one bezel and the display unit (102).
10. The electronic device according to claim 9, wherein a bendable second circuit board (1032) is provided in the body (101), the second circuit board (1032) comprising a first portion extending perpendicular to the display unit (102) and a second portion extending parallel to the display unit (102),
the non-visual sensing unit (103) is located on the second circuit board (1032), at least part of the antenna array (1031) is located at an end of the first portion, and the sensing system processor (1032) is located at the second portion.
11. The electronic device according to claim 2, wherein the antenna array (1031) of the non-visual sensing unit (103) is at least partially integrated in the display unit (102).
12. The electronic device of claim 11, wherein the antenna array (1031) comprises a two-dimensional antenna array.
CN202111150487.8A 2021-09-29 2021-09-29 Electronic device and control method thereof Pending CN115877943A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111150487.8A CN115877943A (en) 2021-09-29 2021-09-29 Electronic device and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111150487.8A CN115877943A (en) 2021-09-29 2021-09-29 Electronic device and control method thereof

Publications (1)

Publication Number Publication Date
CN115877943A true CN115877943A (en) 2023-03-31

Family

ID=85756051

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111150487.8A Pending CN115877943A (en) 2021-09-29 2021-09-29 Electronic device and control method thereof

Country Status (1)

Country Link
CN (1) CN115877943A (en)

Similar Documents

Publication Publication Date Title
US11039048B2 (en) Doorbell camera
US11212427B2 (en) Doorbell camera
US10347985B2 (en) Antenna device and electronic device including the same
CN108700645B (en) Systems, methods, and devices for utilizing radar with smart devices
KR102481939B1 (en) Wireless power transmitting device, electronic device for wirelessly receiving power and operation method thereof
KR102460543B1 (en) Electronic device including electronic pen and method for recognizing insertion of the electronic pen therein
KR102359786B1 (en) Antenna and electronic device comprising thereof
KR20160120495A (en) Noise shielding device with heat sink structure and electronic device having it
US11671683B2 (en) Doorbell camera
WO2021013136A1 (en) Apparatus control method, and electronic apparatus
US11632521B2 (en) Audio/video electronic device
US20220113395A1 (en) Method for providing service related to electronic device by forming zone, and device therefor
CN109029252B (en) Object detection method, object detection device, storage medium, and electronic apparatus
US20230319724A1 (en) Electronic device and operation method therefor
CN115877943A (en) Electronic device and control method thereof
KR20210090520A (en) Antenna and electronic device having the same
KR20220063862A (en) An electronic apparatus and a method of operating the electronic apparatus
US20230171493A1 (en) Electronic device for autofocusing and method of operating the same
EP4395350A1 (en) Electronic device for autofocusing and method for operating same
US11997370B2 (en) Doorbell camera
US20240148330A1 (en) Electronic device for measuring biometric signal and method of operating the same
US20230232106A1 (en) Image capturing method using wireless communication and electronic device supporting same
US20220391022A1 (en) System and method for human-device radar-enabled interface
US20230280874A1 (en) Electronic device comprising flexible display, and method for operating same
KR20240072945A (en) Water supply device and providing method water supply data to the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination