CN114764272A - Terminal, terminal control method, terminal control device, and storage medium - Google Patents

Terminal, terminal control method, terminal control device, and storage medium Download PDF

Info

Publication number
CN114764272A
CN114764272A CN202110001256.4A CN202110001256A CN114764272A CN 114764272 A CN114764272 A CN 114764272A CN 202110001256 A CN202110001256 A CN 202110001256A CN 114764272 A CN114764272 A CN 114764272A
Authority
CN
China
Prior art keywords
image acquisition
terminal
acquisition component
component
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110001256.4A
Other languages
Chinese (zh)
Inventor
陈沭
豆子飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202110001256.4A priority Critical patent/CN114764272A/en
Publication of CN114764272A publication Critical patent/CN114764272A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
    • G06K17/0022Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisious for transferring data to distant stations, e.g. from a sensing device
    • G06K17/0025Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisious for transferring data to distant stations, e.g. from a sensing device the arrangement consisting of a wireless interrogation device in combination with a device for optically marking the record carrier

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to a terminal, a terminal control method, a terminal control device, and a storage medium. The terminal comprises a first image acquisition component, a second image acquisition component and a first processing component in a running state, wherein the first processing component is provided with a first interface and a second interface. The first processing component is used for controlling at least one of the first image acquisition component and the second image acquisition component to be in an opening state. A terminal control method is applied to a terminal and comprises the following steps: controlling the first image acquisition component and the second image acquisition component to be in an opening state; and if the first image acquisition component and/or the second image acquisition component acquire a target image for triggering the terminal to perform target event processing, the triggering terminal responds to the target image acquired by the first image acquisition component and/or the second image acquisition component and used for triggering the terminal to perform the target event processing. Through the terminal and the terminal control method provided by the disclosure, the requirements of the user can be actively sensed according to the acquired image, the quick response is realized, and the use experience of the user is improved.

Description

Terminal, terminal control method, terminal control device, and storage medium
Technical Field
The present disclosure relates to the field of terminal processing technologies, and in particular, to a terminal, a terminal control method, a terminal control apparatus, and a storage medium.
Background
With the development of terminal technology, the capability of the terminal to perform corresponding response based on the operation of the user has been gradually improved. However, when the terminal executes the corresponding operation, the user is required to actively perform the corresponding touch control on the terminal, so that the terminal can perform the corresponding response. In practical application, the process of operating the terminal is complicated, and the use experience of the user is influenced.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a terminal, a terminal control method, a terminal control apparatus, and a storage medium.
According to a first aspect of the embodiments of the present disclosure, a terminal is provided, where the terminal includes a first image capturing component, a second image capturing component, and a first processing component in a running state, where the first processing component has a first interface and a second interface, the first interface is connected to the first image capturing component, and the second interface is connected to the second image capturing component. The first processing component is used for controlling at least one of the first image acquisition component and the second image acquisition component to be in an opening state.
In an embodiment, the first processing component is an application specific integrated chip.
In another embodiment, the operating power consumption of the first processing component is less than or equal to a first power consumption threshold.
In another embodiment, the terminal further includes: and the second processing component is connected with the first processing component and triggers the operation through the first processing component when the first image acquisition component and/or the second image acquisition component acquire images.
According to a second aspect of the embodiments of the present disclosure, there is provided a terminal control method applied to a terminal, where the terminal includes a first image capturing component, a second image capturing component, and a first processing component in a running state, the first processing component has a first interface and a second interface, the first interface is connected to the first image capturing component, the second interface is connected to the second image capturing component, and the terminal control method includes: controlling, by the first processing component, at least one of the first image capturing component and the second image capturing component to be in an on state. And if the first image acquisition component and/or the second image acquisition component acquire a target image for triggering the terminal to process the target event, triggering to process the target event.
In one embodiment, triggering processing of the target event includes: and triggering the terminal to run a first application, wherein the first application is used for responding to the target image acquired by the first image acquisition component or the second image acquisition component and executing the target event corresponding to the target image.
In another embodiment, the terminal control method further includes: and determining the working mode of the terminal, wherein the working mode comprises a screen mode, a privacy mode or a browsing mode. And according to the working mode, keeping the first image acquisition component and/or the second image acquisition component in an opening state.
In another embodiment, if the first image capturing component and/or the second image capturing component captures a target image for triggering the terminal to perform target event processing, triggering to process the target event, including: if the first image acquisition component and/or the second image acquisition component acquire a two-dimensional code image, triggering the terminal to identify the two-dimensional code; or if the first image acquisition component and/or the second image acquisition component acquire an image with a specified gesture, triggering the terminal to recognize the specified gesture and execute a function operation corresponding to the specified gesture; or, if the first image acquisition component and/or the second image acquisition component acquire an image with at least one face, triggering the terminal to identify the face direction in the target image or detect the number of faces in the target image.
According to a third aspect of the embodiments of the present disclosure, there is provided a terminal control device applied to a terminal, where the terminal includes a first image capturing component and a second image capturing component, and a first processing component in an operating state, the first processing component has a first interface and a second interface, the first interface is connected to the first image capturing component, the second interface is connected to the second image capturing component, and the terminal control device includes: the control unit controls at least one of the first image acquisition component and the second image acquisition component to be in an opening state through the first processing component. And the triggering unit is used for triggering the processing of the target event if the first image acquisition component and/or the second image acquisition component acquires a target image for triggering the terminal to process the target event.
In an embodiment, the triggering unit triggers the processing of the target event in the following manner: and triggering the terminal to call and run a first application, and triggering the terminal to run the first application, wherein the first application is used for responding to a target image acquired by the first image acquisition component or the second image acquisition component and executing the target event corresponding to the target image.
In another embodiment, the terminal control apparatus further includes: the terminal comprises a determining unit, a display unit and a display unit, wherein the determining unit is used for determining the working mode of the terminal, and the working mode comprises a screen turning mode, a privacy mode or a browsing mode. The control unit is further configured to keep the first image acquisition component and/or the second image acquisition component in an open state according to the working mode.
In another embodiment, the triggering unit triggers processing of the target event according to the target image acquired by the first image acquisition component and/or the second image acquisition component for triggering the terminal to perform target event processing in the following manner: if the first image acquisition component and/or the second image acquisition component acquire a two-dimensional code image, triggering the terminal to identify the two-dimensional code; or if the first image acquisition component and/or the second image acquisition component acquire an image with a specified gesture, triggering the terminal to recognize the specified gesture and execute a function operation corresponding to the specified gesture; or, if the first image acquisition component and/or the second image acquisition component acquire an image with at least one face, triggering the terminal to identify the face direction in the target image or detect the number of faces in the target image.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a terminal control apparatus including: a memory to store instructions; and the processor is used for calling the instruction stored in the memory to execute any one of the terminal control methods.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium having stored therein instructions which, when executed by a processor, perform any one of the above-described terminal control methods.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: through the terminal provided by the disclosure, the first image acquisition component and the second image acquisition component can be always in the opening state based on the first processing component in the running state. And then when the terminal is used, the response can be carried out according to the collected image. Therefore, the terminal can actively execute the corresponding function without excessive operation of the user, so that the user can use the terminal more conveniently and quickly, and the use experience of the user is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a schematic diagram of a terminal structure according to an exemplary embodiment.
Fig. 2 is a schematic diagram illustrating a configuration of an image capturing component according to an exemplary embodiment.
Fig. 3 is a schematic diagram illustrating another terminal structure according to an example embodiment.
Fig. 4 is a schematic diagram illustrating a further terminal structure according to an example embodiment.
Fig. 5 is a flowchart illustrating a terminal control method according to an exemplary embodiment.
Fig. 6 is a flow chart illustrating another terminal control method according to an example embodiment.
Fig. 7 is a diagram illustrating a terminal control interaction, according to an example embodiment.
Fig. 8 is a diagram illustrating a control flow of a terminal according to an example embodiment.
Fig. 9 is a schematic diagram illustrating another terminal control flow according to an example embodiment.
Fig. 10 is a block diagram illustrating a terminal control apparatus according to an example embodiment.
Fig. 11 is a block diagram illustrating another terminal control device according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below do not represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
In the related art, when executing a target function, a terminal performs a corresponding response according to a selection of a user based on active control of the user. The terminal is always in a state of passively receiving the instruction.
In view of this, the present disclosure provides a terminal, which can control a first image capturing component and a second image capturing component to be in an open state for a long time according to a first processing component operating inside, and can trigger a corresponding response in time through image capturing, so that the terminal can actively sense a user's requirement. Therefore, the operation of the user is reduced, and the use experience of the user is improved.
In one example, the category of terminals may include mobile terminals, such as: mobile phones, tablets, notebooks, etc. In another example, the structure of the terminal may include: a dual-screen terminal, a folding screen terminal, a full-screen terminal, etc.
Fig. 1 is a schematic diagram illustrating a terminal structure according to an exemplary embodiment. As shown in fig. 1, the terminal 100 includes: a first image acquisition component 101, a second image acquisition component 102 and a first processing component 103.
The first processing unit 103 has a first interface and a second interface, the first interface is connected to the first image capturing unit 101, and the second interface is connected to the second image capturing unit 102. The first processing unit 103 is used for controlling at least one of the first image capturing unit 101 and the second image capturing unit 102 to be in an on state
In the embodiment of the present disclosure, the first processing unit 103 is a processing unit that is in an operating state and has a processing function. The first processing component 103 comprises a first interface and a second interface, wherein the first interface is connected with the first image acquisition component 101, and the second interface is connected with the second image acquisition component 102. Further, the first processing part 103 can control at least one of the first image capturing part 101 and the second image capturing part 102 to be in an on state based on the connection with the first image capturing part 101 and the second image capturing part 102. Therefore, when a trigger event occurs, the function of triggering to respond can be determined according to the images acquired by the first image acquisition component 101, the second image acquisition component 102 or the first image acquisition component 101 and the second image acquisition component 102, so that the terminal 100 can respond quickly, actively sense the operation required by the user, provide convenience for the user, and improve the use experience of the user. The first image capturing component 101 or the second image capturing component 102 may include a camera, and an AON (always on) camera may be obtained based on the control of the first processing component 103.
In an example, the internal configuration of the first image capturing component or the second image capturing component may be as shown in fig. 2. Fig. 2 is a schematic diagram illustrating a configuration of an image capture component according to an exemplary embodiment. A sensor is included in the first image capturing element or the second image capturing element. The sensor converts the light image on the light sensing surface into an electric signal in a corresponding proportional relation with the light image by utilizing the photoelectric conversion function of the photoelectric device. That is, the light reflected from the object is captured by the sensor, and the image is captured. Pixel size, motion detection, analog-to-digital converter (ADC), and exposure functions are included in the sensor to determine the captured imaging. For example: the analog-digital converter can adopt 8-bit ADC (analog-digital converter), namely, a monolithic integrated A/D converter with 8-bit binary output, the conversion time can be within 50 nanoseconds (ns), and the conversion of analog and digital can be completed quickly in a short time so as to collect images in time. The pixel size may be limited to 16 x 12 pixels. The sensor transmits the acquired image to the first processing unit 103 through a Group Policy Object (GPO) or an I2C (control interface), and the first processing unit 103 determines whether the acquired image is an image corresponding to the event trigger, and further determines whether to trigger an interrupt to perform a corresponding response. In one example, the first processing component 103 may be an Image Processor (ISP), and the first Image capturing component 101 and the second Image capturing component 102 may be connected to the first processing component 103 as shown in fig. 3. Fig. 3 is a schematic diagram illustrating another terminal structure according to an example embodiment.
In an embodiment, the first processing component 103 may be an application specific integrated chip, for example: an Application Specific Integrated Circuit (ASIC) is used as the first processing section 103. The ASIC is used as the first processing part 103, and can be designed according to the requirements of the user when manufacturing the chip, so that the first processing part 103 can execute a plurality of functions. And thus the resulting integrated circuit is enabled to process the image while determining whether a trigger event is currently occurring in order to control the on-state of the first image capturing element 101 and the second image capturing element 102. Compared with the common integrated chip, the ASIC has the characteristics of small size, low power consumption, high reliability, high performance, strong confidentiality, low cost and the like, so that the ASIC is arranged in a terminal without influencing the occupied space of other parts.
In another embodiment, the user's usage needs are determined at a time, and therefore, the first processing component 103 needs to be kept in an operating state for a long period of time. Further, when a user needs the image capturing device, the opening states of the first image capturing part 101 and the second image capturing part 102 can be controlled in time. In order to avoid affecting the operation of other components, a processing component with operation power consumption less than or equal to the first power consumption threshold may be selected as the first processing component 103 to ensure that it can be used normally. For example: a processing section that operates with power consumption of 10 watts (w) or less may be employed as the first processing section 103.
In another embodiment, the first processing component further has a bypass function (bypass) mode, and the mode of the first image acquisition component or the mode of the second image acquisition component can be automatically switched, so that the normal shooting function can be ensured while the first processing component acquires the external image.
Fig. 4 is a schematic diagram illustrating a structure of another terminal according to an exemplary embodiment. As shown in fig. 4, the terminal 100 further includes: a second processing component 104.
And the second processing component 104 is connected with the first processing component 103, and is triggered to operate by the first processing component 103 when the first image acquisition component 101 and/or the second image acquisition component 102 acquires an image.
In the disclosed embodiment, the terminal 100 further comprises a second processing component 104 connected to the first processing component 103. The second processing component 104 may include an Application Processor (AP). When the first processing component 103 determines that a trigger event occurs according to the images acquired by the first image acquisition component 101, the second image acquisition component 102, or the first image acquisition component 101 and the second image acquisition component 102, the first processing component 103 sends information of the trigger event to the second processing component 104, and then the second processing component 104 calls and starts a corresponding application, so that the purpose that the terminal can realize autonomous perception is achieved.
In one implementation scenario, the trigger event may include any one or more of: the method comprises the steps of lying on the side to use the terminal, performing gesture operation, detecting that the number of faces exceeds a specified number threshold or detecting that the number of faces is less than the specified number threshold, waking up a screen, unlocking the screen, and detecting a two-dimensional code.
Based on the same inventive concept, the disclosure also provides a terminal control method. The terminal control method can be applied to any one of the terminals in the above embodiments.
Fig. 5 is a flowchart illustrating a terminal control method according to an exemplary embodiment, and as shown in fig. 5, the terminal control method includes the following steps S11 through S12.
In step S11, at least one of the first image pickup section and the second image pickup section is controlled to be in an on state by the first processing section.
In the embodiment of the disclosure, in order to find the usage intention of the user in time, the first processing component in the terminal controls at least one of the state of the first image acquisition component and the state of the second image acquisition component to be in an open state, so as to determine whether to trigger the terminal to respond correspondingly through the images acquired by the first image acquisition component, the second image acquisition component or the first image acquisition component and the second image acquisition component in the open state.
In step S12, if the first image capturing means and/or the second image capturing means captures a target image for triggering the terminal to perform target event processing, a processing target event is triggered.
In the embodiment of the disclosure, when a target image for triggering the terminal to perform target event processing is acquired by the first image acquisition component or the second image acquisition component, the terminal triggers and processes a target event corresponding to the target image according to the target image for triggering the terminal to perform event processing, so as to control and execute a corresponding function. In an implementation scenario, if a target image of a user watching a screen is acquired and corresponds to a target event of waking up the screen, the terminal responds according to the target event of waking up the screen, triggers and processes the target event, and executes a function corresponding to the target event. For example: the target event of triggering the processing to wake up the screen may include triggering the terminal to automatically light up the screen to wake up the screen.
If the first image acquisition component and the second image acquisition component both detect a target image for triggering the terminal to perform target event processing, the terminal can be triggered to perform corresponding processing based on the priority of the triggered target event; or both may be run in parallel, depending on the actual settings of the terminal.
Through the embodiment, whether the image is the target image for triggering the terminal to perform target event processing can be determined based on the images acquired by the first image acquisition component, the second image acquisition component or the first image acquisition component and the second image acquisition component in the starting state. And then when the terminal is triggered, the terminal can respond correspondingly according to the triggered target event so that the terminal can autonomously execute the operation required by the user, thereby reducing the operation interaction of the user and improving the use experience of the user.
In an implementation scene, if the first image acquisition component, the second image acquisition component or the first image acquisition component and the second image acquisition component acquire a two-dimensional code image, the terminal is triggered to recognize the two-dimensional code. Or if the target image acquired by the first image acquisition component and the second image acquisition component or the target image acquired by the first image acquisition component and the second image acquisition component is an image with a specified gesture, the terminal is triggered to recognize the specified gesture and execute the function operation corresponding to the specified gesture. Or if the target image acquired by the first image acquisition component and the second image acquisition component or the target images acquired by the first image acquisition component and the second image acquisition component are images with at least one face, triggering the first processing component to process the target event as identifying the face direction in the target image or detecting the number of faces in the target image.
In an embodiment, triggering the response of the terminal may include triggering the terminal to run the first application. The first application is used for responding to a target image which is acquired by the first image acquisition component or the second image acquisition component and used for triggering the terminal to perform target event processing. And when the acquired target image for triggering the terminal to perform the target event processing is acquired, the terminal can automatically run the first application according to the target image for triggering the terminal to perform the target event processing. For example: when the image with the two-dimensional code is acquired by the second image acquisition component, the terminal can directly call the application capable of analyzing the two-dimensional code, so that a user can directly acquire information in the two-dimensional code without manually opening the application and then acquiring the information in the two-dimensional code. The method for the user to acquire the information is convenient and fast, and the use experience of the user is improved.
Fig. 6 is a flowchart illustrating another terminal control method according to an exemplary embodiment, and as shown in fig. 6, the terminal control method includes the following steps S21 to S24.
In step S21, at least one of the first image pickup section and the second image pickup section is controlled to be in an on state by the first processing section.
In step S22, if the first image capturing means and/or the second image capturing means captures a target image for triggering the terminal to perform target event processing, a processing target event is triggered.
In step S23, the operation mode of the terminal is determined.
In the embodiment of the disclosure, in order to reduce useless power consumption, the on-state or off-state of the first image acquisition component and the second image acquisition component can be determined according to the working mode of the terminal, so that the power consumption is saved, and meanwhile, the accuracy of active sensing of the terminal is favorably ensured. The operation modes may include: a screen mode, a privacy mode, or a browsing mode. Wherein, the breath screen mode may include: off screen (black screen) and wake-up screen (bright screen). The privacy mode may include launching privacy class applications, such as: short messages, mails and the like relate to personal privacy, such as instant messaging applications or financial management applications. The browsing mode may include launching a video, reading software, etc. application.
In step S24, the first image capturing element and/or the second image capturing element are kept in an on state, depending on the mode of operation.
In the embodiment of the present disclosure, different acquisition strategies are set for different operating modes. In each acquisition strategy, the on-state or off-state of the first image acquisition component and the second image acquisition component may be different. And then according to the working mode of the terminal, determining a corresponding acquisition strategy, keeping the first image acquisition component and the second image acquisition component or the first image acquisition component and the second image acquisition component in an open state, and closing the image acquisition component needing to be closed, thereby saving power consumption.
Through the embodiment, the terminal can trigger the terminal to perform corresponding response by combining the determined working mode according to the target image which is acquired by the first image acquisition component and the second image acquisition component in the starting state and is used for triggering the terminal to perform target event processing, so that the user operation is simplified, and meanwhile, the power consumption can be saved.
In an implementation scenario, whether the images acquired by the first image acquisition component and the second image acquisition component are target images for triggering the terminal to perform target event processing is determined by the first processing component. According to the terminal control interaction diagram shown in fig. 7, when the first processing component is an application specific integrated chip, and when the application specific integrated chip determines that the image acquired by the first image acquisition component or the second image acquisition component is a target image for triggering the terminal to perform target event processing, the output information in the target image for triggering the terminal to perform the target event processing is sent to the application processor, so as to trigger the application processor to respond, and run the corresponding first application. When the first application runs, the first processing component adjusts the opening state or the closing state of the first image acquisition component and the second image acquisition component according to the working mode of the terminal, and keeps the opening state of the first image acquisition component, the second image acquisition component or the first image acquisition component and the second image acquisition component.
In another implementation scenario, the first image capture component is a front camera in the terminal and the second image capture component is a rear camera in the terminal. Wherein table 1 is an acquisition strategy. As shown in table 1, the working modes of the terminal may be pre-divided into four working modes, which are respectively a working mode 1, a working mode 2, a working mode 3, and a working mode 4, and different acquisition strategies and information of a target image for triggering the terminal to perform target event processing in the working mode are determined for the different working modes. And after the working mode is determined, the terminal can be correspondingly controlled based on the acquisition strategy corresponding to the working mode. In an example, a time limit may be further included in the acquisition policy, so as to limit the frequency of the information output by the first processor, so as to improve the accuracy of the terminal in performing the corresponding trigger response based on active sensing. In another example, the power consumption calculation may also be limited in the acquisition strategy in order to save the power consumption of the terminal.
Figure BDA0002881441060000091
TABLE 1
In yet another implementation scenario, the types of operating modes may include: standard mode, standby mode, or high performance mode. As shown in table 2, in different types of operation modes, the frequency of acquiring images by the front camera and the rear camera may be different. Where table 2 is another acquisition strategy. In the standby mode, if the face is not acquired, the front camera performs refresh scanning by adopting a transmission frame Per Second (Frames Per Second, fps), and the rear camera is not refreshed or closed. If the human face is collected, refreshing scanning is carried out by adopting 1 or 2 or 3fps in the front. In the high-performance mode, when the information to be output by the first processor is the face direction, the refreshing may be performed at 10fps, and when the number of faces is detected, the refreshing may be performed at 1 fps.
Figure BDA0002881441060000092
TABLE 2
In yet another implementation scenario, the control flow of the terminal may be as shown in fig. 8. Fig. 8 is a schematic diagram of a terminal control flow. When the terminal is in a screen-saving mode, the opening state of the front camera is kept when the terminal is in a dormant state, the rear camera is closed, and when the terminal is picked up by a user or the front camera detects a face, the first processor controls the front camera to be in the closing state and keeps the rear camera in the opening state. When the front camera is unlocked, the front camera is closed, and the opening state of the rear camera is kept. When in privacy mode, for example: and viewing the short message, keeping the opening states of the front camera and the rear camera by the first processor, and closing the front camera when the face number information output by the first processor is 1. And when the face number information output by the first processor is 2, the front-facing camera is in an open state. When the short message interface is closed, the front camera is closed, and the opening state of the rear camera is kept. When the mobile terminal is in the browsing mode, the front camera and the rear camera are both kept in the opening state. When the mobile phone is in the screen-off mode and no operation is performed for 15 seconds(s), the screen is locked, and if no operation is performed for 5s, the front camera and the rear camera are both in an open state. And if the human face is detected or the terminal is operated by a person, closing the front camera. And if no operation is carried out for 15s, locking the screen of the terminal, and keeping the front camera and the rear camera in an open state.
In yet another implementation scenario, fig. 9 is a schematic diagram of another terminal control flow. When the rear camera acquires a two-dimensional code image, the first processor determines that the two-dimensional code image is a target image for triggering the terminal to perform target event processing, the terminal is triggered to respond, the mobile phone is awakened, the camera is turned on to scan, and the two-dimensional code is recognized. And if the two-dimensional code is successfully identified, entering a first application capable of analyzing the two-dimensional code. And if the two-dimension code cannot be identified, acquiring again. When the terminal is in the unlocking state, the terminal directly enters the first application. Therefore, the user can directly obtain the result after the two-dimensional code is analyzed, and the autonomous manual operation is reduced, so that the use experience is improved.
In the present disclosure, any of the above terminals and terminal control methods may use, but are not limited to, any of the following implementation scenarios capable of realizing active sensing: 1. when the user falls asleep (lies on his side) and does not wish to rotate the screen, it is found that the orientation of the user's face does not cause the screen to rotate. 2. When the user is inconvenient to directly touch the screen, the user can carry out the spaced operation by hand. 3. When a user checks private information, the information can be automatically encrypted when the user detects that other people watch your mobile phone screen. 4. When the user looks at the screen, the screen will automatically light up. 5. When the mobile phone detects that the user does not watch the screen for a period of time, the screen is automatically turned off. 6. When the rear camera detects the two-dimensional code, the user can be prompted to enter the corresponding first application by the automatic acquisition information.
Based on the same concept, the embodiment of the present disclosure further provides a terminal control device, which is applied to any one of the terminals provided by the present disclosure.
It is to be understood that, in order to implement the above functions, the terminal control apparatus provided in the embodiments of the present disclosure includes a hardware structure and/or a software module corresponding to the execution of each function. The disclosed embodiments can be implemented in hardware or a combination of hardware and computer software, in combination with the exemplary elements and algorithm steps disclosed in the disclosed embodiments. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
Fig. 10 is a block diagram illustrating a terminal control apparatus according to an exemplary embodiment. Referring to fig. 10, the terminal control device 200 includes a control unit 201 and a trigger unit 202.
A control unit 201, configured to control at least one of the first image capturing component and the second image capturing component to be in an on state through the first processing component.
The triggering unit 202 is configured to trigger processing of the target event if the first image acquisition component and/or the second image acquisition component acquires a target image for triggering the terminal to perform target event processing.
In one embodiment, the triggering unit triggers the processing target event in the following manner: and the trigger terminal runs a first application, and the first application is used for responding to the target image acquired by the first image acquisition component or the second image acquisition component and executing a target event corresponding to the target image.
In another embodiment, the terminal control device 200 further includes: the determining unit 203 is configured to determine an operating mode of the terminal, where the operating mode includes a screen turning mode, a privacy mode, or a browsing mode. And the control unit is also used for keeping the first image acquisition component and/or the second image acquisition component in an opening state according to the working mode.
In another embodiment, the triggering unit 202 triggers the processing target event according to the target image acquired by the first image acquisition component and/or the second image acquisition component for triggering the terminal to perform the target event processing in the following manner: if the first image acquisition component and/or the second image acquisition component acquire the two-dimensional code image, the terminal is triggered to identify the two-dimensional code; or if the first image acquisition component and/or the second image acquisition component acquire an image with a designated gesture, triggering the terminal to identify the designated gesture and executing a function operation corresponding to the designated gesture; or, if the first image acquisition component and/or the second image acquisition component acquire an image with at least one face, the terminal is triggered to identify the face direction in the target image or detect the number of faces in the target image.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 11 is a block diagram illustrating a terminal control device 300 according to an example embodiment. For example, the terminal control device 300 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 11, the terminal control apparatus 300 may include one or more of the following components: a processing component 302, a memory 304, a power component 306, a multimedia component 308, an audio component 310, an input/output (I/O) interface 312, a sensor component 314, and a communication component 316.
The processing component 302 generally controls the overall operation of the terminal control device 300, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 302 may include one or more processors 320 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 302 can include one or more modules that facilitate interaction between the processing component 302 and other components. For example, the processing component 302 may include a multimedia module to facilitate interaction between the multimedia component 308 and the processing component 302.
The memory 304 is configured to store various types of data to support operations at the terminal control device 300. Examples of such data include instructions for any application or method operating on the terminal control device 300, contact data, phonebook data, messages, pictures, videos, and the like. The memory 304 may be implemented by any type or combination of volatile and non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power component 306 provides power to the various components of the terminal control device 300. The power components 306 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the terminal control device 300.
The multimedia component 308 comprises a screen providing an output interface between the terminal control means 300 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 308 includes a front facing camera and/or a rear facing camera. When the terminal control device 300 is in an operation mode, such as a photographing mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 310 is configured to output and/or input audio signals. For example, the audio component 310 includes a Microphone (MIC) configured to receive an external audio signal when the terminal control device 300 is in an operation mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 304 or transmitted via the communication component 316. In some embodiments, audio component 310 also includes a speaker for outputting audio signals.
The I/O interface 312 provides an interface between the processing component 302 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 314 includes one or more sensors for providing various aspects of state assessment for the terminal control device 300. For example, the sensor unit 314 may detect an open/close state of the terminal control device 300, a relative positioning of components such as a display and a keypad of the terminal control device 300, a change in position of the terminal control device 300 or a component of the terminal control device 300, the presence or absence of user contact with the terminal control device 300, an orientation or acceleration/deceleration of the terminal control device 300, and a temperature change of the terminal control device 300. Sensor assembly 314 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 314 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 314 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 316 is configured to facilitate wired or wireless communication between the terminal control apparatus 300 and other devices. The terminal control device 300 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 316 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 316 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the terminal control device 300 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors, or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 304 comprising instructions, executable by the processor 320 of the terminal control device 300 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It is further understood that the use of "a plurality" in this disclosure means two or more, and other terms are analogous. "and/or" describes the association relationship of the associated object, indicating that there may be three relationships, for example, a and/or B, which may indicate: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. The singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It will be further understood that the terms "first," "second," and the like are used to describe various information and that such information should not be limited by these terms. These terms are only used to distinguish one type of information from another, and do not indicate a particular order or degree of importance. Indeed, the terms "first," "second," and the like are fully interchangeable. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure.
It will be further understood that, unless otherwise specified, "connected" includes direct connections between the two without the presence of other elements, as well as indirect connections between the two with the presence of other elements.
It is further to be understood that while operations are depicted in the drawings in a particular order, this is not to be understood as requiring that such operations be performed in the particular order shown or in serial order, or that all illustrated operations be performed, to achieve desirable results. In certain environments, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (14)

1. A terminal, characterized in that the terminal comprises a first image acquisition component, a second image acquisition component and a first processing component in an operational state, wherein,
the first processing component is provided with a first interface and a second interface, the first interface is connected with the first image acquisition component, and the second interface is connected with the second image acquisition component; the first processing component is used for controlling at least one of the first image acquisition component and the second image acquisition component to be in an opening state.
2. A terminal according to claim 1, characterized in that the first processing component is an application specific integrated chip.
3. A terminal according to claim 1 or 2, characterized in that the operating power consumption of the first processing means is less than or equal to a first power consumption threshold.
4. The terminal of claim 1, further comprising:
and the second processing component is connected with the first processing component and triggers the operation through the first processing component when the first image acquisition component and/or the second image acquisition component acquire images.
5. A terminal control method is applied to a terminal, the terminal comprises a first image acquisition component, a second image acquisition component and a first processing component in a running state, the first processing component is provided with a first interface and a second interface, the first interface is connected with the first image acquisition component, the second interface is connected with the second image acquisition component, and the terminal control method comprises the following steps:
controlling at least one of the first image acquisition component and the second image acquisition component to be in an open state through the first processing component;
and if the first image acquisition component and/or the second image acquisition component acquire a target image for triggering the terminal to process the target event, triggering to process the target event.
6. The terminal control method according to claim 5, wherein triggering the processing of the target event comprises:
and triggering the terminal to run a first application, wherein the first application is used for responding to the target image acquired by the first image acquisition component or the second image acquisition component and executing the target event corresponding to the target image.
7. The terminal control method according to claim 5 or 6, characterized in that the terminal control method further comprises:
determining a working mode of the terminal, wherein the working mode comprises a screen mode, a privacy mode or a browsing mode;
and according to the working mode, keeping the first image acquisition component and/or the second image acquisition component in an opening state.
8. The terminal control method according to claim 5, wherein triggering processing of the target event if the first image capturing component and/or the second image capturing component captures a target image for triggering the terminal to perform target event processing comprises:
if the first image acquisition component and/or the second image acquisition component acquire a two-dimensional code image, triggering the terminal to identify the two-dimensional code; or
If the first image acquisition component and/or the second image acquisition component acquire images with specified gestures, triggering the terminal to identify the specified gestures and executing functional operations corresponding to the specified gestures; or
And if the first image acquisition component and/or the second image acquisition component acquire an image with at least one face, triggering the terminal to identify the face direction in the target image or detect the number of faces in the target image.
9. The utility model provides a terminal control device, its characterized in that is applied to the terminal, the terminal includes first image acquisition part and second image acquisition part and is in the first processing unit who calls and running state, first processing unit has first interface and second interface, first interface connection first image acquisition part, the second interface connection second image acquisition part, terminal control device includes:
the control unit is used for controlling at least one of the first image acquisition component and the second image acquisition component to be in an opening state through the first processing component;
and the triggering unit is used for triggering the processing of the target event if the first image acquisition component and/or the second image acquisition component acquire the target image for triggering the terminal to process the target event.
10. The terminal control device according to claim 9, wherein the triggering unit triggers processing of the target event in the following manner:
and triggering the terminal to run a first application, wherein the first application is used for responding to the target image acquired by the first image acquisition component or the second image acquisition component and executing the target event corresponding to the target image.
11. The terminal control device according to claim 9 or 10, characterized by further comprising:
the terminal comprises a determining unit, a judging unit and a processing unit, wherein the determining unit is used for determining a working mode of the terminal, and the working mode comprises a screen turning mode, a privacy mode or a browsing mode;
the control unit is further configured to keep the first image acquisition component and/or the second image acquisition component in an open state according to the working mode.
12. The terminal control device according to claim 9, wherein the triggering unit triggers processing of the target event according to the target image acquired by the first image acquisition component and/or the second image acquisition component for triggering the terminal to perform target event processing in the following manner:
if the first image acquisition component and/or the second image acquisition component acquire a two-dimensional code image, triggering the terminal to identify the two-dimensional code; or
If the first image acquisition component and/or the second image acquisition component acquire images with specified gestures, triggering the terminal to identify the specified gestures and executing functional operations corresponding to the specified gestures; or
And if the first image acquisition component and/or the second image acquisition component acquire an image with at least one face, triggering the terminal to identify the face direction in the target image or detect the number of faces in the target image.
13. A terminal control device, characterized in that the terminal control device comprises:
a memory to store instructions; and
a processor for invoking the memory-stored instructions to perform the terminal control method according to any one of claims 5-8.
14. A computer-readable storage medium in which instructions are stored, which when executed by a processor, perform a terminal control method according to any one of claims 5 to 8.
CN202110001256.4A 2021-01-04 2021-01-04 Terminal, terminal control method, terminal control device, and storage medium Pending CN114764272A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110001256.4A CN114764272A (en) 2021-01-04 2021-01-04 Terminal, terminal control method, terminal control device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110001256.4A CN114764272A (en) 2021-01-04 2021-01-04 Terminal, terminal control method, terminal control device, and storage medium

Publications (1)

Publication Number Publication Date
CN114764272A true CN114764272A (en) 2022-07-19

Family

ID=82363691

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110001256.4A Pending CN114764272A (en) 2021-01-04 2021-01-04 Terminal, terminal control method, terminal control device, and storage medium

Country Status (1)

Country Link
CN (1) CN114764272A (en)

Similar Documents

Publication Publication Date Title
EP3866458B1 (en) Method and device for capturing images
CN106572299B (en) Camera opening method and device
CN106951884B (en) Fingerprint acquisition method and device and electronic equipment
US10674088B2 (en) Method and device for acquiring image, terminal and computer-readable storage medium
US20190132441A1 (en) Method and device for preventing terminal from being inadvertently touched
US9661223B2 (en) Method and device for photographing including camera shake
US20170064182A1 (en) Method and device for acquiring image file
US20190251317A1 (en) Method and apparatus for fingerprint unlocking
US10292004B2 (en) Method, device and medium for acquiring location information
EP3136793A1 (en) Method and apparatus for awakening electronic device
US9924090B2 (en) Method and device for acquiring iris image
CN106357934B (en) Screen locking control method and device
CN111988493B (en) Interaction processing method, device, equipment and storage medium
CN111984347A (en) Interaction processing method, device, equipment and storage medium
WO2017140109A1 (en) Pressure detection method and apparatus
CN108073328B (en) Touch response method and device
US20190320489A1 (en) Region configuration method and device
US20180238748A1 (en) Pressure detection method and apparatus, and storage medium
EP3211879A1 (en) Method and device for automatically capturing photograph, electronic device
CN107809588B (en) Monitoring method and device
CN109922203B (en) Terminal, screen off method and device
CN114764272A (en) Terminal, terminal control method, terminal control device, and storage medium
EP3789849A1 (en) Contactless gesture control method, apparatus and storage medium
CN107329604B (en) Mobile terminal control method and device
CN108334762B (en) Terminal unlocking method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination