WO2019223493A1 - Procédé de reconnaissance d'objets et terminal mobile - Google Patents

Procédé de reconnaissance d'objets et terminal mobile Download PDF

Info

Publication number
WO2019223493A1
WO2019223493A1 PCT/CN2019/084500 CN2019084500W WO2019223493A1 WO 2019223493 A1 WO2019223493 A1 WO 2019223493A1 CN 2019084500 W CN2019084500 W CN 2019084500W WO 2019223493 A1 WO2019223493 A1 WO 2019223493A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
mobile terminal
operating state
state
scanned object
Prior art date
Application number
PCT/CN2019/084500
Other languages
English (en)
Chinese (zh)
Inventor
李浩荣
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Priority to JP2020564722A priority Critical patent/JP7221305B2/ja
Priority to EP19807463.5A priority patent/EP3816768A4/fr
Priority to KR1020207036650A priority patent/KR102497444B1/ko
Publication of WO2019223493A1 publication Critical patent/WO2019223493A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/3287Power saving characterised by the action undertaken by switching off individual functional units in the computer system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/322Aspects of commerce using mobile devices [M-devices]
    • G06Q20/3223Realising banking transactions through M-devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/326Payment applications installed on the mobile devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/327Short range or proximity payments by means of M-devices
    • G06Q20/3276Short range or proximity payments by means of M-devices using a pictured code, e.g. barcode or QR-code, being read by the M-device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4014Identity check for transactions
    • G06Q20/40145Biometric identity checks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72463User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • the present disclosure relates to the field of communication technologies, and in particular, to an object recognition method and a mobile terminal.
  • users need to perform a series of operations when performing functions such as face recognition or code scanning payment.
  • the following uses the user to perform code scanning payment as an example.
  • the user first needs to control the mobile terminal to turn on the screen, then unlock the mobile terminal, then open an application in the mobile terminal, and enable the code scanning function in the opened application to perform code scanning payment.
  • Embodiments of the present disclosure provide an object recognition method and a mobile terminal to solve the problem of tedious operations when the mobile terminal recognizes a scanned object.
  • an embodiment of the present disclosure provides an object recognition method, including:
  • the camera When it is determined that the scanned object meets a preset condition according to the first collection result, the camera is controlled to enter a second operating state, and the scanned object is collected by the camera in the second operating state to obtain a second Collect the results, and identify the scanned object according to the second collected result, control the mobile terminal to light up, and display the recognition result corresponding to the scanned object;
  • the operating power of the camera in the second operating state is greater than the operating power of the camera in the first operating state.
  • an embodiment of the present disclosure further provides a mobile terminal, where the mobile terminal has a camera, and the mobile terminal includes:
  • An acquisition module configured to acquire a scan object through the camera in a first running state when the mobile terminal is in an off-screen state to obtain a first acquisition result
  • a display module configured to control the camera to enter a second operating state when it is determined that the scanned object meets a preset condition according to the first acquisition result, and acquire the scan through the camera in the second operating state The object to obtain a second collection result, and identify the scanned object according to the second collection result, control the mobile terminal to light up the screen, and display the recognition result corresponding to the scanned object;
  • the operating power of the camera in the second operating state is greater than the operating power of the camera in the first operating state.
  • an embodiment of the present disclosure further provides a mobile terminal, including: a memory, a processor, and a computer program stored on the memory and executable on the processor.
  • a mobile terminal including: a memory, a processor, and a computer program stored on the memory and executable on the processor.
  • the processor executes the computer program, the processor is implemented as described above. The steps in the object recognition method described above.
  • an embodiment of the present disclosure further provides a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the steps in the object recognition method described above are implemented. .
  • a scanning object is collected by the camera in a first running state to obtain a first collection result; when the scanning object is determined according to the first collection result
  • the camera is controlled to enter a second operation state, and the scanned object is collected by the camera in the second operation state to obtain a second acquisition result, and the second acquisition result is obtained according to the second acquisition result.
  • the scanning object is identified, and the mobile terminal is controlled to display a bright screen, and the recognition result corresponding to the scanned object is displayed; wherein the operating power of the camera in the second operating state is greater than that of the camera in the first operation Operating power in the state.
  • the mobile terminal can control the camera to be in the first running state when the screen is turned off, so that the camera can collect scanned objects at any time without the need for the user to manually open the application, which is convenient for users to operate and can improve scanning efficiency.
  • FIG. 1 is one of the flowcharts of the object recognition method provided by the embodiment of the present disclosure
  • FIG. 3 is one of the structural diagrams of a mobile terminal according to an embodiment of the present disclosure.
  • FIG. 4 is a second structural diagram of a mobile terminal according to an embodiment of the present disclosure.
  • FIG. 5 is a structural diagram of a display module in a mobile terminal according to an embodiment of the present disclosure.
  • FIG. 6 is a third structural diagram of a mobile terminal according to an embodiment of the present disclosure.
  • FIG. 1 is a flowchart of an object recognition method according to an embodiment of the present disclosure. As shown in FIG. 1, the method includes the following steps:
  • Step 101 When the mobile terminal is in a screen-off state, collect a scan object through the camera in a first running state to obtain a first acquisition result.
  • the off-screen state can be understood as a state when the mobile terminal is in a running state and no information is displayed on the screen. For example, after the screen of the mobile terminal is locked, the state of the mobile terminal screen when no information is displayed. In this step, when the mobile terminal is in an off-screen state, the camera is in an on state and operates in a first operating state.
  • the first operating state may be a state in which the operating power of the camera is less than a preset power value, for example, a low power consumption operating state in which the operating power is less than 50 mW, 60 mW, or 100 mW.
  • a preset power value for example, a low power consumption operating state in which the operating power is less than 50 mW, 60 mW, or 100 mW.
  • the user can manipulate the mobile terminal to control the camera to be always open in the first operating state, or the mobile terminal to automatically turn on the camera and run in the first operating state when it is turned on, that is, to turn on when the mobile terminal is on or off.
  • the camera can also be controlled to be turned on only when the screen is off, and turned off when the screen is on, which can further save power consumption.
  • the first operating state may also be a state where pixels are smaller than a preset pixel value. Since the pixels in the first operating state are low, the power consumption of the camera can be reduced.
  • the first operating state is a state in which some pixel units of the camera are turned on
  • the second operating state is a state in which all pixel units of the camera are turned on.
  • the mobile terminal may use an existing camera to control a part of the pixel units of the camera in a normally-on state in a first operating state, that is, when the mobile terminal is off and bright, both On state, where one pixel unit can correspond to one pixel.
  • the first operating state 300,000 pixels are evenly selected to work, while the remaining pixels are not to work.
  • Another example is to turn on a 5M camera with low resolution and low power consumption such as 160 ⁇ 120. Mode to control the camera to operate in the first operating state.
  • the power of the camera when only some of the pixel units work is less than the preset power value.
  • the camera can use the turned on part of the pixel units to collect the scanned object at any time, which can save power consumption, and can use the existing camera without the need to set low power separately. Consume camera.
  • the mobile terminal when the camera is in the first running state, the mobile terminal may be in a state of bright screen or off screen. In this way, the camera can collect scanned objects at any time without requiring the user to open an application for collection, and the user operation is convenient.
  • the above second operating state is to turn on all the pixel units of the camera.
  • all the pixel units are controlled to work, so that a clearer image can be obtained and the recognition rate can be improved.
  • Step 102 When it is determined that the scanned object meets a preset condition according to the first collection result, control the camera to enter a second operating state, and collect the scanned object through the camera in the second operating state, A second acquisition result is obtained, and the scanned object is identified according to the second acquisition result, the mobile terminal is controlled to display a bright screen, and the recognition result corresponding to the scanned object is displayed; wherein the camera is in the second The running power in the running state is greater than the running power of the camera in the first running state.
  • the mobile terminal may store a recognition algorithm of the scanned object in advance, for example, a face algorithm, a two-dimensional code recognition algorithm, a gesture recognition algorithm, etc.
  • the recognition algorithm of the scanned object may be stored in the camera . In this way, after the camera collects the scanned object, the camera can initially determine whether the scanned object meets the preset conditions based on the first acquisition result and the characteristics of the scanned object, so that when the scanned object meets the preset conditions, the mobile terminal is awakened to further Scan objects for acquisition and identification.
  • the preset condition may be a condition set and stored in advance by the mobile terminal. If the preset condition is met, it can be understood that the scanning object is a preset type, and the preset type may include a two-dimensional code, a gesture, and a human face. In this way, the mobile terminal can further identify only the scanning objects that meet the preset conditions.
  • the camera integrates a two-dimensional code recognition algorithm. After the camera collects an image, the camera determines whether the image has the characteristics of a two-dimensional code. Another example is the integration of a face recognition image in the camera. After the camera collects the image, the camera judges the acquisition. Whether the image has the characteristics of a face image.
  • the camera enters a second operating state, and the scanning object is collected by the camera in the second operating state. Since the operating power in the second operating state is higher than the operating power in the first operating state, the camera can obtain a clearer image in the second operating state.
  • the mobile terminal recognizes the scanned object based on the second acquisition result collected by the camera in the second operating state, on the one hand, it can prevent the camera from misidentifying the scanned object in the first operating state; on the other hand, it can identify the scan more accurately The specific content of the object.
  • the power of the camera in the first operating state is relatively low, power consumption can be saved by collecting and identifying scanned objects through the camera in the first operating state.
  • the scanned objects that meet the preset conditions are further collected to prevent the misidentification of the scanned objects, and the specific content of the scanned objects or the specific information of the scanned objects, that is, the recognition results can be obtained more accurately, so that the mobile terminal
  • the recognition result is displayed on the screen. For example, when the camera recognizes that the scanning object is a two-dimensional code, it can further identify that the specific content of the two-dimensional code is a payment two-dimensional code, thereby displaying a payment interface on the screen.
  • the camera recognizes that the scanned object does not meet the preset conditions, the scanned object is not further identified, which can save power consumption.
  • the mobile terminal has a rear camera, and a two-dimensional code recognition algorithm is integrated in the rear camera.
  • the rear camera of the mobile terminal is in a normally-on and low-power-consumption operating state, that is, a first operating state.
  • the low-power-consumption operating state is a state in which only a part of the pixel units of the camera are turned on and the operating power is lower than a preset power value.
  • the normally-on state is the state where the camera is turned on and off on the mobile terminal.
  • the mobile terminal can be on or off the screen.
  • the user can point the rear camera of the mobile terminal towards the payment QR code. Since the rear camera is always on under low power consumption, The camera is set to collect an image containing a two-dimensional code, and the camera further determines whether the image contains a two-dimensional code.
  • the mobile terminal controls all the pixel units of the camera to be turned on, and the camera enters a second operating state. The mobile terminal collects the two-dimensional code image through all the pixel units of the camera, and further recognizes the information in the two-dimensional code, and displays the payment interface on the screen according to the information in the two-dimensional code.
  • the user does not need to perform interactive code scanning by the user, and the payment is fast.
  • the two-dimensional code is collected and identified by the camera in the first operating state, which can reduce power consumption.
  • the above object recognition method may be applied to a mobile terminal having a camera, such as a mobile phone, a tablet computer, a laptop computer, and a personal digital assistant (PDA).
  • a mobile terminal having a camera such as a mobile phone, a tablet computer, a laptop computer, and a personal digital assistant (PDA).
  • PDA personal digital assistant
  • Mobile Internet Device Mobile Internet Device, MID
  • Wearable Device Wearable Device
  • a scanning object is collected by the camera in a first running state to obtain a first collection result; when the mobile terminal is determined according to the first collection result, When the scanning object meets a preset condition, the camera is controlled to enter a second operation state, and the scanning object is collected by the camera in the second operation state to obtain a second acquisition result, and according to the second acquisition As a result, the scanned object is identified, the mobile terminal is controlled to turn on the screen, and the recognition result corresponding to the scanned object is displayed; wherein the operating power of the camera in the second operating state is greater than that of the camera in the second operating state. Operating power in the first operating state.
  • the mobile terminal can control the camera to be in the first running state when the screen is off, so that the camera can collect scanned objects at any time without the need for the user to manually open an application, which is convenient for user operations and can improve scanning efficiency. And based on the scanning object collected by the camera in the second operating state for recognition, the accuracy of recognition can be improved and errors can be reduced.
  • the main difference between this embodiment and the foregoing embodiment lies in that when the camera in the first operating state judges that the characteristics of the scanning object match the preset features, the scan is collected by the camera in the second operating state. Object.
  • FIG. 2 is a flowchart of an object recognition method according to an embodiment of the present disclosure. As shown in FIG. 2, the method includes the following steps:
  • Step 201 When the mobile terminal is in an off-screen state, collect a scan object through the camera in a first running state to obtain a first acquisition result.
  • step 101 For specific implementation of this step, refer to step 101. To avoid repetition, details are not described herein again.
  • the method when the mobile terminal is in a screen-off state, before the scanning object is collected by the camera in a first operating state, the method further includes: turning on the camera, and controlling the camera to enter the screen First operating state.
  • the user can turn on the camera by operating the mobile terminal, or the mobile terminal can automatically turn on the camera when it is turned on.
  • the mobile terminal can set the switching control of the camera in the first operating state and the second operating state.
  • the user can operate the switching control to control the camera in the first operating state.
  • the user can flexibly operate according to the application scenario and the state of the mobile terminal.
  • the mobile terminal also When the camera is turned on, the camera can be automatically controlled to be in the first running state to save power consumption of the mobile terminal and facilitate the user to manipulate the mobile terminal to scan the scanned object. It should be noted that this implementation manner can also be applied to the embodiment corresponding to FIG. 1 and achieve the same beneficial effects.
  • turning on the camera includes turning on the camera when the mobile terminal is turned on or off.
  • the mobile terminal can turn on the camera when it is turned on, and control the camera to be always on when the screen is off and on, so that the user can use the camera of the mobile terminal to scan at any time.
  • the mobile terminal can also turn on the camera when it is detected that the screen of the mobile terminal is turned off, and turn off the camera when the screen is turned on, which can save power consumption and facilitate users to scan when the mobile terminal is turned off. It should be noted that this implementation manner can also be applied to the embodiment corresponding to FIG. 1 and achieve the same beneficial effects.
  • the method further includes: when it is determined through the camera that the scanned object meets a preset condition, the mobile terminal outputs prompt information.
  • the mobile terminal when the camera recognizes that the scanning object meets a preset condition, the mobile terminal outputs prompt information, which may be specifically prompted through vibration or outputting voice.
  • prompt information which may be specifically prompted through vibration or outputting voice.
  • Step 202 Identify the characteristics of the scanned object through the camera according to the first collection result.
  • a recognition algorithm for a scanned object may be integrated in the camera, for example, a face recognition algorithm, a gesture recognition algorithm, a two-dimensional code recognition algorithm, and the like.
  • the features of the scanned object can be identified according to the first acquisition result.
  • the features of the two-dimensional code may include contour features and the features of the anchor points of the two-dimensional code; the features of the face image may include facial features and distribution positions, and so on.
  • the camera can identify the characteristics of the scanned object, thereby determining whether the scanned object is a preset type of scanned object.
  • Step 203 In a case where the camera determines that the characteristics of the scanned object match preset features, control the camera to enter a second operating state; wherein the operating power of the camera in the second operating state Greater than the operating power of the camera in the first operating state.
  • the camera After the camera recognizes the feature of the scanned object, it can compare the feature with a preset feature. If the similarity between the feature of the scanned object and the preset feature is greater than the preset similarity value, for example, greater than 80%, it means that the feature of the scanned object matches the preset feature, that is, the scanned object meets the preset conditions, the mobile terminal controls The camera enters the second operating state.
  • the preset similarity value for example, greater than 80%
  • the scanning object is initially identified by the camera in the first operating state. Because the camera in the first operating state has a low running power, the power consumption of the mobile terminal can be reduced, and the user is convenient to operate.
  • Step 204 Collect the scanned object through the camera in the second operating state to obtain a second acquisition result, identify the scanned object according to the second acquisition result, and control the mobile terminal to light up the screen. , Displaying the recognition result corresponding to the scanned object.
  • step 102 For a specific implementation of this step, refer to the related description in step 102, and details are not described herein again.
  • the user in a state where the mobile terminal is off, the user can quickly complete the scanning operation without unlocking the mobile terminal, which is convenient for the user to operate; and the camera in the first operating state is used to collect and scan objects.
  • the recognition operation can save the energy consumption of the mobile terminal; and the recognition of the scanned object based on the camera in the second operating state can improve the accuracy of the recognition of the scanned object.
  • FIG. 3 is a structural diagram of a mobile terminal according to an embodiment of the present disclosure, where the mobile terminal has a camera.
  • the mobile terminal 300 includes a collection module 301 and a display module 302.
  • a collection module 301 configured to collect a scan object through the camera in a first running state when the mobile terminal is in an off-screen state to obtain a first collection result
  • a display module 302 configured to control the camera to enter a second operating state when it is determined that the scanned object meets a preset condition according to the first acquisition result, and collect the camera through the camera in the second operating state Scanning the object to obtain a second acquisition result, identifying the scanned object according to the second acquisition result, controlling the mobile terminal to light up the screen, and displaying the recognition result corresponding to the scanned object;
  • the operating power of the camera in the second operating state is greater than the operating power of the camera in the first operating state.
  • the mobile terminal further includes:
  • the enabling module 303 is configured to enable the camera and control the camera to enter the first operating state.
  • the enabling module 303 is specifically configured to enable the camera when the mobile terminal is turned on or off.
  • the first operating state is a state where some pixel units of the camera are turned on
  • the second operating state is a state where all pixel units of the camera are turned on.
  • the display module 302 includes:
  • a recognition sub-module 3021 configured to identify a feature of the scanned object through the camera according to the first collection result
  • a state switching sub-module 3022 is configured to control the camera to enter a second operating state when the camera determines that the characteristics of the scanned object match preset features.
  • the mobile terminal 300 can implement various processes implemented by the mobile terminal in the method embodiments corresponding to FIG. 1 to FIG. 2. To avoid repetition, details are not described herein again.
  • the mobile terminal 300 can control the camera to be in the first running state when the screen is turned off, so that the camera can collect scanning objects at any time without requiring the user to manually open the application program.
  • the user operation is convenient and can improve Scanning efficiency.
  • the recognition based on the scanned object collected by the camera in the second operating state can improve the recognition accuracy and reduce errors.
  • the mobile terminal 600 includes, but is not limited to, a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, and a display unit. 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and a power supply 611.
  • a radio frequency unit 601 a radio frequency unit 601
  • a network module 602 an audio output unit 603, an input unit 604, a sensor 605, and a display unit.
  • 606 a user input unit 607, an interface unit 608, a memory 609, a processor 610, and a power supply 611.
  • the mobile terminal may include more or fewer components than shown in the figure, or combine some components, or different components. Layout.
  • the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a vehicle-mounted mobile terminal, a wearable device, and a pedometer.
  • the processor 610 when the mobile terminal is in an off-screen state, collects a scan object through the camera in a first running state to obtain a first acquisition result; and determines the scan object according to the first acquisition result
  • the camera is controlled to enter a second operation state, and the scanned object is collected by the camera in the second operation state to obtain a second acquisition result, and the second acquisition result is obtained according to the second acquisition result.
  • the scanning object is identified, and the mobile terminal is controlled to display a bright screen, and the recognition result corresponding to the scanned object is displayed; wherein the operating power of the camera in the second operating state is greater than that of the camera in the first operation Operating power in the state.
  • the mobile terminal can control the camera to be in the first running state when the screen is off, so that the camera can collect scanned objects at any time without the need for the user to manually open an application, which is convenient for user operations and can improve scanning efficiency.
  • the processor 610 is further configured to turn on the camera and control the camera to enter the first operating state.
  • executing the turning on the camera by the processor 610 includes turning on the camera when the mobile terminal is turned on or off.
  • the first operating state is a state where some pixel units of the camera are turned on
  • the second operating state is a state where all pixel units of the camera are turned on.
  • the processor 610 executes the step of controlling the camera to enter a second operating state when it is determined that the scanned object meets a preset condition according to the first collection result, and includes: The first acquisition result identifies the features of the scanned object; and when the camera determines that the features of the scanned object match a preset feature, the camera is controlled to enter a second operating state.
  • the radio frequency unit 601 may be used to receive and send signals during the transmission and reception of information or during a call. Specifically, the downlink data from the base station is received and processed by the processor 610; The uplink data is sent to the base station.
  • the radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 601 can also communicate with a network and other devices through a wireless communication system.
  • the mobile terminal provides users with wireless broadband Internet access through the network module 602, such as helping users to send and receive email, browse web pages, and access streaming media.
  • the audio output unit 603 may convert audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into audio signals and output them as sound. Moreover, the audio output unit 603 may also provide audio output (for example, a call signal receiving sound, a message receiving sound, etc.) related to a specific function performed by the mobile terminal 600.
  • the audio output unit 603 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 604 is used for receiving audio or video signals.
  • the input unit 604 may include a graphics processing unit (GPU) 6041 and a microphone 6042.
  • the graphics processor 6041 pairs images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode Data is processed.
  • the processed image frames may be displayed on a display unit 606.
  • the image frames processed by the graphics processor 6041 may be stored in the memory 609 (or other storage medium) or transmitted via the radio frequency unit 601 or the network module 602.
  • the microphone 6042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be transmitted to a mobile communication base station via the radio frequency unit 601 in the case of a telephone call mode.
  • the mobile terminal 600 further includes at least one sensor 605, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 6061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 6061 and the mobile terminal 600 when the mobile terminal 600 moves to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes). It can detect the magnitude and direction of gravity when it is stationary.
  • sensor 605 can also include fingerprint sensor, pressure sensor, iris sensor, molecular sensor, gyroscope, barometer, hygrometer, thermometer, Infrared sensors, etc. are not repeated here.
  • the display unit 606 is configured to display information input by the user or information provided to the user.
  • the display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 607 may be used to receive inputted numeric or character information, and generate key signal inputs related to user settings and function control of the mobile terminal.
  • the user input unit 607 includes a touch panel 6071 and other input devices 6072.
  • Touch panel 6071 also known as touch screen, can collect the user's touch operations on or near it (such as the user using a finger, stylus and other suitable objects or accessories on or near touch panel 6071 operating).
  • the touch panel 6071 may include a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into contact coordinates, and sends it To the processor 610, receive the command sent by the processor 610 and execute it.
  • various types such as resistive, capacitive, infrared, and surface acoustic wave can be used to implement the touch panel 6071.
  • the user input unit 607 may further include other input devices 6072.
  • other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, and details are not described herein again.
  • the touch panel 6071 may be overlaid on the display panel 6061.
  • the touch panel 6071 detects a touch operation on or near the touch panel 6071, the touch panel 6071 transmits the touch operation to the processor 610 to determine the type of the touch event.
  • the type of event provides corresponding visual output on the display panel 6061.
  • the touch panel 6071 and the display panel 6061 are implemented as two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 6071 and the display panel 6061 can be integrated. The implementation of the input and output functions of the mobile terminal is not specifically limited here.
  • the interface unit 608 is an interface through which an external device is connected to the mobile terminal 600.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input / output (I / O) port, video I / O port, headphone port, and more.
  • the interface unit 608 may be used to receive an input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 600 or may be used to connect the mobile terminal 600 and an external device. Transfer data between devices.
  • the memory 609 can be used to store software programs and various data.
  • the memory 609 may mainly include a storage program area and a storage data area, where the storage program area may store an operating system, at least one application required by a function (such as a sound playback function, an image playback function, etc.), etc .; the storage data area may store data according to Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 609 may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 610 is a control center of the mobile terminal, and uses various interfaces and lines to connect various parts of the entire mobile terminal.
  • the processor 610 runs or executes software programs and / or modules stored in the memory 609, and calls data stored in the memory 609. , Perform various functions of the mobile terminal and process data, so as to monitor the mobile terminal as a whole.
  • the processor 610 may include one or more processing units; optionally, the processor 610 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, and an application program, etc.
  • the tuning processor mainly handles wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 610.
  • the mobile terminal 600 may further include a power source 611 (such as a battery) for supplying power to various components.
  • a power source 611 such as a battery
  • the power source 611 may be logically connected to the processor 610 through a power management system, thereby implementing management of charging, discharging, and power consumption through the power management system. Management and other functions.
  • the mobile terminal 600 includes some functional modules that are not shown, and details are not described herein again.
  • an embodiment of the present disclosure further provides a mobile terminal including a processor 610, a memory 609, and a computer program stored on the memory 609 and executable on the processor 610, the computer program being executed by the processor 610
  • a mobile terminal including a processor 610, a memory 609, and a computer program stored on the memory 609 and executable on the processor 610, the computer program being executed by the processor 610
  • An embodiment of the present disclosure further provides a computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the processes of the foregoing object recognition method embodiment are implemented, and the same technology can be achieved. Effect, in order to avoid repetition, will not repeat them here.
  • the computer-readable storage medium is, for example, a read-only memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Accounting & Taxation (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Computer Security & Cryptography (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • Toxicology (AREA)
  • Finance (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computing Systems (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)
  • Power Sources (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé de reconnaissance d'objets et un terminal mobile. Le procédé comporte les étapes consistant: lorsqu'un terminal mobile est dans un état d'écran éteint, à collecter un objet de balayage au moyen d'une caméra dans un premier état de fonctionnement de façon à obtenir un premier résultat de collecte; et lorsqu'il est déterminé, d'après le premier résultat de collecte, que l'objet de balayage satisfait une condition prédéfinie, à commander la caméra pour qu'elle entre dans un second état de fonctionnement, collecter un objet de balayage au moyen de la caméra dans le second état de fonctionnement de façon à obtenir un second résultat de collecte, à reconnaître l'objet de balayage d'après le second résultat de collecte, à commander l'allumage d'un écran du terminal mobile, et à afficher un résultat de reconnaissance correspondant à l'objet de balayage.
PCT/CN2019/084500 2018-05-22 2019-04-26 Procédé de reconnaissance d'objets et terminal mobile WO2019223493A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2020564722A JP7221305B2 (ja) 2018-05-22 2019-04-26 対象物認識方法及び移動端末
EP19807463.5A EP3816768A4 (fr) 2018-05-22 2019-04-26 Procédé de reconnaissance d'objets et terminal mobile
KR1020207036650A KR102497444B1 (ko) 2018-05-22 2019-04-26 대상 인식 방법 및 이동 단말

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810494006.7A CN110517034A (zh) 2018-05-22 2018-05-22 一种对象识别方法及移动终端
CN201810494006.7 2018-05-22

Publications (1)

Publication Number Publication Date
WO2019223493A1 true WO2019223493A1 (fr) 2019-11-28

Family

ID=68616528

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/084500 WO2019223493A1 (fr) 2018-05-22 2019-04-26 Procédé de reconnaissance d'objets et terminal mobile

Country Status (5)

Country Link
EP (1) EP3816768A4 (fr)
JP (1) JP7221305B2 (fr)
KR (1) KR102497444B1 (fr)
CN (1) CN110517034A (fr)
WO (1) WO2019223493A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114127686A (zh) * 2020-04-27 2022-03-01 华为技术有限公司 一种开启应用程序的方法、装置及终端
CN114860178A (zh) * 2021-01-18 2022-08-05 华为技术有限公司 一种投屏的方法和电子设备
CN115705567B (zh) * 2021-08-06 2024-04-19 荣耀终端有限公司 支付方法及相关装置
WO2023120060A1 (fr) * 2021-12-23 2023-06-29 ソニーセミコンダクタソリューションズ株式会社 Dispositif électronique, procédé de commande et programme

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105759935A (zh) * 2016-01-29 2016-07-13 华为技术有限公司 一种终端控制方法及终端
CN107944325A (zh) * 2017-11-23 2018-04-20 维沃移动通信有限公司 一种扫码方法、扫码装置及移动终端
CN108989668A (zh) * 2018-06-29 2018-12-11 维沃移动通信有限公司 一种摄像头的工作方法及移动终端
CN109151180A (zh) * 2018-07-27 2019-01-04 维沃移动通信有限公司 一种对象识别方法及移动终端

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7302089B1 (en) * 2004-04-29 2007-11-27 National Semiconductor Corporation Autonomous optical wake-up intelligent sensor circuit
JP5066133B2 (ja) * 2009-05-20 2012-11-07 株式会社日立製作所 情報記録装置、省電力方法
KR101652110B1 (ko) * 2009-12-03 2016-08-29 엘지전자 주식회사 사용자의 제스쳐로 제어가능한 장치의 전력 제어 방법
CN102148922B (zh) * 2010-02-08 2013-01-16 联想(北京)有限公司 一种电子设备、图像采集装置及图像采集控制方法
CN104375628B (zh) * 2013-08-16 2018-08-07 联想(北京)有限公司 一种信息处理方法及电子设备
JP6289184B2 (ja) * 2014-03-14 2018-03-07 オリンパス株式会社 画像認識装置および画像認識方法
CN104331149B (zh) * 2014-09-29 2018-08-10 联想(北京)有限公司 一种控制方法、装置和电子设备
CN104410785B (zh) * 2014-11-17 2019-01-15 联想(北京)有限公司 一种信息处理方法及电子设备
CN105807888B (zh) * 2014-12-31 2019-11-26 联想(北京)有限公司 一种电子设备及信息处理方法
KR102304693B1 (ko) * 2015-02-12 2021-09-28 삼성전자주식회사 카메라 시스템의 제어 방법, 전자 장치 및 저장 매체
CN105117256A (zh) * 2015-08-31 2015-12-02 联想(北京)有限公司 一种信息处理方法和电子设备
TW201743241A (zh) * 2016-06-01 2017-12-16 原相科技股份有限公司 可攜式電子裝置及其運作方法
US10627887B2 (en) * 2016-07-01 2020-04-21 Microsoft Technology Licensing, Llc Face detection circuit

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105759935A (zh) * 2016-01-29 2016-07-13 华为技术有限公司 一种终端控制方法及终端
CN107944325A (zh) * 2017-11-23 2018-04-20 维沃移动通信有限公司 一种扫码方法、扫码装置及移动终端
CN108989668A (zh) * 2018-06-29 2018-12-11 维沃移动通信有限公司 一种摄像头的工作方法及移动终端
CN109151180A (zh) * 2018-07-27 2019-01-04 维沃移动通信有限公司 一种对象识别方法及移动终端

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3816768A4 *

Also Published As

Publication number Publication date
EP3816768A1 (fr) 2021-05-05
JP2021524203A (ja) 2021-09-09
CN110517034A (zh) 2019-11-29
KR20210013127A (ko) 2021-02-03
JP7221305B2 (ja) 2023-02-13
KR102497444B1 (ko) 2023-02-07
EP3816768A4 (fr) 2021-06-23

Similar Documents

Publication Publication Date Title
WO2020020063A1 (fr) Procédé d'identification d'objet et terminal mobile
US11429248B2 (en) Unread message prompt method and mobile terminal
WO2019196707A1 (fr) Procédé de commande de terminal mobile et terminal mobile
CN108459797B (zh) 一种折叠屏的控制方法及移动终端
WO2020156111A1 (fr) Terminal et procédé d'affichage d'interface
CN109078319B (zh) 一种游戏界面显示方法和终端
CN107742072B (zh) 人脸识别方法及移动终端
WO2019223493A1 (fr) Procédé de reconnaissance d'objets et terminal mobile
CN109523253B (zh) 一种支付方法和装置
CN109960813A (zh) 一种翻译方法、移动终端及计算机可读存储介质
CN109241775B (zh) 一种隐私保护方法及终端
CN107832110A (zh) 一种信息处理方法及移动终端
WO2019206077A1 (fr) Procédé de traitement d'appel vidéo, et terminal mobile
WO2020259091A1 (fr) Procédé d'affichage de contenu d'écran et terminal
CN110096203B (zh) 一种截图方法及移动终端
CN109544172B (zh) 一种显示方法及终端设备
CN108881721B (zh) 一种显示方法及终端
JP2023518548A (ja) 検出結果出力方法、電子機器及び媒体
CN110519443B (zh) 一种亮屏方法及移动终端
CN109740312B (zh) 一种应用控制方法及终端设备
CN108270928B (zh) 一种语音识别的方法及移动终端
CN107809515B (zh) 一种显示控制方法及移动终端
CN109660750B (zh) 一种视频通话方法及终端
CN109819331B (zh) 一种视频通话方法、装置、移动终端
CN108108608B (zh) 一种移动终端的控制方法及移动终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19807463

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020564722

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019807463

Country of ref document: EP

Effective date: 20201203

ENP Entry into the national phase

Ref document number: 20207036650

Country of ref document: KR

Kind code of ref document: A