WO2023287054A1 - Dispositif électronique et son procédé de commande - Google Patents

Dispositif électronique et son procédé de commande Download PDF

Info

Publication number
WO2023287054A1
WO2023287054A1 PCT/KR2022/008955 KR2022008955W WO2023287054A1 WO 2023287054 A1 WO2023287054 A1 WO 2023287054A1 KR 2022008955 W KR2022008955 W KR 2022008955W WO 2023287054 A1 WO2023287054 A1 WO 2023287054A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
projector
window
processor
imaging device
Prior art date
Application number
PCT/KR2022/008955
Other languages
English (en)
Korean (ko)
Inventor
박재성
박이훈
이정흠
최낙원
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2023287054A1 publication Critical patent/WO2023287054A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/14Payment architectures specially adapted for billing systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/18Payment architectures involving self-service terminals [SST], vending machines, kiosks or multimedia terminals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Definitions

  • the disclosed invention relates to an ordering and payment device used in a drive-through based store, and unlike the conventional technology in which the window of the vehicle must be lowered to place an order and present a payment method (card, cash), a projector irradiates content on the vehicle window. It relates to a non-face-to-face kiosk device for a drive-thru-based store that orders and pays by.
  • a drive-through refers to a service that allows a vehicle occupant to purchase products from a store while riding in the vehicle, and is abbreviated as DT.
  • Drive-through stores have been widely applied to fast food restaurants in the United States, and recently, franchise companies such as Starbucks and McDonald's are operating drive-through stores in Korea.
  • a drive-through based store is divided into an ordering area, a payment area, and a product delivery area (pickup area).
  • an entry vehicle enters the ordering area and the ordering step can be performed.
  • an ordering device for a user's order and payment is installed in the ordering area, and the ordering device may communicate with a POS terminal inside a drive-through based store or a separate server.
  • the entry vehicle moves to a payment area along an access road.
  • the clerk uses a payment terminal of a drive-thru-based store to pay the amount of money to be paid by using a payment method such as a credit card, cash, or coupon from a passenger of an entry vehicle.
  • a payment method such as a credit card, cash, or coupon from a passenger of an entry vehicle.
  • the ordering device may include a means for payment, such as a card slot or a cash slot, and in this case, the payment step may be omitted, thereby reducing overall usage time of the occupant.
  • a means for payment such as a card slot or a cash slot
  • Payment for ordered products can also cause the same inconvenience as in the case mentioned above, and in the COVID situation, payment methods (credit card, cash, mobile phone payment) are provided to store employees by opening the window of the vehicle to pay for the user's order. There are many inconvenience factors such as having to do it.
  • the driver in the vehicle is the subject of ordering and paying, but there is no way for the vehicle's passengers to select menus by viewing the kiosk screen together, and moreover, two or more passengers can view their menus and order.
  • the disclosed invention provides an ordering device suitable for the COVID non-face-to-face era, and the driver and passenger of a vehicle entering a drive-thru shop can order products together through the front/rear/side windows or sunroof of the vehicle through the window of the vehicle. It is provided as a kiosk screen. To this end, the purpose is to research contents through projectors located outside the vehicle through the front/rear/side windows or sunroof of the vehicle, and to provide a user-friendly interface for two or more passengers to order products and pay for the products individually or at once. to be
  • an imaging device an imaging device; a communication unit that communicates with the server; projector; Controlling the projector to investigate content including a plurality of menus through a window of the vehicle based on the vehicle being photographed through the imaging device, and based on the photographing of a user's gesture for the content through the imaging device and a processor controlling the communication unit to determine a menu corresponding to the user gesture among the plurality of menus and to transmit information on the determined menu to a server.
  • the projector may set the irradiation angle by irradiating the content a plurality of times at various angles to set the irradiation angle and checking feedback of the irradiated content.
  • the projector may irradiate content including a pattern image to the vehicle at the set irradiation angle, and the processor may detect a window area of the vehicle according to a difference in transmittance and reflectance.
  • the processor controls the imaging device to photograph the vehicle at the set irradiation angle to recognize the sun visor pattern displayed on the front window of the vehicle, and detects an area below the recognized sun visor pattern as the front window area.
  • the processor may detect the detected slope and the degree of outer rounding of the window area, and apply inverse compensation in a direction opposite to the geometric distortion based on the detected slope and degree of outer rounding to examine the content in the window.
  • the processor may detect the slope of the detected window area and the degree of outer rounding, and reduce and examine the content in an area having a geometric distortion equal to or less than a preset reference value.
  • the processor may determine whether the vehicle entered normally by controlling a projector to irradiate a pattern image on the floor of the driving path of the vehicle and controlling the imaging device to capture the pattern image.
  • the projector irradiates the vehicle with a focus adjustment pattern image for focus adjustment, and the processor photographs the pattern image through the imaging device to determine whether the focus is normally set, and controls the projector to adjust the focus. .
  • the processor recognizes an omega shape inside the vehicle through the imaging device, detects the location of the user's head based on the recognized omega shape, and controls the focus of the projector. You can control it.
  • the processor recognizes an omega shape inside the vehicle through the imaging device, and operates the projector to project a plurality of kiosk screens to a window of the vehicle based on a plurality of omega shapes. You can control it.
  • the processor may determine that the user has selected the menu based on detecting, through the imaging device, that the user holds a tool for selecting the menu in front of the menu for a predetermined time.
  • the processor controls the projector to project a payment screen onto a window of the vehicle based on the information on the determined menu being transmitted to the server, and the user can proceed with payment through the payment screen.
  • the processor may proceed with the payment by photographing the marker displayed on the payment screen with a mobile device or by sequentially drawing 9 dots displayed on the payment screen on the mobile device.
  • a control method of an electronic device includes irradiating a pattern image on the floor of a driving path of a vehicle by a projector; determining, by the imaging device, whether the vehicle entered normally by photographing the pattern image; examining content including a plurality of menus through a window of the vehicle; determining a menu corresponding to the user gesture from among the plurality of menus when the user gesture for the content is captured through the imaging device; Transmitting information on the determined menu to a server; may include.
  • Irradiating content including a plurality of menus to the window of the vehicle moves the projector to set an irradiation angle, irradiates a projection beam for each preset angle, and checks feedback of the irradiated projection beam to irradiate the irradiation angle.
  • Setting a may further include.
  • Examining the contents including a plurality of menus through the window of the vehicle may include: irradiating a pattern to the vehicle, photographing the reflected pattern with the imaging device, and detecting a window area of the vehicle according to a difference in transmittance and reflectance. ; may be further included.
  • Examining the contents including a plurality of menus through the window of the vehicle detects the detected slope and degree of outer rounding of the window area, and performs inverse compensation in a direction opposite to the geometric distortion based on the detected slope and degree of outer rounding.
  • Applying a may further include.
  • Examining the contents including a plurality of menus through the window of the vehicle examines a focus control pattern for adjusting the focus of the vehicle, photographs the pattern through the imaging device, determines whether the focus is normally set, and determines whether the focus is normally set, and It may further include; step of adjusting the focus.
  • Determining the menu corresponding to the user gesture may include determining that the user has selected the menu based on the detection through the imaging device that the user holds a tool for selecting the menu in front of the menu for a predetermined time. ; may be further included.
  • Transmitting the information on the determined menu to the server may further include: irradiating a payment screen on a window of the vehicle, and proceeding with payment through the payment screen by the user.
  • users can order and pay completely non-face-to-face inside the vehicle without opening the vehicle window and firing at a clerk or kiosk screen located outside.
  • a kiosk with improved convenience that can provide convenience and UX, such as two or more vehicle passengers placing separate orders and paying for each while looking at each kiosk screen, or entering only product orders and paying at once with one person's card. it is possible to provide
  • FIG. 1 is an exemplary diagram illustrating a disposition structure of an electronic device when an entry vehicle enters an access road through a drive-through based store according to an embodiment of the present disclosure.
  • FIG. 2 is a diagram of input values for generating a radio channel according to an embodiment of the present invention.
  • FIG. 3 is a diagram of a process of generating a radio channel by a processor according to an embodiment of the present invention.
  • FIG. 4 is a diagram showing a control block diagram of an electronic device according to an embodiment of the present disclosure.
  • FIG. 5 is a diagram for inducing entry of an entry vehicle and confirming whether the entry vehicle has entered a normal position, according to an embodiment of the present disclosure.
  • FIG. 6 is a diagram in which an electronic device according to an embodiment of the present disclosure distinguishes between a vehicle with and without a sunroof and searches for content.
  • FIG 7 and 8 are views in which the projector 130 determines a window to irradiate content according to a predetermined angle and sets an optimal irradiation angle according to an embodiment of the disclosed invention.
  • FIG. 9 is a diagram in which an electronic device according to an embodiment of the present disclosure divides other areas such as a window area and a frame area.
  • FIG. 10 is a diagram explaining that a projector searches content for one person or two people according to an embodiment of the present disclosure.
  • FIG. 11 is a diagram showing that a projector irradiates content to a window and a user selects a menu according to an embodiment of the present invention.
  • FIG. 12 is a view from the inside of a vehicle when a projector examines content according to an embodiment of the present disclosure.
  • FIG. 13 is a diagram explaining that electronic devices hand over order information to next electronic devices according to an embodiment of the present disclosure.
  • FIG. 14 is a general flow chart of placing an order by an electronic device according to an embodiment of the present disclosure.
  • 15 is a flowchart illustrating a search for a plurality of contents when boarding of a passenger is confirmed by an electronic device according to an embodiment of the present disclosure.
  • first and second used herein may be used to describe various components, but the components are not limited by the terms, and the terms It is used only for the purpose of distinguishing one component from another.
  • a first element may be termed a second element, and similarly, the second element may also be termed a first element.
  • the term "and/or" includes any combination of a plurality of related listed items or any of a plurality of related listed items.
  • ⁇ unit may mean a unit that processes at least one function or operation.
  • the above terms refer to at least one hardware such as a field-programmable gate array (FPGA)/application specific integrated circuit (ASIC), at least one software stored in a memory, or at least one process processed by the processor 120.
  • FPGA field-programmable gate array
  • ASIC application specific integrated circuit
  • the codes attached to each step are used to identify each step, and these codes do not indicate the order of each step, and each step is performed in a different order from the specified order unless a specific order is clearly stated in the context. It can be.
  • the disclosed embodiments may be implemented in the form of a recording medium storing instructions executable by a computer. Instructions may be stored in the form of program codes, and when executed by the processor 120, may perform operations of the disclosed embodiments.
  • the recording medium may be implemented as a computer-readable recording medium.
  • Computer-readable recording media include all types of recording media in which instructions that can be decoded by a computer are stored. For example, there may be read only memory (ROM), random access memory (RAM), magnetic tape, magnetic disk, flash memory, optical data storage device, and the like.
  • ROM read only memory
  • RAM random access memory
  • magnetic tape magnetic tape
  • magnetic disk magnetic disk
  • flash memory optical data storage device
  • the projector 130 described below is a device for enlarging and showing an image (image or video) on a projection screen or similar white flat surface, and is usually a tool used to provide the same information to multiple audiences. It may be a device that receives signals from various video devices such as TVs, VCRs, DVD players, PCs, and camcorders and displays enlarged images on a screen through lenses.
  • the imaging device 110 described below may include any means capable of obtaining an image by photographing or sensing an object.
  • it may be a camera with a built-in image sensor capable of taking pictures or videos of objects.
  • it may be a radar technology that generates a point cloud image using radio frequencies, such as UWB (Ultra-WideBand) and FMCW (Frequency-Modulated Continuous Wave).
  • radio frequencies such as UWB (Ultra-WideBand) and FMCW (Frequency-Modulated Continuous Wave).
  • the imaging device 110 means a camera.
  • FIG. 1 is a diagram illustrating electronic devices 100-1,100-2,100-3 when entering vehicles 200-1,200-2,200-3 enter an access road through a drive-through based store according to an embodiment of the present disclosure. It is an example diagram showing the arrangement structure of.
  • the electronic device 100 is for ordering and payment used in a drive-thru (DT) based store, and as shown in FIG. do.
  • DT drive-thru
  • a plurality of electronic devices 100 may be configured, and they may unidirectionally or bidirectionally communicate with a vehicle 200 entering a movement path.
  • a radio channel may be used, and a method of using the radio channel will be described in detail with reference to FIGS. 2 and 3 .
  • FIG. 2 is a diagram related to input values for generating a radio channel according to an embodiment of the disclosed invention
  • FIG. 3 is a diagram related to a process of generating a radio channel by a processor according to an embodiment of the disclosed invention.
  • the processor 120 may guide the vehicle 200 to tune in to a radio channel of the vehicle 200 through a speaker or the like when the vehicle 200 enters a DT point.
  • the processor 120 may automatically set a radio channel to a specific frequency with the consent of the driver of the vehicle 200 and may be configured to be guided through a speaker inside the vehicle 200. .
  • the radio channel may be determined based on information obtained through vehicle 200 recognition when the vehicle 200 enters.
  • the electronic device 100 may select a non-overlapping radio channel by encoding each characteristic parameter such as vehicle model, year, color, and number, and using the code combination as a seed for generating a random frequency.
  • the selected radio channel may provide a communication means for one-way or two-way communication between users in the vehicle 200 and store personnel.
  • FIG. 4 is a diagram showing a control block diagram of the electronic device 100 according to an embodiment of the present disclosure.
  • the electronic device 100 may include a projector 130 , a server 300 , and a processor 120 that controls the communication unit 140 to communicate with the server 300 .
  • the projector 130 may include a projection module 131 configured to emit a projection beam and an imaging device 110 capable of checking feedback of the projected projection beam or detecting a user's motion.
  • the projector 130 may check parking/stop guidelines for a user who wants to use the window of the vehicle 200 as a screen.
  • the projector 130 may irradiate content to the window of the vehicle 200, and the infrared imaging device 110 may be used to find an optimal irradiation area or angle.
  • the projector 130 may project content onto the window of the vehicle 200, and the infrared imaging device 110 may be used to detect the overall tilt and rounding degree of the window of the vehicle 200.
  • the projector 130 may be used to provide a kiosk screen to the detected window area.
  • the communication unit 140 may include a wired communication module 141 provided to enable wireless communication and a wireless communication module 142 provided to enable wired communication.
  • the communication unit 140 communicates with the server 300 under the control of the processor 120, and transmits order-related information and payment-related information to the server 300 or receives it from the server 300 to display the projector 130. can be made visible to the user.
  • the server 300 communicates with the communication unit 140 to pass order information and payment information to the seller's POS or to display information received from the POS to the user through the projector 130 .
  • the processor 120 controls the projector 130 to provide parking/stop guidelines for a user who wants to utilize the window of the vehicle 200 as a screen, or irradiates content to the window of the vehicle 200 to provide an optimal irradiation area or It can be used to find an angle, or it can be used to detect the overall inclination and rounding degree of the window of the vehicle 200 by examining contents on the window of the vehicle 200 .
  • the projector 130 can be used to provide a kiosk screen to the detected window area.
  • the processor 120 controls the communication unit 140 and the server 300 to transmit order information and payment information to the server 300 through mutual communication between the communication unit 140 and the server 300, and order information from the server 300. And payment information can be received by the communication unit 140 and displayed to the user.
  • Gesture input during operation of the processor 120 may mean that the user selects a menu when the projector 130 examines a plurality of menus that the user can select.
  • the gesture input when a driver's or passenger's finger, back of the hand, or other tool stays within a specific area of the menu selection screen irradiated on the window of the vehicle 200 for a certain period of time, a procedure for confirming whether to add the corresponding menu to the shopping cart is performed. If it is not the corresponding menu selection, by selecting 'No', the menu selection can be continued. When all menu selections are completed, you can select an order in the menu selection space area.
  • the projector 130 displays a specific marker at a specific location of the window and images the mobile phone there.
  • a user interface (UI) for displaying the device 110 is displayed.
  • a new authentication method can be provided through pairing. After displaying 9 dots on the screen of the projector 130 and displaying arrows connected in order between the dots, the paired mobile phone It is possible to provide a method of authentication by an operation of drawing along the dot. This is similar to the lock screen release method used in cell phones or mobile devices, but it is a method that provides a number sequence pattern determined by the server 300 and approves it by a drawing motion by the user, which is different from the existing method.
  • 5 is a diagram in which the electronic device 100 induces entry of the entry vehicle 200 and checks whether the entry vehicle 200 has entered the normal position according to an embodiment of the present disclosure.
  • the projector 130 identifies the vehicle 200 with a specific pattern (eg, an indicator lamp such as a square box for a parking position) on the floor of a driving path of the vehicle 200. How to mark the location so that you can stop, etc.) can be investigated. Thereafter, the projector 130 guides the vehicle 200 to move to a specific location, and recognizes that the imaging device 110 has arrived at an accurate area, thereby guiding the vehicle 200 to park and stop.
  • a specific pattern eg, an indicator lamp such as a square box for a parking position
  • the projector 130 may irradiate a total of 4 specific patterns of 8 dots on the DT point movement path and guide the vehicle 200 to be parked on the corresponding pattern.
  • the processor 120 may determine whether the vehicle 200 normally enters the DT point movement path by analyzing the number of obscured patterns and the degree of obscuration.
  • the processor 120 may calculate the size or position of the vehicle 200 and perform an operation for detecting a window area to be irradiated on the kiosk screen.
  • the type or number of patterns irradiated by the projector 130 is not limited thereto, and the pattern is irradiated by the projector 130 and other methods of confirming the entry and size of the vehicle 200 according to the change of the pattern may also be included. there is.
  • FIG. 6 is a diagram in which the electronic device 100 classifies a vehicle 200 with a sunroof and a vehicle 200 without a sunroof according to an embodiment of the present disclosure and examines contents.
  • the projector 130 has LED flashing on the floor or separate
  • the vehicle 200 may be guided by displaying two stopping guidelines in a pattern or the like.
  • the projector 130 displays two guidelines of a dotted line and a solid line, so that ordering the kiosk screen through the sunroof leads to match the front part of the vehicle 200 with the dotted line in front, and utilizes the front window.
  • the front part of the vehicle 200 may be aligned with the solid line at the back to guide the user to select a window to utilize the kiosk.
  • FIG 7 and 8 are views in which the projector 130 determines a window to irradiate content according to a predetermined angle and sets an optimal irradiation angle according to an embodiment of the disclosed invention.
  • the imaging device 110 determines the amount of irradiation, the amount of reflection, etc. according to the angle between the window of the vehicle 200 and the projection beam, and determines the optimum amount of irradiation. You can find areas or angles. For example, in order to determine the optimal beam irradiation angle and area for the front window (first irradiation area) or sunroof (second irradiation area) of the vehicle 200, the projector 130 located on a two-dimensional plane is placed on the vehicle.
  • the beam While moving in the longitudinal direction (x-y direction) of (200), the beam is irradiated at each angle ( ⁇ 1, ⁇ 2, ⁇ 3), and the processor 120 repeatedly checks the feedback to determine the optimal irradiation angle for the first irradiation area ( The process of finding ⁇ _optimum) can be performed.
  • the projector 130 in order to determine the optimal beam irradiation angle and area for the left side window (third irradiation area) or the right side window (fourth irradiation area) of the vehicle 200, the projector 130 , while moving in the width direction (u-v direction) of the vehicle 200, the beam is irradiated at each angle ( ⁇ 1, ⁇ 2, ⁇ ), and the processor 120 checks the feedback to perform a process of finding an optimal irradiation angle. .
  • the processor performs learning based on various vehicle images in advance to generate a model, verifies the generated model, and then uses this model to obtain window tilt information of the vehicle entering the DT. can go through the process of inferring.
  • the AI technology for effectively determining the preset angle may be a deep learning technology, which is a method of machine learning.
  • Machine learning can be learning the ability of machines to stochastically and statistically analyze big data to find and classify the most valuable data.
  • Deep learning is an artificial neural network model modeled after the human brain neural network, and can be a method of supplementing the limitations of machine learning that cannot identify various variables included in existing data.
  • Determination of the angle may be image analysis of a data-based method.
  • Data-driven methods can train a model over a dataset with a large number of images and labels.
  • the trained machine learning model can take a new image as input and predict the label of the image.
  • a new image input to predict a label may be collected from the web or may be collected during the operation of the disclosed invention.
  • FIG. 9 is a diagram in which the electronic device 100 according to an embodiment of the present disclosure divides other areas such as a window area and a frame area.
  • the projector 130 may irradiate a specific pattern to the window and frame areas of the vehicle 200 .
  • the processor 120 may calculate a difference between transmittance and reflectance by capturing a pattern reflected by the vehicle 200 with the (RGB, infrared, etc.) imaging device 110 built in the projector 130 . Since the pattern of the area reflected by the frame of the vehicle 200 and the area reflected by the window of the vehicle 200 are inevitably different, the window area of the vehicle 200 can be detected according to the difference in the reflected pattern. At this time, after the pattern irradiated by the projector 130 is photographed by the imaging device 110, geometric distortion analysis of the pattern in the photographed image may be performed so that the tilt or size of the window can be detected together.
  • the electronic device 100 when a window of the vehicle 200 is detected, a sun visor printed on the window of the vehicle 200
  • a sun visor pattern that is, when the front window of the vehicle 200 is photographed and analyzed with the (RGB, infrared, etc.) imaging device 110 built into the projector 130, the line of the vehicle 200 A sun visor pattern may be detected and an area below the corresponding sun visor pattern may be detected as the front window of the vehicle 200 .
  • the window area can be detected in the same way.
  • a method of displaying a kiosk screen without distortion on the detected window of the vehicle 200 based on the detection of the window area of the vehicle 200 by the electronic device 100 will be described in detail.
  • the imaging device 110 may capture the window of the vehicle 200 and detect the overall inclination of the window of the vehicle 200 and the degree of outer rounding of the window.
  • the processor 120 may apply inverse compensation to the displayed image in a direction opposite to the detected geometric distortion so that the kiosk screen is visible without distortion from the perspective of the occupant of the vehicle 200 .
  • the projector 130 may irradiate the kiosk screen to which inverse compensation is applied so as to display the kiosk screen without distortion, or may reduce and display the kiosk screen in an area where geometric distortion is minimized.
  • the processor 120 may perform additional correction in consideration of the fact that the distance from the projector 130 to the projected window varies depending on the height of the vehicle, and the size of the image varies accordingly.
  • the projector 130 may examine content after performing scaling and aspect ratio adjustments with the detected window size.
  • Scaling adjustment can adjust the size of the screen through zoom-in and zoom-out functions.
  • Aspect Ratio can display the image at its detected size, or resize the image by maximizing its height, width, or both, or it can maximize its size to fit the window size while maintaining its original aspect ratio.
  • the kiosk keystone can be determined by the user or the keystone can be manually adjusted by the user. Accordingly, the projector 130 may provide a kiosk screen having an optimal focus.
  • the projector 130 may display a specific focus adjustment pattern on an area to provide a kiosk screen or other areas among windows of the vehicle 200 . Thereafter, the processor 120 may automatically adjust the focus after determining whether the focus is normally set using the imaging device 110 . After the process is automatically adjusted, the focus can be determined by the user or the focus can be manually adjusted by the user. Accordingly, it is possible to provide the user with the user's optimal focus to view the kiosk screen.
  • a process of guiding the removal of foreign substances may be added to the window of the vehicle 200.
  • the processor 120 utilizes a sensor (such as a CCD or illuminance sensor) that may be built into the projector 130 or utilizes the imaging device 110 to determine whether the amount of light output to the window or an image of a specific pattern is normally displayed. can be identified. If the processor 120 determines that there is a foreign substance, it may guide the user to wipe the window corresponding to the screen with the wiper of the vehicle 200 or request approval from the driver through communication with the vehicle 200 and receive approval to automatically operate the processor 120. there is.
  • the projector 130 identifies the brightness of the projected image or pattern, and adjusts the brightness when it is determined that the brightness is insufficient or excessive.
  • the position of the eyes or the head of the occupant of the vehicle 200 may be recognized to determine the position of the kiosk screen optimized for the user.
  • the processor 120 may detect the position of the user's eyes or head by utilizing the omega shape recognition technology through image recognition, and provide an optimal kiosk screen considering the projection target environment such as the inclination of the screen.
  • the omega shape recognition technology is a head tracking algorithm that detects a human head using the imaging device 110, and when the front shape of a person is captured by the imaging device 110, except for special cases,
  • the round shape of the head and the straight shape of the shoulder combine to form an omega ( ⁇ ) shape. Accordingly, it refers to a technology for filtering an image through image processing and checking the number of human heads according to the number of omega shapes.
  • recognition of the omega shape may have a low detection rate due to the shapes of various occupants in the vehicle.
  • AI technology can be used to increase the detection rate by learning various boarding pattern images.
  • the AI technology for increasing the detection rate of the omega shape may be deep learning technology, which is a method of machine learning.
  • Machine learning can be learning the ability of machines to stochastically and statistically analyze big data to find and classify the most valuable data.
  • Deep learning is an artificial neural network model modeled after the human brain neural network, and can be a method of supplementing the limitations of machine learning that cannot identify various variables included in existing data.
  • the detection of the omega shape according to an embodiment is data-based It may be an image analysis of the method. Data-driven methods can train a model over a dataset with a large number of images and labels.
  • the trained machine learning model can take a new image as input and predict the label of the image.
  • a new image input to predict a label may be collected from the web or may be collected during the operation of the disclosed invention.
  • the KNN (K-nearest neighbor) method is a method of predicting an image with the most frequently appearing label after obtaining a total of k data labels in order close to the input in a test or prediction step.
  • the detection rate can be increased by determining the most frequent shape as the omega shape when the user visits the DT store.
  • FIG. 10 is a diagram explaining that the projector 130 examines content for one person or two people according to an embodiment of the present disclosure.
  • the processor 120 automatically adjusts the kiosk area in consideration of whether or not a passenger is on board, so that two or more people can use one kiosk screen at the same time.
  • the projector 130 provides user interface (UI) buttons on both sides for product search and ordering according to the user's selection, or two kiosk screens so that the driver and passenger of the vehicle 200 can each use the same kiosk screen. can be provided for each passenger.
  • UI user interface
  • the projector gives the user a choice on how to display the menu according to the user's choice, so that only one person can view and select the menu.
  • providing kiosk contents for one person or two people may be provided according to the user's selection, but may also be provided automatically by the processor 120 for two people.
  • checking whether or not a passenger is boarding can check the number of heads by utilizing the omega shape recognition technology described above and confirming whether or not the passenger is boarding.
  • FIG. 11 is a diagram showing that the projector 130 irradiates content to a window and the user selects a menu according to an embodiment of the present invention.
  • a vehicle occupant when selecting a menu of researched content in a window of the vehicle 200, a vehicle occupant may select a product and proceed with order payment using hands or other tools.
  • the imaging device 110 of the disclosed invention may include an infrared (IR) imaging device 110 or an RGB imaging device 110 .
  • the processor may notify the user that a certain product has been added to the cart or ordered in the ordering system according to the detected user's hand shape or gesture, and imaging that the user holds the tool for selecting the menu in front of the menu for a predetermined period of time.
  • it may be determined that the user has selected a menu.
  • the color of the selected menu may be inverted or the density of the text may be inverted.
  • the screen may be selected to a screen other than the shopping cart or menu selection screen.
  • An image obtained by the infrared (IR) imaging device 110 has a brighter value than an object located at a short distance due to the characteristics of an infrared sensor.
  • the processor 120 may set a value corresponding to 0.1 of the maximum brightness value as a threshold point based on the brightness value, and may extract an area exceeding the threshold point as an approximate hand area.
  • the processor 120 may separate the brighter hand part and the relatively dark forearm region by applying the second threshold as a value corresponding to 0.3 of the maximum brightness value. At this time, only the hand part can be identified by excluding parts of the extracted boundary line from the hand area.
  • the optical flow which is the apparent velocity distribution on the image caused by the relative motion of the observer and the object, can be utilized.
  • the processor 120 may calculate a difference in optical flow between a finger and a palm to identify a current motion among moving, clicking, and stopping.
  • touch recognition can be implemented by applying the same algorithm not only when the user implements a touch using a hand but also when using a tool, and according to the touch recognition, a menu can be selected, and an order and payment can be made.
  • a user who selects a menu may be in the projector 130 or a space separated from the irradiation surface of the projector 130 . That is, according to the electronic device 100 according to an embodiment, unlike the prior art, the direction in which the projector 130 irradiates content may be different from the direction in which the user's line of sight moves.
  • parking violation fine paper is attached to the front window of the vehicle 200, or snow is piled up.
  • an exception handling method using the user's portable device is provided separately.
  • the projector 130 searches for nearby mobile devices and sends a pairing request to the user's portable device closest to it, and the user accepts the pairing to complete the pairing.
  • the user can navigate through the menu or make selections without looking at the mobile phone screen, just pointing the mobile phone at the screen displayed in the window or shaking it up, down, left and right.
  • the corresponding function can be provided and accuracy can be improved by utilizing the acceleration sensor or gravity sensor UWB technology of the portable device.
  • FIG. 12 is a view from the inside of the vehicle 200 when the projector 130 examines content according to an embodiment of the present disclosure.
  • the dashboard can provide a clearer screen than irradiating the window because the reflectance and refractive index are low, so that the kiosk screen can be adjusted according to the user's selection, resulting in convenience. This increases
  • FIG. 13 illustrates order information of electronic devices 100-1, 100-2, 100-3, 100-4, 100-5, 100-6, and 100-7 according to an embodiment of the present disclosure, and the next electronic device. It is a diagram explaining the transfer to fields 100-1, 100-2, 100-3, 100-4, 100-5, 100-6, and 100-7.
  • a function of displaying order information or transferring the playback time of advertisements and other content information viewed while waiting to a display on the next driving route is provided, and the vehicle 200 is transmitted through the imaging device 110 built in the electronic device 100. ) number and confirming which vehicle 200 is on which driving route the electronic device 100 is on, it is possible to provide seamless content enjoyment.
  • recognizing the number of the vehicle 200 it recognizes the unique shape (hood shape, etc.) of some parts of the vehicle 200, uses the recognition information for a while only at the DT store, and then immediately deletes the information when leaving the DT store to prevent exposure of personal information. You can also come up with a countermeasure.
  • FIG. 14 is a general flow chart of ordering by the electronic device 100 according to an embodiment of the disclosed subject matter.
  • the projector 130 may irradiate (2010) content including a pattern image on the floor of the DT access road, and the imaging device 110 may detect whether the pattern image is changed (2020).
  • the type of window to be projected among the front, side, and sunroof may be determined based on the change (2030). At this time, determining the window may be a user's choice or may be automatically made.
  • a window area and a non-window area may be distinguished and detected (2040).
  • content is repeatedly irradiated toward the vehicle 200 to detect the window area, and the window is detected by checking the feedback (2050).
  • the projected window is irradiated with content (2060).
  • content (2070) information on the menu is transmitted to the server 300 (2080), and content related to payment information is examined in the window (2090).
  • the operation ends when the user selects a payment method for payment (2100), and the user opens the window of the vehicle 200 and is completely non-face-to-face inside the vehicle 200 without having to speak to a clerk or a kiosk screen located outside. It has the effect of enabling ordering and payment.
  • 15 is a flowchart illustrating a search for a plurality of contents when boarding of a passenger is confirmed by the electronic device 100 according to an embodiment of the present disclosure.
  • the projector 130 may irradiate (3010) content including a pattern image on the floor of a DT access road, and the imaging device 110 may detect whether or not the pattern image is changed (3020).
  • the type of window to be projected among the front, side, and sunroof may be determined based on the change (3030). At this time, determining the window may be a user's choice or may be automatically made.
  • a window area and a non-window area may be distinguished and detected (3040).
  • content is repeatedly irradiated toward the vehicle 200 to detect the window area, and the window is detected by checking the feedback (3050).
  • the user selects a menu displayed in the content (3090), transmits information about the menu to the server 300 (3100), and irradiates content related to payment information to the window ( 3110). After that, the operation ends when the user selects a payment method for payment (3120).
  • the passenger can also order and pay through the kiosk screen, increasing convenience.
  • the projector 130 irradiates a pattern image on the floor of a driving path of the vehicle 200, and the imaging device 110 captures the pattern image so that the vehicle ( 200) is normally entered, and contents including a plurality of menus may be examined through the window of the vehicle 200.
  • determining a menu corresponding to the user gesture among a plurality of menus and transmitting information on the determined menu to the server 300 may be included. there is.
  • control method of the electronic device 100 is omitted because it overlaps with the embodiment of the electronic device 100 described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Human Computer Interaction (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Social Psychology (AREA)
  • Health & Medical Sciences (AREA)
  • Development Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Economics (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon un mode de réalisation, un dispositif électronique peut comprendre : un dispositif d'imagerie ; une unité de communication qui effectue une communication avec un serveur ; un projecteur ; et un processeur qui : sur la base d'une capture d'une image d'un véhicule par l'intermédiaire du dispositif d'imagerie, commande la projection, par le projecteur, d'un contenu comprenant une pluralité de menus sur une fenêtre du véhicule ; sur la base du fait qu'une image d'un geste d'utilisateur sur l'image de contenu est capturée par l'intermédiaire du dispositif d'imagerie, détermine un menu correspondant au geste d'utilisateur parmi la pluralité de menus ; et commande la transmission, par l'unité de communication, d'informations sur le menu déterminé au serveur.
PCT/KR2022/008955 2021-07-16 2022-06-23 Dispositif électronique et son procédé de commande WO2023287054A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020210093643A KR20230012861A (ko) 2021-07-16 2021-07-16 전자 장치 및 전자 장치의 제어 방법
KR10-2021-0093643 2021-07-16

Publications (1)

Publication Number Publication Date
WO2023287054A1 true WO2023287054A1 (fr) 2023-01-19

Family

ID=84919506

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/008955 WO2023287054A1 (fr) 2021-07-16 2022-06-23 Dispositif électronique et son procédé de commande

Country Status (2)

Country Link
KR (1) KR20230012861A (fr)
WO (1) WO2023287054A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080099287A (ko) * 2006-02-10 2008-11-12 쓰리엠 이노베이티브 프로퍼티즈 캄파니 다수의 무선 통신 채널을 사용하는 퀵 서비스 레스토랑의 주문 접수 시스템
KR20190079394A (ko) * 2017-12-27 2019-07-05 (주) 엔시스텍 차량 인식 카메라 및 차량 인식 시스템
US20190287159A1 (en) * 2016-05-05 2019-09-19 Conduent Business Services, Llc System and method for lane merge sequencing in drive-thru restaurant applications
KR20190142216A (ko) * 2019-06-07 2019-12-26 케이에스아이 주식회사 마킹의 딥러닝 인식을 이용한 차량 검지 시스템
KR20200055532A (ko) * 2018-11-13 2020-05-21 효성티앤에스 주식회사 드라이브 스루용 금융자동화기기에 적용되는 가상 키보드 모듈 및 그 모듈이 구비된 금융자동화기기

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080099287A (ko) * 2006-02-10 2008-11-12 쓰리엠 이노베이티브 프로퍼티즈 캄파니 다수의 무선 통신 채널을 사용하는 퀵 서비스 레스토랑의 주문 접수 시스템
US20190287159A1 (en) * 2016-05-05 2019-09-19 Conduent Business Services, Llc System and method for lane merge sequencing in drive-thru restaurant applications
KR20190079394A (ko) * 2017-12-27 2019-07-05 (주) 엔시스텍 차량 인식 카메라 및 차량 인식 시스템
KR20200055532A (ko) * 2018-11-13 2020-05-21 효성티앤에스 주식회사 드라이브 스루용 금융자동화기기에 적용되는 가상 키보드 모듈 및 그 모듈이 구비된 금융자동화기기
KR20190142216A (ko) * 2019-06-07 2019-12-26 케이에스아이 주식회사 마킹의 딥러닝 인식을 이용한 차량 검지 시스템

Also Published As

Publication number Publication date
KR20230012861A (ko) 2023-01-26

Similar Documents

Publication Publication Date Title
KR101107441B1 (ko) 진입 통제 시스템 및 차량 액세스 통제 방법
US5956122A (en) Iris recognition apparatus and method
US10937028B2 (en) Store system, method of controlling the store system, computer program for executing the method, and checkout device
CN100447661C (zh) 投影显示设备
US5912721A (en) Gaze detection apparatus and its method as well as information display apparatus
WO2014106977A1 (fr) Afficheur facial et procédé pour le commander
WO2021095916A1 (fr) Système de suivi pouvant suivre le trajet de déplacement d'un objet
CA2707993A1 (fr) Systeme d'interaction pour interaction entre un ecran et un objet indicateur
KR20070109713A (ko) 이동식 불법 주정차 단속 시스템
KR101968203B1 (ko) 주차관제시스템
KR20100027406A (ko) 주차 관리 시스템 및 주차 관리 방법
CN108076265A (zh) 处理装置以及摄像装置
WO2019221494A1 (fr) Dispositif électronique permettant de réaliser une authentification biométrique et son procédé de fonctionnement
CN109842790A (zh) 影像信息显示方法与显示器
EP3746923A1 (fr) Dispositif électronique permettant de réaliser une authentification biométrique et son procédé de fonctionnement
KR101596363B1 (ko) 얼굴 인식에 의한 출입관리장치 및 방법
WO2023287054A1 (fr) Dispositif électronique et son procédé de commande
JP7047891B1 (ja) エレベータシステム
WO2023153812A1 (fr) Dispositif électronique de détection d'objet et son procédé de commande
WO2019216673A1 (fr) Système et procédé de guidage d'objet pour corps mobile sans pilote
JP7182442B2 (ja) 指示特定システム
JP7099564B1 (ja) エレベータシステム
KR200435771Y1 (ko) 이동식 불법 주정차 단속 시스템
CN118279850A (zh) 一种交互方法、装置、电子设备及计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22842316

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22842316

Country of ref document: EP

Kind code of ref document: A1