WO2016062187A1 - 匹配方法及智能互动体验系统 - Google Patents

匹配方法及智能互动体验系统 Download PDF

Info

Publication number
WO2016062187A1
WO2016062187A1 PCT/CN2015/090862 CN2015090862W WO2016062187A1 WO 2016062187 A1 WO2016062187 A1 WO 2016062187A1 CN 2015090862 W CN2015090862 W CN 2015090862W WO 2016062187 A1 WO2016062187 A1 WO 2016062187A1
Authority
WO
WIPO (PCT)
Prior art keywords
experience
data
smart terminal
user
time period
Prior art date
Application number
PCT/CN2015/090862
Other languages
English (en)
French (fr)
Inventor
杨军
Original Assignee
杭州大穿越旅游策划有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州大穿越旅游策划有限公司 filed Critical 杭州大穿越旅游策划有限公司
Publication of WO2016062187A1 publication Critical patent/WO2016062187A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/55Push-based network services

Definitions

  • the invention relates to a matching method and an intelligent interactive experience system.
  • the technical problem to be solved by the present invention is to provide a matching method and an intelligent interactive experience system for the above-mentioned problems, which can accurately match the experience data generated by the user in the experience activity with the smart terminal carried by the user.
  • the technical solution adopted by the present invention is: a matching method, which is characterized in that it comprises:
  • the service terminal matches the experience data in the experience terminal that is within the experience device experience corresponding to the experience data during the data generation period;
  • the smart terminal is carried by the user and is installed with a mobile client capable of acquiring the current location of the smart terminal in real time.
  • the client and the service terminal constitute a master-slave architecture to implement interaction.
  • the obtaining experience data of a user who is performing an experience activity includes:
  • the experience device After the user enters the experience device to perform the experience activity and triggers the sensor on the experience device, the experience device starts to acquire the experience data of the user who is performing the experience activity.
  • the service terminal includes:
  • the detection server is connected to the experience device, and is configured to detect and acquire new experience data in the experience device in real time, and extract the device number of the experience device corresponding to each experience data and the original acquisition time of the experience data;
  • Remote server connected to the detection server via lan or wifi, for receiving and storing analytics Measuring the experience data transmitted by the server, and the device number and the original acquisition time of the corresponding experience device; on the other hand, corresponding to the mobile client installed on the smart terminal, forming a master-slave architecture, and the experience data generation period is positive
  • the smart terminal that is within the range of the experience device experience corresponding to the experience data matches the experience data.
  • the smart terminal that is within the experiencing data experience period of the experience data is matched with the experience data, and includes:
  • the remote server obtains the start time point t1 and the end time point t2 of the sensor on the user triggering experience device, and the device number A of the experience device corresponding to the location of the smart terminal carried by the user in the time period t1-t2;
  • the remote server obtains the experience data acquired by the experience device with the number A in the t1-t2 time period and the t1-t2 time according to the device number of the experience device corresponding to the experience data and the original acquisition time of the experience data.
  • the smart terminal in the segment is located in the range of the experience device experience of device number A.
  • the remote server obtains the start time point t1 and the end time point t2 of the sensor on the user triggering experience device, and the device number A of the experience device corresponding to the location of the smart terminal carried by the user in the time period t1-t2, include:
  • the detecting server acquires a start time t1 and an end time t2 at which the sensor on the experience device is triggered, and transmits the same to the remote server;
  • the intelligent terminal carried by the user feeds back the location coordinates of the location in the t1-t2 time period to the remote server, and the remote server matches the feedback with the coordinates of the pre-stored experience device experience range, and obtains the smart terminal at the time t1-t2.
  • the smart terminal that is within the experiencing data experience period of the experience data is matched with the experience data, and includes:
  • the remote server obtains the start time point t1 and the end time point t2 of the sensor on the user trigger experience device, and the device number A of the experience device corresponding to the triggered sensor;
  • the remote server determines whether the smart terminal is located in the experiencing device experience range of the device number A in the t1-t2 time period, and if so, the original acquisition time is located in the t1-t2 time period and is numbered A.
  • the experience data acquired by the device is matched with the smart terminal within the range of the experience device experience of the device number A in the t1-t2 time period.
  • the remote server determines whether the smart terminal is located in the experiential device experience range of the device number A in the t1-t2 time period, including:
  • the remote server acquires the location coordinates of the smart terminals in the master-slave architecture within the time period t1-t2, and matches the coordinates of the experience device experience range of the pre-stored device number A. If the match is successful, it indicates that the smart terminal is located within the experience device experience range of device number A in the time period t1-t2.
  • the experience device is a bidding device, including an electronic tag, and a reader connected to the detection server and having a sensing function. After the user asks for the dropped electronic tag to enter the sensing range of the reader, the reader sends an instruction to the electronic tag. The data information in the electronic tag is read as experience data.
  • the experience device is a graffiti wall connected to the detection server and having the function of sensing and recording the graffiti track.
  • the graffiti wall can sense and record the graffiti track and form a computer readable form.
  • the graffiti works as experience data.
  • the experience device includes a play item located in the scenic spot, and a photographing device connected to the detection server and having an infrared sensing function, the photographing device is arranged for the play item, and the photographing device shoots the user after entering the sensing range of the shooting device. To obtain photos and/or videos when the user experiences on the game item as experience data.
  • the play items include slide movement, downhill movement, strop movement, sidewalk movement, ramp cable movement, plank road movement, ropeway movement, bow movement, skiing.
  • the experience range of the experience device is determined by the coordinates of the location where the experience device is located, and the device number of each experience device and its coordinate value are pre-stored in the remote server.
  • An intelligent interactive experience system comprising: an intelligent terminal, an experience device, and a service terminal, wherein
  • the smart terminal is carried by the user and is installed with a mobile client capable of acquiring the current location of the smart terminal in real time, and the client and the service terminal constitute a master-slave architecture to implement interaction;
  • the service terminal receives the experience data transmitted by the experience device in real time, and matches the smart terminal that is within the experience device experience range of the experience data during the time period of the experience data generation, and pushes the experience data to the matching result according to the matching result. Corresponding to the smart terminal.
  • the service terminal includes:
  • the detection server is connected to the experience device, and is configured to detect and acquire new experience data in the experience device in real time, and extract the device number of the experience device corresponding to each experience data and the original acquisition time of the experience data;
  • the remote server is connected to the detection server through the lan or the wifi, and is configured to receive and store the experience data transmitted by the detection server, and the device number and the original acquisition time of the corresponding experience device;
  • the installed mobile client corresponds to the master-slave architecture, and the smart terminal that is within the experience device experience period corresponding to the experience data during the data generation period is matched with the experience data, and the experience data is pushed to the matching result according to the matching result. Corresponding to the smart terminal.
  • the smart terminal that is within the experiencing data experience period of the experience data is matched with the experience data, and includes:
  • the remote server obtains the start time point t1 and the end time point t2 of the sensor on the user triggering experience device, and the device number A of the experience device corresponding to the location of the smart terminal carried by the user in the time period t1-t2;
  • the remote server obtains the experience data acquired by the experience device with the number A in the t1-t2 time period and the t1-t2 time according to the device number of the experience device corresponding to the experience data and the original acquisition time of the experience data.
  • the smart terminal in the segment is located in the range of the experience device experience of device number A.
  • the remote server obtains the start time point t1 and the end time point t2 of the sensor on the user triggering experience device, and the device number A of the experience device corresponding to the location of the smart terminal carried by the user in the time period t1-t2, include:
  • the detecting server acquires a start time t1 and an end time t2 at which the sensor on the experience device is triggered, and transmits the same to the remote server;
  • the intelligent terminal carried by the user feeds back the location coordinates of the location in the t1-t2 time period to the remote server, and the remote server matches the feedback with the coordinates of the pre-stored experience device experience range, and obtains the smart terminal at the time t1-t2.
  • the smart terminal that is within the experiencing data experience period of the experience data is matched with the experience data, and includes:
  • the remote server obtains the start time point t1 and the end time point t2 of the sensor on the user trigger experience device, and the device number A of the experience device corresponding to the triggered sensor;
  • the remote server determines whether the smart terminal is located in the experiencing device experience range of the device number A in the time period t1-t2, and if yes, the original acquiring time is located in the t1-t2 time period, and is obtained by the experience device numbered A.
  • the experience data is matched with the smart terminal within the experience device experience range of device number A in the t1-t2 time period.
  • the remote server determines whether the smart terminal is located in the experiential device experience range of the device number A in the t1-t2 time period, including:
  • the remote server acquires the location coordinates of the smart terminals in the master-slave architecture within the time period t1-t2, and matches the coordinates of the experience device experience range of the pre-stored device number A. If the match is successful, it indicates that the smart terminal is located within the experience device experience range of device number A in the time period t1-t2.
  • the experience device is a bidding device, including an electronic tag, and a reader connected to the detection server and having a sensing function. After the user asks for the dropped electronic tag to enter the sensing range of the reader, the reader sends an instruction to the electronic tag. The data information in the electronic tag is read as experience data.
  • the experience device is a graffiti wall connected to the detection server and having the function of sensing and recording the graffiti track.
  • the graffiti wall can sense and record the graffiti track and form a computer readable form.
  • the graffiti works as experience data.
  • the experience device includes a play item located in the scenic spot, and a photographing device connected to the detection server and having an infrared sensing function, the photographing device is arranged for the play item, and the photographing device shoots the user after entering the sensing range of the shooting device. To obtain photos and/or videos when the user experiences on the game item as experience data.
  • the play items include slide movement, downhill movement, strop movement, sidewalk movement, ramp cable movement, plank road movement, ropeway movement, bow movement, skiing.
  • the beneficial effects of the present invention are as follows: 1.
  • the present invention matches an intelligent terminal (the user carries it) that is within the experience device experience range corresponding to the experience data during the data generation period to ensure the user's experience data ( Files in computer readable form such as photos and videos are only sent to the smart terminal that they carry with them, thus avoiding the inconsistency between the experience data and the user.
  • the whole process is fully automated, saving time and labor costs and high efficiency.
  • Fig. 1 is a schematic structural view of a first embodiment.
  • Fig. 2 is a schematic structural view of the second embodiment.
  • Fig. 3 is a schematic structural view of a third embodiment.
  • Fig. 4 is a schematic block diagram of the fourth embodiment.
  • Fig. 5 is a schematic structural view of the fifth embodiment.
  • Fig. 6 is a schematic structural view of a sixth embodiment.
  • Fig. 7 is a schematic structural view of the seventh embodiment.
  • Figure 8 is a block diagram showing the structure of the eighth embodiment.
  • Fig. 9 is a schematic structural view of a ninth embodiment.
  • Figure 10 is a block diagram showing the structure of the tenth embodiment.
  • Figure 11 is a block diagram showing the structure of the eleventh embodiment.
  • the experience device After the user enters the experience device to perform the experience activity and triggers the sensor on the experience device, the experience device starts to acquire the experience data of the user who is performing the experience activity;
  • the detection server 1 is connected to the experience device, and is configured to detect and acquire new experience data in the experience device in real time, and extract the device number of the experience device corresponding to each experience data and the original acquisition time of the experience data;
  • the remote server 2 is connected to the detection server 1 through lan or wifi for receiving and storing
  • the smart terminal 7 is carried by the user and is installed with a mobile client capable of acquiring the current location of the smart terminal 7 in real time.
  • the client and the remote server 2 constitute a master-slave architecture to implement interaction.
  • the smart terminal 7 that is within the experiencing data experience period of the experience data is matched with the experience data, including:
  • the remote server 2 acquires the start time point t1 and the end time point t2 of the sensor on the user triggering experience device, and the device number A of the experience device corresponding to the location of the smart terminal 7 carried by the user in the time period t1-t2;
  • the specific implementation manner is: the detection server 1 acquires the start time t1 and the end time t2 that the sensor on the experience device is triggered, and transmits the same to the remote server 2; the smart terminal 7 carried by the user carries the t1-t2 time period.
  • the location coordinates of the device are fed back to the remote server 2, and the remote server 2 matches the feedback with the coordinates of the pre-stored experience device experience range, and obtains the experience device corresponding to the location of the smart terminal 7 during the time period t1-t2.
  • the remote server 2 compares the device number of the experience device corresponding to the experience data with the original acquisition time of the experience data, and the experience data acquired by the experience device numbered A in the time period t1-t2, and t1-t2
  • the smart terminal 7 located within the range of the experience device experience of the device number A is matched in the time period.
  • the specific implementation manner of the smart terminal 7 that matches the experience data in the range of the experience device corresponding to the experience data in the experience data generation period may also be:
  • the remote server 2 acquires the start time point t1 and the end time point t2 of the sensor on the user trigger experience device, and the device number A of the trigger device corresponding to the experience device;
  • the remote server 2 determines whether the smart terminal is located in the experiencing device experience range of the device number A in the t1-t2 time period (specifically: the remote server acquires the t1-t2 time period, and establishes the master-slave architecture with each
  • the location coordinates of the smart terminal are matched with the coordinates of the pre-stored device experience range of the device number A. If the match is successful, it indicates that there is a smart end in the t1-t2 time period. The end is located in the experiential device experience range of the device number A. If yes, the original acquisition time is in the t1-t2 time period, and the experience data obtained by the experience device numbered A and the time period t1-t2 The smart terminal located within the experience device experience of device number A is matched.
  • the experience device may be a bidding device, including an electronic tag, and a reader connected to the detection server 1 and having a sensing function. After the user invites the dropped electronic tag to enter the sensing range of the reader, the reader sends the electronic tag to the electronic tag. The instruction reads the data information in the electronic tag as the experience data.
  • the experience device can also be a graffiti wall connected to the detection server 1 and having the function of sensing and recording graffiti tracks. When the user creates graffiti on the graffiti wall, the graffiti wall can sense and record the graffiti track and form a computer. Graffiti works in readable form as experience data.
  • the experience device may also be based on the play item of the scenic spot, and the shooting device is arranged to be connected to the detection server 1 and has an infrared sensing function. After the user enters the sensing range of the shooting device, the shooting device takes a picture to acquire the user. Photos and/or videos as experienced on the play item as experience data.
  • the play items include slide movement, downhill movement, strop movement, sidewalk movement, ramp cable movement, plank road movement, ropeway movement, bow movement, skiing.
  • the experience range of the experience device is determined by the coordinates of the location where the experience device is located, and the device number of each experience device and its coordinate value are pre-stored in the remote server 2.
  • the smart terminal 7 is a smart product such as a smart phone, a tablet computer, or a smart wearable device.
  • Embodiment 1 As shown in FIG. 1 , the intelligent interactive experience system of the present embodiment is based on the slide motion of the scenic spot (the experience device includes the photographing device and the scenic play project), and includes: the smart terminal 7 , the photographing device, and the detecting server 1 And remote server 2, where
  • the smart terminal 7 is carried by the user and is installed with a mobile client capable of acquiring the current location of the smart terminal 7 in real time.
  • the client and the remote server 2 constitute a master-slave architecture to implement interaction;
  • the photographing device is arranged for the play item of the scenic spot, and the shooting range is located in the middle section of the play area in the play item, and is used for photographing the user entering the shooting range to obtain photos and/or videos;
  • the detecting server 1 is connected to the photographing device for detecting and acquiring newly taken photos and/or videos in the photographing device in real time, and extracting the device number of the photographing device corresponding to each photo and/or video and the shooting time.
  • the detection server 1 selects Advantech 610L;
  • the remote server 2 is connected to the detection server 1 through lan or wifi, and is used for receiving and storing the photos and/or videos transmitted by the detection server 1, and the basic information such as the device number and the shooting time of the corresponding shooting device;
  • a master-slave architecture is formed, and the photo and/or video are in the shooting time according to the photographing time of the photo and/or video and the photographing device number.
  • the smart terminal 7 in the shooting range of the photographing device having the foregoing device number is matched, and according to the matching result, the photo thumbnail generated by the received file and/or the transcoded compressed video is pushed to the corresponding smart terminal 7;
  • the remote server 2 is composed of a pair of application servers 2-1, a database server 2-2, and a web server 2-3, all of which use IBM Netfinity 3000; wherein the application server 2-1 and the
  • the mobile client installed on the smart terminal 7 corresponds to a master-slave architecture, and is used for compressing the received photos to generate corresponding thumbnails, so that the user can browse the smart terminal 7 more smoothly, and transcode the received video.
  • the smart terminal 7 that is in the shooting range of the camera with the aforementioned device number is matched in time; the database server 2-2 is used to store unprocessed photos and/or videos, and the image processed by the application server 2-1 is reduced.
  • Thumbnail and/or transcoded video Web server 2-3 compresses photo thumbnails and/or transcodes generated from received files based on matching results
  • the video pushed to the corresponding intelligent terminal 7, on the other hand provides the user logs WEB site systems through PC, notebook, tablet or mobile phone in the home or office, query and download photos, videos.
  • the user can register the user through the mobile client on the smart terminal 7 as an identity indicator, and can view historical photos and/or videos after logging in.
  • the imaging device includes a field infrared sensing camera 3 and a field infrared sensing camera 4, and the infrared sensing ranges of the two are coincident (the user enters the infrared sensing range, and the field infrared sensing camera and the field infrared sensing camera start shooting, that is, the user enters the shooting.
  • the start time of the shooting range of the device is the start time of the infrared sensing trigger, which constitutes the shooting range; the device number of the field infrared sensing camera 3, the field infrared sensing camera 4, and the shooting range coordinate value are pre-stored in the remote server 2 .
  • a USB data signal amplifier 5 is connected to each of the outputs of the field infrared camera 3 and the field infrared camera 4, and the other end of each USB data signal amplifier 5 is connected to the detection server 1 via the USB data signal receiver 6, the USB data.
  • the signal amplifier 5 and the USB data signal receiver 6 are connected by a super five type network cable, and the transceiver of the embedded circuit can prevent the infrared sensor from automatically capturing the photo and video information loss of the camera, and amplifying the USB signal through the super five network cable. Perform cable extension transmission with a maximum long span of 80-120 meters.
  • the photo and/or video are matched with the smart terminal 7 that is within the shooting range of the camera having the aforementioned device number according to the photographing time of the photo and/or video and the photographing device device number.
  • the specific implementation is:
  • the remote server 2 acquires the start time point t1 of the user and the portable smart terminal 7 entering the shooting range of the camera, and the time point t2 when the shooting range of the shooting device is left, and the smart terminal 7 is in the time period t1-t2.
  • the device number A of the corresponding shooting device of the position is: when the smart terminal 7 carried by the user enters the shooting range of the shooting device and leaves the shooting range of the shooting device, respectively sends the arrival sign and the leaving sign to the remote server 2, and simultaneously remotely
  • the server 2 records the time points t1 and t2 at which the shooting range is reached and left (or the detection server acquires the start time t1 and the end time t2 at which the infrared sensor is triggered on the photographing device, and transmits it to the remote server);
  • the smart terminal 7 carried around carries back the position coordinates of the location in the t1-t2 time period to the remote server 2, and the remote server 2 matches the feedback with the coordinates of the pre-stored shooting range of each camera to obtain the smart terminal 7 at t1.
  • the remote server 2 takes a photo and/or video taken by the photographing device with the device number A in the time period t1-t2 according to the photographing time of the photo and/or video and the photographing device number, and t1-
  • the smart terminal 7 located within the photographing range of the photographing device of the device number A is matched in the t2 period.
  • the specific implementation manners of sending the arrival flag and the leaving flag to the remote server 2 are:
  • the mobile client installed on the smart terminal 7 acquires the current position coordinates of the smart terminal 7 in real time, and Matching the coordinates of the pre-stored shooting range of each camera;
  • the arrival flag is transmitted to the remote server 2;
  • the departure flag is transmitted to the remote server 2.
  • the specific implementation of matching the photo and/or video with the smart terminal 7 that is in the shooting range of the photographing device having the aforementioned device number according to the photographing time of the photo and/or video and the photographing device device number It can also be:
  • the remote server 2 acquires the starting time point t1 of the user and the portable intelligent terminal 7 entering the shooting range of the shooting device, and the time point t2 of leaving the shooting range of the shooting device (ie, triggering the infrared sensor of the field and the infrared sensor of the field infrared sensing camera) Start time point t1 and end time point t2), and the device number A of the photographing device;
  • the remote server 2 determines whether there is a smart terminal in the shooting range of the device number A in the time period t1-t2 (specifically: the remote server acquires the t1-t2 time period, and establishes each of the master-slave architectures
  • the position coordinates of the intelligent terminal are matched with the pre-stored coordinates of the shooting range of the device with the device number A. If the matching is successful, it indicates that the smart terminal is located at the device number A in the t1-t2 time period.
  • the photo and/or video taken during the t1-t2 time period and taken by the camera with device number A, and the device number A in the time period t1-t2 The smart terminal 7 within the shooting range of the photographing device performs matching.
  • the embodiment adds the following steps: the user views the received photo thumbnails and/or through the mobile client on the smart terminal 7. Or the transcoded video, if the group of photos and/or videos is found not to be personal information, send a resend command to the remote server 2; after receiving the resend command, the remote server 2 pushes the front and rear of the video to the smart terminal 7 Three sets of photos and / or videos.
  • the detection server 1 periodically scans the physical space of the photo and/or video stored in the field infrared camera 3, the field infrared camera 4, and the detection server 1. After the storage size limit is reached, the history file is automatically performed. Delete to ensure that newly taken photos and/or videos are saved Storage.
  • Embodiment 2 As shown in FIG. 2, the intelligent interactive experience system of this embodiment is basically the same as Embodiment 1, except that the system of the present embodiment is based on the mid-segment of the speed-down motion in the speed-down motion of the scenic area. .
  • Embodiment 3 As shown in FIG. 3, the intelligent interactive experience system of this embodiment is basically the same as that of Embodiment 1, except that the system of the present embodiment is based on the strop movement of the scenic spot, and the imaging device is arranged in the middle of the strop movement. .
  • Embodiment 4 As shown in FIG. 4, the intelligent interactive experience system of this embodiment is basically the same as that of Embodiment 1, except that the system of the present embodiment is based on the scenic track of the scenic spot, and the shooting device is arranged in the middle of the riding track.
  • Embodiment 5 As shown in FIG. 5, the intelligent interactive experience system of this embodiment is basically the same as that of Embodiment 1, except that the system of the present embodiment is based on the scenic track-based ramp cable project, and the photographing device is facing the middle section of the ramp cable car. Arrangement.
  • Embodiment 6 As shown in FIG. 6, the intelligent interactive experience system of this embodiment is basically the same as that of Embodiment 1, except that in the system of the present embodiment, the camera is arranged in the middle of the path.
  • Embodiment 7 As shown in FIG. 7, the intelligent interactive experience system of this embodiment is basically the same as Embodiment 1, except that the system of the present embodiment is arranged in the middle section of the ropeway motion based on the cableway motion of the scenic spot.
  • Embodiment 8 As shown in FIG. 8, the intelligent interactive experience system of this embodiment is basically the same as that of Embodiment 1, except that the system of the present embodiment is based on the mid-section of the bow project path in the bow project of the scenic spot. Arrangement.
  • Embodiment 9 As shown in FIG. 9, the intelligent interactive experience system of this embodiment is basically the same as Embodiment 1, except that in the skiing project of the present embodiment, the photographing device is arranged in the middle of the ski project path.
  • Embodiment 10 As shown in FIG. 10, the intelligent interactive experience system of this embodiment is basically the same as that of Embodiment 1, except that the experience device in this embodiment is a call-forward device, which includes an electronic tag, and The detection server 1 is connected (connected through the USB data signal amplifier 5, the USB data signal receiver 6 and the super five network cable) and has a sensing function reader. After the user requests the dropped electronic tag to enter the sensing range of the reader, the reader Send an instruction to the electronic tag to read the data information in the electronic tag as the experience data.
  • the experience device in this embodiment is a call-forward device, which includes an electronic tag
  • the detection server 1 is connected (connected through the USB data signal amplifier 5, the USB data signal receiver 6 and the super five network cable) and has a sensing function reader. After the user requests the dropped electronic tag to enter the sensing range of the reader, the reader Send an instruction to the electronic tag to read the data information in the electronic tag as the experience data.
  • the detection server 1 After detecting the experience data, the detection server 1 transmits the experience data to the remote server 2, and after the remote server 2 decodes the experience data, the experience data after the decoding process is pushed to the corresponding user according to the same principle as in the first embodiment.
  • the remote server 2 decodes the experience data
  • the experience data after the decoding process is pushed to the corresponding user according to the same principle as in the first embodiment.
  • Embodiment 11 As shown in FIG. 11, the intelligent interactive experience system of the embodiment is basically the same as that of Embodiment 1, except that the experience device of the embodiment is connected to the detection server 1 (via USB data signal amplifier 5, USB data)
  • the signal receiver 6 and the super five network cable are connected, and the graffiti wall with the function of sensing and recording the graffiti track (achieved by the pressure sensor), when the user creates the graffiti on the graffiti wall, the graffiti wall can sense and record the graffiti track.
  • a graffiti work in computer readable form as experience data (either the final picture after the graffiti is completed or the animation of the graffiti painting process).
  • the detection server 1 obtains the experience data, it transmits the experience data to the remote server 2.
  • the remote server 2 pushes the experience data to the smart terminal carried by the corresponding user according to the same principle as the first embodiment.
  • the above embodiments are based on a system in which a remote server 2 corresponds to a plurality of camera devices and a plurality of detection servers 1 (the camera device and the detection server are in one-to-one correspondence, that is, one camera device corresponds to one detection server).
  • the remote server 2 corresponds to one camera and one detection server, it only needs to match the photo and/or video according to the shooting time of the photo and/or video, and the shooting time is within the shooting range of the only camera.
  • the smart terminals match.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Telephonic Communication Services (AREA)

Abstract

本发明涉及一种匹配方法及智能互动体验系统。本发明要解决的技术问题是提供一种匹配方法及智能互动体验系统,能够将用户在体验活动中产生的体验数据与该用户随身携带的智能终端准确匹配。解决该问题的技术方案:匹配方法包括:获取正在进行体验活动的用户的体验数据;服务终端将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配;所述智能终端由用户随身携带,安装有能够实时获取智能终端当前位置的移动客户端,该客户端与服务终端相对应构成主从式架构以实现交互。

Description

匹配方法及智能互动体验系统 技术领域
本发明涉及一种匹配方法及智能互动体验系统。
背景技术
随着社会的进步,科技的发展,微信朋友圈、微博等交流软件的日渐火爆,智能互动体验越来越受到年轻一代消费者的青睐,他们通常会把体验活动中的一些体验数据(照片、视频等计算机可读形式的文件)分享给亲友,但是如何将体验活动中的体验数据准确的传输至对应用户的智能终端上,是一个亟待解决的重要问题。
发明内容
本发明要解决的技术问题是:针对上述存在的问题提供一种匹配方法及智能互动体验系统,能够将用户在体验活动中产生的体验数据与该用户随身携带的智能终端准确匹配。
本发明所采用的技术方案是:匹配方法,其特征在于,包括:
获取正在进行体验活动的用户的体验数据;
服务终端将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配;
所述智能终端由用户随身携带,安装有能够实时获取智能终端当前位置的移动客户端,该客户端与服务终端相对应构成主从式架构以实现交互。
所述获取正在进行体验活动的用户的体验数据,包括:
用户进入体验设备进行体验活动并触发体验设备上的感应器后,体验设备即开始获取正在进行体验活动的用户的体验数据。
所述服务终端包括:
侦测服务器,与体验设备相连,用于实时侦测并获取体验设备中新的体验数据,并提取出各体验数据所对应体验设备的设备编号及体验数据的原始获取时间;
远程服务器,一方面通过lan或wifi与侦测服务器相连,用于接收和存储侦 测服务器传输过来的体验数据,以及对应体验设备的设备编号和原始获取时间;另一方面与所述智能终端上安装的移动客户端相对应构成主从式架构,将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配。
所述将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配,包括:
远程服务器获取用户触发体验设备上感应器的起始时间点t1和结束时间点t2,以及该用户所携带智能终端在t1-t2时间段内所处位置所对应体验设备的设备编号A;
远程服务器根据体验数据所对应体验设备的设备编号及体验数据的原始获取时间,将原始获取时间位于t1-t2时间段内、且由编号为A的体验设备获取的体验数据,与t1-t2时间段内位于设备编号为A的体验设备体验范围内的智能终端进行匹配。
所述远程服务器获取用户触发体验设备上感应器的起始时间点t1和结束时间点t2,以及该用户所携带智能终端在t1-t2时间段内所处位置所对应体验设备的设备编号A,包括:
侦测服务器获取体验设备上感应器被触发的起始时间t1和结束时间t2,并将其传输至远程服务器;
用户随身携带的智能终端将t1-t2时间段内其所处位置坐标反馈给远程服务器,远程服务器将该反馈与所预存各体验设备体验范围的坐标进行匹配,得到该智能终端在t1-t2时间段内所处位置所对应体验设备的设备编号A。
所述将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配,包括:
远程服务器获取用户触发体验设备上感应器的起始时间点t1和结束时间点t2,以及被触发感应器对应体验设备的设备编号A;
远程服务器判断t1-t2时间段内是否有智能终端位于设备编号为A的体验设备体验范围内,若有,则将原始获取时间位于t1-t2时间段内、且由编号为A的 体验设备获取的体验数据,与t1-t2时间段内位于设备编号为A的体验设备体验范围内的智能终端进行匹配。
所述远程服务器判断t1-t2时间段内是否有智能终端位于设备编号为A的体验设备体验范围内,包括:
远程服务器获取t1-t2时间段内、与其建立起所述主从式架构的各智能终端所处的位置坐标,并将其与所预存的设备编号为A的体验设备体验范围的坐标进行匹配,若匹配成功,则表示t1-t2时间段内有智能终端位于设备编号为A的体验设备体验范围内。
所述体验设备为一求签祈福设备,包括电子标签,以及与侦测服务器相连且具有感应功能的阅读器,用户求签跌落的电子标签进入阅读器的感应范围后,阅读器向电子标签发送指令,读取电子标签内的数据信息作为体验数据。
所述体验设备为与侦测服务器相连、且带有感应和记录涂鸦轨迹功能的涂鸦墙,用户在涂鸦墙上进行涂鸦创作时,涂鸦墙能够感应并记录下涂鸦轨迹,并形成计算机可读形式的涂鸦作品作为体验数据。
所述体验设备包括位于景区的游玩项目,以及与侦测服务器相连且具有红外感应功能的拍摄装置,该拍照装置正对游玩项目布置,用户进入拍摄装置感应范围内后,拍摄装置对其进行拍摄以获取用户在游玩项目上体验时的照片和/或视频作为体验数据。
所述游玩项目包括滑道运动、速降运动、溜索运动、骑行道运动、斜轨缆车运动、栈道运动、索道运动、船头运动、滑雪运动。
所述体验设备的体验范围由体验设备所处位置的坐标确定,且各体验设备的设备编号及其坐标值预存于远程服务器中。
智能互动体验系统,其特征在于,包括:智能终端、体验设备和服务终端,其中,
智能终端,用户随身携带,安装有能够实时获取智能终端当前位置的移动客户端,该客户端与服务终端相对应构成主从式架构以实现交互;
体验设备,内设感应器,供用户进行体验活动,并在感应器被触发时获取 用户进行体验活动时的体验数据;
服务终端,实时接收体验设备传输过来的体验数据,将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配,并根据匹配结果将体验数据推送至对应的智能终端上。
所述服务终端包括:
侦测服务器,与体验设备相连,用于实时侦测并获取体验设备中新的体验数据,并提取出各体验数据所对应体验设备的设备编号及体验数据的原始获取时间;
远程服务器,一方面通过lan或wifi与侦测服务器相连,用于接收和存储侦测服务器传输过来的体验数据,以及对应体验设备的设备编号和原始获取时间;另一方面与所述智能终端上安装的移动客户端相对应构成主从式架构,将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配,并根据匹配结果将体验数据推送至对应的智能终端上。
所述将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配,包括:
远程服务器获取用户触发体验设备上感应器的起始时间点t1和结束时间点t2,以及该用户所携带智能终端在t1-t2时间段内所处位置所对应体验设备的设备编号A;
远程服务器根据体验数据所对应体验设备的设备编号及体验数据的原始获取时间,将原始获取时间位于t1-t2时间段内、且由编号为A的体验设备获取的体验数据,与t1-t2时间段内位于设备编号为A的体验设备体验范围内的智能终端进行匹配。
所述远程服务器获取用户触发体验设备上感应器的起始时间点t1和结束时间点t2,以及该用户所携带智能终端在t1-t2时间段内所处位置所对应体验设备的设备编号A,包括:
侦测服务器获取体验设备上感应器被触发的起始时间t1和结束时间t2,并将其传输至远程服务器;
用户随身携带的智能终端将t1-t2时间段内其所处位置坐标反馈给远程服务器,远程服务器将该反馈与所预存各体验设备体验范围的坐标进行匹配,得到该智能终端在t1-t2时间段内所处位置所对应体验设备的设备编号A。
所述将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配,包括:
远程服务器获取用户触发体验设备上感应器的起始时间点t1和结束时间点t2,以及被触发感应器对应体验设备的设备编号A;
远程服务器判断t1-t2时间段内是否有智能终端位于设备编号为A的体验设备体验范围内,若有,则将原始获取时间位于t1-t2时间段内、且由编号为A的体验设备获取的体验数据,与t1-t2时间段内位于设备编号为A的体验设备体验范围内的智能终端进行匹配。
所述远程服务器判断t1-t2时间段内是否有智能终端位于设备编号为A的体验设备体验范围内,包括:
远程服务器获取t1-t2时间段内、与其建立起所述主从式架构的各智能终端所处的位置坐标,并将其与所预存的设备编号为A的体验设备体验范围的坐标进行匹配,若匹配成功,则表示t1-t2时间段内有智能终端位于设备编号为A的体验设备体验范围内。
所述体验设备为一求签祈福设备,包括电子标签,以及与侦测服务器相连且具有感应功能的阅读器,用户求签跌落的电子标签进入阅读器的感应范围后,阅读器向电子标签发送指令,读取电子标签内的数据信息作为体验数据。
所述体验设备为与侦测服务器相连、且带有感应和记录涂鸦轨迹功能的涂鸦墙,用户在涂鸦墙上进行涂鸦创作时,涂鸦墙能够感应并记录下涂鸦轨迹,并形成计算机可读形式的涂鸦作品作为体验数据。
所述体验设备包括位于景区的游玩项目,以及与侦测服务器相连且具有红外感应功能的拍摄装置,该拍照装置正对游玩项目布置,用户进入拍摄装置感应范围内后,拍摄装置对其进行拍摄以获取用户在游玩项目上体验时的照片和/或视频作为体验数据。
所述游玩项目包括滑道运动、速降运动、溜索运动、骑行道运动、斜轨缆车运动、栈道运动、索道运动、船头运动、滑雪运动。
本发明的有益效果是:1、本发明将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端(用户随身携带)与该体验数据匹配,确保用户的体验数据(照片、视频等计算机可读形式的文件)仅发送至自己随身携带的智能终端上,从而避免了体验数据与用户本人不一致的现象。2、整个过程全自动化,节省时间成本和人力成本、效率高。
附图说明
图1是实施例1的结构简图。
图2是实施例2的结构简图。
图3是实施例3的结构简图。
图4是实施例4的结构简图。
图5是实施例5的结构简图。
图6是实施例6的结构简图。
图7是实施例7的结构简图。
图8是实施例8的结构简图。
图9是实施例9的结构简图。
图10是实施例10的结构简图。
图11是实施例11的结构简图。
具体实施方式
本实施例匹配方法包括:
用户进入体验设备进行体验活动并触发体验设备上的感应器后,体验设备即开始获取正在进行体验活动的用户的体验数据;
侦测服务器1,与体验设备相连,用于实时侦测并获取体验设备中新的体验数据,并提取出各体验数据所对应体验设备的设备编号及体验数据的原始获取时间;
远程服务器2,一方面通过lan或wifi与侦测服务器1相连,用于接收和存 储侦测服务器1传输过来的体验数据,以及对应体验设备的设备编号和原始获取时间;另一方面与所述智能终端7上安装的移动客户端相对应构成主从式架构,将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端7与该体验数据匹配;
所述智能终端7由用户随身携带,安装有能够实时获取智能终端7当前位置的移动客户端,该客户端与远程服务器2相对应构成主从式架构以实现交互。
本例中,所述将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端7与该体验数据匹配,包括:
远程服务器2获取用户触发体验设备上感应器的起始时间点t1和结束时间点t2,以及该用户所携带智能终端7在t1-t2时间段内所处位置所对应体验设备的设备编号A;具体实现方式为:侦测服务器1获取体验设备上感应器被触发的起始时间t1和结束时间t2,并将其传输至远程服务器2;用户随身携带的智能终端7将t1-t2时间段内其所处位置坐标反馈给远程服务器2,远程服务器2将该反馈与所预存各体验设备体验范围的坐标进行匹配,得到该智能终端7在t1-t2时间段内所处位置所对应体验设备的设备编号A;
远程服务器2根据体验数据所对应体验设备的设备编号及体验数据的原始获取时间,将原始获取时间位于t1-t2时间段内、且由编号为A的体验设备获取的体验数据,与t1-t2时间段内位于设备编号为A的体验设备体验范围内的智能终端7进行匹配。
所述将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端7与该体验数据匹配的具体实现方式还可以是:
远程服务器2获取用户触发体验设备上感应器的起始时间点t1和结束时间点t2,以及被触发感应器对应体验设备的设备编号A;
远程服务器2判断t1-t2时间段内是否有智能终端位于设备编号为A的体验设备体验范围内(具体为:远程服务器获取t1-t2时间段内、与其建立起所述主从式架构的各智能终端所处的位置坐标,并将其与所预存的设备编号为A的体验设备体验范围的坐标进行匹配,若匹配成功,则表示t1-t2时间段内有智能终 端位于设备编号为A的体验设备体验范围内),若有,则将原始获取时间位于t1-t2时间段内、且由编号为A的体验设备获取的体验数据,与t1-t2时间段内位于设备编号为A的体验设备体验范围内的智能终端进行匹配。
所述体验设备可以为一求签祈福设备,包括电子标签,以及与侦测服务器1相连且具有感应功能的阅读器,用户求签跌落的电子标签进入阅读器的感应范围后,阅读器向电子标签发送指令,读取电子标签内的数据信息作为体验数据。所述体验设备也可以为与侦测服务器1相连、且带有感应和记录涂鸦轨迹功能的涂鸦墙,用户在涂鸦墙上进行涂鸦创作时,涂鸦墙能够感应并记录下涂鸦轨迹,并形成计算机可读形式的涂鸦作品作为体验数据。所述体验设备还可以基于景区的游玩项目,正对游玩项目布置与侦测服务器1相连且具有红外感应功能的拍摄装置,用户进入拍摄装置感应范围内后,拍摄装置对其进行拍摄以获取用户在游玩项目上体验时的照片和/或视频作为体验数据。
所述游玩项目包括滑道运动、速降运动、溜索运动、骑行道运动、斜轨缆车运动、栈道运动、索道运动、船头运动、滑雪运动。
所述体验设备的体验范围由体验设备所处位置的坐标确定,且各体验设备的设备编号及其坐标值预存于远程服务器2中。
所述智能终端7为智能手机、平板电脑、智能穿戴设备等智能产品。
以下各实施例均以前述匹配方法为基础实现。
实施例1:如图1所示,本实施例智能互动体验系统,基于景区的滑道运动(其体验设备包括拍摄装置和景区游玩项目),包括:智能终端7、拍摄装置、侦测服务器1和远程服务器2,其中,
智能终端7,用户随身携带,安装有能够实时获取智能终端7当前位置的移动客户端,该客户端与远程服务器2相对应构成主从式架构以实现交互;
拍摄装置,正对景区的游玩项目布置,其拍摄范围位于游玩项目中游玩区的中段,用于对进入其拍摄范围内的用户进行拍摄以获取照片和/或视频;
侦测服务器1,与拍摄装置相连,用于实时侦测并获取拍摄装置中新拍摄的照片和/或视频,并提取出各照片和/或视频所对应拍摄装置的设备编号和拍摄时 间;本例中,所述侦测服务器1选用研华610L;
远程服务器2,一方面通过lan或wifi与侦测服务器1相连,用于接收和存储侦测服务器1传输过来的照片和/或视频,以及对应拍摄装置的设备编号和拍摄时间等基础信息;另一方面与所述智能终端7上安装的移动客户端相对应构成主从式架构,根据照片和/或视频的拍摄时间及拍摄装置设备编号,将照片和/或视频与该拍摄时间内正处于具有前述设备编号的拍摄装置拍摄范围内的智能终端7相匹配,并根据该匹配结果将依据所接收文件生成的照片缩略图和/或转码压缩后的视频推送至对应的智能终端7上;
本例中,所述远程服务器2由两两相连的应用服务器2-1、数据库服务器2-2和Web服务器2-3组成,三者均选用IBM Netfinity 3000;其中应用服务器2-1与所述智能终端7上安装的移动客户端相对应构成主从式架构,一方面用于对接收的照片进行压缩生成对应的缩略图使用户通过智能终端7浏览时更加流畅,对接收的视频进行转码(以符合智能终端能够播放的格式)压缩(使用户通过智能终端浏览时更加流畅),另一方面根据照片和/或视频的拍摄时间及拍摄装置设备编号,将照片和/或视频与该拍摄时间内正处于具有前述设备编号的拍摄装置拍摄范围内的智能终端7相匹配;数据库服务器2-2用于存储未经处理的照片和/或视频,以及应用服务器2-1处理后的照片缩略图和/或转码压缩后的视频;Web服务器2-3一方面根据匹配结果将依据所接收文件生成的照片缩略图和/或转码压缩后的视频推送至对应的智能终端7上,另一方面提供用户在家里或办公场所通过PC、笔记本、平板电脑或手机登录WEB网站系统,进行查询和下载照片、视频的功能。用户可以通过智能终端7上的移动客户端注册用户,作为身份标示,登录后可以查看历史照片和/或视频。
所述拍摄装置包括野外红外感应照相机3和野外红外感应摄像机4,两者的红外感应范围重合(用户进入该红外感应范围内,野外红外感应照相机和野外红外感应摄像机即开始拍摄,即用户进入拍摄装置拍摄范围的起始时间就是红外感应触发的起始时间),构成所述拍摄范围;所述野外红外感应照相机3、野外红外感应摄像机4的设备编号和拍摄范围坐标值预存在远程服务器2中。所 述野外红外感应照相机3和野外红外感应摄像机4的输出端各连接有一USB数据信号放大器5,各USB数据信号放大器5另一端经USB数据信号接收器6与侦测服务器1相连,所述USB数据信号放大器5与USB数据信号接收器6之间通过超五类网线连接,嵌入式电路的收发器可以防止红外感应自动拍摄相机传输照片和视频的信息丢失,把USB信号进行放大通过超五类网线进行可最大长跨度80-120米有线延长传输。
本例中,所述根据照片和/或视频的拍摄时间及拍摄装置设备编号,将照片和/或视频与该拍摄时间内正处于具有前述设备编号的拍摄装置拍摄范围内的智能终端7相匹配的具体实现方式为:
首先,远程服务器2获取用户及其随身携带智能终端7进入拍摄装置拍摄范围的起始时间点t1、离开拍摄装置拍摄范围的时间点t2,以及该智能终端7在t1-t2时间段内所处位置所对应拍摄装置的设备编号A;其具体实现方式为:用户随身携带的智能终端7进入拍摄装置拍摄范围和离开拍摄装置拍摄范围时,分别向远程服务器2发送到达标志和离开标志,同时远程服务器2分别记录下到达和离开拍摄范围的时间点t1和t2(或者侦测服务器获取拍摄装置上红外感应器被触发的起始时间t1和结束时间t2,并将其传输至远程服务器);用户随身携带的智能终端7将t1-t2时间段内其所处位置坐标反馈给远程服务器2,远程服务器2将该反馈与所预存各拍摄装置拍摄范围的坐标进行匹配,得到该智能终端7在t1-t2时间段内所处位置所对应拍摄装置的设备编号A;
然后,远程服务器2根据照片和/或视频的拍摄时间及拍摄装置设备编号,将拍摄于t1-t2时间段内、且由设备编号为A的拍摄装置拍摄的照片和/或视频,与t1-t2时间段内位于设备编号为A的拍摄装置拍摄范围内的智能终端7进行匹配。
本例中,所述用户随身携带的智能终端7进入拍摄装置拍摄范围和离开拍摄装置拍摄范围时,分别向远程服务器2发送到达标志和离开标志的具体实现方式为:
安装于智能终端7上的移动客户端实时获取智能终端7当前位置坐标,并 与预存的各拍摄装置拍摄范围的坐标进行匹配;
当智能终端7进入拍摄装置拍摄范围的坐标范围内时,向远程服务器2发送到达标志;
当智能终端7离开拍摄装置拍摄范围的坐标范围内时,向远程服务器2发送离开标志。
所述根据照片和/或视频的拍摄时间及拍摄装置设备编号,将照片和/或视频与该拍摄时间内正处于具有前述设备编号的拍摄装置拍摄范围内的智能终端7相匹配的具体实现方式还可以是:
远程服务器2获取用户及其随身携带智能终端7进入拍摄装置拍摄范围的起始时间点t1、离开拍摄装置拍摄范围的时间点t2(即触发野外红外感应照相机和野外红外感应摄像机上红外感应器的起始时间点t1和结束时间点t2),以及该拍摄装置的设备编号A;
远程服务器2判断t1-t2时间段内是否有智能终端位于设备编号为A的拍摄装置拍摄范围内(具体为:远程服务器获取t1-t2时间段内、与其建立起所述主从式架构的各智能终端所处的位置坐标,并将其与所预存的设备编号为A的拍摄装置拍摄范围的坐标进行匹配,若匹配成功,则表示t1-t2时间段内有智能终端位于设备编号为A的拍摄装置拍摄范围内),若有,则将拍摄于t1-t2时间段内、且由设备编号为A的拍摄装置拍摄的照片和/或视频,与t1-t2时间段内位于设备编号为A的拍摄装置拍摄范围内的智能终端7进行匹配。
为了进一步确保向用户随身携带智能终端7所推送照片和/或视频与用户本人的一致性,本实施例增加以下步骤:用户通过智能终端7上的移动客户端查看接收到的照片缩略图和/或转码压缩后的视频,若发现该组照片和/或视频并非本人信息,则向远程服务器2发送重发指令;远程服务器2接收到该重发指令后,向该智能终端7推送前后各三组照片和/或视频。
所述侦测服务器1定期对野外红外感应照相机3、野外红外感应摄像机4及侦测服务器1上存放照片和/或视频的物理空间进行扫描,到达一定得存储大小限制后,自动对历史文件进行删除,以确保新拍摄的照片和/或视频能够正常存 储。
实施例2:如图2所示,本实施例智能互动体验系统与实施例1基本相同,其区别在于,本实施例系统基于景区的速降运动中,拍摄装置正对速降运动的中段布置。
实施例3:如图3所示,本实施例智能互动体验系统与实施例1基本相同,其区别在于,本实施例系统基于景区的溜索运动中,拍摄装置正对溜索运动的中段布置。
实施例4:如图4所示,本实施例智能互动体验系统与实施例1基本相同,其区别在于,本实施例系统基于景区的骑行道中,拍摄装置正对骑行道中段布置。
实施例5:如图5所示,本实施例智能互动体验系统与实施例1基本相同,其区别在于,本实施例系统基于景区的斜轨缆车项目中,拍摄装置正对斜轨缆车的中段布置。
实施例6:如图6所示,本实施例智能互动体验系统与实施例1基本相同,其区别在于,本实施例系统基于景区的栈道项目中,拍摄装置正对栈道的中段布置。
实施例7:如图7所示,本实施例智能互动体验系统与实施例1基本相同,其区别在于,本实施例系统基于景区的索道运动中,拍摄装置正对索道运动的中段布置。
实施例8:如图8所示,本实施例智能互动体验系统与实施例1基本相同,其区别在于,本实施例系统基于景区的船头项目中,拍摄装置正对船头项目路径的中段布置。
实施例9:如图9所示,本实施例智能互动体验系统与实施例1基本相同,其区别在于,本实施例系统基于景区的滑雪项目中,拍摄装置正对滑雪项目路径的中段布置。
实施例10:如图10所示,本实施例智能互动体验系统与实施例1基本相同,其区别在于,本实施例体验设备为一求签祈福设备,它包括电子标签,以及与 侦测服务器1相连(通过USB数据信号放大器5、USB数据信号接收器6及超五类网线相连)且具有感应功能的阅读器,用户求签跌落的电子标签进入阅读器的感应范围后,阅读器向电子标签发送指令,读取电子标签内的数据信息作为体验数据。侦测服务器1获取该体验数据后,将其传输至远程服务器2,远程服务器2对该体验数据进行解码处理后,依据与实施例1相同的原理,将解码处理后的体验数据推送至对应用户随身携带的智能终端上,获得电子形式的签。
实施例11:如图11所示,本实施例智能互动体验系统与实施例1基本相同,其区别在于,本实施例体验设备为与侦测服务器1相连(通过USB数据信号放大器5、USB数据信号接收器6及超五类网线相连)、且带有感应和记录涂鸦轨迹功能(通过压力传感器实现)的涂鸦墙,用户在涂鸦墙上进行涂鸦创作时,涂鸦墙能够感应并记录下涂鸦轨迹,并形成计算机可读形式的涂鸦作品作为体验数据(可以是涂鸦完成后的最终图片,也可以是涂鸦绘画过程的动画)。侦测服务器1获取该体验数据后,将其传输至远程服务器2,远程服务器2依据与实施例1相同的原理,将该体验数据推送至对应用户随身携带的智能终端上。
上述实施例都是基于一个远程服务器2对应多个拍摄装置和多个侦测服务器1(拍摄装置与侦测服务器是一一对应的,即一个拍摄装置对应一个侦测服务器)的系统,当一个远程服务器2对应一个拍摄装置和一个侦测服务器时,匹配时,只需要根据照片和/或视频的拍摄时间,将照片和/或视频与该拍摄时间内正处于这唯一一个拍摄装置拍摄范围内的智能终端相匹配即可。

Claims (22)

  1. 一种匹配方法,其特征在于,包括:
    获取正在进行体验活动的用户的体验数据;
    服务终端将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配;
    所述智能终端由用户随身携带,安装有能够实时获取智能终端当前位置的移动客户端,该客户端与服务终端相对应构成主从式架构以实现交互。
  2. 根据权利要求1所述的匹配方法,其特征在于,所述获取正在进行体验活动的用户的体验数据,包括:
    用户进入体验设备进行体验活动并触发体验设备上的感应器后,体验设备即开始获取正在进行体验活动的用户的体验数据。
  3. 根据权利要求2所述的匹配方法,其特征在于,所述服务终端包括:
    侦测服务器(1),与体验设备相连,用于实时侦测并获取体验设备中新的体验数据,并提取出各体验数据所对应体验设备的设备编号及体验数据的原始获取时间;
    远程服务器(2),一方面通过lan或wifi与侦测服务器(1)相连,用于接收和存储侦测服务器(1)传输过来的体验数据,以及对应体验设备的设备编号和原始获取时间;另一方面与所述智能终端上安装的移动客户端相对应构成主从式架构,将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配。
  4. 根据权利要求3所述的匹配方法,其特征在于,所述将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配,包括:
    远程服务器(2)获取用户触发体验设备上感应器的起始时间点t1和结束时间点t2,以及该用户所携带智能终端在t1-t2时间段内所处位置所对应体验设备的设备编号A;
    远程服务器(2)根据体验数据所对应体验设备的设备编号及体验数据的原 始获取时间,将原始获取时间位于t1-t2时间段内、且由编号为A的体验设备获取的体验数据,与t1-t2时间段内位于设备编号为A的体验设备体验范围内的智能终端进行匹配。
  5. 根据权利要求4所述的匹配方法,其特征在于,所述远程服务器(2)获取用户触发体验设备上感应器的起始时间点t1和结束时间点t2,以及该用户所携带智能终端在t1-t2时间段内所处位置所对应体验设备的设备编号A,包括:
    侦测服务器(1)获取体验设备上感应器被触发的起始时间t1和结束时间t2,并将其传输至远程服务器(2);
    用户随身携带的智能终端将t1-t2时间段内其所处位置坐标反馈给远程服务器(2),远程服务器(2)将该反馈与所预存各体验设备体验范围的坐标进行匹配,得到该智能终端在t1-t2时间段内所处位置所对应体验设备的设备编号A。
  6. 根据权利要求3所述的匹配方法,其特征在于,所述将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配,包括:
    远程服务器(2)获取用户触发体验设备上感应器的起始时间点t1和结束时间点t2,以及被触发感应器对应体验设备的设备编号A;
    远程服务器(2)判断t1-t2时间段内是否有智能终端位于设备编号为A的体验设备体验范围内,若有,则将原始获取时间位于t1-t2时间段内、且由编号为A的体验设备获取的体验数据,与t1-t2时间段内位于设备编号为A的体验设备体验范围内的智能终端进行匹配。
  7. 根据权利要求6所述的匹配方法,其特征在于,所述远程服务器(2)判断t1-t2时间段内是否有智能终端位于设备编号为A的体验设备体验范围内,包括:
    远程服务器(2)获取t1-t2时间段内、与其建立起所述主从式架构的各智能终端所处的位置坐标,并将其与所预存的设备编号为A的体验设备体验范围的坐标进行匹配,若匹配成功,则表示t1-t2时间段内有智能终端位于设备编号为A的体验设备体验范围内。
  8. 根据权利要求3-7任意一项所述的匹配方法,其特征在于:所述体验设备为一求签祈福设备,包括电子标签,以及与侦测服务器(1)相连且具有感应功能的阅读器,用户求签跌落的电子标签进入阅读器的感应范围后,阅读器向电子标签发送指令,读取电子标签内的数据信息作为体验数据。
  9. 根据权利要求3-7任意一项所述的匹配方法,其特征在于:所述体验设备为与侦测服务器(1)相连、且带有感应和记录涂鸦轨迹功能的涂鸦墙,用户在涂鸦墙上进行涂鸦创作时,涂鸦墙能够感应并记录下涂鸦轨迹,并形成计算机可读形式的涂鸦作品作为体验数据。
  10. 根据权利要求3-7任意一项所述的匹配方法,其特征在于:所述体验设备包括位于景区的游玩项目,以及与侦测服务器(1)相连且具有红外感应功能的拍摄装置,该拍照装置正对游玩项目布置,用户进入拍摄装置感应范围内后,拍摄装置对其进行拍摄以获取用户在游玩项目上体验时的照片和/或视频作为体验数据。
  11. 根据权利要求10所述的匹配方法,其特征在于:所述游玩项目包括滑道运动、速降运动、溜索运动、骑行道运动、斜轨缆车运动、栈道运动、索道运动、船头运动、滑雪运动。
  12. 根据权利要求3-7任意一项所述的匹配方法,其特征在于:所述体验设备的体验范围由体验设备所处位置的坐标确定,且各体验设备的设备编号及其坐标值预存于远程服务器(2)中。
  13. 一种智能互动体验系统,其特征在于,包括:智能终端、体验设备和服务终端,其中,
    智能终端,用户随身携带,安装有能够实时获取智能终端当前位置的移动客户端,该客户端与服务终端相对应构成主从式架构以实现交互;
    体验设备,内设感应器,供用户进行体验活动,并在感应器被触发时获取用户进行体验活动时的体验数据;
    服务终端,实时接收体验设备传输过来的体验数据,将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹 配,并根据匹配结果将体验数据推送至对应的智能终端上。
  14. 根据权利要求13所述的智能互动体验系统,其特征在于,所述服务终端包括:
    侦测服务器(1),与体验设备相连,用于实时侦测并获取体验设备中新的体验数据,并提取出各体验数据所对应体验设备的设备编号及体验数据的原始获取时间;
    远程服务器(2),一方面通过lan或wifi与侦测服务器(1)相连,用于接收和存储侦测服务器(1)传输过来的体验数据,以及对应体验设备的设备编号和原始获取时间;另一方面与所述智能终端上安装的移动客户端相对应构成主从式架构,将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配,并根据匹配结果将体验数据推送至对应的智能终端上。
  15. 根据权利要求14所述的智能互动体验系统,其特征在于:所述将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配,包括:
    远程服务器(2)获取用户触发体验设备上感应器的起始时间点t1和结束时间点t2,以及该用户所携带智能终端在t1-t2时间段内所处位置所对应体验设备的设备编号A;
    远程服务器(2)根据体验数据所对应体验设备的设备编号及体验数据的原始获取时间,将原始获取时间位于t1-t2时间段内、且由编号为A的体验设备获取的体验数据,与t1-t2时间段内位于设备编号为A的体验设备体验范围内的智能终端进行匹配。
  16. 根据权利要求15所述的智能互动体验系统,其特征在于,所述远程服务器(2)获取用户触发体验设备上感应器的起始时间点t1和结束时间点t2,以及该用户所携带智能终端在t1-t2时间段内所处位置所对应体验设备的设备编号A,包括:
    侦测服务器(1)获取体验设备上感应器被触发的起始时间t1和结束时间t2, 并将其传输至远程服务器(2);
    用户随身携带的智能终端将t1-t2时间段内其所处位置坐标反馈给远程服务器(2),远程服务器(2)将该反馈与所预存各体验设备体验范围的坐标进行匹配,得到该智能终端在t1-t2时间段内所处位置所对应体验设备的设备编号A。
  17. 根据权利要求14所述的智能互动体验系统,其特征在于,所述将体验数据产生时间段内正处于该体验数据所对应体验设备体验范围内的智能终端与该体验数据匹配,包括:
    远程服务器(2)获取用户触发体验设备上感应器的起始时间点t1和结束时间点t2,以及被触发感应器对应体验设备的设备编号A;
    远程服务器(2)判断t1-t2时间段内是否有智能终端位于设备编号为A的体验设备体验范围内,若有,则将原始获取时间位于t1-t2时间段内、且由编号为A的体验设备获取的体验数据,与t1-t2时间段内位于设备编号为A的体验设备体验范围内的智能终端进行匹配。
  18. 根据权利要求17所述的智能互动体验系统,其特征在于,所述远程服务器(2)判断t1-t2时间段内是否有智能终端位于设备编号为A的体验设备体验范围内,包括:
    远程服务器(2)获取t1-t2时间段内、与其建立起所述主从式架构的各智能终端所处的位置坐标,并将其与所预存的设备编号为A的体验设备体验范围的坐标进行匹配,若匹配成功,则表示t1-t2时间段内有智能终端位于设备编号为A的体验设备体验范围内。
  19. 根据权利要求14-18任意一项所述的智能互动体验系统,其特征在于:所述体验设备为一求签祈福设备,包括电子标签,以及与侦测服务器(1)相连且具有感应功能的阅读器,用户求签跌落的电子标签进入阅读器的感应范围后,阅读器向电子标签发送指令,读取电子标签内的数据信息作为体验数据。
  20. 根据权利要求14-18任意一项所述的智能互动体验系统,其特征在于:所述体验设备为与侦测服务器(1)相连、且带有感应和记录涂鸦轨迹功能的涂鸦墙,用户在涂鸦墙上进行涂鸦创作时,涂鸦墙能够感应并记录下涂鸦轨迹, 并形成计算机可读形式的涂鸦作品作为体验数据。
  21. 根据权利要求14-18任意一项所述的智能互动体验系统,其特征在于:所述体验设备包括位于景区的游玩项目,以及与侦测服务器(1)相连且具有红外感应功能的拍摄装置,该拍照装置正对游玩项目布置,用户进入拍摄装置感应范围内后,拍摄装置对其进行拍摄以获取用户在游玩项目上体验时的照片和/或视频作为体验数据。
  22. 根据权利要求21所述的智能互动体验系统,其特征在于:所述游玩项目包括滑道运动、速降运动、溜索运动、骑行道运动、斜轨缆车运动、栈道运动、索道运动、船头运动、滑雪运动。
PCT/CN2015/090862 2014-10-24 2015-09-28 匹配方法及智能互动体验系统 WO2016062187A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410579648.9A CN104410663B (zh) 2014-10-24 2014-10-24 匹配方法及智能互动体验系统
CN201410579648.9 2014-10-24

Publications (1)

Publication Number Publication Date
WO2016062187A1 true WO2016062187A1 (zh) 2016-04-28

Family

ID=52648261

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/090862 WO2016062187A1 (zh) 2014-10-24 2015-09-28 匹配方法及智能互动体验系统

Country Status (2)

Country Link
CN (1) CN104410663B (zh)
WO (1) WO2016062187A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114915647A (zh) * 2021-01-28 2022-08-16 复旦大学 基于微服务的前沿装备远程互动体验系统

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104301777B (zh) * 2014-10-24 2017-10-03 杭州自拍秀科技有限公司 自动获取影像并即时推送至对应智能终端的系统和方法
CN104410663B (zh) * 2014-10-24 2018-07-17 杭州自拍秀科技有限公司 匹配方法及智能互动体验系统
WO2017121403A1 (zh) * 2016-01-15 2017-07-20 杨军 一种数据传输系统及方法
CN105516911B (zh) * 2016-01-15 2019-07-23 杭州大穿越旅游策划有限公司 一种数据传输方法
CN106162552B (zh) * 2016-05-26 2021-08-27 杭州大穿越旅游策划有限公司 一种地图系统及基于该地图系统的交互方法
CN106453470B (zh) * 2016-05-26 2022-05-10 杭州大穿越旅游策划有限公司 一种交互系统及方法
CN106453537B (zh) * 2016-09-30 2024-02-02 杭州大穿越旅游策划有限公司 匹配方法、智能互动体验系统及智能交互系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011107821A1 (en) * 2010-03-05 2011-09-09 Telefonaktiebolaget L M Ericsson (Publ) Node capabilities detection method and system
CN203800960U (zh) * 2014-03-21 2014-08-27 杨荔菲 基于绳索的户外运动拍照系统
CN104410663A (zh) * 2014-10-24 2015-03-11 杭州大穿越旅游策划有限公司 匹配方法及智能互动体验系统

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103595779B (zh) * 2013-11-01 2017-02-08 深圳市中视典数字科技有限公司 一种实现群体同一时刻拍照及照片分享的方法
CN103888533B (zh) * 2014-03-21 2018-06-05 杭州自拍秀科技有限公司 基于绳索的户外运动拍照系统及照片获取方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011107821A1 (en) * 2010-03-05 2011-09-09 Telefonaktiebolaget L M Ericsson (Publ) Node capabilities detection method and system
CN203800960U (zh) * 2014-03-21 2014-08-27 杨荔菲 基于绳索的户外运动拍照系统
CN104410663A (zh) * 2014-10-24 2015-03-11 杭州大穿越旅游策划有限公司 匹配方法及智能互动体验系统

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114915647A (zh) * 2021-01-28 2022-08-16 复旦大学 基于微服务的前沿装备远程互动体验系统
CN114915647B (zh) * 2021-01-28 2023-08-29 复旦大学 基于微服务的前沿装备远程互动体验系统

Also Published As

Publication number Publication date
CN104410663A (zh) 2015-03-11
CN104410663B (zh) 2018-07-17

Similar Documents

Publication Publication Date Title
WO2016062187A1 (zh) 匹配方法及智能互动体验系统
WO2016062185A1 (zh) 自动获取影像并即时推送至对应智能终端的系统和方法
US10701448B2 (en) Video delivery method for delivering videos captured from a plurality of viewpoints, video reception method, server, and terminal device
US9788065B2 (en) Methods and devices for providing a video
CN204392413U (zh) 拍摄系统
US9942455B2 (en) Timing system and method with integrated participant event image capture management services
US10356183B2 (en) Method for sharing photographed images between users
US20130307988A1 (en) Rfid tag read triggered image and video capture event timing method
CN103189864A (zh) 用于确定个人的共享好友的方法、设备和计算机程序产品
JP5947044B2 (ja) 画像収集システム、画像撮像装置、画像記憶装置
WO2018059536A1 (zh) 匹配方法、智能互动体验系统及智能交互系统
CN106105231A (zh) 在移动设备上标识媒体
CN112805722A (zh) 减少面部识别中的误报的方法和装置
CN107220856B (zh) 一种移动消费群组识别的系统及方法
CN104426937A (zh) 通过移动电话和云计算实现对其他设备所记录内容的定位
TWI547883B (zh) 影像擷取方法及系統,及相關電腦程式產品
JP6115113B2 (ja) 所定領域管理システム、所定領域管理方法、及びプログラム
CN201860379U (zh) 便携式无线网络高清摄录设备
TW201401070A (zh) 資料傳輸系統和電子裝置
CN112437332B (zh) 一种目标多媒体信息的播放方法和装置
JP2019083532A (ja) 画像処理システム、画像処理方法および画像処理プログラム
US20190253371A1 (en) Systems and methods for sharing captured visual content
US20170048654A1 (en) Information processing apparatus, information processing method, and program
CN105827724A (zh) 一种智能识别并反馈信息的方法及系统
WO2017121403A1 (zh) 一种数据传输系统及方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15852932

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 04/09/2017)

122 Ep: pct application non-entry in european phase

Ref document number: 15852932

Country of ref document: EP

Kind code of ref document: A1