Implementing a video queue based off of proximity sensors

My company is looking to create a touch-less experience for one of our clients, here is the description:

Our plan for this project is to have 5 plinths that resemble obelisks made of translucent white acrylic (“milk plex”) that each house a proximity sensor and LED strip. These would connect to a USB Controller Module(s) which would then connect to a single Intel NUC miniPC running IntuiFace.

A visitor to our client’s trade show exhibit would approach 1 of 5 branded plinths, each representing 1 of 5 product categories. The LED strips inside the plinth will be triggered by the sensor at 90cm to turn on at a low intensity and become brightest once the visitor is within 15cm of the sensor. At this point, a video for that product with play on a large LED tile wall (supplied as a rental asset by our company).

Once the video is finished, then LED wall will return to a looping “HOME” video and the plinth will go dark. If another visitor approaches a different plinth than the “active” one, its LEDs will pulse as if waiting in queue. The proximity sensor will trigger the next video to queue and pause until the active video is complete, and its LED strip will change from a pulse to a solid glow once its video begins to play.

The proximity sensor integration will be easy enough, but I was wondering if IntuiFace supports this ‘waiting video’ queue implementation that is described. Currently, if left up to my druthers, I would simply write a queue implementation in C#, but I am trying to find out if there are some IntuiFace internals that already support this functionality.

Is there some internal support for this, or a pre-scripted solution? If not, what are your recommendations on how I go about the queue implementation?

Hi Anthony,

You could do this with either the Nexmosphere sensors or the Phidgets sensors they both do the same thing but its down to user preference.

To trigger contnet with the Phidgets sensor you can do one of two things, you can either create a trigger as to when the values change on a global variable then play the video or create an If statement on the Phidgets 8-8-8-8 interface asset.

Using the Nexmosphere sensors they have three versions but for example using the small sensor with a range output of 10-1 as 1 being closest to the sensor you create a trigger on the asset itself to say if the address/command is recieved then play the video.

But depending on your scenario you could say whoever is at sensor 1 first then stop the detection from sensors 2,3,4 once the range is greater than a value on sensor 1 then whatever value you choose to trigger the content can be triggered on any of the other sensors.

I have built something like this before with the Nexmosphere sensors 10 sensors in total so you could imagine the if statements needed and alot of testing.

I hope this helps, if not then look at the documentation on the KB support if not give @Seb a shout unless he has anything else to add to what I’ve said in the above? .

Kind Regards

Louie

1 Like

Hi @amesa and thanks @Promultis for your insights :wink:

Regarding the sensors, I won’t add much to what Louie said. Either with Nexmosphere or Phidgets, you can handle the sensor side pretty easily. Nexmosphere will add the output LED control as well.

Regarding your queueing algorithm, I can see a few ways to build it in Intuiface, including

I think I’d give a try with the Excel solution and see if it can do the job.

  • When a sensor is triggered:
    • add a row in Excel with the necessary info (sensor number, video to play, LED to control, …)
    • If no video is playing yet
      • play the video in Row 1 and update its LEDs
      • use a Global Variable to store “A video is Playing” information (use this as a boolean)
  • When a video ends playing
    • Look at Row 1 in Excel and reset associated LEDs
    • Delete Row 1
    • If Excel number of rows > 0
      • play video listed in Row 1 and update its LEDs
    • If Excel number of rows = 0
      • restore initial status
      • Set “a video is playing” to false

Or something like that :slight_smile:

2 Likes

Thank you so much for all of your feedback! I implemented the queue as an ObservableCollection with dequeue and enqueue methods using C#, though the excel implementation certainly gives me something to keep in mind in the event we are dealing with data that clients may need to update in the IntuiFace directory themselves.

3 Likes

This sounds so cool! Do you have any video to demonstrate how it came out? :smiley: