Integration with this so apps could be connected with Alexa commands and more

I wish Intuilab’s software was integrated to work with Using IF THIS THAN THAT statements their would be many possibilities like using Amazon Alexa commands to move to a scene in a project and so much more.

Hi @raythorsky,

We have some ideas to make IFTTT & IntuiFace compliant using their Makers web hooks. I’m tagging Paolo @tosolini since I know he’s also interested in this.
Stay tuned, we’ll come back to you when we have a clearer vision of what we can do to make this happen.



Hi Seb, I am also interested. please let me now any futher development

1 Like

I just found a way to do it, it is not ideal but it works.

IF NEW email on GMAIL (you can specify a query) then Whatever (Exemple SEND SMS)…

Hi Marcelo, yes nice job, I tried something like that too but I can’t figure out how to send an action to my Intuilab project from Amazon’s Alexa specifically. I have some developers looking into it and will post to the community if we find a way.


One solution that just came to my mind is IF Alexa(triggers) then there is a service in Ifttt called webhook, there you can send a http request to intuiface.

The new version of IF has a property called external source, there you can receive http request with parameters that will be need to trigger.

Hope it helps

The solution we are investigating is effectively using the Maker webhooks. The “issue” is that you need to be able to call that URL from the web to target a specific player, which you can’t do now unless your player machine is directly addressable via an IP address + port redirections.

We have some architecture ideas, based on a server that would listen to that web calls and to which a player would register to “receive some notifications”. We’ll investigate that in the coming weeks depending on internal bandwidth.

Hi @Seb @marcelo @tosolini

I am happy to hear you are working on a solution for this as this would be a very “COOL” feature to be able to freely talk to the Intuilab project (without having to touch a button on the application). I think many Intuilab developers would also like this feature since adds a whole new way to interact with the application.

Its been about 30 days since your response saying “We’ll investigate that in the coming weeks depending on internal bandwidth.” so I wanted to check back in and see if there are any updates.

Thanks, Ray

@raythorsky I don’t have any news on my end about the IFTTT concept.
@Seb Have you released anything new along these lines?

Hi @raythorsky and Paolo @tosolini,

We did some investigations and have a pretty clear idea of what we’ll do, if doable. The big question is the when considering summer vacations and all the things already on the short term roadmap.
I should know more about it at the end of this week for a first step, then around the middle of August.

1 Like

Hi Seb, I wanted to check back and see if there is any update on Intuilab connecting to Amazon Alexa.



We made some progress and have a functional proof of concept. Some other tasks came with higher priorities and this feature is postponed for the moment.

The info I can give you is that this feature could be done by anyone outside of the IntuiFace team, provided they have some development skills.
The architecture will be something like that:

  • a Node.JS server that will
    • listen to HTTP requests coming from the web (ex, IFTTT, Zapier, Alexa, …)
    • host a websocket server to communicate with IntuiFace Players in the field
  • a JavaScript Interface Asset, to be added in your experiences that should be reachable from the web. This IA will act as a websocket client and receive “commands” from the Node.JS server.

The “complex” part on our side is to create the “1 architecture that works for all our users”, including authentication & security matters.

It would actually take less time for a developer to create this architecture for his personal needs.


1 Like

Hi Seb,

Any chance there’s an example of the needed JS IA one could work from?


Not yet, but we’re getting close :slight_smile:

Regarding the IA side, since it is written in JavaScript, you’ll be able to see the code inside when we release it.
As I said before, the mechanism rely on a websocket with the following process

  • When loaded, the IA opens a websocket with a nodeJS server. The server registers this connection and keeps it alive.
  • When the server received an external command (Web API), such as an IFTTT Make a web request, the server sends this information to the IA through the websocket.
  • When the IA receives some information from the websocket, it raises a trigger to the experience.

We are now working on the API keys mechanism to make sure that @tosolini won’t be able to send commands to @gordon_johnson devices :smiley:


I always wanted to hack somebody else using Intuiface and IFTTT. Your security feature will crash my dream, Seb :slight_smile:

Thanks for the update SEB. looking forward to the new update

@tosolini Paolo, I’ll actually have an IFTTT based demo at ISE tradeshow without using this new “web trigger” mechanism :slight_smile:

I’ll present this to you at our booth 8-P270 but is uses an iPad that calls an IFTTT query that changes an Airtable base used on a passive Signage wall screen + a remote action


Dear Community, please check out our latest announcement here. You’ll be pleased…