Unifying Intuiface and Unity: The Creation of Intui3D Gallery

Welcome to the Intui3D Gallery, an exploration of the synergistic potential of Unity and Intuiface. This prototype showcases an integrated experience combining the interactive user interface of Intuiface with the dynamic 3D environment of Unity. With a simple selection from the Intuiface carousel, Unity directs the camera to the chosen artifact, offering flexible object rotation and viewpoint adjustment. Our gallery stands out with its global reach – each artifact is a 3D scan, created by individuals around the globe using the Polycam iOS app, showcasing a diverse array of cultural heritage in a single digital space.

How it Works:

Our prototype operates through two separate applications: Intuiface running in the foreground, and Unity running in the background. The Intuiface experience features a vertical asset flow sourced from an Excel database, which includes data such as artifact name, person who scanned it, location, and two URLs serving as remote actions.

These URLs are integral to our prototype’s operation. The first, the Pin Remote Action, is a convenient trick that allows us to show geographic locations on a 3D world map using a dynamic command line created in Excel.

The second, the Unity Remote Action, sends a command to the Unity app running in the background.

Unity remote control

Thanks to Intuiface’s transparent background support, users can interact with both Intuiface and Unity simultaneously, creating a smooth and immersive user experience.

The Unity app showcases a 3D gallery featuring artifacts captured using iPhone devices. The capability to rotate the artifacts, adjust viewpoints, transition smoothly, and apply lighting and particle effects, are all accomplished within Unity. The challenge was to manage the 3D space from the Intuiface experience.
The integration between Unity and Intuiface is achieved by calling a URL from Intuiface and reading that URL in Unity using HttpListener. We assigned a unique URL to each specific action to ensure Unity executes the intended function based on the received URL. This allows us to synchronize Intuiface with Unity and perform various actions as the user scrolls through the asset flow.

Try it Out:

We invite you to download and experience the Intui3D Gallery on your own PC.

  • Download and unzip the Unity app. (50MB)
  • Download and unzip the Intuiface XP (10MB)
  • Launch the Unity file, Museum.exe. Please note you might receive a security warning from the Windows firewall. Simply select “Allow.” The app will launch in full-screen mode.
  • Run the Intuiface experience, scroll through the asset flow, rotate artifacts, and adjust your viewpoint.
  • Tap the experience title to reset the gallery view and the “X” button to quit the Unity app.

Notable Points:

  • This prototype builds on our previous Tosoville demo, but with greater efficiency by removing the need for middleware.
  • Asking @Seb for confirmation, but I’m pretty sure this demo is compatible only with Player for Windows, not Player NextGen.

We’d love to hear from you: What potential use cases do you envision for this kind of integration?

3 Likes

You did it again @tosolini ! Once again, I love your mash-ups!

And you’re right, at the moment, only our Player for Windows has a transparent background capability that will let the touch events go through it.
But…
If you were able to export your Unity app as a web app… you might be able to host it in our web browser asset / HTML Frame asset, then use something like web sockets to communicate between the 2 environments (Intuiface & Unity).
Just saying… :wink:

2 Likes

@seb you always allure me to explore the next frontier (in this case Unity on the web + web sockets) :slight_smile: Thanks for the tip. I’ll consider it for a future mash-up

This is a fantastic pairing between two powerful applications. As a person who has recently become addicted to VR, I am starting to learn Unity because of it’s integration in the particular platform with which I am active, as well as its ease of workflow with Blender. I am excited to see the possibilities of combining my new personal hobby with my professional knowledge of using IntuiFace to create corporate interactives. Bravo, Paolo!

2 Likes

Thanks @cullenb. Glad to hear this demo may open up new creative explorations for you.

@cullenb Time for you to start playing with Player Next Gen, hosting Intuiface Experiences on the web and playing them in a web browser in your VR environment :wink:

1 Like

Very cool! I bet @geoff is already thinking how to take a future Intuiface event into the metaverse. I’m wondering if these environments (Virbela, Spatial, etc.) allow avatars to fully engage with live web sites (e.g. Player NextGen experience) or have some limits.

@tosolini, the costs have to come way down for the metaverse to be a viable option for events. As it is, most people won’t want a headset strapped to their head for any length of time, so it’ll be more of a 2D event space on their computer screen. The novelty would attract some folks, and networking becomes possible, but currently, the value-add is dubious.

As you can see from those photos, we played with it a couple of years ago. We’ll keep our eyes on it!

2 Likes

We tried Virbela several years ago indeed (with my Lenovo Windows Mixed Reality headset) and more recently with EngageVR with our Quests 2.
Engage was probably out of budget for most of us (last time I checked, they had more public pricing info), but I did attend a live VR concert they organized recently. The experience was “fun”, but the bandwidth required / data to download was huge (for me and my “nomad connectivity”), and the interaction with other users was … chaotic at best …

Regarding PLN, the limit remains how precise your avatar can be when “touching” a virtual 2D screen in a 3D environment. Back to the big buttons XPs :smiley:

3 Likes