Ok - I’ve been using Intuiface for years. Surprisingly, I can still get confused by some basic things. So I’m going to post this publicly because I think others may have the same challenge.
Let’s talk about the keyboard. Here’s my questions:
What is the difference between the keyboard in the Experience properties vs the keyboard in the Scene properties?
Why are there built-in triggers to move the Experience keyboard, but no triggers to move the scene keyboard?
Why does the keyboard seem to operate so unpredictably when different text inputs of the same scene require the keyboard in different places? I’ve built many Xp’s, and I often see the keyboard pop up in the wrong location, at the wrong size, etc, but then run the xp again, and it’s working just fine.
Maybe this is just wishful thinking, but wouldn’t it be nice if each text input had it’s own keyboard and properties? That way multiple people could input data on the same screen at the same time? Or at minimum, put keyboard properties in the text input item? That way whenever that input item goes into edit mode, the keyboard follows the size/position of that text input? And if another text input is touched, the keyboard follows the size/position set by that one?
Maybe there’s some OS limitations behind my requested features. But the keyboard is like a printer. In order to make it work, you need to double check everything, sing to it softly, and tell it that it’s safe for it to play nicely