I have noticed that Xcode is capable of running UIKit code for IBDesignables, without launching the simulator at all. It seems to be using IBDesignablesAgentCocoaTouch
tool which is a macOS app. How something like that can be implemented?
iOS apps compiled for x86_64 are actually normal macOS apps, just linked against different frameworks. The simulator framework provides the runtime needed for all those framework to function in a manner similar to how they do on iOS hardware. The IBDesignablesAgentCocoaTouch
daemon spins up enough of the iOS subsystem to be able to take a snapshot of your UI every time you make a change in Xcode. Theoretically, it is possible to spin up the iOS subsystem partially or fully in a different manner than how the simulator does, but this is most likely infringes on the developer license agreement you signed when installing, and would certainly not be legal for distribution. Legal or otherwise, it would be quite an undertaking, and very likely to break with each change to the system. It is enough to look at how widely the simulator framework has changed over the recent years to get a feel for how hard this would be for non-Apple developers.
If you need to run non-UI code for testing, for instance, there are far more easier ways to achieve what you need, such as creating a macOS target and including all your non-UI stuff in it, and stubbing out the absolutely minimum necessary UI stuff for testing.
If you want to achieve multiplatform development using iOS frameworks, and then running them using Apple iOS frameworks outside of the simulator, this is not legal (iOS frameworks are not distributed with macOS normally, and require installing Xcode), and is not a good idea, as users expect applications to behave natively on the platform they are running. iOS UI concepts fit best for touch input, but they would be out of place on a precision-pointer input ecosystem.