Performance of Desktop Apps running in virtualized servers

I have a Desktop App running on a Windows server, accessed via Remote Desktop. When the app runs on a non-virtualized server or a regular PC, performance is reasonably fast. However, on a virtualized Windows server, it becomes significantly slower.

Is there a way to force software rendering to bypass dependency on the virtualized GPU?

Any suggestions to improve the performance of a Xojo app in virtualized environments?

Allocate more vcpus. Allocate more memory. Get a dedicated server.

Remember, most virtualization is on shared servers where there are multiple tenants. To ensure quality for everyone, you only get the resources you pay for.

Thanks @Greg_O.
In ma cese is it not a shared server, is a dedicated one.
What I noticed is that same server non-virtualized versus virtualized, running the same apolicatiin, the difference of performance is very important, and I what to know if there is any Xojo special compilation setup or any Xojo sentence or optimization that could help on performance on virtualized servers.

Hello,
Is the GUI required? Minimizing the window makes it run much faster.

As early as possible (e.g. App.Opening):
System.EnvironmentVariable("XOJO_D2D_SOFTWARE_RENDERING") = "True"

Or set this Environment Variable globally on that system, so that it’s set and available before you’re launching the Xojo built application.

1 Like