I have a Desktop App running on a Windows server, accessed via Remote Desktop. When the app runs on a non-virtualized server or a regular PC, performance is reasonably fast. However, on a virtualized Windows server, it becomes significantly slower.
Is there a way to force software rendering to bypass dependency on the virtualized GPU?
Any suggestions to improve the performance of a Xojo app in virtualized environments?
Allocate more vcpus. Allocate more memory. Get a dedicated server.
Remember, most virtualization is on shared servers where there are multiple tenants. To ensure quality for everyone, you only get the resources you pay for.
Thanks @Greg_O.
In ma cese is it not a shared server, is a dedicated one.
What I noticed is that same server non-virtualized versus virtualized, running the same apolicatiin, the difference of performance is very important, and I what to know if there is any Xojo special compilation setup or any Xojo sentence or optimization that could help on performance on virtualized servers.