Testing with VMs vs Real Hardware

Been a while since I posted a question here…

With most of my apps, I like to make sure they run on Windows, MacOS and Linux. I often test on VMs and then on real hardware to make sure there are no issues. So far, I’ve not had any issues.

Now I know it’s always a good idea to make sure the software runs fine on Real Hardware, especially if it’s going to be a commercial product. I have Macs and Windows systems. I’m selling the one Linux Box I have and just trying to decide if my VMs will be good enough.

What’s everyone else doing?

I develop under macOS, but have VirtualBox VMs fpr Linux and Win-11, and real Pi.

I test Windows and Linux on Parallels, but I had a problem last year when the VPN software needed was for x86 Windows only, and not ARM Windows, so I bought a Lenovo laptop for final testing.

In hindsight I should have used the macOS version of the VPN software and Parallels Windows would have worked!

The Lenovo PC is gathering dust, except for the odd game or two…

I find it quite depends on what you’re doing, and what hardware architectures you’re targeting.

For example, if you develop on an ARM Mac but target Windows x86, certain operations can behave differently on the ARM version of Windows, so a VM does not reflect the native experience. I’ve experienced this especially when doing things that use GPU acceleration or otherwise attempt to tap into hardware. I’ve actually had better results running a MacOS VM on Windows than the other way around. But of course then you can’t test MacOS ARM specific instructions either.

My advice would be that if you’re simply building standard desktop apps that don’t do anything fancy you’ll probably be fine with VM’s. But anything involving advanced math, hardware access or other special circumstances (such as shelling to different command line utilities based on architecture), you probably should invest in the hardware for testing. This is why I still keep an Intel Mac around too.

I’d second this. When I replaced my macbook with an ARM machine, I set up a Windows ARM VM and was conncerned about testing. So I also bought a Dell laptop to test on real intel hardware.
I have to say, complete waste of money. It’s been used for a total of about 12 hours, and now lies dusty in a drawer. The ARM machine (for me) is fast and reliable, and so so far no issues re: intel/arm hardware/vm

Wipe it and sell/give it to a college student.

1 Like

Every now and then I get a bug that can’t be resolved via a VM so it’s nice to have a real Intel Windows laptop to fall back to. But 99% of the time a VM works just fine.

Of those that can’t be fixed it’s due to Windows OS features that the VM hides, or at least manages somewhat. I’m talking about canvas drawing issues since Windows double buffers but Mac’s do the VM can hide some issues. And then anything hardware related.

Lot’s of great feedback! I actually have a few Intel Based Macs, an M1 Mac mini and M1 MacBook Air. The system I develop on mostly is my 2017 iMac with 64gb of RAM, 2 TB System drive and a lot of external storage. That’s where I run my VMs. I do have Windows 11 ARM running in Parallels on my M1 MacBook Air.

So far, I’ve been able to target all of these systems without any real issues. I have used If Targetxxx on occasion to insure my software works properly on each system and that has worked great!

I have an Intel NUC running Windows 10 for now. I just like to have my software run on the 3 OSs. To be honest, I haven’t had anyone yet need a linux version, but I figure as soon as I don’t support that, someone will ask…LOL.

In any case, my Linux Box is a Medium Tower Case and bulky, If I can’t sell it cheap, I’ll probably donate it. With so many flavors of Linux out there, I can’t test them all. I’m thinking as little as I’d use the Linux hardware, the VMs make more sense. If it got down to really needing Linux hardware to do testing, I’d probably just get a mini PC for that purpose.

Thanks for all the feedback!