EU Cyber Resilience Act?

Greetings XOJO folks

I am primarily an electronic instrument developer. I am working on a new model of a currently sold instrument. This instrument is a battery powered “data logging accelerometer” that has been used world-wide to measure the sway-behavior of trees. The instrument is highly configurable over an intermittent USB/serial connection. With the new model, I plan to offer a desktop GUI to simplify management of the instrument. Said GUI will be done in XOJO. The GUI will NOT be filtered through any app store but will be downloadable for instrument users from a web site that I nominally control. The first GUI release will probably be for Windows (which I am ill-equipped to cope with!) but later releases will add MacOS and linux.

It appears that the EU’s Cyber Resilience Act (CRA) will come into effect fairly soon. And, it appears to have provisions for non-web desktop applications and even the firmware in the instrument. Focussing on the XOJO part of it, what are folks doing to address these coming CRA requirements? In particular, there seem to be requirements that the end-user be able to verify that the app they are about to use is “valid” (e.g. not altered). There are other requirements about bug fixes, free updates, and such, which I see more as procedural. But the validity issue is more technical. And there may be several other similar requirements. How do YOU plan on addressing these things?

Many thanks
Jim Wagner
Oregon Research Electronics

1 Like

if your code is signed, it’s already protected against alteration.

1 Like

Thanks, I was wondering about that.

Jim

Still looking for comments about XOJO and CRA broadly. Any plans, strategies, discussions, etc, about meeting (or not meeting) CRA?

Thanks
Jim

Let’s talk about this for a minute.

On macOS at least, If your bundle is signed, your code and any other code items it loads (like workers) are certified. The resources however are not. So any pictures, sounds, text files, icons, etc are not part of the code signature, and therefore could be modified without your consent, without breaking the code signature. This would include the ability to deface your app or make it look like it’s created by an entirely different entity.

Also, you’ll need to use the codesign hardening options to protect the app from runtime modifications from things like viruses and malware changing the code at runtime.

If you want to prevent resource modifications, you could create a SHA256 hash of each file and put them into an easily readable json file, then you create a SHA256 or SHA512 hash of the json file and create an RSA signature of the hash, which gets embedded into your app using a series of chr() commands along with the public key so they can’t be easily found.

That is all to say that you are trying to make it extremely difficult for someone to modify your non-code resources, with the understanding that with enough time, resources and effort, nothing is impossible. But extremely difficult does make some actors move on to the next target, depending on how valuable your target actually is.

Thanks, Greg

My (possibly questionable) reading of CRA references suggests that the rules assume that not just the app is the target of concern, but also other apps on the machine, the machine itself, the cloud, and the internet generally. And, there are many other provisions beyond install-file integrity; I simply mentioned that since it is an obvious starting point. CRA does seem to reference an installation check mechanism referred to as “SBOM”? I cannot yet find what this implies.

Jim

1 Like

Thank you! Jim

1 Like