Setting mouse cursor with a declare?

This may not be possible, but I’m wondering if there is a declare sub that can set the mouse cursor to a custom image. I cannot use the Xojo method because apparently the cursor they set is not a macOS standard cursor. The issue is I need to be able to take a small screenshot of the screen under my custom cursor (without showing the custom cursor in the screenshot). When I set the cursor using Xojo’s method, the cursor is included in the screenshot, even when I set the screenshot parameter to not include the mouse cursor.

I’m hoping there is a Mac-specific way to set the mouse cursor so that it will be considered a “standard” system cursor and adhere to the screenshot settings.

Any help is much appreciated.

Have you tried setting the Xojo custom cursor to nil or System.Cursors.InvisibleCursor prior to taking the screenshot with the parameter to hide the mouse?

Yes, and this is a good suggestion. However, the cursor is a magnifier that is sampling the color under the mouse, and this code is called in a timer that runs a few times a second, so this causes the cursor to flicker when it is hidden and re-shown, which makes using it as a magnifier to pick a color useless.

system.cursors.hide

Is what I use to hide the cursor when displaying the custom color picker widget in HDRtist (http://machdr.com).

I use a window (which follows the cursor) to show an enlarged area and the current highlighted color.

When the selection is complete or the process is cancelled, I call system.cursors.show to show it again.

Some notes.

  1. You can get a eye dropper control for FREE, by using select color and clicking on the eye dropper icon. It’s cumbersome, yet for handling the color selection, efficient.
  2. There’s issues with some custom eye dropper controls as some of the techniques for grabbing a values from the screen are blocked on Catalina. For my own solution, because it’s designed to allow the user to grab a sample from the image they’re editing, I convert the mouse location to a coordinate on the image preview and sample there, completely avoiding any screen shot mechanism.

[quote=475354:@Sam Rowlands]system.cursors.hide
Is what I use to hide the cursor when displaying the custom color picker widget in HDRtist (http://machdr.com).

I use a window (which follows the cursor) to show an enlarged area and the current highlighted color.

When the selection is complete or the process is cancelled, I call system.cursors.show to show it again.

Some notes.

  1. You can get a eye dropper control for FREE, by using select color and clicking on the eye dropper icon. It’s cumbersome, yet for handling the color selection, efficient.
  2. There’s issues with some custom eye dropper controls as some of the techniques for grabbing a values from the screen are blocked on Catalina. For my own solution, because it’s designed to allow the user to grab a sample from the image they’re editing, I convert the mouse location to a coordinate on the image preview and sample there, completely avoiding any screen shot mechanism.[/quote]

The problem with that is that it hides my custom magnifying glass cursor. I need a way to treat the cursor as a “layer” on the screen. It needs to always be shown, but be hidden from the screenshot, which is exactly what macOS does with NSCursors. I’m looking into writing my own declare with macoslib to set the cursor, but haven’t had much luck so far.

how are you taking the screen shot ?
screencapture has params to include / exclude the cursor

server:~ npalardy$ screencapture -help
screencapture: illegal option -- h
usage: screencapture [-icMPmwsWxSCUtoa] [files]
  -c         force screen capture to go to the clipboard
  -b         capture Touch Bar - non-interactive modes only
  -C         capture the cursor as well as the screen. only in non-interactive modes
  -d         display errors to the user graphically
  -i         capture screen interactively, by selection or window
               control key - causes screen shot to go to clipboard
               space key   - toggle between mouse selection and
                             window selection modes
               escape key  - cancels interactive screen shot
  -m         only capture the main monitor, undefined if -i is set
  -D<display> screen capture or record from the display specified. -D 1 is main display, -D 2 secondary, etc.
  -o         in window capture mode, do not capture the shadow of the window
  -p         screen capture will use the default settings for capture. The files argument will be ignored
  -M         screen capture output will go to a new Mail message
  -P         screen capture output will open in Preview or QuickTime Player if video
  -I         screen capture output will open in Messages
  -B<bundleid> screen capture output will open in app with bundleid
  -s         only allow mouse selection mode
  -S         in window capture mode, capture the screen not the window
  -J<style>  sets the starting of interfactive capture
               selection       - captures screen in selection mode
               window          - captures screen in window mode
               video           - records screen in selection mode
  -t<format> image format to create, default is png (other options include pdf, jpg, tiff and other formats)
  -T<seconds> take the picture after a delay of <seconds>, default is 5
  -w         only allow window selection mode
  -W         start interaction in window selection mode
  -x         do not play sounds
  -a         do not include windows attached to selected windows
  -r         do not add dpi meta data to image
  -l<windowid> capture this windowsid
  -R<x,y,w,h> capture screen rect
  -v        capture video recording of the screen
  -V<seconds> limits video capture to specified seconds
  -A<id>    captures audio during a video recording using default input. Optional specify the id of the audio source
  -k        show clicks in video recording mode
  -U        Show interactive toolbar in interactive mode
  -u        present UI after screencapture is complete. files passed to command line will be ignored
  files   where to save the screen capture, 1 file per screen

I’m not using the -C command, and it correctly ignores the cursor when it’s the standard pointer. As soon as I change it (either to a custom cursor I set, or even using one of the other cursors like finger pointing), the cursor is include in the screenshot. It seems that whatever Xojo is doing to set the cursor isn’t “standard”. Other Mac apps with custom cursors correctly omit the icon in screenshots.

pretty sure they juts call into the API NSCursor already exposes for all those “standard” cursors like the finger pointing etc
https://developer.apple.com/documentation/appkit/nscursor?language=objc

there doesnt seem to be anything you can set on the cursor itself that says “make this not visible to screen shots”

maybe “setHiddenUntilMouseMoves:” but thats about the only one I can see in the cursor itself that might have any effect

[quote=475462:@Norman Palardy]pretty sure they juts call into the API NSCursor already exposes for all those “standard” cursors like the finger pointing etc
https://developer.apple.com/documentation/appkit/nscursor?language=objc

there doesnt seem to be anything you can set on the cursor itself that says “make this not visible to screen shots”

maybe “setHiddenUntilMouseMoves:” but thats about the only one I can see in the cursor itself that might have any effect[/quote]

I think once I make a custom cursor via an image, then its no longer possible to hide that cursor in the screenshot. Which is strange, because other apps that use custom cursors (like Photoshop) seem to be able to hide the cursor in the screenshot.

It just may not be possible to do what I’m wanting.

I use a window to display what I want to display.

Doing your own cursor doesn’t look that hard… Famous last words.

You’ll need a NSImage and the hotspot, I assume they mean the point in the cursor image which is the mouse location.

Have you tried manually hiding the cursor, taking a screenshot and then showing it again?

[quote=475506:@Sam Rowlands]I use a window to display what I want to display.

Doing your own cursor doesn’t look that hard… Famous last words.
https://developer.apple.com/documentation/appkit/nscursor/1524612-initwithimage?language=objc
You’ll need a NSImage and the hotspot, I assume they mean the point in the cursor image which is the mouse location.

Have you tried manually hiding the cursor, taking a screenshot and then showing it again?[/quote]

Yes and that works to hide the cursor from the screenshot, but I’m taking a screenshot a few times a second in a timer when the screen is being sampled, and this causes the magnifying cursor to flicker and be very difficult to use, so it’s not an option I can use unfortunately.

At this point I think you should stop for a moment and seriously consider your options.

The problems that I can see are as follows.
2. Using a commandline tool to take the screenshot; to which I assume is saving to disk is wasteful and eats into the users write allowance on SSDs, not to mention you have the time involved in writing and reading.
3. The built-in Apple method requires the user learns to click on select color and then the eyedropper; but the reat is handled for you and is allowed on Catalina. I’ll agree it’s pretty a pretty ugly workaround and that’s why I don’t use it myself.
5. If you only need sampling from your own windows; there’s API for that, which would speed up your solution and in theory shouldn’t cross Catalina’s privacy rules.
6. If you need sampling of the screen, i.e. content you don’t control, there’s API for that. You’ll need to research which API is permitted under Catalina (if any).
7. If you really see absolutely no way to do this without using the commandline functions; then take a screenshot. Load it into your application and throw up a full screen window, and sample from there, you’ll have full control over your cursor (real or fake), you only deduct 1 write from the SSD allowance, and it should be smooth without flickering.
8. Make sure you are at least testing this with Catalina; as it’s gone privacy mad, your app may not even be able to capture the screen at all. I know that some devs have had to pull their apps or functionality from the App Store because of this; although I’ll confess that I never investigated which methods are no longer allowed.

This of course is just my thoughts and possible solutions.

[quote=475518:@Sam Rowlands]At this point I think you should stop for a moment and seriously consider your options.

The problems that I can see are as follows.

  1. Using a commandline tool to take the screenshot; to which I assume is saving to disk is wasteful and eats into the users write allowance on SSDs, not to mention you have the time involved in writing and reading.
  2. The built-in Apple method requires the user learns to click on select color and then the eyedropper; but the reat is handled for you and is allowed on Catalina. I’ll agree it’s pretty a pretty ugly workaround and that’s why I don’t use it myself.
  3. If you only need sampling from your own windows; there’s API for that, which would speed up your solution and in theory shouldn’t cross Catalina’s privacy rules.
  4. If you need sampling of the screen, i.e. content you don’t control, there’s API for that. You’ll need to research which API is permitted under Catalina (if any).
  5. If you really see absolutely no way to do this without using the commandline functions; then take a screenshot. Load it into your application and throw up a full screen window, and sample from there, you’ll have full control over your cursor (real or fake), you only deduct 1 write from the SSD allowance, and it should be smooth without flickering.
  6. Make sure you are at least testing this with Catalina; as it’s gone privacy mad, your app may not even be able to capture the screen at all. I know that some devs have had to pull their apps or functionality from the App Store because of this; although I’ll confess that I never investigated which methods are no longer allowed.

This of course is just my thoughts and possible solutions.[/quote]

Thanks for your thoughtful response, Sam.

  1. The command line approach saves a single file to the temporary cache folder. Each save overwrites the previous file, and the file is only about 150KB. Once the sampling is complete, the file is immediately erased. So no worries about wasted space. It does read and write to the SSD while sampling, but that theoretically only happens for a few seconds whenever the user wants that feature. I think this is fine.

2, 3, 4. I have to sample the entire screen. The app is a color picker that lets you grab a color from anywhere on the screen, including outside of the app. I can control the cursor just fine outside of my app. Everything about this approach works great. I can sample the full screen, the magnifier looks perfect, and everything behaves as intended.

  1. I cannot load the screenshot into my application, because I need to provide “live” sampling. Meaning if the user hovers over a button, or the Dock, or tabs to a different window, the color picker should always pull the color of the active control or the changing screen dynamically. I can get away with this by only sampling a small section of the screen just around the cursor, and doing that repeatedly as the mouse moves until the sampling is complete. It all works very well and is incredibly fast (while also requiring almost not CPU/RAM for the app).

  2. I have this running fine on Catalina and all is working well.

The only issue is, when I screenshot the area around the cursor, if the cursor is my custom magnifier, then the screenshot is basically just the magnification cursor, not the true screen underneath. Which is why I need to omit the cursor from the screenshot (without hiding it and showing it, which causes the cursor to flicker). I know there is a way to take a screenshot of the screen without the cursor, even a custom cursor, as the screen is treated as a layer. Each window is its own layer, and the cursor is its own layer. I just cannot figure out how to take a screenshot that omits the cursor when Xojo has set it to a custom cursor. It is honestly quite frustrating.

Just to clarify; my concern is not about space, it’s that SSDs do wear out over time, some of them even have a fixed write limit, once that’s exceeded the SSD starts misbehaving. There are alternative methods for capturing screenshots that not only will save these writes, but will also be faster/smoother.

It’s just me being pedantic.

So regarding the final issue, you can directly create an set a NSCursor using declares, perhaps that is what you should try next. If you own the MBS plugin, I am pretty certain that it already contains the declares in it (so you don’t have to write them).

Failing that, I would encourage you to consider one of the alternative solutions, yes it isn’t what you want, but sometimes these compromises have to be made. There’s a feature request with Apple for separating the eyedropper from the Color Picker, but it’s old and to be honest I don’t expect it to ever be addressed, which is a shame because so many developers end up having to re-invent the wheel, Apple’s is great, but how you use it is not so convenient.

Yes, that’s exactly what I’m trying to do (hence the title of this post). I’ve been looking over the AppKit API documentation but haven’t had any luck getting it to work properly in Xojo. Just trying to see if anyone has any experience with this or suggestions for how to do so.

Regarding the alternative screen shot methods, I’m not sure what they would be to save writes to the SSD. The only other option is to save the screenshot to the clipboard, but that is destructive to whatever the user already has on their clipboard.

The code posted at the following link (Obj-C) is an example of how to do a screen capture using API, which results in a CGImage. From here there are multiple ways you can sample the values. Probably the easiest; is to draw the CGimage into a Xojo picture and use RGBSurface to read the values. Otherwise you could create a NSBitmapImageRep and sample that. You could be really geeky and grab the memory location of the pixels from the CGImageRef and read the bytes.

This is going to be exponentially faster/smoother, while avoiding any unnecessary wear on the SSD.

*Disclaimer: This is not my code.

https://github.com/wentingliu/ScreenPicker/blob/master/ScreenPicker/ScreenPickerWindow.m

Edit: Re-read the code, it’s really kinda cool as it grabs the contents below the Magnifier window, nice!

[quote=475599:@Sam Rowlands]The code posted at the following link (Obj-C) is an example of how to do a screen capture using API, which results in a CGImage. From here there are multiple ways you can sample the values. Probably the easiest; is to draw the CGimage into a Xojo picture and use RGBSurface to read the values. Otherwise you could create a NSBitmapImageRep and sample that. You could be really geeky and grab the memory location of the pixels from the CGImageRef and read the bytes.

This is going to be exponentially faster/smoother, while avoiding any unnecessary wear on the SSD.

*Disclaimer: This is not my code.

https://github.com/wentingliu/ScreenPicker/blob/master/ScreenPicker/ScreenPickerWindow.m

Edit: Re-read the code, it’s really kinda cool as it grabs the contents below the Magnifier window, nice![/quote]

This code does look promising, thank you. Unfortunately I’m not 100% sure how to implement it in Xojo. I have limited experience with this. The app is simple enough that I may just rewrite it in Swift UI, but since it’s already done in Xojo I was hoping to not have to do that. There is quite a bit of code around storing color palettes and generating custom swatches based on a color picked that would be tedious to rewrite.