Back when Xojo was RealBasic, I used to have some routines that allowed me to look at points in non-RB windows.
Foe example I used to be able to display mouseX and mouseY in any window I had clicked on to bring it to the front together with the color of the pixel at that point, also I could examine the color of the pixels in a line of points from X1Y1 to X2Y2 (points relative to the clicked-on window’s 0,0).
I can’t remember how I did it and I can’t find the old app. Maybe I had to use Monkeybread routines? Can anybody help?
One way to achieve this would be to grab a screenshot using MBS plugins (I know you can, but I’m not sure with which one), use System.MouseX/Y to determine the global mouse position, and examine the pixel from the screenshot.
One thing to note about this is that with modern macOS it will require the screen recording permission. Even the app I use for this Couleurs requires this permission.
Screen recording tools that are able to separate every item on screen out into layers seem to suggest to me that there are APIs that allow really flexible examination of the screen, but I would guess these would need a lot of declares to use in Xojo.
I think my routines may well have used Monkeybread but looking at the current MBS library, I can’t find anything that would do what I used to do.
Basically, I just had one browser window on the screen and I knew approximately where on the browser window the image of an object would appear but not its exact size or orientation. I don’t think there was time to do screen snapshots, certainly not time to use a third party app.
I would keep loop-scanning a block of pixels in a region until some pixels of a particular colour appeared then automatically home in on the image of the object in order to analyse and identify it - a bit like knowing a black on white slogan is going to appear and needing to locate it without delay when it does appear and do a kind of OCR routine to interpret the slogan text.
I don’t understand your comments about declares and flexibility. I wasn’t changing anything just identifying and analysing characteristic pixels in a selection area.
> dim p as picture
> p = ScreenshotRectMBS(left as Integer, top as Integer, width as Integer, height as Integer)
I have a feeling this is what I used to do.I think it uses the whole screen so you have to place the window you are interested in at a point where you can work out the left and top as offsets from screen 0,0 then work with the picture p.
At that time, there used to be a System.Pixel function in RealBasic, which is certainly what you used.
In, perhaps, 10 years ago, Mac OS had started to “secure” that API which then returned random colours, unless the passed point was owned by your app.
So RB/RS/Xojo (don’t remember which one) finally removed System.Pixel, being useless.
How is that consistent with what both Sam and Tim posted? Apparently it can be done without issue with declares, so there was no need to remove System.Pixel, or at least not for Xojo Inc to restore it now.
IIRC, with both of the approved Apple ways, you don’t ask for a color at a point, the OS presents the user with a color sampler UI or cursor and when the user clicks, you get a color, you have no idea where it came from.
Either way could easily be integrated into Xojo, you know if they…