Pattern recognition through iPhone camera

I have an idea for a simple (in concept) app that would require the identification of patterns through the iPhone’s camera, but I have no idea where to start with that. Any ideas?

Look for CoreML and Vision api’s of apple thos may be helpfull

I agree with Derk… MBS has a CoreML Plugin so you could start playing around with it on a Mac then move it over to iOS with declares… You could also look at Google’s AutoML cloud service (… and/or come to my XDC session on AI.

Because this is a Xojo, not Swift, forum, for people who want to use Xojo, not Swift. :stuck_out_tongue:

Edit: Err, this was in response to something that has now been deleted. Oh well.

Ah crap, deleted my post instead of quoting it. For those who are curious, it was links to CoreML being super easy in Xcode.

To which I remind, “This is one of those times to use the right tool for the job.” Xojo is more work to implement this. Ignoring that fact misleads people who are making decisions on which tools to use.

What kind of patterns? How complex?

Dots on dominoes.

Funny, I was thinking about doing an app that did that (count dots on dominoes) a few months ago*. Got sidelined with other stuff. Hopefully you find a solution.

I’m not familiar with what iOS has but I’ve done some work with OpenCV for robotics. In that environment it would tell you how many circle-like objects are in the picture.

*Actually came out after losing a round of mexican train dominoes rather badly and counting up the 20 dominoes I had left at the end of the turn was so bad I just estimated. No one called me on it.

@Kem Tekinay xojo declares are very powerfull, iOSKit for example os full of usefull stuff. The only thing is; it’s hard to find any typo or bug in the Ptr/declares.

Other than that you have an lot of cool stuff in iOS to play with.

I’m comfortable with declares, thanks for the suggestion.

One method could be to detect the rectangles in the image with CoreImage then look for dots on those identified rectangles.

I only looked for about 10 minutes but CoreML Vision classes appear to require preemptive threading. So they can’t be done in Xojo at the moment without hacks. Lack of preemptive threading support is going to prevent Xojo for iOS from accessing the new features apple keeps introducing. There’s a feedback case for this, it really needs more attention, I’ll post it later when I’m back at my laptop.

Well darn.

Well, those FC are more than a decade old, and the oficial xojo aswer for that is: “That is dangerous, we make the right desition to not have preemptive threading support, just use helper console apps”. So, Maybe Tim Parnell was right in this case.

<> is for preemptive threads as a general feature in the language. As Ivan mentions its 9.5 years old so probably not something to hope for sadly.

<> is for a pragma that would cause the compiler to allow callbacks from non-Xojo threads by dispatching them to the main thread.

There might be a better way to do this. I’m sure Joe or someone at Xojo can come up with something better. But we desperately need it. Without support for preemptive threads, or at least allowing threads to run Xojo code when they aren’t “owned” by Xojo, the promise of using declares to extend Xojo’s iOS functionality is decreasing with every single iOS version. Those of us in the community trying to write libraries of declares to improve the language are stuck because there really isn’t anything we can do.

I believe Joe is no longer with Xojo.

I did not realize that, thanks.

A proposal for another way to handle async callbacks safely. Please read the case and add your support.

Being able to better handle async callbacks is really needed on the long run.
For concurrency, I think a better model would be be something like goroutines. The paradigm is changed by communicating instead of sharing. The Xojo runtime heritage is alas not helping here.