Optimising Loading Large Images to Canvas

Hello, I am in the process of making an application for OSX and Windows in which the user needs to load images typical of >500 mb, 2800x7000 resolution. For starters, if I recall, REAL basic has a limit of 32,000 pixels horizontal or vertical resolution, so we won’t focus on images of that resolution.

The general idea is that the image is loaded onto a canvas, and the user can scroll and zoom around and click on points of interest, generating a circle at the mousedown point and a tally recorded in a listbox.

My application is completed, and works, however it needs to be optimised to run on older slower machines. Testing on a 2.1 Core 2 Duo 4 GB RAM macbook, the application cannot draw a 70 MB photo to the canvas, and crashes. On a built PC, I can load images upwards of 500 mb.

Currently I am loading the picture into a global Picture property to keep it in the RAM for fast access. My scrolling and zooming are done through the Canvas.Paint event using g.DrawPicture. I found it easier concerning scrolling and zooming to draw the picture to the canvas using g.DrawPicture. Should I think about using the canvas backdrop instead?

My main question is if I am storing the picture in a property, and then I draw it to a canvas, does the canvas only store the bytes of the image that are within the bounds of the canvas? If not, does the entire image get reloaded into the canvas, meaning I am storing the same image twice over; once in the global property and once in the canvas?

In general, if anyone has any tips or pointers as to how I’d go about efficiently loading a very large image into a Canvas for user editing, that’d be awesome!

(On a side-note, if I try to export a large (70 MB, 28,000X7,000) picture using the SaveAsTIFF/JPG etc., the program crashes even if I have purged all of the its memory first. Is the SaveAs function limited by the API it calls or by the computer (I have tried running it on a current gen fully built gaming PC, and it still crashes until I scale the resolution down to ~20,000x6000)?)

Thanks!

Can you post the relevant sections of the crash log? That means crashing thread and memory part.

BTW, a 28000 x 7000 image takes 748MB of memory in your program, even if it’s only 70MB on disk.

You’re only drawing the subsection of the picture that fills the canvas, right?

I should have said “You’re only drawing the subsection of the picture that fills the currently visible area of the canvas, right?”

Hello, everyone, thank you for your thoughts and help. I have a feeling Daniel has pointed out my mistake.

Currently I am using the following code to draw to the canvas:

g.DrawPicture App.OriginalPic, App.XScroll, App.YScroll, app.OriginalPic.Width*Zoom, app.OriginalPic.Height*Zoom, 0, 0, app.OriginalPic.Width, app.OriginalPic.Height[/code]

I have a feeling I am drawing the entire image, while simply shifting the 0,0 point of it to scroll, rather than taking a rectangular subsection of the visible image? The debugger showed the canvas.backdrop image dimensions were equal to that of the actual picture, so way too big.

It looks like you are drawing the entire image. DestX / DestY should always be 0, and DestWidth / DestHeight should always match the canvas size. Use the Source parameters to grab a subsection of the image at your scroll point. You will have to compute those values based on the zoom.

Shouldn’t Backdrop be nil? You’re not setting it, too, are you?

Sorry, I meant the graphics property of the canvas. As you pointed out above, the RAM required is a function of bits per colour, so loading large resolution photos will be limited by hardware.

Out of interest, I made a new application which consists of a global Picture property, a pushbutton which opens a folder item as an image and stores it to the property, and a button to ‘purge’ the Picture property to Nil. I was interested to see how much memory would be used when loading images (using Activity Monitor).

[code] Dim f as FolderItem

f = GetOpenFolderItem(“special/any”)

If f <> Nil Then
App.SavedPicture = f.OpenAsVectorPicture
End If

f = Nil[/code]
The above loads the picture. Another button simply set: “App.SavedPicture = Nil”. (Is there a better/proper way to clear a variable?)

My test photo was 28346?×?7370, so at 32 bits per pixel, that’s ~796 MB of required RAM.

I opened the image using a number of graphics programmes, all of them used around 800 MB of physical memory.

The TestApplication idled at ~7 MB, opening the test photo spiked it to 2.21 GB where it remained until the Purge button was pressed, setting the SavedPicture to Nil - the programme then idled at 670 MB. I found the RAM used to load the photo to be quite high; to confirm, does RB load an image into a memory block as a bittmapped image? The RAM usage was double my calculation and double what Preview, GIMP, CS, and others used. Is this just due to the built in RB Picture functions being memory intensive? I am also confused by the programme idling at such high RAM consumption after the SavedPicture was cleared to Nil. My guess is setting the Picture property to Nil is not the proper way to clear it from the memory?

Btw, I just upgraded to the buggy Mavericks, and am running the latest release of Xojo.

Sorry for the double post. As a follow up, the unexpected memory usage, and failure to release memory after clearing the Picture variable is a function of compiling to Cocoa with Xojo. I compiled the same exact code as above to Carbon and had no problems. The programme took ~800 MB physical memory to load and hold the Picture, I was even able to display it to a Canvas, and when I cleared the Picture, the programme dropped back down to using 5 MB.

So, that’s explains the problem, now I need a way to circumnavigate it. The Cocoa build uses excessive memory for all pictures loaded, regardless of resolution, and fails to fully dump its memory when all references to a block have been cleared. As I am only developing the OSX version on one machine, I do not know if it behaves differently on a different version of OSX. I will run memory checks on the Windows 7 version later today out of interest.

I will have a play and see if the culprit is the Picture class itself or the FolderItem.OpenAsPicture. I guess one work around might be to open an image as a binary stream, create a new memory block, and read the information directly in, and then write colour data per pixel to an RGBSurface from the memory block? Does anyone have any good suggestions?

Thanks for the help so far!

Is this anything to do with the new alpha channel method pushing the depth to 32bit instead of 24?
Might you save memory by pushing the picture into a new picture of a lower color depth (no good if you are doing an image editor, I guess)

Does your picture have a mask in Xojo? (that seems to require another image of the same size and depth)

I just did a test in Cocoa opening a 25kx10k png and I saw a spike to 2.3G memory used, then dropping to about 900M. When I nilled the picture, the memory dropped back to 12M… Maybe it has to do with the workings of OpenAsVectorPicture?

[quote=88337:@Jeff Tullin]Is this anything to do with the new alpha channel method pushing the depth to 32bit instead of 24?
Might you save memory by pushing the picture into a new picture of a lower color depth (no good if you are doing an image editor, I guess)

Does your picture have a mask in Xojo? (that seems to require another image of the same size and depth)[/quote]
The alpha channel would definitely require more memory, unfortunately as you pointed out, for image editing purposes I would like to leave it in, but it’s a good idea to save on some memory should the need arise.

Hey, thanks for checking it out. What code did you use to open the picture, if you don’t mind?

App.SavedPicture=picture.open(f)

Thank you, I tried this route. Under Cocoa I am still spiking to 2.2 GB, and holding. When nilled, it idles at 600 MB. I am wondering if approaching my physical RAM limit (4 GB) is the issue? I would normally just recommend the user have sufficient RAM to open large files, however this application is custom targeted for a lab using old iMacs with 4 GB of RAM. I can put the finger on them to crop their images, but it’s inconvenient for processing, and the programme’s intention is to remove their current inconveniences. Such is life.

[quote=88159:@Peter B]Sorry, I meant the graphics property of the canvas. As you pointed out above, the RAM required is a function of bits per colour, so loading large resolution photos will be limited by hardware.
[/quote]
More likely limited by the amount of memory a single 32 bit process can access which is < 4Gb at best and usually less than that in practice

What format is your source image?

I would report this memory leak to Xojo, through Feedback.

There are alternatives to try, such as NSImage or CGImage. MacOSLib and the MBS plugin contain code for both of these while the Retina Kit uses NSImage.

I don’t know what Xojo uses underneath to load images, but it could be a lazy loading API (like CoreImage) where the OS doesn’t load the picture until you draw it and doesn’t release it until… Well until it decides to do so. This was very confusing to me when I first started working with CoreImage (I didn’t know it was lazy loading then).

[quote=88370:@Sam Rowlands]What format is your source image?

I would report this memory leak to Xojo, through Feedback.

There are alternatives to try, such as NSImage or CGImage. MacOSLib and the MBS plugin contain code for both of these while the Retina Kit uses NSImage.

I don’t know what Xojo uses underneath to load images, but it could be a lazy loading API (like CoreImage) where the OS doesn’t load the picture until you draw it and doesn’t release it until… Well until it decides to do so. This was very confusing to me when I first started working with CoreImage (I didn’t know it was lazy loading then).[/quote]
Okay, I had a play with image formats. End result being it has something to do with Progressive display on .JPEGs with Cocoa Xojo builds. All image formats work just fine, including .JPEGs, so long as they aren’t large resolution and progressive.

Only large resolution pictures resulted in a problem when I converted them to .JPEG with progressive display (e.g. opening a 28000x7000 progressive jpg leaked memory, but a 20000x4500 progressive jpg did not leak memory).

I tried converting from multiple formats using three different programmes, this didn’t seem to make a difference. So - Cocoa Xojo builds leak memory for me when attempting to open large progressive display .JPEGs. Again, I ran into no problems during testing using the Carbon build.

I did not run CGImage through all the debug tests with the various images, but it too leaked memory when opening my original progressive .jpg. I did not try NSImage. I wouldn’t take that for much though, as I’m no expert at calling external APIs.

I wonder if this is in any way related to the weird bug we ran into when images were converted from one format to uncompressed BMPs and they would end up mangled. That was specifically and OS 10.9 and Haswell chipset bug. If you had a different set up you would not not run into this.

Have you got another machine or one with a different version of OS X on it to try ?

[quote=88402:@Norman Palardy]I wonder if this is in any way related to the weird bug we ran into when images were converted from one format to uncompressed BMPs and they would end up mangled. That was specifically and OS 10.9 and Haswell chipset bug. If you had a different set up you would not not run into this.

Have you got another machine or one with a different version of OS X on it to try ?[/quote]
Yeah, it may well be hardware specific. This MacBook is a mid 2009 i386 Core 2 Duo running 10.9.2. I’ll give it a run at work tomorrow on some of the other macs.

For my purposes I’m not too worried, especially if it turns out to be architecture specific. Either way, for its intended use, this application shouldn’t be dealing with progressive .jpegs. I only found out because GIMP exports .jpegs with both ‘Optimisation’ and ‘Progressive’ on by default.