I have been struggling with what should be very simple.
I have a PNG image file with alpha, it is a square picture with a circle in the center, the top half of the circle is opaque white, the bottom half is pure BLUE opaque, the corners of the image (outside the circle) are clear.
I load it from disk and assign it to an ImageWell and it draws correctly.
But if I access the Image (not the image well) then Image’s RGBSurface to get pixel values here’s what I see.
in the White area the color is &h00FFFFFF (which seems correct, opaque white)
in the blue area the color is &h000000FF (which seems correct, opaque blue)
in the clear area the color is &h00000000 (which seems totally wrong, it is saying opaque black)
I can also report that the image says Transparent = 0 and HasAlphaChannel=true.
Alpha Channel Support
The Color type has a read-only “Alpha” property. The alpha channel is the transparency of the color represented as an integer between 0 (opaque) and 255 (transparent).
So &h00000000 means OPAQUE BLACK. Which is definitely not clear.
I modified my image to have a circle of Opaque Black and checked the color returned there, it likewise returned &h00000000
so it is drawing different but returning the same value (in fact nowhere in the image do I get an alpha value of anything other than 00.)
Or am I missing something on that?
Thanks, I get that &c and &h will represent the values differently, but the problem is, at the core, that when I execute
c = theImage.RGBSurface.Pixel(x,y) the value for pixels which are plotting as clear are exactly the same values as for pixels which are opaque black.
I have now found that if I change a pixel to color.rgba(0,0,0,254) then the pixel reads as &hFE000000, but…
If I set it to color.rgba(0,0,0,255) then the pixel reads as &h00000000.
I’ve gone through every imaginable manner of manipulating the image but I NEVER read an alpha of transparent (255)
I really want to help you here, because something at the back of my brain is telling me that I dealt with this before. I just can’t remember the cause or how I fixed it.
One thing you can try is getting the mask of the picture and checking that:
The result is correct for points with alpha of 0 (opaque) to 254 (almost totally transparent), BUT if the point is totally transparent (with alpha of 255) then the returned value is 0 (opaque)
It is infuriating. I’ve done every imaginable combination.
I’ve now felt in this trap as well. Creating the picture using 2 or 3 parameters makes no difference: every transparent pixel of the original PNG file, obtained using RGBSurface.Pixel, returns an alpha of 0 (like non-transparent pixels), although the picture is drawn with correct transparency using DrawPicture.
Curious it’s not considered a big problem since one year.