I have this program that reads textured meshes (wavefront files) and display them in an openglsurface control.
To do that, I have to transform the texture of the wavefront file (an image) into a memoryblock readable by opengl, and I have noticed a different behaviour between Windows and Linux (not yet tested on MacOs):
In concrete, I have two options to read the texture/picture and transform it into a memoryblock
OS Sensitive? Maybe? I don’t known how things work behind the scenes, but maybe different libs cause different binary bitmaps, like using compression or choosing different number of bits per pixel with or without Alpha channel? I think you should decode the output to understand instead of assuming a fixed format. Export the contents, use some tool to analyze and show the data structures of both.
Hi Rick, thanks for the advice.
It is likely a problem of image format returned by the same function but in different OS, indeed.
returns a memoryblock of Picture.height X Picture.width X 3 + 54 (header) bytes in linux and Picture.height X Picture.width X 4 + 54 in windows.
I suppose in windows the formats.bmp provides Blue-Green-Red-Alpha, whereas in linux just the Blue-Green-Red info.
The strange thing is that in linux once I subtract the first 54 bytes, the returned moryblock can be directly used in openGL like this:
whereas the memoryblock returned in windows in not structured in a way directly usable in openGL, like this:
OpenGL.glTexImage2d(OpenGL.GL_TEXTURE_2D, 0, 4, Picture.width, Picture.height, 0, OpenGL.GL_BGR, OpenGL.GL_UNSIGNED_BYTE, memoryblock).