Picture.todata. OS sensitive?

Hi all,

I have this program that reads textured meshes (wavefront files) and display them in an openglsurface control.

To do that, I have to transform the texture of the wavefront file (an image) into a memoryblock readable by opengl, and I have noticed a different behaviour between Windows and Linux (not yet tested on MacOs):

In concrete, I have two options to read the texture/picture and transform it into a memoryblock

Option 1, by far faster

memoryblock=texture.ToData(Picture.Formats.bmp).RightB(texture.height * texture.width*3)

the rightB is needed to skip the BMP format header

Option 2

offset=0

for y = texture.height -1 to 0 step -1

for x = 0 to texture.width - 1

surfCol =Texture.RGBSurface.Pixel(x,y)

memoryblock.Byte(offset) = surfCol.Blue

memoryblock.Byte(offset + 1) = surfCol.Green

memoryblock.Byte(offset + 2) = surfCol.red

offset = offset + 3

next x

next y

In linux the two options provide the same results, whereas in windows they do not. Any suggestions?

Stefano

Little/Big endian?

Hi Jeff, just checked out, unfortunately is not a endian issue.

I’m attaching the screenshots of the different results.

Screenshot 1 (rocks and vegetation) is produced by options 1&2 in linux and option 2 in windows

Screenshot 2 (psychedelic image) is produced by option 1 in windows


OS Sensitive? Maybe? I don’t known how things work behind the scenes, but maybe different libs cause different binary bitmaps, like using compression or choosing different number of bits per pixel with or without Alpha channel? I think you should decode the output to understand instead of assuming a fixed format. Export the contents, use some tool to analyze and show the data structures of both.

Hi Rick, thanks for the advice.
It is likely a problem of image format returned by the same function but in different OS, indeed.

memoryblock=Picture.ToData(Picture.Formats.bmp)
returns a memoryblock of Picture.height X Picture.width X 3 + 54 (header) bytes in linux and Picture.height X Picture.width X 4 + 54 in windows.
I suppose in windows the formats.bmp provides Blue-Green-Red-Alpha, whereas in linux just the Blue-Green-Red info.

The strange thing is that in linux once I subtract the first 54 bytes, the returned moryblock can be directly used in openGL like this:

OpenGL.glTexImage2d(OpenGL.GL_TEXTURE_2D, 0, 3, Picture.width, Picture.height, 0, OpenGL.GL_BGR, OpenGL.GL_UNSIGNED_BYTE, memoryblock)

whereas the memoryblock returned in windows in not structured in a way directly usable in openGL, like this:
OpenGL.glTexImage2d(OpenGL.GL_TEXTURE_2D, 0, 4, Picture.width, Picture.height, 0, OpenGL.GL_BGR, OpenGL.GL_UNSIGNED_BYTE, memoryblock).

Do you have OpenGL.GL_BGRA or OpenGL.GL_RGBA options for tests?

Yes, sorry, tested with BGRA nor BGR.

OpenGL.glTexImage2d(OpenGL.GL_TEXTURE_2D, 0, 4, Picture.width, Picture.height, 0, OpenGL.GL_BGRA, OpenGL.GL_UNSIGNED_BYTE, memoryblock).

You may need to create a normalizer function, Something like

memoryblock = ExtractBGR(texture.ToData(Picture.Formats.bmp)) // From BGRA

It will loop thru the 32bit pixels (4 bytes) and extract the compatible OpenGL.GL_BGR 24 bits (3 bytes) and skip the last one discarding the Alpha

I’ve read the BMP with 32 bit depth uses BGRA layout.

BMP also has its map often upside down.

Solved, thanks to all

memoryblock=texture.ToData(Picture.Formats.bmp)

if memoryblock.size > (texture.height X texture.width X 3 +54) then
memoryblock=texture.ToData(Picture.Formats.bmp).RightB(texture.height X texture.width X 4)

OpenGL.glTexImage2d(OpenGL.GL_TEXTURE_2D, 0, 4, Picture.width, Picture.height, 0, OpenGL.GL_BGRA, OpenGL.GL_UNSIGNED_BYTE, memoryblock).

else

memoryblock=texture.ToData(Picture.Formats.bmp).RightB(texture.height X texture.width X 3)

OpenGL.glTexImage2d(OpenGL.GL_TEXTURE_2D, 0, 3, Picture.width, Picture.height, 0, OpenGL.GL_BGR, OpenGL.GL_UNSIGNED_BYTE, memoryblock)
end if

So setting to OpenGL.GL_BGRA solves. Good to know.

1 Like

I guess that it could be changed to

memoryblock = texture.ToData(Picture.Formats.bmp)

If memoryblock.UInt16Value(28) = 32 Then // BGRA

  memoryblock = memoryblock.StringValue(54, m.Size - 54) // cut header

  OpenGL.glTexImage2d(OpenGL.GL_TEXTURE_2D, 0, 4, Picture.width, Picture.height, 0, OpenGL.GL_BGRA, OpenGL.GL_UNSIGNED_BYTE, memoryblock).

Else

NB: an hex Color (&h00FF6432) have the Alpha value in the leaduing Byte… (&h00 above).

We are talking about BGRA layouts used internaly in BMP images and OpenGL, but Xojo constants are RGBA, what you are talking about is ARGB and I don’t know where they are used in the Xojo context.

RGBA from the docs:

But… In memory, we see that Xojo stores it as BGRA

image

Note: Both Intel and Apple Silicon uses Little Endian layouts.