Displaying a 16bit image in Xojo - possible?

I am using a frame grabber to capture images on a film scanner. The raw image is captured by the frame grabber and put in a memory block. I then pass this to the Einhugur RawBitmap class, which converts it to a picture for display in my app.

Everything works fine if the frame grabber is set to 8 bit monochrome capture. But if I set it to 16bit monochrome, the resulting image is …not right.

With the camera set to mono8 acquisition, confirmed in the manufacturer’s software and their debug logging tool, using the following code in Xojo:

var grabbed as boolean = false

//get the frame buffer into the captureBuffer memory block
//This function looks at the camera's current mode and determines whether to 
//expect an 8 or 16 bit image in the frame buffer. 
grabbed = FrameGrabber.CXPGetFrameBuffer

if grabbed = true then 
  
  var rowbytes as integer
  var bpp as integer
  
  if FrameGrabber.frameBPP = 2 then //16bit color
    bpp = 16
  elseif FrameGrabber.frameBPP = 1 then //8bit mono
    bpp = 8
  end if
  
  //Calculate bytes per row in the image
  rowbytes = (bpp * FrameGrabber.frameColumns) / 8
  
  //Use Einhugur plugin to convert raw image data to a picture
  var frame as new RawBitmap(FrameGrabber.captureBuffer, FrameGrabber.frameColumns, FrameGrabber.frameRows, rowbytes, RawBitmap.RawBitmapFormat.gg) 
  
  Var convertedFrame as Picture = RawBitmapConverter.ToPicture(frame,false)
  var displayedFrame as new picture(MainWindow.FrameDisplay.Width, MainWindow.FrameDisplay.Height)
  displayedFrame.Graphics.DrawPicture(convertedFrame, 0, 0, displayedFrame.Graphics.Width, displayedFrame.Graphics.Height, 0, 0, convertedFrame.Width, convertedFrame.Height)
  MainWindow.FrameDisplay.Backdrop = displayedFrame
  
end if

Resulting image - weird image but this is how it should look:

Now, if I set the camera to Mono16 and grab a frame in the manufacturer’s software, it looks exactly like the image above, only it’s from the 16bit frame buffer. but if I change how RawBitmap is interpreting the data, using “gg” as the RawBitmapFormat when setting up the RawBitmap object, the result is this:

You can see there’s some image there. Obviously this is an issue with how the 16bit data is being interpreted - but by Xojo? RawBitmap? I’m not sure. For the purpose of the app itself, display of the image at 8 bit is just fine as long as I’m able to process the image behind the scenes at 16. What am I missing here?

1 Like

The Xojo is not doing anything wrong since Xojo cannot handle the 16 bit images at all.

However my plugin can (the one your using).

So question of course is where in the chain things go wrong. I dont know for sure but I am guessing your rowbytes calculation is wrong maybe ?

Your comment in the code says 16 bit color but then you feed gg into the color space which would mean 16 bit gray…which again would mean totally different rowBytes and different internal interpretation. Which one is it supposed to be ?

Your comment in the code says 16 bit color

that’s a mistake in the comments. The camera’s mode is set either from a command you send through the dll, or it’s set in the camera firmware using third party software. I am verifying that it’s the correct mode using both the third party software and the frame grabber’s debug logger, which reports absolutely everything that happens related to the grabber, the camera, or the API.

frameBPP in my code holds the result of requesting the camera’s reported bits per pixel mode – not the actual bits per pixel, it’s a numerical value that maps to the different color or monochrome modes, which are determined by the kind of camera connected. for example:

frameBPP: 4 = bayer32
frameBPP: 2 = mono16
frameBPP: 1 = mono8

As this is a monochrome camera, the only two modes we will ever use are 1 & 2, I included 4 for illustration but this is not a bayer camera.

My rowbytes calculation is the number of bits per pixel (16) times the number of pixels in a row (columns, as reported by the frame grabber). That’s divided by 8 to get bytes per row. Is RawBitmap expecting something different?

There is chance of course that there is bug somewhere, gg is extremely little used color space so probably not the most tested and battle hardened from usage of our users.

So what you could try to realize if its the BitmapConverter that fails or something else is to just create new RawBitmap maybe 200 x 200 with gg color space then do bitmap.FillRect and fill portion of the image with some shade.

And then feed it to the RawBitmapConverter and see what you get. Then we have sort of narrowed a little bit down what could be failing.

ok. I’m heading home for the night but will be spending tomorrow working on this so I’ll try that out. I should also be able to do it with a 16bit file generated by some software we use here - like a TIFF or something - as well as an image generated in code.

I actually see there is something wrong, the test cases of:

var bitmap as RawBitmap = new RawBitmap(nil, 200, 200, 0, RawBitmap.RawBitmapFormat.g)

bitmap.FillRect(30, 30, 50, 50, RGB(50,50, 50))

var p as Picture = RawBitmapConverter.ToPicture(bitmap)

g.DrawPicture(p,0,0)

vs

var bitmap as RawBitmap = new RawBitmap(nil, 200, 200, 0, RawBitmap.RawBitmapFormat.gg)

bitmap.FillRect(30, 30, 50, 50, RGB(50,50, 50))

var p as Picture = RawBitmapConverter.ToPicture(bitmap)

g.DrawPicture(p,0,0)

The later code gives bad result

I will investigate it.

Thanks for looking into it!

TypeLib has now been updated to version 13.2.2, fixing the issue above. Basically conversion from RawBitmap gg and ggAA color space to Xojo Picture got fixed.

You can find the update on our web.

1 Like

Awesome. Thanks! I’ll install and test this tomorrow when I’m back in the office.

I just had a chance to test this (with gg, since we’re not capturing an alpha channel), andit works perfectly. Thanks for the quick fix!

1 Like