So I’m heading down a rabbit hole at light speed now.
This started out as a CGI project. I’ve written them before, no problems, normally using PRINT to output my console applications HTML response.
But this time I need to send data, and to make it simple, I tried just returning a JPEG image as a test app. Guess what, you can’t cause PRINT and StandardOutputStream is truncating when it hits the first 0x00 in the output. I’ve tried variant and memoryblock, but can’t get past it.
So then I started looking at (cause I’m in an IIS windows environment) using DLL calls to Kernel32.dll to do my output. But in order to use WriteConsole you have to get a Handle to the STD_OUTPUT first. I found an old NUG article for doing this on a MAC using System.Framework and the Mac handle is just 1. But the Visual Basic code I found to reference what STD_OUTPUT_HANDLE is…
Public Const STD_OUTPUT_HANDLE = -11&
Now I’m confused. What is the ending “&” for? Hex? And what is the negative sign for? I thought DWORD was a 32 bit unsigned integer? So in testing -11 decimal crashes, and -17 decimal (11 hex) does work, but the output isn’t the STD_OUTPUT it went somewhere else. Looking at a snipit of the win32.h (header file) is not helpful at all.
[code]#define FILE_ATTRIBUTE_READONLY 0x00000001
#define FILE_ATTRIBUTE_HIDDEN 0x00000002
#define FILE_ATTRIBUTE_SYSTEM 0x00000004
#define FILE_ATTRIBUTE_DIRECTORY 0x00000010
#define FILE_ATTRIBUTE_ARCHIVE 0x00000020
#define FILE_ATTRIBUTE_DEVICE 0x00000040
#define FILE_ATTRIBUTE_NORMAL 0x00000080
#define FILE_ATTRIBUTE_TEMPORARY 0x00000100
#define STD_INPUT_HANDLE ((DWORD)-10)
#define STD_OUTPUT_HANDLE ((DWORD)-11)
#define STD_ERROR_HANDLE ((DWORD)-12)[/code]
More questions, what is ((DWORD)-11) is that taking a -11 hex and forcing it as DWORD? Isn’t that 0xFFFFFFEF ? So why not just notate it that way?
Anyone have a clue on how to force binary console output, or how to resolve the value for STD_OUTPUT_HANDLE. Thanks