I am working on a more complete emailMessage parser and have run into a situation that just does not make sense to me.
I sent a simple email from a Windows 8.1 machine via my gmail account… when I receive it via a popsocket, in part, the raw text looks like:
… but the text is clearly base64 and not 7bit…
So it seems that the message is not being properly constructed by the Windows client… but when I receive the same email with Mac Mail, it properly decodes it.
I do not know a reliable way to determine is a string is base64… but it seems the Mac is figuring it out… or I am missing something?
Thanks for any insight,
base64 ends mostly with “==”
some more info just look for the string;
Thanks for the reply… but 2 things:
My understanding is that 7bit is basically the same as ASCII… characters 1 - 127… That string is base64… I checked it.
base64 string length must be in 4 byte increments… with “=” used as padding when the string length is not “mod 4”… So there is a 25% chance that there will be no “=” on the end… and a 25% chance it will have 2 or 3 “=”… so I cannot depend on that…
Well there is always this one:
WIch indicates human readable text (non-binary) so it’s an ascii string.
But i’m not exactly sure what the best options are. What i do know is they are there.
As there are more then enough email services/apps that work flawless.
Sometimes email messages lie about their encoding. Says utf8 in the header and latin-something in the html. This makes email parsing sometimes guesswork. But I haven’t seen this particular fun before. Can you send me the complete raw mail privately? I’l like to check how my own email parser handles the mail.