System.DebugLog output Web 2.0

I wanted to make sure I’m not inadvertently creating server log files or tying up system resources by leaving my debug entries in a deployed Web App.

Are there any cautions regarding System.DebugLog, or other ‘debugging’ code that could create a mess if left in for (beta) deployment?

If you want the debug logs only when you are debugging, you can put them inside of a conditional compilation:

#If DebugBuild Then
  DebugLogCodeHere
#EndIf

That way the release compilation wont have any debug code

For certain, anything long-term that I plan on leaving in the project for debug will get a nice conditional wrapper (like your sample). But what are the ramifications of NOT taking these lines out? Does System.debuglog (or any other statements) cause hosting servers to jam up with log entries?

According to the documentation, for Linux it prints to StdErr which is not retained and will vanish when the application closes. Basically just tosses the message up in the terminal if you launched your app from a terminal.

Great for troubleshooting builds on systems you can’t debug on.

i would expect that System.DebugLog is not part of a release build
and if me need a real log then i would use System.Log

It will write to the system journal same as

System.Log( System.LogLevelDebug, "your message") 

Does in an actual build. It’s not stripped off.

You can review this on linux using the terminal:

journalctl -b -n 200 -f

If your system uses the journal. Otherwise it may end up somewhere else.
“-f” keeps following the logs

I bet you don’t want your customers to see all, then use #if debugbuild…

2 Likes

Apparently I have a memory error. I could’ve sworn stderr was ephemeral based on the process ID and only logged to /proc/PROCESS_ID/fd/2 which was removed when the process terminated and/or periodically.

I wish I had time to investigate and test. Thanks for clarifying!

Another option, if you are working anyways with a database is to store your “logs” in an own table. There are pros and cons to this. Though Database accesses are usually very, very fast, a written file log is aways (by nature) the fastest approach.

But I still often use an own table on a database, as I can then access it for many customers and condense this information into an own dashboard very easily. I like as well to keep some logs in a productive release. Of course this all depends on any GDPR requirements, the nature of your app, your customer etc. For instance I might store where in my code I believe it will never be executed, just to be informed ASAP that it happened.

For deleting old entries, I’m creating a console app for Linux, deleting old data in that table periodically via CRON.

Sometimes it is helpful to know ahead of your customer, what their next problem (or even better for us devs: their next requirement) most likely will be :wink:

In regards to filespace consumption and performance I would not care (until you will run into an issue in 10 years). Just look at the logs of Apache2 in Nginx after a standard installation. They write much more, then you probably ever will achieve with your few logs, even if you will keep them for eternity.

Depends on the system setup

2 Likes

The journal manages this system wide in linux. That’s way better since system administrators can find problems much easier

2 Likes

I am NOT deleting system message via CRON, I’m deleting my log entries in myLogTable in the database via a simple cron calling a xojo linux console app.

As said, it always depends. In a corporate environment logs are definitely the way to go. If your customer has no administrator or the tech department doesn’t maintain the servers of your web apps with care and love, others options might be valid as well. I have customers were I am doing both. Writing to system logs and in parallel to the database. I am pulling such messages from many customers into an own Dashboard, and usually I see issues far ahead of the customer ;-).

Hi Jeannot, your detailed response is very much appreciated. I’m always trying to conserve on waste, even tiny log files. Memory leaks and an accumulation of unnecessary buildup of files are things I look to avoid. It may not save the planet from global warming but at least it’ll make my footprint on a server easier to manage.

Logging to a LogTable in SQL sounds like a great idea. :slight_smile:

1 Like

It depends (as always), if you want for instance to monitor issues or errors in/with the database, then my “idea” is obviously most certainly the wrong approach :wink: - but in general it is a very fast and quick way to journalize some debug information and core bits of information you perhaps want to keep in production too (for instance who tried to login unsuccessfully).