Compiled Tsql vs sql sent via Xojo

HI Folks…

I’ve been gone awhile and I’m thinking about a project that I want to design in Xojo to try it out since I"ve downloaded and installed, but did that awhile back. First time I"ve had the time to really play.
Here’s my DB question.

Isn’t it more managable and faster to do operations in the db itself by sending sql or tsql, etc, to the database and then just retrieving the results? I mean, like if I want to get some results for a report everyday, design the report in tsql and save it and then run it via a command in Xojo sent to the db and then retrieve the results? VS coming up with all the SQL and running it via Xojo and sending to the database? I am thinking, but it’s been years when I used ASP classic and VB that I tried to compile scripts on SQL server and run those and save the results in a temp table or something, then open that in the Frontend… Does that all make sense I hope… Sorry for the simpleton question, but it’s been sooooo long since I’ve coded due to bad health and I want to do so again, but from home and eventually writing some apps to sell that I think are needed…
Again, thanks so much for all the help here… Everyone here is great…
Tim

I assume you are talking about creating stored procedures on the server (since you mention TSQL). Unless your queries/actions on the db are very complex and/or involve a massive amount of data, I don’t think you’ll see any difference in performance. As for manageability, doing it all in Xojo keeps the code in one place, which, IMO, makes future (i.e., foggy memory) maintenance easier. Another con with coding on the server is upgrade synchronization in a multiuser environment. But then that’s already an issue if you make structure changes to the db.

Jay… I am going to disagree… Let the DB do as much work as it can. No issue with upgrades… the server keeps its stored procedures in one place and as many users as needed share the same code (new instances, but same code)
Or at least that is how Oracle works…

I have to agree with Dave
Always try and let the DB do the work

Of course there will be times you have to do things at the application level but the DB should be first choice in operations

I also agree with @Dave S and @brian franco.

At my day job, I spend at least 70% of my time writing T-SQL for a massive multi-user Web environment, where 90% of the business logic of the applications are in Stored Procedures, User-Defined functions, Views, Synonyms, etc. - and the boost in performance, ease of system-wide updates and the lessened burden of application building is indispensable.

Sure, Desktop apps are different, where you can off load some processing effort to the client machine to ease some things at the server level. But in web apps, you don’t want to rely on a browser to do that work for you, or even the application server because webservers are just glorified file servers and not as efficient as a database server. But I digress.

At the moment I am more focused on local Desktop apps, where they’re not talking to a server. But I have some projects in mind for that too, where I still plan to implement the majority of the business logic at the database level. This frees up the Desktop app to just handle UI logic and other local stuff like exports & printing, etc.

Plus the Desktop app only has to call one line of T-SQL, something like

EXEC dbo.MyStupidlyComplexAggregationRoutineAndGiveMeOnlyTheResults ?

That’s my vote, anyway. Hope that helps :slight_smile:

Dave and Brian,

I agree to let the db server do the “work”. I thought Tim’s question was on the line of using a stored procedure or even view on the server versus just sending the underlying SQL to the server. Only in certain cases will there be a perceivable performance difference.

Now if Tim meant only getting the raw data into Xojo and filtering/sorting it there, then I wholeheartedly agree that’s not optimal.

Dave, on upgrading I was referring to getting the db and app upgraded simultaneously in a multi-user environment. You have to come up with some mechanism to keep old apps from using the new db code, or new apps using old db code, until all apps and the db have been updated. It’s easier to manage that if all of the “code” is in the app. Now, if all users run the app from the same location, then it’s no problem. But if you have multi-platform apps that access the same db and db code, and/or remote users, then it’s more complicated. As far as one location for db code, replication throws a wrench into that (I know, as I’ve done it in the past).

Again… I tend to disagree. Any interaction with the database should be transparent. The only time an “old” version of a desktop app should have an issue is if a NEW stored procedure or function was added. And even then not so much, since it wouldn’t be calling something it didn’t know about. Any existing code should be able to discern who is calling if that requires a different “answer”.

I wrote tons of backend code for a webbased system that managed 600,000 doctors and hospitals, and usually had a few hundred online users, some in-house, but most across a selection of our clients. And since we could not force them to upgrade operating systems, browsers or apps, we had to make sure the back end “just worked”

There is no one easy elegant way of handling such a thing, unless you only provide web apps, but even then there are challenges (depending).

My strategy has always been to treat the database server as the “master”. In which case a “version” query needs to be performed by the Desktop client app either:
A.) - When client app starts up (and user logs in).
B.) - Periodically, like once an hour.
C.) - Before the user starts a long data entry process or other involved work.
D.) - Preceding each query.
E.) - As part of the results returned (trickier because of object binding may throw an error)
When the Database “version” value is greater than the client “version” (stored in a Constant?), stop the Desktop app and tell the user an upgrade is required before allowing them to continue. And either trigger the download/upgrade automatically, or make the user do it.

The above allows us to have minimal downtime and interruptions to the company as a whole. We don’t have as many proprietary Desktop apps as web apps (ratio of probably 1:50), but even for web apps performing Ajax/JSON calls in the background, we provide this logic which notifies the user and forces a page refresh for them to get any necessary HTML/Javascript updates. I even had to come with a process to only load HTML fragments when necessary, to help mitigate interruptions during updates.

This is what we do where I work. And because these are internal Intranet systems, we also generally notify our users ahead of time of the coming change(s) so people are not surprised.

[quote=467659:@Dave S]Again… I tend to disagree. Any interaction with the database should be transparent. The only time an “old” version of a desktop app should have an issue is if a NEW stored procedure or function was added. And even then not so much, since it wouldn’t be calling something it didn’t know about. Any existing code should be able to discern who is calling if that requires a different “answer”.

I wrote tons of backend code for a webbased system that managed 600,000 doctors and hospitals, and usually had a few hundred online users, some in-house, but most across a selection of our clients. And since we could not force them to upgrade operating systems, browsers or apps, we had to make sure the back end “just worked”[/quote]
That’s an impressive achievement Dave, I applaud your efforts. Sincerely.

Rant incoming…, our management would love for us to provide seamless system upgrades with no interruptions (like you describe), but working on the Microsoft platform suite of tools and application languages the past 5 or so years has created a dependency/versioning hell. Couple that with a scattered team of in-house developers, where many are not using the same language versions/features or tools (no, we have no in-house tool standards), and you have a real mess to juggle when it comes to certain kinds of upgrades. I’m still stuck on some tools and versions from 2010, for some projects.

Now if only management would just stop coming up with new requirements, maybe all us developers could catch up to the same level…

Man, I love Xojo and just doing my own thing with it :slight_smile:

Once you start working on large scale multiuser this just is not a realistic possibility. You have to handle versioning correctly and newer clients must be able to fall back to older versions of the database if needed. Once you release a version of the application into the wild you have no control over which servers will be upgraded in time for the next shift or which clients will receive the new version in time. You need to handle this.

But this is not about tools, it’s about discipline. Before I got out of the game, the last company I worked for shipped an application t to 39 banks around the world, no idea how many branch offices they had, but the clients were XP to Windows 10. Once you checked code in, it was automatically checked out, built for each platform and tested against 39 test suites, one per bank. You then got a report about broken builds and had fix them immediately.

The only thing I hated about putting business logic into store procedures was then you were just as locked into that vendors SP language etc as if you had written local sql in the client

A middle tier makes sense but that may NOT be store procs but it definitely was an API the client code could talk to that would then interact with the DB

[quote=467681:@Norman Palardy]The only thing I hated about putting business logic into store procedures was then you were just as locked into that vendors SP language etc as if you had written local sql in the client
[/quote]

Well after thirty years I can say I never met a company of any size the switched database environments. Building up the environment takes time and money and companies don’t want to do it more than once.

I have
Done it twice in companies between 1500 - 3000 employees and 100+ million in revenue
Up to the point I did it the first time I would probably have said much like you
The second I was literally on the team because I had done this once and most no one else had
And we did split the business logic out from the vendors specific sp language

N-Tier architectures often do this

So have I. We had systems across a few differnet engines , and was working to consolidate all under Oracle
Mostly due to having acquired other companies that used different tools

You can say that again. Our development VP is not a hands on kind of person, or very much cares about what any of us does.

I don’t know how many there are of us developers working here world-wide (which says something about our VP), but I’ll be generous and say 200. It’s like the bad old days of the wild west, with how some people decide to build stuff. The only restriction is that it has to run on Windows and be properly licensed by the company - we have auditors for that part.

Thanks everyone for the replies. I really appreciate it.

I worked with a IT manager who always made sure we used SP’s, Views, etc, and used the DB to do most of the heavy lifting and we just called SP’s basically on the client, and the results were returned to the client and sorted or filtered, etc. according to the clients needs. Views were also created by the SP’s for common reports that limited touching of the original tables basically so that the users didn’t need access to the tables. He was very stingy with the permissions on the DB, but for good reason since much of the information was the public’s data that had to be protected.
We were using a 150 table SQL Server DB that was both used on the web with classic ASP calling the SP’s, and also in house apps with C++ or VB front-ends.
Again, thanks everyone. You’re knowledge is killer and I really appreciate it.