multiple HandleSpecialURL performance optimisation

Hi,

We have a web app which uses HandleSpecialURL calls between versions. The data that needs to be sent across can be up to about 2MB (although sometimes smaller)

I’ve created a console app to create the data as that uses 100% CPU whilst the data is generated, which can take up to 30-60 seconds.

CAN I use app.doevents inside HandleSpecialURL?

I want to manage the number of concurrent calls to the console app, (i.e. the number of concurrent HandleSpecialURL running) and also be able to start the console app and wait for the results to come back without causing a 100% CPU usage from inside the app itself.

The Concurrency checking code is:

d = new date while app.remoteAddresses.Count>app.processorcount and now.TotalSeconds-d.TotalSeconds<20 now = new date app.DoEvents(1000) wend if now.TotalSeconds-d.TotalSeconds>=20 then //timeout remoteAddresses.Remove(rmt) return false end if
I want to wait keep checking for 20 seconds or until the number of console apps running is less that the number of processers.

The console code is:

d = new date while urlshell.IsRunning and urlshell.ErrorCode=0 and now.TotalSeconds-d.TotalSeconds<6000 now = new date app.DoEvents(100) wend dim shellresult as string = urlshell.Result

This is the code I use which is fine inside a normal thread, but is is ok inside HandleSpecialURL?

for web app I would use DoEvents with smaller number like 10.

Or maybe better just run and wait for the shell Complete event?

At one time, there were 10 requests at once (which for a web server seems tiny), but launching 10 shells and waiting for each of them to finish takes too long. What I want to achieve is some kind of load balancing, because the app knows how many requests are being dealt with so I can control the number of running shells, while delaying any new requests.

Using DoEvents(10) is a good idea though.

Lee