When running in the IDE (or built app for that matter) my app can crash with this error
[quote]My Application.debug(2210,0x112e235c0) malloc: *** error for object 0x10e8e8000: pointer being freed was not allocated
[/quote]
Any ideas on what could cause this? As it is a hard crash, I have no idea where in the code this is happening.
I’m not using declares, but I am using a memoryblock and writing to it with a pointer, and extending the memory blocksize when needed… It only happens when I write a lot of data.
Interesting … What I am doing is stress testing some code to do binary serialization of database recordsets from an SQLDatabase…
The select statement is on a view with 36 fields that are a mix of Strings, doubles and integers and the average binary size of a record is only 373 bytes,. The view has almost half a million records but I use a limit cause in teh select …
I can process 70,000 records without a problem but it cashes somewhere above that, but NOT at the same record every time… from what test I’ve done that can vary by almost 1000 records at least. It seems to depend on how else much I have open… so something to do with RAM … But I have 40GB RAM and the activity monitor does not show any memory pressure…
As I can process at least 70,000 records without a problem and I am not running out of memory, is this more likely to be a framework issue, or an issue in my code? Given how many records I can process, (and I can process more records if I compress groups of them before i stick them into the memoryblock as i go along - north of 170,000) it seems to me it’s likely a framework issue…
You may be right, but shouldn’t framework should throw an exception if I try to increase the memoryblock size beyond what it can do for whatever reason, so I can deal with it gracefully?
The problem with using feedback for this is that I don’t know if can create a simple app to replicate it and I can’t share this database.
Right now I’m not sure why this is happening, or if I can do anything about it…
I have my code written so that the max size the Memoryblock can grow to can be set as preference,… But without know why it’s happening I don’t know if that limit would be very different on different machines and/or OS or OS version! Never mind coding to avoid it!
Try nil’ing your old pointers before you let them go out of scope and nil any that refer to the old memoryblock before the resize as they will now be useless anyway.
I got rid of teh crash by reassigning the memoryblock to the pointer after the resize…
BTW with the defaults settings that happened after about 22,000d records… which is why I did not catch it earlier… I was testing with only 20,000 records!
The solution had nothing to do with nilling the power per say, but reassigning the the memoryblock to the pointer after a resize, as without that the pointer may new be pointing to the wrong location in memory!
[code]Dim P as Ptr, MB as New MemoryBlock(someSize)
P = MB
//Sometime later you need to to make the memoryblock bigger
MB.Size = SomeBiggerSize ’ this could move the whole thing to a different memory location ,
P = MB ’ Make sure the pointer is pointing to the right memory location!
[/code]
If you don’t do that you could wind up with the crash I got…
If you are using a BinaryStream as I did in my original version, that gets taken care of under the hood… But if you are using a pointer (for speed), you need you take care of it yourself
I have not had to think at that level for a LONG time!
Maybe it’s an option to use a BinaryStream to write the data to the MemoryBlock. You don’t have to deal with pointers and the BinaryStream resizes the MB automatically. Even writing strings is much simpler.
I personally often initialize the MB for example with 4 MB and cut it after all data is written to it’s effective size using BinaryStream.BytePosition.