FolderItem.Delete Weird result

Greetings,

I`m trying to debug one app that creates issues while using FolderItem part

So the code workflow it the following

  1. It scans the pictures in one folder ;
  2. It lists them on a list;
  3. user selects the items in the list ;
  4. code loops trough the files and check selected files.
  5. it gets the FolderItem.Length and store it to a property
  6. Use FolderItem.CopyFileTo to copy the file into the new folder
  7. get the new destination file location and save it as folder item.
  8. get FolderItem.Length for the copied file
  9. If original file.Length = Moved file.Length then delete the original file .

so far all looks perfect but ,

If i copy the files one by one then some times it works some times not, if i copy bulk , meaning all the files , then it always fails to the last file where i get a 104 error even if nobody is touching that file.

Something weird as well, on the code level OriginalFile.Length = MovedFile.Length so the logic let me delete the file but when i look into the folders once the file was copied, renamed and deleted i see that on the original folder in Finder the file has 2,7 MB while in the new folder the moved file has 579 KB, so any idea what happens on the way ?

So far i did not managed to find out why the code fails to the last file and as well why the file size it changes.

I will try to make a separate project and to see if same problem happens if new project or maybe i`m missing something there .

Thanks in advance.

Not sure I would rely on Folderitem.length , since that value could change if the blocking factors of the source and destination are different or for any number of other reasons.

Is there a reason you are not using proper error checking (LastErrorCode) to determine if it succeeded?
The LangRef for CopyFileTo has a nice example

Well for the same reason i cannot rely on FolderItem.MoveFileTo as it might loose the file if error occur so i prefer to copy first the file and then validate and delete.

As for the length why it is there if we cannot rely on it, so far strangely every time i get same size on copy part but once renamed the size reduces a lot on the new file . why i have no idea, doing more tests now.

Same drive, same file system same main directory, only 2 separate folders original and copy , so why that size difference i have no idea.

As for the Last error code, i am using that , that`s why i see that i get 104 error for the last picture delete. no idea why.

I tried to check if it`s locked to unlock, it and delete it again but no luck .

Anyway thanks, ill try to do a separate project to see if this happens as well there and why if possible to avoid this. If still i cannot rely on FolderItem then i will go shell and thats it .

Exactly my point

  1. Copy File
  2. Error?
  3. No… delete the original

How does that potentially lose the file?

And I think MoveFileTo actually does all that, as in if there is an error the original stays put

Personally I think you are over-thinking what is really a very simple process

Well the reason is other, Local copy most of the time works fine , but while doing remote copy like local server or remote server via vpn where there can be a lot of reasons like restriction errors, disconnections and so on i need to make sure that the file is intact and i don`t loose it or corrupt the data.

I`ll have to dig more into this but i guess now if the connection is cut the file gets partially transferred and i will have to either override the file or to delete and start transfer again so i would need to have some kind of control on this side.

Any idea on this ?

Thanks again.

A file hash then ? and if ok validate the file , if not ok then delete and retry again .

Any idea for a copy progress ? in terms of time and file size ? in this case i have small files max 10 Mb , but if i have bigger files like > 100 MB over let`s say a 10 Mbps vpn connection it will take a while to transfer a full folder so i need to inform the user that the app still works and it is actually transferring the data.

Thanks.

yeah… check the error code that is what it is there for

as to a progress bar… now you leave the realm of inbuilt functions, and have to venture into other areas…
you have to move the file in chunks, packets what have you… and the possiblity for errors does become much greater.

Why not look into FTP functions instead…they are very error tolerant

Well because of the way the app works and the users are using the shared storage, they use AFP to mound the shared drives so we have to deal with that and use that only.

Thanks again, i will dig more and see what happens, so far i still don`t have any result on why the last file gets 104 error and why the new files are a lot smaller than the original.

Why are-you thinking the value returned by Folderitem.Length is the Hard Disk Logical Length ?
(On OS X / macOS; I do not know on other OSes, but I suppose it is the same)

Hard Disk Logical Length: this is the value of the Block read / write by the hard disk. This value is a multiple of 256, and depend on the size of the Hard Disk. If this value is 4096 Bytes (on small sized hard disk), your text file will take 1 block until the number of bytes taken by the text is below 4096; add one byte and your file will takes TWO Blocks (for a single byte more…)

Here, you gives two values (579KB and 2.7MB), but you do not give us the context. In other words, these values does not have any meaning until you give us the size of the Hard Disk where they resides… Worst, you quite gave us the file logical size on disk, not the files real size (in Bytes, even if that number is 541 244 321 Bytes.

BTW: to know if I am right or wrong, just select a RAW TEXT file and Get the info about this file (select it, cmd-i) and read the size: you will see two values: the first one is in Bytes, the second is between parenthesis and use KB, MB, GB, etc. in the form… 2.7 MB for example !

Aurelian:
this may be a stupid list of questions…
Do you check if the file is readable / writeable ?
Is the other volume a Linux OS volume?
Do you close the file write reference(s) once the file copy is over (if needed) ?
Did you flush at Write time ?

Are-you sure the image files you copy have only a data fork (in the old times, files can have a resource fork) ?
Are-you sure the “images files” you copy is… really is a file (not a bundle, nor a folder, nor… whatever) ?

Are-you sure about the number of files / the # of times you go thru the loop are the same ?
(this is the well know 0-based / 1-based problem)

Last idea: Why don’t you copy your files from the same hard disk, using know Source and Target folders ? So, the Hard Disk Logical Block will have the same size, and there cannot be permission problems (if your data are stored in the Documents folder for example). In that case you cannot have two sizes for the same file (or you have virus troubles or your hard disk is bad).

I know this is a lot of questions, but when you do not know what happens… one have to ask questions (and get answers).

BTW: can you make a bulk copy of a bunch of txt files (raw files, the files can have any size, but a bunch of byts will be enough). RAW TEXT files only. No RTF nor RTFD files
The idea here is to be sure about the copy of know files.

Sorry for this long answer.

Ask if I was not crystal clear.

[quote=337616:@Emile Schwarz]Why are-you thinking the value returned by Folderitem.Length is the Hard Disk Logical Length ?
(On OS X / macOS; I do not know on other OSes, but I suppose it is the same)

Hard Disk Logical Length: this is the value of the Block read / write by the hard disk. This value is a multiple of 256, and depend on the size of the Hard Disk. If this value is 4096 Bytes (on small sized hard disk), your text file will take 1 block until the number of bytes taken by the text is below 4096; add one byte and your file will takes TWO Blocks (for a single byte more…)

Here, you gives two values (579KB and 2.7MB), but you do not give us the context. In other words, these values does not have any meaning until you give us the size of the Hard Disk where they resides… Worst, you quite gave us the file logical size on disk, not the files real size (in Bytes, even if that number is 541 244 321 Bytes.

BTW: to know if I am right or wrong, just select a RAW TEXT file and Get the info about this file (select it, cmd-i) and read the size: you will see two values: the first one is in Bytes, the second is between parenthesis and use KB, MB, GB, etc. in the form… 2.7 MB for example !

Aurelian:
this may be a stupid list of questions…
Do you check if the file is readable / writeable ?
Is the other volume a Linux OS volume?
Do you close the file write reference(s) once the file copy is over (if needed) ?
Did you flush at Write time ?

Are-you sure the image files you copy have only a data fork (in the old times, files can have a resource fork) ?
Are-you sure the “images files” you copy is… really is a file (not a bundle, nor a folder, nor… whatever) ?

Are-you sure about the number of files / the # of times you go thru the loop are the same ?
(this is the well know 0-based / 1-based problem)

Last idea: Why don’t you copy your files from the same hard disk, using know Source and Target folders ? So, the Hard Disk Logical Block will have the same size, and there cannot be permission problems (if your data are stored in the Documents folder for example). In that case you cannot have two sizes for the same file (or you have virus troubles or your hard disk is bad).

I know this is a lot of questions, but when you do not know what happens… one have to ask questions (and get answers).

BTW: can you make a bulk copy of a bunch of txt files (raw files, the files can have any size, but a bunch of byts will be enough). RAW TEXT files only. No RTF nor RTFD files
The idea here is to be sure about the copy of know files.

Sorry for this long answer.

Ask if I was not crystal clear.[/quote]
Hi Emile ,

Well the app is more like a Photo manager for some parts and on the local version it does copies the photos within Documents folder you will have Documents -> DataFolder -> Import Photos to Documents ->DataFolder -> Client -> ClientID - > Photos .

In the server version i have 2 options, either mac server where the data gets stored on a shared drive on mac, or on a shared drive on a linux machine with avahi and the packages to make it like time capsule.

all those can be specially for the client - server can be in the same place or over a vpn connection so imagine de speeds , headache, and so on.

can you give me more details about [quote]Do you close the file write reference(s) once the file copy is over (if needed) ?
Did you flush at Write time ?
[/quote]

Thanks.

[quote=337619:@Aurelian Negrea]can you give me more details about

Do you close the file write reference(s) once the file copy is over (if needed) ?
Did you flush at Write time ?[/quote]

This depends on the way you copy the files. For example, you can use BinaryStream (check that entry in the BinaryStream in the Language Reference: Flush, Close).

When a complex (to far complex) feature have troubles, back to basics is a good way to get where the problem(s) are.

I had trouble using a project on Linux recently. Using isWrite / isRead removed these troubles. Check FolderItem.

Well Emile here is the Fun, it seems that i found my shrinking file part , even if the resolution is the same according to Finder File info it seems that the size is totally different .

[code]Dim PictureThumb As String
Dim pictureBuffer As picture
Dim MovedPhoto As FolderItem
Dim ThumbnailAI As FolderItem = DestFolder.Child(PictureThumb)

pictureBuffer = Picture.Open(MovedPhoto)
pictureBuffer.Save(ThumbnailAI, Picture.SaveAsJPEG)
[/code]

Here it is the code

And here are the Finder Details.

Any Ideas ?

It seems that all this is Reducing the File Quality by default and as well reducing the size of the file.

The Color Profile is different: the file size is different.

Copy the file from disk as file, do not open it, then save it. If for some reason you want to see it, this is not a problem, but copy the file from disk.

You do not changed the Xojo Quality value who can be different from what was used to create the file you just loaded.

http://documentation.xojo.com/index.php/Picture
search:
JPEG Quality Constants

Also: do not use a destructive compression shema file type like JPEG:

The result file takes less size on disk, but each time ou open / save it, you loose quality (just like making Xerox copy to a Xerox copy to a Xerox copy…).

Also: if the running machine have a running virus, it can add itself inside the open / write process (it may…).

The Language Reference have anexample on how to make a file copy here:
http://documentation.xojo.com/index.php/FolderItem

Watch the The CopyFileOrFolder method is as follows: paragraph.

Personally, I would use: FolderItem.CopyFileTo (here: http://documentation.xojo.com/index.php/FolderItem.CopyFileTo).

Is it clear now ?

Everytime you save a JPEG image you are causing damage to the image. Because JPEG is a lossy format (throws things away to get smaller files), I would not advise this a file copy function.

If CopyFileTo doeesn’t cut it for you; you can use “ditto” via a shell class; or alternatively create two binary streams, read from one file and write t’other.

Sam:

the problem - as usual - is that we do not really know what the op is doing. OP - me included - never told the whole story :wink: