We're talking about file systems, so to get this back on track in terms of things we can actually observe, WTH is a "data tree"? Do you mean "directory"? Directories that "go six or seven deep" are not even moderately unusual, nor is "massive amounts of data storage", whatever that means, which probably isn't much considering the small capacities of the SSDs you are likely using.
You have to understand how the underlying filesystem works. It's built upon a computer science concept known as b-trees and linked lists. What can happen, as files are added and deleted to directory entries, is that they get "holes" in them where deleted entries remain and new entries are added onto the end. After a lot of time passes, this makes directory entries quite large and reduce directory scanning performance (because it has to skip over all these old entries). It's true that an SSD will make this faster in many cases as well, but an optimized DE will always be faster than a non-optimized one, regardless of the storage media.
Still, my point is that defragmentation with a good defragmenter can achieve some benefits.. I don't consider them worth the loss in write cycles caused by defragging an SSD, but still they are there... in some circumstances. I don't advocate defragging an SSD, but at the same time it's not as ridiculous as you like to make out.
I understand how file systems work, and as a software developer for over 20 years, I've spent a lot of my time choosing data structures such as lists, trees, graphs, etc from libraries when possible and implementing them from scratch when necessary. As I replied to Poppa Bear's enthusiastic adoption of your terminology, I wanted "to get this back on track in terms of things we can actually observe". We can't really observe the "data trees" and other OS data structures (not that Poppa Bear understood that's what you were referring to - after all, he talked about "massive amounts of data storage, with data trees that go six or seven deep", and I'm sure he meant files and directories), and everything you've talked about is conjecture based on some knowledge of how the OS works. That knowledge can be the basis for forming hypotheses and designing experiments, but it cannot be used to draw conclusions, which is what Poppa Bear did. For all we know, the optimization of OS data structures you mentioned and I acknowledged as "interesting" but probably hard to measure amount to microseconds in operations that take milliseconds or longer to complete; IOW, they could very well be negligible. I think this is very likely given the relative speed of SSDs, the ridiculous speed of modern CPUs with gobs of memory at their disposal, the relatively small amounts of data needed to keep track of files and directories as compared to the amount of data they contain and what is done with it, and the scarcity of legitimate people recommending defragmentation of SSDs.
My Computer
System One
-
- OS
- Windows 8.1 Pro with Media Center