Microsoft's Mark Russinovich hopes so

Aug 7, 2007 10:22 GMT  ·  By

The new Windows Explorer in Windows Vista is not all that it was cranked up to be. Well, at least not when it comes down to the built in compression mechanism. This is nothing new in Vista. In fact, the default compression capabilities of Windows XP have survived in Microsoft's latest operating system from XP. However, the evolution of the feature has left something to be desired revealed Mark Russinovich, Microsoft Technical Fellow, following the acquisition of Sysinternals by the Redmond company. At this time, Russinovich revealed that he hopes Microsoft will improve the compression engine in Windows 7 (Seven), skipping Windows Vista Service Pack 1.

And it all started with an error message generated when using Explorer's Send To Compressed (zipped) Folder feature. The error message read "File not found or no read permission." Although the error did not make any sense, Russinovich traced the cause to a Sharing Violation. The antivirus from Computer Associates International was apparently keeping the files opened for scanning while Windows Explorer was trying to access them. While the scenario did indeed sniff out a bug in the security solution, it failed to explain Windows Explorer compression shortcomings.

"Now I turned my attention back to the inefficiencies of Explorer's compression feature. I captured a Process Monitor trace of the compression of a single file and counted the associated operations. Just for this simple case, Explorer opened the target ZIP file 14 times, 12 of those before it had actually created the file and therefore with NOT FOUND results, and performed directory look ups of the target 19 times. It was also redundant with the source file, opening it 28 times and querying the file's basic properties 17 times," Russinovich explained.

Thus he managed to confirm the fact that Windows Explorer was also at fault; well, actually the compression mechanism built into Explorer. "Zipfldr.dll, the Explorer file compression DLL, was in most of the stack traces, meaning that the compression engine itself was ultimately responsible for the waste. Further, the number of repetitious operations explodes when you compress multiple files. There are clearly easy ways to improve the algorithm, so hopefully we'll see a more efficient compression engine in Windows 7," Russinovich added.