June 2017
Intermediate to advanced
532 pages
12h 59m
English
There are a lot of tools that compress data in various ways. The most famous examples for file packing algorithms/formats are ZIP and RAR. Such tools try to reduce the size of files by reducing internal redundancy.
Before compressing files in archives, a very simple way to reduce disk usage is just deleting duplicate files. In this recipe, we will implement a little tool that crawls a directory recursively. While crawling, it will look for files that have the same content. If it finds such files, it will remove all duplicates but one. All removed files will be substituted with symbolic links that point to the now unique file. This saves spaces without any ...
Read now
Unlock full access