Merge branch 'main' of github.com:kevinveenbirkenbach/duplicate-file-handler

This commit is contained in:
Kevin Veen-Birkenbach 2023-06-28 12:01:34 +02:00
commit c8e802548d

View File

@ -2,6 +2,8 @@
This repository contains two bash scripts for handling duplicate files in a directory and its subdirectories. This repository contains two bash scripts for handling duplicate files in a directory and its subdirectories.
The scripts may need to be modified depending on the specific requirements of your system or the specific use case. They currently operate by comparing the MD5 hash of files to find duplicates, which is a common but not foolproof method.
## Author ## Author
**Kevin Veen-Birkenbach** **Kevin Veen-Birkenbach**
@ -10,6 +12,9 @@ This repository contains two bash scripts for handling duplicate files in a dire
This repository was created with the help of [OpenAI's ChatGPT](https://openai.com/research/chatgpt) (Link to the conversation). This repository was created with the help of [OpenAI's ChatGPT](https://openai.com/research/chatgpt) (Link to the conversation).
## Setup
These scripts will help you manage duplicate files in your directories. Please make sure to adjust permissions on the scripts to be executable with `chmod +x list_duplicates.sh delete_duplicates.sh` before running.
## Usage ## Usage
### 1. List Duplicate Files ### 1. List Duplicate Files
@ -31,9 +36,3 @@ This repository was created with the help of [OpenAI's ChatGPT](https://openai.c
## License ## License
This project is licensed under the terms of the [GNU Affero General Public License v3.0](https://www.gnu.org/licenses/agpl-3.0.de.html). This project is licensed under the terms of the [GNU Affero General Public License v3.0](https://www.gnu.org/licenses/agpl-3.0.de.html).
These scripts will help you manage duplicate files in your directories. Please make sure to adjust permissions on the scripts to be executable with `chmod +x list_duplicates.sh delete_duplicates.sh` before running.
The scripts may need to be modified depending on the specific requirements of your system or the specific use case. They currently operate by comparing the MD5 hash of files to find duplicates, which is a common but not foolproof method.
Please be aware that these scripts are provided as is, without warranty of any kind, express or implied, including but not limited to the warranties of merchantability, fitness for a particular purpose and non-infringement. In no event shall the authors or copyright holders be liable for any claim, damages or other liability, whether in an action of contract, tort or otherwise, arising from, out of or in connection with the scripts or the use or other dealings in the scripts.