Last updated Sept. 19, 2022
We all have incidents when we copy or download files a bunch of times and this results in multiple duplicate files. For a few files, you can delete them manually but for multiple files, this might tend to be a tedious task both in locating the files and deleting them. This python script makes the work easier for you as all you need is to input the location you want to be searched for and the keyword to be looked for. There are no external modules required for this as the needed import comes with python by default
import os
duplicate_files = []
search_term = 'Copy'
filepath = "<Enter your file path here>"
def search():
for dirpath,dirs, files in os.walk(filepath):
for filename in files:
fname = filename
if fname.__contains__(search_term):
print(fname)
duplicate_files.append(fname)
return print("Files Found: ", len(duplicate_files))
def delete():
for file in duplicate_files:
os.remove(file)
search()
delete()
The keyword to be searched for should be entered in this line in the code above:
if fname.__contains__('<Enter the keyword to be Searched for>'):
It is worth noting that the keyword to be searched for is case-sensitive.
The above script can be very versatile and used for more than searching for duplicate files. It can also be used to search for files containing a certain word and manipulate them as you see fit. Feel free to customize it and have fun.
Watch the implementation here:
Python, Django, Javascript
By subscribing, you will get one email every month on tips, tutorials, and resources to improve your skills as a developer. You will get early access to my courses and videos and also access to special bonus of the time. No spam, unsubscribe at any time