rupeshforu3
New Member
- Messages
- 13
Hi I am Rupesh from India. I have examined a website and it contains of 50000 mp3 files of which I want to download 11000 files and I have downloaded 8000 mp3 files and I want to download upto 3000 mp3 files from the same website and discard the remaining files.
I have downloaded the files from net using offline browser called extreme picture founder. In that application it has an option called skip if the destination file exists. I am going to re download the files and select the option skip if the destination file exists. The above application has also has options for scanning the website and spidering etc., of which all options I have understood.
Previously after downloading files using offline browser I have copied the files to another directory and the directory structure was lost but have files in another directories.
As I want to download 11000 files which is of size 135 gb but I have downloaded 93 gb. If I can obtain the directory structure of website and the files with filenames only without any data in it I can maintain the directory structure same as website I mean I can keep file names and directory names same as website.
At present I have installed Windows 10 on my laptop. Upon opening power shell ISE which is command prompt for Windows and issuing the command ls -r > filenames.txt I can obtain the list of filenames with directory names. Is there any command or tool to obtain just the filenames and directory names of directory in website and store the output in a text file.
So please suggest a way how to obtain list of directory names and also filenames containing in those directories and store content in text file. If possible can you please suggest how to maintain directory structure same as website and also filenames without any content.
Regards,
Rupesh.
I have downloaded the files from net using offline browser called extreme picture founder. In that application it has an option called skip if the destination file exists. I am going to re download the files and select the option skip if the destination file exists. The above application has also has options for scanning the website and spidering etc., of which all options I have understood.
Previously after downloading files using offline browser I have copied the files to another directory and the directory structure was lost but have files in another directories.
As I want to download 11000 files which is of size 135 gb but I have downloaded 93 gb. If I can obtain the directory structure of website and the files with filenames only without any data in it I can maintain the directory structure same as website I mean I can keep file names and directory names same as website.
At present I have installed Windows 10 on my laptop. Upon opening power shell ISE which is command prompt for Windows and issuing the command ls -r > filenames.txt I can obtain the list of filenames with directory names. Is there any command or tool to obtain just the filenames and directory names of directory in website and store the output in a text file.
So please suggest a way how to obtain list of directory names and also filenames containing in those directories and store content in text file. If possible can you please suggest how to maintain directory structure same as website and also filenames without any content.
Regards,
Rupesh.