Login Register






Thread Rating:
  • 0 Vote(s) - 0 Average


Split large wordlists into smaller files filter_list
Author
Message
Split large wordlists into smaller files #1
It's not always efficient having to wait for 15 million attempts for a single account. So I wrote a simple script to split it into smaller files.

Syntax: ./splitfile.py [filename] [new size]

Example: ./splitfile.py rockyou.txt 50000
This will create X number of files with no more than 50k words in each

Code:
#!/usr/bin/python

import sys
import math
import re

def makeFilename(number, chunks):
    length = str(len(str(chunks)))
    format = 'output-%0' + length + 'd'
    return format % (number)

if len(sys.argv) != 3:
    sys.exit('Syntax error: ./splitfile.py <filename> [lines per file]')
else:
    filename = sys.argv[1]
    if not re.match('^\d+$', sys.argv[2]):
        sys.exit('Chunk size must be a number')
    try:
        with open(filename) as f:
            lines = f.readlines()
        total = len(lines)
        chunk_size = int(sys.argv[2])
        chunks = (total / chunk_size) + 1

        i = 0
        j = 1
        fout = open(makeFilename(j, chunks), 'wb')
        print 'Writing file #' + str(j) + ' of ' + str(chunks)
        for line in lines:
            fout.write(line)
            if i % chunk_size == 0:
                fout.close()
                fout = open(makeFilename(j, chunks), 'wb')
                print 'Writing file #' + str(j) + ' of ' + str(chunks)
                j += 1
            i += 1
        fout.close()

    except IOError:
        sys.exit('IOError: Unable to read file')

Reply

RE: Split large wordlists into smaller files #2
this is just what i was looking for! thanks!!!

Reply







Users browsing this thread: 1 Guest(s)