Search code examples
djangomultithreadingpython-3.xscapypcap

Process Several Pcap Files Simultaneously - Django


In essence, the following function, called by the user of the django application that I am developing, uses the Scapy library to process 80-odd fairly large pcaps in order to initially parse their destination IP addresses.

I was wondering whether it would be possible to process several pcaps simultaneously, as the CPU is not being utilised to it's full capacity, ideally using multi-threading

def analyseall(request):
    allpcaps = Pcaps.objects.all()
    for individualpcap in allpcaps:
        strfilename = str(individualpcap.filename)
        print(strfilename)
        pcapuuid = individualpcap.uuid
        print(pcapuuid)
        packets = rdpcap(strfilename)
        print("hokay")
        for packet in packets:
            if packet.haslayer(IP):
    #            print(packet[IP].src)

     #       print(packet[IP].dst)
                dstofpacket = packet[IP].dst

                PcapsIps.objects.update_or_create(ip=dstofpacket, uuid=individualpcap)

    return render(request, 'about.html', {"list": list})

Solution

  • You can use above answer (multiprocessing), and also improve scapy’s reading speed, by using the PcapReader generator rather than rdpcap

    with PcapReader(filename) as fdesc:
        for pkt in fdesc:
            [actions on the pkt]