Search code examples
pythondjangoperformanceopencvread-write

How to speed up reading and writing binary file operation in python?


I have got a binary file(400MB) that I first want to read and then write to another file in python.Currently what I'm doing is very basic.

file = 'file.bin'
with open('temp','wb+') as dest:
    for chunk in file.chunks():
        dest.write(chunk)

This code is a part of my Django app that I want to speed up.Is there any other better way to speed up this operation?

Update: Alright,to make things a bit more clear what I'm trying to do is loading an in-memory video file(binary data) in OpenCV using cv2.VideoCapture(filename) which I think is not possible as of now.So now I have to read the file from memory and write it to disk so that OpenCV operation can be performed.Basically I'm trying to get the video duration for validation purposes.

    import cv2
    cap = cv2.VideoCapture(dest.name)
    fps = cap.get(cv2.CAP_PROP_FPS)  
    frame_count = int(cap.get(cv2.CAP_PROP_FRAME_COUNT))
    duration = frame_count/fps

Solution

  • Got it after suffering for hours.But if you are a newbie programmer I guess you can't escape it. I was trying to get the path of the request.FILES in my Django view so that I can use it in OpenCV VideoCapture. So this is how I did it.

    file = request.FILES['filename'].file   
    cap = cv2.VideoCapture(file.name)
    fps = cap.get(cv2.CAP_PROP_FPS) 
    frame_count = int(cap.get(cv2.CAP_PROP_FRAME_COUNT))
    duration = frame_count/fps
    

    Update June/2020 This method will only work if your file size is more than 2.5 Mb.For files less than 2.5Mb you need to add this to your settings.py file.

    FILE_UPLOAD_HANDLERS = [
        'django.core.files.uploadhandler.TemporaryFileUploadHandler',
    ]