Search code examples
ubuntu-18.04macos-catalinasambarepo

Samba shared folder becomes unresponsive when updating lot of files (using repo sync command)


I have a code base on a Linux (Ubuntu) Machine that is shared with other machines using samba. When accessing the code base from a MacOS (10.15.5) and using 'repo sync' the command stops after a few git repos get updated abruptly with the following message

fatal: Unable to read directory. Resource temporarily unavailable!

I am not sure why this is happening. Nothing works until I remount the samba shared folder

The logs do not show any errors. Any leads on what might be causing this?

  • The Network is solid. Its two devices connected with static IP's
  • Checked the ulimit on the linux machine where the samba server runs. Its set to a huge value
  • Read and write to shared files is good. Copying a single large file works without issues as well. Somehow updating a lot of small files causes the issue (As in what repo sync does)

Here's the relevant config for samba

## Samba Version is 4.7.6-Ubuntu
[Global]
min protocol = SMB2
log level = 2
syslog = true
max log size = 1000

; Disabled this for debugging
; vfs objects = catia fruit streams_xattr

; fruit:metadata = stream
; fruit:model = MacSamba

; fruit:posix_rename = yes
; fruit:veto_appledouble = no
; fruit:wipe_intentionally_left_blank_rfork = yes
; fruit:delete_empty_adfiles = yes


[Extension_Project]
    path = /******
    valid users = *****

    guest ok = no
    read only = no
    writable = yes
    browseable = yes

Solution

  • So there are three things that I did:

    1. The logs show no errors, but every time they ended when number of open files were 16384. I updated this to 937730 (Just a arbitrary number from one of configs)

    2. When I run repo sync, the number of GIT repo's updated was better but still it would stop at some point (there are total of 700+ git repos) with the error message "Too many open files". I updated the linux open files limit to the some large number. (https://askubuntu.com/questions/1049058/how-to-increase-max-open-files-limit-on-ubuntu-18-04)

    The issue was still not resolved, although a greater number of repo's would be updated before the command quit on the client side. Turns out the repo command was overloading the server with waaaaay too many open file (some 6 digit open file number).

    3. Instead of running 'repo sync' that would sequentially update all 700+ repos, I use the below shell script to update the repos in chunks

    repo sync **SPACE SEPARATED LIST OF 20 PROJECTS**
    sleep 100
    
    repo sync **SPACE SEPARATED LIST OF 20 PROJECTS**
    sleep 100
    

    This is not a solution, but this works for now. Can someone help find a better solution?