I know this has been asked and discussed earlier, but I couldn´t find the right workflow for this problem.
Let´s say I´m working on a new project that I want to push to GitHub. After a few commits and pushes that worked well, I continue coding and editing and at some point I add some big files larger than 100MB to my project (without knowing or keeping in mind that this will cause problems when I will try to push it in the next step).
So I do:
git add .
and after that I do:
git commit -m 'some commit message'
and finally:
git push
And now I am in trouble, because I get the remote error: Large files detected.
So what are my options here to 1. keep my project changes and my added files alive and 2. exclude the big files files from future commits.
I have found the command to delete the last commit (where I added the big files among other things) with git revert …
, but this is not want I want, because it also deletes all the work from my working directory.
Thoughts
This achieves the same effect as Joe A's answer except it's a lot simpler and IMHO safer for someone unfamiliar with the area for this special case of last commit was completely bad. You should definitely check his answer out, and upvote it (I did) if you liked mine, because it covers a larger scale of the problem. Think of this is a special case of his; if you had n>1 commits then his answer is the way.
Solution
git reset --soft HEAD~
This will undo the commit but leave all the things that would have been committed as staged.
Then git rm
the files that shouldn't have been committed and then re-commit.
If you still want them to stick around locally then you might use a variant of git rm
and add the file to the .gitignores
or possibly look into git lfs
.
Now you should be able to push.