In the past few years I have been working more of an architect than a developer and I am mostly working with teams from the other side of the sea. So usually, by the time I wake up and get to work, there are a bunch of updates in the project. And if the project is big enough, it can potentially involve dozens of repos - microservices for the win.
Now I do have notification configured to my liking in the interested repos, but since I am still trying to be hands-on I am trying to spend a few minutes running the code locally. (Or some other cases need to support business in real time, in which case, a running local version is really handy)
TLDR: I need to keep a lot of repos updated, and I have found myself re-inventing the same 10 lines of bash script over and over again.
- Searches for .git folders under the "project structure" - hence the min/maxdept limitation. If you want to enable search via all folders, you can get rid of them, or if you're using a flat structure, set it to 2 etc.
- Saves changes - if any - to stash and changes to the master branch - if not there.
- Pulls branches + cleans up remotely deleted ones - PRs for the win
- Then restores the earlier stage (branch change-back + pop if needed)
- Now be carefull here, if you get a merge-conflict, you'll want to clean it up before issuing a pull in that repo again