If you change work environments once in a while you know that there is one thing that you are doing quite a lot of when starting off: Cloning quite a lot of repositories. Microservices, git and infrastructure as code among other things have let to a big number of repositories in almost all organizations.
As I was doing that the last time it got really tedious. Mainly because I wanted to search the whole code base and that did not work with the local gitlab instance. Naturally I turned to the shell to give this a shot and as it turns out it is a one liner (line breaks added for your convenience ;)), mostly thanks to python-gitlab and the inherent awesomeness of the shell:
gitlab -o json -f ssh_url_to_repo,namespace,path_with_namespace project list --all \ | sed '/^[[:space:]]*$/d' \ | jq -c 'select( .namespace.kind == "group" )' \ | jq -r ".ssh_url_to_repo,.path_with_namespace" \ | xargs -P 10 -n 2 git clone
By the way if you work with many repositories I recommend giving myrepos a shot. That means you have to register repositories with the tool and then update them in bulk. That again is a breeze to do if your shell is at hand:
find . -name .git -type d | xargs -n 1 dirname | xargs -n 1 mr register
Bonus tip: If you are still only use grep for search, give ag a go. It is like googling your hard drive and very handy when dealing with a big codebase.