If you’re working with a dataset that is slow to download, you might want to save it in your google drive so that you can reload it faster when you come back to your colab notebook. Here’s what worked for me:
from google.colab import drive
# The first time you run this, it will prompt you to get a code
# from your google account
drive.mount('/content/gdrive')
# Example from HCP dataset colab notebook
# If you adapt this to another dataset, post it in the replies!
# Note that this link may look different depending on if you have
# an institutional or personal account
HCP_DIR = os.path.join("/content/gdrive", "My Drive", "colab_data", "hcp")
if not os.path.isdir(HCP_DIR):
os.mkdir(HCP_DIR)
Then, you should not need to re-download the data the next time you open the notebook (make sure you save a copy of the notebook in your drive!)