Using datalad with Google Cloud Storage

Hi All,

I am new to Datalad. I am trying to achieve version history and commit details for every person who is doing any changes to my datalad dataset.

For far, I am able to create a sibling of my local dataset to a cloud storage bucket and able to export the datalad dataset to GCS bucket/datalad sibling.

What I am trying to achieve here is the below points: -

  1. where ever some files get changed to my datalad directory a commit should be able to capture the user details.
    Currently it captures the git config details of my that I set during the git installation. Is there a way to dynamically pass these values using datalad while doing a commit?

  2. I don’t want my local disk to maintain the history of the files, just the metadata, version history I want to store it on a GCS bucket.
    Currently, I am able to push all the files/ folder (except the .git folder which contains history) to GCS sibling using git-annex export command. Is there a way to push the version history to GCS bucket and get insight from there instead of storing everything locally?

  3. Also, most of the commands I am using are the git-annex commands. Is there a datalad API present for the same operations?

Any insights will be helpful.