Uploaded image for project: 'Picard'
  1. Picard
  2. PICARD-1819

Distribute MB cache via IPFS to speed up metadata fetching

XMLWordPrintable

      The overall lookup process is slow due to API rate limits, but they are there for a good reason. However, the rate limitation can be mitigated by distributing the MB response cache using P2P techniques such as IPFS, instead of only caching locally using Qt tools.

      The P2P technique lets the users share their cache with others, while the MB server serves as the "super-seeder" (in Torrent-land)/"pinner" (in IFPS-land), saving request responses as files in an IPFS folder, which can be accessed via the IPFS network using the same API path as the original server, but replacing the base url with the ipfs hash to the folder (and gateway base url if using a gateway). As an advantage over torrent, files can be updated in the IPFS folder, keeping their name, while torrent would require a new torrent for the folder for each file changed (that would also require a new torrent).

      IPFS addresses/content can also be accessed without an IPFS client, using gateways (list) which temporarily cache the contents of the file and distribute using HTTP/HTTPS. As a proof of concept I cached few entries of MB responses into an IPFS folder replicating the API path and accessed them through two gateways (link1, link2).

      Even if the IPFS client can't be built into Picard to help host the cache, the gateway provides a good opportunity to offload requests.

      Based on the API statistics provided by Monin (@zas) in this link, about 3% of the queries are unique (total unique/total), indicating the potential for distributed caching, even though the data is pretty limit to make more assumptions, estimates, etc. Hard to estimate on the Picard's side without unique requests of Picard users, but I'd assume it should be similar to the rest of the users.

            Unassigned Unassigned
            gabrielcarvfer Gabriel Ferreira
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

              Created:
              Updated:

                Version Package