They serve different purposes. PyTorch-BigGraph (BG) implements a set of algorithms for learning node embeddings (vector representations of each node in the graph) based on the edges (relations) present on a single graph.
dgl is a library for graph neural networks (GNNs). The algorithms present in BG can be implemented in dgl, albeit much less efficiently but the reverse might not be true.
More specifically GNNs are a set of methods based on what is called "Message Passing" algorithm, where the embedding of each node is a function (parameterized over the model weights) of its neighborhood and the edges that connect the node to it.
Additionally GNNs target learning functions that work on multiple graphs for example graphs of molecules to predict their properties, not just a single graph.
dgl can be compared to PyTorch Geometric. The former works on both TF 2.0 and Pytorch while the latter is only for PyTorch.
Both are almost equivalent, although dgl has some institutions backing it. PyTorch Geometric might feel a bit more lightweight to integrate in existing codebases.
(Not parent author) I believe what makes HEY compelling to some email users is that it is opinionated about what email workflow should be like and fits the bill for many.
While a similar way of using email can be achieved in other mail clients [0] the simplicity of having such features by default it's by itself a good reason.
Some features that are non-trivial to emulate in other email clients are:
- Grouping emails with different subjects etc to the same
They are also efficient at inference-time. On GPUs the difference is noticeable only for sequences of length > 1024 (2048 for reformer since it adds some operations for hashing) thanks to the massive parallelism of GPUs amortizing the quadratic effect of the "usual" self-attention mechanism.
I find remote code editing to be useful when I don't have the necessary resources on my machine such as disk/memory/gpu and I am still prototyping a solution, which happens a lot in ML/DL.
I agree that remote code editing is not strictly necessary since you can perform such task with sshfs or git push/pull workflows.
On the other hand, having the remote environment in your local IDE quickens the development and you don't have to setup a full replica of your environment locally.
edit: ML/DL here stands for Machine Learning/Deep Learning
I also tried SSHFS, but there are hiccups. I did not really debug this. But sometimes PyCharm just freezes for 5-10 seconds. Which is extremely annoying, basically this makes it unusable. Maybe too much files are opened at the same time or so. Maybe I have to increase the cache size. I don't really know...
Also doing any Git action (git commit or so) over SSHFS is way too slow because it needs to go through the whole project directory. Maybe it's fine if the repo and project is smaller. But in this case, a Git commit can easily take a minute (while locally it takes a few milliseconds).
A Git push/pull workflow would also be quite annoying when you are prototyping and just want to try out things.
Rclone is mainly for keeping files in sync/transferring files between multiple storage providers, for example copying a file to GDrive/Dropbox when such file is added to S3 or locally. It supports multiple storage providers.
Borg is for backups. It doesn't support any storage provider.
One can use both of them jointly, creating borg backups and saving them on GDrive for example.
There is Password Store[0] although apparently no longer maintained (currently still using it). The downside is that there's no auto-fill etc. meaning that you have to copy and paste every login manually.
Actually there is one. It's a no profit organization called Afs (afs.org). Its main purpose is to make adolescents (students between 14-17yo) explore a new country being hosted in a local family and studying in a local school. The participant makes a list of 10 (or less) preferred countries to which he'd like to go but can't choose the city. The latter is chosen depending on the compatibility with the families disponible in that country. The whole network of countries and hosting families is composed by volunteers (no economic benefits) all over the world who share the same goals: discovering and understanding new cultures. Disclaimer: I went one year abroad with afs and am currently a volunteer in my country.
dgl is a library for graph neural networks (GNNs). The algorithms present in BG can be implemented in dgl, albeit much less efficiently but the reverse might not be true.
More specifically GNNs are a set of methods based on what is called "Message Passing" algorithm, where the embedding of each node is a function (parameterized over the model weights) of its neighborhood and the edges that connect the node to it.
Additionally GNNs target learning functions that work on multiple graphs for example graphs of molecules to predict their properties, not just a single graph.
dgl can be compared to PyTorch Geometric. The former works on both TF 2.0 and Pytorch while the latter is only for PyTorch.
Both are almost equivalent, although dgl has some institutions backing it. PyTorch Geometric might feel a bit more lightweight to integrate in existing codebases.