Posts

Check out my posts

paper review: “Graph Attention Networks”

arxiv: https://arxiv.org/abs/1710.10903 key points introduce “graph attention network(GAT)” which consists of “graph attention layers” which incorporate the “self-attention” idea from transformers to graph neural network the graph attention layers work by calculating weights of a Read more…

Get in Touch

Contact me through LinkedIn