Graph Representation Learning Book
William l. hamilton, mcgill university.
The field of graph representation learning has grown at an incredible (and sometimes unwieldy) pace over the past seven years, transforming from a small subset of researchers working on a relatively niche topic to one of the fastest growing sub-areas of deep learning.
This book is my attempt to provide a brief but comprehensive introduction to graph representation learning, including methods for embedding graph data, graph neural networks, and deep generative models of graphs.
- Download the pre-publication pdf .
- Purchase the e-book or print edition here .
- Access the individual chapters (in pre-publication form) below.
Contents and Chapter Drafts
- Chapter 1: Introduction and Motivations [Draft. Updated September 2020.]
- Chapter 2: Background and Traditional Approaches [Draft. Updated September 2020.]
- Chapter 3: Neighborhood Reconstruction Methods [Draft. Updated September 2020.]
- Chapter 4: Multi-Relational Data and Knowledge Graphs [Draft. Updated September 2020.]
- Chapter 5: The Graph Neural Network Model [Draft. Updated September 2020.]
- Chapter 6: Graph Neural Networks in Practice [Draft. Updated September 2020.]
- Chapter 7: Theoretical Motivations [Draft. Updated September 2020.]
- Chapter 8: Traditional Graph Generation Approaches [Draft. Updated September 2020.]
- Chapter 9: Deep Generative Models [Draft. Updated September 2020.]
- Bibliography [Draft. Updated September 2020.]
Copyrights and Citation
This book is a pre-publication draft of the book that has been published by Morgan & Claypool. The publishers have generously agreed to allow the public hosting of the pre-publication draft, which does not include the publisher's formatting or revisions. The book should be cited as follows:
@article{ author={Hamilton, William L.}, title={Graph Representation Learning}, journal={Synthesis Lectures on Artificial Intelligence and Machine Learning}, volume={14}, number={3}, pages={1-159}, publisher={Morgan and Claypool} }
All copyrights held by the author and publishers extend to the pre-publication drafts.
Feedback, typo corrections, and comments are welcome and should be sent to [email protected] with [GRL BOOK] in the subject line.
IEEE Account
- Change Username/Password
- Update Address
Purchase Details
- Payment Options
- Order History
- View Purchased Documents
Profile Information
- Communications Preferences
- Profession and Education
- Technical Interests
- US & Canada: +1 800 678 4333
- Worldwide: +1 732 981 0060
- Contact & Support
- About IEEE Xplore
- Accessibility
- Terms of Use
- Nondiscrimination Policy
- Privacy & Opting Out of Cookies
A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.
- DOI: 10.2200/S01045ED1V01Y202009AIM046
- Corpus ID: 224936620
Graph Representation Learning
- William L. Hamilton
- Published in Synthesis Lectures on… 15 September 2020
- Computer Science
Figures and Tables from this paper
831 Citations
On addressing the limitations of graph neural networks, graph sampling for node embedding.
- Highly Influenced
Machine Learning on Graph-Structured Data
Designing and building enterprise knowledge graphs, geometric instability of graph neural networks on large graphs, theory of graph neural networks: representation and learning, dimensionality reduction meets message passing for graph node embeddings, a note on spectral graph neural network., deep generative models for subgraph prediction, graph neural networks for soft semi-supervised learning on hypergraphs, 136 references, semi-supervised classification with graph convolutional networks.
- Highly Influential
- 10 Excerpts
Graphite: Iterative Generative Modeling of Graphs
Learning deep generative models of graphs, gated graph sequence neural networks, deep graph infomax, representation learning on graphs with jumping knowledge networks, inductive representation learning on large graphs, grarep: learning graph representations with global structural information, representation learning on graphs: methods and applications.
- 12 Excerpts
Modeling Relational Data with Graph Convolutional Networks
Related papers.
Showing 1 through 3 of 0 Related Papers
Heterogeneous Graph Representation Learning and Applications
- © 2022
- Chuan Shi 0 ,
- Xiao Wang 1 ,
- Philip S. Yu 2
School of Computer Science, Beijing University of Posts and Telecommunications, Beijing, China
You can also search for this author in PubMed Google Scholar
Department of Computer Science, University of Illinois at Chicago, Chicago, USA
- Provides a comprehensive survey of heterogeneous graph representation learning
- Written by experts in the fields of data mining and machine learning
- Demonstrates effective applications of heterogeneous graphs
Part of the book series: Artificial Intelligence: Foundations, Theory, and Algorithms (AIFTA)
15k Accesses
11 Citations
1 Altmetric
This is a preview of subscription content, log in via an institution to check access.
Access this book
Subscribe and save.
- Get 10 units per month
- Download Article/Chapter or eBook
- 1 Unit = 1 Article or 1 Chapter
- Cancel anytime
- Available as EPUB and PDF
- Read on any device
- Instant download
- Own it forever
- Compact, lightweight edition
- Dispatched in 3 to 5 business days
- Free shipping worldwide - see info
- Durable hardcover edition
Tax calculation will be finalised at checkout
Other ways to access
Licence this eBook for your library
Institutional subscriptions
About this book
Similar content being viewed by others.
A Tutorial of Graph Representation
RHGNN: imposing relational inductive bias for heterogeneous graph neural network
Review of heterogeneous graph embedding methods based on deep learning techniques and comparing their efficiency in node classification
- Heterogeneous graph
- heterogeneous information network
- social network analysis
- network embedding
- network represenation
- data mining
- machine learning
- graph neural network
- deep neural network
- attention mechnism
Table of contents (11 chapters)
Front matter, introduction.
- Chuan Shi, Xiao Wang, Philip S. Yu
The State-of-the-Art of Heterogeneous Graph Representation
Structure-preserved heterogeneous graph representation, attribute-assisted heterogeneous graph representation, dynamic heterogeneous graph representation, emerging topics of heterogeneous graph representation, applications, heterogeneous graph representation for recommendation, heterogeneous graph representation for text mining, heterogeneous graph representation for industry application, platforms and practice of heterogeneous graph representation learning, future research directions, authors and affiliations.
Chuan Shi, Xiao Wang
Philip S. Yu
About the authors
Xiao Wang is the assistant professor in School of Computer Sciences of Beijing University of Posts and Telecommunications. He was a postdoc in the Department of Computer Science and Technology at Tsinghua University. He got his Ph.D. in the School of Computer Science and Technology at Tianjin University and a joint-training Ph.D. at Washington University in St. Louis. The main research interests include data mining, machine learning, artificial intelligence and big data analysis. He has published more than 50 refereed papers, including top journals and conferences in data mining, such as IEEE TKDE, KDD, AAAI, IJCAI, and WWW. He also serves as SPC/PC member and Reviewer of several high-level international conferences, e.g., KDD, AAAI, IJCAI, and journals, e.g., IEEE TKDE.
Philip S. Yu's main research interests include big data, data mining (especially on graph/network mining), social network, privacy preserving data publishing, data stream, database systems, and Internet applications and technologies. He is a Distinguished Professor in the Departmentof Computer Science at UIC and also holds the Wexler Chair in Information and Technology. Before joining UIC, he was with IBM Thomas J. Watson Research Center, where he was manager of the Software Tools and Techniques department. Dr. Yu has published more than 1,300 papers in refereed journals and conferences with more than 133,000 citations and an H-index of 169. He holds or has applied for more than 300 US patents. Dr. Yu is a Fellow of the ACM and the IEEE. He is the recepient of ACM SIGKDD 2016 Innovation Award and the IEEE Computer Society's 2013 Technical Achievement Award.
Bibliographic Information
Book Title : Heterogeneous Graph Representation Learning and Applications
Authors : Chuan Shi, Xiao Wang, Philip S. Yu
Series Title : Artificial Intelligence: Foundations, Theory, and Algorithms
DOI : https://doi.org/10.1007/978-981-16-6166-2
Publisher : Springer Singapore
eBook Packages : Computer Science , Computer Science (R0)
Copyright Information : The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022
Hardcover ISBN : 978-981-16-6165-5 Published: 31 January 2022
Softcover ISBN : 978-981-16-6168-6 Published: 01 February 2023
eBook ISBN : 978-981-16-6166-2 Published: 30 January 2022
Series ISSN : 2365-3051
Series E-ISSN : 2365-306X
Edition Number : 1
Number of Pages : XX, 318
Number of Illustrations : 1 b/w illustrations
Topics : Data Mining and Knowledge Discovery , Machine Learning , Data Structures and Information Theory , Artificial Intelligence
- Publish with us
Policies and ethics
- Find a journal
- Track your research
IMAGES
VIDEO
COMMENTS
These advances in graph representation learning have led to new state-of-the-art results in numerous domains, including chemical synthesis, 3D-vision, recommender systems, question answering, and social network analysis. The goal of this book is to provide a synthesis and overview of graph representation learning.
This book is my attempt to provide a brief but comprehensive introduction to graph representation learning, including methods for embedding graph data, graph neural networks, and deep generative models of graphs. Access . Download the pre-publication pdf. Purchase the e-book or print edition here.
These advances in graph representation learning have led to new state-of-the-art results in numerous domains, including chemical synthesis, 3D vision, recommender systems, question answering, and social network analysis. This book provides a synthesis and overview of graph representation learning. It begins with a discussion of the goals of ...
It can be downloaded from the website in a single book-length PDF or separate single chapter PDFs. At a very high level, the book aims to do the following: This book is my attempt to provide a brief but comprehensive introduction to graph representation learning, including methods for embedding graph data, graph neural networks, and deep ...
This book is a foundational guide to graph representation learning, including state-of-the art advances, and introduces the highly successful graph neural network (GNN) formalism. Graph-structured data is ubiquitous throughout the natural and social sciences, from telecommunication networks to quantum chemistry. Building relational inductive biases into deep learning architectures is crucial ...
HARP (Random-walk embeddings via graph pre-processing) Neighborhood autoencoder and aggregation Deep Neural Graph Representations (DNGR) Structural Deep Network Embeddings (SDNE) etc. Transductive learning No parameter sharing No feature learned representation Hamilton, 2018
Figure 3: Overview of the encoder-decoder approach. First the encoder maps the node, vi, to a low-dimensional vector embedding, zi, based on the node's position in the graph, its local neighborhood structure, and/or its attributes.Next, the decoder extracts user-specified information from the low-dimensional embedding; this might be information about vi's
1.4 Who Should Read the Book? 6 1.5 Feature Learning on Graphs: A Brief History 8 1.5.1 Feature Selection on Graphs 9 1.5.2 Representation Learning on Graphs 10 1.6 Conclusion 13 ... The field of graph representation learning has been greatly developed over the past decades that can be roughly divided into three generations includ-
Book Abstract: This book is a foundational guide to graph representation learning, including state-of-the art advances, and introduces the highly successful graph neural network (GNN) formalism. Graph-structured data is ubiquitous throughout the natural and social sciences, from telecommunication networks to quantum chemistry.
al intelligence techniques.2.5 SummaryIn this chapter, we introduce the moti. ation of graph representation learning. Then in Section 2, we discuss the traditional graph embedding methods and the mod-ern graph embedd. ng methods are introduced in Section 3. Basically, the structure and property preserving graph re.
This is the graph representation learning chapter of the second edition of the book Representation Learning for Natural Language Processing, with its first edition published in 2020 . As compared to the first edition of this chapter, the main changes include the following: (1) we reorganized the narrative logic of this chapter, by dividing it ...
Graphs representation learning has been a very active research area in recent years. The goal of graph representation learning is to ... To the best of our knowledge, there are several other surveys and books on graph representation learning. The surveys in [8, 26, 34, 78, 87, 88, 126, 153, 171, 256, 266, 295, 310, 318, 319] mainly cover the ...
TLDR. This tutorial presents machine learning on graphs, focusing on how representation learning - from traditional approaches to deep neural architectures - fosters carrying out tasks to unlock the potential of this data and execute tasks, including node classification, graph classification, and link prediction. Expand.
The field of graph representation learning has grown at an incredible—and sometimes unwieldy—pace over the past seven years. I first encountered this area as a graduate student in 2013, during the time when many researchers began investigating deep learning methods for "embedding" graph-structured data.
Machine learning. Subject: Neural networks (Computer science) Subject: Graph theory -- Data processing. Call number: Q325.5 .H364 2020. Other copies: Look for editions of this book at your library, or elsewhere.
Graph representation learning (or graph embedding) aims to map each node to a vector where the distance char-acteristics among nodes is preserved. Mathematically, for graph G= (V;E), we would like to find a mapping: f: v i!x i2Rd; where d˝jVj, and X i= fx 1;x 2;:::;x dgis the embedded (or learned) vector that captures the structural ...
1.4 Who Should Read the Book? 6 1.5 Feature Learning on Graphs: A Brief History 8 1.5.1 Feature Selection on Graphs 9 1.5.2 Representation Learning on Graphs 10 1.6 Conclusion 13 1.7 Further Reading 13 PART ONE FOUNDATIONS 15 2 Foundations of Graphs 17 2.1 Introduction 17 2.2 Graph Representations 18 2.3 Properties and Measures 19 2.3.1 Degree 19
Graph-structured data is ubiquitous throughout the natural and social sciences, from telecommunication networks to quantum chemistry. Building relational inductive biases into deep learning architectures is crucial for creating systems that can learn, reason, and generalize from this kind of data. Recent years have seen a surge in research on graph representation learning, including techniques ...
Graph Representation Learning (GRL) methods aim at learning low-dimensional continuous vector representations for graph-structured data, also called embeddings. Broadly speaking, GRL can be divided into two classes of learning problems, unsu-pervised and supervised (or semi-supervised) GRL. The rst family aims at learning
Authors: Chuan Shi, Xiao Wang, Philip S. Yu. Provides a comprehensive survey of heterogeneous graph representation learning. Written by experts in the fields of data mining and machine learning. Demonstrates effective applications of heterogeneous graphs. Part of the book series: Artificial Intelligence: Foundations, Theory, and Algorithms (AIFTA)
In this survey, we provide an overview of these two categories and cover the current state-of-the-art methods for both static and dynamic graphs. Finally, we explore some open and ongoing research directions for future work. Subjects: Machine Learning (cs.LG); Social and Information Networks (cs.SI) Cite as: arXiv:2204.01855 [cs.LG]
Graph representation learning aims to effectively encode high-dimensional sparse graph-structured data into low-dimensional dense vectors, which is a fundamental task that has been widely studied in a range of fields, including machine learning and data mining. Classic graph embedding methods follow the basic idea that the