Difference between revisions of "表示学习小组"

From dbgroup
Jump to: navigation, search
(实体类型方向)
(实体类型方向)
Line 27: Line 27:
  
 
# TKRL: Representation Learning of Knowledge Graphs with Hierarchical Types. ''Ruobing Xie, Zhiyuan Liu, Maosong Sun''. IJCAI 2016. [http://www.thunlp.org/~lzy/publications/ijcai2016_tkrl.pdf paper] [https://github.com/thunlp/TKRL code]
 
# TKRL: Representation Learning of Knowledge Graphs with Hierarchical Types. ''Ruobing Xie, Zhiyuan Liu, Maosong Sun''. IJCAI 2016. [http://www.thunlp.org/~lzy/publications/ijcai2016_tkrl.pdf paper] [https://github.com/thunlp/TKRL code]
- Entities should have multiple representations in different types. TKRL is the first attempt to capture  the hierarchical types information, which is significant to KRL.
+
Entities should have multiple representations in different types. TKRL is the first attempt to capture  the hierarchical types information, which is significant to KRL.
  
 
# KR-EAR: Knowledge Representation Learning with Entities, Attributes and Relations. ''Yankai Lin, Zhiyuan Liu, Maosong Sun''. IJCAI 201. [http://nlp.csai.tsinghua.edu.cn/~lyk/publications/ijcai2016_krear.pdf paper] [https://github.com/thunlp/KR-EAR code]
 
# KR-EAR: Knowledge Representation Learning with Entities, Attributes and Relations. ''Yankai Lin, Zhiyuan Liu, Maosong Sun''. IJCAI 201. [http://nlp.csai.tsinghua.edu.cn/~lyk/publications/ijcai2016_krear.pdf paper] [https://github.com/thunlp/KR-EAR code]
- Existing KG-relations can be divided into attributes and relations, which exhibit rather distinct characteristics. KG-EAR is a KR model with entities, attributes and relations, which encodes the correlations between entity descriptions.
+
Existing KG-relations can be divided into attributes and relations, which exhibit rather distinct characteristics. KG-EAR is a KR model with entities, attributes and relations, which encodes the correlations between entity descriptions.
  
 
# Differentiating Concepts and Instances for Knowledge Graph Embedding. ''Xin Lv, Lei Hou, Juanzi Li, Zhiyuan Liu''. EMNLP 2018. [http://aclweb.org/anthology/DB-1222 paper] [https://github.com/davidlvxin/TransC code]
 
# Differentiating Concepts and Instances for Knowledge Graph Embedding. ''Xin Lv, Lei Hou, Juanzi Li, Zhiyuan Liu''. EMNLP 2018. [http://aclweb.org/anthology/DB-1222 paper] [https://github.com/davidlvxin/TransC code]
- TransC proposes a novel knowledge graph embedding model by differentiating concepts and instances. Specifically, TransC encodes each concept in knowledge graph as a sphere and each instance as a vector in the same semantic space. This model can also handle the transitivity of isA relations much better than previous models.
+
TransC proposes a novel knowledge graph embedding model by differentiating concepts and instances. Specifically, TransC encodes each concept in knowledge graph as a sphere and each instance as a vector in the same semantic space. This model can also handle the transitivity of isA relations much better than previous models.
  
 
# AutoETER: Automated Entity Type Representation for Knowledge Graph Embedding. ''Guanglin Niu, Bo Li, Yongfei Zhang, Shiliang Pu, Jingyang Li''. EMNLP 2020. [https://arxiv.org/pdf/2009.12030 paper]
 
# AutoETER: Automated Entity Type Representation for Knowledge Graph Embedding. ''Guanglin Niu, Bo Li, Yongfei Zhang, Shiliang Pu, Jingyang Li''. EMNLP 2020. [https://arxiv.org/pdf/2009.12030 paper]
- 知识图谱表示学习领域的研究,一作是北航的Guanglin Niu。本文将知识图谱中实体-关系-实体的三元组扩展到实体类型-关系-实体类型的类型相关的三元组,提出能够自动化学习实体类型的向量表示的AutoETER模型,并给出建模和推理对称、逆反、传递关系的理论证明,同时能够解决1-N,N-1和N-N这类复杂关系的推理问题。特别的,论文中提出的AutoETER是一个可适配于任意知识图谱表示学习模型的可插播模块,用于提供实体类型表示并进一步提升原有知识图谱表示学习模型的性能。在四个不同数据集上的实验结果表明本文提出的AutoETER方法的有效性和先进性。实验中还给出了可视化分析,可以直观看出实体类型表示的聚类效果明显优于实体表示的聚类效果,说明了实体类型表示的有效性。
+
知识图谱表示学习领域的研究,一作是北航的Guanglin Niu。本文将知识图谱中实体-关系-实体的三元组扩展到实体类型-关系-实体类型的类型相关的三元组,提出能够自动化学习实体类型的向量表示的AutoETER模型,并给出建模和推理对称、逆反、传递关系的理论证明,同时能够解决1-N,N-1和N-N这类复杂关系的推理问题。特别的,论文中提出的AutoETER是一个可适配于任意知识图谱表示学习模型的可插播模块,用于提供实体类型表示并进一步提升原有知识图谱表示学习模型的性能。在四个不同数据集上的实验结果表明本文提出的AutoETER方法的有效性和先进性。实验中还给出了可视化分析,可以直观看出实体类型表示的聚类效果明显优于实体表示的聚类效果,说明了实体类型表示的有效性。

Revision as of 14:55, 10 November 2020

综述

  1. Representation Learning: A Review and New Perspectives. Yoshua Bengio, Aaron Courville, and Pascal Vincent. TPAMI 2013. paper
  2. 知识表示学习研究进展. 刘知远,孙茂松,林衍凯,谢若冰. 计算机研究与发展 2016. paper
  3. A Review of Relational Machine Learning for Knowledge Graphs. Maximilian Nickel, Kevin Murphy, Volker Tresp, Evgeniy Gabrilovich. Proceedings of the IEEE 2016. paper
  4. Knowledge Graph Embedding: A Survey of Approaches and Applications. Quan Wang, Zhendong Mao, Bin Wang, Li Guo. TKDE 2017. paper

基于翻译模型

  1. TransE: Translating Embeddings for Modeling Multi-relational Data. Antoine Bordes, Nicolas Usunier, Alberto Garcia-Duran, Jason Weston, Oksana Yakhnenko. NIPS 2013. paper
  2. TransH: Knowledge Graph Embedding by Translating on Hyperplanes. Zhen Wang, Jianwen Zhang, Jianlin Feng, Zheng Chen. AAAI 2014. paper
  3. TransR & CTransR: Learning Entity and Relation Embeddings for Knowledge Graph Completion. Yankai Lin, Zhiyuan Liu, Maosong Sun, Yang Liu, Xuan Zhu. AAAI 2015. paper
  4. TransD: Knowledge Graph Embedding via Dynamic Mapping Matrix. Guoliang Ji, Shizhu He, Liheng Xu, Kang Liu, Jun Zhao. ACL 2015. paper
  5. TransA: An Adaptive Approach for Knowledge Graph Embedding. Han Xiao, Minlie Huang, Hao Yu, Xiaoyan Zhu. arXiv 2015. paper
  6. TranSparse: Knowledge Graph Completion with Adaptive Sparse Transfer Matrix. Guoliang Ji, Kang Liu, Shizhu He, Jun Zhao. AAAI 2016. paper
  7. TransG: A Generative Mixture Model for Knowledge Graph Embedding. Han Xiao, Minlie Huang, Xiaoyan Zhu. ACL 2016 paper
  8. KG2E: Learning to Represent Knowledge Graphs with Gaussian Embedding. Shizhu He, Kang Liu, Guoliang Ji and Jun Zhao. CIKM 2015. paper

基于关系路径

  1. PTransE: Modeling Relation Paths for Representation Learning of Knowledge Bases. Yankai Lin, Zhiyuan Liu, Huanbo Luan, Maosong Sun, Siwei Rao, Song Liu. EMNLP 2015. paper
  2. Traversing Knowledge Graphs in Vector Space. Kelvin Guu, John Miller, Percy Liang. EMNLP 2015. paper
  3. Knowledge Graph Embedding with Hierarchical Relation Structure. Zhao Zhang, Fuzhen Zhuang, Meng Qu, Fen Lin, Qing He. ACL 2018. paper
  4. TransRHS: A Representation Learning Method for Knowledge Graphs with Relation Hierarchical Structure. Fuxiang Zhang, Xin Wang, Zhao Li, Jianxin Li. IJCAI 2020. paper

实体类型方向

  1. Type-Constrained Representation Learning in Knowledge Graphs. Denis Krompa, Stephan Baier, Volker Tresp. The Semantic Web - ISWC 2015. paper
  1. TKRL: Representation Learning of Knowledge Graphs with Hierarchical Types. Ruobing Xie, Zhiyuan Liu, Maosong Sun. IJCAI 2016. paper code

Entities should have multiple representations in different types. TKRL is the first attempt to capture the hierarchical types information, which is significant to KRL.

  1. KR-EAR: Knowledge Representation Learning with Entities, Attributes and Relations. Yankai Lin, Zhiyuan Liu, Maosong Sun. IJCAI 201. paper code

Existing KG-relations can be divided into attributes and relations, which exhibit rather distinct characteristics. KG-EAR is a KR model with entities, attributes and relations, which encodes the correlations between entity descriptions.

  1. Differentiating Concepts and Instances for Knowledge Graph Embedding. Xin Lv, Lei Hou, Juanzi Li, Zhiyuan Liu. EMNLP 2018. paper code

TransC proposes a novel knowledge graph embedding model by differentiating concepts and instances. Specifically, TransC encodes each concept in knowledge graph as a sphere and each instance as a vector in the same semantic space. This model can also handle the transitivity of isA relations much better than previous models.

  1. AutoETER: Automated Entity Type Representation for Knowledge Graph Embedding. Guanglin Niu, Bo Li, Yongfei Zhang, Shiliang Pu, Jingyang Li. EMNLP 2020. paper

知识图谱表示学习领域的研究,一作是北航的Guanglin Niu。本文将知识图谱中实体-关系-实体的三元组扩展到实体类型-关系-实体类型的类型相关的三元组,提出能够自动化学习实体类型的向量表示的AutoETER模型,并给出建模和推理对称、逆反、传递关系的理论证明,同时能够解决1-N,N-1和N-N这类复杂关系的推理问题。特别的,论文中提出的AutoETER是一个可适配于任意知识图谱表示学习模型的可插播模块,用于提供实体类型表示并进一步提升原有知识图谱表示学习模型的性能。在四个不同数据集上的实验结果表明本文提出的AutoETER方法的有效性和先进性。实验中还给出了可视化分析,可以直观看出实体类型表示的聚类效果明显优于实体表示的聚类效果,说明了实体类型表示的有效性。