Learning a functional grammar of protein domains using natural language word embedding techniques

Proteins. 2020 Apr;88(4):616-624. doi: 10.1002/prot.25842. Epub 2019 Nov 25.

Abstract

In this paper, using Word2vec, a widely-used natural language processing method, we demonstrate that protein domains may have a learnable implicit semantic "meaning" in the context of their functional contributions to the multi-domain proteins in which they are found. Word2vec is a group of models which can be used to produce semantically meaningful embeddings of words or tokens in a fixed-dimension vector space. In this work, we treat multi-domain proteins as "sentences" where domain identifiers are tokens which may be considered as "words." Using all InterPro (Finn et al. 2017) pfam domain assignments we observe that the embedding could be used to suggest putative GO assignments for Pfam (Finn et al. 2016) domains of unknown function.

Keywords: function prediction; machine learning; protein domains; semantic embedding; word2vec.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Databases, Protein
  • Datasets as Topic
  • Gene Ontology
  • Humans
  • Molecular Sequence Annotation / methods*
  • Natural Language Processing*
  • Protein Domains
  • Proteins / chemistry*
  • Proteins / physiology
  • Semantics*

Substances

  • Proteins