A Prompt-Based Knowledge Graph Foundation Model for Universal In-Context Reasoning

Developments The author spropose prompt base KG foundation model using ind-context learning, called KG_ICL. The result yield suniversal reasoning with query-related examples and facts. They use a tokenizer to map entities and relations in prompt graphs to predefined tokens. They show that the method outperforms baselines and it enables generalization and knowledge transfer across diverse KGs.

image

Share link! 📋
Link copied!
See the main site!