Skip to content

MDrance/pre-trained-embeddings-for-enhancing-multi-hop-reasoning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pre-Trained Embeddings for Enhancing Multi-Hop Reasoning

This is the official codebase of the paper Pre-Trained Embeddings For Enhancing Multi-Hop Reasoning.

Credits

This experiments were made based on the SalesForce MultiHopKG repository, which contains the code for the paper Multi-Hop Knowledge Graph Reasoning with Reward Shaping.

How to run

Pre-training the KGE model

To only train the KGE embedding model (ComplEx or ConvE) run the following command:

./experiment-emb.sh configs/<dataset>-<emb_model>.sh --train <gpu-ID>

Train MultiHopKG with reward shaping and pre-trained embeddings

To train MultiHopKG using the pre-trained embeddings from ConvE or ComplEx, run:

./experiment-rs.sh configs/<dataset>-<pre-trained_model>.sh --train <gpu-ID>

Train MultiHopKG without reward shaping but using pre-trained embeddings

To train MultiHopKG only using pre-trained embeddings from ConvE or ComplEx, run:

./experiment.sh configs/<dataset>.sh --train <gpu-ID>

To select which pre-trained model to use, go to the config file associated with the dataset, e.g for FB15K-237: configs/fb15K-237.sh and change the argument pretrained into conve or complex.

Freeze the pre-trained embeddings

By default, pre-trained embeddings are part of the learnable parameters of the model. To freeze the pre-trained embeddings ans exclude them from the learnable parameters, add the argument --freeze at the end of your command.

Inference

To evaluate an already trained model, remplace the --train flag by the --inference flag in the above commands. To save the search paths during the inference, add the `--save_beam_search_paths``flag.

Citation

If you use this work, please cite our paper:

@inproceedings{drance2023pre, title={Pre-Trained Embeddings for Enhancing Multi-Hop Reasoning}, author={Dranc{'e}, Martin and Mougin, Fleur and Zemmari, Akka and Diallo, Gayo}, booktitle={International Joint Conference on Artificial Intelligence 2023 Workshop on Knowledge-Based Compositional Generalization}, year={2023} }

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors