Syrine Krichene, Thomas Müller, Julian Eisenschlos, DoT: An efficient Double Transformer for NLP tasks with tables, Findings of ACL, 2021 Code

Julian Eisenschlos, Bhuwan Dhingra, Jannis Bulian, Benjamin Börschinger, Jordan Boyd-Graber, Fool Me Twice: Entailment from Wikipedia Gamification, Proceedings of NAACL, 2021 Code

Jonathan Herzig, Thomas Müller, Syrine Krichene, Julian Eisenschlos, Open Domain Question Answering over Tables via Dense Retrieval, Proceedings of NAACL, 2021 Code

Thomas Müller, Julian Eisenschlos, Syrine Krichene, TAPAS at SemEval-2021 Task 9: Reasoning over tables with intermediate pre-training, Proceedings of 15th International Workshop on Semantic Evaluation, 2021 Code

Julian Eisenschlos, Syrine Krichene, Thomas Müller, Understanding tables with intermediate pre-training, Findings of EMNLP, 2020 Code Blog

Jonathan Herzig, Pawel Krzysztof Nowak, Thomas Müller, Francesco Piccinno, Julian Eisenschlos, TaPas: Weakly Supervised Table Parsing via Pre-training, Proceedings of ACL, 2020 Code Blog

Sebastian Prillo*, Julian Eisenschlos* SoftSort: A Continuous Relaxation for the argsort Operator, Proceedings of ICML, 2020 Code

Julian Eisenschlos*, Sebastian Ruder*, Piotr Czapla*, Marcin Kadras*, Sylvain Gugger and Jeremy Howard MultiFiT: Efficient Multi-lingual Language Model Fine-tuning, Proceedings of EMNLP, 2019 Code

Venky Iyer, Clayton Andrews, Omid Rouhani-Kalleh, Julian Eisenschlos Evaluating likely accuracy of metadata received from social networking system users based on user characteristics US Patent App. 10/061,797 US Patent App. 14/742,639, 2016

Julian Eisenschlos, Venky Iyer Predicting the Quality of New Contributors to a Crowdsourcing System NeurIPS: Crowdsourcing and Machine Learning Workshop, 2014

Julian Eisenschlos, Mariano Suárez-Álvarez 3-Calabi-Yau Algebras from Steiner Systems Master Thesis, 2013

[*] Equal contribution.