Publications
[09-2024] Hassan Soliman, Miloš Kravčík, Alexander Tobias Neumann, Yue Yin, Norbert Pengel and Maike Haag. 2024. Scalable Mentoring Support with a Large Language Model Chatbot. Technology Enhanced Learning for Inclusive and Equitable Quality Education (ECTEL), September 16–20, 2024, Krems, Austria, 6 pages.
Venues
Preprints
[01-2021] Effective General-Domain Data Inclusion for Machine Translation by Vanilla Transformers.
Built and trained a Transformer from scratch on the German-English translation task applications of WMT’13.
Utilized a general-domain dataset from IWSLT'16 TED talks to help improve performance of the Transformer model, achieving a 25.8 BLEU score.
ECTEL'24
DELFI'24
ACM'24
UMAP
ACL'24
RepL4NLP
[07-2024] Hassan Soliman, Miloš Kravčík, Nagasandeepa Basvoju, and Patrick Jähnichen. 2024. Using Large Language Models for Adaptive Dialogue Management in Digital Telephone Assistants. In Adjunct Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization (UMAP Adjunct ’24), July 1–4, 2024, Cagliari, Italy. ACM, New York, NY, USA, 12 pages.
[09-2024] Hassan Soliman, Miloš Kravčík, Alexander Tobias Neumann, Yue Yin, Norbert Pengel, Maike Haag and Heinz-Werner Wollersheim. 2024. Generative KI zur Lernenbegleitung in den Bildungswissenschaften: Implementierung eines LLM-basierten Chatbots im Lehramtsstudium. 22. Fachtagung Bildungstechnologien (DELFI), September 9-11, 2024, Fulda, Germany, 7 pages.
[08-2019] Offensive Language Detection & Classification on Twitter.
Trained a classifier to detect offensive tweets from Twitter using SVM, after performing iterative experiments.
Achieved a Binary Accuracy of 74% in classifying offensive tweets, and received the highest score among all participant teams.
[06-2019] Data Augmentation using Feature Generation for Volumetric Medical Images.
Proposed using U-net and ACGAN as a learning framework for feature generation of medical images of two complex types of brain tumors.
Deployed a classifier pipeline to test & validate the quality of the generated features.
[05-2022] Hassan Soliman, Heike Adel, Mohamed H. Gad-Elrab, Dragan Milchevski, and Jannik Strötgen. 2022. A Study on Entity Linking Across Domains: Which Data is Best for Fine-Tuning?. In Proceedings of the 7th Workshop on Representation Learning for NLP, ACL, 184–190, Dublin, Ireland.