Journals Books 2687-5527 doi.org/10.36287/setsci
Latest Issue Archive Future Issues About Us
Conference Proceedings

SETSCI - Volume 6(1) (2023)
BMYZ2023 - Cognitive Models and Artificial Intelligence Conference, Ankara, Türkiye, Oct 26, 2023

FlexiGPT: Engaging with Documents
Abdalrhman Alquaary1*, Numan Çelebi2
1Sakarya University, Sakarya, Türkiye
2Sakarya University, Sakarya, Türkiye
* Corresponding author: apoalquaary@gmail.com
Published Date: 2023-11-26   |   Page (s): 81-85   |    71     7
https://doi.org/10.36287/setsci.6.1.029

ABSTRACT Leveraging the robust potential inherent in large language models, their profound and pervasive impact has transcended various domains in recent years, ushering in their widespread integration across diverse sectors. In resonance with this predominant trend, the current study introduces an innovative application that endows users with the capacity to actively engage in conversations with their digital files. This program integrates state-of-the-art large language models with the techniques of Retrieval Augmentation, thereby crafting an immersive and sophisticated experience that not only amplifies but fundamentally elevates user engagement to new heights of interactivity and responsiveness. Functioning as a pivotal nexus, Hugging Face, a renowned platform for machine learning models, assumes the role of the primary repository and catalyst for these transformative language models. Through the medium of this application, users can have interactive engagement, perfectly aligned with the continually evolving tapestry of linguistic technology and digital interaction. Significantly, users possess the freedom to choose from an extensive array of open-source large language models available on the Hugging Face platform, thereby, they also retain the option to seamlessly update to newer models as they become available, ensuring continuous access to the latest large language models and maintaining the applicability of the application in line with evolving user needs. Importantly, the operational viability of the program is extended to local execution, contingent upon the availability of sufficient hardware resources.  
KEYWORDS LLM, Retrieval Augmentation, Hugging Face, Langchain, Local LLMs
REFERENCES Web Access: LangChain. https://python.langchain.com/docs/get_started/introduction.

Web Access: LLM Leaderboard. https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard.

Web Access: Embedding Models LeaderBoard. https://huggingface.co/spaces/mteb/leaderboard.

Web Access: FlexiGPT. https://github.com/apoalquaary/FlexiGPT.

Vaswani, A., Shazeer, N., Parmar, N., and at all. 2017. Attention Is All You Need. NIPS. arXiv:1706.03762v5.

Zheng, L., Chiang, W.L., Sheng, Y., et al. 2023. Judging LLM-as-a-judge with MT-Bench and Chatbot Arena. arXiv:2306.05685v2.

Brochier, R., Guille, A., Velcin, J. 2019. Global Vectors for Node Representations. arXiv:1902.11004v1.

Mikolov, T., Chen, K., Corrado, G., et al., 2013. Efficient Estimation of Word Representations in Vector Space. arXiv:1301.3781v3.

Hu, E., Shen, Y., Wallis, P., et all. 2021. ORA: LOW-RANK ADAPTATION OF LARGE LANGUAGE MODELS. arXiv:2106.09685v2.

Dettmers, T., Pagnoni, A., Holtzman, A., et al. 2023. QLORA: Efficient Finetuning of Quantized LLMs. arXiv:2305.14314v1.

Touvron, H., Martin, L., Stone, K., et al. 2023. Llama 2: Open Foundation and Fine-Tuned Chat Models. arXiv:2307.09288v2.

Grootendorst, M. 2022. BERTopic: Neural topic modeling with a class-based TF-IDF procedure. arXiv:2203.05794v1.


SET Technology - Turkey

eISSN  : 2687-5527    
DOI : doi.org/10.36287/setsci

E-mail : info@set-science.com
+90 533 2245325

Tokat Technology Development Zone Gaziosmanpaşa University Taşlıçiftlik Campus, 60240 TOKAT-TURKEY
©2018 SET Technology