Control Systems and Computers, N3, 2024, Article 5
https://doi.org/10.15407/csc.2024.03.053
Control Systems and Computers, 2024, Issue 3 (307), pp. 53-59.
UCD 514.18
V.Yu. LEVCHUK, Master’s degree, National University of Kyiv-Mohyla Academy, H. Skovorody str., 2, Kyiv, Ukraine, 04070, ORCID: https://orcid.org/0009-0001-6613-7478, pifagor6541@gmail.com
THE UNIVERSAL MODULE FOR INTEGRATION OF AN INTELLIGENT ASSISTANT INTO IOS APPLICATIONS
Investigated current implementations of the integration of intelligent assistants into mobile applications. Identified key disadvantages of existing implementations and formed the criteria for a universal intelligent assistant. Developed a proprietary software module for integrating an intelligent assistant into iOS application, which provides autonomy, minimal resource requirements, and simplifies the development process. Created a photo editor application to test the operation of the software module. The test results were presented and further development prospects were described.
Download full text! (On English)
Keywords: intelligent assistant, artificial intelligence, semantic search, natural language, model, machine learning, speech recognition, graphical interface.
- Nori, H. et al. (2023). “Capabilities of GPT-4 on medical challenge problems”. https://arxiv.org/abs/2303.13375 (accessed April 21, 2024).
- Yang, H., Lee, H. (2019). “Understanding user behavior of virtual personal assistant devices”. Inf. Syst. E-Bus Manage, 17, pp. 65-87. https://doi.org/10.1007/s10257-018-0375-1.
https://doi.org/10.1007/s10257-018-0375-1 - Hoy, M. B. (2018). “Alexa, Siri, Cortana, and More: An Introduction to Voice Assistants”. Medical Reference Services Quarterly, 37(1), pp. 81-88. https://doi.org/10.1080/02763869.2018.1404391.
https://doi.org/10.1080/02763869.2018.1404391 - Easwara Moorthy, A. and Vu, K.-P.L. (2015). “Privacy Concerns for Use of Voice Activated Personal Assistant in the Public Space”, International Journal of Human-Computer Interaction, 31(4), pp. 307-335. doi: 10.1080/10447318.2014.986642.
https://doi.org/10.1080/10447318.2014.986642 - Vaswani, A. et al. (2023). ” Attention is all you need”, arXiv.org. Available at: https://arxiv.org/abs/1706.03762 (accessed April 28, 2024).
- ZachNagengast – ZachNagengast/similarity-search-kit. [online]. Available at: <https://github.com/ZachNagengast/similarity-search-kit> [Accessed 11 May 2024].
- Wolf, Thomas, et al. “HuggingFace’s Transformers: State-of-The-Art Natural Language Processing”. ArXiv:1910.03771 [Cs], Feb. 11. 2020, arxiv.org/abs/1910.03771 [Accessed May 13, 2024].
- Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019).” DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter”. arXiv.org. https://arxiv.org/abs/1910.01108 [Accessed May 13, 2024].
- Wang, W. et al. (2020). “MINILM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers”. https://arxiv.org/abs/2002.10957 [Accessed June 21, 2024].
Received 28.07.2024