How have robots adapted to the new GPT reality? Cases, tools and flight analysis at Conversations’23

How have robots adapted to the new GPT reality? Cases, tools and flight analysis at Conversations’23

On December 8, the eighth conversational and now generative AI conference for business and developers Conversations was held in Moscow. Most of the industry events this year were defined by ChatGPT, which caused the neural network boom around the world. Almost every talk at the conference was in one way or another related to the implementation, development or practical application of generative AI. Part of the conference was devoted to classic automation and the development of AI assistants in the new GPT reality.

Traditionally, Conversations opened with a report by Kyril Petrov, co-founder of Just AI. In his presentation, Kyrylo collected the key trends of generative AI for 2024. what obstacles await domestic IT companies and highlighted a number of promising startups that should be followed in the near future. As one of the trends of GenAI, Kyrylo noted the concentration of the most powerful LLMs in the hands of IT giants, which will be available only from vendor clouds. According to Kirill, weaker models will enter open-source. Also, in the near future we can expect the appearance of a large number of multimodal models that will work with different types of content. And as companies integrate new technologies, the way they work with knowledge and information will change, from searching internal knowledge bases to working with clients and finding sites.

Among the factors hindering the rapid development and spread of neural network models in Russia, Kyrylo singled out, first of allsanctions and isolation of the country from the most modern models and construction of large GPU clusters. But these factors restrain competition. It comes in second place lack of venture capital investment and, as a result, a small number of AI startups in our market. It also remains open regulatory and security issues large language models, therefore traditionally faster and more progressive than in Europe, banks and corporations are restrained by security issues, laws and policies in terms of personal and sensitive data.

GPU shortage issue speakers from Selectel, Yukhym Golovin and Vladyslav Kyrpinskyi touched on Since the beginning of 2023, the demand for GPUs at Selectel has increased 3 times. The experts talked about the situation on the market, the necessary infrastructure for the launch of LLM, the challenges facing suppliers of “iron” and the “pains” of client companies.

The creators of GigaChat and YandexGPT shared practical cases of using the most powerful LLMs on the domestic market. The top business cases of both companies included issues of support, HR tasks, marketing and sales. Polina Hryshina from SberDevices shared the insights of developing her own LLM, and also highlighted the resources spent: for example, to train GigaChat, 1024 GPUs were needed (such power would be enough to supply the Luzhniki stadium with electricity for 4 months) and 7.5 PB of data (that’s 4 !

The spokesman for YandexGPT, Oleksiy Dolotov, in turn, shared the statistics of users who use neural network tools in Yandex products – more than 31 million people have used YouTube’s short transfer feature, and all the tools combined have helped save users 57 years of life.

Representatives of SberDevices, YandexGPT, Tinkoff, and Alfa-Insurance gathered on stage to discuss technical issues related to the development and continuing education of LLM. As part of the discussion “The reverse side of LLM” Experts discussed the implementation and exploitation of open-source and proprietary large language models.

Along with the appearance of ChatGPT, the lives of many chatbots and voice assistants have changed – speakers from Tinkoff and Boto talked about how their transformation took place. Artem Bondar, Tinkoff, shared the details of the process of implementing ChatGPT in the bank’s support robot and how the team managed to solve the problem of neural network hallucinations. Anna Begiashvili, Boto, talked about how the company quickly adapted to new technologies and started integrating LLM into all products: from training and adaptation robots to HR bots.

Many participants of the conference noted in their reports that at the beginning of their GPT path» encountered a lack of information confirmed by real cases of implementation. A practical guide to integrating LLM into business processes shared by Maksym Bolotskikh, Yakiv and Partners. Maxim listed the main steps for developing an implementation strategy, talked about use cases for piloting, the process of gathering a team, training employees, and answered the question of further scaling of generative AI in the organization.

Timofii Barsov, Markswebb, gave an analytical report introduced a new rating system for chatbotstelling how it has evolved over the past few years. According to Tymofiy, the success of a chatbot today consists of 50% of its ability to solve the client’s tasks, 45% of the ability to conduct a competent dialogue with a person, and 5% of the user-friendliness of the interface.

Many reports were devoted to the development of language technologies. Yuliya Korotkova from Just AI talked about trends in speech synthesis and prosody management to create a lively and natural sound. The speaker from VKontakte, Vitaly Shutov, shared methods of neural network processing for real-time applications – how streaming speech recognition works, noise cleaning, echo cancellation, how to restore lost audio packets. Andriy Smolev from Yandex Cloud talked about how YaGPT changes the usual processes of language analytics.

The next AI Conversations conference will be held in 2024. Follow the program on the project website.

Related posts