Volodymyr, Data Scientist and AI/ML Engineer at Intelliarts, outlines how GenAI and advanced AI agents are redefining content creation, customer service, and internal automation. He highlights the rise of RAG-powered chatbots delivering domain-specific, context-aware insights from custom data sources. Across industries, AI is driving innovation through predictive maintenance, computer vision, and machine learning in marketing and manufacturing. Carsten emphasizes aligning AI investments with long-term business goals and scaling solutions thoughtfully. He believes successful adoption depends on strategic integration, cross-functional collaboration, and building adaptable AI infrastructures.
What major AI trends do you foresee shaping the industries you work in over the next 3–5 years?
The biggest AI trend we’re already seeing is the use of Gen AI, with applications like content generation, customer service, market research and analysis, translation, etc. One area where Gen AI is making a powerful impact is in the development of AI-powered chatbots and smart agents. These systems are becoming more capable and context-aware, supporting everything from internal knowledge retrieval to complex task automation. A standout example is a custom AI agent that we developed to support expert assessments and internal audits. This agent uses Gen AI to structure complex, raw feedback from domain experts and transform it into clear, actionable insights for leadership and management teams.
Together with Gen AI, I believe AI will continue to transform traditional workflows. Here we can mention ML-powered lead scoring in marketing or insurance or fault detection on the production lines in manufacturing. Predictive maintenance is another critical trend, allowing businesses to reduce downtime and cut operational costs. In industries relying on visual data, computer vision continues to unlock efficiencies in quality control, safety monitoring, and process improvement. From personal experience, one of the great use cases of computer vision is for car damage detection in insurance.

What new AI use cases are gaining traction across your client base?
Again, I’d say LLMs are gaining much attention but combined with RAG (retrieval augmented generation). This framework helps to integrate the generative abilities of LLMs with the precision of information retrieval systems. Since RAG draws information from external custom sources, frequently the company’s own database, the AI chatbots then provide more reliable outputs. In one of our recent case studies, the Intelliarts team built a domain-specific chatbot using the GPT-4 model. Trained on the NGO’s data, this AI agent could carry contextual conversations exclusively around gun safety — our customer’s key topic. It also mimicked the company’s tone of voice, use of terms, etc.
How are you helping clients future-proof their AI investments?
For one thing, we align any AI initiatives with our clients’ long-term business goals. It’s easy to get caught up in the hype around new models, but the real value comes when AI solutions are based on specific use cases that drive measurable results. In one case, a client considered using an LLM, but after evaluating the requirements, we recommended a classical machine learning (ML) approach instead. It was a better fit for this particular task, more cost-efficient, and easier to maintain.
For another thing, we build AI solutions that can scale, both from a technical and operational perspective. This way, we ensure our AI solutions can grow together with the company and adapt as new needs arise.
What do you think will separate successful AI adopters from the rest?
The key differentiator is how strategically and thoughtfully businesses integrate AI. Some just see it as an experiment or a quick fix when it’s better to treat it as a long-term capability. They can get started with a pilot AI project (and it’s even better than going all in). But in the end, it’s important to set clear objectives, build strong data foundations and infrastructure, and scale up.
Another major factor is knowing when and how to use the right technology. As mentioned earlier with the LLM vs. classical ML example, sometimes, the best solution isn’t the trendy one. Here reaching out to AI experts helps most.
What’s the biggest challenge most companies face when trying to scale AI?
I’d say that the biggest challenge is that scaling AI requires not just technical capability but organizational readiness and commitment to embedding AI into the business processes. This means fostering cross-functional collaboration between data scientists, software engineers, domain experts, and business leaders for AI solutions to deliver real business value.
Take manufacturing, for example: implementing predictive maintenance goes beyond an ML task. It involves coordination between production teams, IT, and operations to integrate AI insights into daily workflows and adapt existing processes.
Are clients investing in internal innovation labs or AI centers of excellence?
Yes, a large part of our clients are investing in AI centers as part of their broader digital transformation strategies. The main purpose of these hubs is to centralize AI expertise, accelerate the adoption of successful use cases, and promote the use of trustworthy AI. That said, the most effective ones aren’t isolated R&D. Instead, they’re deeply connected to business units, aiming to solve real business challenges and deliver measurable outcomes.