LLM series - Nerdio: At the coalface of AI product management 

This is a guest post for the Computer Weekly Developer Network written by Stefan Georgiev in his position as senior product manager at Nerdio.

Nerdio is known as a software provider for organisations that want to manage and cost-optimise native cloud technologies, with Microsoft as its core focus now.

The company details the fact that it is highly focused on IT team skills development for Microsoft Azure, Windows 365, Microsoft Azure Virtual Desktop and Microsoft Intune among other key technologies.

Georgiev writes in full as follows…

When we consider the question of which LLM to start with, as a company, we can only comment on the one we used, which was OpenAI.

In evaluating project decisions, the product team at Nerdio and I always consider the following factors: internal expertise, compatibility with our existing technology stack and the resources required to develop a Minimum Viable Product (MVP). 

For us, our specialisation is Azure Virtual Desktop (AVD) and we are a Microsoft partner, so (perhaps logically) the decision was OpenAI on Azure. This streamlined our decision-making process for the first two dimensions. Regarding the effort dimension, utilising a foundational model from OpenAI considerably reduced the time and cost involved in developing our MVP.

A fundamental shift is underway

People often ask about whether working with LLMs requires a wider grounding in Machine Learning – and, you know, it’s a good point so let me tell you where we stand on this.

The answer is yes, a solid grounding in Data Science (DS) and Machine Learning (ML) can be helpful. 

In our product development efforts, expertise in DS and ML has proven beneficial in several ways. Firstly, there is a lot of information and hype in the DS/ML space, so our team’s expertise helps us filter out information that is irrelevant to our objectives. Secondly, DS/ML expertise is needed for designing solutions, evaluating trade-offs during implementation and assessing the quality of outcomes. Finally, Artificial Intelligence (AI), DS and ML is not just an emerging trend but a fundamental shift. These domains are swiftly becoming essential competencies for any IT organisation or department.

When people ask whether we should all be using closed source LLMs or open source LLMs and what’s the essential difference, it really depends on what stack the software vendor is using. We are a Microsoft partner and we use Azure and OpenAI, but in general, it will depend on the product/feature context.

Then of course there is the question of whether there are any innate challenges when it comes to the use of LLMs? Here we would say hallucination, repeatable results and staying grounded to the topic are challenges that we struggle with the most in our projects when it comes to using LLMs.

Identifying hallucinations

In our most recent project, we have seen instances where we can identify hallucinations and lack of repeatable results. There are ways to solve these problems, though they do make implementation more complex, as well as delay time to market and complicate support. 

Looking to the future, many people ask whether LLMs will become more task-specific and industry-specific – our stance on this is yes and no. 

We do see a trend among companies to create specialised ‘plugins’ for Large Language Models (LLMs) tailored for specific industries or verticals. This approach will likely involve incorporating industry-specific data and leveraging Subject Matter Expertise (SME). Recent announcements from Microsoft at the Ignite Conference for example show investments in tools to facilitate this. At the same time, there is a need for generic LLMs capable of addressing topics that span multiple industries or verticals, particularly for themes that defy straightforward categorisation.

Finally, let’s cover the question of whether or not we will ever be in a position where LLMs are fully integrated into enterprise software?

In the context of Nerdio and our products, the short answer is yes. The product team and I are investing one-third of our time and engineering bandwidth into integrating AI into our products. I am not sure who originally said this, but I think it answers these questions nicely – “AI will be like electricity, it will be everywhere and used for a lot of different purposes.”

Data Center
Data Management