youtube_url

Mphasis has been at the forefront of the AI revolution. They have been offering incredible services in the domain of AI that include integrating LLM models across several businesses and their use cases. With their latest venture, Mphasis.AI, they are poised to revolutionize the existing market and infuse AI into numerous business scenarios. The organization has already developed a range of customer-centric services and is primed to offer various solutions to its clients and the market in general.

Today, we have the privilege of interviewing Anup Nair, managing partner and global CTO at Mphasis.AI. Anup’s remarkable journey has earned him recognition as one of the top 30 CTOs in India. In this interview, we will delve into his inspiring story of innovation and leadership and explore insight and experiences from his lifelong tale. So, let’s embark on this enlightening conversation.

web web web

webFull Interview

1) Can you explain what Mphasis.AI is and your role in the organization?

Sure. So, Mphasis.AI is a business unit that we launched at the beginning of June. The sole purpose behind the unit was to bring AI solutions to solve some of our complex business problems. Problems that our customers are facing and help them in their journey of business transformation.

Furthermore, one can imagine Mphasis.AI as a platform that our partners and we use to provide services to our customers.

Talking about myself. I am the CTO for Mphasis.AI, and my primary objective is to make sure we bring technological innovation and the right kind of solutions to our customers' primary goal.

2) What kind of AI services does Mphasis.AI provide to the customers?

We offer a full range of solutions. We start with AI blueprinting, where we start and talk to our customers to understand their requirements. We use evidence-based design, which is IP. We have to understand and dissect issues in their business processes and blueprint a roadmap for them. We call this the advisory services.

From here, we go about helping them in executing the blueprint, which is what we call a factory model. In this, we implement an AI factory for our customers. The objective of this factory is to execute the blueprint and bring AI products to the market from the customer's perspective.

We do things like LLM application development, fine-tuning of LLM models, prompt engineering, conversation design, and many similar services. Services that make the factory successful.

Outside of this, we also have some products we’ve always had. For example, we had a product called Winsure. It is an insurance policy administration platform. We have infused AI into the policy administration platform, and we offer that to our customers.

Similar to that, we have something called InfraGenie. It is an infrastructure service desk automation platform that sits on top of our partner ServiceNow. We and our partners together are reimagining this platform with an AI lens.

We intend to bring in Automation in IT operations for our customers and then mortgage operations. We are the 11th largest mortgage originator in the United States. We speak through the digital risk company that we acquired a few years back. So, as a part of that, we handle mortgage operations origination and servicing for some of the largest banks in the world.

With this service, we manage the servicing, and we have an entire suite of platforms that make this happen. So, we’re bringing artificial intelligence into this model in the mortgage operations to accelerate the underwriting process, mortgage origination process, etc.

We will be offering a full range of services,i.e., end-to-end AI platforms and end-to-end solutions around our survey services, LLM fine-tuning, etc. Additionally, we will be bringing product-based solutions to the market.

3) What exactly is generative AI according to you, and how is Mphasis.AI contributing to this technology?

It’s a necessary question, and it's necessary to bring clarity to it upfront. I have at least three customer-level conversations every week just to give a basic understanding of what generative AI is to senior executives. There is so much material available on the web, and that is not always the right way to look at it.

In simple terms, generative AI is a subfield of machine learning and deep learning. Machine learning is the way AI models are created. The multiple algorithms that are available in the market to make AI happen and deep learning belong to the field of machine learning and generative AI, and these are the subfields of deep learning.

The objective of generative AI is the generation of relevant, brand-new, original content based on the learnings from the patterns of data that you provide. It is primarily focused on content generation instead of traditional models that primarily work on prediction and classification problems. It can have human-like conversations where you can have Q&A as a human, and it will generate content for you. So that’s generative AI in its simplest layman form.

To further explain, if I go to a bank and ask for my account summary, then the bot will be charged for it. Now, bots are very decision-driven. So how they operate is that this is a question hence, this would be the answer. However, if I apply generative AI in this situation, it will be a conversation, and it’ll be more human-like, and it will be originating new content. So, if I ask for my account number, it can proactively predict and give information beyond the account summary to recommend and even cross-sell things.

Generative AI can do that through the models and training of data it has undergone.

4) What models are being offered via the umbrella of Mphasis.AI?

Well, there is a way to look at it. Everything that we do is from the customer’s perspective as they drive us. Every model we have is for providing a service to a certain kind of problem that our customers have.

To further explain, if it's not a customer IP, we make it available through hyperscalers on the public market. On the other hand, if it's a customer IP, we don’t make that available. If one of our customers does not want to take large language models and wants to build their own, we’ll build that large language model for them.

For example, there is a large drug specialty provider which we’ve recently engaged with. We’re working with them to solve the problem of drug advisory, which is one of the services they provide to patients. So, they don’t want to automate the advisory using generative AI.

The application we are building for them is not using LLMs. In this case, we’re using the GPTs of the world and not even fine-tuning large language models for them. So this is the first kind of customer.

The second kind of customer wants us to fine-tune some of these models as per their business context.

Furthermore, Logistics is our second largest revenue generator for Mphasis. We use a hugging face model called Plentify for one of our largest logistics clients. It’s an open-source model, and we’re fine-tuning it to predict (HS) Code better. (HS) Code is used for export and clearance custom activities, so when they slip a package from one place to another, we get better prediction capabilities than GPT.

GPT is a much larger model for this specific business requirement. We’ve taken the Frantify model, and we’ve fine-tuned it to make it happen.

The third type of customer is rare, and only large enterprises can afford these large language models and build things on top of them.

There’s an upcoming trend of building small language models. There are a few customers who are doing it, for example, Bloomberg. They use Bloom. This work is very resource-intensive. However, it is not a big problem. The bigger problem is to collect a large corpus of data. You need to have a massive volume of data to train these models.

We have 250 models which are out there on the AWS marketplace. For example, all classification or prediction models are built from various sources. There are many open-source technologies that we have used to provide many custom build solutions. I don’t know if I answered your question, but that’s the best way to address this one.

To conclude this question, GPT is not the answer to everything. There are certain situations where you may need to use multiple models to actually build an application, and in that case, you need to orchestrate across these models. There has to be a layer of orchestration that sits above all this. We built that layer of orchestration, and that is how we address the situation.

5) Can you tell us about the different initiatives of Mphasis and its thought leadership in brief?

Well, thought leadership is a part of everything we do. We have products like Sparkle, NextLabs, etc., primarily for thought leadership. One of the ways we can win is by differentiation and not making us capable of doing everything.

Our competitors have a large capacity to offer people. Our way of winning is via differentiation by putting thought leadership into everything.

Talking about the platforms mentioned earlier, Sparkle does two things. First, it brings contextual innovation to our customers. What we do is we go to a customer, and we identify the problems that they’re dealing with wherever research is needed. We address those problems, and we invest in those problems ourselves. We solve it for them, thereby releasing their bandwidth and bringing in new ideas into the situation. So, this is one part of Sparkle.

The other part is bringing in industry-leading startup technologies to our customers. The objective is to talk to various emerging startups in different areas. It’s not just AI across other areas, and we do proof of concepts. We do pilots with them, and then once we vet them out, we bring them to our customers, which helps solve some problems for our customers.

Talking about NextLabs is a very AI-focused lab. We have been operating for almost eight to nine years. Now, it’s been there for some time, as I said, from the time AI became cognitive and important. We started this lab of data scientists and data science engineers who work together to innovate on various use cases.

For example, this is looking at problems out there in the industry and trying to solve them using those 250-plus models that I talked about.NextLabs continuously builds these products, and they are launched on the hyperscale market.

The majority of innovation that we did in generative AI was a product of research that was done in NextLabs. So, these are our two vehicles of innovation.

6) Tell us about the platforms Sparkle and NEXTLabs?

As discussed earlier, Sparkle has been created for customer contextual innovation. NextLabs is for AI. As of now, contextual innovation is important because the majority of the vendors out there provide Blue Sky innovation. Nobody tries to solve the problems of the customers.

7) Is there any advice that you would like to give to all the entrepreneurs and startups that are trying to make it in the AI industry?

Well, we have startups that are very accurately working to solve customer problems. I mean, I would say the opportunity is humongous. Every business process is up for disruption because of generative AI, and in fact, everything we see around us becomes applicable these days.

You can bring in transformative solutions using AI to solve business problems. However, many times, we have a hammer, and we try to look for a nail to hit.

We should not be in that problem. Try to identify problems to solve. Just because you have the tech, it does not mean that you need to go and figure out and have a startup around it.

The biggest thing I’m seeing with the startups is that they have the tech, and they are trying to solve any problem. I would look for business problems where there are ample opportunities.  As of now, every startup is undergoing transformation, so the opportunity is huge. Simply solve that problem.

8) On a closing note, what are the future plans for Mphasis.AI, and what kind of impact can we see in the industries and economies around the globe?

We’ve already discussed that we do everything for our customers. We are trying to make our customers’ lives better. We service some of the top banks, top logistics companies, top finders, Fortune 500 companies, etc., in this world.

We do have a direct impact on their top and bottom line, which has a direct impact on their global economy. Since they are the top 500 companies in the world from an AI perspective, if I take the Mphasis.AI lens and set it on their solution archetypes that we have been using for the last so many years. We are in the process of infusing AI or re-imagining those solution archetypes with an AI generative lens.

When you put generative AI into a business situation, that situation itself may not be there. Therefore, the business process you look at can be completely reimagined with generative AI. It’s not just adding generative AI into a process; it’s actually reimagining the process with the technology.

The thinking means that for all the solutions we bring to the table for solving those solution archetypes, we need to reimage them as a roadmap. We are in the process over the next year to get those archetypes reimagined and then bring that to our customers.

From an innovation perspective, we are building a platform we will soon roll out. These will be transformative AI models and will make a real impact on the top and bottom lines of our customers.

It means there will be an impact, and it will be global. I mean, there are three perspectives right now. One is from an agility point of view. Businesses will become faster, and we will enable them to become faster.

Second, is delivering great experiences through customers. With this, banks, healthcare companies, insurance companies, etc., can deliver better customer experiences.

Third, we can make it extremely cost-efficient. So, if we do these three things using Mphasis.AI over time, we would make this kind of an impact on the business and, therefore, on the world, obviously, in a responsible fashion.

The last word was very important, especially in this industry,i.e., responsive fashion. This is especially the case when some content is being pushed out to the masses, and AI is in good hands, so there is no need to fear or be afraid of it.

Follow the incredible journey of Anup Nair, connect with using his Linkedin. To stay notified of all our latest interviews subscribe to MobileAppDaily. To read our existing MobileAppDaily interviews, click on the link provided.

web

Interested in authoring a guest article or narrating your start-up journey for MobileAppDaily?

Write for us
web
Written by

MobileAppDaily MobileAppDaily

Unveiling the pulse of mobile tech, our expert author at MobileAppDaily is your guide to the latest trends and insights in the app development sphere. With a passion for innovation, they bring you succinct analyses and a keen perspective on the evolving world of mobile technology. Stay tuned for concise updates that decode the future of mobile apps.

Are you one of the disruptors too?

We’d love to feature you.

web

Featured Interviews

Interview

Interview With Coyote Jackson, Director of Product Management, PubNub

MobileAppDaily had a word with Coyote Jackson, Director of Product Management, PubNub. We spoke to him about his journey in the global Data Stream Network and real-time infrastructure-as-a-service company. Learn more about him.

MAD Team 4 min read  
Interview

Interview With Laetitia Gazel Anthoine, Founder and CEO, Connecthings

MobileAppDaily had a word with Laetitia Gazel Anthoine, Founder and CEO, Connecthings. We spoke to her about her idea behind Connecthings and thoughts about the company’s services.

MAD Team 4 min read  
Interview

Interview With Gregg Temperley, Founder Of ParcelBroker App

MobileAppDaily had a word with Gregg Temperley, Founder. We spoke to him about his idea behind such an excellent app and his whole journey during the development process.

MAD Team 4 min read  
Interview

Interview With George Deglin, CEO Of OneSignal

MobileAppDaily had a word with George Deglin, the CEO and co-founder of OneSignal, a leading customer messaging and engagement solution, we learn multiple facets related to customer engagement, personalization, and the future of mobile marketing.

MAD Team 4 min read