Skip to main content Skip to footer
Cognizant Blog

Be it in banking, insurance or any other industry, the pressure on improving the customer experience is immense. How can an organization engage more closely with consumers and become more responsive to their needs? By using data strategically through AI and ML – and a data foundation is essential to success.

In most organizations, every element is rapidly becoming a data-driven play. All aspects of the business generate information to help develop and improve services, for those who manage to harvest it. Doing so is crucial to success in today’s marketplace.

As for insurers, for example, they want to understand data around claims, improve loss prevention, make sure customers are correctly insured, and learn from experience to strengthen the customer experience. Now, with all the connected devices and cars, there are immense possibilities to learn from data and suggest new, profitable business models.  

Within banking, the needs also pivot around getting to know the customers better. Among other things, banks want to be able to understand customer lifetime moments to service their financial needs just in time, build new ecosystems with fintech partners and improve risk management with predictive capabilities. 

Get the foundation right

Yet, the majority of organizations are not leveraging enough data to become truly digital and data-driven through AI and ML. What’s stopping them then? Normally, having too many sources, siloed information, and a mixture of structured and unstructured data make it hard to extract any real value from the existing data. There might also be a lack of other success drivers like executive support, ability to retain and nurture talent, lack of focus on organizational change efforts, lack of focus on human-computer collaboration, etc. 

Even though AI and ML are business enablers, we need to get a bit techie. Utilizing AI and ML technologies, at scale, requires a functional data foundation. Quite commonly, this is viewed as an infrastructure issue rather than a business matter and might not be prioritized as one, but it really should be. This has the potential to enable new revenue opportunities, grow existing business and improve your bottom line.

What you need is an engine that allows the capture of any type of data, the generation of insights through analytical models, and the integration of those insights in real-time with core processes. The engine should also support the processing of data from live events. On top of that, you also need the capability to scale your data-driven initiatives beyond the Proof-of-concept stages. You also need to infuse collaboration around data within your business teams by enabling easy access to data and tools to play around with the data.

Is a data lake the answer?

Traditional data platforms, like relational databases, may not be able to meet many of the AI needs. They rely on using rigid technology infrastructure to capture mostly internal data in predefined formats, and this has become insufficient. Hence, identifying the right data infrastructure (big data, data lakes, and other modern data platforms) is key to the success of having the right data foundation.

From my experience, a data lake is a good option. They are low-cost data storage environments that use commodity hardware and an integrated technology stack of open-source data and analytical tools. Most cloud vendors have strong commercial cloud offerings including storage, computing, utilities and data science workbenches and services around it. 

The data lake can store vast amounts of structured and unstructured data for AI purposes and help drive more value from existing data assets by combining, analyzing and using traditional and new types of data. It also allows democratizing data by providing enterprise-wide access to information. 

Simply storing the data is not enough. You also need to work with it, prepare it, experiment with it and then finally make it AI ready for your teams to build analytics products on top of it (e.g., predictive models). Roughly, about 25–30 different technology components (many of them open source) come together to work in tandem. This is by no means an easy feat and you need an experienced team that can scale your data pipelines in production. 

Some advice on the way

Cognizant has been involved in several such data modernization projects, among them an insurance company and a Nordic-Baltic banking group, as part of the preparation for realizing AI and ML initiatives. All in all, our experience from across the globe shows some distinct make-or-break factors:

  • Get the internal organization on board and put efforts into democratizing the use of data with modern techniques. 
  • Be ambitious; pick business drivers and use cases that will make an impact as opposed to simple low-hanging fruits which sometimes demand similar effort but don’t bring sharp results that can enthuse the business.
  • Find the center of gravity and pinpoint the accountability of building a platform that in turn enables business teams to experiment and build data products on their own – otherwise, it won’t happen. 
  • Go all the way. Building a good predictive model is just 10–15 percent of the activity. Think about applying the insights to your business processes. This is not just a frontend integration issue, but more importantly a change management one where business teams need to learn to work with machine intelligence. 
  • Make a plan to revisit key design and tech decisions so that you are continuously improving and making your foundation future-proof.

Businesses all around the world have realized that AI will play a key role in the disruption of industries globally. If you get the required foundational enablers in place quickly, you’re likely to be among the winners within your industry. 


Cognizant Nordics

Our experts are contributing with exciting insights about what is going on within technology and innovation.



Latest posts
Related posts