iSPIRT works to transform India into a hub for new generation software products, by addressing crucial government policy, creating market catalysts and grow the maturity of product entrepreneurs. Welcome to the Official Insights!
In this recent OpenHouse, Sagar Parikh discusses with Dr Ravi Modani how democratizing credit through short-tenor and small-ticket loans can help finance Indian MSMEs, 99% of which are micro-enterprises. Dr Modani shares his insights and invaluable guidance to navigate the complex world of B2B financing for MSMEs.
He also delves deep into the challenges faced by them in accessing financing, particularly in the realm of B2B transactions. Drawing from his extensive experience and research, he offers a fresh perspective on the traditional lending landscape and presents innovative solutions to empower MSMEs.
Key Insights from the Video:
The MSME Financing Dilemma: Dr. Modani highlights the significant hurdles that MSMEs encounter when seeking short-tenor and small-ticket loans. He emphasizes the need for a paradigm shift in lending practices to better serve the unique needs of these businesses.
A New Way Of Financing for MSMEs: Dr. Modani advocates for a pioneering financing approach for MSMEs, highlighting the effectiveness of short-tenor and small-ticket loans. These loans, being revolving in nature throughout the year, allow lenders to disburse a higher volume of loans. Consequently, lenders can potentially amplify their AUM by up to 8 times, surpassing the typical 5-6 times AUM ratio associated with traditional lending practices.
Comparing Financial Platforms: Dr. Modani provides a comprehensive comparison between TReDs and OCEN, offering insights into the advantages of leveraging public networks like OCEN for enhanced interoperability and accessibility.
The Power of Public Networks: Leveraging platforms like OCEN and GeM can significantly reduce operational costs for lenders, ultimately leading to lower lending costs and improved efficiency. Dr. Modani illustrates how these public networks can drive down the cost of lending, benefiting both lenders and borrowers alike.
The Time Sensitivity of MSME Financing: Dr. Modani underscores the time-critical nature of MSME lending and stresses the importance of streamlining the loan journey to ensure timely access to funds for businesses.
His illustrations and learnings help in navigating the complex world of MSME financing by embracing innovative approaches. He believes that leveraging public networks like OCEN will only help lenders unlock new growth and success in today’s lending landscape by opening multifold opportunities to them.
In our most recent OpenHouse, we embark on an insightful exploration of the transformative landscape in MSME lending, featuring Bhavik Vasa, the Founder of GetVantage, and Sagar Parikh. The conversation delves into the potential of creating groundbreaking impact through interoperable networks, particularly focusing on OCEN. The discussion navigates the dynamic intersection of finance and technology, highlighting how inventive solutions are reshaping the lending panorama. Emphasizing the crucial role of interoperability, the dialogue underscores its significance in bridging the credit gap, propelling the MSME sector into a new era of unprecedented growth.
Key Takeaways:
Network Effects Unleashed: OCEN catalyzes network effects, narrowing the credit gap and expanding the market, fostering inclusivity and vibrancy.
Efficiency through Interoperability: Standardized protocols cut costs and efforts, providing high-quality data for lenders while empowering MSMEs with smoother access to loans.
Addressing Unmet Needs: Explore how interoperable networks bridge gaps in unsecured lending, catering to shorter tenures and smaller loan sizes.
Tech-Enabled Business Growth: Witness the role of unsecured lending in a tech-driven landscape, fostering a circular consumption economy for economic growth.
Personalized FinTech Solutions: Bhavik advocates for a borrower-centric approach, urging lenders to view lending through a tech and data-driven lens, benefiting both parties.
Collaboration Dynamics: Conclude with insights on how NBFCs and banks can coexist and collaborate, playing to their strengths for a more robust lending environment.
Ready to unlock the future of MSME lending? Join the conversation now!
Intermediaries and Fintechs have played an important role in the lending ecosystem, but the impact is mostly seen in consumer lending and not so much in MSME lending, especially for unsecured, small ticket and short duration loans. What are the missing pieces in the lending process for which advanced tech and a mindset shift can utilise a digital infrastructure like OCEN (Open Credit Enablement Network) and unlock this credit supply for MSMEs?
Recently, we hosted Lizzie Chapman in an insightful conversation with Sagar Parikh. She shared her views on where the intermediaries and FinTechs can further become a value add in a profitable manner by pushing the boundaries of technology.
Points discussed:
Digital infrastructure & its impact on the costs, penetration & process for lending eco-system
Unsecured MSME loans not as solved as unsecured consumer loans. Cashflow lending addresses the concerns around unsecured lending to MSMEs.
DPI such as OCEN facilitating the availability, quality, aggregation of data for credit underwriting along with loan disbursement for MSMEs
Need for Intermediaries & Fintechs to harness technology to conceptualise innovative lending products, advanced ways of pricing and matching risks & address the opex challenges in collections & repayments
Investors tend to prefer businesses that touch the customer end to end. They should see that being part of the value chain can be as profitable as owning the value chain.
OCEN is creating dispute resolution mechanisms but intermediaries should also innovate for transparency and building trust with the customers so as to enable a safe, stable, secure growth in short term cashflow lending for MSME credit.
This is the 4th blog in a series of blogs describing and signifying the importance of DPI for AI, a privacy-preserving techno-legal framework for AI data collaboration. Readers are encouraged to first go over the earlier blogs for better understanding and continuity.
We are at the cusp of history with regard to how AI advancements are unfolding and the potential to build a man-machine society of the future economically, socially, and politically. There is a great opportunity to understand and deliver on potentially breakthrough business and societal use cases while developing and advancing foundational capabilities that can adapt to new ideas and challenges in the future. The major startups in Silicon Valley and big techs are focused first on bringing the advancements of AI to first-world problems – optimized and trained for their contexts. However, we know that first world’s solutions may not work in diverse and unstructured contexts in the rest of the world – may not even for all sections of the developed world.
Let’s address the elephant in the room – what are the critical ingredients that an AI ecosystem needs to succeed – Data, enabling regulatory framework, talent, computing, capital, and a large market. In this open house
we make a case that India is the place that excels in all these dimensions, making it literally a no-brainer whether you are an investor, a researcher, an AI startup, or a product company to come and do it in India for your own success.
India has one of the most vibrant, diverse, and eager markets in the world, making it a treasure chest of diverse data at scale, which is vital for AI models. While much of this data happens to be proprietary, the DPI for AI data collaboration framework makes it available in an easy and privacy-preserving way to innovators in India. Literally, no other country has such a scale and game plan for training data. One may ask that diversity and scale are indeed India’s strengths but where is the data? Isn’t most of our data with the US-based platforms? In this context, there are three types of Data:
a. Public Data, b. Non-Personal Data (NPD), and c. Proprietary Datasets.
Let’s look at health. India has far more proprietary datasets than the US. It is just frozen in the current setup. Unfreezing this will give us a play in AI. This is exactly what DPI for AI is doing – in a privacy-preserving manner. In the US, health data platforms like those of Apple and Google are entering into agreements with big hospital chains – to supplement their user health data that comes from wearables. How do we better that? This is the US Big Tech-oriented approach – not exactly an ecosystem approach. Democratic unfreezing of health data with hospitals is the key today. DPI for AI would do that even for all – small or big, developers or researchers! We have continental-scale data with more diversity than any other nation. We need a unique way to unlock them to enable the entire ecosystem, not just big corporations. If we can do that, and we think we can via DPI for AI, we will have AI winners from India.
Combine this with India’s forward looking regulatory thought process that balances Regulation for AI and Regulation of AI in a unique way that encourages innovation without compromising on individual privacy and other potential harms of the technology. The diversity and scale of the Indian market act like a forcing function for innovators to think of robustness, safety, and efficiency from the very start which is critical for the innovations in AI to actually result in financial and societal benefits at scale. There are more engineers and scientists of Indian origin who are both creating AI models or developing innovative applications around AI models. Given our demographic dividend, this is one of our strengths for decades to come. Capital and Compute are clearly not our strong points, but capital literally follows the opportunity. Given India’s position of strength on data, regulation, market, and talent, capital is finding its way to India!
So, what are you all waiting for? India welcomes you with continental scale data with a lightweight but safe regulatory regime and talent like no place else to come build, invest, and innovate in India. India has done it in the past in various sectors, and it is strongly positioned to do it again in AI. Let’s do this together. We are just getting started, and, as always, are very eager for your feedback, suggestions, and participation in this journey!
This is the third in a series of blogs describing the structure and importance of Digital Public Infrastructure for Artificial Intelligence (DPI for AI), a privacy-preserving techno-legal framework for data and AI model building collaborations. Readers are encouraged to go over the first and second blogs for better understanding and continuity.
The techno-legal framework of DEPA, elaborated upon in the earlier blogs, provides the foundations. From multiple discussions and history, it is clear that building and growing a vibrant AI economy that can create a product nation in India, requires a regulatory framework. This regulatory structure will serve as the legal partner to the technology aspect and work hand in hand with it. Upon this reliable techno-legal foundation will the ecosystem and global product companies from India be materialized.
‘Data Empowerment And Protection Architecture’ – or DEPA’s – worldview of ‘regulation for AI’, rather than the more conventional ‘Regulation of AI’ espoused by US, EU and so on sets DEPA apart and drives India towards an AI product nation with a global footprint.
How does one envisage the form and function of ‘Regulation for AI’? In this open house, we have a dialog between technology and legal sides of the approach to explain the significant facets.
In a nutshell, ‘Regulation for AI’ will focus on
what standards the AI models need to adhere to
define a lightweight but foolproof path for getting there for startups as well as the big players
provide an environment which deals with many of the compliance and safety aspects ab initio
define ways to remove hurdles from the innovator’s paths
In contrast, ‘Regulation of AI’ deals with what AI models cannot be and do and the tests and conditions that they have to pass depending on the risk classes that they are placed into. This is akin to certification processes in many fields such as pharma, transportation and so on which impose heavy cost burdens, especially on new innovators. For instance, many pharma companies which develop potentially good drug candidates run out of steam trying to meet the clinical trial conditions. Very often they are unable to find a valid and sizeable sample population to test their products as a part of the mandatory certification process.
The current standards in the new Regulation of AI in the US, EU and so on leave many aspects such as the risk model classification process undefined, leading to regulatory uncertainty. This also works against investment driven innovation and consequent growth of the ecosystem in multiple ways.
The path to value both for the economy and the users, lies in the power of the data being projected into the universe of applications. These applications will be powered by the AI models in addition to other algorithmic engines. The earlier blogs already addressed the need and the way for data to make their way into models.
For the models to exhibit their power, we must make sure they are reliable and used widely. This requires the AI models be accessible and available and most importantly, ‘do no harm’ when they are applied, through mistakes, misuse or malfeasance. In addition to this, humans or their agents must not be allowed to harm the markets and users through monopoly control of the AI models. Large scale monopolistic control of these models which have global use and relevance can lead to situations which are beyond national or international legislation to control or curb.
In the DEPA model, this benign, and in most ways, benevolent environment is created by a concinnous combination of technology and legal principles. Having analyzed the technological aspects of data privacy in the earlier blogs in this chain, here we will talk about the regulations implemented via a Self-Regulatory Organization – the SRO.
Though not fully fleshed out, the SRO provides functions such as registration and roles to participants such as TDP (Training Data Provider), TDC (Training Data Consumer) and CCRP (Confidential Clean Room Provider). Many of these functions have been implemented in part to support the tech stack that we have released with respect to the CCR (Ref: DEPA Open House #1). This tech stack currently supports registration and allows the interactions between participants to be mediated via electronic contracts (the technological counterpart of legal contracts).
The technology that validates the models through pre-deployment analysis based on complex adaptive system models is under development and is based on diverse research efforts across the world. This technology is designed for measuring the positive and negative impact of use of these models on societies at small and large scale and in short and long timescales.
‘Complex adaptive system models’ are dynamic models which can capture agents with their state information and the multiple feedback loops which determine the changes in the system at different scales, sometimes simultaneously. The large number of components and the many kinds of feedback loops with their dynamic nature are what make these models complex and adaptive. These models, while still in their infancy in many ways, are critical to the question of understanding the AI models’ impact on societies.
The SRO guides and supports the ecosystem players in building and deploying their models in a safe and secure way with lightweight regulatory ceilings so that large product companies in many fields like finance, healthcare, and education can grow and reach a happy consumer base. This is key to growing the ecosystem and connecting it to other parts of the India stack.
We envisage leveraging the current legal system in terms of the different Acts (DPDP, IT Act, Copyright etc.) and models of Data Protection through CDO ( Chief Data Office) and CGO ( Grievance Office) in companies in India in defining the SRO’s role and features further.
The regulatory model also looks at the question of data ownership and copyright issues, especially in the context of Generative AI. We require large foundation models independent of the ‘Big Tech’ to fight potential monopolies. These models should be reflective of the local diversity to serve as reliable engines in the context of India. We need these models built and deployed locally, to be able to play a role as a product nation without being subverted or subjugated in our cyberspace strategies.
To light up the AI sky with these many ‘fireflies’ in different parts of India, infrastructure for compute as well as market access is needed. The SRO creates policies that are not restrictive or protective but promotes participation and value realization. The data players, compute providers, market creators and users need to be able to play with each other in a safe space. Sufficient protection of copyright and creative invention will be provided via existing IP law to incentivize participation while not restricting to the point of killing innovation – this is the balance that the regulatory framework of SRO strives to reach.
Drawing upon ideas of risk-based categorization of models (such as in the EU AI Act) and regulatory models (including punitive and compensatory measures) proportional to them, the models in India Stack will be easily compatible with international standards, as well as a universal or global standard, should an organization such as a UN agency define it. This makes global market reach of AI models and products built in India, an easier target to achieve.
We conjecture that these different aspects of DEPA will release the data from its silos. AI models will proliferate with multiple players profiting from infrastructure, model building, and exporting them to the world. Many applications will be built which will be used both in India (as part of the India Stack) and the world. It is through these models and applications that the latent potential and knowledge in the vast stores of data in India will be realized.
In an era of evolving financial landscapes, the realm of lending is witnessing a significant shift—from the traditional collateral-based approach to the more contemporary cash flow lending model facilitated by OCEN (Open Credit Enablement Network). Recently, we hosted UGro Capital in an insightful conversation with Shachindra Nath, shedding light on this transformative paradigm in lending and delving into its profound implications.
Points discussed:
Transitioning from Collateral to Cash Flow Lending
OCEN plays a pivotal role in revolutionising MSME lending in India. This innovative open network is specifically designed to serve those new to credit, employing an omni-channel approach that democratises and simplifies access to lending.
Currently, the market lacks a scalable and profitable model for short-term, low-value MSME loans – a significant gap that OCEN has adeptly filled with itsGeM-SAHAY pilots.
Amidst the confusion and excitement surrounding OCEN versus ONDC, and the broader impact of open networks in the lending sphere, this blog aims to provide clarity and insight. Let’s dive in and explore these transformative developments.
🔀 OCEN or ONDC: Which is better for short tenure MSME lending?
There’s much debate about which lending framework potential partners should explore. Rahul Mathur (Associate Director, InsuranceDekho) captures this perfectly in his tweet, presented as a checklist below:
🗣️ “Turns out, the focus in lending for ONDC v/s OCEN is very different (see the
image below)
(1) 💰Type of loan: Type 1 personal loan v/s Type 4 MSME loan
(2) 🔎GTM: Online v/s Omni-channel (assisted)
(3) 🙇Persona: Eligible for credit v/s New to credit
(4) 🌟Objective: Bring credit to point of commerce v/s Democratize credit access
To summarize, there are some good reasons why ONDC has launched loans
independently of the OCEN network.
Over time, OCEN will expand to include further lending use-cases & products.
And, at that point, ONDC <> OCEN interoperability would make sense.”
Clearly, OCEN is the undisputed option for short tenure, low ticket size lending for new to credit MSMEs. Over time the lending use cases will be expanded to service the traditional form of loans.
OCEN and ONDC, while both operating in the lending space, are tailored for very different use cases and audiences. While they may overlap in some cases, the larger ecosystem benefits from introduction of newer networks. In the end, it’s all about solving the most challenging problems 🙂
Let’s further understand how OCEN addresses the MSME lending problem in India.
📈 OCEN makes small ticket size lending a reality
OCEN’s primary goal is to make short-term lending profitable. Something which we’ve achieved in our pilots with the Government e-Marketplace, through the GeM-SAHAY app.
One of our volunteers explains the economics in this blog post:Evaluating the short term lending opportunity, where he shows how lenders can earn 2.2x higher revenue with the same capital through the adoption of the OCEN framework.
The significant 2.2x increase in revenue is attributed to the introduction of a crucial role known as the borrower’s agent. These agents not only reduce the cost of servicing a loan but also heighten accountability within the system.
Borrower’s Agents (BAs) assume a variety of roles traditionally outsourced by lenders, BAs function as data providers, collections agents, escrow account managers, and product providers.
By integrating these services and cohesively binding the network, BAs enable lenders to efficiently service low-cost loans even in remote areas. In performing these four key roles, the borrower’s agent emerges as the cornerstone of the open network, vital for its effective operation.
The role of borrower’s agent has been discussed in depth in one of our open house sessions:
OCEN is changing the game by making even the tiniest loans worthwhile for both the lender and the borrower.
🌐 Efficacy of Open networks and streamlining the lending process
Some people we’ve spoken to, worry that open networks will lead to the commodification of lending, which, in turn, is bad for the overall market. However, this couldn’t be farther from the truth 🙂.
OCEN streamlines the lending process by introducing roles such as the borrower’s agent, KYC agents, and collection partners. These roles combine to create a bundle that lenders can easily integrate into their processes to start lending.
Newer and smaller lenders will benefit from the transparency and scale offered by open networks.
Closed network auctions, which are common today, see lenders bidding down for loans. However, their lack of transparency and scale often results in low profitability.
Open networks, on the other hand, provide scale and transparency that leads to low cost of servicing, more borrowers to choose from, and reliability in the system through a borrower’s agent.
Larger lenders benefit from the low cost of servicing a loan that comes with open networks
Larger lenders will benefit from open networks as it provides the technical chops of a borrower’s agent. BAs can help with KYC, collections and other parts of servicing a loan while absorbing some of the costs.
We’ve seen such effects before, with the introduction of Aadhaar and UPI, where KYC and collections became far cheaper enabling large lenders to facilitate smaller ticket size loans.
In conclusion
Through OCEN, the potential to unlock a ~$300 billion credit market in India becomes a tangible reality. This is demonstrated by the increased revenue potential and the introduction of the borrower’s agent role, enhancing loan servicing efficiency and accountability.
Moreover, OCEN’s streamlined lending process benefits the entire market, by offering scalability and cost-effectiveness to both emerging and established lenders.
Thus, embracing OCEN is not just a choice but a strategic direction for expanding market possibilities and empowering both lenders and borrowers in the dynamic credit landscape of India.
This is the 2nd blog in a series of blogs describing and signifying the importance of DPI for AI, a privacy-preserving techno-legal framework for AI data collaboration. Readers are encouraged to first go over the 1st blog for better understanding and continuity.
What is unique about the techno-legal framework in DPI for AI is that it allows for data collaboration without compromising on data privacy. Now let’s put this in perspective of Indian enterprises and users. This framework can potentially revolutionize the entire ecosystem to slingshot India towards an AI product nation where we are not just using AI models developed within India but exporting the same. What is the biggest roadblock in this dream? In this open house (https://bit.ly/DEPA-2), we make a case that privileged access to data from Indian contexts is not only necessary to develop AI-based systems that are much more relatable to Indians but in fact, gives Indian innovators a distinct advantage over much larger and better funded big tech companies from the west.
Let’s get started. Clearly, there is a race to build larger and larger AI models these days trained on as much training data as possible. Most training data used in the models is publicly available on the web. Given that Indian enterprises are quite behind in this race, it is unlikely that we will catch up by simply following their footsteps. But what many folks outside of AI research circles often miss is that there has been credible research that shows that access to even relatively small amounts of contextual data can drastically reduce the data and compute requirements to achieve the same level of performance.
This sounds great, right, but (there is always a but!) much of this Indian context data is not in one place and is hidden behind numerous government and corporate walls. What makes the situation worse is most of these data silos are enterprises of traditional nature and are not the typical centers of innovation, at least for modern technologies like AI. This is a fertile ground for DPI for AI. The three core concepts of DPI for AI ensure that this data sitting in silos can be seamlessly (thanks to digital contracts) and democratically shared with innovators around India in a privacy-preserving manner (thanks to differential privacy). The innovators also do not need to worry one bit about the confidentiality of their IP (thanks to confidential computing). The techno-legal framework makes it super easy for anyone to abide by the privacy regulations without sweat. This will keep them safe from future litigations as long as they follow easy-to-follow guidelines provided in the framework. This is what we refer to as the unfreezing of data markets in this Open House. This unfreezing is critical for our innovators to get easy access to contextual data to give them a much-needed leg up against the Western onslaught in the field of AI. This is India’s moment to leapfrog in the field of AI as we have done in so many domains (payments, identity, internet, etc.). Given the enormity of the goal and the need to get it right, we seek participation from folks from varied expertise and backgrounds. Please share your feedback here
In the last decade, we’ve seen an extraordinary explosion in the volume of data that we, as a species, generate. The possibilities that this data-driven era unlocks are mind-boggling. Large language models, trained on vast datasets, are already capable of performing a wide array of tasks, from text completion to image generation and understanding. The potential applications of AI, especially for societal problems, are limitless. However, lurking in the shadows are significant concerns such as security and privacy, abuse and mis-information, fairness and bias.
These concerns have led to stringent data protection laws worldwide, such as the European Union’s General Data Protection Regulation (GDPR) and California’s Consumer Privacy Act (CCPA), and the European AI Act. India has recently joined this global privacy protection movement with the Data Protection and Privacy Act of 2023 (DPDP Act). These laws emphasize the importance of individuals’ right to privacy and the need for real-time, granular, and specific consent when sharing personal data.
In parallel with privacy laws, India has also adopted a techno-legal approach for data sharing, led by the Data Empowerment and Protection Architecture (DEPA). This new-age digital infrastructure introduces a streamlined and compliant approach to consent-driven data sharing.
Today, we are taking the next step in this journey by extending DEPA to support training of AI models in accordance with responsible AI principles. This new digital public infrastructure, which we call DEPA for Training, is designed to address critical scenarios such as detecting fraud using datasets from multiple banks, helping with tracking and diagnosis of diseases, all without compromising the privacy of data principals.
DEPA for Training is founded on three core concepts, digital contracts, confidential clean rooms, and differential privacy. Digital contracts backed by transparent contract services make it simpler for organizations to share datasets and collaborate by recording data sharing agreements transparently. Confidential clean rooms ensure data security and privacy by processing datasets and training models in hardware protected secure environments. Differential privacy further fortifies this approach, allowing AI models to learn from data without risking individuals’ privacy. You can find more details how these concepts come together to create an open and fair ecosystem at https://depa.world.
DEPA for Training represents the first step towards a more responsible and secure AI landscape, where data privacy and technological advancement can thrive side by side. We believe that collaboration and feedback from experts, stakeholders, and the wider community are essential in shaping the future of this approach. Please share your feedback here
📢Calling all lenders to understand how a short tenor loan can become both an effective and profitable business opportunity under OCEN 4.0. 🔑📈
If you are a lender looking for the next big opportunity in lending to the thousands of MSMEs currently unable to access loans, then don’t miss this introduction to OCEN 4.0. 💡
Here we deep-dive into how a new underwriting model enabled by OCEN 4.0 makes it viable and profitable to provide loans to MSMEs traditionally considered unfavourable candidates for loans given the associated high delinquency rates. Our OCEN pilots show, in some cases, it is even possible to create short tenor loans that are twice as profitable as long tenor loans. 🌟🚀🚀
📢Calling all loan agents keen to understand the OCEN 4.0 business opportunity. 🔑📈
OCEN 4.0 introduces a new and powerful role – the Borrowers Agent(BA). If you are looking to play a pivotal role in the MSME lending ecosystem without lending from your own balance sheet, this new role of a BA may be what you want to understand really well. 💡
The BA role is critical to the OCEN story. In this session, we deep-dive on what this role entails, why it is the linchpin of the OCEN 4.0 model, how BAs enable lenders to go remote, and how this role wields a lot of power. We also talk through how to get started, possible business models for BAs and what to focus on to be a successful Borrowers Agent. 🌟🚀🚀
📢👷🧑💻Calling all TSPs and participants eager to dive into OCEN 4.0 APIs.
If you are wanting to understand the tech, the APIs and get started on building for OCEN 4.0, our second open house on OCEN 4.0 is here for you !! 💡
In this session, we do a deep-dive on the architecture, the loan journey on OCEN 4.0 components, the APIs in the OCEN spec and share how you can build for a participant by mocking the APIs of the other. 📝🔑🧑💻
We’re thrilled to unveil OCEN 4.0, the latest advancement in our Open Credit Enablement Network protocol, revolutionizing cash flow-based MSME lending. 🌟
OCEN 4.0 represents a significant leap forward from our ongoing GeM SAHAY and GST SAHAY pilots. In this iteration, along with updated API specifications, we have also added the OCEN Registry, Product Network and rules, specialized participant roles and much more. All these features help us unlock cash-flow-based lending to match the scale, complexity and needs of Bharat. 🔑📈
Check out our introductory open house session on OCEN 4.0
🔍 More details? The API and documentation of OCEN 4.0 are publicly available at http://ocen.dev and will be updated with FAQs from the open house sessions.
🔮 What’s next? Yes, a lot is happening. We have more open house sessions coming out in the following weeks. We are also actively onboarding Wave 1 partners for OCEN 4.0.
❓Questions? Submit your questions here. 📩Contact? Reach the OCEN 4.0 team at [email protected]
Please note: The blog post is authored by our volunteers, Aravind R andSagar Parikh.
Amidst the usual flurry of sensational headlines, you may have missed a quiet announcement a few weeks ago that marked a monumental shift: RBI became the first central bank globally to publish acommon technology framework – including detailed APIs – for consent driven data sharing across the entire financial sector (banking, insurance, securities, and investment).
This is a gamechanger for the industry.
Out of context, yet another circular with a good deal of jargon is an easy thing to gloss over. But it turns out this effort is actually a global first: although the UK, EU, Bank of International Settlements (BIS), Canada, and others have begun thoughtful public conversations around Open Banking (e.g. through that famous BIS report making the case, initiatives like PSD2, conferences, and various committees), India is one of the first nations in the world to actually make it a market reality by publishing detailed technical API standards — standards that are quickly being adopted by major banks and others across the financial sector in the country without a mandatory requirement from RBI. It’s not just the supposedly cutting edge banks of Switzerland, the UK, or the US driving fintech innovation: the top leadership of our very own SBI, ICICI, IDFC First, Bajaj Finserv, Kotak, Axis, and other household names have recognised that this is the way forward for the industry, and are breaking through new global frontiers by actually operationalising the powerful interoperable technology framework. Not only are they adopting the APIs, some are also starting to think through the new lending and advisory use cases and products made possible by the infrastructure. We think many new fintech startups should also be considering doing the same.
Why do the APIs Matter?
The world is focusing heavily on data protection and privacy – and rightly so. Securing data with appropriate access controls and preventing unauthorised third-party sharing is critical to protecting individual privacy. But to a typical MSME, portability and control oftheirdata is just as critical as data security to empower them with access to a stream of new and tailored financial products and services. For instance, if an MSME owner could share trusted proof of their business’ regular historic GST payments or receivables invoices digitally with ease, a bank could now offer regular small ticket working capital loans based on demonstrated ability to repay (known as Flow-based lending) rather than just loans backed on collateral. Data sharing can become a tool for individual empowerment and prosperity by enabling many such innovative new solutions.
Operationalising a seamless and secure means to share data across different types of financial institutions – banks, NBFCs, mutual funds, insurance companies, or brokers – requires a common technology framework for data sharing. The published APIs create interoperable public infrastructure (a standard ‘rails’) to be used for consented data sharing across all types of financial institutions. This means that once a bank plugs into the network as an information provider, entities with new use cases can plug in as users of that data without individually integrating with each bank. Naturally, the system is designed such that data sharing occurs only with the data owner’s consent — to ensure that data is used primarily to empower the individual or small business. The MeiTY Consent Framework provides a machine-readable standard for obtaining consent to share data. This consent standard is based on an open standard, revocable, granular (referring to a specific set of data), auditable, and secure. Programmable consent of this form is the natural next innovation of the long terms and conditions legalese that apps typically rely on. RBI has also announced a new type of NBFC – the Account Aggregator – to serve as a consent dashboard for users, and seven new AAs already have in principle licenses.
The Data Empowerment and Protection Architecture (DEPA) – in one image
In many other nations, market players have either not been able to come together to agree on a common technical standard for APIs, or have not been able to kick off its adoption across multiple competing banks at scale and speed. In countries like the US, data sharing was enabled only through proprietary rails – private companies took the initiative to design their own infrastructure for data sharing which end up restricting players like yourselves from innovating to design new products and services which could benefit people on top of the infra.
What other kinds of innovative products and services could you build?
Think of the impact that access to the Google Maps APIs allowed: without them, we would never have seen startups like Uber or Airbnb come to life. Building these consented data sharing APIs as a public good allows an explosion of fintech innovation, in areas such as:
New types of tailored flow-based lending products that provide regular, sachet sized loans to different target groups based on GST or other invoices (as described above).
New personal financial management apps which could help consumers make decisions on different financial institutions and products (savings, credit, insurance, etc.) based on historic data and future projections. This could also branch out into improved wealth management or Robo advisory.
Applications that allow individuals to share evidence of financial status (for instance, for a credit card or visa application) without sharing a complete detailed bank statement history of every transaction
…and many others, such as that germ of an idea that’s possibly started taking shape in your mind as you were reading.
In summary
This ecosystem is where UPI was in mid-2016: with firm, interdepartmental, and long term regulatory backing, and at the cusp of operationally taking off. UPI taught us that those who make a bet on the future, build and test early (PhonePe and Google were both at the first ever UPI hackathon!), and are agile enough to thrive in an evolving landscape end up reaping significant rewards. And just as with UPI, our financial sector regulators are to be lauded for thinking proactively and years ahead by building the right public infrastructure for data sharing. RBI’s planning for this began back in 2015! They have now passed the innovation baton onto you — and we, for one, have ambitious expectations.
With warmest regards,
iSPIRT Foundation
I’m Pinging A Few Whatsapp Groups Now, What Else Should I Send Them To Read?
Access to formal credit continues to be one of the largest challenges faced by MSMEs in India due to lack of verifiable data about their business.Digital payments data combined with GST data has the potential to unlock millions of SMEs & bring them into the formal system. India is going through a Cambrian explosion of data usage. It is estimated that the monthly data consumption on every smartphone in India is estimated to grow nearly five times from 3.9 GB in 2017 to 18 GB by 2023 as per a report by Swedish telecom gear maker Ericsson.
As businesses and their processes get digitized, it provides us a unique opportunity to re-imagine credit products for MSMEs like never before.
In order to move from traditional Asset-based lending to Data based lending it is important to make the following design considerations:
Underwriting based on Data – Assess creditworthiness in real time based on the consented data provided by the user
Low-Value – Bringing down the cost of processing a loan using digital platforms like eKYC, eSign & UPI enables one to process sachet sized loans
Smaller Tenures – Offer small tenures to reduce risk and thereby build better credit history of a customer
Customised Loan Offers – In the old world, loan products were designed to be one size fits all; With data & better underwriting, create a “loan offer on the fly” for a borrower based on his need
Getting started with GST Data Based Lending – Basics
Over 8M+ businesses in India will file GST returns
Every invoice in the GSTN system is verified by the counterparty
GST returns are digitally signed and this data can be accessed through consent of a small business
To access this data, you need the understand the three types of GST APIs:
Authentication – Allows a taxpayer to login into his GST account from any application
Returns – Allows a taxpayer to file his returns from any application
Ledger – Allows a taxpayer to view & share his tax data with any application
You can access the GSTN Sandbox & APIs here: bit.ly/GSTAPIs
If you want more insights, do join the GSTN Discussion Forum here: bit.ly/GSTgroup
The GSTN Tech Ecosystem
Goods and Service Tax Network is a section 8 company set up to provide common and shared IT infrastructure and services to the Central and State Governments, Tax Payers and other stakeholders for the implementation of the Goods & Services Tax (GST).
In this context, it is important to understand the below two roles of GSTN:
Direct portal for taxpayers – https://services.gst.gov.in/services/login
GST Suvidha Provider (GSP) – Companies which provide GST API Gateway as a service to application service providers; They are appointed by the GSTN and list of the GSPs can be accessed here:http://www.gstn.org/gsp-list/
ASPs – Companies which provide the user interface for business to file or fetch their returns from the GSTN
Naturally, ASPs are a great fit as distribution partners for lending as they own and control the end user experience of small businesses. Some of the examples are:
Accounting Software Providers
They help small business manage their accounting, inventory & even payroll;
They have rich data sets about the small business including their GST returns Eg: Tally (Desktop), Zoho/Cleartax/Profitbooks (Cloud-based)
Tax Filing Software Providers
These companies help business who use excel/manual billing/custom software to prepare their GST return & file it every month;
One of the key stakeholders here is the accountant who essentially is the business advisor for an SMB and tapping into them as an influencer channel is a great opportunity Eg: Cleartax, SahiGST etc.
Supply Chain Automation Companies:
Today many FMCGs and Large manufacturing companies are using software to track their sales/inventory in their supply chain; For e.g: Asian Paints, Tata Steel, ITC etc.
As these companies enable a large of wholesalers, retailers to use their software problem, there is a great opportunity to extend credit to their entire ecosystem
Eg: Moglix, Channel Konnekt, Bizom etc.
Example of a Lender – ASP Partnership
Consider a services-based company which provides advertising services to multiple companies
Let’s assume they use an accounting software like for example Cleartax or Zoho
In the software, the SMB sees a one-click credit button (This is enabled through an integration with the ASP & lender)
In a few clicks, the SMB is able to share multiple types of data like – GST, Payroll, Balance Sheet, Bank Statement etc. with the lender
With consent, the lender uses this data for underwriting, build a credit score and makes a credit offer to the SMB
The SMB provides his bank account details for real-time loan disbursement and based on the type of the business you can complete KYC
Take mandate either digitally or physically based on the customer for repayments
There are various other data sources one could use to improve the underwriting like – Smartphone, Payments Data from the Bank, Bill Payments, Electronic Toll Collection & various others. Algorithms can use these data sources along with other other public data sets like – Seasonal demand for a product, Import/Export, GDP, Consumption Patterns to do contextual lending.
We recommend you go through the presentation above to understand these basics & do watch the pre-recorded webinar session below on How to Leverage GST data for Flow-based lending for more details.
At iSPIRT, we are working with multiple stakeholders to create a winning implementation of Flow-Based Lending. Do watch out for future announcements from us for entrepreneurs working in this space or write to us [email protected] to know more.
About the Author
Nikhil Kumar is a full-time fellow with iSPIRT Foundation, a non for profit think-thank and has been focussed on building the developer ecosystem for the India Stack.