Axis Bank has acquired FreeCharge, a mobile wallet company opening doors to many such deals in the future. As a consequence, do you think banks and fintech startups have started working towards a common goal?
On some day in the early 2016, Rajiv Anand, the Executive Director of Retail Banking at Axis Bank, asked his team who were hard at work, “Do present-day customers know how a bank would look in the future?”
Banking risk refers to the future uncertainty which creates stochasticity in the cash flow from receivables of outstanding balances. Banking Risks can be described in the Vonn-Neumann-Morgenstern (VNM) framework of Money lotteries. In this framework, the set of outcomes are assumed to be continuous and monetary in nature, and the lottery is a list of probabilities associated with the continuous outcomes. When applied to the banking framework, the cash flows (the set of outcomes) are assumed to be continuous and stochastic in nature. A theoretical model for the risk is represented in the framework below:
Credit Risk arises when the borrower defaults to honour the repayment commitments on their debts. Such a risk arises as a result of adverse selection (screening) of applicants at the stage of acquisitions or due to a change in the financial capabilities of the borrower over the process of repayment. A loan will default if the borrower’s assets (A) at maturity (T) falls below the contractual value of the obligations payable (B) (Vasicek,1991). Let A_i be the asset of the i-th borrower, which is described by the process:
MARKET RISK:
Market Risk includes the risk that arises for banks from fluctuation of the market variables like: Asset Prices, Price levels, Unemployment rate etc. This risk arises from both on-balance sheet as well as off-balance sheet items. This risk includes risk arising from macroeconomic factors such as sharp decline in asset prices and adverse stock market movements. Recessions and sudden adverse demand and supply shock also affect the delinquency rates of the borrowers. Market Risk includes a whole family of risk which includes: stock market risks, counterparty default risk, interest rate risk, liquidity risk, price level movements etc.
OPERATIONAL RISK:
Operational Risk arises from the operational inefficiencies of the human resources and business processes of an organisation. Operational risk includes Fraud risks, bankruptcy risks, risks arising from cyber hacks etc. These risks are uncorrelated across the industries and is very organisation specific. However, Operational risk excludes strategy risk and reputation risk.
To learn more about Data Analyst with Advanced excel course – Enrol Now. To learn more about Data Analyst with R Course – Enrol Now. To learn more about Big Data Course – Enrol Now.
To learn more about Machine Learning Using Python and Spark – Enrol Now. To learn more about Data Analyst with SAS Course – Enrol Now. To learn more about Data Analyst with Apache Spark Course – Enrol Now. To learn more about Data Analyst with Market Risk Analytics and Modelling Course – Enrol Now.
Banks, as financial institutions, play an important role in the economic development of a nation. The primary function of banks had been to channelize the funds appropriately and efficiently in the economy. Households deposit cash in the banks, which the latter lends out to those businesses and households who has a requirement for credit. The credit lent out to businesses is known as commercial credit(Asset Backed Loans, Cash flow Loans, Factoring Loans, Franchisee Finance, Equipment Finance) and those lent out to the households is known as retail credit(Credit Cards, Personal Loans, Vehicle Loans, Mortgages etc.). Figure1 below shows the important interlinkages between the banking sector and the different segments of the economy:
Figure 1: Inter Linkages of the Banking Sector with other sectors of the economy
Banks borrow from the low-risk segment (Deposits from household sector) and lend to the high-risk segment (Commercial and retail credit) and the profit from lending is earned through the interest differential between the high risk and the low risk segment. For example: There are 200 customers on the books of Bank XYZ who deposit $1000 each on 1st January, 2016. These borrowers keep their deposits with the bank for 1 year and do not withdraw their money before that. The bank pays 5% interest on the deposits plus the principal to the depositors after 1 year. On the very same day, an entrepreneur comes asking for a loan of $ 200,000 for financing his business idea. The bank gives away the amount as loan to the entrepreneur at an interest rate of 15% per annum, under the agreement that he would pay back the principal plus the interest on 31st December, 2016. Therefore, as on 1st January, 2016 the balance sheet on Bank XYZ is:
Consider two scenarios:
Scenario 1: The Entrepreneur pays off the Principal plus the interest to the bank on 31st December, 2016
This is a win – win situation for all. The pay-offs were as follows:
Entrepreneur: Met the capital requirements of his business through the funding he obtained from the bank.
Depositors: The depositors got back their principal, with the interest (Total amount = 1000 + 0.05 * 1000 = 1050).
Bank: The bank earned a net profit of 10%. The profit earned by the bank is the Net Interest Income = Interest received – Interest Paid (= $30,000 – $10000 = $20,000).
Scenario2:The Entrepreneur defaults on the loan commitment on 31st December, 2016
This is a drastic situation for the bank!!!! The disaster would spread through the following channel:
Entrepreneur: Defaults on the whole amount lent.
Bank:Does not have funds to pay back to the depositors. Hence, the bank has run into liquidity crisis and hence on the way to collapse!!!!!!
Depositors:Does not get their money back. They lose confidence on the bank.
Only way to save the scene is BAILOUT!!!!!
The Second Scenario highlighted some critical underlying assumptions in the lending process which resulted in the drastic outcomes:
Assumption1: The Entrepreneur (Obligor) was assumed to be a ‘Good’ borrower. No specific screening procedure was used to identify the affordability of the obligor for the loan.
Observation: The sources of borrower and transaction risks associated with an obligor must be duly assessed before lending out credit. A basic tenet of risk management is to ensure that appropriate controls are in place at the acquisition phase so that the affordability and the reliability of the borrower can be assessed appropriately. Accurate appraisal of the sources of an obligor’s origination risk helps in streamlining credit to the better class of applicants.
Assumption2: The entire amount of the deposit was lent out. The bank was over optimistic of the growth opportunities. Under estimation of the risk and over emphasis on growth objectives led to the liquidation of the bank.
Observation: The bank failed to keep back sufficient reserves to fall back up on, in case of defaults. Two extreme lending possibilities for a bank are: a. Bank keeps 100% reserves and lends out 0%, b. Bank keeps 0% and lends out 100%. Under the first extreme, the bank does not grow at all. Under the second extreme (which is the case here!!!) the bank runs a risk of running into liquidation in case of a default. Every bank must solve an optimisation problem between risk and growth opportunities.
The discussion above highlights some important questions on lending and its associated risks:
What are the different types of risks associated with the lending process of a bank?
How can the risk from lending to different types of customers be identified?
How can the adequate amount of capital to be reserved by banks be identified?
The answers to these questions to be discussed in the subsequent blogs.
Stay glued to our site for further details about banking structure and risk modelling. DexLab Analytics offers a unique module on Credit Risk Modelling Using SAS. Contact us today for more details!
Interested in a career in Data Analyst?
To learn more about Data Analyst with Advanced excel course – Enrol Now. To learn more about Data Analyst with R Course – Enrol Now. To learn more about Big Data Course – Enrol Now.
To learn more about Machine Learning Using Python and Spark – Enrol Now. To learn more about Data Analyst with SAS Course – Enrol Now. To learn more about Data Analyst with Apache Spark Course – Enrol Now. To learn more about Data Analyst with Market Risk Analytics and Modelling Course – Enrol Now.
In the aftermath of the Great Recession and the credit crunch that followed, the financial institutions across the globe are facing an increasing amount of regulatory scrutiny, and for good reasons. Regulatory efforts necessitate new, in-depth analysis, reports, templates and assessments from financial institutions in the form of call reports and loan loss summaries, all of which ensures better accountability, thus helping business initiatives.
Also, regulators have started asking for more transparency. Their main objective is to know that a bank possesses thorough knowledge about its customers and their related credit risk. Moreover, new Basel III regulations entail an even bigger regulatory burden for the banks.
What are the challenges faced by CRM Managers?
Sloppy data management – Unable to access the data when it’s needed the most, due to inefficient data management issues.
No group-wide risk modeling framework – Banks need strong, meaningful risk measures to get a larger picture of the problem. Without these frameworks, it becomes really difficult to get to the tip of the problem.
Too much duplication of effort – As analysts cannot alter model parameters they face too much duplication of work, which results in constant rework. This may negatively affect a bank’s efficiency ratio.
Inefficient risk tools – Banks need to have a potent risk solution, otherwise how can they identify portfolio concentrations or re-grade portfolios to mitigate upcoming risks!
Long, unwieldy reporting process – Manual spreadsheet based reporting is simply horrible, overburdening the IT analysts and researchers.
What are the Best Practices to fight the Challenges Noted Above?
For the most effective credit risk management solution, one needs to gain in-depth understanding of a bank’s overall credit risk. View individual, customer and portfolio risk levels.
While banks give immense importance for a structured understanding of their risk profiles, a lot of information is found strewn across among various business units. For all this and more, intensive risk assessment is needed, otherwise bank can never know if capital reserves precisely reveal risks or if loan loss reserves sufficiently cover prospective short-term credit losses. Banks that are not in such good shape are mostly taken under for close scrutiny by investors and regulators, as they may lead to draining losses in the future.
Adopt a well-integrated, comprehensive credit risk solution. It helps in curbing loan losses, while ensuring capital reserves that strictly reflect the risk profile. Owing to this solution, banks buckle up and run quickly to coordinate with simple portfolio measures. Fortunately, it will also lead to a more sophisticated credit risk management solution, which will include:
Improved model management, stretching over the whole modeling life cycle
Real-time scoring and limits monitoring
Powerful stress-testing capabilities
Data visualization capabilities and robust BI tools that helps in transporting crucial information to anyone who needs them
In summary, if your credit risk is controlled properly, the rest of the things are taken care by themselves. To manage credit risk perfectly, rest your trust on credit risk professionals – they understand the pressing needs of decreasing default rates and improving the veracity with which credit is issued, and for that, they need to devise newer ways and start applying data analytics to Big Data.
The noteworthy triumphs over us, humans, in Poker, GO, speech recognition, language translation, image identification and virtual assistance have enhanced the market of AI, machine learning and neural networks, triggering exponential razzmatazz of Apple (#1 as of February 17), Google (#2), Microsoft (#3), Amazon (#5), and Facebook (#6). While these digital natives command the daily headlines, a tug of war has been boiling of late between two ace developers – Equifax and SAS – the former is busy in developing deep learning tools to refine credit scoring, and the latter is adding new deep learning functionality to its bouquet of data mining tools and providing a deep learning API.
New age technologies are dominating the present business environment. Mobility, cloud computing, social media and analytics have been affecting the different realms of business at an ever-increasing rate. Though most of the impacts are favourable, yet it will be reckless to ignore the severity of the negative ones.
Amidst all, cloud computing grabbed the utmost attention. The benefits of cloud computing are myriad – better productivity, lower costs and quicker time to market. A surging number of employees are using cloud applications to talk about various work-related subject matters. Nevertheless, data security is still a leading concern.
Traditional threats are no more potent. Most organisations have devised manipulating ways to safeguard themselves against those predictable threats, newer threats call for better IT security to realise high profile business priorities. A well-researched study by VMware, a pioneer in cloud infrastructure and digital workspace technology revealed that though businesses – small, medium and large will be more than keen to implement cloud computing to secure better future goals and efficiency, information security thriving on the cloud will have a profound impact on enterprises in the next 3-5 years.
The Cloud Security
Another study by eminent research firm Kantar IMRB highlighted that though organisations are taking steps towards a modern workspace environment, they are more interested about having a safe and secured digital environment, thanks to a rising number of cyber threats and thefts. If you follow the figures, in the next 3-5 years, more than 86% of enterprises are going to enhance their IT Budget and 80% of organisations will be eager to expend more time, skill and money on cloud technology.
In respect to the above context, Arun Parameswaran, managing director of VMware India said, “With nearly 25% of all IT workloads being managed on the cloud today, and the number expected to double by 2021, it is evident that the traditional on-premises IT environment is undergoing a profound change.” He further added, “Today, CIOs play an extremely essential role in their organisations’ IT, and it is of utmost importance to have enterprise data available always—anytime and anywhere while tightly secured.”
Enhanced productivity and better profitability will always remain a prime priority, but now as per the recent studies, IT security has also become a chief concern in the list of business priorities. However, despite heavy investments in IT, CIOs of well-established companies are unhappy because the budget is either not structured properly or inadequate. The studies also reveal that the government and BFSI respondents think that the budget for IT security is quite low, and it should be increased at least by 25% by next year.
Cloud is the best thing since sliced bread. Companies are relying more on cloud to store sensitive data. Cloud is the future; so companies should look up to ways to balance the risks with explicit advantages that this evolving technology brings in.
To learn more about Machine Learning Using Python and Spark – click here. To learn more about Data Analyst with Advanced excel course – click here. To learn more about Data Analyst with SAS Course – click here. To learn more about Data Analyst with R Course – click here. To learn more about Big Data Course – click here.
At the previous month’s “R user group meeting in Melbourne”, they had a theme going; which was “Experiences with using SAS and R in insurance and banking”. In that convention, Hong Ooi from ANZ (Australia and New Zealand Banking Group) spoke on the “experiences in credit risk analysis with R”. He gave a presentation, which has a great story told through slides about implementing R programming for fiscal analyses at a few major banks.
In the slides he made, one can see the following:
How R is used to fit models for mortgage loss at ANZ
A customized model is made to assess the probability of default for individual’s loans with a heavy tailed T distribution for volatility.
One slide goes on to display how the standard lm function for regression is adapted for a non-Gaussian error distribution — one of the many benefits of having the source code available in R.
A comparison in between R and SAS for fitting such non-standard models
Mr. Ooi also notes that SAS does contain various options for modelling variance like for instance, SAS PROC MIXED, PRIC NLIN. However, none of these are as flexible or powerful as R. The main difference as per Ooi, is that R modelling functions return as object as opposed to returning with a mere textual output. This however, can be later modified and manipulated with to adapt to a new modelling situation and generate summaries, predictions and more. An R programmer can do this manipulation.
We can use cohort models to aggregate the point estimates for default into an overall risk portfolio as follows:
He revealed how ANZ implemented a stress-testing simulation, which made available to business users via an Excel interface:
The primary analysis is done in r programming within 2 minutes usually, in comparison to SAS versions that actually took 4 hours to run, and frequently kept crashing due to lack of disk space. As the data is stored within SAS; SAS code is often used to create the source data…
While an R script can be used to automate the process of writing, the SAS code can do so with much simplicity around the flexible limitations of SAS.
Comparison between use of R and SAS’s IML language to implement algorithms:
Mr. Ooi’s R programming code has a neat trick of creating a matrix of R list objects, which is fairly difficult to do with IML’s matrix only data structures.
He also discussed some of the challenges one ma face when trying to deploy open-source R in the commercial organizations, like “who should I yell at if things do now work right”.
And lastly he also discussed a collection of typically useful R resources as well.
For people who work in a bank and need help adopting R in the workflow, may make use of this presentation to get some knowledge about the same. And also feel free to get in touch with our in-house experts in R programming at DexLab Analytics, the premiere R programming training institute in India.
To learn more about Data Analyst with Advanced excel course – Enrol Now. To learn more about Data Analyst with R Course – Enrol Now. To learn more about Big Data Course – Enrol Now.
To learn more about Machine Learning Using Python and Spark – Enrol Now. To learn more about Data Analyst with SAS Course – Enrol Now. To learn more about Data Analyst with Apache Spark Course – Enrol Now. To learn more about Data Analyst with Market Risk Analytics and Modelling Course – Enrol Now.
As per the latest research from strategy analytics, the global smart watch shipments of Apple has grown by 1 percent annually to hit the major record of 8.2 million units in the 4th quarter of the year 2016. The growth of apple watch drove and got dominated with 63 percent in global smart watch share of market and Samsung still continues to hold its second position.
Neil Mawston, the Executive Director at Strategy Analytics stated on the issue by saying – the global shipments have grown by 1 percent annually from the pre-existing 8.1 million units in quarter 4 in 2015 to 8.2 million in quarter 4 in 2016. The market shows a marked growth in the fourth quarter for growth in smart watches industry after the past two consecutive quarters for declining volumes. The smart watch growth is also seen to be recovering ever so slightly due to new product launches from other company giants. Moreover, there is a seasonal demand for these gadgets, and a giant such as Apple is launching stringer demand in the major developed markets in the US and UK. Hence, the international smart watch shipments grew by 1 percent annually; from the previously existing 20.8 million in full-year 2015 to a record high of 21.1 million in 2016.
Within the past few decades, the banking institutions have collected plenty of data in order to describe the default behaviour of their clientele. Good examples of them are historical data about a person’s date of birth, their income, gender, status of employment etc. the whole of this data has all been nicely stored into several huge databases or data warehouses (for e.g. relational).
And on top of all this, the banks have accumulated several business experiences about their crediting products. For instance, a lot of credit experts have done a pretty swell job at discriminating between low risk and high risk mortgages with the use of their business mortgages, thereby making use of their business expertise only. It is now the goal of all credit scoring to conduct a detailed analysis of both the sources of data into a more detailed perspective with then come up with a statistically based decision model, which allows to score future credit applications and then ultimately make a decision about which ones to accept and which to reject.
With the surfacing of Big Data it has created both chances as well as challenges to conduct credit scoring. Big Data is often categorised in terms of its four Vs viz: Variety, Velocity, Volume, and Veracity. To further illustrate this, let us in short focus into some key sources or processes, which will generate Big Data.
The traditional sources of Big Data are usually large scale transactional enterprise systems like OLTP (online Transactional Processing), ERP (Enterprise Resource Processing) and CRM (Customer Relationship Management) applications. The classical credit is generally constructed using the data extracted from these traditional transactional systems.
However, the online graphing is more recent example. Simply think about the all the major social media networks like, Weibo, Wechat, Facebook, Twitter etc. All of these networks together capture the information about close to two billion people relating to their friends preferences and their other behaviours, thereby leaving behind a huge trail of digital footprint in the form of data.
Also think about the IoT (the internet of things) or the emergence of the sensor enable ecosystems which is going to link the various objects (for e.g. cars, homes etc) with each other as well as with other humans. And finally, we get to see a more and more transparent or public data such as the data about weather, maps, traffic and the macro-economy. It is a clear indication that all of these new sources of generating data will offer a tremendous potential for building better credit scoring models.
The main challenges:
The above mentioned data generating processes can all be categorised in terms of their sheer volume of the data which is being created. Thus, it is evident that this poses to be a serious challenge in order to set up a scalable storage architecture which when combined with a distributed approach to manipulate data and query will be difficult.
Big Data also comes with a lot of variety or in several other formats. The traditional data or the structured data, such as customer name, their birth date etc are usually more and more complementary with unstructured data such as images, tweets, emails, sensor data, Facebook pages, GPS data etc. While the former may be easily stored in traditional databases, the latter needs to be accommodated with the use of appropriate database technology thus, facilitating the storage, querying and manipulation of each of these types of unstructured data. Also it requires a lot of effort since it is thought to be that at least 80 percent of all data in unstructured.
The speed at which data is generated is the velocity factor and it is at that perfect speed that it must be analysed and stored. You can imagine the streaming applications like on-line trading platforms, SMS messages, YouTube, about the credit card swipes and other phone calls, these are all examples of high velocity data and form an important concern.
Veracity which is the quality or trustworthiness of the data, is yet another factor that needs to be considered. However, sadly more data does not automatically indicate better data, so the quality of data being generated must be monitored closely and guaranteed.
So, in closing thoughts as the velocity, veracity, volume, and variety keeps growing, so will the new opportunities to build better credit scoring models.
To learn more about Data Analyst with Advanced excel course – Enrol Now. To learn more about Data Analyst with R Course – Enrol Now. To learn more about Big Data Course – Enrol Now.
To learn more about Machine Learning Using Python and Spark – Enrol Now. To learn more about Data Analyst with SAS Course – Enrol Now. To learn more about Data Analyst with Apache Spark Course – Enrol Now. To learn more about Data Analyst with Market Risk Analytics and Modelling Course – Enrol Now.