credit risk analysis course online Archives - Page 3 of 3 - DexLab Analytics | Big Data Hadoop SAS R Analytics Predictive Modeling & Excel VBA

SAS and Equifax Clouts Deep Learning and AI to Improve Credit Risk Analysis

SAS and Equifax Clouts Deep Learning and AI to Improve Credit Risk Analysis

The noteworthy triumphs over us, humans, in Poker, GO, speech recognition, language translation, image identification and virtual assistance have enhanced the market of AI, machine learning and neural networks, triggering exponential razzmatazz of  Apple (#1 as of February 17), Google (#2), Microsoft (#3), Amazon (#5), and Facebook (#6). While these digital natives command the daily headlines, a tug of war has been boiling of late between two ace developers –  Equifax and SAS – the former is busy in developing deep learning tools to refine credit scoring, and the latter is adding new deep learning functionality to its bouquet of data mining tools and providing a deep learning API.

Continue reading “SAS and Equifax Clouts Deep Learning and AI to Improve Credit Risk Analysis”

ANZ uses R programming for Credit Risk Analysis

ANZ uses R programming for Credit Risk Analysis

At the previous month’s “R user group meeting in Melbourne”, they had a theme going; which was “Experiences with using SAS and R in insurance and banking”. In that convention, Hong Ooi from ANZ (Australia and New Zealand Banking Group) spoke on the “experiences in credit risk analysis with R”. He gave a presentation, which has a great story told through slides about implementing R programming for fiscal analyses at a few major banks.

In the slides he made, one can see the following:

How R is used to fit models for mortgage loss at ANZ

A customized model is made to assess the probability of default for individual’s loans with a heavy tailed T distribution for volatility.

One slide goes on to display how the standard lm function for regression is adapted for a non-Gaussian error distribution — one of the many benefits of having the source code available in R.

A comparison in between R and SAS for fitting such non-standard models

Mr. Ooi also notes that SAS does contain various options for modelling variance like for instance, SAS PROC MIXED, PRIC NLIN. However, none of these are as flexible or powerful as R. The main difference as per Ooi, is that R modelling functions return as object as opposed to returning with a mere textual output. This however, can be later modified and manipulated with to adapt to a new modelling situation and generate summaries, predictions and more. An R programmer can do this manipulation.

 

Read Also: From dreams to reality: a vision to train the youngsters about big data analytics by the young entrepreneurs:

 

We can use cohort models to aggregate the point estimates for default into an overall risk portfolio as follows:

A comparison in between R and SAS for fitting such non-standard models
Photo Coutesy of revolution-computing.typepad.com

He revealed how ANZ implemented a stress-testing simulation, which made available to business users via an Excel interface:

The primary analysis is done in r programming within 2 minutes usually, in comparison to SAS versions that actually took 4 hours to run, and frequently kept crashing due to lack of disk space. As the data is stored within SAS; SAS code is often used to create the source data…

While an R script can be used to automate the process of writing, the SAS code can do so with much simplicity around the flexible limitations of SAS.

 

Read Also: Dexlab Analytics' Workshop on Sentiment Analysis of Twitter Data Using R Programming

 

Comparison between use of R and SAS’s IML language to implement algorithms:

Mr. Ooi’s R programming code has a neat trick of creating a matrix of R list objects, which is fairly difficult to do with IML’s matrix only data structures.

He also discussed some of the challenges one ma face when trying to deploy open-source R in the commercial organizations, like “who should I yell at if things do now work right”.

And lastly he also discussed a collection of typically useful R resources as well.

For people who work in a bank and need help adopting R in the workflow, may make use of this presentation to get some knowledge about the same. And also feel free to get in touch with our in-house experts in R programming at DexLab Analytics, the premiere R programming training institute in India.

 

Refhttps://www.r-bloggers.com/how-anz-uses-r-for-credit-risk-analysis/

 

Interested in a career in Data Analyst?

To learn more about Data Analyst with Advanced excel course – Enrol Now.
To learn more about Data Analyst with R Course – Enrol Now.
To learn more about Big Data Course – Enrol Now.

To learn more about Machine Learning Using Python and Spark – Enrol Now.
To learn more about Data Analyst with SAS Course – Enrol Now.
To learn more about Data Analyst with Apache Spark Course – Enrol Now.
To learn more about Data Analyst with Market Risk Analytics and Modelling Course – Enrol Now.

The Opportunities and Challenges in Credit Scoring with Big Data

The Opportunities and Challenges in Credit Scoring with Big Data

Within the past few decades, the banking institutions have collected plenty of data in order to describe the default behaviour of their clientele. Good examples of them are historical data about a person’s date of birth, their income, gender, status of employment etc. the whole of this data has all been nicely stored into several huge databases or data warehouses (for e.g. relational).

And on top of all this, the banks have accumulated several business experiences about their crediting products. For instance, a lot of credit experts have done a pretty swell job at discriminating between low risk and high risk mortgages with the use of their business mortgages, thereby making use of their business expertise only. It is now the goal of all credit scoring to conduct a detailed analysis of both the sources of data into a more detailed perspective with then come up with a statistically based decision model, which allows to score future credit applications and then ultimately make a decision about which ones to accept and which to reject.

With the surfacing of Big Data it has created both chances as well as challenges to conduct credit scoring. Big Data is often categorised in terms of its four Vs viz: Variety, Velocity, Volume, and Veracity. To further illustrate this, let us in short focus into some key sources or processes, which will generate Big Data.  

The traditional sources of Big Data are usually large scale transactional enterprise systems like OLTP (online Transactional Processing), ERP (Enterprise Resource Processing) and CRM (Customer Relationship Management) applications. The classical credit is generally constructed using the data extracted from these traditional transactional systems.

However, the online graphing is more recent example. Simply think about the all the major social media networks like, Weibo, Wechat, Facebook, Twitter etc. All of these networks together capture the information about close to two billion people relating to their friends preferences and their other behaviours, thereby leaving behind a huge trail of digital footprint in the form of data.

Also think about the IoT (the internet of things) or the emergence of the sensor enable ecosystems which is going to link the various objects (for e.g. cars, homes etc) with each other as well as with other humans. And finally, we get to see a more and more transparent or public data such as the data about weather, maps, traffic and the macro-economy. It is a clear indication that all of these new sources of generating data will offer a tremendous potential for building better credit scoring models.

The main challenges:

The above mentioned data generating processes can all be categorised in terms of their sheer volume of the data which is being created. Thus, it is evident that this poses to be a serious challenge in order to set up a scalable storage architecture which when combined with a distributed approach to manipulate data and query will be difficult.

Big Data also comes with a lot of variety or in several other formats. The traditional data or the structured data, such as customer name, their birth date etc are usually more and more complementary with unstructured data such as images, tweets, emails, sensor data, Facebook pages, GPS data etc. While the former may be easily stored in traditional databases, the latter needs to be accommodated with the use of appropriate database technology thus, facilitating the storage, querying and manipulation of each of these types of unstructured data. Also it requires a lot of effort since it is thought to be that at least 80 percent of all data in unstructured.

The speed at which data is generated is the velocity factor and it is at that perfect speed that it must be analysed and stored. You can imagine the streaming applications like on-line trading platforms, SMS messages, YouTube, about the credit card swipes and other phone calls, these are all examples of high velocity data and form an important concern.

Veracity which is the quality or trustworthiness of the data, is yet another factor that needs to be considered. However, sadly more data does not automatically indicate better data, so the quality of data being generated must be monitored closely and guaranteed.

So, in closing thoughts as the velocity, veracity, volume, and variety keeps growing, so will the new opportunities to build better credit scoring models.     

Looking for credit risk modelling courses? Take up our credit risk management course online or classroom-based from DexLab Analytics and get your career moving….

 

 

Interested in a career in Data Analyst?

To learn more about Data Analyst with Advanced excel course – Enrol Now.
To learn more about Data Analyst with R Course – Enrol Now.
To learn more about Big Data Course – Enrol Now.

To learn more about Machine Learning Using Python and Spark – Enrol Now.
To learn more about Data Analyst with SAS Course – Enrol Now.
To learn more about Data Analyst with Apache Spark Course – Enrol Now.
To learn more about Data Analyst with Market Risk Analytics and Modelling Course – Enrol Now.

Understanding Credit Risk Management With Modelling and Validation

The term credit risk encompasses all types of default risks that are associated with different financial instruments such as – (like for example, a debtor has not met his or her legal duties according to the debt contract), migrating risk (arises from adverse movements internally or externally with the ratings) and country risks (the debtor cannot pay as per the duties because of measure or events taken by political or monetary agencies of the country itself).

In compliance to Basel Regulations, most banks choose to develop their own credit risk measuring parameters: Probability Default (PD), Loss Given Default (LGD), and Exposure at Default (EAD). Several MNCs have gathered solid experience by developing models for the Internal Ratings Based Approach (IRBA) for different clients.

For implementation of these Credit Risk Assessment parameters, we need the following data analytics and visualization tools:

  • SAS Credit Risk modelling for banking
  • SA Enterprise miner and SAS Credit scoring
  • Matlab
Default Probability Curve for Each Counterparty
                                                                               Image Source: businessdecision.be

Credit and counterparty risk validating:

The models that are built for the computation of risks must be revalidated on a regular basis.

On one hand, the second pillar of the Basel regulations implies that supervisors should check that their risk models are working consistently for optimum results. On the other hand, recent crises have drawn the focus of the stakeholders of the banks (business, CRO) to a higher interest on the models.

The process of validation includes in a review of the development process and all the related aspects of model implementation. The process can be divided into two parts:

  1. Quality control is mainly concerned about the ongoing monitoring of the model in use, the quality of the input variables, judgemental decisions and the resulting output models.
  2. Quantitatively with backresting, we can statistically compare the periodic risk parameters with its actual outcomes.

In the context of credit risk, the process of validation is concerned with three main parameters they are – probability of default (PD), exposure at default (EAD) and the loss given default (LGD). And for all of the above mentioned three a complete backresting is done at the three levels:

  1. Discriminatory power: this is the ability of the model to differentiate between defaults, non-defaults, or between high-losses and low losses.
  2. Power of prediction: this is a checking using comparison between defaults and non-defaults, or between high losses and low losses.
  3. Stability: is the portfolio change between the time when the model was first developed and now.

In the below three X three matrix (parameter X level) each and every component has had one or more standardized tests to process. With the right Credit Risk Modelling training an individual can implement all the above tests and provide for the needful reporting of the same.

In terms of the counterparty credit risk context, one must consider the uncertainty of exposure and the bilateral nature of the risk associated. Hence, exposure at the default can be replaced by the EPE (expected positive exposure) and EEPE (effective expected positive exposure).

The test include comparing the observed P&L with the EEPE (make sure the violations are moderate and the pass rate does not exceed a predetermined level for instance 70%).

Deep Learning and AI using Python

For better visualization, here is an example of the same:

For better visualization, here is an example of the same:
                                                                  Image Source: businessdecision.be

Risk models:

As per the National Bank of Belgium, which is he Belgian regulator (NBB), it insists that appropriate conservative measures should be incorporated to compensate for the discrepancies of the value and risk models. For example, as per the NBB requisites there should be an assessment of the model risk, which is based on the inventory of:

  1. The risk that model covers, along with an assessment of the quality of the results calculated by the model (maturity of the model, adequacy of assumptions made, weaknesses and limitations of the model, etc) and the improvements that are planned to be included over time.
  2. The risks that are not yet be covered by the model along with an assessment of the materiality of these risks and their process of handling the same.
  3. The elements that are covered by a general modelling method along with the entities that are covered by a more simplified method, or the ones that are not covered at all.

A quality Credit Risk Management Course can provide you with the necessary functional and technical knowledge to assess the model risk.

 

Interested in a career in Data Analyst?

To learn more about Data Analyst with Advanced excel course – Enrol Now.
To learn more about Data Analyst with R Course – Enrol Now.
To learn more about Big Data Course – Enrol Now.

To learn more about Machine Learning Using Python and Spark – Enrol Now.
To learn more about Data Analyst with SAS Course – Enrol Now.
To learn more about Data Analyst with Apache Spark Course – Enrol Now.
To learn more about Data Analyst with Market Risk Analytics and Modelling Course – Enrol Now.

Introduction To Credit Score Cards: Its Use in Crisis

The incident we are about to describe took place during 2009 circa at a party, a year in which the world was going through one of its worst financial crisis for the longest time. Every average bloke on the streets was aware of terms like mortgage-backed securities (MBS), sub-prime lending and credit crisis, after all these are the reasons for his plight.

 

Introduction To Credit Score Cards: Its Use in Crisis

 

But at this party we are speaking of, I was fortunate enough to meet with an informed and highly compassionate elderly woman, and after a few minutes of discussion the topic came to what we here do for a living. She wanted to know more about credit scorecard systems. As I further went on to explain the details of how this system works, her expression changed from being just plainly curious to angry to pained. Continue reading “Introduction To Credit Score Cards: Its Use in Crisis”

Credit Risk Managers Must use Big Data in These Three Ways

Credit risk managers must use Big Data in these three ways

While the developed nations are slowly recovering from the financial chaos of post depression, the credit risk managers are facing growing default rates as household debts are increasing with almost no relief in sight. As per the reports of the International Finance which stated at the end of 2015 that household debts have risen to by USD 7.7 trillion since the year 2007. It now stands at the heart stopping amount of a massive USD 44 trillion and the amount of debts increased in the emerging markets is of USD 6.2 trillion. The household loans of emerging economies calculating as per adult rose by 120 percent over the period and are now summed up to USD 3000.

To thrive in this market of increasing debts, credit risk managers must consider innovative methods to keep accuracy in check and decrease default rates. A good solution to this can be applying the data analytics to Big Data. Continue reading “Credit Risk Managers Must use Big Data in These Three Ways”

Facts about Remittances for Credits and Rent Losses – Part 1

Facts about Remittances for Credits and Rent Losses – Part 1

 

A valuation store, built up and kept up by charges against the bank’s working salary, is what we know by The Allowances for Loans and Lease Losses (ALLL). As an assessment measure, it is an evaluation of invalid sums that is utilized to decrease the book estimation of credits and rents to the sum that is relied upon to be gathered. The ALLL frames a piece of Capital of Tier-2; henceforth it is kept up to cover misfortunes that are plausible and admirable at the time of assessment. It does not work as a support against all conceivable future misfortunes; that assurance is given by the Capital of Tier 1. For building up and keeping up a satisfactory payment, a bank ought to:

Continue reading “Facts about Remittances for Credits and Rent Losses – Part 1”

Banking Business And Banking Instruments- Part 2

Banking Business And Banking Instruments- Part 2
 

In the last blog we had discussed three types of banking instruments, namely the Current account, Savings account and Certificate of Deposit.  In this blog we discuss credit cards. Credit cards are the most expensive and profitable type of loan that a bank can extend. A credit card is a card issued by a financial institution giving the holder an option to borrow funds, usually at points of scales. Credit cards charge interest and are primarily used for short-term financing. Interest usually begins one month after a purchase is made and borrowing limit is pre-set according to the individual’s credit rating. Credit cards have higher interest rates than most consumer loans, or lines of credit.

Continue reading “Banking Business And Banking Instruments- Part 2”

Call us to know more