Neural Networks Training Archives - DexLab Analytics | Big Data Hadoop SAS R Analytics Predictive Modeling & Excel VBA

ARMA- Time Series Analysis Part 4

ARMA Time series DexLab Analytics

ARMA(p,q) model in time series forecasting is a combination of Autoregressive  Process also known as AR Process and Moving Average (MA) Process where p corresponds to the autoregressive part and q corresponds to the moving average part.

                      

Autoregressive Process (AR) :- When the value of Yt in a time series data is regressed over its own past value then it is called an autoregressive process where p is the order of lag into consideration.

Where,

Yt = observation which we need to find out.

α1= parameter of an autoregressive model

Yt-1= observation in the previous period

ut= error term

The equation above follows the first order of autoregressive process or AR(1) and the value of p is 1. Hence the value of Yt in the period ‘t’ depends upon its previous year value and a random term.

Moving Average (MA) Process :- When the value of Yt  of order q in a time series data depends on the weighted sum of current and the q recent errors i.e. a linear combination of error terms then it is called a moving average process which can be written as :-

yt = observation which we need to find out

α= constant term

βut-q= error over the period q .

ARMA (Autoregressive Moving Average) Process :-

The above equation shows that value of Y in time period ‘t’ can be derived by taking into consideration the order of lag p which in the above case is 1 i.e. previous year’s observation and the weighted average of the error term over a period of time q which in case of the above equation is 1.

How to decide the value of p and q?

Two of the most important methods to obtain the best possible values of p and q are ACF and PACF plots.

ACF (Auto-correlation function) :- This function calculates the auto-correlation of the complete data on the basis of lagged values which when plotted helps us choose the value of q that is to be considered to find the value of Yt. In simple words how many years residual can help us predict the value of Yt can obtained with the help of ACF, if the value of correlation is above a certain point then that amount of lagged values can be used to predict Yt.

Using the stock price of tesla between the years 2012 and 2017 we can use the .acf() method in python to obtain the value of p.

.DataReader() method is used to extract the data from web.

The above graph shows that beyond the lag 350 the correlation moved towards 0 and then negative.

PACF (Partial auto-correlation function) :- Pacf helps find the direct effect of the past lag by removing the residual effect of the lags in between. Pacf helps in obtaining the value of AR where as acf helps in obtaining the value of MA i.e. q. Both the methods together can be use find the optimum value of p and q in a time series data set.

Lets check out how to apply pacf in python.

As you can see in the above graph after the second lag the line moved within the confidence band therefore the value of p will be 2.

 

So, with that we come to the end of the discussion on the ARMA Model. Hopefully it helped you understand the topic, for more information you can also watch the video tutorial attached down this blog. The blog is designed and prepared by Niharika Rai, Analytics Consultant, DexLab Analytics DexLab Analytics offers machine learning courses in Gurgaon. To keep on learning more, follow DexLab Analytics blog.


.

Skewness and Kurtosis: A Definitive Guide

Skewness and Kurtosis: A Definitive Guide

While dealing with data distribution, Skewness and Kurtosis are the two vital concepts that you need to be aware of. Today, we will be discussing both the concepts to help your gain new perspective.

Skewness gives an idea about the shape of the distribution of your data. It helps you identify the side towards which your data is inclined. In such a case, the plot of the distribution is stretched to one side than to the other. This means in case of skewness we can say that the mean, median and mode of your dataset are not equal and does not follow the assumptions of a normally distributed curve.

Positive skewness:- When the curve is stretched towards the right side more it is called a positively skewed curve. In this case mean is greater than median and median is the greater mode

(Mean>Median>Mode)

Let’s see how we can plot a positively skewed graph using python programming language.

  • First we will have to import all the necessary libraries.

  • Then let’s create a data using the following code:-

In the above code we first created an empty list and then created a loop where we are generating a data of 100 observations. The initial value is raised by 0.1 and then each observation is raised by the loop count.

  • To get a visual representation of the above data we will be using the Seaborn library and to add more attributes to our graph we will use the Matplotlib methods.


In the above graph you can see that the data is stretched towards right, hence the data is positively skewed.

  • Now let’s cross validate the notion that whether Mean>Median>Mode or not.


Since each observation in the dataset is unique mode cannot be calculated.

Calculation of skewness:

Formula:-

  • In case we have the value of mode then skewness can be measured by Mode ─ Mean
  • In case mode is ill-defined then skewness can be measured by 3(Mean ─ Median)
  • To obtain relative measures of skewness, as in dispersion we use the following formula:-

When mode is defined:-
When mode is ill-defined:-


To calculate positive skewness using Python programming language we use the following code:-


Negative skewness:- When the curve is stretched towards left side more it is called a negatively skewed curve. In this case mean is less than median and median is  mode.

(Mean<Median<Mode)

Now let’s see how we can plot a negatively skewed graph using python programming language.

Since we have already imported all the necessary libraries we can head towards generating the data.|


In the above code instead of raising the value of observation we are reducing it.

  • To visualize the data we have created again we will use the Seaborn and Matplotlib library.


The above graph is stretched towards left, hence it is negatively skewed.

  • To check whether Mean<Median<Mode or not again we will be using the following code:-


The above result shows that the value of mean is less than mode and since each observation is unique mode cannot be calculated.

  • Now let’s calculate skewness in Python.


Kurtosis

Kurtosis is nothing but the flatness or the peakness of a distribution curve.   

  • Platykurtic :- This kind of distribution has the smallest or the flattest peak.
  • Misokurtic:- This kind of distribution has a medium peak.
  • Leptokurtic:- This kind of distribution has the highest peak.


The video attached below will help you clear any query you might have.

So, this was the discussion on the Skewness and Kurtosis, at the end of this you have definitely become familiar with both concepts. Dexlab Analytics blog has informative posts on diverse topics such as neural network machine learning python which you need to explore to update yourself. Dexlab Analytics offers cutting edge courses like machine learning certification courses in gurgaon.


.

Learn How To Do Image Recognition Using LSTM

Learn How To Do Image Recognition Using LSTM

This is a tutorial where we teach you to do image recognition using LSTM. To get to the core you have to understand that how a convolutional neural network perceives the data. In this tutorial the data we have is four-dimensional data, so, you need to convert the dataset accordingly. You can find the tutorial video attached to this blog.

Now suppose there is an image 28 by 28 pixel, if the image is black and white then there would be only one channel. So how will you put the data in CNN, it will be like the number of samples, then followed by the number of rows of the data, then the number of columns, then channels. These are the four values that need to be provided in the input layer, at the very beginning. Now, these values must be converted according to the LSTM. Now the LSTM wants the STF, like the number of samples, time steps like how many time steps back you want to go for making further prediction because LSTM is a sequence generator and the number of features. So, we will be converting the image that is the number of sample 28 by 28 one pixel into one sample of 28 by 28, that’s the only job you have to do and all you need to accomplish this is to prepare the data accordingly.

There will be no mysteries here, in fact, it is a normal neural network LSTM, that anybody can run in a most simple form, and in this tutorial, it is also run in the most simple form there is no complexity involved and only a few epochs will be run.

You can find the code sheet you need for this at

 

Also follow this video that explains the process step by step, so that you can easily grasp how LSTM can be used for the purpose of image recognition. To access more informative tutorial sessions like this follow the DexLab Analytics blog. 


.

How AI is Reshaping The Finance Industry?

How AI Is Reshaping The Finance Industry?

Technology is bringing about rapid changes in almost every field it touches. Traditional finance tools no longer suit the current tech-friendly generation of investors who are now used to getting information, service at their fingertips. Unless the gap is bridged, it would be hard for firms to retain any clients. Some of the financial firms have already started investing in AI technology to develop a business model that satisfies the changing requirements of the customers and leverages their business.

The adoption of AI has finally enabled the firms to have access to customer-centric information to develop a plan that suits their individual financial goals and offer customer-centric solutions to offer a personalized experience.

AI is impacting the financial industry in more ways than one. Let’s take a look

Mitigating risks

The application of AI has enabled institutes to assess risk factors and mitigate risk. Implementation of AI tools allows the processing of a huge amount of financial records that comprise structured as well as unstructured data to recognize patterns and predict the risk factors. So, while approving a loan, for example, an institute could be better prepared as it would be able to identify those customers who are likely to default and having personnel with a background in credit risk management courses can certainly be of immense help here.

Detecting fraud

One of the most niggling issues faced by the banking institutes is a fraud, and with AI application being available fraud identification gets easier. When any such case happens it becomes almost impossible for institutes to recover the money. Along with that the banks especially also have to deal with false positives cases that can harm their business. Credit card fraud cases also have become rampant and give customers and banks sleepless nights. AI technology could be a great weapon in fighting and preventing such cases. By analyzing data regarding the transaction of a customer, his behavior, spending habits, past cases if any, an oddity could be easily spotted and an alarm could be sent to monitor the situation and take measures accordingly.

Trading gets easier

Investment always comes with a set of risks, the changing market scenario could certainly put your money in a volatile situation. However, with AI in place, the large datasets could be easily handled, and detecting market situations can help to make investors aware of the trends and they can change their investment decision accordingly. Faster data processing leads to quick decision making and coupled with an accurate prediction of the market situation, trading gets smarter as an investor can buy or, sell stock as per stock trends and stay risk-free.

Personalized banking experience

The integration of AI can offer customers a personalized financial experience. The chatbots are there to help the customers manage their affairs without needing any intervention. Be it checking balance or, scheduling payments everything is streamlined. In addition to this, the customers now have access to apps that help keep their financial transactions in check, track their investments, and plan finances without any hassle. There have been a dynamic progress in the field of NLP and the chatbots being developed now are getting smarter than ever and pursuing a natural language processing course in gurgaon, could lead to lucrative job opportunities.

 Process Automation

Every financial institution needs to run operations with maximum efficiency while adopting cost-cutting measures. The adoption of RPA has significantly changed the way these institutes function. Manual tasks which require time and labor could easily be automated and there would be fewer chances of error. Be it data verification or, report generation every single task could be well taken care of.

Data Science Machine Learning Certification

Examples of AI implementation in finance

  • Zest Automated Machine Learning (ZAML) is a platform that offers underwriting solutions. Borrowers with little or, no past credit history could be assessed.
  • Kensho combines the power of NLP and cloud computing to offer analytical solutions
  • Ayasdi provides anti-money laundering (AML) detection solutions to financial institutes
  • Abe AI is a virtual assistant that helps users with budgeting and saving while allowing them to track spending.
  • Darktrace offers cyber security solutions to financial firms

The powerful ways AI is helping the financial institutes excel in their field indicate a promising future ahead. However, the integration is slowly taking place, and still, there is some uncertainty regarding the technology. With proper training from an analytics lab could help bridge the knowledge gap and thus ensure full integration of this dynamic technology.


.

A Quick Guide To Natural Language Processing (NLP) And Its Applications

 A Quick Guide To Natural Language Processing (NLP) And Its Applications

When you interact with Alexa, or, conduct a voice search on Google, do you wonder about the technology behind it? What is it that makes it possible to communicate with machines as you would with a human being?

Natural Language Processing (NLP) AKA computational linguistics is a subset of Artificial Intelligence, makes all of it possible by combining artificial intelligence, machine learning, and language to facilitate interaction between computers and humans.

So how does NLP work?

When you put a voice command, it gets recorded and then the audio gets converted into text, and the NLP system starts working on the context and the intention of the command. Basically, the words that we speak are labeled and get converted into a set of numbers. However, Human language is complex and has many nuances and underlying subtexts. The same word under a different context can have different connotations. So, when a simple command is put is gets easier for the machine to follow through as it contains simpler words and clear context, but, the system needs to evolve more to fully process the complex language patterns that evolved through ages. There are courses available such as natural language processing course in gurgaon that can help one acquire specialized knowledge in this field.

NLP and its applications

NLP despite being in a nascent stage is getting recognized for its potential and being applied for executing various tasks.

Sentiment Analysis to assess consumer behavior

  This functionality of NLP is an important part of social media analytics that parses the comments and reactions of users regarding products, brand messages spread over social media platforms to detect the mood of the person. This helps businesses gauge customers’ behavior and to make necessary modifications accordingly. 

Email filtering and weeding out spam

If you are a user of Gmail then you must have noticed the way your emails get segmented once they arrive. Primary, Social, and Promotions are the three broad categories followed by others, there is a segment for spams as well. This smart segmentation is a stark example of NLP at work.

Basically text classification technique is used here to assign a mail to a certain category that is pre-defined. You must have also noticed how well your spam is sorted, this is another result of the application of NLP where certain words trigger the spam alert and the mail gets sent straight to the spam folder. However, this sorting is yet to be perfected via further research.

Automatic summarization to find relevant information

Now thanks to digitization we have to deal with huge amounts of data which has led to information overload. This massive amount of data needs to be processed to find actionable information. Automatic Summarization makes it possible by processing a big document and presenting an accurate and short summary of it.

Chatbots

No discussion on NLP can ever be complete without mentioning the chatbots. The customer service segment is gaining huge benefits from these smart chatbots that can offer virtual assistance to the customers 24×7.  Chatbots can not only enhance customer experience but, are also great for reducing costs for any business. However, modern-day chatbots can handle simple, mundane queries that do not require any special knowledge and skill, in the future we could hope to see the bots handling specialized queries in real-time.

Spell and grammar checker

If you have ever used Grammarly and felt impressed with the result then you must have wondered at some point how does it do it? When you put in a text, it not only looks for punctuations but also points out spelling errors and also shows grammatical errors in places where there is no subject-verb agreement. In fact, you also get alternative suggestions to improve your writing. All of this is possible thanks to transformers used by NLP.

Machine Translation

If you are familiar with Google and its myriad apps then you must be familiar with Google Translate. How quickly it translates your sentences in a preferred language format, machine translation is one application of NLP that is transforming the world. We always talk about big data but making it accessible to people scattered across the globe divided by language barriers could be a big problem. So, the NLP enabled us with machine translation that uses the power of smart algorithms to translate without the need of any human supervision or intervention. However, there is still huge room for improvement as languages are full of nuanced meanings that only a human is capable of understanding.

 What are some examples of NLP at work?

We are not including Siri, or, Alexa  here as you are already familiar with them

  • SignAll is an excellent NLP powered tool that is used for converting sign language into text.
  • Nina is a virtual assistant that deals with banking queries of customers.
  • Translation gets easier with another tool called Lilt that can integrate with other platforms as well.
  • HubSpot integrated the autocorrect feature into its site search function to make searching hassle-free for users.
  • MarketMuse helps writers create content that is high-quality and most importantly relevant.

Data Science Machine Learning Certification

Just like AI and its various subsets, NLP is also a field that is still evolving and has a long journey ahead. Language processing is a function that needs more research because simulating human interaction is one thing and processing languages that are so nuanced is not a cakewalk. However, there are plenty of good career opportunities available and undergoing an artificial intelligence course in delhi would be a sound career move.

 


.

Top Six Applications of Natural Language Processing (NLP)

Top Six Applications of Natural Language Processing (NLP)

Words are all around us – in the form of spoken language, texts, sound bytes and even videos. The world would have been a chaotic place had it not been for words and languages that help us communicate with each other.

Now, if we were to enhance language with the attributes of artificial intelligence, we would be working with what is known as Natural Language Processing or NLP – the confluence of artificial intelligence and computational linguistics.

In other words, “NLP is the machine’s ability to process what was said to it, structure the information received, determine the necessary response and respond in a language that we understand”.

Here is a list of popular applications of NLP in the modern world.

1. Machine Translation

When a machine translates from one language to another, “we deal with “Machine” Translation. The idea behind MT is simple — to develop computer algorithms to allow automatic translation without any human intervention. The best-known application is probably Google Translate.”

2. Voice and Speech Recognition

Though voice recognition technology has been around for 50 years, it is only in the last few decades, owing to NLP, have scientists achieved significant success in the field. “Now we have a whole variety of speech recognition software programs that allow us to decode the human voice,”be it in mobile telephony, home automation, hands-free computing, virtual assistance and video games.

3. Sentiment Analysis

“Sentiment analysis (also known as opinion mining or emotion AI) is an interesting type of data mining that measures the inclination of people’s opinions. The task of this analysis is to identify subjective information in the text”. Companies use sentiment analysis to keep abreast of their reputation and customer satisfaction.

4. Question Answering

Question-Answering concerns building systems that “automatically answer questions posed by humans in a natural language”. The real examples of Question-Answering applications are: Siri, OK Google, chat boxes and virtual assistants.

5. Automatic Summarization

Automatic Summarization is the process of creating a short, accurate, and fluent summary of a longer text document. The most important advantage of using a summary is it reduces the time taken to read a piece of text. Here are some applications – Aylien Text Analysis, MeaningCloud Summarization, ML Analyzer, Summarize Text and Text Summary.

Data Science Machine Learning Certification

6. Chatbots

Chatbots currently operate on several channels like the Internet, web applications and messaging platforms. “Businesses today are interested in developing bots that can not only understand a person but also communicate with him at one level”.

While such applications celebrate the use of NLP in modern computing, there are some glitches that arise in systems that cannot be ignored. “The very nature of human natural language makes some NLP tasks difficult…For example, the task of automatically detecting sarcasm, irony, and implications in texts has not yet been effectively solved. NLP technologies still struggle with the complexities inherent in elements of speech such as similes and metaphors.”

To know more, do take a look at the DexLab Analytics website. DexLab Analytics is a premiere institute that trains professionals in NLP deep learning classification in Delhi.

 


.

What is a Neural Network?

What is a Neural Network?

Before we get started with the process of building a Neural Network, we need to understand first what a Neural Network is.

A neural network is a collection of neurons connected by synapses. This collection is organized into three main layers: the input layer, the hidden layer, and the output layer.

In an artificial neural network, there are several inputs, which are called features, producing a single output, known as a label.

Analogy between Human Mind and Neural Network

Scientists believe that a living creature’s brain processes information through the use of a biological neural network. The human brain has as many as 100 trillion synapses – gaps between neurons – which form specific patterns when activated.

In the field of Deep Learning, a neural network is represented by a series of layers that work much like a living brain’s synapses. It is becoming a popular course now, with an array of career opportunities. Thus, Deep learning Certification in Gurgaon is a must for everyone.

Scientists use neural networks to teach computers how to do things for themselves. The whole concept of Neural network and its varied applications are pretty interesting. Moreover, with the matchless Neural Networks Training in Delhi, you need not look any further.

There are numerous kinds of deep learning and neural networks:

  1. Feedforward Neural Network – Artificial Neuron
  2. Radial basis function Neural Network
  3. Kohonen Self Organizing Neural Network
  4. Recurrent Neural Network (RNN) – Long Short Term Memory
  5. Convolutional Neural Network
  6. Modular Neural Network
  7. Generative adversarial networks (GANs)

Data Science Machine Learning Certification

Working of a Simple Feedforward Neural Network

  1. It takes inputs as a matrix (2D array of numbers).
  2. Multiplies the input by a set weight (performs a dot product aka matrix multiplication).
  3. Applies an activation function.
  4. Returns an output.
  5. Error is calculated by taking the difference from the desired output from the data and the predicted output. This creates our gradient descent, which we can use to alter the weights.
  6. The weights are then altered slightly according to the error.
  7. To train, this process is repeated 1,000+ times. The more the data is trained upon, the more accurate our outputs will be.

Implementation of a Neural Network with Python and Keras

Keras has two types of models:

  • Sequential model
  • The model class used with functional API

Sequential model is probably the most used feature of Keras. Primarily, it represents the array of Keras Layers. It is convenient and builds different types of Neural Networks really quick, just by adding layers to it. Keras also has different types of Layers like Dense Layers, Convolutional Layers, Pooling Layers, Activation Layers, Dropout Layers etc.

The most basic layer is Dense Layer. It has many options for setting the inputs, activation function and so on. So, let’s see how one can build a Neural Network using Sequential and Dense. 

First, let’s import the necessary code from Keras:

After this step, the model is ready for compilation. The compilation step asks to define the loss function and the kind of optimizer which should be used. These options depend on the problem one is trying to solve.

Now, the model is ready to get trained. Thus, the parameters get tuned to provide the correct outputs for a given input. This can be done by feeding inputs at the input layer and then, getting an output.

After this one can calculate the loss function using the output and use backpropagation to tune the model parameters. This will fit the model parameters to the data.

Output of the above cell:-

This output shows the loss decrease and the accuracy increase over time. At this point, one can experiment with the hyper-parameters and neural network architecture to get the best accuracy.

After getting the final model architecture, one can now take the model and use feed-forward passes and predict inputs. To start making predictions, one can use the testing dataset in the model that has been created previously. Keras enables one to make predictions by using the .predict() function.

Some points to be remembered while building a strong Neural Network

1. Adding Regularization to Fight Over-Fitting

The predictive models mentioned above are prone to a problem of overfitting. This is a scenario whereby the model memorizes the results in the training set and isn’t able to generalize on data that it hasn’t seen.

In neural networks, regularization is the technique that fights overfitting by adding a layer in the neural network. It can be done in 3 ways:

  • L1 Regularization
  • L2 Regularization
  • Dropout Regularization

Out of these, Dropout is a commonly used regularization technique. In every iteration, it adds a Dropout layer in the neural network and thereby, deactivates some neurons. The process of deactivating neurons is usually random.

2. Hyperparameter Tuning

Grid search is a technique that you can use to experiment with different model parameters to obtain the ones that give you the best accuracy. This is done by trying different parameters and returning those that give the best results. It helps in improving model accuracy.

Conclusion

Neural Network is coping with the fast pace of the technology of the age remarkably well and thereby, inducing the necessity of courses like Neural Network Machine Learning PythonNeural Networks in Python course and more. Though these advanced technologies are just at their nascent stage, they are promising enough to lead the way to the future. 

In this article, Building and Training our Neural Network is shown. This simple Neural Network can be extended to Convolutional Neural Network and Recurrent Neural Network for more advanced applications in Computer Vision and Natural Language Processing respectively.

Reference Blogs:

https://keras.rstudio.com

https://www.khanacademy.org/math/precalculus/x9e81a4f98389efdf:matrices/x9e81a4f98389efdf:multiplying-matrices-by-matrices/v/matrix-multiplication-intro

 

Interested in a career in Data Analyst?

To learn more about Data Analyst with Advanced excel course – Enrol Now.
To learn more about Data Analyst with R Course – Enrol Now.
To learn more about Big Data Course – Enrol Now.

To learn more about Machine Learning Using Python and Spark – Enrol Now.
To learn more about Data Analyst with SAS Course – Enrol Now.
To learn more about Data Analyst with Apache Spark Course – Enrol Now.
To learn more about Data Analyst with Market Risk Analytics and Modelling Course – Enrol Now.

Call us to know more