Today we will learn in continuation of our previous session of Machine Learning.
So, let's see.
How we can practically distribute our probability distribution in python.
For that we will take help of some random numbers, through which we will understand this distribution.
So, let's begin.
Firstly, we will start with normal distribution.
For it let us import our packages.
So, import scipy dot stats,
and also import numpy as np
Third, import matplotlib dot pyplot as plt.
Thereafter, we will start our programme.
Firstly, np dot random dot seed values we will put 1 2 3 4 randomly.
Now, we will generate our sample data, so samples is equal to np dot random dot lognormal, here we will generate such data which will help us in Normal distribution.
Here, mean is equal to we will keep it one point zero, and Sigma is equal to point four and size is equal to 10 thousand records.
Now, we will find shape, loc and scale is equal to scipy dot stats dot lognorm dot fit, so here we will write fit, samples(those data that we had generated), here the second property that we have is floc which is 'hold location parameter fixed to specific value', this we will keep it as zero, floc is equal to zero.
Here, we will identify the number of bins, so the number of bins we will take is 50.
So, the number of bins became 50.
You can give colors if you want, I am not giving it here.
Next, we will initialise parameters,so counts, edges, patches, is equal to plt dot hist, with this we will print histogram, here we will put samples, here we will put bins, here we will define color also, so for now we will keep it red.
Ok! So it's showing bins not defined.
So, here we have num bins, so we will put bins equal to num bins.
So, here you can see this distribution, which is a normal distribution, though we can see some squeness over here, but it is a normal distribution.
Now, we will move ahead and see binomial distribution.
So, for b i n o m a l d i s t r i b u t i o n.
So, from 'bi' we can understand that there will be two outputs only, like success or failure, day or night.
So, for binomial we have libraries already.
But I will still show you once
Here we will use import sea born, then from scipy dot stats, we will import binom, and that's it.
Now, we will generate data from here only
So, data is equal to binom rvs, let this get executed it's taking some time.
So, now it's done, so binom dot rvs, here n is equal to we will give 17 as value, after that p's value we will give 0.7, here loc we will give zero as value, and size here we will keep as one thousand ten.
So, this is the data that we have generated.
So, let's check if this data gets successfully generated, and yes it has done!
Now, we will use ax is equal to seaborn dot distplot, so here we will input the data that we had generated, next parameter is kde parameter that also we will keep as true.
So, true should be written starting with a capital letter.
Next, thing is c o l o r is equal to let me give it as pink.
Next, we will create a histogram over here, so h i s t underscore kws. Here in it we will pass line width as 22, then we will pass alpha parameter and the value of a l p h a (alpha) we will give it as 0.77
Let me execute this.
Ok, so it's unable to understand hist kws,
Let me cross check this, ok so here it should have been dist plot instead of dis.
So, let me correct this.
So, here you can see that we have got this distribution.
So, this is an example of binomial distribution.
Where we have only two variables: true or false or success or failure.
Now, the third distribution I will show you is Poisson Distribution.
Here, p o i s s o n d i s t r i b u t i o n (Poisson Distribution).
So, for that we have already imported numpy, so we don't have to do it again. And we have everything else also.
So, we need to create data for that np dot random dot p o i s s o n(Poisson), so here I will pass a range 5, 10000.
So, I have passed a range.
Now, we will plot this and see, so plt dot hist and s is the dataset that we are passing, along with that we have to give bins, so I am 16 here,and n o r m e d(normed) is equal to we are keeping it as True., C o l o r(color) we are giving green, g as green.
So, let's see how the plot will come.
Now, here we can see the plotting, so it is growing no property as normed.
So, let's cross check this, maybe the property name has been changed, so, let's try to see suggestions, so it is unable to give me any suggestions right now.
So, we will plot this and see the kind of distribution it will give.
So, Poisson Distribution has random values of independent events.
So, like binomial distribution we have Bernoulli's distribution.
So, I will implement that also and show,
Bernoulli is a special case of binomial.
In binomial we have success or fail or win or loss so Bernoulli is a special case in that.
In Binomial we create multiple experiments whereas in Bernoulli there is only a single experiment.
So, let's implement and see this also.
So, b e r n o u l l i (Bernoulli) d i s t r i b u t i o n (distribution).
So we already have the packages for it.
So just create a data over here, so s is equal to np dot random dot binomial, let's pass the parameters over here, so we will pass 10, 0.5, 1000.
Now, we have got the data generated.
So, here we have got the data 6,4,5.
Now, we will do plt dot hist, put it in histogram and here we will put the data and put the bins, so as normed is not working I will not put that and directly out the color, let's put the color as brown.
So, here you can see the plotting based on Bernoulli's distribution in python.
So, this is how we practically implemented and saw that, in python we have different packages available which we can use to generate sample data.
So, we used the scipy dot stats package of python to do these distributions like log norm for normal distribution, binorm for binomial sample data generation, also we learned about Bernoulli distribution, and last for Poisson we generated data through np dot random dot Poisson.
So, friends, let's conclude today's session here.
And we will continue in the next session.
Till then keep learning and remain motivated.
If you have any questions related to this course or you have any comments.
then you can click on the discussion button below this video and post it there.
So, in this way you can discuss this course with many other learners of your kind.
good learning but the content titles are jumbled up, like first title of this module is decision tree dichotomiser which is practical part ahead of theory part. Same with the SVM practical 1 title has
Isakki Alias Devi P
yes, i am happy to learning for machine learning in LearnVern.it i s easily understanding for Beginners.
Superb and amazing 😍🤩 enjoyable experience.
Muhammad Nazam Maqbool
Absolutely good course... will suggest it to everyone. has superb content that is covered in a fantastic way.
super course and easily understanding and Good explaned
Ruturaj Nivas Patil
Very well explained in entire course. Great course for everyone as it takes from scratch to advance level.