hello so first data science machine learning and data analis sence sence sence data analis analis machine learning super smart experience so for example examp fruits vegetables pictures s for already learning supervised learning unsupervised learning semi supervised learning super learning and learning supervised learning supervis learning classification and regression classification for example banana fig for examp classification regression Ali regression classification numb for examp Labs continuous values unsupervised learning uned learning Andis learning clustering and dimensional reduction sem learning learning learning learning and supervis learning s supervis learning regression for col points like for examp so and Lis regression first variable one or more independent variable in dep y = mx + b y s EX 50 Rue l classif or no SP not spam for example you won a million claim your price now so million send name congratulations you are lucky winner click here to claim your million price today million price congratulations Weekly Newsletter latest updates articles subscriber latest news dep bet X1 beta 2 X2 up to bet depent variable X1 X2 X number of words number of keywords X1 X2 up toe features X1 m not something 9 no bet beta1 Al answer yes or no Ro answer feur Sun weak and we we so inform wey minus probability of s dat set s lity soy 0.97 intoy minus weighted average 0.249 inform 20 index formula 1 minus some of probability of j s 0.833 prop so one sum and final answer or no answer decision work nearest Neighbors for 11 most for number neb so for first dat CL a y distance from point C to data point 1 data 5 data 4.47 so dat Point dat point one data point 2 data point 3 data point 4 five so dat point one dat 2 dat so machine learning underfitting and overfitting Excel F EXC and colums colures Point dat 25% dat Cor for for Le learning but for understand for per dat comp mod ofing for Le relation but relationship exam score equal into studor model mod GRE color true relationship model and overfitting for validation regularization Modis so learning Soom forom for bagging technique bootstrap bootstrap bagging with replac with repl l out rep customer One customer two customer customer One customer Bo Boomer One customer 2 like information g boot boot 3 5 2 4 1 Bo customer 6 67 $62,000 Lo am $26,000 67 first $62,000 $26,000 22,000 no from Decision One S from decision Tre 2 s from decision Tre so prediction average out for decision trees random Forest support vect but SE support vect supp and non Max hyper supp Hy the DAT point and the hyper Ma Fe Ma comp for example non for can fure zero mams mams can't fly but birds can fly so m [Applause] with feature one has for feature two canfly feature three has for multiplied by mams for 0 0 0 1 0 x and y with 3D space withern convert nonlinear SPM Vago na Bas text classification spam filtering spam filtering Anis shape P of a given B is equal to P of B given a multiplied by P P of a by P of B P of a given poster probability p so p marginal marginal so P of a given posterior probability P of B given likelihood probability P of a prior probability P of B marginal probability EV prity evid comb so color size small large taste sweet and Sol P Apple orange banana so so first 2 so of 037 two three three orange banana banana so eight p p given given red given orange red given banana small given orange small given banana sweet given orange sweet given banana so ofle 0.3 large sweet red color large in size sweet in taste P of a given B of Apple given redid p a given Bal P of a into P of B given a divid by P of B so P of a given B is equal to P of a p of a Apple so P of a into P of B given P of B red large sweet so P of B given a p red of a large given Apple large given Apple sweet given .12 probility recation system P of a given B is equal to P of a into P of B given a ID P of B hello learning loop loop like for I in Array Su is equal to Z Su plus isal y and multidimensional ARS like not not so go ch inst sing diens aray n NP array of 2 3 so data 2D is equal to np. array of and so sh dat dat IND anding so data 1are first number second element element so first print data call print data 2D col one 2 3 5 1 comma 2 comma 3 so first a cre next Bal NP do array 456 so a 79 1 + 4 2 + 5 3 + 6 4 into 1 4 2 into 5 10 3 into 6 18 value log exponential of a equal NP s s 10 + 2 10+ 11 12 so data equal NP ARR of Square 1 2 Val 4 1+ 2+ 3+ 4+ 5+ 6 total sum is equal to NP do sum of dat straight forward 1 + 4 2 + 5 3 + 6 column sum equal to meal mean of dat first 1 + 2 + 3 6 6 ID 3 2 so first column mean 2 second column mean in the part 11 and part 12 number important machine learning data analysis and data science separ install p maybe Z but so so import p as PD so data is equal to 10 20 30 40 50 seral PD series IND custom index is equal to now a b c c d e indal Toom index custom IND no problem series of Bess one dimension dat diens First Data is equal to for b c AG 20 21 22 20 21 22 23 data frame of IND DF bracket name [Music] lock L lock index use A2 CSV CS so 1 2 3 4 colum 1 2 3 4 5 6 rows C data DF head off first five columns frame first five colums options options display foro value stand minim Maxim shap rows five columns and colums Le DF do is is so for drop random values for example F na with something 20 in place equal to equal D agal sorry age Dot Place equal so DF AG mean age mean mean full mean value 38 is equal to DF of age H mode ofet Z DF dot drop underscore duplicates very simple and in DF rename of columns [Applause] full name and of course in place is equal to set object is not callable DF column name column name so grou by group do mean of group DF group data frame print 31.3 1 2 3 42 relation it appes for salary and and Rel 09 good coration 0 for 06 0.7 0.8 0.9 1 andus 0.6 minus 0.7 - 0.8 - 0.9 good just first and Y for [Music] matplot lip. P plot as usual DF do plot kind scatter plot X age x x ACC so and Y Lael salary and andary SC p balance data and and dat and dat and Fin all the liaries import on the library name as short form for example import as NP import pasas as PD import M plot lip. P plot as PLT in the number normal library is import PSV examp dep features features dependent features same machine learning method and filter Lo relationship so so so first as us dependent Lo appr frame correlation relationship of loan approval 0.47 0.3 equal coration abs featur values false asending is equal to false 0.304 for Rel skarn skarn data sets load dat Lo from skarn fure selection dat Frame data do data columns data do data name of X the target f classification fing function andal 2 for trans F transform x with related to y x get support IND selected features is equal to DF columns columns DF colums index features next and same different recursive feature elimination features f 2 and for ma learnings like decisiones feature encoding most cegal values or no dog or values for examp different Flav I different Nam learning next small for example medium values first labing lab shirt jeans dress t-shirt andir nexted white andell next small pame labod so is equal to encoder fit transform trans 2 next Medi one and small one one encoded colors colors encoded colors equal to enoder fit transform transformame different colums for example red yellow orange purple dfal PD dat frame Colum get feure names equal to p data frame DFC one inst andpop enal calls columns is equal to portable portable or no Colum encoder equal encoder fit transform P one p so portable ptop lab ening binary encoding and oneing first oneing binary feature encoding and for examp 1 is 1 so Bal 1 and 60 40 6 4 40 80 dat 800 training set over transaction ofac for down under sampling over sampling so first number under sampling 8 600 down sampling next so [Applause] C train okay EX add first train next half TRC train 80 test 20 or 75 25 67 33 or 5050 Rel and TR pandas as PD next train T import train test split so from skarn do model selection import train tet D PD dat Frame data and data feature columns and Target col data frame feature one feature two next DF Target column next and so X train X test y train and Y test VAR 25 0.2 random random state equal 42 random L of xra s machine learning port feature scaling variables features independent VAR sugar flour and chocolate chips so for SC scx scalation Minx scaling or normalization am and Ma and Maxim SC of each 50 minimum 90 maximum ma 30 minimum 100 Maxum X minimum by X Maximus X minimum minimum value and X maximum maximum value 20us X Min 50us X Max 90 minus again X Min 50 which is equal to 0 80us minim 30 by maximum 100 100 minus again minimum 30 so is equal to 0.625 so max score 0.62 normalization next stand standardization or Z score scaling stand standardization so standardization is like making a f game where all scores are equal on average and spread out the same stand 10 points of the aage 50 and standardi spre so form Z score is equal to current value minus mean by standard deviation straightfor standard 40 - 50 by 10 which is equal to minus1 so standardized value minus 1 so 75 minus 60 average by standard deviation 15 which is equal 1 stand fure scaling Ty Minx scaling or normalization and Z score or standardization even machine learning out important so for examp first simple St data visualization so data visualization for for L senstive to out and regression teens to outliers so for 250 275 280 300 320 325 330 350 360 and 10,000 $1,000 so Li thr so out is equal to NP zore f nange length of data l next IND next 3ar per per different [Applause] Ali accy for accy number of correct predictions by total number of predictions first number of correct predictions in the model 18 not 15 so 18+ 15 by total number important true positive true negative false positive false negative people posi so truei true negative so next false positive [Applause] people false negative and recall for for [Applause] false false false f [Applause] so you have high precision and p FAL false truei by true positive plus false positive so 7 by 7+ 3al 0.7 70 false negative so 7 by 7 recall next f11 and F1 score F1 2 into recall into Precision by Rec precis true 90 successful negative true POS by POS plus false positive so 9 by 90 + 10 equal 90 by plus false negative negative class FAL FAL f FAL false so or no or notr discr values so first mean absolute error m a e mean absolute predicted value prict value average absolute difference between predicted and actual values in a data set lakh so 3 lakhs minus predicted value 280,000 so minus 280,000 which is equal to 20,000 first value absolute 1 2 3 4 4 5 so 1 by 5et absolute differences 20,000 10,000 20,000 10,000 and again and 20,000 which is equal to 80,000 by 5 and which in turn is equal to 16,000 so mean absolute error mean absolute error next mean squ error mean Square mean squ eror actual value minus predicted value actual sales predicted sales number predicted value actue actual price 3 lakus predicted price 28,000 2 1 by coefficient of determinations for examp for unexplained variation unexplained r R so which is very great 0 7al 2 1us unexplained variation by total variation r value regression and classification Matrix and regression GR Ty for Forbe GR means clustering groups so next for first choing the number ofs placing the repeat and impr fin GR and El GR X El Lear Ty Dimension reduction first Di diatur or attributes features attributes columns Dimension dimension of Di dat Dimensions or features at the same time most of the important information first next feature extraction PCA principal component analysis principal compon with the help of transform observations of corated features transformation so PC original dat trans unated component analysis for examp sh for principal component analysis princip component analis for so for so stand standard of1 X so Rel important [Applause] V matal l important values so components princip comp so 0.9 and explain VAR of comp of compon compon analysis