Consider the 'Advertising' dataset, which consists of columns such as 'TV', 'Radio', 'Newspaper', and 'Sales'. You are required to define the relationships between the Sales column and other independent variables.
Download the dataset from this link
You are required to use gradient descent and sklearn linear regression to answer the question below.
Note:
- The data doesn't require any cleaning
- You need to normalise the data before building the gradient descent model
Q1: Gradient Descent What is the cost of the 0th iteration?
- 0.55
- 0.48
- 0.32
Q2: What is the cost of the 999th iteration?
- 0.6
- 0.05
- 0.45
Q3: After plotting the graph between the cost and the number of iterations, at what point does the cost get flattened?
- >200 iterations
- <200 iterations
Q4: Is there any difference between the coefficients obtained from the gradient descent from scratch and the coefficients obtained by running linear regression through sklearn?
- Yes
- No
Code
import pandas as pd
ad = pd.read_csv('advertising.csv')
ad.head()
ad = (ad - ad.mean())/ad.std()
ad.head()
# Putting feature variable to X
X = ad[['TV','Radio','Newspaper']]
# Putting response variable to y
y = ad['Sales']
X['intercept'] = 1
X = X.reindex_axis(['intercept','TV','Radio','Newspaper'], axis=1)
X.head()
import numpy as np
X = np.array(X)
y = np.array(y)
# Theta needed to be changed with the number of response varaible used.
theta = np.matrix(np.array([0,0,0,0]))
alpha = 0.01
iterations = 1000
import numpy as np
def compute_cost(X, y, theta):
return np.sum(np.square(np.matmul(X, theta) - y)) / (2 * len(y))
def gradient_descent_multi(X, y, theta, alpha, iterations):
theta = np.zeros(X.shape[1])
m = len(X)
gdm_df = pd.DataFrame( columns = ['Bets','cost'])
for i in range(iterations):
gradient = (1/m) * np.matmul(X.T, np.matmul(X, theta) - y)
theta = theta - alpha * gradient
cost = compute_cost(X, y, theta)
gdm_df.loc[i] = [theta,cost]
return gdm_df
print(gradient_descent_multi(X, y, theta, alpha, iterations).values[999])
gradient_descent_multi(X, y, theta, alpha, iterations).reset_index().plot.line(x='index', y=['cost'])
# import LinearRegression from sklearn
from sklearn.linear_model import LinearRegression
# Representing LinearRegression as lr(Creating LinearRegression Object)
lr = LinearRegression()
#You don't need to specify an object to save the result because 'lr' will take the results of the fitted model.
lr.fit(X, y)
print(lr.intercept_)
print(lr.coef_)
Comments