Posts

Showing posts from August, 2017

Gradient Descent with Python

Image
Introduction In the previous two articles, we've talked about linear and logistic regression, covering by it most linear models. And that would be it, if only these models could process data with numerous features, that is x1, x2... xn-1, xn. Let's us see how our classification model deals with Olivetti faces classification dataset: from sklearn import datasets, metrics, linear_model, cross_validation import matplotlib.pyplot as plt import time #Load the digits dataset faces = datasets.fetch_olivetti_faces() start = time.time() X = faces.data Y = faces.target x_train, x_test, y_train, y_test = cross_validation.train_test_split(X, Y) # Create a classifier: a support vector classifier model = linear_model.LogisticRegression(C=10000) # We learn the faces of train set model.fit(x_train, y_train) # Now predict the person on test set expected = y_test predicted = model.predict(x_test) end = time.time() print("%.2f seconds" % (end - start)) # 8.85 seconds