Question

A question related to the tutorial of How To Build a Deep Learning Model to Predict Employee Retention Using Keras and TensorFlow

Firstly thanks for the tutorial and explanation of How To Build a Deep Learning Model to Predict Employee Retention Using Keras and TensorFlow. This is more related to the coding part which i was trying from the tutorial .At the step of hyperparameter tuning, when fiting the model , grid_search = grid_search.fit(X_train,y_train) , its actually throwing the error as TypeError: make_classifier() missing 1 required positional argument: ‘optimizer’ Could you please guide on this error

!ref : https://www.digitalocean.com/community/tutorials/how-to-build-a-deep-learning-model-to-predict-employee-retention-using-keras-and-tensorflow


Submit an answer

This textbox defaults to using Markdown to format your answer.

You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!

Sign In or Sign Up to Answer

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

Want to learn more? Join the DigitalOcean Community!

Join our DigitalOcean community of over a million developers for free! Get help and share knowledge in Q&A, subscribe to topics of interest, and get courses and tools that will help you grow as a developer and scale your project or business.

Hi @happydarkbluewalrus,

Looking at the turorial, at step 8 you have the following created:

  1. def make_classifier():
  2. classifier = Sequential()
  3. classifier.add(Dense(9, kernel_initializer = "uniform", activation = "relu", input_dim=18))
  4. classifier.add(Dense(1, kernel_initializer = "uniform", activation = "sigmoid"))
  5. classifier.compile(optimizer= "adam",loss = "binary_crossentropy",metrics = ["accuracy"])
  6. return classifier

Later on Step 10 — Hyperparameter Tuning, you need to Add this code to your notebook to modify the make_classifier function so you can test out different optimizer functions:

  1. from sklearn.model_selection import GridSearchCV
  2. def make_classifier(optimizer):
  3. classifier = Sequential()
  4. classifier.add(Dense(9, kernel_initializer = "uniform", activation = "relu", input_dim=18))
  5. classifier.add(Dense(1, kernel_initializer = "uniform", activation = "sigmoid"))
  6. classifier.compile(optimizer= optimizer,loss = "binary_crossentropy",metrics = ["accuracy"])
  7. return classifier

Noticed the difference, you are using an optimizer now. A bit later down the article, you’ll see the following parameters configured:

  1. params = {
  2. 'batch_size':[20,35],
  3. 'epochs':[2,3],
  4. 'optimizer':['adam','rmsprop']
  5. }

This is where the class make_classifier actually takes its optimized variable.