Exam Professional Machine Learning Engineer topic 1 question 195 discussion - ExamTopics


AI Summary Hide AI Generated Summary

Problem

A retail company needs a model to predict daily customer product purchases, using data including customer ID, product ID, date, days since last purchase, average purchase frequency, and purchase (binary). The challenge is interpreting individual model predictions.

Options

  • A: Build a boosted tree classifier in BigQuery ML and inspect partition rules.
  • B: Use Vertex AI AutoML to train a model, deploy it, enable feature attributions, and use the 'explain' method for interpretation.
  • C: Build a logistic regression model in BigQuery ML and interpret feature importance from coefficient values.
  • D: Train an AutoML model in Vertex AI, deploy it, and use L1 regularization for feature selection.

Solution

The suggested answer is B. This approach leverages Vertex AI's AutoML and feature attribution capabilities, providing a direct method for explaining individual predictions.

Sign in to unlock more AI features Sign in with Google

You work for a retail company. You have been asked to develop a model to predict whether a customer will purchase a product on a given day. Your team has processed the company’s sales data, and created a table with the following rows: • Customer_id • Product_id • Date • Days_since_last_purchase (measured in days) • Average_purchase_frequency (measured in 1/days) • Purchase (binary class, if customer purchased product on the Date)

You need to interpret your model’s results for each individual prediction. What should you do?

  • A. Create a BigQuery table. Use BigQuery ML to build a boosted tree classifier. Inspect the partition rules of the trees to understand how each prediction flows through the trees.
  • B. Create a Vertex AI tabular dataset. Train an AutoML model to predict customer purchases. Deploy the model to a Vertex AI endpoint and enable feature attributions. Use the “explain” method to get feature attribution values for each individual prediction.
  • C. Create a BigQuery table. Use BigQuery ML to build a logistic regression classification model. Use the values of the coefficients of the model to interpret the feature importance, with higher values corresponding to more importance
  • D. Create a Vertex AI tabular dataset. Train an AutoML model to predict customer purchases. Deploy the model to a Vertex AI endpoint. At each prediction, enable L1 regularization to detect non-informative features.
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Was this article displayed correctly? Not happy with what you see?


Share this article with your
friends and colleagues.

Facebook



Share this article with your
friends and colleagues.

Facebook