Exam Professional Machine Learning Engineer topic 1 question 255 discussion - ExamTopics


AI Summary Hide AI Generated Summary

Problem:

Productionize a TensorFlow classification model trained on tabular data and processed by a Dataflow pipeline (handling terabytes of data into TFRecords), with predictions automatically uploaded to BigQuery weekly.

Options:

  • A: Use Vertex AI for model deployment and Vertex AI Pipelines with DataflowPythonJobOp and ModelBatchPredictOp.
  • B: Use Vertex AI for deployment; create a Dataflow pipeline to send requests to the endpoint and upload predictions to BigQuery.
  • C: Import the model to Vertex AI, then use Vertex AI Pipelines with DataflowPythonJobOp and ModelBatchPredictOp.
  • D: Import the model to BigQuery, use SQL for data processing, and utilize Vertex AI Pipelines with BigqueryQueryJobOp and BigqueryPredictModelJobOp.

Correct Answer:

C is the suggested solution, leveraging Vertex AI for model deployment and pipelines for efficient processing and prediction upload to BigQuery.

Sign in to unlock more AI features Sign in with Google

You have recently used TensorFlow to train a classification model on tabular data. You have created a Dataflow pipeline that can transform several terabytes of data into training or prediction datasets consisting of TFRecords. You now need to productionize the model, and you want the predictions to be automatically uploaded to a BigQuery table on a weekly schedule. What should you do?

  • A. Import the model into Vertex AI and deploy it to a Vertex AI endpoint. On Vertex AI Pipelines, create a pipeline that uses the DataflowPythonJobOp and the ModelBacthPredictOp components.
  • B. Import the model into Vertex AI and deploy it to a Vertex AI endpoint. Create a Dataflow pipeline that reuses the data processing logic sends requests to the endpoint, and then uploads predictions to a BigQuery table.
  • C. Import the model into Vertex AI. On Vertex AI Pipelines, create a pipeline that uses the DataflowPvthonJobOp and the ModelBatchPredictOp components.
  • D. Import the model into BigQuery. Implement the data processing logic in a SQL query. On Vertex AI Pipelines create a pipeline that uses the BigquervQueryJobOp and the BigqueryPredictModelJobOp components.
Show Suggested Answer Hide Answer
Suggested Answer: C πŸ—³οΈ

Was this article displayed correctly? Not happy with what you see?


Share this article with your
friends and colleagues.

Facebook



Share this article with your
friends and colleagues.

Facebook