I am using the Google AiPlatform (Unified) Python client to export a trained model to a Google Cloud bucket. I am following the sample code from: export_model_sample.
The application has "owner" credentials at the moment because I want to make sure it is not a permissions issue. However, when I try to execute the sample code I am getting the following error:
Traceback (most recent call last): File "/usr/local/lib/python3.8/site-packages/google/api_core/grpc_helpers.py", line 57, in error_remapped_callable return callable_(*args, **kwargs) File "/usr/local/lib/python3.8/site-packages/grpc/_channel.py", line 923, in call return _end_unary_response_blocking(state, call, False, None) File "/usr/local/lib/python3.8/site-packages/grpc/_channel.py", line 826, in _end_unary_response_blocking raise _InactiveRpcError(state) grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with: status = StatusCode.FAILED_PRECONDITION details = "Exporting artifact for model
projects/101010101010/locations/us-central1/models/123123123123123
in formatis not supported." debug_error_string = "{"created":"@1611864688.554145696","description":"Error received from peer ipv4:172.217.12.202:443","file":"src/core/lib/surface/call.cc","file_line":1067,"grpc_message":"Exporting artifact for model `projects/110101010101/locations/us-central1/models/123123123123123` in format
is not supported.","grpc_status":9}"The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/app/main.py", line 667, in response = aiplatform_model_client.export_model(name=name, output_config=output_config) File "/usr/local/lib/python3.8/site-packages/google/cloud/aiplatform_v1beta1/services/model_service/client.py", line 937, in export_model response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) File "/usr/local/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py", line 145, in call return wrapped_func(*args, **kwargs) File "/usr/local/lib/python3.8/site-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable six.raise_from(exceptions.from_grpc_error(exc), exc) File "", line 3, in raise_from google.api_core.exceptions.FailedPrecondition: 400 Exporting artifact for model
projects/111101010101/locations/us-central1/models/123123123123123123
in format `` is not supported.
(I have omitted the project id and the models id. Using 10101 and 123123)
I have verified my inputs but everything seems ok:
gcs_destination_output_uri_prefix = "gs://my-bucket-vcm/model-123123123123123/tflite/2021-01-28T16:00:00.000Z/"
gcs_destination = {"output_uri_prefix": gcs_destination_output_uri_prefix}
output_config = {"artifact_destination": gcs_destination,}
name = "projects/10101010101/locations/us-central1/models/123123123123123"
response = aiplatform_model_client.export_model(name=name, output_config=output_config)
print("Long running operation:", response.operation.name)
export_model_response = response.result(timeout=300)
print("export_model_response:", export_model_response)
I am also using the latest version of google-cloud-aiplatform==0.4.0 The model that I am trying to export is of type: MOBILE_TF_LOW_LATENCY_1
I would like to just export the model to a cloud bucket. Not deploy it as a service.
The export_model_sample is missing a request field. You should include "export_format_id": string
in the output_config
. You can further explore the required output_config
fields required by export endpoint in the AI Platform Unified REST API Reference.
The accepted values for export_format_id
are the following:
tflite
Used for Android mobile devices.edgetpu-tflite
Used for Edge TPU devices.tf-saved-model
A tensorflow model in SavedModel format.tf-js
A TensorFlow.js model that can be used in the browser and in
Node.js using JavaScript.core-ml
Used for iOS mobile devices.custom-trained
A Model that was uploaded or trained by custom code.The code should look like this. In this case I used tflite
for the export_format_id
.
from google.cloud import aiplatform
def export_model_sample(
project: str = "your-project-id",
model_id: str = "your-model-id",
gcs_destination_output_uri_prefix: str = "your-bucket-destination",
location: str = "us-central1",
api_endpoint: str = "us-central1-aiplatform.googleapis.com",
timeout: int = 300,
):
# The AI Platform services require regional API endpoints.
client_options = {"api_endpoint": api_endpoint}
# Initialize client that will be used to create and send requests.
# This client only needs to be created once, and can be reused for multiple requests.
client = aiplatform.gapic.ModelServiceClient(client_options=client_options)
output_config = {
"export_format_id": "tflite",
"artifact_destination": {"output_uri_prefix": gcs_destination_output_uri_prefix}
}
name = client.model_path(project=project, location=location, model=model_id)
response = client.export_model(name=name, output_config=output_config)
print("Long running operation:", response.operation.name)
export_model_response = response.result(timeout=timeout)
print("export_model_response:", export_model_response)
export_model_sample()
I got a model named in this manner after the operation was completed:
gs://your-bucket-destination/your-model-id/tflite/2021-01-29T04:15:51.672336Z/model.tflite