Search code examples
pythonamazon-sagemakeramazon-sagemaker-studio

Sagemaker HyperparameterTuner and fixed hyper parameters (StaticHyperParameters)


I used to use this type of hyper parameter (optimisation) specification:

 "OutputDataConfig": {"S3OutputPath": output_path},
    "ResourceConfig": {"InstanceCount": 1, "InstanceType": "ml.m4.xlarge", "VolumeSizeInGB": 3},
    "RoleArn": role_arn,
    "StaticHyperParameters": {
        "objective": "reg:squarederror"
    },
    "StoppingCondition": {"MaxRuntimeInSeconds": 10000} 

TBH I do not even know if this is an old way of doing things or a different SDK - very confusing Sagemaker sometimes. Anyway, I want to use this SDK/API instead - more precisely the HyperparameterTuner. How would I specify StaticHyperParameters (e.g. "objective":"quantile")? Simply by not giving this hyperparameter a range and hard coding it? Thanks!


Solution

  • The hyperparameterTuner takes an Estimator object as one of the parameters. You can keep static hyperparameters as part of the estimator something like below

    estimator = PyTorch(
        entry_point="mnist.py",
        role=role,
        py_version="py3",
        framework_version="1.8.0",
        instance_count=1,
        instance_type="ml.c5.2xlarge",
        hyperparameters={"epochs": 1, "backend": "gloo"},
    )
    

    Once you have the estimator initialized you can pass this to Tuner along with Parameters that has to be tuned as shown below

    hyperparameter_ranges = {
        "lr": ContinuousParameter(0.001, 0.1),
        "batch-size": CategoricalParameter([32, 64, 128, 256, 512]),
    }
    tuner = HyperparameterTuner(
        estimator,
        objective_metric_name,
        hyperparameter_ranges,
        metric_definitions,
        max_jobs=9,
        max_parallel_jobs=3,
        objective_type=objective_type,
    )
    

    Please refer this example for a complete solution

    https://github.com/aws/amazon-sagemaker-examples/blob/main/hyperparameter_tuning/pytorch_mnist/hpo_pytorch_mnist.ipynb