At ArcGIS Server 10.7, administrators can change the jobs directory of a geoprocessing service (or multiple geoprocessing services) from a disk location to a Microsoft Azure cloud storage location. If your geoprocessing services consistently have large outputs, you can use this option to scale your storage resources. Your services may be slower with this configuration.
Note:
The output map image layer option is not available with this configuration.
Note:
Warning: The JSON schema file, helpURLs, file, and raster outputs will have a file signature visible to anyone having access to the service. Although that signature expires after a few hours, there are no restrictions on where a user can use that signature. Please ensure your entire blob store does not contain sensitive data. Evaluate if this configuration conforms to the security policy of your organization.
Prepare the Azure environment
You need a Microsoft Azure account to create a storage account and Blob containers and tables.
Create an Azure storage account
The storage account must meet the following requirements:
- A standard performance storage account is required.
- This account needs to be a General-purpose v2 account. All other types of storage accounts do not have the essential storage services to complete this configuration. If you are still using a General-purpose v1 account, those may still work.
- The Hot access tier is recommended.
- Other advanced settings of the storage account can be adjusted based on your organization's needs.
Once the storage account is deployed, copy the key1 of the access keys of your storage account, which is needed when you register the account as a cloud store with ArcGIS Server.
Create a Blob container, a table, and a queue
Create a Blob container for all geoprocessing services, and an additional table and queue for asynchronous geoprocessing services. All of them must be in the same storage account.
- Create a Blob container.
- If you are configuring an asynchronous geoprocessing service, you also need to create a table optionally. Otherwise, ArcGIS Server will create one for you.
- You also need to create a unique queue for each asynchronous geoprocessing service. Add "jobsStoreQueue":"<name of the queue>" to the serviceProperties for each service.
Note the exact name of the container, table, and the optional queues you create; you'll use them in the following steps.
Move the jobs directory to Azure
Once the Azure Blob container and the table are deployed, register the Blob container in ArcGIS Server and change the service properties accordingly.
- Sign in to ArcGIS Server Administrator Directory and browse to Register Item
- Provide the connection information of your Azure Blob container and table as a JSON. Reference the sample below.
Register Item
In this example, replace the dataname, myaccountkey, storageaccountname, containername, optionalfoldername, and tablename with your artifacts.
{ "path": "/cloudStores/dataname", "type": "cloudStore", "provider": "azure", "info": { "isManaged": false, "connectionString": { "accountKey":"myaccontkey", "accountName":"mystorageaccountname", "defaultEndpointsProtocol":"https", "accountEndpoint":"core.windows.net", "credentialType":"accessKey" }, "objectStore": "containername/optionalfoldername", "tableStore":"tablename" } }
- Return to the home page of the Administrator Directory, and click Services.
- Locate the geoprocessing service you want to configure to use the Azure Blob container, click the service name, and click edit.
- In the JSON representation of the service, add the following keypairs with a new unique serviceId, the name of your cloud store, and the queue for that service:
Asynchronous geoprocessing configuration
{ "serviceId": "<a unique service ID>", "jobQueueStore":"/cloudStores/<name of your cloud store>", "jobTableStore": "/cloudStores/<name of your cloud store>", "outputStore": "/cloudStores/<name of your cloud store>", "jobObjectStore": "/cloudStores/<name of your cloud store>", "jobsStoreQueue": "<name of the queue>" }
Synchronous geoprocessing configuration
{ "serviceId": "<a unique service ID>", "outputStore": "/cloudStores/<name of your cloud store>", "jobObjectStore": "/cloudStores/<name of your cloud store>" }
Tip:
The <name of the cloud store> is at the end of its data item URL endpoint in the Administrator Directory.
- Click Save Edits to confirm. The geoprocessing service will automatically restart, which takes a moment.
- If you're configuring multiple geoprocessing services to use the Azure Blob container as their jobs directory, repeat steps 4 through 6 for each service.
JSON example
Change the service properties JSON of your geoprocessing service by adding keyvalue pairs required in step 5 above.
Edit GPServer. Below is a truncated JSON for asynchronous geoprocessing service configuration.
{
"serviceName": "myGPService1",
"resultMapServer": "false",
"maximumRecords": "1000",
"virtualOutputDir": "/rest/directories/arcgisoutput",
"serviceId": "<this_is_a_unique_serviceid>",
"jobQueueStore":"/cloudStores/<name of your cloud store>",
"jobTableStore": "/cloudStores/<name of your cloud store>",
"outputStore": "/cloudStores/<name of your cloud store>",
"jobObjectStore": "/cloudStores/<name of your cloud store>",
"jobsStoreQueue": "<this_is_a_unique_queue_name>",
"portalURL": "https://domain/webadaptor/",
"toolbox": "<... removed to save space ...>"
},
"portalProperties": "",
"extensions": "",
"frameworkProperties": {},
"datasets": []
}