From 70650bf28e3554792a846da1a91391753ccbdf95 Mon Sep 17 00:00:00 2001 From: Blake-Madden Date: Fri, 5 Jan 2024 12:01:55 -0500 Subject: [PATCH] Fix a few typos --- ...communication-administration-readme-pre.md | 388 +++--- .../latest/ai-metricsadvisor-readme.md | 1076 +++++++-------- .../communication-callautomation-readme.md | 2 +- .../latest/communication-chat-readme.md | 4 +- .../latest/confidentialledger-readme.md | 1216 ++++++++--------- .../latest/mgmt-managementgroups-readme.md | 70 +- .../latest/mgmt-search-readme.md | 2 +- .../latest/security-attestation-readme.md | 700 +++++----- .../latest/storage-file-datalake-readme.md | 2 +- .../latest/storage-fileshare-readme.md | 726 +++++----- .../legacy/storage-file-datalake-readme.md | 472 +++---- .../legacy/storage-fileshare-readme.md | 726 +++++----- .../communication-administration-readme.md | 384 +++--- .../preview/communication-chat-readme.md | 1140 ++++++++-------- docs-ref-services/preview/core-readme.md | 432 +++--- docs-ref-services/preview/corehttp-readme.md | 2 +- .../preview/maps-geolocation-readme.md | 406 +++--- .../mixedreality-remoterendering-readme.md | 762 +++++------ .../preview/purview-workflow-readme.md | 2 +- .../schemaregistry-avroserializer-readme.md | 634 ++++----- .../preview/storage-file-datalake-readme.md | 2 +- .../preview/storage-fileshare-readme.md | 726 +++++----- 22 files changed, 4937 insertions(+), 4937 deletions(-) diff --git a/docs-ref-services/communication-administration-readme-pre.md b/docs-ref-services/communication-administration-readme-pre.md index 555eb6687d22..3041540ac884 100644 --- a/docs-ref-services/communication-administration-readme-pre.md +++ b/docs-ref-services/communication-administration-readme-pre.md @@ -8,198 +8,198 @@ ms.service: ms.technology: azure ms.prod: azure --- -[![Build Status](https://dev.azure.com/azure-sdk/public/_apis/build/status/azure-sdk-for-python.client?branchName=master)](https://dev.azure.com/azure-sdk/public/_build/latest?definitionId=46?branchName=master) - -# Azure Communication Administration Package client library for Python - version 1.0.0b4 - - -This package has been deprecated. Please use [azure-communication-identity](https://pypi.org/project/azure-communication-identity/) and [azure-communication-phonenumbers](https://pypi.org/project/azure-communication-phonenumbers/) instead. - -The requested features were implemented in the new libraries. See change log for more details. - -# Getting started -### Prerequisites -* Python 2.7, or 3.5 or later is required to use this package. -* You must have an [Azure subscription](https://azure.microsoft.com/free/) - -### Install the package -Install the Azure Communication Administration client library for Python with [pip](https://pypi.org/project/pip/): - -```bash -pip install azure-communication-administration -``` - -# Key concepts -## CommunicationIdentityClient -`CommunicationIdentityClient` provides operations for: - -- Create/delete identities to be used in Azure Communication Services. Those identities can be used to make use of Azure Communication offerings and can be scoped to have limited abilities through token scopes. - -- Create/revoke scoped user access tokens to access services such as chat, calling, sms. Tokens are issued for a valid Azure Communication identity and can be revoked at any time. - -## CommunicationPhoneNumberClient -### Initializing Phone Number Client -```python -# You can find your endpoint and access token from your resource in the Azure Portal -import os -from azure.communication.administration import PhoneNumberAdministrationClient - -connection_str = os.getenv('AZURE_COMMUNICATION_SERVICE_CONNECTION_STRING') -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) -``` -### Phone plans overview - -Phone plans come in two types; Geographic and Toll-Free. Geographic phone plans are phone plans associated with a location, whose phone numbers' area codes are associated with the area code of a geographic location. Toll-Free phone plans are phone plans not associated location. For example, in the US, toll-free numbers can come with area codes such as 800 or 888. - -All geographic phone plans within the same country are grouped into a phone plan group with a Geographic phone number type. All Toll-Free phone plans within the same country are grouped into a phone plan group. - -### Searching and Acquiring numbers - -Phone numbers search can be search through the search creation API by providing a phone plan id, an area code and quantity of phone numbers. The provided quantity of phone numbers will be reserved for ten minutes. This search of phone numbers can either be cancelled or purchased. If the search is cancelled, then the phone numbers will become available to others. If the search is purchased, then the phone numbers are acquired for the Azure resources. - -### Configuring / Assigning numbers - -Phone numbers can be assigned to a callback URL via the configure number API. As part of the configuration, you will need an acquired phone number, callback URL and application id. - -# Examples -The following section provides several code snippets covering some of the most common Azure Communication Services tasks, including: - -[Create/delete Azure Communication Service identities][identitysamples] - -[Create/revoke scoped user access tokens][identitysamples] - -## Communication Phone number -### Get Countries - -```python -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -supported_countries = phone_number_administration_client.list_all_supported_countries() -for supported_country in supported_countries: - print(supported_country) -``` - -### Get Phone Plan Groups - -Phone plan groups come in two types, Geographic and Toll-Free. - -```python -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -phone_plan_groups_response = phone_number_administration_client.list_phone_plan_groups( - country_code='' -) -for phone_plan_group in phone_plan_groups_response: - print(phone_plan_group) -``` - -### Get Phone Plans - -Unlike Toll-Free phone plans, area codes for Geographic Phone Plans are empty. Area codes are found in the Area Codes API. - -```python -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -phone_plans_response = phone_number_administration_client.list_phone_plans( - country_code='', - phone_plan_group_id='' -) -for phone_plan in phone_plans_response: - print(phone_plan) -``` - -### Get Location Options - -For Geographic phone plans, you can query the available geographic locations. The locations options are structured like the geographic hierarchy of a country. For example, the US has states and within each state are cities. - -```python -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -location_options_response = phone_number_administration_client.get_phone_plan_location_options( - country_code='', - phone_plan_group_id='', - phone_plan_id='' -) -print(location_options_response) -``` - -### Get Area Codes - -Fetching area codes for geographic phone plans will require the the location options queries set. You must include the chain of geographic locations traversing down the location options object returned by the GetLocationOptions API. - -```python -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -all_area_codes = phone_number_administration_client.get_all_area_codes( - location_type="NotRequired", - country_code='', - phone_plan_id='' -) -print(all_area_codes) -``` - -### Create Search - -```python -from azure.communication.administration import CreateSearchOptions -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -searchOptions = CreateSearchOptions( - area_code='', - description="testsearch20200014", - display_name="testsearch20200014", - phone_plan_ids=[''], - quantity=1 -) -search_response = phone_number_administration_client.create_search( - body=searchOptions -) -print(search_response) -``` - -### Get search by id -```python -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -phone_number_search_response = phone_number_administration_client.get_search_by_id( - search_id='' -) -print(phone_number_search_response) -``` - -### Purchase Search - -```python -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -phone_number_administration_client.purchase_search( - search_id='' -) -``` - -# Troubleshooting -The Azure Communication Service Identity client will raise exceptions defined in [Azure Core][azure_core]. - -# Next steps -## More sample code - -Please take a look at the [samples](https://github.com/Azure/azure-sdk-for-python/blob/azure-communication-administration_1.0.0b3/sdk/communication/azure-communication-administration/samples) directory for detailed examples of how to use this library to manage identities and tokens. - -## Provide Feedback - -If you encounter any bugs or have suggestions, please file an issue in the [Issues](https://github.com/Azure/azure-sdk-for-python/issues) section of the project - -# Contributing -This project welcomes contributions and suggestions. Most contributions require you to agree to a -Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com. - -When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the -PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. - -This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). -For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments. - - -[identitysamples]: https://github.com/Azure/azure-sdk-for-python/blob/azure-communication-administration_1.0.0b3/sdk/communication/azure-communication-administration/samples/identity_samples.py +[![Build Status](https://dev.azure.com/azure-sdk/public/_apis/build/status/azure-sdk-for-python.client?branchName=master)](https://dev.azure.com/azure-sdk/public/_build/latest?definitionId=46?branchName=master) + +# Azure Communication Administration Package client library for Python - version 1.0.0b4 + + +This package has been deprecated. Please use [azure-communication-identity](https://pypi.org/project/azure-communication-identity/) and [azure-communication-phonenumbers](https://pypi.org/project/azure-communication-phonenumbers/) instead. + +The requested features were implemented in the new libraries. See change log for more details. + +# Getting started +### Prerequisites +* Python 2.7, or 3.5 or later is required to use this package. +* You must have an [Azure subscription](https://azure.microsoft.com/free/) + +### Install the package +Install the Azure Communication Administration client library for Python with [pip](https://pypi.org/project/pip/): + +```bash +pip install azure-communication-administration +``` + +# Key concepts +## CommunicationIdentityClient +`CommunicationIdentityClient` provides operations for: + +- Create/delete identities to be used in Azure Communication Services. Those identities can be used to make use of Azure Communication offerings and can be scoped to have limited abilities through token scopes. + +- Create/revoke scoped user access tokens to access services such as chat, calling, sms. Tokens are issued for a valid Azure Communication identity and can be revoked at any time. + +## CommunicationPhoneNumberClient +### Initializing Phone Number Client +```python +# You can find your endpoint and access token from your resource in the Azure Portal +import os +from azure.communication.administration import PhoneNumberAdministrationClient + +connection_str = os.getenv('AZURE_COMMUNICATION_SERVICE_CONNECTION_STRING') +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) +``` +### Phone plans overview + +Phone plans come in two types; Geographic and Toll-Free. Geographic phone plans are phone plans associated with a location, whose phone numbers' area codes are associated with the area code of a geographic location. Toll-Free phone plans are phone plans not associated location. For example, in the US, toll-free numbers can come with area codes such as 800 or 888. + +All geographic phone plans within the same country are grouped into a phone plan group with a Geographic phone number type. All Toll-Free phone plans within the same country are grouped into a phone plan group. + +### Searching and Acquiring numbers + +Phone numbers search can be search through the search creation API by providing a phone plan id, an area code and quantity of phone numbers. The provided quantity of phone numbers will be reserved for ten minutes. This search of phone numbers can either be cancelled or purchased. If the search is cancelled, then the phone numbers will become available to others. If the search is purchased, then the phone numbers are acquired for the Azure resources. + +### Configuring / Assigning numbers + +Phone numbers can be assigned to a callback URL via the configure number API. As part of the configuration, you will need an acquired phone number, callback URL and application id. + +# Examples +The following section provides several code snippets covering some of the most common Azure Communication Services tasks, including: + +[Create/delete Azure Communication Service identities][identitysamples] + +[Create/revoke scoped user access tokens][identitysamples] + +## Communication Phone number +### Get Countries + +```python +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +supported_countries = phone_number_administration_client.list_all_supported_countries() +for supported_country in supported_countries: + print(supported_country) +``` + +### Get Phone Plan Groups + +Phone plan groups come in two types, Geographic and Toll-Free. + +```python +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +phone_plan_groups_response = phone_number_administration_client.list_phone_plan_groups( + country_code='' +) +for phone_plan_group in phone_plan_groups_response: + print(phone_plan_group) +``` + +### Get Phone Plans + +Unlike Toll-Free phone plans, area codes for Geographic Phone Plans are empty. Area codes are found in the Area Codes API. + +```python +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +phone_plans_response = phone_number_administration_client.list_phone_plans( + country_code='', + phone_plan_group_id='' +) +for phone_plan in phone_plans_response: + print(phone_plan) +``` + +### Get Location Options + +For Geographic phone plans, you can query the available geographic locations. The locations options are structured like the geographic hierarchy of a country. For example, the US has states and within each state are cities. + +```python +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +location_options_response = phone_number_administration_client.get_phone_plan_location_options( + country_code='', + phone_plan_group_id='', + phone_plan_id='' +) +print(location_options_response) +``` + +### Get Area Codes + +Fetching area codes for geographic phone plans will require the location options queries set. You must include the chain of geographic locations traversing down the location options object returned by the GetLocationOptions API. + +```python +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +all_area_codes = phone_number_administration_client.get_all_area_codes( + location_type="NotRequired", + country_code='', + phone_plan_id='' +) +print(all_area_codes) +``` + +### Create Search + +```python +from azure.communication.administration import CreateSearchOptions +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +searchOptions = CreateSearchOptions( + area_code='', + description="testsearch20200014", + display_name="testsearch20200014", + phone_plan_ids=[''], + quantity=1 +) +search_response = phone_number_administration_client.create_search( + body=searchOptions +) +print(search_response) +``` + +### Get search by id +```python +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +phone_number_search_response = phone_number_administration_client.get_search_by_id( + search_id='' +) +print(phone_number_search_response) +``` + +### Purchase Search + +```python +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +phone_number_administration_client.purchase_search( + search_id='' +) +``` + +# Troubleshooting +The Azure Communication Service Identity client will raise exceptions defined in [Azure Core][azure_core]. + +# Next steps +## More sample code + +Please take a look at the [samples](https://github.com/Azure/azure-sdk-for-python/blob/azure-communication-administration_1.0.0b3/sdk/communication/azure-communication-administration/samples) directory for detailed examples of how to use this library to manage identities and tokens. + +## Provide Feedback + +If you encounter any bugs or have suggestions, please file an issue in the [Issues](https://github.com/Azure/azure-sdk-for-python/issues) section of the project + +# Contributing +This project welcomes contributions and suggestions. Most contributions require you to agree to a +Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com. + +When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the +PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. + +This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). +For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments. + + +[identitysamples]: https://github.com/Azure/azure-sdk-for-python/blob/azure-communication-administration_1.0.0b3/sdk/communication/azure-communication-administration/samples/identity_samples.py [azure_core]: https://github.com/Azure/azure-sdk-for-python/blob/azure-communication-administration_1.0.0b4/sdk/core/azure-core/README.md - + diff --git a/docs-ref-services/latest/ai-metricsadvisor-readme.md b/docs-ref-services/latest/ai-metricsadvisor-readme.md index 7cb7a93a6692..91bf6859a1e0 100644 --- a/docs-ref-services/latest/ai-metricsadvisor-readme.md +++ b/docs-ref-services/latest/ai-metricsadvisor-readme.md @@ -8,542 +8,542 @@ ms.service: applied-ai-services ms.subservice: metrics-advisor ms.technology: azure --- -# Azure Metrics Advisor client library for Python - version 1.0.0 - -Metrics Advisor is a scalable real-time time series monitoring, alerting, and root cause analysis platform. Use Metrics Advisor to: - -- Analyze multi-dimensional data from multiple data sources -- Identify and correlate anomalies -- Configure and fine-tune the anomaly detection model used on your data -- Diagnose anomalies and help with root cause analysis - -[Source code][src_code] | [Package (Pypi)][package] | [API reference documentation][reference_documentation] | [Product documentation][ma_docs] | [Samples][samples_readme] - -## Getting started - -### Install the package - -Install the Azure Metrics Advisor client library for Python with pip: - -```commandline -pip install azure-ai-metricsadvisor --pre -``` - -### Prerequisites - -* Python 2.7, or 3.6 or later is required to use this package. -* You need an [Azure subscription][azure_sub], and a [Metrics Advisor serivce][ma_service] to use this package. - -### Authenticate the client - -You will need two keys to authenticate the client: - -1) The subscription key to your Metrics Advisor resource. You can find this in the Keys and Endpoint section of your resource in the Azure portal. -2) The API key for your Metrics Advisor instance. You can find this in the web portal for Metrics Advisor, in API keys on the left navigation menu. - -We can use the keys to create a new `MetricsAdvisorClient` or `MetricsAdvisorAdministrationClient`. - -```py -import os -from azure.ai.metricsadvisor import ( - MetricsAdvisorKeyCredential, - MetricsAdvisorClient, - MetricsAdvisorAdministrationClient, -) - -service_endpoint = os.getenv("ENDPOINT") -subscription_key = os.getenv("SUBSCRIPTION_KEY") -api_key = os.getenv("API_KEY") - -client = MetricsAdvisorClient(service_endpoint, - MetricsAdvisorKeyCredential(subscription_key, api_key)) - -admin_client = MetricsAdvisorAdministrationClient(service_endpoint, - MetricsAdvisorKeyCredential(subscription_key, api_key)) -``` - -## Key concepts - -### MetricsAdvisorClient - -`MetricsAdvisorClient` helps with: - -- listing incidents -- listing root causes of incidents -- retrieving original time series data and time series data enriched by the service. -- listing alerts -- adding feedback to tune your model - -### MetricsAdvisorAdministrationClient - -`MetricsAdvisorAdministrationClient` allows you to - -- manage data feeds -- manage anomaly detection configurations -- manage anomaly alerting configurations -- manage hooks - -### DataFeed - -A `DataFeed` is what Metrics Advisor ingests from your data source, such as Cosmos DB or a SQL server. A data feed contains rows of: - -- timestamps -- zero or more dimensions -- one or more measures - -### Metric - -A `DataFeedMetric` is a quantifiable measure that is used to monitor and assess the status of a specific business process. It can be a combination of multiple time series values divided into dimensions. For example a web health metric might contain dimensions for user count and the en-us market. - -### AnomalyDetectionConfiguration - -`AnomalyDetectionConfiguration` is required for every time series, and determines whether a point in the time series is an anomaly. - -### Anomaly & Incident - -After a detection configuration is applied to metrics, `AnomalyIncident`s are generated whenever any series within it has an `DataPointAnomaly`. - -### Alert - -You can configure which anomalies should trigger an `AnomalyAlert`. You can set multiple alerts with different settings. For example, you could create an alert for anomalies with lower business impact, and another for more important alerts. - -### Notification Hook - -Metrics Advisor lets you create and subscribe to real-time alerts. These alerts are sent over the internet, using a notification hook like `EmailNotificationHook` or `WebNotificationHook`. - -## Examples - -- [Add a data feed from a sample or data source](#add-a-data-feed-from-a-sample-or-data-source "Add a data feed from a sample or data source") -- [Check ingestion status](#check-ingestion-status "Check ingestion status") -- [Configure anomaly detection configuration](#configure-anomaly-detection-configuration "Configure anomaly detection configuration") -- [Configure alert configuration](#configure-alert-configuration "Configure alert configuration") -- [Query anomaly detection results](#query-anomaly-detection-results "Query anomaly detection results") -- [Query incidents](#query-incidents "Query incidents") -- [Query root causes](#query-root-causes "Query root causes") -- [Add hooks for receiving anomaly alerts](#add-hooks-for-receiving-anomaly-alerts "Add hooks for receiving anomaly alerts") - -### Add a data feed from a sample or data source - -Metrics Advisor supports connecting different types of data sources. Here is a sample to ingest data from SQL Server. - -```py -import os -import datetime -from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorAdministrationClient -from azure.ai.metricsadvisor.models import ( - SqlServerDataFeedSource, - DataFeedSchema, - DataFeedMetric, - DataFeedDimension, - DataFeedRollupSettings, - DataFeedMissingDataPointFillSettings - ) - -service_endpoint = os.getenv("ENDPOINT") -subscription_key = os.getenv("SUBSCRIPTION_KEY") -api_key = os.getenv("API_KEY") -sql_server_connection_string = os.getenv("SQL_SERVER_CONNECTION_STRING") -query = os.getenv("SQL_SERVER_QUERY") - -client = MetricsAdvisorAdministrationClient( - service_endpoint, - MetricsAdvisorKeyCredential(subscription_key, api_key) -) - -data_feed = client.create_data_feed( - name="My data feed", - source=SqlServerDataFeedSource( - connection_string=sql_server_connection_string, - query=query, - ), - granularity="Daily", - schema=DataFeedSchema( - metrics=[ - DataFeedMetric(name="cost", display_name="Cost"), - DataFeedMetric(name="revenue", display_name="Revenue") - ], - dimensions=[ - DataFeedDimension(name="category", display_name="Category"), - DataFeedDimension(name="city", display_name="City") - ], - timestamp_column="Timestamp" - ), - ingestion_settings=datetime.datetime(2019, 10, 1), - data_feed_description="cost/revenue data feed", - rollup_settings=DataFeedRollupSettings( - rollup_type="AutoRollup", - rollup_method="Sum", - rollup_identification_value="__CUSTOM_SUM__" - ), - missing_data_point_fill_settings=DataFeedMissingDataPointFillSettings( - fill_type="SmartFilling" - ), - access_mode="Private" -) - -return data_feed -``` - -### Check ingestion status - -After we start the data ingestion, we can check the ingestion status. - -```py -import datetime -from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorAdministrationClient - -service_endpoint = os.getenv("ENDPOINT") -subscription_key = os.getenv("SUBSCRIPTION_KEY") -api_key = os.getenv("API_KEY") -data_feed_id = os.getenv("DATA_FEED_ID") - -client = MetricsAdvisorAdministrationClient(service_endpoint, - MetricsAdvisorKeyCredential(subscription_key, api_key) -) - -ingestion_status = client.list_data_feed_ingestion_status( - data_feed_id, - datetime.datetime(2020, 9, 20), - datetime.datetime(2020, 9, 25) -) -for status in ingestion_status: - print("Timestamp: {}".format(status.timestamp)) - print("Status: {}".format(status.status)) - print("Message: {}\n".format(status.message)) -``` - -### Configure anomaly detection configuration - -While a default detection configuration is automatically applied to each metric, we can tune the detection modes used on our data by creating a customized anomaly detection configuration. - -```py -from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorAdministrationClient -from azure.ai.metricsadvisor.models import ( - ChangeThresholdCondition, - HardThresholdCondition, - SmartDetectionCondition, - SuppressCondition, - MetricDetectionCondition, -) - -service_endpoint = os.getenv("ENDPOINT") -subscription_key = os.getenv("SUBSCRIPTION_KEY") -api_key = os.getenv("API_KEY") -metric_id = os.getenv("METRIC_ID") - -client = MetricsAdvisorAdministrationClient( - service_endpoint, - MetricsAdvisorKeyCredential(subscription_key, api_key) -) - -change_threshold_condition = ChangeThresholdCondition( - anomaly_detector_direction="Both", - change_percentage=20, - shift_point=10, - within_range=True, - suppress_condition=SuppressCondition( - min_number=5, - min_ratio=2 - ) -) -hard_threshold_condition = HardThresholdCondition( - anomaly_detector_direction="Up", - upper_bound=100, - suppress_condition=SuppressCondition( - min_number=2, - min_ratio=2 - ) -) -smart_detection_condition = SmartDetectionCondition( - anomaly_detector_direction="Up", - sensitivity=10, - suppress_condition=SuppressCondition( - min_number=2, - min_ratio=2 - ) -) - -detection_config = client.create_detection_configuration( - name="my_detection_config", - metric_id=metric_id, - description="anomaly detection config for metric", - whole_series_detection_condition=MetricDetectionCondition( - condition_operator="OR", - change_threshold_condition=change_threshold_condition, - hard_threshold_condition=hard_threshold_condition, - smart_detection_condition=smart_detection_condition - ) -) -return detection_config -``` - -### Configure alert configuration - -Then let's configure in which conditions an alert needs to be triggered. - -```py -from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorAdministrationClient -from azure.ai.metricsadvisor.models import ( - MetricAlertConfiguration, - MetricAnomalyAlertScope, - TopNGroupScope, - MetricAnomalyAlertConditions, - SeverityCondition, - MetricBoundaryCondition, - MetricAnomalyAlertSnoozeCondition, -) -service_endpoint = os.getenv("ENDPOINT") -subscription_key = os.getenv("SUBSCRIPTION_KEY") -api_key = os.getenv("API_KEY") -anomaly_detection_configuration_id = os.getenv("DETECTION_CONFIGURATION_ID") -hook_id = os.getenv("HOOK_ID") - -client = MetricsAdvisorAdministrationClient( - service_endpoint, - MetricsAdvisorKeyCredential(subscription_key, api_key) -) - -alert_config = client.create_alert_configuration( - name="my alert config", - description="alert config description", - cross_metrics_operator="AND", - metric_alert_configurations=[ - MetricAlertConfiguration( - detection_configuration_id=anomaly_detection_configuration_id, - alert_scope=MetricAnomalyAlertScope( - scope_type="WholeSeries" - ), - alert_conditions=MetricAnomalyAlertConditions( - severity_condition=SeverityCondition( - min_alert_severity="Low", - max_alert_severity="High" - ) - ) - ), - MetricAlertConfiguration( - detection_configuration_id=anomaly_detection_configuration_id, - alert_scope=MetricAnomalyAlertScope( - scope_type="TopN", - top_n_group_in_scope=TopNGroupScope( - top=10, - period=5, - min_top_count=5 - ) - ), - alert_conditions=MetricAnomalyAlertConditions( - metric_boundary_condition=MetricBoundaryCondition( - direction="Up", - upper=50 - ) - ), - alert_snooze_condition=MetricAnomalyAlertSnoozeCondition( - auto_snooze=2, - snooze_scope="Metric", - only_for_successive=True - ) - ), - ], - hook_ids=[hook_id] -) - -return alert_config -``` - -### Query anomaly detection results - -We can query the alerts and anomalies. - -```py -import datetime -from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorClient - -service_endpoint = os.getenv("ENDPOINT") -subscription_key = os.getenv("SUBSCRIPTION_KEY") -api_key = os.getenv("API_KEY") -alert_config_id = os.getenv("ALERT_CONFIG_ID") -alert_id = os.getenv("ALERT_ID") - -client = MetricsAdvisorClient(service_endpoint, - MetricsAdvisorKeyCredential(subscription_key, api_key) -) - -results = client.list_alerts( - alert_configuration_id=alert_config_id, - start_time=datetime.datetime(2020, 1, 1), - end_time=datetime.datetime(2020, 9, 9), - time_mode="AnomalyTime", -) -for result in results: - print("Alert id: {}".format(result.id)) - print("Create time: {}".format(result.created_time)) - -results = client.list_anomalies( - alert_configuration_id=alert_config_id, - alert_id=alert_id, -) -for result in results: - print("Create time: {}".format(result.created_time)) - print("Severity: {}".format(result.severity)) - print("Status: {}".format(result.status)) -``` - -### Query incidents - -We can query the incidents for a detection configuration. - -```py -import datetime -from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorClient - -service_endpoint = os.getenv("ENDPOINT") -subscription_key = os.getenv("SUBSCRIPTION_KEY") -api_key = os.getenv("API_KEY") -anomaly_detection_configuration_id = os.getenv("DETECTION_CONFIGURATION_ID") - -client = MetricsAdvisorClient(service_endpoint, - MetricsAdvisorKeyCredential(subscription_key, api_key) -) - -results = client.list_incidents( - detection_configuration_id=anomaly_detection_configuration_id, - start_time=datetime.datetime(2020, 1, 1), - end_time=datetime.datetime(2020, 9, 9), - ) -for result in results: - print("Metric id: {}".format(result.metric_id)) - print("Incident ID: {}".format(result.id)) - print("Severity: {}".format(result.severity)) - print("Status: {}".format(result.status)) -``` - -### Query root causes - -We can also query the root causes of an incident - -```py -from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorClient - -service_endpoint = os.getenv("ENDPOINT") -subscription_key = os.getenv("SUBSCRIPTION_KEY") -api_key = os.getenv("API_KEY") -anomaly_detection_configuration_id = os.getenv("DETECTION_CONFIGURATION_ID") -incident_id = os.getenv("INCIDENT_ID") - -client = MetricsAdvisorClient(service_endpoint, - MetricsAdvisorKeyCredential(subscription_key, api_key) -) - -results = client.list_incident_root_causes( - detection_configuration_id=anomaly_detection_configuration_id, - incident_id=incident_id, - ) -for result in results: - print("Score: {}".format(result.score)) - print("Description: {}".format(result.description)) - -``` - - -### Add hooks for receiving anomaly alerts - -We can add some hooks so when an alert is triggered, we can get call back. - -```py -from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorAdministrationClient -from azure.ai.metricsadvisor.models import EmailNotificationHook - -service_endpoint = os.getenv("ENDPOINT") -subscription_key = os.getenv("SUBSCRIPTION_KEY") -api_key = os.getenv("API_KEY") - -client = MetricsAdvisorAdministrationClient(service_endpoint, - MetricsAdvisorKeyCredential(subscription_key, api_key)) - -hook = client.create_hook( - hook=EmailNotificationHook( - name="email hook", - description="my email hook", - emails_to_alert=["alertme@alertme.com"], - external_link="https://docs.microsoft.com/en-us/azure/cognitive-services/metrics-advisor/how-tos/alerts" - ) -) -``` - -### Async APIs - -This library includes a complete async API supported on Python 3.6+. To use it, you must -first install an async transport, such as [aiohttp](https://pypi.org/project/aiohttp/). -See -[azure-core documentation][azure_core_docs] -for more information. - - -```py -from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential -from azure.ai.metricsadvisor.aio import MetricsAdvisorClient, MetricsAdvisorAdministrationClient - -client = MetricsAdvisorClient( - service_endpoint, - MetricsAdvisorKeyCredential(subscription_key, api_key) -) - -admin_client = MetricsAdvisorAdministrationClient( - service_endpoint, - MetricsAdvisorKeyCredential(subscription_key, api_key) -) -``` - -## Troubleshooting - -### General - -The Azure Metrics Advisor clients will raise exceptions defined in [Azure Core][azure_core]. - -### Logging -This library uses the standard -[logging][python_logging] library for logging. - -Basic information about HTTP sessions (URLs, headers, etc.) is logged at `INFO` level. - -Detailed `DEBUG` level logging, including request/response bodies and **unredacted** -headers, can be enabled on the client or per-operation with the `logging_enable` keyword argument. - -See full SDK logging documentation with examples [here][sdk_logging_docs]. - -## Next steps - -### More sample code - - For more details see the [samples README][samples_readme]. - -## Contributing - -This project welcomes contributions and suggestions. Most contributions require -you to agree to a Contributor License Agreement (CLA) declaring that you have -the right to, and actually do, grant us the rights to use your contribution. For -details, visit [cla.microsoft.com][cla]. - -This project has adopted the [Microsoft Open Source Code of Conduct][code_of_conduct]. -For more information see the [Code of Conduct FAQ][coc_faq] -or contact [opencode@microsoft.com][coc_contact] with any -additional questions or comments. - - -[src_code]: https://github.com/Azure/azure-sdk-for-python/tree/azure-ai-metricsadvisor_1.0.0/sdk/metricsadvisor/azure-ai-metricsadvisor -[reference_documentation]: https://aka.ms/azsdk/python/metricsadvisor/docs -[ma_docs]: https://docs.microsoft.com/azure/cognitive-services/metrics-advisor/overview -[azure_cli]: https://docs.microsoft.com/cli/azure -[azure_sub]: https://azure.microsoft.com/free/ -[package]: https://aka.ms/azsdk/python/metricsadvisor/pypi -[ma_service]: https://go.microsoft.com/fwlink/?linkid=2142156 -[python_logging]: https://docs.python.org/3.5/library/logging.html -[azure_core]: https://aka.ms/azsdk/python/core/docs#module-azure.core.exceptions -[azure_core_docs]: https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-metricsadvisor_1.0.0/sdk/core/azure-core/README.md#transport -[sdk_logging_docs]: https://docs.microsoft.com/azure/developer/python/azure-sdk-logging -[samples_readme]: https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-metricsadvisor_1.0.0/sdk/metricsadvisor/azure-ai-metricsadvisor/samples/README.md - -[cla]: https://cla.microsoft.com -[code_of_conduct]: https://opensource.microsoft.com/codeofconduct/ -[coc_faq]: https://opensource.microsoft.com/codeofconduct/faq/ +# Azure Metrics Advisor client library for Python - version 1.0.0 + +Metrics Advisor is a scalable real-time time series monitoring, alerting, and root cause analysis platform. Use Metrics Advisor to: + +- Analyze multi-dimensional data from multiple data sources +- Identify and correlate anomalies +- Configure and fine-tune the anomaly detection model used on your data +- Diagnose anomalies and help with root cause analysis + +[Source code][src_code] | [Package (Pypi)][package] | [API reference documentation][reference_documentation] | [Product documentation][ma_docs] | [Samples][samples_readme] + +## Getting started + +### Install the package + +Install the Azure Metrics Advisor client library for Python with pip: + +```commandline +pip install azure-ai-metricsadvisor --pre +``` + +### Prerequisites + +* Python 2.7, or 3.6 or later is required to use this package. +* You need an [Azure subscription][azure_sub], and a [Metrics Advisor service][ma_service] to use this package. + +### Authenticate the client + +You will need two keys to authenticate the client: + +1) The subscription key to your Metrics Advisor resource. You can find this in the Keys and Endpoint section of your resource in the Azure portal. +2) The API key for your Metrics Advisor instance. You can find this in the web portal for Metrics Advisor, in API keys on the left navigation menu. + +We can use the keys to create a new `MetricsAdvisorClient` or `MetricsAdvisorAdministrationClient`. + +```py +import os +from azure.ai.metricsadvisor import ( + MetricsAdvisorKeyCredential, + MetricsAdvisorClient, + MetricsAdvisorAdministrationClient, +) + +service_endpoint = os.getenv("ENDPOINT") +subscription_key = os.getenv("SUBSCRIPTION_KEY") +api_key = os.getenv("API_KEY") + +client = MetricsAdvisorClient(service_endpoint, + MetricsAdvisorKeyCredential(subscription_key, api_key)) + +admin_client = MetricsAdvisorAdministrationClient(service_endpoint, + MetricsAdvisorKeyCredential(subscription_key, api_key)) +``` + +## Key concepts + +### MetricsAdvisorClient + +`MetricsAdvisorClient` helps with: + +- listing incidents +- listing root causes of incidents +- retrieving original time series data and time series data enriched by the service. +- listing alerts +- adding feedback to tune your model + +### MetricsAdvisorAdministrationClient + +`MetricsAdvisorAdministrationClient` allows you to + +- manage data feeds +- manage anomaly detection configurations +- manage anomaly alerting configurations +- manage hooks + +### DataFeed + +A `DataFeed` is what Metrics Advisor ingests from your data source, such as Cosmos DB or a SQL server. A data feed contains rows of: + +- timestamps +- zero or more dimensions +- one or more measures + +### Metric + +A `DataFeedMetric` is a quantifiable measure that is used to monitor and assess the status of a specific business process. It can be a combination of multiple time series values divided into dimensions. For example a web health metric might contain dimensions for user count and the en-us market. + +### AnomalyDetectionConfiguration + +`AnomalyDetectionConfiguration` is required for every time series, and determines whether a point in the time series is an anomaly. + +### Anomaly & Incident + +After a detection configuration is applied to metrics, `AnomalyIncident`s are generated whenever any series within it has an `DataPointAnomaly`. + +### Alert + +You can configure which anomalies should trigger an `AnomalyAlert`. You can set multiple alerts with different settings. For example, you could create an alert for anomalies with lower business impact, and another for more important alerts. + +### Notification Hook + +Metrics Advisor lets you create and subscribe to real-time alerts. These alerts are sent over the internet, using a notification hook like `EmailNotificationHook` or `WebNotificationHook`. + +## Examples + +- [Add a data feed from a sample or data source](#add-a-data-feed-from-a-sample-or-data-source "Add a data feed from a sample or data source") +- [Check ingestion status](#check-ingestion-status "Check ingestion status") +- [Configure anomaly detection configuration](#configure-anomaly-detection-configuration "Configure anomaly detection configuration") +- [Configure alert configuration](#configure-alert-configuration "Configure alert configuration") +- [Query anomaly detection results](#query-anomaly-detection-results "Query anomaly detection results") +- [Query incidents](#query-incidents "Query incidents") +- [Query root causes](#query-root-causes "Query root causes") +- [Add hooks for receiving anomaly alerts](#add-hooks-for-receiving-anomaly-alerts "Add hooks for receiving anomaly alerts") + +### Add a data feed from a sample or data source + +Metrics Advisor supports connecting different types of data sources. Here is a sample to ingest data from SQL Server. + +```py +import os +import datetime +from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorAdministrationClient +from azure.ai.metricsadvisor.models import ( + SqlServerDataFeedSource, + DataFeedSchema, + DataFeedMetric, + DataFeedDimension, + DataFeedRollupSettings, + DataFeedMissingDataPointFillSettings + ) + +service_endpoint = os.getenv("ENDPOINT") +subscription_key = os.getenv("SUBSCRIPTION_KEY") +api_key = os.getenv("API_KEY") +sql_server_connection_string = os.getenv("SQL_SERVER_CONNECTION_STRING") +query = os.getenv("SQL_SERVER_QUERY") + +client = MetricsAdvisorAdministrationClient( + service_endpoint, + MetricsAdvisorKeyCredential(subscription_key, api_key) +) + +data_feed = client.create_data_feed( + name="My data feed", + source=SqlServerDataFeedSource( + connection_string=sql_server_connection_string, + query=query, + ), + granularity="Daily", + schema=DataFeedSchema( + metrics=[ + DataFeedMetric(name="cost", display_name="Cost"), + DataFeedMetric(name="revenue", display_name="Revenue") + ], + dimensions=[ + DataFeedDimension(name="category", display_name="Category"), + DataFeedDimension(name="city", display_name="City") + ], + timestamp_column="Timestamp" + ), + ingestion_settings=datetime.datetime(2019, 10, 1), + data_feed_description="cost/revenue data feed", + rollup_settings=DataFeedRollupSettings( + rollup_type="AutoRollup", + rollup_method="Sum", + rollup_identification_value="__CUSTOM_SUM__" + ), + missing_data_point_fill_settings=DataFeedMissingDataPointFillSettings( + fill_type="SmartFilling" + ), + access_mode="Private" +) + +return data_feed +``` + +### Check ingestion status + +After we start the data ingestion, we can check the ingestion status. + +```py +import datetime +from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorAdministrationClient + +service_endpoint = os.getenv("ENDPOINT") +subscription_key = os.getenv("SUBSCRIPTION_KEY") +api_key = os.getenv("API_KEY") +data_feed_id = os.getenv("DATA_FEED_ID") + +client = MetricsAdvisorAdministrationClient(service_endpoint, + MetricsAdvisorKeyCredential(subscription_key, api_key) +) + +ingestion_status = client.list_data_feed_ingestion_status( + data_feed_id, + datetime.datetime(2020, 9, 20), + datetime.datetime(2020, 9, 25) +) +for status in ingestion_status: + print("Timestamp: {}".format(status.timestamp)) + print("Status: {}".format(status.status)) + print("Message: {}\n".format(status.message)) +``` + +### Configure anomaly detection configuration + +While a default detection configuration is automatically applied to each metric, we can tune the detection modes used on our data by creating a customized anomaly detection configuration. + +```py +from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorAdministrationClient +from azure.ai.metricsadvisor.models import ( + ChangeThresholdCondition, + HardThresholdCondition, + SmartDetectionCondition, + SuppressCondition, + MetricDetectionCondition, +) + +service_endpoint = os.getenv("ENDPOINT") +subscription_key = os.getenv("SUBSCRIPTION_KEY") +api_key = os.getenv("API_KEY") +metric_id = os.getenv("METRIC_ID") + +client = MetricsAdvisorAdministrationClient( + service_endpoint, + MetricsAdvisorKeyCredential(subscription_key, api_key) +) + +change_threshold_condition = ChangeThresholdCondition( + anomaly_detector_direction="Both", + change_percentage=20, + shift_point=10, + within_range=True, + suppress_condition=SuppressCondition( + min_number=5, + min_ratio=2 + ) +) +hard_threshold_condition = HardThresholdCondition( + anomaly_detector_direction="Up", + upper_bound=100, + suppress_condition=SuppressCondition( + min_number=2, + min_ratio=2 + ) +) +smart_detection_condition = SmartDetectionCondition( + anomaly_detector_direction="Up", + sensitivity=10, + suppress_condition=SuppressCondition( + min_number=2, + min_ratio=2 + ) +) + +detection_config = client.create_detection_configuration( + name="my_detection_config", + metric_id=metric_id, + description="anomaly detection config for metric", + whole_series_detection_condition=MetricDetectionCondition( + condition_operator="OR", + change_threshold_condition=change_threshold_condition, + hard_threshold_condition=hard_threshold_condition, + smart_detection_condition=smart_detection_condition + ) +) +return detection_config +``` + +### Configure alert configuration + +Then let's configure in which conditions an alert needs to be triggered. + +```py +from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorAdministrationClient +from azure.ai.metricsadvisor.models import ( + MetricAlertConfiguration, + MetricAnomalyAlertScope, + TopNGroupScope, + MetricAnomalyAlertConditions, + SeverityCondition, + MetricBoundaryCondition, + MetricAnomalyAlertSnoozeCondition, +) +service_endpoint = os.getenv("ENDPOINT") +subscription_key = os.getenv("SUBSCRIPTION_KEY") +api_key = os.getenv("API_KEY") +anomaly_detection_configuration_id = os.getenv("DETECTION_CONFIGURATION_ID") +hook_id = os.getenv("HOOK_ID") + +client = MetricsAdvisorAdministrationClient( + service_endpoint, + MetricsAdvisorKeyCredential(subscription_key, api_key) +) + +alert_config = client.create_alert_configuration( + name="my alert config", + description="alert config description", + cross_metrics_operator="AND", + metric_alert_configurations=[ + MetricAlertConfiguration( + detection_configuration_id=anomaly_detection_configuration_id, + alert_scope=MetricAnomalyAlertScope( + scope_type="WholeSeries" + ), + alert_conditions=MetricAnomalyAlertConditions( + severity_condition=SeverityCondition( + min_alert_severity="Low", + max_alert_severity="High" + ) + ) + ), + MetricAlertConfiguration( + detection_configuration_id=anomaly_detection_configuration_id, + alert_scope=MetricAnomalyAlertScope( + scope_type="TopN", + top_n_group_in_scope=TopNGroupScope( + top=10, + period=5, + min_top_count=5 + ) + ), + alert_conditions=MetricAnomalyAlertConditions( + metric_boundary_condition=MetricBoundaryCondition( + direction="Up", + upper=50 + ) + ), + alert_snooze_condition=MetricAnomalyAlertSnoozeCondition( + auto_snooze=2, + snooze_scope="Metric", + only_for_successive=True + ) + ), + ], + hook_ids=[hook_id] +) + +return alert_config +``` + +### Query anomaly detection results + +We can query the alerts and anomalies. + +```py +import datetime +from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorClient + +service_endpoint = os.getenv("ENDPOINT") +subscription_key = os.getenv("SUBSCRIPTION_KEY") +api_key = os.getenv("API_KEY") +alert_config_id = os.getenv("ALERT_CONFIG_ID") +alert_id = os.getenv("ALERT_ID") + +client = MetricsAdvisorClient(service_endpoint, + MetricsAdvisorKeyCredential(subscription_key, api_key) +) + +results = client.list_alerts( + alert_configuration_id=alert_config_id, + start_time=datetime.datetime(2020, 1, 1), + end_time=datetime.datetime(2020, 9, 9), + time_mode="AnomalyTime", +) +for result in results: + print("Alert id: {}".format(result.id)) + print("Create time: {}".format(result.created_time)) + +results = client.list_anomalies( + alert_configuration_id=alert_config_id, + alert_id=alert_id, +) +for result in results: + print("Create time: {}".format(result.created_time)) + print("Severity: {}".format(result.severity)) + print("Status: {}".format(result.status)) +``` + +### Query incidents + +We can query the incidents for a detection configuration. + +```py +import datetime +from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorClient + +service_endpoint = os.getenv("ENDPOINT") +subscription_key = os.getenv("SUBSCRIPTION_KEY") +api_key = os.getenv("API_KEY") +anomaly_detection_configuration_id = os.getenv("DETECTION_CONFIGURATION_ID") + +client = MetricsAdvisorClient(service_endpoint, + MetricsAdvisorKeyCredential(subscription_key, api_key) +) + +results = client.list_incidents( + detection_configuration_id=anomaly_detection_configuration_id, + start_time=datetime.datetime(2020, 1, 1), + end_time=datetime.datetime(2020, 9, 9), + ) +for result in results: + print("Metric id: {}".format(result.metric_id)) + print("Incident ID: {}".format(result.id)) + print("Severity: {}".format(result.severity)) + print("Status: {}".format(result.status)) +``` + +### Query root causes + +We can also query the root causes of an incident + +```py +from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorClient + +service_endpoint = os.getenv("ENDPOINT") +subscription_key = os.getenv("SUBSCRIPTION_KEY") +api_key = os.getenv("API_KEY") +anomaly_detection_configuration_id = os.getenv("DETECTION_CONFIGURATION_ID") +incident_id = os.getenv("INCIDENT_ID") + +client = MetricsAdvisorClient(service_endpoint, + MetricsAdvisorKeyCredential(subscription_key, api_key) +) + +results = client.list_incident_root_causes( + detection_configuration_id=anomaly_detection_configuration_id, + incident_id=incident_id, + ) +for result in results: + print("Score: {}".format(result.score)) + print("Description: {}".format(result.description)) + +``` + + +### Add hooks for receiving anomaly alerts + +We can add some hooks so when an alert is triggered, we can get call back. + +```py +from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential, MetricsAdvisorAdministrationClient +from azure.ai.metricsadvisor.models import EmailNotificationHook + +service_endpoint = os.getenv("ENDPOINT") +subscription_key = os.getenv("SUBSCRIPTION_KEY") +api_key = os.getenv("API_KEY") + +client = MetricsAdvisorAdministrationClient(service_endpoint, + MetricsAdvisorKeyCredential(subscription_key, api_key)) + +hook = client.create_hook( + hook=EmailNotificationHook( + name="email hook", + description="my email hook", + emails_to_alert=["alertme@alertme.com"], + external_link="https://docs.microsoft.com/en-us/azure/cognitive-services/metrics-advisor/how-tos/alerts" + ) +) +``` + +### Async APIs + +This library includes a complete async API supported on Python 3.6+. To use it, you must +first install an async transport, such as [aiohttp](https://pypi.org/project/aiohttp/). +See +[azure-core documentation][azure_core_docs] +for more information. + + +```py +from azure.ai.metricsadvisor import MetricsAdvisorKeyCredential +from azure.ai.metricsadvisor.aio import MetricsAdvisorClient, MetricsAdvisorAdministrationClient + +client = MetricsAdvisorClient( + service_endpoint, + MetricsAdvisorKeyCredential(subscription_key, api_key) +) + +admin_client = MetricsAdvisorAdministrationClient( + service_endpoint, + MetricsAdvisorKeyCredential(subscription_key, api_key) +) +``` + +## Troubleshooting + +### General + +The Azure Metrics Advisor clients will raise exceptions defined in [Azure Core][azure_core]. + +### Logging +This library uses the standard +[logging][python_logging] library for logging. + +Basic information about HTTP sessions (URLs, headers, etc.) is logged at `INFO` level. + +Detailed `DEBUG` level logging, including request/response bodies and **unredacted** +headers, can be enabled on the client or per-operation with the `logging_enable` keyword argument. + +See full SDK logging documentation with examples [here][sdk_logging_docs]. + +## Next steps + +### More sample code + + For more details see the [samples README][samples_readme]. + +## Contributing + +This project welcomes contributions and suggestions. Most contributions require +you to agree to a Contributor License Agreement (CLA) declaring that you have +the right to, and actually do, grant us the rights to use your contribution. For +details, visit [cla.microsoft.com][cla]. + +This project has adopted the [Microsoft Open Source Code of Conduct][code_of_conduct]. +For more information see the [Code of Conduct FAQ][coc_faq] +or contact [opencode@microsoft.com][coc_contact] with any +additional questions or comments. + + +[src_code]: https://github.com/Azure/azure-sdk-for-python/tree/azure-ai-metricsadvisor_1.0.0/sdk/metricsadvisor/azure-ai-metricsadvisor +[reference_documentation]: https://aka.ms/azsdk/python/metricsadvisor/docs +[ma_docs]: https://docs.microsoft.com/azure/cognitive-services/metrics-advisor/overview +[azure_cli]: https://docs.microsoft.com/cli/azure +[azure_sub]: https://azure.microsoft.com/free/ +[package]: https://aka.ms/azsdk/python/metricsadvisor/pypi +[ma_service]: https://go.microsoft.com/fwlink/?linkid=2142156 +[python_logging]: https://docs.python.org/3.5/library/logging.html +[azure_core]: https://aka.ms/azsdk/python/core/docs#module-azure.core.exceptions +[azure_core_docs]: https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-metricsadvisor_1.0.0/sdk/core/azure-core/README.md#transport +[sdk_logging_docs]: https://docs.microsoft.com/azure/developer/python/azure-sdk-logging +[samples_readme]: https://github.com/Azure/azure-sdk-for-python/blob/azure-ai-metricsadvisor_1.0.0/sdk/metricsadvisor/azure-ai-metricsadvisor/samples/README.md + +[cla]: https://cla.microsoft.com +[code_of_conduct]: https://opensource.microsoft.com/codeofconduct/ +[coc_faq]: https://opensource.microsoft.com/codeofconduct/faq/ [coc_contact]: mailto:opencode@microsoft.com - + diff --git a/docs-ref-services/latest/communication-callautomation-readme.md b/docs-ref-services/latest/communication-callautomation-readme.md index 3956eec3cdeb..61fdfea40826 100644 --- a/docs-ref-services/latest/communication-callautomation-readme.md +++ b/docs-ref-services/latest/communication-callautomation-readme.md @@ -34,7 +34,7 @@ pip install azure-communication-callautomation | Name | Description | | -------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | CallAutomationClient | `CallAutomationClient` is the primary interface for developers using this client library. It can be used to initiate calls by `createCall` or `answerCall`. It can also be used to do recording actions such as `startRecording` | | -| CallConnectionClient | `CallConnectionClient` represents a ongoing call. Once the call is established with `createCall` or `answerCall`, further actions can be performed for the call, such as `transfer` or `play_media`. | | +| CallConnectionClient | `CallConnectionClient` represents an ongoing call. Once the call is established with `createCall` or `answerCall`, further actions can be performed for the call, such as `transfer` or `play_media`. | | | Callback Events | Callback events are events sent back during duration of the call. It gives information and state of the call, such as `CallConnected`. `CallbackUrl` must be provided during `createCall` and `answerCall`, and callback events will be sent to this url. | | Incoming Call Event | When incoming call happens (that can be answered with `answerCall`), incoming call eventgrid event will be sent. This is different from Callback events above, and should be setup on Azure portal. See [Incoming Call][incomingcall] for detail. | diff --git a/docs-ref-services/latest/communication-chat-readme.md b/docs-ref-services/latest/communication-chat-readme.md index 97b18862ce77..0563182171c2 100644 --- a/docs-ref-services/latest/communication-chat-readme.md +++ b/docs-ref-services/latest/communication-chat-readme.md @@ -461,7 +461,7 @@ new_users = [identity_client.create_user() for i in range(2)] # from azure.communication.chat import CommunicationUserIdentifier # # user_id = 'some user id' -# user_display_name = "Wilma Flinstone" +# user_display_name = "Wilma Flintstone" # new_user = CommunicationUserIdentifier(user_id) # participant = ChatParticipant( # identifier=new_user, @@ -472,7 +472,7 @@ participants = [] for _user in new_users: chat_participant = ChatParticipant( identifier=_user, - display_name='Fred Flinstone', + display_name='Fred Flintstone', share_history_time=datetime.utcnow() ) participants.append(chat_participant) diff --git a/docs-ref-services/latest/confidentialledger-readme.md b/docs-ref-services/latest/confidentialledger-readme.md index d0c7ac6af743..d9a2087074e3 100644 --- a/docs-ref-services/latest/confidentialledger-readme.md +++ b/docs-ref-services/latest/confidentialledger-readme.md @@ -6,612 +6,612 @@ ms.topic: reference ms.devlang: python ms.service: confidentialledger --- -# Azure Confidential Ledger client library for Python - version 1.1.1 - - -Azure Confidential Ledger provides a service for logging to an immutable, tamper-proof ledger. As part of the [Azure Confidential Computing][azure_confidential_computing] portfolio, Azure Confidential Ledger runs in secure, hardware-based trusted execution environments, also known as enclaves. It is built on Microsoft Research's [Confidential Consortium Framework][ccf]. - -[Source code][confidential_ledger_client_src] -| [Package (PyPI)][pypi_package_confidential_ledger] -| [Package (Conda)](https://anaconda.org/microsoft/azure-confidentialledger/) -| [API reference documentation][reference_docs] -| [Product documentation][confidential_ledger_docs] - -## Getting started -### Install packages -Install [azure-confidentialledger][pypi_package_confidential_ledger] and [azure-identity][azure_identity_pypi] with [pip][pip]: -```Bash -pip install azure-identity azure-confidentialledger -``` -[azure-identity][azure_identity] is used for Azure Active Directory -authentication as demonstrated below. - -### Prerequisites -* An [Azure subscription][azure_sub] -* Python 3.6 or later -* A running instance of Azure Confidential Ledger. -* A registered user in the Confidential Ledger, typically assigned during [ARM][azure_resource_manager] resource creation, with `Administrator` privileges. - -### Authenticate the client -#### Using Azure Active Directory -This document demonstrates using [DefaultAzureCredential][default_cred_ref] to authenticate to the Confidential Ledger via Azure Active Directory. However, `ConfidentialLedgerClient` accepts any [azure-identity][azure_identity] credential. See the [azure-identity][azure_identity] documentation for more information about other credentials. - -#### Using a client certificate -As an alternative to Azure Active Directory, clients may choose to use a client certificate to authenticate via mutual TLS. `azure.confidentialledger.ConfidentialLedgerCertificateCredential` may be used for this purpose. - -### Create a client -`DefaultAzureCredential` will automatically handle most Azure SDK client scenarios. To get started, set environment variables for the AAD identity registered with your Confidential Ledger. -```bash -export AZURE_CLIENT_ID="generated app id" -export AZURE_CLIENT_SECRET="random password" -export AZURE_TENANT_ID="tenant id" -``` -Then, `DefaultAzureCredential` will be able to authenticate the `ConfidentialLedgerClient`. - -Constructing the client also requires your Confidential Ledger's URL and id, which you can get from the Azure CLI or the Azure Portal. When you have retrieved those values, please replace instances of `"my-ledger-id"` and `"https://my-ledger-id.confidential-ledger.azure.com"` in the examples below. You may also need to replace `"https://identity.confidential-ledger.core.azure.com"` with the hostname from the `identityServiceUri` in the ARM description of your ledger. - -Because Confidential Ledgers use self-signed certificates securely generated and stored in an enclave, the signing certificate for each Confidential Ledger must first be retrieved from the Confidential Ledger Identity Service. - -```python -from azure.confidentialledger import ConfidentialLedgerClient -from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient -from azure.identity import DefaultAzureCredential - -identity_client = ConfidentialLedgerCertificateClient() -network_identity = identity_client.get_ledger_identity( - ledger_id="my-ledger-id" -) - -ledger_tls_cert_file_name = "ledger_certificate.pem" -with open(ledger_tls_cert_file_name, "w") as cert_file: - cert_file.write(network_identity["ledgerTlsCertificate"]) - -credential = DefaultAzureCredential() -ledger_client = ConfidentialLedgerClient( - endpoint="https://my-ledger-id.confidential-ledger.azure.com", - credential=credential, - ledger_certificate_path=ledger_tls_cert_file_name -) -``` - -Conveniently, the `ConfidentialLedgerClient` constructor will fetch the ledger TLS certificate (and write it to the specified file) if it is provided with a non-existent file. The user is responsible for removing the created file as needed. - -```python -from azure.confidentialledger import ConfidentialLedgerClient -from azure.identity import DefaultAzureCredential - -credential = DefaultAzureCredential() -ledger_client = ConfidentialLedgerClient( - endpoint="https://my-ledger-id.confidential-ledger.azure.com", - credential=credential, - ledger_certificate_path="ledger_certificate.pem" -) - -# The ledger TLS certificate is written to `ledger_certificate.pem`. -``` - -To make it clear that a file is being used for the ledger TLS certificate, subsequent examples will explicitly write the ledger TLS certificate to a file. - -## Key concepts -### Ledger entries and transactions -Every write to Azure Confidential Ledger generates an immutable ledger entry in the service. Writes, also referred to as transactions, are uniquely identified by transaction ids that increment with each write. Once written, ledger entries may be retrieved at any time. - -### Collections -While most use cases involve just one collection per Confidential Ledger, we provide the collection id feature in case semantically or logically different groups of data need to be stored in the same Confidential Ledger. - -Ledger entries are retrieved by their `collectionId`. The Confidential Ledger will always assume a constant, service-determined `collectionId` for entries written without a `collectionId` specified. - -### Users -Users are managed directly with the Confidential Ledger instead of through Azure. Users may be AAD-based, identified by their AAD object id, or certificate-based, identified by their PEM certificate fingerprint. - -### Receipts - -To enforce transaction integrity guarantees, an Azure Confidential Ledger uses a [Merkle tree][merkle_tree_wiki] data structure to record the hash of all transactions blocks that are appended to the immutable ledger. After a write transaction is committed, Azure Confidential Ledger users can get a cryptographic Merkle proof, or receipt, over the entry produced in a Confidential Ledger to verify that the write operation was correctly saved. A write transaction receipt is proof that the system has committed the corresponding transaction and can be used to verify that the entry has been effectively appended to the ledger. - -Please refer to the following [article](https://learn.microsoft.com/azure/confidential-ledger/write-transaction-receipts) for more information about Azure Confidential Ledger write transaction receipts. - -### Receipt Verification - -After getting a receipt for a write transaction, Azure Confidential Ledger users can verify the contents of the fetched receipt following a verification algorithm. The success of the verification is proof that the write operation associated to the receipt was correctly appended into the immutable ledger. - -Please refer to the following [article](https://learn.microsoft.com/azure/confidential-ledger/verify-write-transaction-receipts) for more information about the verification process for Azure Confidential Ledger write transaction receipts. - -### Application Claims -Azure Confidential Ledger applications can attach arbitrary data, called application claims, to write transactions. These claims represent the actions executed during a write operation. When attached to a transaction, the SHA-256 digest of the claims object is included in the ledger and committed as part of the write transaction. This guarantees that the digest is signed in place and cannot be tampered with. - -Later, application claims can be revealed in their un-digested form in the receipt payload corresponding to the same transaction where they were added. This allows users to leverage the information in the receipt to re-compute the same claims digest that was attached and signed in place by the Azure Confidential Ledger instance during the transaction. The claims digest can be used as part of the write transaction receipt verification process, providing an offline way for users to fully verify the authenticity of the recorded claims. - -More details on the application claims format and the digest computation algorithm can be found at the following links: - -- [Azure Confidential Ledger application claims](https://learn.microsoft.com/azure/confidential-ledger/write-transaction-receipts#application-claims) -- [Azure Confidential Ledger application claims digest verification](https://learn.microsoft.com/azure/confidential-ledger/verify-write-transaction-receipts#verify-application-claims-digest) - -Please refer to the following CCF documentation pages for more information about CCF Application claims: - -- [Application Claims](https://microsoft.github.io/CCF/main/use_apps/verify_tx.html#application-claims) -- [User-Defined Claims in Receipts](https://microsoft.github.io/CCF/main/build_apps/example_cpp.html#user-defined-claims-in-receipts) - -### Confidential computing -[Azure Confidential Computing][azure_confidential_computing] allows you to isolate and protect your data while it is being processed in the cloud. Azure Confidential Ledger runs on Azure Confidential Computing virtual machines, thus providing stronger data protection with encryption of data in use. - -### Confidential Consortium Framework -Azure Confidential Ledger is built on Microsoft Research's open-source [Confidential Consortium Framework (CCF)][ccf]. Under CCF, applications are managed by a consortium of members with the ability to submit proposals to modify and govern application operation. In Azure Confidential Ledger, Microsoft Azure owns an operator member identity that allows it to perform governance and maintenance actions like replacing unhealthy nodes in the Confidential Ledger and upgrading the enclave code. - -## Examples -This section contains code snippets covering common tasks, including: -- [Append entry](#append-entry) -- [Retrieving ledger entries](#retrieving-ledger-entries) -- [Making a ranged query](#making-a-ranged-query) -- [Managing users](#managing-users) -- [Using certificate authentication](#using-certificate-authentication) -- [Verify write transaction receipts](#verify-write-transaction-receipts) - -### Append entry -Data that needs to be stored immutably in a tamper-proof manner can be saved to Azure Confidential Ledger by appending an entry to the ledger. - -Since Confidential Ledger is a distributed system, rare transient failures may cause writes to be lost. For entries that must be preserved, it is advisable to verify that the write became durable. For less important writes where higher client throughput is preferred, the wait step may be skipped. - -```python -from azure.confidentialledger import ConfidentialLedgerClient -from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient -from azure.identity import DefaultAzureCredential - -identity_client = ConfidentialLedgerCertificateClient() -network_identity = identity_client.get_ledger_identity( - ledger_id="my-ledger-id" -) - -ledger_tls_cert_file_name = "ledger_certificate.pem" -with open(ledger_tls_cert_file_name, "w") as cert_file: - cert_file.write(network_identity["ledgerTlsCertificate"]) - -credential = DefaultAzureCredential() -ledger_client = ConfidentialLedgerClient( - endpoint="https://my-ledger-id.confidential-ledger.azure.com", - credential=credential, - ledger_certificate_path=ledger_tls_cert_file_name -) - -post_entry_result = ledger_client.create_ledger_entry( - {"contents": "Hello world!"} - ) -transaction_id = post_entry_result["transactionId"] - -wait_poller = ledger_client.begin_wait_for_commit(transaction_id) -wait_poller.wait() -print(f'Ledger entry at transaction id {transaction_id} has been committed successfully') -``` - -Alternatively, the client may wait for commit when writing a ledger entry. - -```python -from azure.confidentialledger import ConfidentialLedgerClient -from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient -from azure.identity import DefaultAzureCredential - -identity_client = ConfidentialLedgerCertificateClient() -network_identity = identity_client.get_ledger_identity( - ledger_id="my-ledger-id" -) - -ledger_tls_cert_file_name = "ledger_certificate.pem" -with open(ledger_tls_cert_file_name, "w") as cert_file: - cert_file.write(network_identity["ledgerTlsCertificate"]) - -credential = DefaultAzureCredential() -ledger_client = ConfidentialLedgerClient( - endpoint="https://my-ledger-id.confidential-ledger.azure.com", - credential=credential, - ledger_certificate_path=ledger_tls_cert_file_name -) - -post_poller = ledger_client.begin_create_ledger_entry( - {"contents": "Hello world again!"} -) -new_post_result = post_poller.result() -print( - 'The new ledger entry has been committed successfully at transaction id ' - f'{new_post_result["transactionId"]}' -) -``` - -### Retrieving ledger entries -Getting ledger entries older than the latest may take some time as the service is loading historical entries, so a poller is provided. - -Ledger entries are retrieved by collection. The returned value is the value contained in the specified collection at the point in time identified by the transaction id. - -```python -from azure.confidentialledger import ConfidentialLedgerClient -from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient -from azure.identity import DefaultAzureCredential - -identity_client = ConfidentialLedgerCertificateClient() -network_identity = identity_client.get_ledger_identity( - ledger_id="my-ledger-id" -) - -ledger_tls_cert_file_name = "ledger_certificate.pem" -with open(ledger_tls_cert_file_name, "w") as cert_file: - cert_file.write(network_identity["ledgerTlsCertificate"]) - -credential = DefaultAzureCredential() -ledger_client = ConfidentialLedgerClient( - endpoint="https://my-ledger-id.confidential-ledger.azure.com", - credential=credential, - ledger_certificate_path=ledger_tls_cert_file_name -) - -post_poller = ledger_client.begin_create_ledger_entry( - {"contents": "Original hello"} -) -post_result = post_poller.result() - -post_transaction_id = post_result["transactionId"] - -latest_entry = ledger_client.get_current_ledger_entry() -print( - f'Current entry (transaction id = {latest_entry["transactionId"]}) ' - f'in collection {latest_entry["collectionId"]}: {latest_entry["contents"]}' -) - -post_poller = ledger_client.begin_create_ledger_entry( - {"contents": "Hello!"} -) -post_result = post_poller.result() - -get_entry_poller = ledger_client.begin_get_ledger_entry(post_transaction_id) -older_entry = get_entry_poller.result() -print( - f'Contents of {older_entry["entry"]["collectionId"]} at {post_transaction_id}: {older_entry["entry"]["contents"]}' -) -``` - -### Making a ranged query -Ledger entries may be retrieved over a range of transaction ids. Entries will only be returned from the default or specified collection. - -```python -from azure.confidentialledger import ConfidentialLedgerClient -from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient -from azure.identity import DefaultAzureCredential - -identity_client = ConfidentialLedgerCertificateClient() -network_identity = identity_client.get_ledger_identity( - ledger_id="my-ledger-id" -) - -ledger_tls_cert_file_name = "ledger_certificate.pem" -with open(ledger_tls_cert_file_name, "w") as cert_file: - cert_file.write(network_identity["ledgerTlsCertificate"]) - -credential = DefaultAzureCredential() -ledger_client = ConfidentialLedgerClient( - endpoint="https://my-ledger-id.confidential-ledger.azure.com", - credential=credential, - ledger_certificate_path=ledger_tls_cert_file_name -) - -post_poller = ledger_client.begin_create_ledger_entry( - {"contents": "First message"} -) -first_transaction_id = post_poller.result()["transactionId"] - -for i in range(10): - ledger_client.create_ledger_entry( - {"contents": f"Message {i}"} - ) - -post_poller = ledger_client.begin_create_ledger_entry( - {"contents": "Last message"} -) -last_transaction_id = post_poller.result()["transactionId"] - -ranged_result = ledger_client.list_ledger_entries( - from_transaction_id=first_transaction_id, - to_transaction_id=last_transaction_id, -) -for entry in ranged_result: - print(f'Contents at {entry["transactionId"]}: {entry["contents"]}') -``` - -### Managing users -Users with `Administrator` privileges can manage users of the Confidential Ledger directly with the Confidential Ledger itself. Available roles are `Reader` (read-only), `Contributor` (read and write), and `Administrator` (read, write, and add or remove users). - -```python -from azure.confidentialledger import ConfidentialLedgerClient -from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient -from azure.identity import DefaultAzureCredential - -identity_client = ConfidentialLedgerCertificateClient() -network_identity = identity_client.get_ledger_identity( - ledger_id="my-ledger-id" -) - -ledger_tls_cert_file_name = "ledger_certificate.pem" -with open(ledger_tls_cert_file_name, "w") as cert_file: - cert_file.write(network_identity["ledgerTlsCertificate"]) - -credential = DefaultAzureCredential() -ledger_client = ConfidentialLedgerClient( - endpoint="https://my-ledger-id.confidential-ledger.azure.com", - credential=credential, - ledger_certificate_path=ledger_tls_cert_file_name -) - -user_id = "some AAD object id" -user = ledger_client.create_or_update_user( - user_id, {"assignedRole": "Contributor"} -) -# A client may now be created and used with AAD credentials (i.e. AAD-issued JWT tokens) for the user identified by `user_id`. - -user = ledger_client.get_user(user_id) -assert user["userId"] == user_id -assert user["assignedRole"] == "Contributor" - -ledger_client.delete_user(user_id) - -# For a certificate-based user, their user ID is the fingerprint for their PEM certificate. -user_id = "PEM certificate fingerprint" -user = ledger_client.create_or_update_user( - user_id, {"assignedRole": "Reader"} -) - -user = ledger_client.get_user(user_id) -assert user["userId"] == user_id -assert user["assignedRole"] == "Reader" - -ledger_client.delete_user(user_id) -``` - -### Using certificate authentication -Clients may authenticate with a client certificate in mutual TLS instead of via an Azure Active Directory token. `ConfidentialLedgerCertificateCredential` is provided for such clients. - -```python -from azure.confidentialledger import ( - ConfidentialLedgerCertificateCredential, - ConfidentialLedgerClient, -) -from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient - -identity_client = ConfidentialLedgerCertificateClient() -network_identity = identity_client.get_ledger_identity( - ledger_id="my-ledger-id" -) - -ledger_tls_cert_file_name = "ledger_certificate.pem" -with open(ledger_tls_cert_file_name, "w") as cert_file: - cert_file.write(network_identity["ledgerTlsCertificate"]) - -credential = ConfidentialLedgerCertificateCredential( - certificate_path="Path to user certificate PEM file" -) -ledger_client = ConfidentialLedgerClient( - endpoint="https://my-ledger-id.confidential-ledger.azure.com", - credential=credential, - ledger_certificate_path=ledger_tls_cert_file_name -) -``` - -### Verify write transaction receipts - -Clients can leverage the receipt verification library in the SDK to verify write transaction receipts issued by Azure Confidential Legder instances. The utility can be used to fully verify receipts offline as the verification algorithm does not require to be connected to a Confidential ledger or any other Azure service. - -Once a new entry has been appended to the ledger (please refer to [this example](https://github.com/Azure/azure-sdk-for-python/tree/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger#append-entry)), it is possible to get a receipt for the committed write transaction. - -```python -from azure.confidentialledger import ConfidentialLedgerClient -from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient -from azure.identity import DefaultAzureCredential - -# Replace this with the Confidential Ledger ID -ledger_id = "my-ledger-id" - -# Setup authentication -credential = DefaultAzureCredential() - -# Create a Ledger Certificate client and use it to -# retrieve the service identity for our ledger -identity_client = ConfidentialLedgerCertificateClient() -network_identity = identity_client.get_ledger_identity( - ledger_id=ledger_id -) - -# Save ledger service certificate into a file for later use -ledger_tls_cert_file_name = "ledger_certificate.pem" -with open(ledger_tls_cert_file_name, "w") as cert_file: - cert_file.write(network_identity["ledgerTlsCertificate"]) - -# Create Confidential Ledger client -ledger_client = ConfidentialLedgerClient( - endpoint=f"https://{ledger_id}.confidential-ledger.azure.com", - credential=credential, - ledger_certificate_path=ledger_tls_cert_file_name -) - -# The method begin_get_receipt returns a poller that -# we can use to wait for the receipt to be available for retrieval -get_receipt_poller = ledger_client.begin_get_receipt(transaction_id) -get_receipt_result = get_receipt_poller.result() - -print(f"Write receipt for transaction id {transaction_id} was successfully retrieved: {get_receipt_result}") -``` - -After fetching a receipt for a write transaction, it is possible to call the `verify_receipt` function to verify that the receipt is valid. The function can accept an optional list of application claims to verify against the receipt claims digest. - -```python -from azure.confidentialledger.receipt import ( - verify_receipt, -) - -# Read contents of service certificate file saved in previous step. -with open(ledger_tls_cert_file_name, "r") as service_cert_file: - service_cert_content = service_cert_file.read() - -# Optionally read application claims, if any -application_claims = get_receipt_result.get("applicationClaims", None) - -try: - # Verify the contents of the receipt. - verify_receipt(get_receipt_result["receipt"], service_cert_content, application_claims=application_claims) - print(f"Receipt for transaction id {transaction_id} successfully verified") -except ValueError: - print(f"Receipt verification for transaction id {transaction_id} failed") -``` - -A full sample Python program that shows how to append a new entry to a running Confidential Ledger instance, get a receipt for the committed transaction, and verify the receipt contents can be found under the [samples](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples) folder: [get_and_verify_receipt.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/get_and_verify_receipt.py). - -### Async API -This library includes a complete async API supported on Python 3.5+. To use it, you must first install an async transport, such as [aiohttp](https://pypi.org/project/aiohttp). See the [azure-core documentation](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/core/azure-core/CLIENT_LIBRARY_DEVELOPER.md#transport) for more information. - -An async client is obtained from `azure.confidentialledger.aio`. Methods have the same names and signatures as the synchronous client. Samples may be found [here](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples). - -## Troubleshooting -### General -Confidential Ledger clients raise exceptions defined in [azure-core][azure_core_exceptions]. For example, if you try to get a transaction that doesn't exist, `ConfidentialLedgerClient` raises [ResourceNotFoundError](https://aka.ms/azsdk-python-core-exceptions-resource-not-found-error): - -```python -from azure.core.exceptions import ResourceNotFoundError -from azure.confidentialledger import ConfidentialLedgerClient -from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient -from azure.identity import DefaultAzureCredential - -identity_client = ConfidentialLedgerCertificateClient() -network_identity = identity_client.get_ledger_identity( - ledger_id="my-ledger-id" -) - -ledger_tls_cert_file_name = "ledger_certificate.pem" -with open(ledger_tls_cert_file_name, "w") as cert_file: - cert_file.write(network_identity["ledgerTlsCertificate"]) - -credential = DefaultAzureCredential() -ledger_client = ConfidentialLedgerClient( - endpoint="https://my-ledger-id.confidential-ledger.azure.com", - credential=credential, - ledger_certificate_path=ledger_tls_cert_file_name -) - -try: - ledger_client.begin_get_ledger_entry( - transaction_id="10000.100000" # Using a very high id that probably doesn't exist in the ledger if it's relatively new. - ) -except ResourceNotFoundError as e: - print(e.message) -``` - -### Logging -This library uses the standard -[logging](https://docs.python.org/3.5/library/logging.html) library for logging. Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO level. - -Detailed DEBUG level logging, including request/response bodies and unredacted headers, can be enabled on a client with the `logging_enable` argument: -```python -import logging -import sys - -from azure.confidentialledger import ConfidentialLedgerClient -from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient -from azure.identity import DefaultAzureCredential - -# Create a logger for the 'azure' SDK -logger = logging.getLogger('azure') -logger.setLevel(logging.DEBUG) - -# Configure a console output -handler = logging.StreamHandler(stream=sys.stdout) -logger.addHandler(handler) - -identity_client = ConfidentialLedgerCertificateClient() -network_identity = identity_client.get_ledger_identity( - ledger_id="my-ledger-id" -) - -ledger_tls_cert_file_name = "ledger_certificate.pem" -with open(ledger_tls_cert_file_name, "w") as cert_file: - cert_file.write(network_identity["ledgerTlsCertificate"]) - -credential = DefaultAzureCredential() - -# This client will log detailed information about its HTTP sessions, at DEBUG level. -ledger_client = ConfidentialLedgerClient( - endpoint="https://my-ledger-id.confidential-ledger.azure.com", - credential=credential, - ledger_certificate_path=ledger_tls_cert_file_name, - logging_enable=True, -) -``` - -Similarly, `logging_enable` can enable detailed logging for a single operation, even when it isn't enabled for the client: -```python -ledger_client.get_current_ledger_entry(logging_enable=True) -``` - -## Next steps -### More sample code -These code samples show common scenario operations with the Azure Confidential Ledger client library. - -#### Common scenarios - -- Writing to the ledger: - - [write_to_ledger.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/write_to_ledger.py) - - [write_to_ledger_async.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/write_to_ledger_async.py) (async version) - -- Write many ledger entries and retrieve them all afterwards: - - [list_ledger_entries.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/list_ledger_entries.py) - - [list_ledger_entries_async.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/list_ledger_entries_async.py) (async version) - -- Manage users using service-implemented role-based access control: - - [manage_users.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/manage_users.py) - - [manage_users_async.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/manage_users_async.py) (async version) - -#### Advanced scenarios - -- Using collections: - - [use_collections.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/use_collections.py) - - [use_collections_async.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/use_collections_async.py) (async version) - -- Getting receipts for ledger writes: - - [get_receipt.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/get_receipt.py) - - [get_receipt_async.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/get_receipt_async.py) (async version) - -- Verifying service details: - - [verify_service.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/verify_service.py) - - [verify_service_async.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/verify_service_async.py) (async version) - -### Additional Documentation -For more extensive documentation on Azure Confidential Ledger, see the -[API reference documentation][reference_docs]. You may also read more about Microsoft Research's open-source [Confidential Consortium Framework][ccf]. - -## Contributing -This project welcomes contributions and suggestions. Most contributions require -you to agree to a Contributor License Agreement (CLA) declaring that you have -the right to, and actually do, grant us the rights to use your contribution. -For details, visit https://cla.microsoft.com. - -When you submit a pull request, a CLA-bot will automatically determine whether -you need to provide a CLA and decorate the PR appropriately (e.g., label, -comment). Simply follow the instructions provided by the bot. You will only -need to do this once across all repos using our CLA. - -This project has adopted the [Microsoft Open Source Code of Conduct][code_of_conduct]. -For more information, see the -[Code of Conduct FAQ][code_of_conduct_faq] or -contact opencode@microsoft.com with any additional questions or comments. - - -[azure_cli]: /cli/azure -[azure_cloud_shell]: https://shell.azure.com/bash -[azure_confidential_computing]: https://azure.microsoft.com/solutions/confidential-compute -[azure_core_exceptions]: https://github.com/Azure/azure-sdk-for-python/tree/azure-confidentialledger_1.1.1/sdk/core/azure-core#azure-core-library-exceptions -[azure_identity]: https://github.com/Azure/azure-sdk-for-python/tree/azure-confidentialledger_1.1.1/sdk/identity/azure-identity -[azure_identity_pypi]: https://pypi.org/project/azure-identity/ -[azure_resource_manager]: /azure/azure-resource-manager/management/overview -[azure_sub]: https://azure.microsoft.com/free -[ccf]: https://github.com/Microsoft/CCF -[code_of_conduct]: https://opensource.microsoft.com/codeofconduct -[code_of_conduct_faq]: https://opensource.microsoft.com/codeofconduct/faq -[confidential_ledger_client_src]: https://aka.ms/azsdk/python/confidentialledger/src -[confidential_ledger_docs]: https://aka.ms/confidentialledger-servicedocs -[default_cred_ref]: https://aka.ms/azsdk/python/identity/docs#azure.identity.DefaultAzureCredential -[pip]: https://pypi.org/project/pip/ -[pypi_package_confidential_ledger]: https://aka.ms/azsdk/python/confidentialledger/pypi +# Azure Confidential Ledger client library for Python - version 1.1.1 + + +Azure Confidential Ledger provides a service for logging to an immutable, tamper-proof ledger. As part of the [Azure Confidential Computing][azure_confidential_computing] portfolio, Azure Confidential Ledger runs in secure, hardware-based trusted execution environments, also known as enclaves. It is built on Microsoft Research's [Confidential Consortium Framework][ccf]. + +[Source code][confidential_ledger_client_src] +| [Package (PyPI)][pypi_package_confidential_ledger] +| [Package (Conda)](https://anaconda.org/microsoft/azure-confidentialledger/) +| [API reference documentation][reference_docs] +| [Product documentation][confidential_ledger_docs] + +## Getting started +### Install packages +Install [azure-confidentialledger][pypi_package_confidential_ledger] and [azure-identity][azure_identity_pypi] with [pip][pip]: +```Bash +pip install azure-identity azure-confidentialledger +``` +[azure-identity][azure_identity] is used for Azure Active Directory +authentication as demonstrated below. + +### Prerequisites +* An [Azure subscription][azure_sub] +* Python 3.6 or later +* A running instance of Azure Confidential Ledger. +* A registered user in the Confidential Ledger, typically assigned during [ARM][azure_resource_manager] resource creation, with `Administrator` privileges. + +### Authenticate the client +#### Using Azure Active Directory +This document demonstrates using [DefaultAzureCredential][default_cred_ref] to authenticate to the Confidential Ledger via Azure Active Directory. However, `ConfidentialLedgerClient` accepts any [azure-identity][azure_identity] credential. See the [azure-identity][azure_identity] documentation for more information about other credentials. + +#### Using a client certificate +As an alternative to Azure Active Directory, clients may choose to use a client certificate to authenticate via mutual TLS. `azure.confidentialledger.ConfidentialLedgerCertificateCredential` may be used for this purpose. + +### Create a client +`DefaultAzureCredential` will automatically handle most Azure SDK client scenarios. To get started, set environment variables for the AAD identity registered with your Confidential Ledger. +```bash +export AZURE_CLIENT_ID="generated app id" +export AZURE_CLIENT_SECRET="random password" +export AZURE_TENANT_ID="tenant id" +``` +Then, `DefaultAzureCredential` will be able to authenticate the `ConfidentialLedgerClient`. + +Constructing the client also requires your Confidential Ledger's URL and id, which you can get from the Azure CLI or the Azure Portal. When you have retrieved those values, please replace instances of `"my-ledger-id"` and `"https://my-ledger-id.confidential-ledger.azure.com"` in the examples below. You may also need to replace `"https://identity.confidential-ledger.core.azure.com"` with the hostname from the `identityServiceUri` in the ARM description of your ledger. + +Because Confidential Ledgers use self-signed certificates securely generated and stored in an enclave, the signing certificate for each Confidential Ledger must first be retrieved from the Confidential Ledger Identity Service. + +```python +from azure.confidentialledger import ConfidentialLedgerClient +from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient +from azure.identity import DefaultAzureCredential + +identity_client = ConfidentialLedgerCertificateClient() +network_identity = identity_client.get_ledger_identity( + ledger_id="my-ledger-id" +) + +ledger_tls_cert_file_name = "ledger_certificate.pem" +with open(ledger_tls_cert_file_name, "w") as cert_file: + cert_file.write(network_identity["ledgerTlsCertificate"]) + +credential = DefaultAzureCredential() +ledger_client = ConfidentialLedgerClient( + endpoint="https://my-ledger-id.confidential-ledger.azure.com", + credential=credential, + ledger_certificate_path=ledger_tls_cert_file_name +) +``` + +Conveniently, the `ConfidentialLedgerClient` constructor will fetch the ledger TLS certificate (and write it to the specified file) if it is provided with a non-existent file. The user is responsible for removing the created file as needed. + +```python +from azure.confidentialledger import ConfidentialLedgerClient +from azure.identity import DefaultAzureCredential + +credential = DefaultAzureCredential() +ledger_client = ConfidentialLedgerClient( + endpoint="https://my-ledger-id.confidential-ledger.azure.com", + credential=credential, + ledger_certificate_path="ledger_certificate.pem" +) + +# The ledger TLS certificate is written to `ledger_certificate.pem`. +``` + +To make it clear that a file is being used for the ledger TLS certificate, subsequent examples will explicitly write the ledger TLS certificate to a file. + +## Key concepts +### Ledger entries and transactions +Every write to Azure Confidential Ledger generates an immutable ledger entry in the service. Writes, also referred to as transactions, are uniquely identified by transaction ids that increment with each write. Once written, ledger entries may be retrieved at any time. + +### Collections +While most use cases involve just one collection per Confidential Ledger, we provide the collection id feature in case semantically or logically different groups of data need to be stored in the same Confidential Ledger. + +Ledger entries are retrieved by their `collectionId`. The Confidential Ledger will always assume a constant, service-determined `collectionId` for entries written without a `collectionId` specified. + +### Users +Users are managed directly with the Confidential Ledger instead of through Azure. Users may be AAD-based, identified by their AAD object id, or certificate-based, identified by their PEM certificate fingerprint. + +### Receipts + +To enforce transaction integrity guarantees, an Azure Confidential Ledger uses a [Merkle tree][merkle_tree_wiki] data structure to record the hash of all transactions blocks that are appended to the immutable ledger. After a write transaction is committed, Azure Confidential Ledger users can get a cryptographic Merkle proof, or receipt, over the entry produced in a Confidential Ledger to verify that the write operation was correctly saved. A write transaction receipt is proof that the system has committed the corresponding transaction and can be used to verify that the entry has been effectively appended to the ledger. + +Please refer to the following [article](https://learn.microsoft.com/azure/confidential-ledger/write-transaction-receipts) for more information about Azure Confidential Ledger write transaction receipts. + +### Receipt Verification + +After getting a receipt for a write transaction, Azure Confidential Ledger users can verify the contents of the fetched receipt following a verification algorithm. The success of the verification is proof that the write operation associated to the receipt was correctly appended into the immutable ledger. + +Please refer to the following [article](https://learn.microsoft.com/azure/confidential-ledger/verify-write-transaction-receipts) for more information about the verification process for Azure Confidential Ledger write transaction receipts. + +### Application Claims +Azure Confidential Ledger applications can attach arbitrary data, called application claims, to write transactions. These claims represent the actions executed during a write operation. When attached to a transaction, the SHA-256 digest of the claims object is included in the ledger and committed as part of the write transaction. This guarantees that the digest is signed in place and cannot be tampered with. + +Later, application claims can be revealed in their un-digested form in the receipt payload corresponding to the same transaction where they were added. This allows users to leverage the information in the receipt to re-compute the same claims digest that was attached and signed in place by the Azure Confidential Ledger instance during the transaction. The claims digest can be used as part of the write transaction receipt verification process, providing an offline way for users to fully verify the authenticity of the recorded claims. + +More details on the application claims format and the digest computation algorithm can be found at the following links: + +- [Azure Confidential Ledger application claims](https://learn.microsoft.com/azure/confidential-ledger/write-transaction-receipts#application-claims) +- [Azure Confidential Ledger application claims digest verification](https://learn.microsoft.com/azure/confidential-ledger/verify-write-transaction-receipts#verify-application-claims-digest) + +Please refer to the following CCF documentation pages for more information about CCF Application claims: + +- [Application Claims](https://microsoft.github.io/CCF/main/use_apps/verify_tx.html#application-claims) +- [User-Defined Claims in Receipts](https://microsoft.github.io/CCF/main/build_apps/example_cpp.html#user-defined-claims-in-receipts) + +### Confidential computing +[Azure Confidential Computing][azure_confidential_computing] allows you to isolate and protect your data while it is being processed in the cloud. Azure Confidential Ledger runs on Azure Confidential Computing virtual machines, thus providing stronger data protection with encryption of data in use. + +### Confidential Consortium Framework +Azure Confidential Ledger is built on Microsoft Research's open-source [Confidential Consortium Framework (CCF)][ccf]. Under CCF, applications are managed by a consortium of members with the ability to submit proposals to modify and govern application operation. In Azure Confidential Ledger, Microsoft Azure owns an operator member identity that allows it to perform governance and maintenance actions like replacing unhealthy nodes in the Confidential Ledger and upgrading the enclave code. + +## Examples +This section contains code snippets covering common tasks, including: +- [Append entry](#append-entry) +- [Retrieving ledger entries](#retrieving-ledger-entries) +- [Making a ranged query](#making-a-ranged-query) +- [Managing users](#managing-users) +- [Using certificate authentication](#using-certificate-authentication) +- [Verify write transaction receipts](#verify-write-transaction-receipts) + +### Append entry +Data that needs to be stored immutably in a tamper-proof manner can be saved to Azure Confidential Ledger by appending an entry to the ledger. + +Since Confidential Ledger is a distributed system, rare transient failures may cause writes to be lost. For entries that must be preserved, it is advisable to verify that the write became durable. For less important writes where higher client throughput is preferred, the wait step may be skipped. + +```python +from azure.confidentialledger import ConfidentialLedgerClient +from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient +from azure.identity import DefaultAzureCredential + +identity_client = ConfidentialLedgerCertificateClient() +network_identity = identity_client.get_ledger_identity( + ledger_id="my-ledger-id" +) + +ledger_tls_cert_file_name = "ledger_certificate.pem" +with open(ledger_tls_cert_file_name, "w") as cert_file: + cert_file.write(network_identity["ledgerTlsCertificate"]) + +credential = DefaultAzureCredential() +ledger_client = ConfidentialLedgerClient( + endpoint="https://my-ledger-id.confidential-ledger.azure.com", + credential=credential, + ledger_certificate_path=ledger_tls_cert_file_name +) + +post_entry_result = ledger_client.create_ledger_entry( + {"contents": "Hello world!"} + ) +transaction_id = post_entry_result["transactionId"] + +wait_poller = ledger_client.begin_wait_for_commit(transaction_id) +wait_poller.wait() +print(f'Ledger entry at transaction id {transaction_id} has been committed successfully') +``` + +Alternatively, the client may wait for commit when writing a ledger entry. + +```python +from azure.confidentialledger import ConfidentialLedgerClient +from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient +from azure.identity import DefaultAzureCredential + +identity_client = ConfidentialLedgerCertificateClient() +network_identity = identity_client.get_ledger_identity( + ledger_id="my-ledger-id" +) + +ledger_tls_cert_file_name = "ledger_certificate.pem" +with open(ledger_tls_cert_file_name, "w") as cert_file: + cert_file.write(network_identity["ledgerTlsCertificate"]) + +credential = DefaultAzureCredential() +ledger_client = ConfidentialLedgerClient( + endpoint="https://my-ledger-id.confidential-ledger.azure.com", + credential=credential, + ledger_certificate_path=ledger_tls_cert_file_name +) + +post_poller = ledger_client.begin_create_ledger_entry( + {"contents": "Hello world again!"} +) +new_post_result = post_poller.result() +print( + 'The new ledger entry has been committed successfully at transaction id ' + f'{new_post_result["transactionId"]}' +) +``` + +### Retrieving ledger entries +Getting ledger entries older than the latest may take some time as the service is loading historical entries, so a poller is provided. + +Ledger entries are retrieved by collection. The returned value is the value contained in the specified collection at the point in time identified by the transaction id. + +```python +from azure.confidentialledger import ConfidentialLedgerClient +from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient +from azure.identity import DefaultAzureCredential + +identity_client = ConfidentialLedgerCertificateClient() +network_identity = identity_client.get_ledger_identity( + ledger_id="my-ledger-id" +) + +ledger_tls_cert_file_name = "ledger_certificate.pem" +with open(ledger_tls_cert_file_name, "w") as cert_file: + cert_file.write(network_identity["ledgerTlsCertificate"]) + +credential = DefaultAzureCredential() +ledger_client = ConfidentialLedgerClient( + endpoint="https://my-ledger-id.confidential-ledger.azure.com", + credential=credential, + ledger_certificate_path=ledger_tls_cert_file_name +) + +post_poller = ledger_client.begin_create_ledger_entry( + {"contents": "Original hello"} +) +post_result = post_poller.result() + +post_transaction_id = post_result["transactionId"] + +latest_entry = ledger_client.get_current_ledger_entry() +print( + f'Current entry (transaction id = {latest_entry["transactionId"]}) ' + f'in collection {latest_entry["collectionId"]}: {latest_entry["contents"]}' +) + +post_poller = ledger_client.begin_create_ledger_entry( + {"contents": "Hello!"} +) +post_result = post_poller.result() + +get_entry_poller = ledger_client.begin_get_ledger_entry(post_transaction_id) +older_entry = get_entry_poller.result() +print( + f'Contents of {older_entry["entry"]["collectionId"]} at {post_transaction_id}: {older_entry["entry"]["contents"]}' +) +``` + +### Making a ranged query +Ledger entries may be retrieved over a range of transaction ids. Entries will only be returned from the default or specified collection. + +```python +from azure.confidentialledger import ConfidentialLedgerClient +from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient +from azure.identity import DefaultAzureCredential + +identity_client = ConfidentialLedgerCertificateClient() +network_identity = identity_client.get_ledger_identity( + ledger_id="my-ledger-id" +) + +ledger_tls_cert_file_name = "ledger_certificate.pem" +with open(ledger_tls_cert_file_name, "w") as cert_file: + cert_file.write(network_identity["ledgerTlsCertificate"]) + +credential = DefaultAzureCredential() +ledger_client = ConfidentialLedgerClient( + endpoint="https://my-ledger-id.confidential-ledger.azure.com", + credential=credential, + ledger_certificate_path=ledger_tls_cert_file_name +) + +post_poller = ledger_client.begin_create_ledger_entry( + {"contents": "First message"} +) +first_transaction_id = post_poller.result()["transactionId"] + +for i in range(10): + ledger_client.create_ledger_entry( + {"contents": f"Message {i}"} + ) + +post_poller = ledger_client.begin_create_ledger_entry( + {"contents": "Last message"} +) +last_transaction_id = post_poller.result()["transactionId"] + +ranged_result = ledger_client.list_ledger_entries( + from_transaction_id=first_transaction_id, + to_transaction_id=last_transaction_id, +) +for entry in ranged_result: + print(f'Contents at {entry["transactionId"]}: {entry["contents"]}') +``` + +### Managing users +Users with `Administrator` privileges can manage users of the Confidential Ledger directly with the Confidential Ledger itself. Available roles are `Reader` (read-only), `Contributor` (read and write), and `Administrator` (read, write, and add or remove users). + +```python +from azure.confidentialledger import ConfidentialLedgerClient +from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient +from azure.identity import DefaultAzureCredential + +identity_client = ConfidentialLedgerCertificateClient() +network_identity = identity_client.get_ledger_identity( + ledger_id="my-ledger-id" +) + +ledger_tls_cert_file_name = "ledger_certificate.pem" +with open(ledger_tls_cert_file_name, "w") as cert_file: + cert_file.write(network_identity["ledgerTlsCertificate"]) + +credential = DefaultAzureCredential() +ledger_client = ConfidentialLedgerClient( + endpoint="https://my-ledger-id.confidential-ledger.azure.com", + credential=credential, + ledger_certificate_path=ledger_tls_cert_file_name +) + +user_id = "some AAD object id" +user = ledger_client.create_or_update_user( + user_id, {"assignedRole": "Contributor"} +) +# A client may now be created and used with AAD credentials (i.e. AAD-issued JWT tokens) for the user identified by `user_id`. + +user = ledger_client.get_user(user_id) +assert user["userId"] == user_id +assert user["assignedRole"] == "Contributor" + +ledger_client.delete_user(user_id) + +# For a certificate-based user, their user ID is the fingerprint for their PEM certificate. +user_id = "PEM certificate fingerprint" +user = ledger_client.create_or_update_user( + user_id, {"assignedRole": "Reader"} +) + +user = ledger_client.get_user(user_id) +assert user["userId"] == user_id +assert user["assignedRole"] == "Reader" + +ledger_client.delete_user(user_id) +``` + +### Using certificate authentication +Clients may authenticate with a client certificate in mutual TLS instead of via an Azure Active Directory token. `ConfidentialLedgerCertificateCredential` is provided for such clients. + +```python +from azure.confidentialledger import ( + ConfidentialLedgerCertificateCredential, + ConfidentialLedgerClient, +) +from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient + +identity_client = ConfidentialLedgerCertificateClient() +network_identity = identity_client.get_ledger_identity( + ledger_id="my-ledger-id" +) + +ledger_tls_cert_file_name = "ledger_certificate.pem" +with open(ledger_tls_cert_file_name, "w") as cert_file: + cert_file.write(network_identity["ledgerTlsCertificate"]) + +credential = ConfidentialLedgerCertificateCredential( + certificate_path="Path to user certificate PEM file" +) +ledger_client = ConfidentialLedgerClient( + endpoint="https://my-ledger-id.confidential-ledger.azure.com", + credential=credential, + ledger_certificate_path=ledger_tls_cert_file_name +) +``` + +### Verify write transaction receipts + +Clients can leverage the receipt verification library in the SDK to verify write transaction receipts issued by Azure Confidential Ledger instances. The utility can be used to fully verify receipts offline as the verification algorithm does not require to be connected to a Confidential ledger or any other Azure service. + +Once a new entry has been appended to the ledger (please refer to [this example](https://github.com/Azure/azure-sdk-for-python/tree/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger#append-entry)), it is possible to get a receipt for the committed write transaction. + +```python +from azure.confidentialledger import ConfidentialLedgerClient +from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient +from azure.identity import DefaultAzureCredential + +# Replace this with the Confidential Ledger ID +ledger_id = "my-ledger-id" + +# Setup authentication +credential = DefaultAzureCredential() + +# Create a Ledger Certificate client and use it to +# retrieve the service identity for our ledger +identity_client = ConfidentialLedgerCertificateClient() +network_identity = identity_client.get_ledger_identity( + ledger_id=ledger_id +) + +# Save ledger service certificate into a file for later use +ledger_tls_cert_file_name = "ledger_certificate.pem" +with open(ledger_tls_cert_file_name, "w") as cert_file: + cert_file.write(network_identity["ledgerTlsCertificate"]) + +# Create Confidential Ledger client +ledger_client = ConfidentialLedgerClient( + endpoint=f"https://{ledger_id}.confidential-ledger.azure.com", + credential=credential, + ledger_certificate_path=ledger_tls_cert_file_name +) + +# The method begin_get_receipt returns a poller that +# we can use to wait for the receipt to be available for retrieval +get_receipt_poller = ledger_client.begin_get_receipt(transaction_id) +get_receipt_result = get_receipt_poller.result() + +print(f"Write receipt for transaction id {transaction_id} was successfully retrieved: {get_receipt_result}") +``` + +After fetching a receipt for a write transaction, it is possible to call the `verify_receipt` function to verify that the receipt is valid. The function can accept an optional list of application claims to verify against the receipt claims digest. + +```python +from azure.confidentialledger.receipt import ( + verify_receipt, +) + +# Read contents of service certificate file saved in previous step. +with open(ledger_tls_cert_file_name, "r") as service_cert_file: + service_cert_content = service_cert_file.read() + +# Optionally read application claims, if any +application_claims = get_receipt_result.get("applicationClaims", None) + +try: + # Verify the contents of the receipt. + verify_receipt(get_receipt_result["receipt"], service_cert_content, application_claims=application_claims) + print(f"Receipt for transaction id {transaction_id} successfully verified") +except ValueError: + print(f"Receipt verification for transaction id {transaction_id} failed") +``` + +A full sample Python program that shows how to append a new entry to a running Confidential Ledger instance, get a receipt for the committed transaction, and verify the receipt contents can be found under the [samples](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples) folder: [get_and_verify_receipt.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/get_and_verify_receipt.py). + +### Async API +This library includes a complete async API supported on Python 3.5+. To use it, you must first install an async transport, such as [aiohttp](https://pypi.org/project/aiohttp). See the [azure-core documentation](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/core/azure-core/CLIENT_LIBRARY_DEVELOPER.md#transport) for more information. + +An async client is obtained from `azure.confidentialledger.aio`. Methods have the same names and signatures as the synchronous client. Samples may be found [here](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples). + +## Troubleshooting +### General +Confidential Ledger clients raise exceptions defined in [azure-core][azure_core_exceptions]. For example, if you try to get a transaction that doesn't exist, `ConfidentialLedgerClient` raises [ResourceNotFoundError](https://aka.ms/azsdk-python-core-exceptions-resource-not-found-error): + +```python +from azure.core.exceptions import ResourceNotFoundError +from azure.confidentialledger import ConfidentialLedgerClient +from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient +from azure.identity import DefaultAzureCredential + +identity_client = ConfidentialLedgerCertificateClient() +network_identity = identity_client.get_ledger_identity( + ledger_id="my-ledger-id" +) + +ledger_tls_cert_file_name = "ledger_certificate.pem" +with open(ledger_tls_cert_file_name, "w") as cert_file: + cert_file.write(network_identity["ledgerTlsCertificate"]) + +credential = DefaultAzureCredential() +ledger_client = ConfidentialLedgerClient( + endpoint="https://my-ledger-id.confidential-ledger.azure.com", + credential=credential, + ledger_certificate_path=ledger_tls_cert_file_name +) + +try: + ledger_client.begin_get_ledger_entry( + transaction_id="10000.100000" # Using a very high id that probably doesn't exist in the ledger if it's relatively new. + ) +except ResourceNotFoundError as e: + print(e.message) +``` + +### Logging +This library uses the standard +[logging](https://docs.python.org/3.5/library/logging.html) library for logging. Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO level. + +Detailed DEBUG level logging, including request/response bodies and unredacted headers, can be enabled on a client with the `logging_enable` argument: +```python +import logging +import sys + +from azure.confidentialledger import ConfidentialLedgerClient +from azure.confidentialledger.certificate import ConfidentialLedgerCertificateClient +from azure.identity import DefaultAzureCredential + +# Create a logger for the 'azure' SDK +logger = logging.getLogger('azure') +logger.setLevel(logging.DEBUG) + +# Configure a console output +handler = logging.StreamHandler(stream=sys.stdout) +logger.addHandler(handler) + +identity_client = ConfidentialLedgerCertificateClient() +network_identity = identity_client.get_ledger_identity( + ledger_id="my-ledger-id" +) + +ledger_tls_cert_file_name = "ledger_certificate.pem" +with open(ledger_tls_cert_file_name, "w") as cert_file: + cert_file.write(network_identity["ledgerTlsCertificate"]) + +credential = DefaultAzureCredential() + +# This client will log detailed information about its HTTP sessions, at DEBUG level. +ledger_client = ConfidentialLedgerClient( + endpoint="https://my-ledger-id.confidential-ledger.azure.com", + credential=credential, + ledger_certificate_path=ledger_tls_cert_file_name, + logging_enable=True, +) +``` + +Similarly, `logging_enable` can enable detailed logging for a single operation, even when it isn't enabled for the client: +```python +ledger_client.get_current_ledger_entry(logging_enable=True) +``` + +## Next steps +### More sample code +These code samples show common scenario operations with the Azure Confidential Ledger client library. + +#### Common scenarios + +- Writing to the ledger: + - [write_to_ledger.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/write_to_ledger.py) + - [write_to_ledger_async.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/write_to_ledger_async.py) (async version) + +- Write many ledger entries and retrieve them all afterwards: + - [list_ledger_entries.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/list_ledger_entries.py) + - [list_ledger_entries_async.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/list_ledger_entries_async.py) (async version) + +- Manage users using service-implemented role-based access control: + - [manage_users.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/manage_users.py) + - [manage_users_async.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/manage_users_async.py) (async version) + +#### Advanced scenarios + +- Using collections: + - [use_collections.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/use_collections.py) + - [use_collections_async.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/use_collections_async.py) (async version) + +- Getting receipts for ledger writes: + - [get_receipt.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/get_receipt.py) + - [get_receipt_async.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/get_receipt_async.py) (async version) + +- Verifying service details: + - [verify_service.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/verify_service.py) + - [verify_service_async.py](https://github.com/Azure/azure-sdk-for-python/blob/azure-confidentialledger_1.1.1/sdk/confidentialledger/azure-confidentialledger/samples/verify_service_async.py) (async version) + +### Additional Documentation +For more extensive documentation on Azure Confidential Ledger, see the +[API reference documentation][reference_docs]. You may also read more about Microsoft Research's open-source [Confidential Consortium Framework][ccf]. + +## Contributing +This project welcomes contributions and suggestions. Most contributions require +you to agree to a Contributor License Agreement (CLA) declaring that you have +the right to, and actually do, grant us the rights to use your contribution. +For details, visit https://cla.microsoft.com. + +When you submit a pull request, a CLA-bot will automatically determine whether +you need to provide a CLA and decorate the PR appropriately (e.g., label, +comment). Simply follow the instructions provided by the bot. You will only +need to do this once across all repos using our CLA. + +This project has adopted the [Microsoft Open Source Code of Conduct][code_of_conduct]. +For more information, see the +[Code of Conduct FAQ][code_of_conduct_faq] or +contact opencode@microsoft.com with any additional questions or comments. + + +[azure_cli]: /cli/azure +[azure_cloud_shell]: https://shell.azure.com/bash +[azure_confidential_computing]: https://azure.microsoft.com/solutions/confidential-compute +[azure_core_exceptions]: https://github.com/Azure/azure-sdk-for-python/tree/azure-confidentialledger_1.1.1/sdk/core/azure-core#azure-core-library-exceptions +[azure_identity]: https://github.com/Azure/azure-sdk-for-python/tree/azure-confidentialledger_1.1.1/sdk/identity/azure-identity +[azure_identity_pypi]: https://pypi.org/project/azure-identity/ +[azure_resource_manager]: /azure/azure-resource-manager/management/overview +[azure_sub]: https://azure.microsoft.com/free +[ccf]: https://github.com/Microsoft/CCF +[code_of_conduct]: https://opensource.microsoft.com/codeofconduct +[code_of_conduct_faq]: https://opensource.microsoft.com/codeofconduct/faq +[confidential_ledger_client_src]: https://aka.ms/azsdk/python/confidentialledger/src +[confidential_ledger_docs]: https://aka.ms/confidentialledger-servicedocs +[default_cred_ref]: https://aka.ms/azsdk/python/identity/docs#azure.identity.DefaultAzureCredential +[pip]: https://pypi.org/project/pip/ +[pypi_package_confidential_ledger]: https://aka.ms/azsdk/python/confidentialledger/pypi [reference_docs]: https://aka.ms/azsdk/python/confidentialledger/ref-docs - + diff --git a/docs-ref-services/latest/mgmt-managementgroups-readme.md b/docs-ref-services/latest/mgmt-managementgroups-readme.md index 1d2a27c9d8b3..db9191eb2781 100644 --- a/docs-ref-services/latest/mgmt-managementgroups-readme.md +++ b/docs-ref-services/latest/mgmt-managementgroups-readme.md @@ -8,39 +8,39 @@ ms.service: azure-resource-manager ms.subservice: management ms.technology: azure --- -## Microsoft Azure SDK for Python - -This is the Microsoft Azure Management Groups Client Library. - -Azure Resource Manager (ARM) is the next generation of management APIs -that replace the old Azure Service Management (ASM). - -This package has been tested with Python 2.7, 3.4, 3.5, 3.6 and 3.7. - -For the older Azure Service Management (ASM) libraries, see -[azure-servicemanagement-legacy](https://pypi.python.org/pypi/azure-servicemanagement-legacy) -library. - -For a more complete set of Azure libraries, see the -[azure sdk python release](https://aka.ms/azsdk/python/all). - -## Usage - - -To learn how to use this package, see the [quickstart guide](https://aka.ms/azsdk/python/mgmt) - - - -For docs and references, see [Python SDK References](https://docs.microsoft.com/python/api/overview/azure/) -Code samples for this package can be found at [Management Management](https://docs.microsoft.com/samples/browse/?languages=python&term=Getting%20started%20-%20Managing&terms=Getting%20started%20-%20Managing) on docs.microsoft.com. -Additional code samples for different Azure services are available at [Samples Repo](https://aka.ms/azsdk/python/mgmt/samples) - - -## Provide Feedback - -If you encounter any bugs or have suggestions, please file an issue in -the [Issues](https://github.com/Azure/azure-sdk-for-python/issues) -section of the project. - +## Microsoft Azure SDK for Python + +This is the Microsoft Azure Management Groups Client Library. + +Azure Resource Manager (ARM) is the next generation of management APIs +that replace the old Azure Service Management (ASM). + +This package has been tested with Python 2.7, 3.4, 3.5, 3.6 and 3.7. + +For the older Azure Service Management (ASM) libraries, see +[azure-servicemanagement-legacy](https://pypi.python.org/pypi/azure-servicemanagement-legacy) +library. + +For a more complete set of Azure libraries, see the +[azure sdk python release](https://aka.ms/azsdk/python/all). + +## Usage + + +To learn how to use this package, see the [quickstart guide](https://aka.ms/azsdk/python/mgmt) + + + +For docs and references, see [Python SDK References](https://docs.microsoft.com/python/api/overview/azure/) +Code samples for this package can be found at [Management](https://docs.microsoft.com/samples/browse/?languages=python&term=Getting%20started%20-%20Managing&terms=Getting%20started%20-%20Managing) on docs.microsoft.com. +Additional code samples for different Azure services are available at [Samples Repo](https://aka.ms/azsdk/python/mgmt/samples) + + +## Provide Feedback + +If you encounter any bugs or have suggestions, please file an issue in +the [Issues](https://github.com/Azure/azure-sdk-for-python/issues) +section of the project. + ![image](https://azure-sdk-impressions.azurewebsites.net/api/impressions/azure-sdk-for-python%2Fazure-mgmt-managementgroups%2FREADME.png) - + diff --git a/docs-ref-services/latest/mgmt-search-readme.md b/docs-ref-services/latest/mgmt-search-readme.md index 238105c7975a..1dfd6d51d9a0 100644 --- a/docs-ref-services/latest/mgmt-search-readme.md +++ b/docs-ref-services/latest/mgmt-search-readme.md @@ -54,7 +54,7 @@ client = SearchManagementClient(credential=DefaultAzureCredential(), subscriptio ## Examples Code samples for this package can be found at: -- [Search Search Management](/samples/browse/?languages=python&term=Getting%20started%20-%20Managing&terms=Getting%20started%20-%20Managing) on docs.microsoft.com +- [Search Management](/samples/browse/?languages=python&term=Getting%20started%20-%20Managing&terms=Getting%20started%20-%20Managing) on docs.microsoft.com - [Azure Python Mgmt SDK Samples Repo](https://aka.ms/azsdk/python/mgmt/samples) diff --git a/docs-ref-services/latest/security-attestation-readme.md b/docs-ref-services/latest/security-attestation-readme.md index 034c0808ff50..7f9c727cacfc 100644 --- a/docs-ref-services/latest/security-attestation-readme.md +++ b/docs-ref-services/latest/security-attestation-readme.md @@ -7,354 +7,354 @@ ms.devlang: python ms.service: attestation ms.technology: azure --- -# Azure Attestation client library for Python - version 1.0.0 - - -The Microsoft Azure Attestation (MAA) service is a unified solution for remotely verifying the trustworthiness of a platform and integrity of the binaries running inside it. The service supports attestation of the platforms backed by Trusted Platform Modules (TPMs) alongside the ability to attest to the state of Trusted Execution Environments (TEEs) such as Intel(tm) Software Guard Extensions (SGX) enclaves and Virtualization-based Security (VBS) enclaves. - -Attestation is a process for demonstrating that software binaries were properly instantiated on a trusted platform. Remote relying parties can then gain confidence that only such intended software is running on trusted hardware. Azure Attestation is a unified customer-facing service and framework for attestation. - -Azure Attestation enables cutting-edge security paradigms such as Azure Confidential computing and Intelligent Edge protection. Customers have been requesting the ability to independently verify the location of a machine, the posture of a virtual machine (VM) on that machine, and the environment within which enclaves are running on that VM. Azure Attestation will empower these and many additional customer requests. - -Azure Attestation receives evidence from compute entities, turns them into a set of claims, validates them against configurable policies, and produces cryptographic proofs for claims-based applications (for example, relying parties and auditing authorities). - -This package has been tested with Python 2.7, 3.6 to 3.9. - -For a more complete view of Azure libraries, see the [Azure SDK for Python release page](https://aka.ms/azsdk/python/all). - -[Source code][source_code] | [Package (PyPI)][Attestation_pypi] | [API reference documentation][API_reference] | [Product documentation](https://docs.microsoft.com/azure/attestation/) - -## Getting started - -### Prerequisites - -* An Azure subscription. To use Azure services, including the Azure Attestation service, you'll need a subscription. If you do not have an existing Azure account, you may sign up for a [free trial][azure_sub] or use your [Visual Studio Subscription](https://visualstudio.microsoft.com/subscriptions/) benefits when you [create an account](https://account.windowsazure.com/Home/Index). -* An existing Azure Attestation Instance, or you can use the "shared provider" available in each Azure region. If you need to create an Azure Attestation service instance, you can use the Azure Portal or [Azure CLI][azure_cli]. - -### Install the package - -Install the Azure Attestation client library for Python with [PyPI][Attestation_pypi]: - -```Powershell -pip install azure-security-attestation -``` - -### Authenticate the client - -In order to interact with the Azure Attestation service, you'll need to create an instance of the [Attestation Client][attestation_client] or [Attestation Administration Client][attestation_admin_client] class. You need an **attestation endpoint**, which you may see as "Attest URI" in the portal, -and **client credentials (client id, client secret, tenant id)** to instantiate a client object. - -[Client secret credential][ClientSecretCredential] authentication is being used in this getting started section but you can find more ways to authenticate with the [Azure identity package][azure_identity]. To use the [DefaultAzureCredential][DefaultAzureCredential] provider shown below, -or other credential providers provided with the Azure SDK, you should install the azure-identity package: - -```Powershell -pip install azure-identity -``` - -#### Create/Get credentials - -Use the [Azure CLI][azure_cli] snippet below to create/get client secret credentials. - -* Create a service principal and configure its access to Azure resources: - - ```Powershell - az ad sp create-for-rbac -n --skip-assignment - ``` - - Output: - - ```json - { - "appId": "generated-app-ID", - "displayName": "dummy-app-name", - "name": "http://dummy-app-name", - "password": "random-password", - "tenant": "tenant-ID" - } - ``` - -* Take note of the service principal objectId - - ```Powershell - az ad sp show --id --query objectId - ``` - - Output: - - ```Powershell - "" - ``` - -* Use the returned credentials above to set **AZURE_CLIENT_ID** (appId), **AZURE_CLIENT_SECRET** (password), and **AZURE_TENANT_ID** (tenant) environment variables. The following example shows a way to do this in Powershell: - - ```Powershell - $Env:AZURE_CLIENT_ID="generated-app-ID" - $Env:AZURE_CLIENT_SECRET="random-password" - $Env:AZURE_TENANT_ID="tenant-ID" - ``` - -For more information about the Azure Identity APIs and how to use them, see [Azure Identity client library](https://github.com/Azure/azure-sdk-for-python/tree/azure-security-attestation_1.0.0/sdk/identity/azure-identity) - -## Key concepts - -There are four major families of functionality provided in this preview SDK: - -* [SGX and TPM enclave attestation.](#attestation) -* [MAA Attestation Token signing certificate discovery and validation.](#attestation-token-signing-certificate-discovery-and-validation) -* [Attestation Policy management.](#policy-management) -* [Attestation policy management certificate management](#policy-management-certificate-management) (yes, policy management management). - -The Microsoft Azure Attestation service runs in two separate modes: "Isolated" and "AAD". When the service is running in "Isolated" mode, the customer needs to -provide additional information beyond their authentication credentials to verify that they are authorized to modify the state of an attestation instance. - -Finally, each region in which the Azure Attestation service is available supports a "shared" instance, which -can be used to attest SGX enclaves which only need verification against the azure baseline (there are no policies applied to the shared instance). TPM attestation is not available in the shared instance. -While the shared instance requires AAD authentication, it does not have any RBAC policies - any customer with a valid AAD bearer token can attest using the shared instance. - -### Attestation - -SGX or TPM attestation is the process of validating evidence collected from -a trusted execution environment to ensure that it meets both the Azure baseline for that environment and customer defined policies applied to that environment. - -### Attestation service token signing certificate discovery and validation - -One of the core operational guarantees of the Azure Attestation Service is that the service operates "operationally out of the TCB". In other words, there is no way that a Microsoft operator could tamper with the operation of the service, or corrupt data sent from the client. To ensure this guarantee, the core of the attestation service runs in an Intel(tm) SGX enclave. - -To allow customers to verify that operations were actually performed inside the enclave, most responses from the Attestation Service are encoded in a [JSON Web Token][json_web_token], which is signed by a key held within the attestation service's enclave. - -This token will be signed by a signing certificate issued by the MAA service for the specified instance. - -If the MAA service instance is running in a region where the service runs in an SGX enclave, then -the certificate issued by the server can be verified using the [oe_verify_attestation_certificate API](https://openenclave.github.io/openenclave/api/enclave_8h_a3b75c5638360adca181a0d945b45ad86.html). - -### Policy Management - -Each attestation service instance has a policy applied to it which defines additional criteria which the customer has defined. - -For more information on attestation policies, see [Attestation Policy](https://docs.microsoft.com/azure/attestation/author-sign-policy) - -### Policy Management certificate management - -When an attestation instance is running in "Isolated" mode, the customer who created the instance will have provided -a policy management certificate at the time the instance is created. All policy modification operations require that the customer sign -the policy data with one of the existing policy management certificates. The Policy Management Certificate Management APIs enable -clients to "roll" the policy management certificates. - -### Isolated Mode and AAD Mode - -Each Microsoft Azure Attestation service instance operates in either "AAD" mode or "Isolated" mode. When an MAA instance is operating in AAD mode, it means that the customer which created the attestation instance allows Azure Active Directory and Azure Role Based Access control policies to verify access to the attestation instance. - -### *AttestationType* - -The Azure Attestation service supports attesting different types of evidence depending on the environment. -Currently, MAA supports the following Trusted Execution environments: - -* OpenEnclave - An Intel(tm) Processor running code in an SGX Enclave where the attestation evidence was collected using the OpenEnclave [oe_get_report](https://openenclave.io/apidocs/v0.14/enclave_8h_aefcb89c91a9078d595e255bd7901ac71.html#aefcb89c91a9078d595e255bd7901ac71) or [oe_get_evidence](https://openenclave.io/apidocs/v0.14/attester_8h_a7d197e42468636e95a6ab97b8e74c451.html#a7d197e42468636e95a6ab97b8e74c451) API. -* SgxEnclave - An Intel(tm) Processor running code in an SGX Enclave where the attestation evidence was collected using the Intel SGX SDK. -* Tpm - A Virtualization Based Security environment where the Trusted Platform Module of the processor is used to provide the attestation evidence. - -### Runtime Data and Inittime Data - -RuntimeData refers to data which is presented to the Intel SGX Quote generation logic or the `oe_get_report`/`oe_get_evidence` APIs. If the caller to the attest API provided a `runtime_data` attribute, The Azure Attestation service will validate that the first 32 bytes of the `report_data` field in the SGX Quote/OE Report/OE Evidence matches the SHA256 hash of the `runtime_data`. - -InitTime data refers to data which is used to configure the SGX enclave being attested. - -> Note that InitTime data is not supported on Azure [DCsv2-Series](https://docs.microsoft.com/azure/virtual-machines/dcv2-series) virtual machines. - -### Additional concepts - -## Examples - -* [Create an attestation client instance](#create-client-instance) -* [Attest an SGX enclave](#attest-sgx-enclave) -* [Get attestation policy](#get-attestation-policy) -* [Retrieve token validation certificates](#retrieve-token-certificates) -* [Create an attestation client instance](#create-client-instance) - -### Create client instance - -Creates an instance of the Attestation Client at uri `endpoint`. - -```python -attest_client = AttestationClient( - endpoint=base_uri, - credential=DefaultAzureCredential()) -``` - -### Get attestation policy - -The `set_policy` method retrieves the attestation policy from the service. -Attestation Policies are instanced on a per-attestation type basis, the `AttestationType` parameter defines the type to retrieve. - -```python -policy, token = attest_client.get_policy(AttestationType.SGX_ENCLAVE) -print('Instance SGX policy: ', policy) -print('Token: ', token) -``` - -### Set an attestation policy for a specified attestation type - -If the attestation service instance is running in Isolated mode, the set_policy API needs to provide a signing certificate (and private key) which can be used to validate that the caller is authorized to modify policy on the attestation instance. If the service instance is running in AAD mode, then the signing certificate and key are optional. - -Under the covers, the SetPolicy APIs create a [JSON Web Token][json_web_token] based on the policy document and signing information which is sent to the attestation service. - -```python -policy_set_response = attest_client.set_policy(AttestationType.SGX_ENCLAVE, - attestation_policy, - signing_key=key, - signing_certificate=signing_certificate) -new_policy, _ = attest_client.get_policy(AttestationType.SGX_ENCLAVE) -# `new_policy` will equal `attestation_policy`. -``` - -If the service instance is running in AAD mode, the call to set_policy can be -simplified: - -```python -policy_set_response = attest_client.set_policy(AttestationType.SGX_ENCLAVE, - attestation_policy) -# Now retrieve the policy which was just set. -new_policy, _ = attest_client.get_policy(AttestationType.SGX_ENCLAVE) - -``` - -Clients need to be able to verify that the attestation policy document was not modified before the policy document was received by the attestation service's enclave. - -There are two properties provided in the [PolicyResult][attestation_policy_result] that can be used to verify that the service received the policy document: - -* [policy_signer][attestation_policy_result_parameters] - if the `set_policy` call included a signing certificate, this will be the certificate provided at the time of the `set_policy` call. If no policy signer was set, this will be null. -* [policy_token_hash][attestation_policy_result_parameters] - this is the hash of the [JSON Web Token][json_web_token] sent to the service. - -To verify the hash, clients can generate an attestation policy token and verify the hash generated from that token: - -```python -from cryptography.hazmat.primitives import hashes - -expected_policy = AttestationPolicyToken( - attestation_policy, - signing_key=key, - signing_certificate=signing_certificate) -hasher = hashes.Hash(hashes.SHA256()) -hasher.update(expected_policy.serialize().encode('utf-8')) -expected_hash = hasher.finalize() - -# `expected_hash` will exactly match `policy_set_response.policy_token_hash` -``` - -### Attest SGX Enclave - -Use the [attest_sgx_enclave][attest_sgx] method to attest an SGX enclave. - -One of the core challenges customers have interacting with encrypted environments is how to ensure that you can securely communicate with the code running in the environment ("enclave code"). - -One solution to this problem is what is known as "Secure Key Release", which is a pattern that enables secure communication with enclave code. - -To implement the "Secure Key Release" pattern, the enclave code generates an ephemeral asymmetric key. It then serializes the public portion of the key to some format (possibly a JSON Web Key, or PEM, or some other serialization format). - -The enclave code then calculates the SHA256 value of the public key and passes it as an input to code which generates an SGX Quote (for OpenEnclave, that would be the [oe_get_evidence](https://openenclave.io/apidocs/v0.14/attester_8h_a7d197e42468636e95a6ab97b8e74c451.html#a7d197e42468636e95a6ab97b8e74c451) or [oe_get_report](https://openenclave.io/apidocs/v0.14/enclave_8h_aefcb89c91a9078d595e255bd7901ac71.html#aefcb89c91a9078d595e255bd7901ac71)). - -The client then sends the SGX quote and the serialized key to the attestation service. The attestation service will validate the quote and ensure that the hash of the key is present in the quote and will issue an "Attestation Token". - -The client can then send that Attestation Token (which contains the serialized key) to a 3rd party "relying party". The relying party then validates that the attestation token was created by the attestation service, and thus the serialized key can be used to encrypt some data held by the "relying party" to send to the service. - -This example shows one common pattern of calling into the attestation service to retrieve an attestation token associated with a request. - -This example assumes that you have an existing `AttestationClient` object which is configured with the base URI for your endpoint. It also assumes that you have an SGX Quote (`quote`) generated from within the SGX enclave you are attesting, and "Runtime Data" (`runtime_data`) which is referenced in the SGX Quote. - -```python -response, token = attest_client.attest_sgx_enclave(quote, runtime_data=runtime_data) -``` - -At this point, the enclave_held_data attribute in the attestationResult -will hold the input binary runtime_data. - -The token is now passed to the "relying party". The relying party will -validate that the token was issued by the Attestation Service. It then -extracts the asymmetric key from the EnclaveHeldData field. The relying -party will then Encrypt its "key" data using the asymmetric key and -transmits it back to the enclave. - -```python -encrypted_data = send_token_to_relying_party(attestationResult.Token) -``` - -Now the encrypted data can be passed into the enclave which can decrypt that data. - -Additional information on how to perform attestation token validation can be found in the [MAA Service Attestation Sample](https://github.com/Azure-Samples/microsoft-azure-attestation). - -### Retrieve Token Certificates - -Use `get_signing_certificates` to retrieve the certificates which can be used to validate the token returned from the attestation service. - -```python -signers = attest_client.get_signing_certificates() -for signer in signers: - from cryptography.hazmat.backends import default_backend - cert = cryptography.x509.load_pem_x509_certificate(signer.certificates[0].encode('ascii'), backend=default_backend()) - print('Cert iss:', cert.issuer, '; subject:', cert.subject) -``` - -## Troubleshooting - -Most Attestation service operations will raise exceptions defined in [Azure Core](https://github.com/Azure/azure-sdk-for-python/blob/azure-security-attestation_1.0.0/sdk/core/azure-core/README.md). The attestation service APIs will throw a `HttpResponseError` on failure with helpful error codes. Many of these errors are recoverable. - -```python -try: - response, _ = attest_client.attest_sgx_enclave( - quote, - runtime_data=AttestationData(runtime_data, is_json=False)) -except HttpResponseError as ex: - # Ignore invalid quote errors. - if ex.error == "InvalidParameter": - pass -} -``` - -Additional troubleshooting information for the MAA service can be found [here](https://docs.microsoft.com/python/api/overview/azure/attestation?view=azure-python-preview) - -## Next steps - -For more information about the Microsoft Azure Attestation service, please see our [documentation page](https://docs.microsoft.com/azure/attestation/). - -## Contributing - -This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit [the Contributor License Agreement site](https://cla.microsoft.com). - -When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. - -This project has adopted the [Microsoft Open Source Code of Conduct][microsoft_code_of_conduct]. For more information see the Code of Conduct FAQ or contact with any additional questions or comments. - -See [CONTRIBUTING.md][contributing] for details on building, testing, and contributing to these libraries. - -## Provide Feedback - -If you encounter any bugs or have suggestions, please file an issue in the -[Issues](https://github.com/Azure/azure-sdk-for-python/issues) -section of the project. - - -[source_code]: https://github.com/Azure/azure-sdk-for-python/tree/azure-security-attestation_1.0.0/sdk/attestation/azure-security-attestation -[azure_identity]: https://docs.microsoft.com/python/api/overview/azure/identity-readme?view=azure-python-preview -[DefaultAzureCredential]: https://docs.microsoft.com/python/api/azure-identity/azure.identity.defaultazurecredential?view=azure-python -[ClientSecretCredential]: https://docs.microsoft.com/python/api/azure-identity/azure.identity.clientsecretcredential?view=azure-python -[attestation_policy_result]:https://docs.microsoft.com/python/api/azure-security-attestation/azure.security.attestation.policyresult?view=azure-python-preview -[attestation_client]: https://docs.microsoft.com/python/api/azure-security-attestation/azure.security.attestation.attestationclient?view=azure-python-preview -[attestation_admin_client]: https://docs.microsoft.com/python/api/azure-security-attestation/azure.security.attestation.attestationadministrationclient?view=azure-python-preview -[attestation_policy_result_parameters]: https://docs.microsoft.com/python/api/azure-security-attestation/azure.security.attestation.policyresult?view=azure-python-preview#parameters -[attest_sgx]: https://docs.microsoft.com/python/api/azure-security-attestation/azure.security.attestation.attestationclient?view=azure-python-preview#attest-sgx-enclave-quote--inittime-data-none--runtime-data-none--draft-policy-none----kwargs- -[attestation_pypi]: https://aka.ms/azsdk/python/azure-security-attestation -[API_reference]:https://docs.microsoft.com/python/api/overview/azure/security-attestation-readme?view=azure-python-preview -[style-guide-msft]: https://docs.microsoft.com/style-guide/capitalization -[style-guide-cloud]: https://aka.ms/azsdk/cloud-style-guide -[microsoft_code_of_conduct]: https://opensource.microsoft.com/codeofconduct/ -[azure_cli]: https://docs.microsoft.com/cli/azure -[azure_sub]: https://azure.microsoft.com/free/ -[code_of_conduct]: https://opensource.microsoft.com/codeofconduct/ -[json_web_token]: https://tools.ietf.org/html/rfc7519 -[JWK]: https://tools.ietf.org/html/rfc7517 -[base64url_encoding]: https://tools.ietf.org/html/rfc4648#section-5 -[contributing]: https://github.com/Azure/azure-sdk-for-python/blob/azure-security-attestation_1.0.0/CONTRIBUTING.md -[coc_faq]: https://opensource.microsoft.com/codeofconduct/faq/ - +# Azure Attestation client library for Python - version 1.0.0 + + +The Microsoft Azure Attestation (MAA) service is a unified solution for remotely verifying the trustworthiness of a platform and integrity of the binaries running inside it. The service supports attestation of the platforms backed by Trusted Platform Modules (TPMs) alongside the ability to attest to the state of Trusted Execution Environments (TEEs) such as Intel(tm) Software Guard Extensions (SGX) enclaves and Virtualization-based Security (VBS) enclaves. + +Attestation is a process for demonstrating that software binaries were properly instantiated on a trusted platform. Remote relying parties can then gain confidence that only such intended software is running on trusted hardware. Azure Attestation is a unified customer-facing service and framework for attestation. + +Azure Attestation enables cutting-edge security paradigms such as Azure Confidential computing and Intelligent Edge protection. Customers have been requesting the ability to independently verify the location of a machine, the posture of a virtual machine (VM) on that machine, and the environment within which enclaves are running on that VM. Azure Attestation will empower these and many additional customer requests. + +Azure Attestation receives evidence from compute entities, turns them into a set of claims, validates them against configurable policies, and produces cryptographic proofs for claims-based applications (for example, relying parties and auditing authorities). + +This package has been tested with Python 2.7, 3.6 to 3.9. + +For a more complete view of Azure libraries, see the [Azure SDK for Python release page](https://aka.ms/azsdk/python/all). + +[Source code][source_code] | [Package (PyPI)][Attestation_pypi] | [API reference documentation][API_reference] | [Product documentation](https://docs.microsoft.com/azure/attestation/) + +## Getting started + +### Prerequisites + +* An Azure subscription. To use Azure services, including the Azure Attestation service, you'll need a subscription. If you do not have an existing Azure account, you may sign up for a [free trial][azure_sub] or use your [Visual Studio Subscription](https://visualstudio.microsoft.com/subscriptions/) benefits when you [create an account](https://account.windowsazure.com/Home/Index). +* An existing Azure Attestation Instance, or you can use the "shared provider" available in each Azure region. If you need to create an Azure Attestation service instance, you can use the Azure Portal or [Azure CLI][azure_cli]. + +### Install the package + +Install the Azure Attestation client library for Python with [PyPI][Attestation_pypi]: + +```Powershell +pip install azure-security-attestation +``` + +### Authenticate the client + +In order to interact with the Azure Attestation service, you'll need to create an instance of the [Attestation Client][attestation_client] or [Attestation Administration Client][attestation_admin_client] class. You need an **attestation endpoint**, which you may see as "Attest URI" in the portal, +and **client credentials (client id, client secret, tenant id)** to instantiate a client object. + +[Client secret credential][ClientSecretCredential] authentication is being used in this getting started section but you can find more ways to authenticate with the [Azure identity package][azure_identity]. To use the [DefaultAzureCredential][DefaultAzureCredential] provider shown below, +or other credential providers provided with the Azure SDK, you should install the azure-identity package: + +```Powershell +pip install azure-identity +``` + +#### Create/Get credentials + +Use the [Azure CLI][azure_cli] snippet below to create/get client secret credentials. + +* Create a service principal and configure its access to Azure resources: + + ```Powershell + az ad sp create-for-rbac -n --skip-assignment + ``` + + Output: + + ```json + { + "appId": "generated-app-ID", + "displayName": "dummy-app-name", + "name": "http://dummy-app-name", + "password": "random-password", + "tenant": "tenant-ID" + } + ``` + +* Take note of the service principal objectId + + ```Powershell + az ad sp show --id --query objectId + ``` + + Output: + + ```Powershell + "" + ``` + +* Use the returned credentials above to set **AZURE_CLIENT_ID** (appId), **AZURE_CLIENT_SECRET** (password), and **AZURE_TENANT_ID** (tenant) environment variables. The following example shows a way to do this in Powershell: + + ```Powershell + $Env:AZURE_CLIENT_ID="generated-app-ID" + $Env:AZURE_CLIENT_SECRET="random-password" + $Env:AZURE_TENANT_ID="tenant-ID" + ``` + +For more information about the Azure Identity APIs and how to use them, see [Azure Identity client library](https://github.com/Azure/azure-sdk-for-python/tree/azure-security-attestation_1.0.0/sdk/identity/azure-identity) + +## Key concepts + +There are four major families of functionality provided in this preview SDK: + +* [SGX and TPM enclave attestation.](#attestation) +* [MAA Attestation Token signing certificate discovery and validation.](#attestation-token-signing-certificate-discovery-and-validation) +* [Attestation Policy management.](#policy-management) +* [Attestation policy management certificate management](#policy-management-certificate-management) (yes, policy management management). + +The Microsoft Azure Attestation service runs in two separate modes: "Isolated" and "AAD". When the service is running in "Isolated" mode, the customer needs to +provide additional information beyond their authentication credentials to verify that they are authorized to modify the state of an attestation instance. + +Finally, each region in which the Azure Attestation service is available supports a "shared" instance, which +can be used to attest SGX enclaves which only need verification against the azure baseline (there are no policies applied to the shared instance). TPM attestation is not available in the shared instance. +While the shared instance requires AAD authentication, it does not have any RBAC policies - any customer with a valid AAD bearer token can attest using the shared instance. + +### Attestation + +SGX or TPM attestation is the process of validating evidence collected from +a trusted execution environment to ensure that it meets both the Azure baseline for that environment and customer defined policies applied to that environment. + +### Attestation service token signing certificate discovery and validation + +One of the core operational guarantees of the Azure Attestation Service is that the service operates "operationally out of the TCB". In other words, there is no way that a Microsoft operator could tamper with the operation of the service, or corrupt data sent from the client. To ensure this guarantee, the core of the attestation service runs in an Intel(tm) SGX enclave. + +To allow customers to verify that operations were actually performed inside the enclave, most responses from the Attestation Service are encoded in a [JSON Web Token][json_web_token], which is signed by a key held within the attestation service's enclave. + +This token will be signed by a signing certificate issued by the MAA service for the specified instance. + +If the MAA service instance is running in a region where the service runs in an SGX enclave, then +the certificate issued by the server can be verified using the [oe_verify_attestation_certificate API](https://openenclave.github.io/openenclave/api/enclave_8h_a3b75c5638360adca181a0d945b45ad86.html). + +### Policy Management + +Each attestation service instance has a policy applied to it which defines additional criteria which the customer has defined. + +For more information on attestation policies, see [Attestation Policy](https://docs.microsoft.com/azure/attestation/author-sign-policy) + +### Policy Management certificate management + +When an attestation instance is running in "Isolated" mode, the customer who created the instance will have provided +a policy management certificate at the time the instance is created. All policy modification operations require that the customer sign +the policy data with one of the existing policy management certificates. The Policy Management Certificate Management APIs enable +clients to "roll" the policy management certificates. + +### Isolated Mode and AAD Mode + +Each Microsoft Azure Attestation service instance operates in either "AAD" mode or "Isolated" mode. When an MAA instance is operating in AAD mode, it means that the customer which created the attestation instance allows Azure Active Directory and Azure Role Based Access control policies to verify access to the attestation instance. + +### *AttestationType* + +The Azure Attestation service supports attesting different types of evidence depending on the environment. +Currently, MAA supports the following Trusted Execution environments: + +* OpenEnclave - An Intel(tm) Processor running code in an SGX Enclave where the attestation evidence was collected using the OpenEnclave [oe_get_report](https://openenclave.io/apidocs/v0.14/enclave_8h_aefcb89c91a9078d595e255bd7901ac71.html#aefcb89c91a9078d595e255bd7901ac71) or [oe_get_evidence](https://openenclave.io/apidocs/v0.14/attester_8h_a7d197e42468636e95a6ab97b8e74c451.html#a7d197e42468636e95a6ab97b8e74c451) API. +* SgxEnclave - An Intel(tm) Processor running code in an SGX Enclave where the attestation evidence was collected using the Intel SGX SDK. +* Tpm - A Virtualization Based Security environment where the Trusted Platform Module of the processor is used to provide the attestation evidence. + +### Runtime Data and Inittime Data + +RuntimeData refers to data which is presented to the Intel SGX Quote generation logic or the `oe_get_report`/`oe_get_evidence` APIs. If the caller to the attest API provided a `runtime_data` attribute, The Azure Attestation service will validate that the first 32 bytes of the `report_data` field in the SGX Quote/OE Report/OE Evidence matches the SHA256 hash of the `runtime_data`. + +InitTime data refers to data which is used to configure the SGX enclave being attested. + +> Note that InitTime data is not supported on Azure [DCsv2-Series](https://docs.microsoft.com/azure/virtual-machines/dcv2-series) virtual machines. + +### Additional concepts + +## Examples + +* [Create an attestation client instance](#create-client-instance) +* [Attest an SGX enclave](#attest-sgx-enclave) +* [Get attestation policy](#get-attestation-policy) +* [Retrieve token validation certificates](#retrieve-token-certificates) +* [Create an attestation client instance](#create-client-instance) + +### Create client instance + +Creates an instance of the Attestation Client at uri `endpoint`. + +```python +attest_client = AttestationClient( + endpoint=base_uri, + credential=DefaultAzureCredential()) +``` + +### Get attestation policy + +The `set_policy` method retrieves the attestation policy from the service. +Attestation Policies are instanced on a per-attestation type basis, the `AttestationType` parameter defines the type to retrieve. + +```python +policy, token = attest_client.get_policy(AttestationType.SGX_ENCLAVE) +print('Instance SGX policy: ', policy) +print('Token: ', token) +``` + +### Set an attestation policy for a specified attestation type + +If the attestation service instance is running in Isolated mode, the set_policy API needs to provide a signing certificate (and private key) which can be used to validate that the caller is authorized to modify policy on the attestation instance. If the service instance is running in AAD mode, then the signing certificate and key are optional. + +Under the covers, the SetPolicy APIs create a [JSON Web Token][json_web_token] based on the policy document and signing information which is sent to the attestation service. + +```python +policy_set_response = attest_client.set_policy(AttestationType.SGX_ENCLAVE, + attestation_policy, + signing_key=key, + signing_certificate=signing_certificate) +new_policy, _ = attest_client.get_policy(AttestationType.SGX_ENCLAVE) +# `new_policy` will equal `attestation_policy`. +``` + +If the service instance is running in AAD mode, the call to set_policy can be +simplified: + +```python +policy_set_response = attest_client.set_policy(AttestationType.SGX_ENCLAVE, + attestation_policy) +# Now retrieve the policy which was just set. +new_policy, _ = attest_client.get_policy(AttestationType.SGX_ENCLAVE) + +``` + +Clients need to be able to verify that the attestation policy document was not modified before the policy document was received by the attestation service's enclave. + +There are two properties provided in the [PolicyResult][attestation_policy_result] that can be used to verify that the service received the policy document: + +* [policy_signer][attestation_policy_result_parameters] - if the `set_policy` call included a signing certificate, this will be the certificate provided at the time of the `set_policy` call. If no policy signer was set, this will be null. +* [policy_token_hash][attestation_policy_result_parameters] - this is the hash of the [JSON Web Token][json_web_token] sent to the service. + +To verify the hash, clients can generate an attestation policy token and verify the hash generated from that token: + +```python +from cryptography.hazmat.primitives import hashes + +expected_policy = AttestationPolicyToken( + attestation_policy, + signing_key=key, + signing_certificate=signing_certificate) +hasher = hashes.Hash(hashes.SHA256()) +hasher.update(expected_policy.serialize().encode('utf-8')) +expected_hash = hasher.finalize() + +# `expected_hash` will exactly match `policy_set_response.policy_token_hash` +``` + +### Attest SGX Enclave + +Use the [attest_sgx_enclave][attest_sgx] method to attest an SGX enclave. + +One of the core challenges customers have interacting with encrypted environments is how to ensure that you can securely communicate with the code running in the environment ("enclave code"). + +One solution to this problem is what is known as "Secure Key Release", which is a pattern that enables secure communication with enclave code. + +To implement the "Secure Key Release" pattern, the enclave code generates an ephemeral asymmetric key. It then serializes the public portion of the key to some format (possibly a JSON Web Key, or PEM, or some other serialization format). + +The enclave code then calculates the SHA256 value of the public key and passes it as an input to code which generates an SGX Quote (for OpenEnclave, that would be the [oe_get_evidence](https://openenclave.io/apidocs/v0.14/attester_8h_a7d197e42468636e95a6ab97b8e74c451.html#a7d197e42468636e95a6ab97b8e74c451) or [oe_get_report](https://openenclave.io/apidocs/v0.14/enclave_8h_aefcb89c91a9078d595e255bd7901ac71.html#aefcb89c91a9078d595e255bd7901ac71)). + +The client then sends the SGX quote and the serialized key to the attestation service. The attestation service will validate the quote and ensure that the hash of the key is present in the quote and will issue an "Attestation Token". + +The client can then send that Attestation Token (which contains the serialized key) to a 3rd party "relying party". The relying party then validates that the attestation token was created by the attestation service, and thus the serialized key can be used to encrypt some data held by the "relying party" to send to the service. + +This example shows one common pattern of calling into the attestation service to retrieve an attestation token associated with a request. + +This example assumes that you have an existing `AttestationClient` object which is configured with the base URI for your endpoint. It also assumes that you have an SGX Quote (`quote`) generated from within the SGX enclave you are attesting, and "Runtime Data" (`runtime_data`) which is referenced in the SGX Quote. + +```python +response, token = attest_client.attest_sgx_enclave(quote, runtime_data=runtime_data) +``` + +At this point, the enclave_held_data attribute in the attestationResult +will hold the input binary runtime_data. + +The token is now passed to the "relying party". The relying party will +validate that the token was issued by the Attestation Service. It then +extracts the asymmetric key from the EnclaveHeldData field. The relying +party will then Encrypt its "key" data using the asymmetric key and +transmits it back to the enclave. + +```python +encrypted_data = send_token_to_relying_party(attestationResult.Token) +``` + +Now the encrypted data can be passed into the enclave which can decrypt that data. + +Additional information on how to perform attestation token validation can be found in the [MAA Service Attestation Sample](https://github.com/Azure-Samples/microsoft-azure-attestation). + +### Retrieve Token Certificates + +Use `get_signing_certificates` to retrieve the certificates which can be used to validate the token returned from the attestation service. + +```python +signers = attest_client.get_signing_certificates() +for signer in signers: + from cryptography.hazmat.backends import default_backend + cert = cryptography.x509.load_pem_x509_certificate(signer.certificates[0].encode('ascii'), backend=default_backend()) + print('Cert iss:', cert.issuer, '; subject:', cert.subject) +``` + +## Troubleshooting + +Most Attestation service operations will raise exceptions defined in [Azure Core](https://github.com/Azure/azure-sdk-for-python/blob/azure-security-attestation_1.0.0/sdk/core/azure-core/README.md). The attestation service APIs will throw a `HttpResponseError` on failure with helpful error codes. Many of these errors are recoverable. + +```python +try: + response, _ = attest_client.attest_sgx_enclave( + quote, + runtime_data=AttestationData(runtime_data, is_json=False)) +except HttpResponseError as ex: + # Ignore invalid quote errors. + if ex.error == "InvalidParameter": + pass +} +``` + +Additional troubleshooting information for the MAA service can be found [here](https://docs.microsoft.com/python/api/overview/azure/attestation?view=azure-python-preview) + +## Next steps + +For more information about the Microsoft Azure Attestation service, please see our [documentation page](https://docs.microsoft.com/azure/attestation/). + +## Contributing + +This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit [the Contributor License Agreement site](https://cla.microsoft.com). + +When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. + +This project has adopted the [Microsoft Open Source Code of Conduct][microsoft_code_of_conduct]. For more information see the Code of Conduct FAQ or contact with any additional questions or comments. + +See [CONTRIBUTING.md][contributing] for details on building, testing, and contributing to these libraries. + +## Provide Feedback + +If you encounter any bugs or have suggestions, please file an issue in the +[Issues](https://github.com/Azure/azure-sdk-for-python/issues) +section of the project. + + +[source_code]: https://github.com/Azure/azure-sdk-for-python/tree/azure-security-attestation_1.0.0/sdk/attestation/azure-security-attestation +[azure_identity]: https://docs.microsoft.com/python/api/overview/azure/identity-readme?view=azure-python-preview +[DefaultAzureCredential]: https://docs.microsoft.com/python/api/azure-identity/azure.identity.defaultazurecredential?view=azure-python +[ClientSecretCredential]: https://docs.microsoft.com/python/api/azure-identity/azure.identity.clientsecretcredential?view=azure-python +[attestation_policy_result]:https://docs.microsoft.com/python/api/azure-security-attestation/azure.security.attestation.policyresult?view=azure-python-preview +[attestation_client]: https://docs.microsoft.com/python/api/azure-security-attestation/azure.security.attestation.attestationclient?view=azure-python-preview +[attestation_admin_client]: https://docs.microsoft.com/python/api/azure-security-attestation/azure.security.attestation.attestationadministrationclient?view=azure-python-preview +[attestation_policy_result_parameters]: https://docs.microsoft.com/python/api/azure-security-attestation/azure.security.attestation.policyresult?view=azure-python-preview#parameters +[attest_sgx]: https://docs.microsoft.com/python/api/azure-security-attestation/azure.security.attestation.attestationclient?view=azure-python-preview#attest-sgx-enclave-quote--inittime-data-none--runtime-data-none--draft-policy-none----kwargs- +[attestation_pypi]: https://aka.ms/azsdk/python/azure-security-attestation +[API_reference]:https://docs.microsoft.com/python/api/overview/azure/security-attestation-readme?view=azure-python-preview +[style-guide-msft]: https://docs.microsoft.com/style-guide/capitalization +[style-guide-cloud]: https://aka.ms/azsdk/cloud-style-guide +[microsoft_code_of_conduct]: https://opensource.microsoft.com/codeofconduct/ +[azure_cli]: https://docs.microsoft.com/cli/azure +[azure_sub]: https://azure.microsoft.com/free/ +[code_of_conduct]: https://opensource.microsoft.com/codeofconduct/ +[json_web_token]: https://tools.ietf.org/html/rfc7519 +[JWK]: https://tools.ietf.org/html/rfc7517 +[base64url_encoding]: https://tools.ietf.org/html/rfc4648#section-5 +[contributing]: https://github.com/Azure/azure-sdk-for-python/blob/azure-security-attestation_1.0.0/CONTRIBUTING.md +[coc_faq]: https://opensource.microsoft.com/codeofconduct/faq/ + ![Impressions](https://azure-sdk-impressions.azurewebsites.net/api/impressions/azure-sdk-for-python%2Fsdk%2Fattestation%2Fazure-security-attestation%2FREADME.png) - + diff --git a/docs-ref-services/latest/storage-file-datalake-readme.md b/docs-ref-services/latest/storage-file-datalake-readme.md index 5c7c8eb0f7fa..cec41de12007 100644 --- a/docs-ref-services/latest/storage-file-datalake-readme.md +++ b/docs-ref-services/latest/storage-file-datalake-readme.md @@ -86,7 +86,7 @@ DataLake storage offers four types of resources: * The storage account * A file system in the storage account * A directory under the file system -* A file in a the file system or under directory +* A file in the file system or under directory ### Async Clients This library includes a complete async API supported on Python 3.5+. To use it, you must diff --git a/docs-ref-services/latest/storage-fileshare-readme.md b/docs-ref-services/latest/storage-fileshare-readme.md index 0a94b8dd676f..e3b6ec23fb28 100644 --- a/docs-ref-services/latest/storage-fileshare-readme.md +++ b/docs-ref-services/latest/storage-fileshare-readme.md @@ -9,367 +9,367 @@ ms.service: storage ms.technology: azure manager: twolley --- -# Azure Files for Python Readme - Version 12.1.1 -Azure File Share storage offers fully managed file shares in the cloud that are accessible via the industry standard [Server Message Block (SMB) protocol](https://docs.microsoft.com/windows/desktop/FileIO/microsoft-smb-protocol-and-cifs-protocol-overview). Azure file shares can be mounted concurrently by cloud or on-premises deployments of Windows, Linux, and macOS. Additionally, Azure file shares can be cached on Windows Servers with Azure File Sync for fast access near where the data is being used. - -Azure file shares can be used to: - -* Replace or supplement on-premises file servers -* "Lift and shift" applications -* Simplify cloud development with shared application settings, diagnostic share, and Dev/Test/Debug tools - -[Source code](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/azure/storage/fileshare) | [Package (PyPI)](https://pypi.org/project/azure-storage-file-share/) | [API reference documentation](https://aka.ms/azsdk-python-storage-fileshare-ref) | [Product documentation](https://docs.microsoft.com/azure/storage/) | [Samples](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples) - -## Getting started - -### Prerequisites -* Python 2.7, or 3.5 or later is required to use this package. -* You must have an [Azure subscription](https://azure.microsoft.com/free/) and an -[Azure storage account](https://docs.microsoft.com/azure/storage/common/storage-account-overview) to use this package. - -### Install the package -Install the Azure Storage File Share client library for Python with [pip](https://pypi.org/project/pip/): - -```bash -pip install azure-storage-file-share -``` - -### Create a storage account -If you wish to create a new storage account, you can use the -[Azure Portal](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-portal), -[Azure PowerShell](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-powershell), -or [Azure CLI](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-cli): - -```bash -# Create a new resource group to hold the storage account - -# if using an existing resource group, skip this step -az group create --name my-resource-group --location westus2 - -# Create the storage account -az storage account create -n my-storage-account-name -g my-resource-group -``` - -### Create the client -The Azure Storage File Share client library for Python allows you to interact with four types of resources: the storage -account itself, file shares, directories, and files. Interaction with these resources starts with an instance of a -[client](#clients). To create a client object, you will need the storage account's file service URL and a -credential that allows you to access the storage account: - -```python -from azure.storage.fileshare import ShareServiceClient - -service = ShareServiceClient(account_url="https://.file.core.windows.net/", credential=credential) -``` - -#### Looking up the account URL -You can find the storage account's file service URL using the -[Azure Portal](https://docs.microsoft.com/azure/storage/common/storage-account-overview#storage-account-endpoints), -[Azure PowerShell](https://docs.microsoft.com/powershell/module/az.storage/get-azstorageaccount), -or [Azure CLI](https://docs.microsoft.com/cli/azure/storage/account?view=azure-cli-latest#az-storage-account-show): - -```bash -# Get the file service URL for the storage account -az storage account show -n my-storage-account-name -g my-resource-group --query "primaryEndpoints.file" -``` - -#### Types of credentials -The `credential` parameter may be provided in a number of different forms, depending on the type of -[authorization](https://docs.microsoft.com/azure/storage/common/storage-auth) you wish to use: -1. To use a [shared access signature (SAS) token](https://docs.microsoft.com/azure/storage/common/storage-sas-overview), - provide the token as a string. If your account URL includes the SAS token, omit the credential parameter. - You can generate a SAS token from the Azure Portal under "Shared access signature" or use one of the `generate_sas()` - functions to create a sas token for the storage account, share, or file: - - ```python - from datetime import datetime, timedelta - from azure.storage.fileshare import ShareServiceClient, generate_account_sas, ResourceTypes, AccountSasPermissions - - sas_token = generate_account_sas( - account_name="", - account_key="", - resource_types=ResourceTypes(service=True), - permission=AccountSasPermissions(read=True), - expiry=datetime.utcnow() + timedelta(hours=1) - ) - - share_service_client = ShareServiceClient(account_url="https://.file.core.windows.net", credential=sas_token) - ``` - -2. To use a storage account [shared key](https://docs.microsoft.com/rest/api/storageservices/authenticate-with-shared-key/) - (aka account key or access key), provide the key as a string. This can be found in the Azure Portal under the "Access Keys" - section or by running the following Azure CLI command: - - ```az storage account keys list -g MyResourceGroup -n MyStorageAccount``` - - Use the key as the credential parameter to authenticate the client: - ```python - from azure.storage.fileshare import ShareServiceClient - service = ShareServiceClient(account_url="https://.file.core.windows.net", credential="") - ``` - -#### Creating the client from a connection string -Depending on your use case and authorization method, you may prefer to initialize a client instance with a storage -connection string instead of providing the account URL and credential separately. To do this, pass the storage -connection string to the client's `from_connection_string` class method: - -```python -from azure.storage.fileshare import ShareServiceClient - -connection_string = "DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxx;EndpointSuffix=core.windows.net" -service = ShareServiceClient.from_connection_string(conn_str=connection_string) -``` - -The connection string to your storage account can be found in the Azure Portal under the "Access Keys" section or by running the following CLI command: - -```bash -az storage account show-connection-string -g MyResourceGroup -n MyStorageAccount -``` - -## Key concepts -The following components make up the Azure File Share Service: -* The storage account itself -* A file share within the storage account -* An optional hierarchy of directories within the file share -* A file within the file share, which may be up to 1 TiB in size - -The Azure Storage File Share client library for Python allows you to interact with each of these components through the -use of a dedicated client object. - -### Clients -Four different clients are provided to to interact with the various components of the File Share Service: -1. [ShareServiceClient](https://aka.ms/azsdk-python-storage-fileshare-shareserviceclient) - - this client represents interaction with the Azure storage account itself, and allows you to acquire preconfigured - client instances to access the file shares within. It provides operations to retrieve and configure the service - properties as well as list, create, and delete shares within the account. To perform operations on a specific share, - retrieve a client using the `get_share_client` method. -2. [ShareClient](https://aka.ms/azsdk-python-storage-fileshare-shareclient) - - this client represents interaction with a specific file share (which need not exist yet), and allows you to acquire - preconfigured client instances to access the directories and files within. It provides operations to create, delete, - configure, or create snapshots of a share and includes operations to create and enumerate the contents of - directories within it. To perform operations on a specific directory or file, retrieve a client using the - `get_directory_client` or `get_file_client` methods. -3. [ShareDirectoryClient](https://aka.ms/azsdk-python-storage-fileshare-sharedirectoryclient) - - this client represents interaction with a specific directory (which need not exist yet). It provides operations to - create, delete, or enumerate the contents of an immediate or nested subdirectory, and includes operations to create - and delete files within it. For operations relating to a specific subdirectory or file, a client for that entity can - also be retrieved using the `get_subdirectory_client` and `get_file_client` functions. -4. [ShareFileClient](http://aka.ms/azsdk-python-storage-fileshare-sharefileclient) - - this client represents interaction with a specific file (which need not exist yet). It provides operations to - upload, download, create, delete, and copy a file. - -For details on path naming restrictions, see [Naming and Referencing Shares, Directories, Files, and Metadata](https://docs.microsoft.com/rest/api/storageservices/naming-and-referencing-shares--directories--files--and-metadata). - -## Examples -The following sections provide several code snippets covering some of the most common Storage File Share tasks, including: - -* [Creating a file share](#creating-a-file-share "Creating a file share") -* [Uploading a file](#uploading-a-file "Uploading a file") -* [Downloading a file](#downloading-a-file "Downloading a file") -* [Listing contents of a directory](#listing-contents-of-a-directory "Listing contents of a directory") - -### Creating a file share -Create a file share to store your files - -```python -from azure.storage.fileshare import ShareClient - -share = ShareClient.from_connection_string(conn_str="", share_name="my_share") -share.create_share() -``` - -Use the async client to create a file share - -```python -from azure.storage.fileshare.aio import ShareClient - -share = ShareClient.from_connection_string(conn_str="", share_name="my_share") -await share.create_share() -``` - -### Uploading a file -Upload a file to the share - -```python -from azure.storage.fileshare import ShareFileClient - -file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") - -with open("./SampleSource.txt", "rb") as source_file: - file_client.upload_file(source_file) -``` - -Upload a file asynchronously - -```python -from azure.storage.fileshare.aio import ShareFileClient - -file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") - -with open("./SampleSource.txt", "rb") as source_file: - await file_client.upload_file(source_file) -``` - -### Downloading a file -Download a file from the share - -```python -from azure.storage.fileshare import ShareFileClient - -file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") - -with open("DEST_FILE", "wb") as file_handle: - data = file_client.download_file() - data.readinto(file_handle) -``` - -Download a file asynchronously - -```python -from azure.storage.fileshare.aio import ShareFileClient - -file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") - -with open("DEST_FILE", "wb") as file_handle: - data = await file_client.download_file() - await data.readinto(file_handle) -``` - -### Listing contents of a directory -List all directories and files under a parent directory - -```python -from azure.storage.fileshare import ShareDirectoryClient - -parent_dir = ShareDirectoryClient.from_connection_string(conn_str="", share_name="my_share", directory_path="parent_dir") - -my_list = list(parent_dir.list_directories_and_files()) -print(my_list) -``` - -List contents of a directory asynchronously - -```python -from azure.storage.fileshare.aio import ShareDirectoryClient - -parent_dir = ShareDirectoryClient.from_connection_string(conn_str="", share_name="my_share", directory_path="parent_dir") - -my_files = [] -async for item in parent_dir.list_directories_and_files(): - my_files.append(item) -print(my_files) -``` - -## Optional Configuration - -Optional keyword arguments that can be passed in at the client and per-operation level. - -### Retry Policy configuration - -Use the following keyword arguments when instantiating a client to configure the retry policy: - -* __retry_total__ (int): Total number of retries to allow. Takes precedence over other counts. -Pass in `retry_total=0` if you do not want to retry on requests. Defaults to 10. -* __retry_connect__ (int): How many connection-related errors to retry on. Defaults to 3. -* __retry_read__ (int): How many times to retry on read errors. Defaults to 3. -* __retry_status__ (int): How many times to retry on bad status codes. Defaults to 3. -* __retry_to_secondary__ (bool): Whether the request should be retried to secondary, if able. -This should only be enabled of RA-GRS accounts are used and potentially stale data can be handled. -Defaults to `False`. - -### Other client / per-operation configuration - -Other optional configuration keyword arguments that can be specified on the client or per-operation. - -**Client keyword arguments:** - -* __connection_timeout__ (int): Optionally sets the connect and read timeout value, in seconds. -* __transport__ (Any): User-provided transport to send the HTTP request. - -**Per-operation keyword arguments:** - -* __raw_response_hook__ (callable): The given callback uses the response returned from the service. -* __raw_request_hook__ (callable): The given callback uses the request before being sent to service. -* __client_request_id__ (str): Optional user specified identification of the request. -* __user_agent__ (str): Appends the custom value to the user-agent header to be sent with the request. -* __logging_enable__ (bool): Enables logging at the DEBUG level. Defaults to False. Can also be passed in at -the client level to enable it for all requests. -* __headers__ (dict): Pass in custom headers as key, value pairs. E.g. `headers={'CustomValue': value}` - - -## Troubleshooting -### General -Storage File clients raise exceptions defined in [Azure Core](https://github.com/Azure/azure-sdk-for-python/blob/master/sdk/core/azure-core/README.md). -All File service operations will throw a `StorageErrorException` on failure with helpful [error codes](https://docs.microsoft.com/rest/api/storageservices/file-service-error-codes). - -### Logging -This library uses the standard -[logging](https://docs.python.org/3/library/logging.html) library for logging. -Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO -level. - -Detailed DEBUG level logging, including request/response bodies and unredacted -headers, can be enabled on a client with the `logging_enable` argument: -```python -import sys -import logging -from azure.storage.fileshare import ShareServiceClient - -# Create a logger for the 'azure.storage.fileshare' SDK -logger = logging.getLogger('azure.storage.fileshare') -logger.setLevel(logging.DEBUG) - -# Configure a console output -handler = logging.StreamHandler(stream=sys.stdout) -logger.addHandler(handler) - -# This client will log detailed information about its HTTP sessions, at DEBUG level -service_client = ShareServiceClient.from_connection_string("your_connection_string", logging_enable=True) -``` - -Similarly, `logging_enable` can enable detailed logging for a single operation, -even when it isn't enabled for the client: -```py -service_client.get_service_properties(logging_enable=True) -``` - -## Next steps - -### More sample code - -Get started with our [File Share samples](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples). - -Several Storage File Share Python SDK samples are available to you in the SDK's GitHub repository. These samples provide example code for additional scenarios commonly encountered while working with Storage File Share: - -* [file_samples_hello_world.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_hello_world.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_hello_world_async.py)) - Examples found in this article: - * Client creation - * Create a file share - * Upload a file - -* [file_samples_authentication.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_authentication.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_authentication_async.py)) - Examples for authenticating and creating the client: - * From a connection string - * From a shared access key - * From a shared access signature token - -* [file_samples_service.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_service.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_service_async.py)) - Examples for interacting with the file service: - * Get and set service properties - * Create, list, and delete shares - * Get a share client - -* [file_samples_share.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_share.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_share_async.py)) - Examples for interacting with file shares: - * Create a share snapshot - * Set share quota and metadata - * List directories and files - * Get the directory or file client to interact with a specific entity - -* [file_samples_directory.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_directory.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_directory_async.py)) - Examples for interacting with directories: - * Create a directory and add files - * Create and delete subdirectories - * Get the subdirectory client - -* [file_samples_client.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_client.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_client_async.py)) - Examples for interacting with files: - * Create, upload, download, and delete files - * Copy a file from a URL - -### Additional documentation +# Azure Files for Python Readme - Version 12.1.1 +Azure File Share storage offers fully managed file shares in the cloud that are accessible via the industry standard [Server Message Block (SMB) protocol](https://docs.microsoft.com/windows/desktop/FileIO/microsoft-smb-protocol-and-cifs-protocol-overview). Azure file shares can be mounted concurrently by cloud or on-premises deployments of Windows, Linux, and macOS. Additionally, Azure file shares can be cached on Windows Servers with Azure File Sync for fast access near where the data is being used. + +Azure file shares can be used to: + +* Replace or supplement on-premises file servers +* "Lift and shift" applications +* Simplify cloud development with shared application settings, diagnostic share, and Dev/Test/Debug tools + +[Source code](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/azure/storage/fileshare) | [Package (PyPI)](https://pypi.org/project/azure-storage-file-share/) | [API reference documentation](https://aka.ms/azsdk-python-storage-fileshare-ref) | [Product documentation](https://docs.microsoft.com/azure/storage/) | [Samples](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples) + +## Getting started + +### Prerequisites +* Python 2.7, or 3.5 or later is required to use this package. +* You must have an [Azure subscription](https://azure.microsoft.com/free/) and an +[Azure storage account](https://docs.microsoft.com/azure/storage/common/storage-account-overview) to use this package. + +### Install the package +Install the Azure Storage File Share client library for Python with [pip](https://pypi.org/project/pip/): + +```bash +pip install azure-storage-file-share +``` + +### Create a storage account +If you wish to create a new storage account, you can use the +[Azure Portal](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-portal), +[Azure PowerShell](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-powershell), +or [Azure CLI](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-cli): + +```bash +# Create a new resource group to hold the storage account - +# if using an existing resource group, skip this step +az group create --name my-resource-group --location westus2 + +# Create the storage account +az storage account create -n my-storage-account-name -g my-resource-group +``` + +### Create the client +The Azure Storage File Share client library for Python allows you to interact with four types of resources: the storage +account itself, file shares, directories, and files. Interaction with these resources starts with an instance of a +[client](#clients). To create a client object, you will need the storage account's file service URL and a +credential that allows you to access the storage account: + +```python +from azure.storage.fileshare import ShareServiceClient + +service = ShareServiceClient(account_url="https://.file.core.windows.net/", credential=credential) +``` + +#### Looking up the account URL +You can find the storage account's file service URL using the +[Azure Portal](https://docs.microsoft.com/azure/storage/common/storage-account-overview#storage-account-endpoints), +[Azure PowerShell](https://docs.microsoft.com/powershell/module/az.storage/get-azstorageaccount), +or [Azure CLI](https://docs.microsoft.com/cli/azure/storage/account?view=azure-cli-latest#az-storage-account-show): + +```bash +# Get the file service URL for the storage account +az storage account show -n my-storage-account-name -g my-resource-group --query "primaryEndpoints.file" +``` + +#### Types of credentials +The `credential` parameter may be provided in a number of different forms, depending on the type of +[authorization](https://docs.microsoft.com/azure/storage/common/storage-auth) you wish to use: +1. To use a [shared access signature (SAS) token](https://docs.microsoft.com/azure/storage/common/storage-sas-overview), + provide the token as a string. If your account URL includes the SAS token, omit the credential parameter. + You can generate a SAS token from the Azure Portal under "Shared access signature" or use one of the `generate_sas()` + functions to create a sas token for the storage account, share, or file: + + ```python + from datetime import datetime, timedelta + from azure.storage.fileshare import ShareServiceClient, generate_account_sas, ResourceTypes, AccountSasPermissions + + sas_token = generate_account_sas( + account_name="", + account_key="", + resource_types=ResourceTypes(service=True), + permission=AccountSasPermissions(read=True), + expiry=datetime.utcnow() + timedelta(hours=1) + ) + + share_service_client = ShareServiceClient(account_url="https://.file.core.windows.net", credential=sas_token) + ``` + +2. To use a storage account [shared key](https://docs.microsoft.com/rest/api/storageservices/authenticate-with-shared-key/) + (aka account key or access key), provide the key as a string. This can be found in the Azure Portal under the "Access Keys" + section or by running the following Azure CLI command: + + ```az storage account keys list -g MyResourceGroup -n MyStorageAccount``` + + Use the key as the credential parameter to authenticate the client: + ```python + from azure.storage.fileshare import ShareServiceClient + service = ShareServiceClient(account_url="https://.file.core.windows.net", credential="") + ``` + +#### Creating the client from a connection string +Depending on your use case and authorization method, you may prefer to initialize a client instance with a storage +connection string instead of providing the account URL and credential separately. To do this, pass the storage +connection string to the client's `from_connection_string` class method: + +```python +from azure.storage.fileshare import ShareServiceClient + +connection_string = "DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxx;EndpointSuffix=core.windows.net" +service = ShareServiceClient.from_connection_string(conn_str=connection_string) +``` + +The connection string to your storage account can be found in the Azure Portal under the "Access Keys" section or by running the following CLI command: + +```bash +az storage account show-connection-string -g MyResourceGroup -n MyStorageAccount +``` + +## Key concepts +The following components make up the Azure File Share Service: +* The storage account itself +* A file share within the storage account +* An optional hierarchy of directories within the file share +* A file within the file share, which may be up to 1 TiB in size + +The Azure Storage File Share client library for Python allows you to interact with each of these components through the +use of a dedicated client object. + +### Clients +Four different clients are provided to interact with the various components of the File Share Service: +1. [ShareServiceClient](https://aka.ms/azsdk-python-storage-fileshare-shareserviceclient) - + this client represents interaction with the Azure storage account itself, and allows you to acquire preconfigured + client instances to access the file shares within. It provides operations to retrieve and configure the service + properties as well as list, create, and delete shares within the account. To perform operations on a specific share, + retrieve a client using the `get_share_client` method. +2. [ShareClient](https://aka.ms/azsdk-python-storage-fileshare-shareclient) - + this client represents interaction with a specific file share (which need not exist yet), and allows you to acquire + preconfigured client instances to access the directories and files within. It provides operations to create, delete, + configure, or create snapshots of a share and includes operations to create and enumerate the contents of + directories within it. To perform operations on a specific directory or file, retrieve a client using the + `get_directory_client` or `get_file_client` methods. +3. [ShareDirectoryClient](https://aka.ms/azsdk-python-storage-fileshare-sharedirectoryclient) - + this client represents interaction with a specific directory (which need not exist yet). It provides operations to + create, delete, or enumerate the contents of an immediate or nested subdirectory, and includes operations to create + and delete files within it. For operations relating to a specific subdirectory or file, a client for that entity can + also be retrieved using the `get_subdirectory_client` and `get_file_client` functions. +4. [ShareFileClient](http://aka.ms/azsdk-python-storage-fileshare-sharefileclient) - + this client represents interaction with a specific file (which need not exist yet). It provides operations to + upload, download, create, delete, and copy a file. + +For details on path naming restrictions, see [Naming and Referencing Shares, Directories, Files, and Metadata](https://docs.microsoft.com/rest/api/storageservices/naming-and-referencing-shares--directories--files--and-metadata). + +## Examples +The following sections provide several code snippets covering some of the most common Storage File Share tasks, including: + +* [Creating a file share](#creating-a-file-share "Creating a file share") +* [Uploading a file](#uploading-a-file "Uploading a file") +* [Downloading a file](#downloading-a-file "Downloading a file") +* [Listing contents of a directory](#listing-contents-of-a-directory "Listing contents of a directory") + +### Creating a file share +Create a file share to store your files + +```python +from azure.storage.fileshare import ShareClient + +share = ShareClient.from_connection_string(conn_str="", share_name="my_share") +share.create_share() +``` + +Use the async client to create a file share + +```python +from azure.storage.fileshare.aio import ShareClient + +share = ShareClient.from_connection_string(conn_str="", share_name="my_share") +await share.create_share() +``` + +### Uploading a file +Upload a file to the share + +```python +from azure.storage.fileshare import ShareFileClient + +file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") + +with open("./SampleSource.txt", "rb") as source_file: + file_client.upload_file(source_file) +``` + +Upload a file asynchronously + +```python +from azure.storage.fileshare.aio import ShareFileClient + +file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") + +with open("./SampleSource.txt", "rb") as source_file: + await file_client.upload_file(source_file) +``` + +### Downloading a file +Download a file from the share + +```python +from azure.storage.fileshare import ShareFileClient + +file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") + +with open("DEST_FILE", "wb") as file_handle: + data = file_client.download_file() + data.readinto(file_handle) +``` + +Download a file asynchronously + +```python +from azure.storage.fileshare.aio import ShareFileClient + +file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") + +with open("DEST_FILE", "wb") as file_handle: + data = await file_client.download_file() + await data.readinto(file_handle) +``` + +### Listing contents of a directory +List all directories and files under a parent directory + +```python +from azure.storage.fileshare import ShareDirectoryClient + +parent_dir = ShareDirectoryClient.from_connection_string(conn_str="", share_name="my_share", directory_path="parent_dir") + +my_list = list(parent_dir.list_directories_and_files()) +print(my_list) +``` + +List contents of a directory asynchronously + +```python +from azure.storage.fileshare.aio import ShareDirectoryClient + +parent_dir = ShareDirectoryClient.from_connection_string(conn_str="", share_name="my_share", directory_path="parent_dir") + +my_files = [] +async for item in parent_dir.list_directories_and_files(): + my_files.append(item) +print(my_files) +``` + +## Optional Configuration + +Optional keyword arguments that can be passed in at the client and per-operation level. + +### Retry Policy configuration + +Use the following keyword arguments when instantiating a client to configure the retry policy: + +* __retry_total__ (int): Total number of retries to allow. Takes precedence over other counts. +Pass in `retry_total=0` if you do not want to retry on requests. Defaults to 10. +* __retry_connect__ (int): How many connection-related errors to retry on. Defaults to 3. +* __retry_read__ (int): How many times to retry on read errors. Defaults to 3. +* __retry_status__ (int): How many times to retry on bad status codes. Defaults to 3. +* __retry_to_secondary__ (bool): Whether the request should be retried to secondary, if able. +This should only be enabled of RA-GRS accounts are used and potentially stale data can be handled. +Defaults to `False`. + +### Other client / per-operation configuration + +Other optional configuration keyword arguments that can be specified on the client or per-operation. + +**Client keyword arguments:** + +* __connection_timeout__ (int): Optionally sets the connect and read timeout value, in seconds. +* __transport__ (Any): User-provided transport to send the HTTP request. + +**Per-operation keyword arguments:** + +* __raw_response_hook__ (callable): The given callback uses the response returned from the service. +* __raw_request_hook__ (callable): The given callback uses the request before being sent to service. +* __client_request_id__ (str): Optional user specified identification of the request. +* __user_agent__ (str): Appends the custom value to the user-agent header to be sent with the request. +* __logging_enable__ (bool): Enables logging at the DEBUG level. Defaults to False. Can also be passed in at +the client level to enable it for all requests. +* __headers__ (dict): Pass in custom headers as key, value pairs. E.g. `headers={'CustomValue': value}` + + +## Troubleshooting +### General +Storage File clients raise exceptions defined in [Azure Core](https://github.com/Azure/azure-sdk-for-python/blob/master/sdk/core/azure-core/README.md). +All File service operations will throw a `StorageErrorException` on failure with helpful [error codes](https://docs.microsoft.com/rest/api/storageservices/file-service-error-codes). + +### Logging +This library uses the standard +[logging](https://docs.python.org/3/library/logging.html) library for logging. +Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO +level. + +Detailed DEBUG level logging, including request/response bodies and unredacted +headers, can be enabled on a client with the `logging_enable` argument: +```python +import sys +import logging +from azure.storage.fileshare import ShareServiceClient + +# Create a logger for the 'azure.storage.fileshare' SDK +logger = logging.getLogger('azure.storage.fileshare') +logger.setLevel(logging.DEBUG) + +# Configure a console output +handler = logging.StreamHandler(stream=sys.stdout) +logger.addHandler(handler) + +# This client will log detailed information about its HTTP sessions, at DEBUG level +service_client = ShareServiceClient.from_connection_string("your_connection_string", logging_enable=True) +``` + +Similarly, `logging_enable` can enable detailed logging for a single operation, +even when it isn't enabled for the client: +```py +service_client.get_service_properties(logging_enable=True) +``` + +## Next steps + +### More sample code + +Get started with our [File Share samples](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples). + +Several Storage File Share Python SDK samples are available to you in the SDK's GitHub repository. These samples provide example code for additional scenarios commonly encountered while working with Storage File Share: + +* [file_samples_hello_world.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_hello_world.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_hello_world_async.py)) - Examples found in this article: + * Client creation + * Create a file share + * Upload a file + +* [file_samples_authentication.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_authentication.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_authentication_async.py)) - Examples for authenticating and creating the client: + * From a connection string + * From a shared access key + * From a shared access signature token + +* [file_samples_service.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_service.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_service_async.py)) - Examples for interacting with the file service: + * Get and set service properties + * Create, list, and delete shares + * Get a share client + +* [file_samples_share.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_share.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_share_async.py)) - Examples for interacting with file shares: + * Create a share snapshot + * Set share quota and metadata + * List directories and files + * Get the directory or file client to interact with a specific entity + +* [file_samples_directory.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_directory.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_directory_async.py)) - Examples for interacting with directories: + * Create a directory and add files + * Create and delete subdirectories + * Get the subdirectory client + +* [file_samples_client.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_client.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_client_async.py)) - Examples for interacting with files: + * Create, upload, download, and delete files + * Copy a file from a URL + +### Additional documentation For more extensive documentation on Azure File Share storage, see the [Azure File Share storage documentation](https://docs.microsoft.com/azure/storage/files/) on docs.microsoft.com. - + diff --git a/docs-ref-services/legacy/storage-file-datalake-readme.md b/docs-ref-services/legacy/storage-file-datalake-readme.md index 1ef0b1c275fb..4c896ed38e23 100644 --- a/docs-ref-services/legacy/storage-file-datalake-readme.md +++ b/docs-ref-services/legacy/storage-file-datalake-readme.md @@ -7,240 +7,240 @@ ms.devlang: python ms.service: data-lake-storage-gen2 ms.technology: azure --- -# Azure DataLake service client library for Python - version 12.2.0 - -Overview - -This preview package for Python includes ADLS Gen2 specific API support made available in Storage SDK. This includes: -1. New directory level operations (Create, Rename, Delete) for hierarchical namespace enabled (HNS) storage account. For HNS enabled accounts, the rename/move operations are atomic. -2. Permission related operations (Get/Set ACLs) for hierarchical namespace enabled (HNS) accounts. - - -[Source code](https://github.com/Azure/azure-sdk-for-python/tree/azure-storage-file-datalake_12.2.0/sdk/storage/azure-storage-file-datalake/azure/storage/filedatalake) | [Package (PyPi)](https://pypi.org/project/azure-storage-file-datalake/) | [API reference documentation](https://aka.ms/azsdk-python-storage-filedatalake-ref) | [Product documentation](https://docs.microsoft.com/azure/storage/) | [Samples](https://github.com/Azure/azure-sdk-for-python/tree/azure-storage-file-datalake_12.2.0/sdk/storage/azure-storage-file-datalake/samples) - - -## Getting started - -### Prerequisites -* Python 2.7, or 3.5 or later is required to use this package. -* You must have an [Azure subscription](https://azure.microsoft.com/free/) and an -[Azure storage account](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account) to use this package. - -### Install the package -Install the Azure DataLake Storage client library for Python with [pip](https://pypi.org/project/pip/): - -```bash -pip install azure-storage-file-datalake --pre -``` - -### Create a storage account -If you wish to create a new storage account, you can use the -[Azure Portal](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-the-azure-portal), -[Azure PowerShell](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-powershell), -or [Azure CLI](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-azure-cli): - -```bash -# Create a new resource group to hold the storage account - -# if using an existing resource group, skip this step -az group create --name my-resource-group --location westus2 - -# Install the extension 'Storage-Preview' -az extension add --name storage-preview - -# Create the storage account -az storage account create --name my-storage-account-name --resource-group my-resource-group --sku Standard_LRS --kind StorageV2 --hierarchical-namespace true -``` - -### Authenticate the client - -Interaction with DataLake Storage starts with an instance of the DataLakeServiceClient class. You need an existing storage account, its URL, and a credential to instantiate the client object. - -#### Get credentials - -To authenticate the client you have a few options: -1. Use a SAS token string -2. Use an account shared access key -3. Use a token credential from [azure.identity](https://github.com/Azure/azure-sdk-for-python/tree/azure-storage-file-datalake_12.2.0/sdk/identity/azure-identity) - -Alternatively, you can authenticate with a storage connection string using the `from_connection_string` method. See example: [Client creation with a connection string](#client-creation-with-a-connection-string). - -You can omit the credential if your account URL already has a SAS token. - -#### Create client - -Once you have your account URL and credentials ready, you can create the DataLakeServiceClient: - -```python -from azure.storage.filedatalake import DataLakeServiceClient - -service = DataLakeServiceClient(account_url="https://.dfs.core.windows.net/", credential=credential) -``` - -## Key concepts - -DataLake storage offers four types of resources: -* The storage account -* A file system in the storage account -* A directory under the file system -* A file in a the file system or under directory - -#### Clients - -The DataLake Storage SDK provides four different clients to interact with the DataLake Service: -1. **DataLakeServiceClient** - this client interacts with the DataLake Service at the account level. - It provides operations to retrieve and configure the account properties - as well as list, create, and delete file systems within the account. - For operations relating to a specific file system, directory or file, clients for those entities - can also be retrieved using the `get_file_client`, `get_directory_client` or `get_file_system_client` functions. -2. **FileSystemClient** - this client represents interaction with a specific - file system, even if that file system does not exist yet. It provides operations to create, delete, or - configure file systems and includes operations to list paths under file system, upload, and delete file or - directory in the file system. - For operations relating to a specific file, the client can also be retrieved using - the `get_file_client` function. - For operations relating to a specific directory, the client can be retrieved using - the `get_directory_client` function. -3. **DataLakeDirectoryClient** - this client represents interaction with a specific - directory, even if that directory does not exist yet. It provides directory operations create, delete, rename, - get properties and set properties operations. -3. **DataLakeFileClient** - this client represents interaction with a specific - file, even if that file does not exist yet. It provides file operations to append data, flush data, delete, - create, and read file. -4. **DataLakeLeaseClient** - this client represents lease interactions with a FileSystemClient, DataLakeDirectoryClient - or DataLakeFileClient. It provides operations to acquire, renew, release, change, and break leases on the resources. - -## Examples - -The following sections provide several code snippets covering some of the most common Storage DataLake tasks, including: - -* [Client creation with a connection string](#client-creation-with-a-connection-string) -* [Uploading a file](#uploading-a-file) -* [Downloading a file](#downloading-a-file) -* [Enumerating paths](#enumerating-paths) - - -### Client creation with a connection string -Create the DataLakeServiceClient using the connection string to your Azure Storage account. - -```python -from azure.storage.filedatalake import DataLakeServiceClient - -service = DataLakeServiceClient.from_connection_string(conn_str="my_connection_string") -``` - -### Uploading a file -Upload a file to your file system. - -```python -from azure.storage.filedatalake import DataLakeFileClient - -data = b"abc" -file = DataLakeFileClient.from_connection_string("my_connection_string", - file_system_name="myfilesystem", file_path="myfile") - -file.append_data(data, offset=0, length=len(data)) -file.flush_data(len(data)) -``` - -### Downloading a file -Download a file from your file system. - -```python -from azure.storage.filedatalake import DataLakeFileClient - -file = DataLakeFileClient.from_connection_string("my_connection_string", - file_system_name="myfilesystem", file_path="myfile") - -with open("./BlockDestination.txt", "wb") as my_file: - download = file.download_file() - download.readinto(my_file) -``` - -### Enumerating paths -List the paths in your file system. - -```python -from azure.storage.filedatalake import FileSystemClient - -file_system = FileSystemClient.from_connection_string("my_connection_string", file_system_name="myfilesystem") - -paths = file_system.get_paths() -for path in paths: - print(path.name + '\n') -``` - -## Troubleshooting -### General -DataLake Storage clients raise exceptions defined in [Azure Core](https://github.com/Azure/azure-sdk-for-python/blob/azure-storage-file-datalake_12.2.0/sdk/core/azure-core/README.md). - -This list can be used for reference to catch thrown exceptions. To get the specific error code of the exception, use the `error_code` attribute, i.e, `exception.error_code`. - -### Logging -This library uses the standard -[logging](https://docs.python.org/3/library/logging.html) library for logging. -Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO -level. - -Detailed DEBUG level logging, including request/response bodies and unredacted -headers, can be enabled on a client with the `logging_enable` argument: -```python -import sys -import logging -from azure.storage.filedatalake import DataLakeServiceClient - -# Create a logger for the 'azure.storage.filedatalake' SDK -logger = logging.getLogger('azure.storage') -logger.setLevel(logging.DEBUG) - -# Configure a console output -handler = logging.StreamHandler(stream=sys.stdout) -logger.addHandler(handler) - -# This client will log detailed information about its HTTP sessions, at DEBUG level -service_client = DataLakeServiceClient.from_connection_string("your_connection_string", logging_enable=True) -``` - -Similarly, `logging_enable` can enable detailed logging for a single operation, -even when it isn't enabled for the client: -```py -service_client.list_file_systems(logging_enable=True) -``` - -## Next steps - -### More sample code - -Get started with our [Azure DataLake samples](https://github.com/Azure/azure-sdk-for-python/tree/azure-storage-file-datalake_12.2.0/sdk/storage/azure-storage-file-datalake/samples). - -Several DataLake Storage Python SDK samples are available to you in the SDK's GitHub repository. These samples provide example code for additional scenarios commonly encountered while working with DataLake Storage: - -* [`datalake_samples_access_control.py`](https://github.com/Azure/azure-sdk-for-python/tree/azure-storage-file-datalake_12.2.0/sdk/storage/azure-storage-file-datalake/samples/datalake_samples_access_control.py) - Examples for common DataLake Storage tasks: - * Set up a file system - * Create a directory - * Set/Get access control for the directory - * Create files under the directory - * Set/Get access control for each file - * Delete file system - -* [`datalake_samples_upload_download.py`](https://github.com/Azure/azure-sdk-for-python/tree/azure-storage-file-datalake_12.2.0/sdk/storage/azure-storage-file-datalake/samples/datalake_samples_upload_download.py) - Examples for common DataLake Storage tasks: - * Set up a file system - * Create file - * Append data to the file - * Flush data to the file - * Download the uploaded data - * Delete file system - - -### Additional documentation - -Table for [ADLS Gen1 to ADLS Gen2 API Mapping](https://github.com/Azure/azure-sdk-for-python/blob/azure-storage-file-datalake_12.2.0/sdk/storage/azure-storage-file-datalake/GEN1_GEN2_MAPPING.md) -For more extensive REST documentation on Data Lake Storage Gen2, see the [Data Lake Storage Gen2 documentation](https://docs.microsoft.com/rest/api/storageservices/datalakestoragegen2/filesystem) on docs.microsoft.com. - - -## Contributing -This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com. - -When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. - +# Azure DataLake service client library for Python - version 12.2.0 + +Overview + +This preview package for Python includes ADLS Gen2 specific API support made available in Storage SDK. This includes: +1. New directory level operations (Create, Rename, Delete) for hierarchical namespace enabled (HNS) storage account. For HNS enabled accounts, the rename/move operations are atomic. +2. Permission related operations (Get/Set ACLs) for hierarchical namespace enabled (HNS) accounts. + + +[Source code](https://github.com/Azure/azure-sdk-for-python/tree/azure-storage-file-datalake_12.2.0/sdk/storage/azure-storage-file-datalake/azure/storage/filedatalake) | [Package (PyPi)](https://pypi.org/project/azure-storage-file-datalake/) | [API reference documentation](https://aka.ms/azsdk-python-storage-filedatalake-ref) | [Product documentation](https://docs.microsoft.com/azure/storage/) | [Samples](https://github.com/Azure/azure-sdk-for-python/tree/azure-storage-file-datalake_12.2.0/sdk/storage/azure-storage-file-datalake/samples) + + +## Getting started + +### Prerequisites +* Python 2.7, or 3.5 or later is required to use this package. +* You must have an [Azure subscription](https://azure.microsoft.com/free/) and an +[Azure storage account](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account) to use this package. + +### Install the package +Install the Azure DataLake Storage client library for Python with [pip](https://pypi.org/project/pip/): + +```bash +pip install azure-storage-file-datalake --pre +``` + +### Create a storage account +If you wish to create a new storage account, you can use the +[Azure Portal](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-the-azure-portal), +[Azure PowerShell](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-powershell), +or [Azure CLI](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-quickstart-create-account#create-an-account-using-azure-cli): + +```bash +# Create a new resource group to hold the storage account - +# if using an existing resource group, skip this step +az group create --name my-resource-group --location westus2 + +# Install the extension 'Storage-Preview' +az extension add --name storage-preview + +# Create the storage account +az storage account create --name my-storage-account-name --resource-group my-resource-group --sku Standard_LRS --kind StorageV2 --hierarchical-namespace true +``` + +### Authenticate the client + +Interaction with DataLake Storage starts with an instance of the DataLakeServiceClient class. You need an existing storage account, its URL, and a credential to instantiate the client object. + +#### Get credentials + +To authenticate the client you have a few options: +1. Use a SAS token string +2. Use an account shared access key +3. Use a token credential from [azure.identity](https://github.com/Azure/azure-sdk-for-python/tree/azure-storage-file-datalake_12.2.0/sdk/identity/azure-identity) + +Alternatively, you can authenticate with a storage connection string using the `from_connection_string` method. See example: [Client creation with a connection string](#client-creation-with-a-connection-string). + +You can omit the credential if your account URL already has a SAS token. + +#### Create client + +Once you have your account URL and credentials ready, you can create the DataLakeServiceClient: + +```python +from azure.storage.filedatalake import DataLakeServiceClient + +service = DataLakeServiceClient(account_url="https://.dfs.core.windows.net/", credential=credential) +``` + +## Key concepts + +DataLake storage offers four types of resources: +* The storage account +* A file system in the storage account +* A directory under the file system +* A file in the file system or under directory + +#### Clients + +The DataLake Storage SDK provides four different clients to interact with the DataLake Service: +1. **DataLakeServiceClient** - this client interacts with the DataLake Service at the account level. + It provides operations to retrieve and configure the account properties + as well as list, create, and delete file systems within the account. + For operations relating to a specific file system, directory or file, clients for those entities + can also be retrieved using the `get_file_client`, `get_directory_client` or `get_file_system_client` functions. +2. **FileSystemClient** - this client represents interaction with a specific + file system, even if that file system does not exist yet. It provides operations to create, delete, or + configure file systems and includes operations to list paths under file system, upload, and delete file or + directory in the file system. + For operations relating to a specific file, the client can also be retrieved using + the `get_file_client` function. + For operations relating to a specific directory, the client can be retrieved using + the `get_directory_client` function. +3. **DataLakeDirectoryClient** - this client represents interaction with a specific + directory, even if that directory does not exist yet. It provides directory operations create, delete, rename, + get properties and set properties operations. +3. **DataLakeFileClient** - this client represents interaction with a specific + file, even if that file does not exist yet. It provides file operations to append data, flush data, delete, + create, and read file. +4. **DataLakeLeaseClient** - this client represents lease interactions with a FileSystemClient, DataLakeDirectoryClient + or DataLakeFileClient. It provides operations to acquire, renew, release, change, and break leases on the resources. + +## Examples + +The following sections provide several code snippets covering some of the most common Storage DataLake tasks, including: + +* [Client creation with a connection string](#client-creation-with-a-connection-string) +* [Uploading a file](#uploading-a-file) +* [Downloading a file](#downloading-a-file) +* [Enumerating paths](#enumerating-paths) + + +### Client creation with a connection string +Create the DataLakeServiceClient using the connection string to your Azure Storage account. + +```python +from azure.storage.filedatalake import DataLakeServiceClient + +service = DataLakeServiceClient.from_connection_string(conn_str="my_connection_string") +``` + +### Uploading a file +Upload a file to your file system. + +```python +from azure.storage.filedatalake import DataLakeFileClient + +data = b"abc" +file = DataLakeFileClient.from_connection_string("my_connection_string", + file_system_name="myfilesystem", file_path="myfile") + +file.append_data(data, offset=0, length=len(data)) +file.flush_data(len(data)) +``` + +### Downloading a file +Download a file from your file system. + +```python +from azure.storage.filedatalake import DataLakeFileClient + +file = DataLakeFileClient.from_connection_string("my_connection_string", + file_system_name="myfilesystem", file_path="myfile") + +with open("./BlockDestination.txt", "wb") as my_file: + download = file.download_file() + download.readinto(my_file) +``` + +### Enumerating paths +List the paths in your file system. + +```python +from azure.storage.filedatalake import FileSystemClient + +file_system = FileSystemClient.from_connection_string("my_connection_string", file_system_name="myfilesystem") + +paths = file_system.get_paths() +for path in paths: + print(path.name + '\n') +``` + +## Troubleshooting +### General +DataLake Storage clients raise exceptions defined in [Azure Core](https://github.com/Azure/azure-sdk-for-python/blob/azure-storage-file-datalake_12.2.0/sdk/core/azure-core/README.md). + +This list can be used for reference to catch thrown exceptions. To get the specific error code of the exception, use the `error_code` attribute, i.e, `exception.error_code`. + +### Logging +This library uses the standard +[logging](https://docs.python.org/3/library/logging.html) library for logging. +Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO +level. + +Detailed DEBUG level logging, including request/response bodies and unredacted +headers, can be enabled on a client with the `logging_enable` argument: +```python +import sys +import logging +from azure.storage.filedatalake import DataLakeServiceClient + +# Create a logger for the 'azure.storage.filedatalake' SDK +logger = logging.getLogger('azure.storage') +logger.setLevel(logging.DEBUG) + +# Configure a console output +handler = logging.StreamHandler(stream=sys.stdout) +logger.addHandler(handler) + +# This client will log detailed information about its HTTP sessions, at DEBUG level +service_client = DataLakeServiceClient.from_connection_string("your_connection_string", logging_enable=True) +``` + +Similarly, `logging_enable` can enable detailed logging for a single operation, +even when it isn't enabled for the client: +```py +service_client.list_file_systems(logging_enable=True) +``` + +## Next steps + +### More sample code + +Get started with our [Azure DataLake samples](https://github.com/Azure/azure-sdk-for-python/tree/azure-storage-file-datalake_12.2.0/sdk/storage/azure-storage-file-datalake/samples). + +Several DataLake Storage Python SDK samples are available to you in the SDK's GitHub repository. These samples provide example code for additional scenarios commonly encountered while working with DataLake Storage: + +* [`datalake_samples_access_control.py`](https://github.com/Azure/azure-sdk-for-python/tree/azure-storage-file-datalake_12.2.0/sdk/storage/azure-storage-file-datalake/samples/datalake_samples_access_control.py) - Examples for common DataLake Storage tasks: + * Set up a file system + * Create a directory + * Set/Get access control for the directory + * Create files under the directory + * Set/Get access control for each file + * Delete file system + +* [`datalake_samples_upload_download.py`](https://github.com/Azure/azure-sdk-for-python/tree/azure-storage-file-datalake_12.2.0/sdk/storage/azure-storage-file-datalake/samples/datalake_samples_upload_download.py) - Examples for common DataLake Storage tasks: + * Set up a file system + * Create file + * Append data to the file + * Flush data to the file + * Download the uploaded data + * Delete file system + + +### Additional documentation + +Table for [ADLS Gen1 to ADLS Gen2 API Mapping](https://github.com/Azure/azure-sdk-for-python/blob/azure-storage-file-datalake_12.2.0/sdk/storage/azure-storage-file-datalake/GEN1_GEN2_MAPPING.md) +For more extensive REST documentation on Data Lake Storage Gen2, see the [Data Lake Storage Gen2 documentation](https://docs.microsoft.com/rest/api/storageservices/datalakestoragegen2/filesystem) on docs.microsoft.com. + + +## Contributing +This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com. + +When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. + This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments. - + diff --git a/docs-ref-services/legacy/storage-fileshare-readme.md b/docs-ref-services/legacy/storage-fileshare-readme.md index 577a3f4a5de3..bdc1e8935542 100644 --- a/docs-ref-services/legacy/storage-fileshare-readme.md +++ b/docs-ref-services/legacy/storage-fileshare-readme.md @@ -10,367 +10,367 @@ ms.subservice: files ms.technology: azure manager: twolley --- -# Azure Files for Python Readme - Version 12.1.1 -Azure File Share storage offers fully managed file shares in the cloud that are accessible via the industry standard [Server Message Block (SMB) protocol](https://docs.microsoft.com/windows/desktop/FileIO/microsoft-smb-protocol-and-cifs-protocol-overview). Azure file shares can be mounted concurrently by cloud or on-premises deployments of Windows, Linux, and macOS. Additionally, Azure file shares can be cached on Windows Servers with Azure File Sync for fast access near where the data is being used. - -Azure file shares can be used to: - -* Replace or supplement on-premises file servers -* "Lift and shift" applications -* Simplify cloud development with shared application settings, diagnostic share, and Dev/Test/Debug tools - -[Source code](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/azure/storage/fileshare) | [Package (PyPI)](https://pypi.org/project/azure-storage-file-share/) | [API reference documentation](https://aka.ms/azsdk-python-storage-fileshare-ref) | [Product documentation](https://docs.microsoft.com/azure/storage/) | [Samples](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples) - -## Getting started - -### Prerequisites -* Python 2.7, or 3.5 or later is required to use this package. -* You must have an [Azure subscription](https://azure.microsoft.com/free/) and an -[Azure storage account](https://docs.microsoft.com/azure/storage/common/storage-account-overview) to use this package. - -### Install the package -Install the Azure Storage File Share client library for Python with [pip](https://pypi.org/project/pip/): - -```bash -pip install azure-storage-file-share -``` - -### Create a storage account -If you wish to create a new storage account, you can use the -[Azure Portal](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-portal), -[Azure PowerShell](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-powershell), -or [Azure CLI](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-cli): - -```bash -# Create a new resource group to hold the storage account - -# if using an existing resource group, skip this step -az group create --name my-resource-group --location westus2 - -# Create the storage account -az storage account create -n my-storage-account-name -g my-resource-group -``` - -### Create the client -The Azure Storage File Share client library for Python allows you to interact with four types of resources: the storage -account itself, file shares, directories, and files. Interaction with these resources starts with an instance of a -[client](#clients). To create a client object, you will need the storage account's file service URL and a -credential that allows you to access the storage account: - -```python -from azure.storage.fileshare import ShareServiceClient - -service = ShareServiceClient(account_url="https://.file.core.windows.net/", credential=credential) -``` - -#### Looking up the account URL -You can find the storage account's file service URL using the -[Azure Portal](https://docs.microsoft.com/azure/storage/common/storage-account-overview#storage-account-endpoints), -[Azure PowerShell](https://docs.microsoft.com/powershell/module/az.storage/get-azstorageaccount), -or [Azure CLI](https://docs.microsoft.com/cli/azure/storage/account?view=azure-cli-latest#az-storage-account-show): - -```bash -# Get the file service URL for the storage account -az storage account show -n my-storage-account-name -g my-resource-group --query "primaryEndpoints.file" -``` - -#### Types of credentials -The `credential` parameter may be provided in a number of different forms, depending on the type of -[authorization](https://docs.microsoft.com/azure/storage/common/storage-auth) you wish to use: -1. To use a [shared access signature (SAS) token](https://docs.microsoft.com/azure/storage/common/storage-sas-overview), - provide the token as a string. If your account URL includes the SAS token, omit the credential parameter. - You can generate a SAS token from the Azure Portal under "Shared access signature" or use one of the `generate_sas()` - functions to create a sas token for the storage account, share, or file: - - ```python - from datetime import datetime, timedelta - from azure.storage.fileshare import ShareServiceClient, generate_account_sas, ResourceTypes, AccountSasPermissions - - sas_token = generate_account_sas( - account_name="", - account_key="", - resource_types=ResourceTypes(service=True), - permission=AccountSasPermissions(read=True), - expiry=datetime.utcnow() + timedelta(hours=1) - ) - - share_service_client = ShareServiceClient(account_url="https://.file.core.windows.net", credential=sas_token) - ``` - -2. To use a storage account [shared key](https://docs.microsoft.com/rest/api/storageservices/authenticate-with-shared-key/) - (aka account key or access key), provide the key as a string. This can be found in the Azure Portal under the "Access Keys" - section or by running the following Azure CLI command: - - ```az storage account keys list -g MyResourceGroup -n MyStorageAccount``` - - Use the key as the credential parameter to authenticate the client: - ```python - from azure.storage.fileshare import ShareServiceClient - service = ShareServiceClient(account_url="https://.file.core.windows.net", credential="") - ``` - -#### Creating the client from a connection string -Depending on your use case and authorization method, you may prefer to initialize a client instance with a storage -connection string instead of providing the account URL and credential separately. To do this, pass the storage -connection string to the client's `from_connection_string` class method: - -```python -from azure.storage.fileshare import ShareServiceClient - -connection_string = "DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxx;EndpointSuffix=core.windows.net" -service = ShareServiceClient.from_connection_string(conn_str=connection_string) -``` - -The connection string to your storage account can be found in the Azure Portal under the "Access Keys" section or by running the following CLI command: - -```bash -az storage account show-connection-string -g MyResourceGroup -n MyStorageAccount -``` - -## Key concepts -The following components make up the Azure File Share Service: -* The storage account itself -* A file share within the storage account -* An optional hierarchy of directories within the file share -* A file within the file share, which may be up to 1 TiB in size - -The Azure Storage File Share client library for Python allows you to interact with each of these components through the -use of a dedicated client object. - -### Clients -Four different clients are provided to to interact with the various components of the File Share Service: -1. [ShareServiceClient](https://aka.ms/azsdk-python-storage-fileshare-shareserviceclient) - - this client represents interaction with the Azure storage account itself, and allows you to acquire preconfigured - client instances to access the file shares within. It provides operations to retrieve and configure the service - properties as well as list, create, and delete shares within the account. To perform operations on a specific share, - retrieve a client using the `get_share_client` method. -2. [ShareClient](https://aka.ms/azsdk-python-storage-fileshare-shareclient) - - this client represents interaction with a specific file share (which need not exist yet), and allows you to acquire - preconfigured client instances to access the directories and files within. It provides operations to create, delete, - configure, or create snapshots of a share and includes operations to create and enumerate the contents of - directories within it. To perform operations on a specific directory or file, retrieve a client using the - `get_directory_client` or `get_file_client` methods. -3. [ShareDirectoryClient](https://aka.ms/azsdk-python-storage-fileshare-sharedirectoryclient) - - this client represents interaction with a specific directory (which need not exist yet). It provides operations to - create, delete, or enumerate the contents of an immediate or nested subdirectory, and includes operations to create - and delete files within it. For operations relating to a specific subdirectory or file, a client for that entity can - also be retrieved using the `get_subdirectory_client` and `get_file_client` functions. -4. [ShareFileClient](http://aka.ms/azsdk-python-storage-fileshare-sharefileclient) - - this client represents interaction with a specific file (which need not exist yet). It provides operations to - upload, download, create, delete, and copy a file. - -For details on path naming restrictions, see [Naming and Referencing Shares, Directories, Files, and Metadata](https://docs.microsoft.com/rest/api/storageservices/naming-and-referencing-shares--directories--files--and-metadata). - -## Examples -The following sections provide several code snippets covering some of the most common Storage File Share tasks, including: - -* [Creating a file share](#creating-a-file-share "Creating a file share") -* [Uploading a file](#uploading-a-file "Uploading a file") -* [Downloading a file](#downloading-a-file "Downloading a file") -* [Listing contents of a directory](#listing-contents-of-a-directory "Listing contents of a directory") - -### Creating a file share -Create a file share to store your files - -```python -from azure.storage.fileshare import ShareClient - -share = ShareClient.from_connection_string(conn_str="", share_name="my_share") -share.create_share() -``` - -Use the async client to create a file share - -```python -from azure.storage.fileshare.aio import ShareClient - -share = ShareClient.from_connection_string(conn_str="", share_name="my_share") -await share.create_share() -``` - -### Uploading a file -Upload a file to the share - -```python -from azure.storage.fileshare import ShareFileClient - -file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") - -with open("./SampleSource.txt", "rb") as source_file: - file_client.upload_file(source_file) -``` - -Upload a file asynchronously - -```python -from azure.storage.fileshare.aio import ShareFileClient - -file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") - -with open("./SampleSource.txt", "rb") as source_file: - await file_client.upload_file(source_file) -``` - -### Downloading a file -Download a file from the share - -```python -from azure.storage.fileshare import ShareFileClient - -file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") - -with open("DEST_FILE", "wb") as file_handle: - data = file_client.download_file() - data.readinto(file_handle) -``` - -Download a file asynchronously - -```python -from azure.storage.fileshare.aio import ShareFileClient - -file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") - -with open("DEST_FILE", "wb") as file_handle: - data = await file_client.download_file() - await data.readinto(file_handle) -``` - -### Listing contents of a directory -List all directories and files under a parent directory - -```python -from azure.storage.fileshare import ShareDirectoryClient - -parent_dir = ShareDirectoryClient.from_connection_string(conn_str="", share_name="my_share", directory_path="parent_dir") - -my_list = list(parent_dir.list_directories_and_files()) -print(my_list) -``` - -List contents of a directory asynchronously - -```python -from azure.storage.fileshare.aio import ShareDirectoryClient - -parent_dir = ShareDirectoryClient.from_connection_string(conn_str="", share_name="my_share", directory_path="parent_dir") - -my_files = [] -async for item in parent_dir.list_directories_and_files(): - my_files.append(item) -print(my_files) -``` - -## Optional Configuration - -Optional keyword arguments that can be passed in at the client and per-operation level. - -### Retry Policy configuration - -Use the following keyword arguments when instantiating a client to configure the retry policy: - -* __retry_total__ (int): Total number of retries to allow. Takes precedence over other counts. -Pass in `retry_total=0` if you do not want to retry on requests. Defaults to 10. -* __retry_connect__ (int): How many connection-related errors to retry on. Defaults to 3. -* __retry_read__ (int): How many times to retry on read errors. Defaults to 3. -* __retry_status__ (int): How many times to retry on bad status codes. Defaults to 3. -* __retry_to_secondary__ (bool): Whether the request should be retried to secondary, if able. -This should only be enabled of RA-GRS accounts are used and potentially stale data can be handled. -Defaults to `False`. - -### Other client / per-operation configuration - -Other optional configuration keyword arguments that can be specified on the client or per-operation. - -**Client keyword arguments:** - -* __connection_timeout__ (int): Optionally sets the connect and read timeout value, in seconds. -* __transport__ (Any): User-provided transport to send the HTTP request. - -**Per-operation keyword arguments:** - -* __raw_response_hook__ (callable): The given callback uses the response returned from the service. -* __raw_request_hook__ (callable): The given callback uses the request before being sent to service. -* __client_request_id__ (str): Optional user specified identification of the request. -* __user_agent__ (str): Appends the custom value to the user-agent header to be sent with the request. -* __logging_enable__ (bool): Enables logging at the DEBUG level. Defaults to False. Can also be passed in at -the client level to enable it for all requests. -* __headers__ (dict): Pass in custom headers as key, value pairs. E.g. `headers={'CustomValue': value}` - - -## Troubleshooting -### General -Storage File clients raise exceptions defined in [Azure Core](https://github.com/Azure/azure-sdk-for-python/blob/master/sdk/core/azure-core/README.md). -All File service operations will throw a `StorageErrorException` on failure with helpful [error codes](https://docs.microsoft.com/rest/api/storageservices/file-service-error-codes). - -### Logging -This library uses the standard -[logging](https://docs.python.org/3/library/logging.html) library for logging. -Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO -level. - -Detailed DEBUG level logging, including request/response bodies and unredacted -headers, can be enabled on a client with the `logging_enable` argument: -```python -import sys -import logging -from azure.storage.fileshare import ShareServiceClient - -# Create a logger for the 'azure.storage.fileshare' SDK -logger = logging.getLogger('azure.storage.fileshare') -logger.setLevel(logging.DEBUG) - -# Configure a console output -handler = logging.StreamHandler(stream=sys.stdout) -logger.addHandler(handler) - -# This client will log detailed information about its HTTP sessions, at DEBUG level -service_client = ShareServiceClient.from_connection_string("your_connection_string", logging_enable=True) -``` - -Similarly, `logging_enable` can enable detailed logging for a single operation, -even when it isn't enabled for the client: -```py -service_client.get_service_properties(logging_enable=True) -``` - -## Next steps - -### More sample code - -Get started with our [File Share samples](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples). - -Several Storage File Share Python SDK samples are available to you in the SDK's GitHub repository. These samples provide example code for additional scenarios commonly encountered while working with Storage File Share: - -* [file_samples_hello_world.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_hello_world.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_hello_world_async.py)) - Examples found in this article: - * Client creation - * Create a file share - * Upload a file - -* [file_samples_authentication.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_authentication.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_authentication_async.py)) - Examples for authenticating and creating the client: - * From a connection string - * From a shared access key - * From a shared access signature token - -* [file_samples_service.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_service.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_service_async.py)) - Examples for interacting with the file service: - * Get and set service properties - * Create, list, and delete shares - * Get a share client - -* [file_samples_share.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_share.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_share_async.py)) - Examples for interacting with file shares: - * Create a share snapshot - * Set share quota and metadata - * List directories and files - * Get the directory or file client to interact with a specific entity - -* [file_samples_directory.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_directory.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_directory_async.py)) - Examples for interacting with directories: - * Create a directory and add files - * Create and delete subdirectories - * Get the subdirectory client - -* [file_samples_client.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_client.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_client_async.py)) - Examples for interacting with files: - * Create, upload, download, and delete files - * Copy a file from a URL - -### Additional documentation +# Azure Files for Python Readme - Version 12.1.1 +Azure File Share storage offers fully managed file shares in the cloud that are accessible via the industry standard [Server Message Block (SMB) protocol](https://docs.microsoft.com/windows/desktop/FileIO/microsoft-smb-protocol-and-cifs-protocol-overview). Azure file shares can be mounted concurrently by cloud or on-premises deployments of Windows, Linux, and macOS. Additionally, Azure file shares can be cached on Windows Servers with Azure File Sync for fast access near where the data is being used. + +Azure file shares can be used to: + +* Replace or supplement on-premises file servers +* "Lift and shift" applications +* Simplify cloud development with shared application settings, diagnostic share, and Dev/Test/Debug tools + +[Source code](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/azure/storage/fileshare) | [Package (PyPI)](https://pypi.org/project/azure-storage-file-share/) | [API reference documentation](https://aka.ms/azsdk-python-storage-fileshare-ref) | [Product documentation](https://docs.microsoft.com/azure/storage/) | [Samples](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples) + +## Getting started + +### Prerequisites +* Python 2.7, or 3.5 or later is required to use this package. +* You must have an [Azure subscription](https://azure.microsoft.com/free/) and an +[Azure storage account](https://docs.microsoft.com/azure/storage/common/storage-account-overview) to use this package. + +### Install the package +Install the Azure Storage File Share client library for Python with [pip](https://pypi.org/project/pip/): + +```bash +pip install azure-storage-file-share +``` + +### Create a storage account +If you wish to create a new storage account, you can use the +[Azure Portal](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-portal), +[Azure PowerShell](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-powershell), +or [Azure CLI](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-cli): + +```bash +# Create a new resource group to hold the storage account - +# if using an existing resource group, skip this step +az group create --name my-resource-group --location westus2 + +# Create the storage account +az storage account create -n my-storage-account-name -g my-resource-group +``` + +### Create the client +The Azure Storage File Share client library for Python allows you to interact with four types of resources: the storage +account itself, file shares, directories, and files. Interaction with these resources starts with an instance of a +[client](#clients). To create a client object, you will need the storage account's file service URL and a +credential that allows you to access the storage account: + +```python +from azure.storage.fileshare import ShareServiceClient + +service = ShareServiceClient(account_url="https://.file.core.windows.net/", credential=credential) +``` + +#### Looking up the account URL +You can find the storage account's file service URL using the +[Azure Portal](https://docs.microsoft.com/azure/storage/common/storage-account-overview#storage-account-endpoints), +[Azure PowerShell](https://docs.microsoft.com/powershell/module/az.storage/get-azstorageaccount), +or [Azure CLI](https://docs.microsoft.com/cli/azure/storage/account?view=azure-cli-latest#az-storage-account-show): + +```bash +# Get the file service URL for the storage account +az storage account show -n my-storage-account-name -g my-resource-group --query "primaryEndpoints.file" +``` + +#### Types of credentials +The `credential` parameter may be provided in a number of different forms, depending on the type of +[authorization](https://docs.microsoft.com/azure/storage/common/storage-auth) you wish to use: +1. To use a [shared access signature (SAS) token](https://docs.microsoft.com/azure/storage/common/storage-sas-overview), + provide the token as a string. If your account URL includes the SAS token, omit the credential parameter. + You can generate a SAS token from the Azure Portal under "Shared access signature" or use one of the `generate_sas()` + functions to create a sas token for the storage account, share, or file: + + ```python + from datetime import datetime, timedelta + from azure.storage.fileshare import ShareServiceClient, generate_account_sas, ResourceTypes, AccountSasPermissions + + sas_token = generate_account_sas( + account_name="", + account_key="", + resource_types=ResourceTypes(service=True), + permission=AccountSasPermissions(read=True), + expiry=datetime.utcnow() + timedelta(hours=1) + ) + + share_service_client = ShareServiceClient(account_url="https://.file.core.windows.net", credential=sas_token) + ``` + +2. To use a storage account [shared key](https://docs.microsoft.com/rest/api/storageservices/authenticate-with-shared-key/) + (aka account key or access key), provide the key as a string. This can be found in the Azure Portal under the "Access Keys" + section or by running the following Azure CLI command: + + ```az storage account keys list -g MyResourceGroup -n MyStorageAccount``` + + Use the key as the credential parameter to authenticate the client: + ```python + from azure.storage.fileshare import ShareServiceClient + service = ShareServiceClient(account_url="https://.file.core.windows.net", credential="") + ``` + +#### Creating the client from a connection string +Depending on your use case and authorization method, you may prefer to initialize a client instance with a storage +connection string instead of providing the account URL and credential separately. To do this, pass the storage +connection string to the client's `from_connection_string` class method: + +```python +from azure.storage.fileshare import ShareServiceClient + +connection_string = "DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxx;EndpointSuffix=core.windows.net" +service = ShareServiceClient.from_connection_string(conn_str=connection_string) +``` + +The connection string to your storage account can be found in the Azure Portal under the "Access Keys" section or by running the following CLI command: + +```bash +az storage account show-connection-string -g MyResourceGroup -n MyStorageAccount +``` + +## Key concepts +The following components make up the Azure File Share Service: +* The storage account itself +* A file share within the storage account +* An optional hierarchy of directories within the file share +* A file within the file share, which may be up to 1 TiB in size + +The Azure Storage File Share client library for Python allows you to interact with each of these components through the +use of a dedicated client object. + +### Clients +Four different clients are provided to interact with the various components of the File Share Service: +1. [ShareServiceClient](https://aka.ms/azsdk-python-storage-fileshare-shareserviceclient) - + this client represents interaction with the Azure storage account itself, and allows you to acquire preconfigured + client instances to access the file shares within. It provides operations to retrieve and configure the service + properties as well as list, create, and delete shares within the account. To perform operations on a specific share, + retrieve a client using the `get_share_client` method. +2. [ShareClient](https://aka.ms/azsdk-python-storage-fileshare-shareclient) - + this client represents interaction with a specific file share (which need not exist yet), and allows you to acquire + preconfigured client instances to access the directories and files within. It provides operations to create, delete, + configure, or create snapshots of a share and includes operations to create and enumerate the contents of + directories within it. To perform operations on a specific directory or file, retrieve a client using the + `get_directory_client` or `get_file_client` methods. +3. [ShareDirectoryClient](https://aka.ms/azsdk-python-storage-fileshare-sharedirectoryclient) - + this client represents interaction with a specific directory (which need not exist yet). It provides operations to + create, delete, or enumerate the contents of an immediate or nested subdirectory, and includes operations to create + and delete files within it. For operations relating to a specific subdirectory or file, a client for that entity can + also be retrieved using the `get_subdirectory_client` and `get_file_client` functions. +4. [ShareFileClient](http://aka.ms/azsdk-python-storage-fileshare-sharefileclient) - + this client represents interaction with a specific file (which need not exist yet). It provides operations to + upload, download, create, delete, and copy a file. + +For details on path naming restrictions, see [Naming and Referencing Shares, Directories, Files, and Metadata](https://docs.microsoft.com/rest/api/storageservices/naming-and-referencing-shares--directories--files--and-metadata). + +## Examples +The following sections provide several code snippets covering some of the most common Storage File Share tasks, including: + +* [Creating a file share](#creating-a-file-share "Creating a file share") +* [Uploading a file](#uploading-a-file "Uploading a file") +* [Downloading a file](#downloading-a-file "Downloading a file") +* [Listing contents of a directory](#listing-contents-of-a-directory "Listing contents of a directory") + +### Creating a file share +Create a file share to store your files + +```python +from azure.storage.fileshare import ShareClient + +share = ShareClient.from_connection_string(conn_str="", share_name="my_share") +share.create_share() +``` + +Use the async client to create a file share + +```python +from azure.storage.fileshare.aio import ShareClient + +share = ShareClient.from_connection_string(conn_str="", share_name="my_share") +await share.create_share() +``` + +### Uploading a file +Upload a file to the share + +```python +from azure.storage.fileshare import ShareFileClient + +file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") + +with open("./SampleSource.txt", "rb") as source_file: + file_client.upload_file(source_file) +``` + +Upload a file asynchronously + +```python +from azure.storage.fileshare.aio import ShareFileClient + +file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") + +with open("./SampleSource.txt", "rb") as source_file: + await file_client.upload_file(source_file) +``` + +### Downloading a file +Download a file from the share + +```python +from azure.storage.fileshare import ShareFileClient + +file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") + +with open("DEST_FILE", "wb") as file_handle: + data = file_client.download_file() + data.readinto(file_handle) +``` + +Download a file asynchronously + +```python +from azure.storage.fileshare.aio import ShareFileClient + +file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") + +with open("DEST_FILE", "wb") as file_handle: + data = await file_client.download_file() + await data.readinto(file_handle) +``` + +### Listing contents of a directory +List all directories and files under a parent directory + +```python +from azure.storage.fileshare import ShareDirectoryClient + +parent_dir = ShareDirectoryClient.from_connection_string(conn_str="", share_name="my_share", directory_path="parent_dir") + +my_list = list(parent_dir.list_directories_and_files()) +print(my_list) +``` + +List contents of a directory asynchronously + +```python +from azure.storage.fileshare.aio import ShareDirectoryClient + +parent_dir = ShareDirectoryClient.from_connection_string(conn_str="", share_name="my_share", directory_path="parent_dir") + +my_files = [] +async for item in parent_dir.list_directories_and_files(): + my_files.append(item) +print(my_files) +``` + +## Optional Configuration + +Optional keyword arguments that can be passed in at the client and per-operation level. + +### Retry Policy configuration + +Use the following keyword arguments when instantiating a client to configure the retry policy: + +* __retry_total__ (int): Total number of retries to allow. Takes precedence over other counts. +Pass in `retry_total=0` if you do not want to retry on requests. Defaults to 10. +* __retry_connect__ (int): How many connection-related errors to retry on. Defaults to 3. +* __retry_read__ (int): How many times to retry on read errors. Defaults to 3. +* __retry_status__ (int): How many times to retry on bad status codes. Defaults to 3. +* __retry_to_secondary__ (bool): Whether the request should be retried to secondary, if able. +This should only be enabled of RA-GRS accounts are used and potentially stale data can be handled. +Defaults to `False`. + +### Other client / per-operation configuration + +Other optional configuration keyword arguments that can be specified on the client or per-operation. + +**Client keyword arguments:** + +* __connection_timeout__ (int): Optionally sets the connect and read timeout value, in seconds. +* __transport__ (Any): User-provided transport to send the HTTP request. + +**Per-operation keyword arguments:** + +* __raw_response_hook__ (callable): The given callback uses the response returned from the service. +* __raw_request_hook__ (callable): The given callback uses the request before being sent to service. +* __client_request_id__ (str): Optional user specified identification of the request. +* __user_agent__ (str): Appends the custom value to the user-agent header to be sent with the request. +* __logging_enable__ (bool): Enables logging at the DEBUG level. Defaults to False. Can also be passed in at +the client level to enable it for all requests. +* __headers__ (dict): Pass in custom headers as key, value pairs. E.g. `headers={'CustomValue': value}` + + +## Troubleshooting +### General +Storage File clients raise exceptions defined in [Azure Core](https://github.com/Azure/azure-sdk-for-python/blob/master/sdk/core/azure-core/README.md). +All File service operations will throw a `StorageErrorException` on failure with helpful [error codes](https://docs.microsoft.com/rest/api/storageservices/file-service-error-codes). + +### Logging +This library uses the standard +[logging](https://docs.python.org/3/library/logging.html) library for logging. +Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO +level. + +Detailed DEBUG level logging, including request/response bodies and unredacted +headers, can be enabled on a client with the `logging_enable` argument: +```python +import sys +import logging +from azure.storage.fileshare import ShareServiceClient + +# Create a logger for the 'azure.storage.fileshare' SDK +logger = logging.getLogger('azure.storage.fileshare') +logger.setLevel(logging.DEBUG) + +# Configure a console output +handler = logging.StreamHandler(stream=sys.stdout) +logger.addHandler(handler) + +# This client will log detailed information about its HTTP sessions, at DEBUG level +service_client = ShareServiceClient.from_connection_string("your_connection_string", logging_enable=True) +``` + +Similarly, `logging_enable` can enable detailed logging for a single operation, +even when it isn't enabled for the client: +```py +service_client.get_service_properties(logging_enable=True) +``` + +## Next steps + +### More sample code + +Get started with our [File Share samples](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples). + +Several Storage File Share Python SDK samples are available to you in the SDK's GitHub repository. These samples provide example code for additional scenarios commonly encountered while working with Storage File Share: + +* [file_samples_hello_world.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_hello_world.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_hello_world_async.py)) - Examples found in this article: + * Client creation + * Create a file share + * Upload a file + +* [file_samples_authentication.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_authentication.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_authentication_async.py)) - Examples for authenticating and creating the client: + * From a connection string + * From a shared access key + * From a shared access signature token + +* [file_samples_service.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_service.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_service_async.py)) - Examples for interacting with the file service: + * Get and set service properties + * Create, list, and delete shares + * Get a share client + +* [file_samples_share.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_share.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_share_async.py)) - Examples for interacting with file shares: + * Create a share snapshot + * Set share quota and metadata + * List directories and files + * Get the directory or file client to interact with a specific entity + +* [file_samples_directory.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_directory.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_directory_async.py)) - Examples for interacting with directories: + * Create a directory and add files + * Create and delete subdirectories + * Get the subdirectory client + +* [file_samples_client.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_client.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_client_async.py)) - Examples for interacting with files: + * Create, upload, download, and delete files + * Copy a file from a URL + +### Additional documentation For more extensive documentation on Azure File Share storage, see the [Azure File Share storage documentation](https://docs.microsoft.com/azure/storage/files/) on docs.microsoft.com. - + diff --git a/docs-ref-services/preview/communication-administration-readme.md b/docs-ref-services/preview/communication-administration-readme.md index 782dd83466b9..c24f23d1da69 100644 --- a/docs-ref-services/preview/communication-administration-readme.md +++ b/docs-ref-services/preview/communication-administration-readme.md @@ -7,196 +7,196 @@ ms.devlang: python ms.service: azure-communication-services ms.technology: azure --- -[![Build Status](https://dev.azure.com/azure-sdk/public/_apis/build/status/azure-sdk-for-python.client?branchName=master)](https://dev.azure.com/azure-sdk/public/_build/latest?definitionId=46?branchName=master) - -# Azure Communication Administration Package client library for Python - version 1.0.0b2 - - -Azure Communication Administration client package is intended to be used to setup the basics for opening a way to use Azure Communication Service offerings. This package helps to create identities user tokens to be used by other client packages such as chat, calling, sms. - -# Getting started -### Prerequisites -* Python 2.7, or 3.5 or later is required to use this package. -* You must have an [Azure subscription](https://azure.microsoft.com/free/) - -### Install the package -Install the Azure Communication Administration client library for Python with [pip](https://pypi.org/project/pip/): - -```bash -pip install azure-communication-administration -``` - -# Key concepts -## CommunicationIdentityClient -`CommunicationIdentityClient` provides operations for: - -- Create/delete identities to be used in Azure Communication Services. Those identities can be used to make use of Azure Communication offerings and can be scoped to have limited abilities through token scopes. - -- Create/revoke scoped user access tokens to access services such as chat, calling, sms. Tokens are issued for a valid Azure Communication identity and can be revoked at any time. - -## CommunicationPhoneNumberClient -### Initializing Phone Number Client -```python -# You can find your endpoint and access token from your resource in the Azure Portal -import os -from azure.communication.administration import PhoneNumberAdministrationClient - -connection_str = os.getenv('AZURE_COMMUNICATION_SERVICE_CONNECTION_STRING') -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) -``` -### Phone plans overview - -Phone plans come in two types; Geographic and Toll-Free. Geographic phone plans are phone plans associated with a location, whose phone numbers' area codes are associated with the area code of a geographic location. Toll-Free phone plans are phone plans not associated location. For example, in the US, toll-free numbers can come with area codes such as 800 or 888. - -All geographic phone plans within the same country are grouped into a phone plan group with a Geographic phone number type. All Toll-Free phone plans within the same country are grouped into a phone plan group. - -### Searching and Acquiring numbers - -Phone numbers search can be search through the search creation API by providing a phone plan id, an area code and quantity of phone numbers. The provided quantity of phone numbers will be reserved for ten minutes. This search of phone numbers can either be cancelled or purchased. If the search is cancelled, then the phone numbers will become available to others. If the search is purchased, then the phone numbers are acquired for the Azure resources. - -### Configuring / Assigning numbers - -Phone numbers can be assigned to a callback URL via the configure number API. As part of the configuration, you will need an acquired phone number, callback URL and application id. - -# Examples -The following section provides several code snippets covering some of the most common Azure Communication Services tasks, including: - -[Create/delete Azure Communication Service identities][identitysamples] - -[Create/revoke scoped user access tokens][identitysamples] - -## Communication Phone number -### Get Countries - -```python -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -supported_countries = phone_number_administration_client.list_all_supported_countries() -for supported_country in supported_countries: - print(supported_country) -``` - -### Get Phone Plan Groups - -Phone plan groups come in two types, Geographic and Toll-Free. - -```python -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -phone_plan_groups_response = phone_number_administration_client.list_phone_plan_groups( - country_code='' -) -for phone_plan_group in phone_plan_groups_response: - print(phone_plan_group) -``` - -### Get Phone Plans - -Unlike Toll-Free phone plans, area codes for Geographic Phone Plans are empty. Area codes are found in the Area Codes API. - -```python -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -phone_plans_response = phone_number_administration_client.list_phone_plans( - country_code='', - phone_plan_group_id='' -) -for phone_plan in phone_plans_response: - print(phone_plan) -``` - -### Get Location Options - -For Geographic phone plans, you can query the available geographic locations. The locations options are structured like the geographic hierarchy of a country. For example, the US has states and within each state are cities. - -```python -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -location_options_response = phone_number_administration_client.get_phone_plan_location_options( - country_code='', - phone_plan_group_id='', - phone_plan_id='' -) -print(location_options_response) -``` - -### Get Area Codes - -Fetching area codes for geographic phone plans will require the the location options queries set. You must include the chain of geographic locations traversing down the location options object returned by the GetLocationOptions API. - -```python -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -all_area_codes = phone_number_administration_client.get_all_area_codes( - location_type="NotRequired", - country_code='', - phone_plan_id='' -) -print(all_area_codes) -``` - -### Create Search - -```python -from azure.communication.administration import CreateSearchOptions -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -searchOptions = CreateSearchOptions( - area_code='', - description="testsearch20200014", - display_name="testsearch20200014", - phone_plan_ids=[''], - quantity=1 -) -search_response = phone_number_administration_client.create_search( - body=searchOptions -) -print(search_response) -``` - -### Get search by id -```python -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -phone_number_search_response = phone_number_administration_client.get_search_by_id( - search_id='' -) -print(phone_number_search_response) -``` - -### Purchase Search - -```python -phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) - -phone_number_administration_client.purchase_search( - search_id='' -) -``` - -# Troubleshooting -The Azure Communication Service Identity client will raise exceptions defined in [Azure Core][azure_core]. - -# Next steps -## More sample code - -Please take a look at the [samples](https://github.com/Azure/azure-sdk-for-python/tree/azure-communication-administration_1.0.0b2/sdk/communication/azure-communication-administration/samples) directory for detailed examples of how to use this library to manage identities and tokens. - -## Provide Feedback - -If you encounter any bugs or have suggestions, please file an issue in the [Issues](https://github.com/Azure/azure-sdk-for-python/issues) section of the project - -# Contributing -This project welcomes contributions and suggestions. Most contributions require you to agree to a -Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com. - -When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the -PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. - -This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). -For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments. - - -[identitysamples]: https://github.com/Azure/azure-sdk-for-python/blob/azure-communication-administration_1.0.0b2/sdk/communication/azure-communication-administration/samples/identity_samples.py +[![Build Status](https://dev.azure.com/azure-sdk/public/_apis/build/status/azure-sdk-for-python.client?branchName=master)](https://dev.azure.com/azure-sdk/public/_build/latest?definitionId=46?branchName=master) + +# Azure Communication Administration Package client library for Python - version 1.0.0b2 + + +Azure Communication Administration client package is intended to be used to setup the basics for opening a way to use Azure Communication Service offerings. This package helps to create identities user tokens to be used by other client packages such as chat, calling, sms. + +# Getting started +### Prerequisites +* Python 2.7, or 3.5 or later is required to use this package. +* You must have an [Azure subscription](https://azure.microsoft.com/free/) + +### Install the package +Install the Azure Communication Administration client library for Python with [pip](https://pypi.org/project/pip/): + +```bash +pip install azure-communication-administration +``` + +# Key concepts +## CommunicationIdentityClient +`CommunicationIdentityClient` provides operations for: + +- Create/delete identities to be used in Azure Communication Services. Those identities can be used to make use of Azure Communication offerings and can be scoped to have limited abilities through token scopes. + +- Create/revoke scoped user access tokens to access services such as chat, calling, sms. Tokens are issued for a valid Azure Communication identity and can be revoked at any time. + +## CommunicationPhoneNumberClient +### Initializing Phone Number Client +```python +# You can find your endpoint and access token from your resource in the Azure Portal +import os +from azure.communication.administration import PhoneNumberAdministrationClient + +connection_str = os.getenv('AZURE_COMMUNICATION_SERVICE_CONNECTION_STRING') +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) +``` +### Phone plans overview + +Phone plans come in two types; Geographic and Toll-Free. Geographic phone plans are phone plans associated with a location, whose phone numbers' area codes are associated with the area code of a geographic location. Toll-Free phone plans are phone plans not associated location. For example, in the US, toll-free numbers can come with area codes such as 800 or 888. + +All geographic phone plans within the same country are grouped into a phone plan group with a Geographic phone number type. All Toll-Free phone plans within the same country are grouped into a phone plan group. + +### Searching and Acquiring numbers + +Phone numbers search can be search through the search creation API by providing a phone plan id, an area code and quantity of phone numbers. The provided quantity of phone numbers will be reserved for ten minutes. This search of phone numbers can either be cancelled or purchased. If the search is cancelled, then the phone numbers will become available to others. If the search is purchased, then the phone numbers are acquired for the Azure resources. + +### Configuring / Assigning numbers + +Phone numbers can be assigned to a callback URL via the configure number API. As part of the configuration, you will need an acquired phone number, callback URL and application id. + +# Examples +The following section provides several code snippets covering some of the most common Azure Communication Services tasks, including: + +[Create/delete Azure Communication Service identities][identitysamples] + +[Create/revoke scoped user access tokens][identitysamples] + +## Communication Phone number +### Get Countries + +```python +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +supported_countries = phone_number_administration_client.list_all_supported_countries() +for supported_country in supported_countries: + print(supported_country) +``` + +### Get Phone Plan Groups + +Phone plan groups come in two types, Geographic and Toll-Free. + +```python +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +phone_plan_groups_response = phone_number_administration_client.list_phone_plan_groups( + country_code='' +) +for phone_plan_group in phone_plan_groups_response: + print(phone_plan_group) +``` + +### Get Phone Plans + +Unlike Toll-Free phone plans, area codes for Geographic Phone Plans are empty. Area codes are found in the Area Codes API. + +```python +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +phone_plans_response = phone_number_administration_client.list_phone_plans( + country_code='', + phone_plan_group_id='' +) +for phone_plan in phone_plans_response: + print(phone_plan) +``` + +### Get Location Options + +For Geographic phone plans, you can query the available geographic locations. The locations options are structured like the geographic hierarchy of a country. For example, the US has states and within each state are cities. + +```python +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +location_options_response = phone_number_administration_client.get_phone_plan_location_options( + country_code='', + phone_plan_group_id='', + phone_plan_id='' +) +print(location_options_response) +``` + +### Get Area Codes + +Fetching area codes for geographic phone plans will require the location options queries set. You must include the chain of geographic locations traversing down the location options object returned by the GetLocationOptions API. + +```python +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +all_area_codes = phone_number_administration_client.get_all_area_codes( + location_type="NotRequired", + country_code='', + phone_plan_id='' +) +print(all_area_codes) +``` + +### Create Search + +```python +from azure.communication.administration import CreateSearchOptions +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +searchOptions = CreateSearchOptions( + area_code='', + description="testsearch20200014", + display_name="testsearch20200014", + phone_plan_ids=[''], + quantity=1 +) +search_response = phone_number_administration_client.create_search( + body=searchOptions +) +print(search_response) +``` + +### Get search by id +```python +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +phone_number_search_response = phone_number_administration_client.get_search_by_id( + search_id='' +) +print(phone_number_search_response) +``` + +### Purchase Search + +```python +phone_number_administration_client = PhoneNumberAdministrationClient.from_connection_string(connection_str) + +phone_number_administration_client.purchase_search( + search_id='' +) +``` + +# Troubleshooting +The Azure Communication Service Identity client will raise exceptions defined in [Azure Core][azure_core]. + +# Next steps +## More sample code + +Please take a look at the [samples](https://github.com/Azure/azure-sdk-for-python/tree/azure-communication-administration_1.0.0b2/sdk/communication/azure-communication-administration/samples) directory for detailed examples of how to use this library to manage identities and tokens. + +## Provide Feedback + +If you encounter any bugs or have suggestions, please file an issue in the [Issues](https://github.com/Azure/azure-sdk-for-python/issues) section of the project + +# Contributing +This project welcomes contributions and suggestions. Most contributions require you to agree to a +Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com. + +When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the +PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. + +This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). +For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments. + + +[identitysamples]: https://github.com/Azure/azure-sdk-for-python/blob/azure-communication-administration_1.0.0b2/sdk/communication/azure-communication-administration/samples/identity_samples.py [azure_core]: https://github.com/Azure/azure-sdk-for-python/blob/azure-communication-administration_1.0.0b2/sdk/core/azure-core/README.md - + diff --git a/docs-ref-services/preview/communication-chat-readme.md b/docs-ref-services/preview/communication-chat-readme.md index 22e0381ec489..924c6a99ed26 100644 --- a/docs-ref-services/preview/communication-chat-readme.md +++ b/docs-ref-services/preview/communication-chat-readme.md @@ -8,574 +8,574 @@ ms.service: azure-communication-services ms.subservice: chat ms.technology: azure --- -# Azure Communication Chat Package client library for Python - version 1.1.0b1 - - -This package contains a Python SDK for Azure Communication Services for Chat. -Read more about Azure Communication Services [here](https://docs.microsoft.com/azure/communication-services/overview) - -[Source code](https://github.com/Azure/azure-sdk-for-python/tree/azure-communication-chat_1.1.0b1/sdk/communication/azure-communication-chat) | [Package (Pypi)](https://pypi.org/project/azure-communication-chat/) | [API reference documentation](https://azuresdkdocs.blob.core.windows.net/$web/python/azure-communication-chat/1.0.0b5/index.html) | [Product documentation](https://docs.microsoft.com/azure/communication-services/) - -# Getting started - -## Prerequisites - -- Python 2.7, or 3.6 or later is required to use this package. -- A deployed Communication Services resource. You can use the [Azure Portal](https://docs.microsoft.com/azure/communication-services/quickstarts/create-communication-resource?tabs=windows&pivots=platform-azp) or the [Azure PowerShell](https://docs.microsoft.com/powershell/module/az.communication/new-azcommunicationservice) to set it up. - -## Install the package - -Install the Azure Communication Service Chat SDK. - -```bash -pip install --pre azure-communication-chat -``` - -## User Access Tokens - -User access tokens enable you to build client applications that directly authenticate to Azure Communication Services. You can generate these tokens with azure.communication.identity module, and then use them to initialize the Communication Services SDKs. Example of using azure.communication.identity: - -```bash -pip install --pre azure-communication-identity -``` - -```python -from azure.communication.identity import CommunicationIdentityClient -identity_client = CommunicationIdentityClient.from_connection_string("") -user = identity_client.create_user() -tokenresponse = identity_client.get_token(user, scopes=["chat"]) -token = tokenresponse.token -``` - -The `user` created above will be used later, because that user should be added as a participant of new chat thread when you creating -it with this token. It is because the initiator of the create request must be in the list of the participants of the chat thread. - -## Create the Chat Client - -This will allow you to create, get, list or delete chat threads. - -```python -from azure.communication.chat import ChatClient, CommunicationTokenCredential - -# Your unique Azure Communication service endpoint -endpoint = "https://.communcationservices.azure.com" -chat_client = ChatClient(endpoint, CommunicationTokenCredential(token)) -``` - -## Create Chat Thread Client - -The ChatThreadClient will allow you to perform operations specific to a chat thread, like send message, get message, update -the chat thread topic, add participants to chat thread, etc. - -You can get it by creating a new chat thread using ChatClient: - -```python -create_chat_thread_result = chat_client.create_chat_thread(topic) -chat_thread_client = chat_client.get_chat_thread_client(create_chat_thread_result.chat_thread.id) -``` - -Additionally, the client can also direct so that the request is repeatable; that is, if the client makes the -request multiple times with the same Idempotency-Token and it will get back an appropriate response without -the server executing the request multiple times. The value of the Idempotency-Token is an opaque string -representing a client-generated, globally unique for all time, identifier for the request. - -```python -create_chat_thread_result = chat_client.create_chat_thread( - topic, - thread_participants=thread_participants, - idempotency_token=idempotency_token -) -chat_thread_client = chat_client.get_chat_thread_client(create_chat_thread_result.chat_thread.id) -``` - -Alternatively, if you have created a chat thread before and you have its thread_id, you can create it by: - -```python -chat_thread_client = chat_client.get_chat_thread_client(thread_id) # thread_id is the id of an existing chat thread -``` - -# Key concepts - -A chat conversation is represented by a chat thread. Each user in the thread is called a thread participant. -Thread participants can chat with one another privately in a 1:1 chat or huddle up in a 1:N group chat. -Users also get near real-time updates for when others are typing and when they have read the messages. - -Once you initialized a `ChatClient` class, you can do the following chat operations: - -## Create, get, update, and delete threads - -Perform CRD(Create-Read-Delete) operations on threads - -```Python -create_chat_thread(topic, **kwargs) -list_chat_threads(**kwargs) -delete_chat_thread(thread_id, **kwargs) -``` - -Once you initialized a `ChatThreadClient` class, you can do the following chat operations: - -## Update thread - -Perform Update operation on thread topic - -```python -update_topic(topic, **kwargs) -``` - -## Get Chat thread properties -```python -get_properties(**kwargs) -``` - -## Send, get, update, and delete messages - -Perform CRUD(Create-Read-Update-Delete) operations on messages - -```Python -send_message(content, **kwargs) -get_message(message_id, **kwargs) -list_messages(**kwargs) -update_message(message_id, content, **kwargs) -delete_message(message_id, **kwargs) -``` - -## Get, add, and remove participants - -Perform CRD(Create-Read-Delete) operations on thread participants - -```Python -list_participants(**kwargs) -add_participants(thread_participants, **kwargs) -remove_participant(participant_identifier, **kwargs) -``` - -## Send typing notification - -Notify the service of typing notification - -```python -send_typing_notification(**kwargs) -``` - -## Send and get read receipt - -Notify the service that a message is read and get list of read messages. - -```Python -send_read_receipt(message_id, **kwargs) -list_read_receipts(**kwargs) -``` - -# Examples - -The following sections provide several code snippets covering some of the most common tasks, including: - -- [Thread Operations](#thread-operations) -- [Message Operations](#message-operations) -- [Thread Participant Operations](#thread-participant-operations) -- [Events Operations](#events-operations) - -## Thread Operations - -### Create a thread - -Use the `create_chat_thread` method to create a chat thread. - -- Use `topic`, required, to give a thread topic; -- Use `thread_participants`, optional, to provide a list the `ChatParticipant` to be added to the thread; - - `user`, required, it is the `CommunicationUserIdentifier` you created by CommunicationIdentityClient.create_user() - from User Access Tokens - - - `display_name`, optional, is the display name for the thread participant. - - `share_history_time`, optional, time from which the chat history is shared with the participant. -- Use `idempotency_token`, optional, to specify the unique identifier for the request. - - -`CreateChatThreadResult` is the result returned from creating a thread, you can use it to fetch the `id` of -the chat thread that got created. This `id` can then be used to fetch a `ChatThreadClient` object using -the `get_chat_thread_client` method. `ChatThreadClient` can be used to perform other chat operations to this chat thread. - -```Python -# Without idempotency_token and thread_participants -topic = "test topic" -create_chat_thread_result = chat_client.create_chat_thread(topic) -chat_thread_client = chat_client.get_chat_thread_client(create_chat_thread_result.chat_thread.id) -``` - -```Python -# With idempotency_token and thread_participants -from azure.communication.identity import CommunicationIdentityClient -from azure.communication.chat import ChatParticipant, ChatClient, CommunicationTokenCredential -import uuid -from datetime import datetime - -# create an user -identity_client = CommunicationIdentityClient.from_connection_string('') -user = identity_client.create_user() - -# user access tokens -tokenresponse = identity_client.get_token(user, scopes=["chat"]) -token = tokenresponse.token - -## OR pass existing user -# from azure.communication.chat import CommunicationUserIdentifier -# user_id = 'some_user_id' -# user = CommunicationUserIdentifier(user_id) - -# create the chat_client -endpoint = "https://.communcationservices.azure.com" -chat_client = ChatClient(endpoint, CommunicationTokenCredential(token)) - -# modify function to implement customer logic -def get_unique_identifier_for_request(**kwargs): - res = uuid.uuid4() - return res - -topic = "test topic" -thread_participants = [ChatParticipant( - identifier=user, - display_name='name', - share_history_time=datetime.utcnow() -)] - -# obtains idempotency_token using some customer logic -idempotency_token = get_unique_identifier_for_request() - -create_chat_thread_result = chat_client.create_chat_thread( - topic, - thread_participants=thread_participants, - idempotency_token=idempotency_token) -thread_id = create_chat_thread_result.chat_thread.id - -# fetch ChatThreadClient -chat_thread_client = chat_client.get_chat_thread_client(create_chat_thread_result.chat_thread.id) - -# Additionally, you can also check if all participants were successfully added or not -# and subsequently retry adding the failed participants again -def decide_to_retry(error, **kwargs): - """ - Insert some custom logic to decide if retry is applicable based on error - """ - return True - -retry = [thread_participant for thread_participant, error in create_chat_thread_result.errors if decide_to_retry(error)] -if retry: - chat_thread_client.add_participants(retry) -``` - - -### Get a thread - -Use `get_properties` method retrieves a `ChatThreadProperties` from the service; `thread_id` is the unique ID of the thread. - -```Python -chat_thread_properties = chat_thread_client.get_properties() -``` - -### List chat threads -Use `list_chat_threads` method retrieves the list of created chat threads - -- Use `results_per_page`, optional, The maximum number of messages to be returned per page. -- Use `start_time`, optional, The start time where the range query. - -An iterator of `[ChatThreadItem]` is the response returned from listing threads - -```python -from azure.communication.chat import ChatClient, CommunicationTokenCredential -from datetime import datetime, timedelta - -token = "" -endpoint = "https://.communcationservices.azure.com" -chat_client = ChatClient(endpoint, CommunicationTokenCredential(token)) -start_time = datetime.utcnow() - timedelta(days=2) - -chat_threads = chat_client.list_chat_threads(results_per_page=5, start_time=start_time) -for chat_thread_item_page in chat_threads.by_page(): - for chat_thread_item in chat_thread_item_page: - print("thread id:", chat_thread_item.id) -``` - -### Update a thread topic - -Use `update_topic` method to update a thread's properties. `topic` is used to describe the change of the thread topic -- Use `topic` to give thread a new topic; - -```python -topic = "new topic" -chat_thread_client.update_topic(topic=topic) - -chat_thread = chat_thread_client.get_properties(thread_id) - -assert chat_thread.topic == topic -``` - -### Delete a thread - -Use `delete_chat_thread` method to delete a thread; `thread_id` is the unique ID of the thread. -- Use `thread_id`, required, to specify the unique ID of the thread. -```Python -chat_client.delete_chat_thread(thread_id=thread_id) -``` - -## Message Operations - -### Send a message - -Use `send_message` method to sends a message to a thread identified by `thread_id`. - -- Use `content`, required, to provide the chat message content. -- Use `chat_message_type`, optional, to provide the chat message type. Possible values include: `ChatMessageType.TEXT`, - `ChatMessageType.HTML`, `'text'`, `'html'`; if not specified, `ChatMessageType.TEXT` will be set -- Use `sender_display_name`,optional, to specify the display name of the sender, if not specified, empty name will be set - -`SendChatMessageResult` is the response returned from sending a message, it contains an id, which is the unique ID of the message. - -```Python -from azure.communication.chat import ChatMessageType - -topic = "test topic" -create_chat_thread_result = chat_client.create_chat_thread(topic) -thread_id = create_chat_thread_result.chat_thread.id -chat_thread_client = chat_client.get_chat_thread_client(create_chat_thread_result.chat_thread.id) - -content='hello world' -sender_display_name='sender name' -chat_message_type = ChatMessageType.TEXT - -# without specifying sender_display_name and chat_message_type -send_message_result = chat_thread_client.send_message(content) -send_message_result_id = send_message_result.id -print("Message sent: id: ", send_message_result_id) - -# specifying sender_display_name and chat_message_type -send_message_result_w_type = chat_thread_client.send_message( - content, - sender_display_name=sender_display_name, - chat_message_type=chat_message_type # equivalent to chat_message_type = 'text' -) -send_message_result_w_type_id = send_message_result_w_type.id -print("Message sent: id: ", send_message_result_w_type_id) -``` - -### Get a message - -Use `get_message` method retrieves a message from the service; `message_id` is the unique ID of the message. -- Use `message_id`,required, to specify message id of an existing message -`ChatMessage` is the response returned from getting a message, it contains an id, which is the unique ID of the message, and other fields please refer to azure.communication.chat.ChatMessage - -```python -chat_message = chat_thread_client.get_message(message_id=send_message_result_id) -print("get_chat_message succeeded, message id:", chat_message.id, "content: ", chat_message.content) -``` - -### List messages - -Use `list_messages` method retrieves messages from the service. -- Use `results_per_page`, optional, The maximum number of messages to be returned per page. -- Use `start_time`, optional, The start time where the range query. - -An iterator of `[ChatMessage]` is the response returned from listing messages - -```Python -from datetime import datetime, timedelta - -start_time = datetime.utcnow() - timedelta(days=1) - -chat_messages = chat_thread_client.list_messages(results_per_page=1, start_time=start_time) -for chat_message_page in chat_messages.by_page(): - for chat_message in chat_message_page: - print("ChatMessage: Id=", chat_message.id, "; Content=", chat_message.content.message) -``` - -### Update a message - -Use `update_message` to update a message identified by threadId and messageId. -- Use `message_id`,required, is the unique ID of the message. -- Use `content`, optional, is the message content to be updated; if not specified it is assigned to be empty - -```Python -content = "updated message content" -chat_thread_client.update_message(send_message_result_id, content=content) - -chat_message = chat_thread_client.get_message(message_id=send_message_result_id) - -assert chat_message.content.message == content -``` - -### Delete a message - -Use `delete_message` to delete a message. -- Use `message_id`, required, is the unique ID of the message. - -```python -chat_thread_client.delete_message(message_id=send_message_result_id) -``` - -## Thread Participant Operations - -### List thread participants - -Use `list_participants` to retrieve the participants of the thread. -- Use `results_per_page`, optional, The maximum number of participants to be returned per page. -- Use `skip`, optional, to skips participants up to a specified position in response. - -An iterator of `[ChatParticipant]` is the response returned from listing participants - -```python -chat_participants = chat_thread_client.list_participants(results_per_page=5, skip=5) -for chat_participant_page in chat_participants.by_page(): - for chat_participant in chat_participant_page: - print("ChatParticipant: ", chat_participant) -``` - -### Add thread participants - -Use `add_participants` method to add thread participants to the thread. - -- Use `thread_participants`, required, to list the `ChatParticipant` to be added to the thread; - - `user`, required, it is the `CommunicationUserIdentifier` you created by CommunicationIdentityClient.create_user() from User Access Tokens - - - `display_name`, optional, is the display name for the thread participant. - - `share_history_time`, optional, time from which the chat history is shared with the participant. - -A `list(tuple(ChatParticipant, ChatError))` is returned. When participant is successfully added, -an empty list is expected. In case of an error encountered while adding participant, the list is populated -with the failed participants along with the error that was encountered. - -```Python -from azure.communication.identity import CommunicationIdentityClient -from azure.communication.chat import ChatParticipant -from datetime import datetime - -# create 2 users -identity_client = CommunicationIdentityClient.from_connection_string('') -new_users = [identity_client.create_user() for i in range(2)] - -# # conversely, you can also add an existing user to a chat thread; provided the user_id is known -# from azure.communication.chat import CommunicationUserIdentifier -# -# user_id = 'some user id' -# user_display_name = "Wilma Flinstone" -# new_user = CommunicationUserIdentifier(user_id) -# participant = ChatParticipant( -# identifier=new_user, -# display_name=user_display_name, -# share_history_time=datetime.utcnow()) - -participants = [] -for _user in new_users: - chat_participant = ChatParticipant( - identifier=_user, - display_name='Fred Flinstone', - share_history_time=datetime.utcnow() - ) - participants.append(chat_participant) - -response = chat_thread_client.add_participants(thread_participants=participants) - -def decide_to_retry(error, **kwargs): - """ - Insert some custom logic to decide if retry is applicable based on error - """ - return True - -# verify if all users has been successfully added or not -# in case of partial failures, you can retry to add all the failed participants -retry = [p for p, e in response if decide_to_retry(e)] -if retry: - chat_thread_client.add_participants(retry) -``` - -### Remove thread participant - -Use `remove_participant` method to remove thread participant from the thread identified by threadId. -`identifier` is the `CommunicationUserIdentifier` you created by CommunicationIdentityClient.create_user() from `azure-communication-identity` - -and was added into this chat thread. -- Use `identifier` to specify the `CommunicationUserIdentifier` you created -```python -chat_thread_client.remove_participant(identifier=new_user) - -# # conversely you can also do the following; provided the user_id is known -# from azure.communication.chat import CommunicationUserIdentifier -# -# user_id = 'some user id' -# chat_thread_client.remove_participant(identifier=CommunicationUserIdentifier(new_user)) - -``` - -## Events Operations - -### Send typing notification - -Use `send_typing_notification` method to post a typing notification event to a thread, on behalf of a user. - -```Python -chat_thread_client.send_typing_notification() -``` - -### Send read receipt - -Use `send_read_receipt` method to post a read receipt event to a thread, on behalf of a user. -- Use `message_id` to specify the id of the message whose read receipt is to be sent -```python -content='hello world' -send_message_result = chat_thread_client.send_message(content) -send_message_result_id = send_message_result.id -chat_thread_client.send_read_receipt(message_id=send_message_result_id) -``` - -### List read receipts - -Use `list_read_receipts` method retrieves read receipts for a thread. -- Use `results_per_page`, optional, The maximum number of read receipts to be returned per page. -- Use `skip`,optional, to skips read receipts up to a specified position in response. - -An iterator of `[ChatMessageReadReceipt]` is the response returned from listing read receipts - -```python -read_receipts = chat_thread_client.list_read_receipts(results_per_page=5, skip=5) - -for read_receipt_page in read_receipts.by_page(): - for read_receipt in read_receipt_page: - print(read_receipt) - print(read_receipt.sender) - print(read_receipt.chat_message_id) - print(read_receipt.read_on) -``` - -## Sample Code - -These are code samples that show common scenario operations with the Azure Communication Chat client library. -The async versions of the samples (the python sample files appended with `_async`) show asynchronous operations, -and require Python 3.6 or later. -Before run the sample code, refer to Prerequisites - -to create a resource, then set some Environment Variables - -```bash -set AZURE_COMMUNICATION_SERVICE_ENDPOINT="https://.communcationservices.azure.com" -set COMMUNICATION_SAMPLES_CONNECTION_STRING="" - -pip install azure-communication-identity - -python samples\chat_client_sample.py -python samples\chat_client_sample_async.py -python samples\chat_thread_client_sample.py -python samples\chat_thread_client_sample_async.py -``` - -# Troubleshooting - -Running into issues? This section should contain details as to what to do there. - -# Next steps - -More sample code should go [here](https://github.com/Azure/azure-sdk-for-python/tree/azure-communication-chat_1.1.0b1/sdk/communication/azure-communication-chat/samples), along with links out to the appropriate example tests. - -# Contributing - -If you encounter any bugs or have suggestions, please file an issue in the [Issues]() section of the project. - +# Azure Communication Chat Package client library for Python - version 1.1.0b1 + + +This package contains a Python SDK for Azure Communication Services for Chat. +Read more about Azure Communication Services [here](https://docs.microsoft.com/azure/communication-services/overview) + +[Source code](https://github.com/Azure/azure-sdk-for-python/tree/azure-communication-chat_1.1.0b1/sdk/communication/azure-communication-chat) | [Package (Pypi)](https://pypi.org/project/azure-communication-chat/) | [API reference documentation](https://azuresdkdocs.blob.core.windows.net/$web/python/azure-communication-chat/1.0.0b5/index.html) | [Product documentation](https://docs.microsoft.com/azure/communication-services/) + +# Getting started + +## Prerequisites + +- Python 2.7, or 3.6 or later is required to use this package. +- A deployed Communication Services resource. You can use the [Azure Portal](https://docs.microsoft.com/azure/communication-services/quickstarts/create-communication-resource?tabs=windows&pivots=platform-azp) or the [Azure PowerShell](https://docs.microsoft.com/powershell/module/az.communication/new-azcommunicationservice) to set it up. + +## Install the package + +Install the Azure Communication Service Chat SDK. + +```bash +pip install --pre azure-communication-chat +``` + +## User Access Tokens + +User access tokens enable you to build client applications that directly authenticate to Azure Communication Services. You can generate these tokens with azure.communication.identity module, and then use them to initialize the Communication Services SDKs. Example of using azure.communication.identity: + +```bash +pip install --pre azure-communication-identity +``` + +```python +from azure.communication.identity import CommunicationIdentityClient +identity_client = CommunicationIdentityClient.from_connection_string("") +user = identity_client.create_user() +tokenresponse = identity_client.get_token(user, scopes=["chat"]) +token = tokenresponse.token +``` + +The `user` created above will be used later, because that user should be added as a participant of new chat thread when you creating +it with this token. It is because the initiator of the create request must be in the list of the participants of the chat thread. + +## Create the Chat Client + +This will allow you to create, get, list or delete chat threads. + +```python +from azure.communication.chat import ChatClient, CommunicationTokenCredential + +# Your unique Azure Communication service endpoint +endpoint = "https://.communcationservices.azure.com" +chat_client = ChatClient(endpoint, CommunicationTokenCredential(token)) +``` + +## Create Chat Thread Client + +The ChatThreadClient will allow you to perform operations specific to a chat thread, like send message, get message, update +the chat thread topic, add participants to chat thread, etc. + +You can get it by creating a new chat thread using ChatClient: + +```python +create_chat_thread_result = chat_client.create_chat_thread(topic) +chat_thread_client = chat_client.get_chat_thread_client(create_chat_thread_result.chat_thread.id) +``` + +Additionally, the client can also direct so that the request is repeatable; that is, if the client makes the +request multiple times with the same Idempotency-Token and it will get back an appropriate response without +the server executing the request multiple times. The value of the Idempotency-Token is an opaque string +representing a client-generated, globally unique for all time, identifier for the request. + +```python +create_chat_thread_result = chat_client.create_chat_thread( + topic, + thread_participants=thread_participants, + idempotency_token=idempotency_token +) +chat_thread_client = chat_client.get_chat_thread_client(create_chat_thread_result.chat_thread.id) +``` + +Alternatively, if you have created a chat thread before and you have its thread_id, you can create it by: + +```python +chat_thread_client = chat_client.get_chat_thread_client(thread_id) # thread_id is the id of an existing chat thread +``` + +# Key concepts + +A chat conversation is represented by a chat thread. Each user in the thread is called a thread participant. +Thread participants can chat with one another privately in a 1:1 chat or huddle up in a 1:N group chat. +Users also get near real-time updates for when others are typing and when they have read the messages. + +Once you initialized a `ChatClient` class, you can do the following chat operations: + +## Create, get, update, and delete threads + +Perform CRD(Create-Read-Delete) operations on threads + +```Python +create_chat_thread(topic, **kwargs) +list_chat_threads(**kwargs) +delete_chat_thread(thread_id, **kwargs) +``` + +Once you initialized a `ChatThreadClient` class, you can do the following chat operations: + +## Update thread + +Perform Update operation on thread topic + +```python +update_topic(topic, **kwargs) +``` + +## Get Chat thread properties +```python +get_properties(**kwargs) +``` + +## Send, get, update, and delete messages + +Perform CRUD(Create-Read-Update-Delete) operations on messages + +```Python +send_message(content, **kwargs) +get_message(message_id, **kwargs) +list_messages(**kwargs) +update_message(message_id, content, **kwargs) +delete_message(message_id, **kwargs) +``` + +## Get, add, and remove participants + +Perform CRD(Create-Read-Delete) operations on thread participants + +```Python +list_participants(**kwargs) +add_participants(thread_participants, **kwargs) +remove_participant(participant_identifier, **kwargs) +``` + +## Send typing notification + +Notify the service of typing notification + +```python +send_typing_notification(**kwargs) +``` + +## Send and get read receipt + +Notify the service that a message is read and get list of read messages. + +```Python +send_read_receipt(message_id, **kwargs) +list_read_receipts(**kwargs) +``` + +# Examples + +The following sections provide several code snippets covering some of the most common tasks, including: + +- [Thread Operations](#thread-operations) +- [Message Operations](#message-operations) +- [Thread Participant Operations](#thread-participant-operations) +- [Events Operations](#events-operations) + +## Thread Operations + +### Create a thread + +Use the `create_chat_thread` method to create a chat thread. + +- Use `topic`, required, to give a thread topic; +- Use `thread_participants`, optional, to provide a list the `ChatParticipant` to be added to the thread; + - `user`, required, it is the `CommunicationUserIdentifier` you created by CommunicationIdentityClient.create_user() + from User Access Tokens + + - `display_name`, optional, is the display name for the thread participant. + - `share_history_time`, optional, time from which the chat history is shared with the participant. +- Use `idempotency_token`, optional, to specify the unique identifier for the request. + + +`CreateChatThreadResult` is the result returned from creating a thread, you can use it to fetch the `id` of +the chat thread that got created. This `id` can then be used to fetch a `ChatThreadClient` object using +the `get_chat_thread_client` method. `ChatThreadClient` can be used to perform other chat operations to this chat thread. + +```Python +# Without idempotency_token and thread_participants +topic = "test topic" +create_chat_thread_result = chat_client.create_chat_thread(topic) +chat_thread_client = chat_client.get_chat_thread_client(create_chat_thread_result.chat_thread.id) +``` + +```Python +# With idempotency_token and thread_participants +from azure.communication.identity import CommunicationIdentityClient +from azure.communication.chat import ChatParticipant, ChatClient, CommunicationTokenCredential +import uuid +from datetime import datetime + +# create an user +identity_client = CommunicationIdentityClient.from_connection_string('') +user = identity_client.create_user() + +# user access tokens +tokenresponse = identity_client.get_token(user, scopes=["chat"]) +token = tokenresponse.token + +## OR pass existing user +# from azure.communication.chat import CommunicationUserIdentifier +# user_id = 'some_user_id' +# user = CommunicationUserIdentifier(user_id) + +# create the chat_client +endpoint = "https://.communcationservices.azure.com" +chat_client = ChatClient(endpoint, CommunicationTokenCredential(token)) + +# modify function to implement customer logic +def get_unique_identifier_for_request(**kwargs): + res = uuid.uuid4() + return res + +topic = "test topic" +thread_participants = [ChatParticipant( + identifier=user, + display_name='name', + share_history_time=datetime.utcnow() +)] + +# obtains idempotency_token using some customer logic +idempotency_token = get_unique_identifier_for_request() + +create_chat_thread_result = chat_client.create_chat_thread( + topic, + thread_participants=thread_participants, + idempotency_token=idempotency_token) +thread_id = create_chat_thread_result.chat_thread.id + +# fetch ChatThreadClient +chat_thread_client = chat_client.get_chat_thread_client(create_chat_thread_result.chat_thread.id) + +# Additionally, you can also check if all participants were successfully added or not +# and subsequently retry adding the failed participants again +def decide_to_retry(error, **kwargs): + """ + Insert some custom logic to decide if retry is applicable based on error + """ + return True + +retry = [thread_participant for thread_participant, error in create_chat_thread_result.errors if decide_to_retry(error)] +if retry: + chat_thread_client.add_participants(retry) +``` + + +### Get a thread + +Use `get_properties` method retrieves a `ChatThreadProperties` from the service; `thread_id` is the unique ID of the thread. + +```Python +chat_thread_properties = chat_thread_client.get_properties() +``` + +### List chat threads +Use `list_chat_threads` method retrieves the list of created chat threads + +- Use `results_per_page`, optional, The maximum number of messages to be returned per page. +- Use `start_time`, optional, The start time where the range query. + +An iterator of `[ChatThreadItem]` is the response returned from listing threads + +```python +from azure.communication.chat import ChatClient, CommunicationTokenCredential +from datetime import datetime, timedelta + +token = "" +endpoint = "https://.communcationservices.azure.com" +chat_client = ChatClient(endpoint, CommunicationTokenCredential(token)) +start_time = datetime.utcnow() - timedelta(days=2) + +chat_threads = chat_client.list_chat_threads(results_per_page=5, start_time=start_time) +for chat_thread_item_page in chat_threads.by_page(): + for chat_thread_item in chat_thread_item_page: + print("thread id:", chat_thread_item.id) +``` + +### Update a thread topic + +Use `update_topic` method to update a thread's properties. `topic` is used to describe the change of the thread topic +- Use `topic` to give thread a new topic; + +```python +topic = "new topic" +chat_thread_client.update_topic(topic=topic) + +chat_thread = chat_thread_client.get_properties(thread_id) + +assert chat_thread.topic == topic +``` + +### Delete a thread + +Use `delete_chat_thread` method to delete a thread; `thread_id` is the unique ID of the thread. +- Use `thread_id`, required, to specify the unique ID of the thread. +```Python +chat_client.delete_chat_thread(thread_id=thread_id) +``` + +## Message Operations + +### Send a message + +Use `send_message` method to sends a message to a thread identified by `thread_id`. + +- Use `content`, required, to provide the chat message content. +- Use `chat_message_type`, optional, to provide the chat message type. Possible values include: `ChatMessageType.TEXT`, + `ChatMessageType.HTML`, `'text'`, `'html'`; if not specified, `ChatMessageType.TEXT` will be set +- Use `sender_display_name`,optional, to specify the display name of the sender, if not specified, empty name will be set + +`SendChatMessageResult` is the response returned from sending a message, it contains an id, which is the unique ID of the message. + +```Python +from azure.communication.chat import ChatMessageType + +topic = "test topic" +create_chat_thread_result = chat_client.create_chat_thread(topic) +thread_id = create_chat_thread_result.chat_thread.id +chat_thread_client = chat_client.get_chat_thread_client(create_chat_thread_result.chat_thread.id) + +content='hello world' +sender_display_name='sender name' +chat_message_type = ChatMessageType.TEXT + +# without specifying sender_display_name and chat_message_type +send_message_result = chat_thread_client.send_message(content) +send_message_result_id = send_message_result.id +print("Message sent: id: ", send_message_result_id) + +# specifying sender_display_name and chat_message_type +send_message_result_w_type = chat_thread_client.send_message( + content, + sender_display_name=sender_display_name, + chat_message_type=chat_message_type # equivalent to chat_message_type = 'text' +) +send_message_result_w_type_id = send_message_result_w_type.id +print("Message sent: id: ", send_message_result_w_type_id) +``` + +### Get a message + +Use `get_message` method retrieves a message from the service; `message_id` is the unique ID of the message. +- Use `message_id`,required, to specify message id of an existing message +`ChatMessage` is the response returned from getting a message, it contains an id, which is the unique ID of the message, and other fields please refer to azure.communication.chat.ChatMessage + +```python +chat_message = chat_thread_client.get_message(message_id=send_message_result_id) +print("get_chat_message succeeded, message id:", chat_message.id, "content: ", chat_message.content) +``` + +### List messages + +Use `list_messages` method retrieves messages from the service. +- Use `results_per_page`, optional, The maximum number of messages to be returned per page. +- Use `start_time`, optional, The start time where the range query. + +An iterator of `[ChatMessage]` is the response returned from listing messages + +```Python +from datetime import datetime, timedelta + +start_time = datetime.utcnow() - timedelta(days=1) + +chat_messages = chat_thread_client.list_messages(results_per_page=1, start_time=start_time) +for chat_message_page in chat_messages.by_page(): + for chat_message in chat_message_page: + print("ChatMessage: Id=", chat_message.id, "; Content=", chat_message.content.message) +``` + +### Update a message + +Use `update_message` to update a message identified by threadId and messageId. +- Use `message_id`,required, is the unique ID of the message. +- Use `content`, optional, is the message content to be updated; if not specified it is assigned to be empty + +```Python +content = "updated message content" +chat_thread_client.update_message(send_message_result_id, content=content) + +chat_message = chat_thread_client.get_message(message_id=send_message_result_id) + +assert chat_message.content.message == content +``` + +### Delete a message + +Use `delete_message` to delete a message. +- Use `message_id`, required, is the unique ID of the message. + +```python +chat_thread_client.delete_message(message_id=send_message_result_id) +``` + +## Thread Participant Operations + +### List thread participants + +Use `list_participants` to retrieve the participants of the thread. +- Use `results_per_page`, optional, The maximum number of participants to be returned per page. +- Use `skip`, optional, to skips participants up to a specified position in response. + +An iterator of `[ChatParticipant]` is the response returned from listing participants + +```python +chat_participants = chat_thread_client.list_participants(results_per_page=5, skip=5) +for chat_participant_page in chat_participants.by_page(): + for chat_participant in chat_participant_page: + print("ChatParticipant: ", chat_participant) +``` + +### Add thread participants + +Use `add_participants` method to add thread participants to the thread. + +- Use `thread_participants`, required, to list the `ChatParticipant` to be added to the thread; + - `user`, required, it is the `CommunicationUserIdentifier` you created by CommunicationIdentityClient.create_user() from User Access Tokens + + - `display_name`, optional, is the display name for the thread participant. + - `share_history_time`, optional, time from which the chat history is shared with the participant. + +A `list(tuple(ChatParticipant, ChatError))` is returned. When participant is successfully added, +an empty list is expected. In case of an error encountered while adding participant, the list is populated +with the failed participants along with the error that was encountered. + +```Python +from azure.communication.identity import CommunicationIdentityClient +from azure.communication.chat import ChatParticipant +from datetime import datetime + +# create 2 users +identity_client = CommunicationIdentityClient.from_connection_string('') +new_users = [identity_client.create_user() for i in range(2)] + +# # conversely, you can also add an existing user to a chat thread; provided the user_id is known +# from azure.communication.chat import CommunicationUserIdentifier +# +# user_id = 'some user id' +# user_display_name = "Wilma Flintstone" +# new_user = CommunicationUserIdentifier(user_id) +# participant = ChatParticipant( +# identifier=new_user, +# display_name=user_display_name, +# share_history_time=datetime.utcnow()) + +participants = [] +for _user in new_users: + chat_participant = ChatParticipant( + identifier=_user, + display_name='Fred Flintstone', + share_history_time=datetime.utcnow() + ) + participants.append(chat_participant) + +response = chat_thread_client.add_participants(thread_participants=participants) + +def decide_to_retry(error, **kwargs): + """ + Insert some custom logic to decide if retry is applicable based on error + """ + return True + +# verify if all users has been successfully added or not +# in case of partial failures, you can retry to add all the failed participants +retry = [p for p, e in response if decide_to_retry(e)] +if retry: + chat_thread_client.add_participants(retry) +``` + +### Remove thread participant + +Use `remove_participant` method to remove thread participant from the thread identified by threadId. +`identifier` is the `CommunicationUserIdentifier` you created by CommunicationIdentityClient.create_user() from `azure-communication-identity` + +and was added into this chat thread. +- Use `identifier` to specify the `CommunicationUserIdentifier` you created +```python +chat_thread_client.remove_participant(identifier=new_user) + +# # conversely you can also do the following; provided the user_id is known +# from azure.communication.chat import CommunicationUserIdentifier +# +# user_id = 'some user id' +# chat_thread_client.remove_participant(identifier=CommunicationUserIdentifier(new_user)) + +``` + +## Events Operations + +### Send typing notification + +Use `send_typing_notification` method to post a typing notification event to a thread, on behalf of a user. + +```Python +chat_thread_client.send_typing_notification() +``` + +### Send read receipt + +Use `send_read_receipt` method to post a read receipt event to a thread, on behalf of a user. +- Use `message_id` to specify the id of the message whose read receipt is to be sent +```python +content='hello world' +send_message_result = chat_thread_client.send_message(content) +send_message_result_id = send_message_result.id +chat_thread_client.send_read_receipt(message_id=send_message_result_id) +``` + +### List read receipts + +Use `list_read_receipts` method retrieves read receipts for a thread. +- Use `results_per_page`, optional, The maximum number of read receipts to be returned per page. +- Use `skip`,optional, to skips read receipts up to a specified position in response. + +An iterator of `[ChatMessageReadReceipt]` is the response returned from listing read receipts + +```python +read_receipts = chat_thread_client.list_read_receipts(results_per_page=5, skip=5) + +for read_receipt_page in read_receipts.by_page(): + for read_receipt in read_receipt_page: + print(read_receipt) + print(read_receipt.sender) + print(read_receipt.chat_message_id) + print(read_receipt.read_on) +``` + +## Sample Code + +These are code samples that show common scenario operations with the Azure Communication Chat client library. +The async versions of the samples (the python sample files appended with `_async`) show asynchronous operations, +and require Python 3.6 or later. +Before run the sample code, refer to Prerequisites + +to create a resource, then set some Environment Variables + +```bash +set AZURE_COMMUNICATION_SERVICE_ENDPOINT="https://.communcationservices.azure.com" +set COMMUNICATION_SAMPLES_CONNECTION_STRING="" + +pip install azure-communication-identity + +python samples\chat_client_sample.py +python samples\chat_client_sample_async.py +python samples\chat_thread_client_sample.py +python samples\chat_thread_client_sample_async.py +``` + +# Troubleshooting + +Running into issues? This section should contain details as to what to do there. + +# Next steps + +More sample code should go [here](https://github.com/Azure/azure-sdk-for-python/tree/azure-communication-chat_1.1.0b1/sdk/communication/azure-communication-chat/samples), along with links out to the appropriate example tests. + +# Contributing + +If you encounter any bugs or have suggestions, please file an issue in the [Issues]() section of the project. + ![Impressions](https://azure-sdk-impressions.azurewebsites.net/api/impressions/azure-sdk-for-python%2Fsdk%2Ftemplate%2Fazure-template%2FREADME.png) - + diff --git a/docs-ref-services/preview/core-readme.md b/docs-ref-services/preview/core-readme.md index 25eae3b56dd9..6ec446239e5e 100644 --- a/docs-ref-services/preview/core-readme.md +++ b/docs-ref-services/preview/core-readme.md @@ -7,220 +7,220 @@ ms.devlang: python ms.service: azure-python ms.technology: azure --- -# Azure Core shared client library for Python - version 1.15.0b1 - - -Azure core provides shared exceptions and modules for Python SDK client libraries. -These libraries follow the [Azure SDK Design Guidelines for Python](https://azure.github.io/azure-sdk/python/guidelines/index.html) . - -If you are a client library developer, please reference [client library developer reference](https://github.com/Azure/azure-sdk-for-python/blob/azure-core_1.15.0b1/sdk/core/azure-core/CLIENT_LIBRARY_DEVELOPER.md) for more information. - -[Source code](https://github.com/Azure/azure-sdk-for-python/blob/azure-core_1.15.0b1/sdk/core/azure-core/) | [Package (Pypi)][package] | [API reference documentation](https://github.com/Azure/azure-sdk-for-python/blob/azure-core_1.15.0b1/sdk/core/azure-core/) - -## Getting started - -Typically, you will not need to install azure core; -it will be installed when you install one of the client libraries using it. -In case you want to install it explicitly (to implement your own client library, for example), -you can find it [here](https://pypi.org/project/azure-core/). - -## Key concepts - -### Azure Core Library Exceptions - -#### AzureError -AzureError is the base exception for all errors. -```python -class AzureError(Exception): - def __init__(self, message, *args, **kwargs): - self.inner_exception = kwargs.get('error') - self.exc_type, self.exc_value, self.exc_traceback = sys.exc_info() - self.exc_type = self.exc_type.__name__ if self.exc_type else type(self.inner_exception) - self.exc_msg = "{}, {}: {}".format(message, self.exc_type, self.exc_value) # type: ignore - self.message = str(message) - super(AzureError, self).__init__(self.message, *args) -``` - -*message* is any message (str) to be associated with the exception. - -*args* are any additional args to be included with exception. - -*kwargs* are keyword arguments to include with the exception. Use the keyword *error* to pass in an internal exception. - -**The following exceptions inherit from AzureError:** - -#### ServiceRequestError -An error occurred while attempt to make a request to the service. No request was sent. - -#### ServiceResponseError -The request was sent, but the client failed to understand the response. -The connection may have timed out. These errors can be retried for idempotent or safe operations. - -#### HttpResponseError -A request was made, and a non-success status code was received from the service. -```python -class HttpResponseError(AzureError): - def __init__(self, message=None, response=None, **kwargs): - self.reason = None - self.response = response - if response: - self.reason = response.reason - message = "Operation returned an invalid status code '{}'".format(self.reason) - try: - try: - if self.error.error.code or self.error.error.message: - message = "({}) {}".format( - self.error.error.code, - self.error.error.message) - except AttributeError: - if self.error.message: #pylint: disable=no-member - message = self.error.message #pylint: disable=no-member - except AttributeError: - pass - super(HttpResponseError, self).__init__(message=message, **kwargs) -``` -*message* is the HTTP response error message (optional) - -*response* is the HTTP response (optional). - -*kwargs* are keyword arguments to include with the exception. - -**The following exceptions inherit from HttpResponseError:** - -#### DecodeError -An error raised during response deserialization. - -#### ResourceExistsError -An error response with status code 4xx. This will not be raised directly by the Azure core pipeline. - -#### ResourceNotFoundError -An error response, typically triggered by a 412 response (for update) or 404 (for get/post). - -#### ClientAuthenticationError -An error response with status code 4xx. This will not be raised directly by the Azure core pipeline. - -#### ResourceModifiedError -An error response with status code 4xx, typically 412 Conflict. This will not be raised directly by the Azure core pipeline. - -#### ResourceNotModifiedError -An error response with status code 304. This will not be raised directly by the Azure core pipeline. - -#### TooManyRedirectsError -An error raised when the maximum number of redirect attempts is reached. The maximum amount of redirects can be configured in the RedirectPolicy. -```python -class TooManyRedirectsError(HttpResponseError): - def __init__(self, history, *args, **kwargs): - self.history = history - message = "Reached maximum redirect attempts." - super(TooManyRedirectsError, self).__init__(message, *args, **kwargs) -``` - -*history* is used to document the requests/responses that resulted in redirected requests. - -*args* are any additional args to be included with exception. - -*kwargs* are keyword arguments to include with the exception. - -### Configurations - -When calling the methods, some properties can be configured by passing in as kwargs arguments. - -| Parameters | Description | -| --- | --- | -| headers | The HTTP Request headers. | -| request_id | The request id to be added into header. | -| user_agent | If specified, this will be added in front of the user agent string. | -| logging_enable| Use to enable per operation. Defaults to `False`. | -| logger | If specified, it will be used to log information. | -| response_encoding | The encoding to use if known for this service (will disable auto-detection). | -| proxies | Maps protocol or protocol and hostname to the URL of the proxy. | -| raw_request_hook | Callback function. Will be invoked on request. | -| raw_response_hook | Callback function. Will be invoked on response. | -| network_span_namer | A callable to customize the span name. | -| tracing_attributes | Attributes to set on all created spans. | -| permit_redirects | Whether the client allows redirects. Defaults to `True`. | -| redirect_max | The maximum allowed redirects. Defaults to `30`. | -| retry_total | Total number of retries to allow. Takes precedence over other counts. Default value is `10`. | -| retry_connect | How many connection-related errors to retry on. These are errors raised before the request is sent to the remote server, which we assume has not triggered the server to process the request. Default value is `3`. | -| retry_read | How many times to retry on read errors. These errors are raised after the request was sent to the server, so the request may have side-effects. Default value is `3`. | -| retry_status | How many times to retry on bad status codes. Default value is `3`. | -| retry_backoff_factor | A backoff factor to apply between attempts after the second try (most errors are resolved immediately by a second try without a delay). Retry policy will sleep for: `{backoff factor} * (2 ** ({number of total retries} - 1))` seconds. If the backoff_factor is 0.1, then the retry will sleep for [0.0s, 0.2s, 0.4s, ...] between retries. The default value is `0.8`. | -| retry_backoff_max | The maximum back off time. Default value is `120` seconds (2 minutes). | -| retry_mode | Fixed or exponential delay between attemps, default is `Exponential`. | -| timeout | Timeout setting for the operation in seconds, default is `604800`s (7 days). | -| connection_timeout | A single float in seconds for the connection timeout. Defaults to `300` seconds. | -| read_timeout | A single float in seconds for the read timeout. Defaults to `300` seconds. | -| connection_verify | SSL certificate verification. Enabled by default. Set to False to disable, alternatively can be set to the path to a CA_BUNDLE file or directory with certificates of trusted CAs. | -| connection_cert | Client-side certificates. You can specify a local cert to use as client side certificate, as a single file (containing the private key and the certificate) or as a tuple of both files' paths. | -| proxies | Dictionary mapping protocol or protocol and hostname to the URL of the proxy. | -| cookies | Dict or CookieJar object to send with the `Request`. | -| connection_data_block_size | The block size of data sent over the connection. Defaults to `4096` bytes. | - -### Async transport - -The async transport is designed to be opt-in. [AioHttp](https://pypi.org/project/aiohttp/) is one of the supported implementations of async transport. It is not installed by default. You need to install it separately. - -### Shared modules - -#### MatchConditions - -MatchConditions is an enum to describe match conditions. -```python -class MatchConditions(Enum): - Unconditionally = 1 - IfNotModified = 2 - IfModified = 3 - IfPresent = 4 - IfMissing = 5 -``` - -#### CaseInsensitiveEnumMeta - -A metaclass to support case-insensitive enums. -```python -from enum import Enum -from six import with_metaclass - -from azure.core import CaseInsensitiveEnumMeta - -class MyCustomEnum(with_metaclass(CaseInsensitiveEnumMeta, str, Enum)): - FOO = 'foo' - BAR = 'bar' -``` - -#### Null Sentinel Value - -A falsy sentinel object which is supposed to be used to specify attributes -with no data. This gets serialized to `null` on the wire. - -```python -from azure.core.serialization import NULL - -assert bool(NULL) is False - -foo = Foo( - attr=NULL -) -``` - -## Contributing -This project welcomes contributions and suggestions. Most contributions require -you to agree to a Contributor License Agreement (CLA) declaring that you have -the right to, and actually do, grant us the rights to use your contribution. -For details, visit [https://cla.microsoft.com](https://cla.microsoft.com). - -When you submit a pull request, a CLA-bot will automatically determine whether -you need to provide a CLA and decorate the PR appropriately (e.g., label, -comment). Simply follow the instructions provided by the bot. You will only -need to do this once across all repos using our CLA. - -This project has adopted the -[Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). -For more information, see the -[Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) -or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any -additional questions or comments. - - +# Azure Core shared client library for Python - version 1.15.0b1 + + +Azure core provides shared exceptions and modules for Python SDK client libraries. +These libraries follow the [Azure SDK Design Guidelines for Python](https://azure.github.io/azure-sdk/python/guidelines/index.html) . + +If you are a client library developer, please reference [client library developer reference](https://github.com/Azure/azure-sdk-for-python/blob/azure-core_1.15.0b1/sdk/core/azure-core/CLIENT_LIBRARY_DEVELOPER.md) for more information. + +[Source code](https://github.com/Azure/azure-sdk-for-python/blob/azure-core_1.15.0b1/sdk/core/azure-core/) | [Package (Pypi)][package] | [API reference documentation](https://github.com/Azure/azure-sdk-for-python/blob/azure-core_1.15.0b1/sdk/core/azure-core/) + +## Getting started + +Typically, you will not need to install azure core; +it will be installed when you install one of the client libraries using it. +In case you want to install it explicitly (to implement your own client library, for example), +you can find it [here](https://pypi.org/project/azure-core/). + +## Key concepts + +### Azure Core Library Exceptions + +#### AzureError +AzureError is the base exception for all errors. +```python +class AzureError(Exception): + def __init__(self, message, *args, **kwargs): + self.inner_exception = kwargs.get('error') + self.exc_type, self.exc_value, self.exc_traceback = sys.exc_info() + self.exc_type = self.exc_type.__name__ if self.exc_type else type(self.inner_exception) + self.exc_msg = "{}, {}: {}".format(message, self.exc_type, self.exc_value) # type: ignore + self.message = str(message) + super(AzureError, self).__init__(self.message, *args) +``` + +*message* is any message (str) to be associated with the exception. + +*args* are any additional args to be included with exception. + +*kwargs* are keyword arguments to include with the exception. Use the keyword *error* to pass in an internal exception. + +**The following exceptions inherit from AzureError:** + +#### ServiceRequestError +An error occurred while attempt to make a request to the service. No request was sent. + +#### ServiceResponseError +The request was sent, but the client failed to understand the response. +The connection may have timed out. These errors can be retried for idempotent or safe operations. + +#### HttpResponseError +A request was made, and a non-success status code was received from the service. +```python +class HttpResponseError(AzureError): + def __init__(self, message=None, response=None, **kwargs): + self.reason = None + self.response = response + if response: + self.reason = response.reason + message = "Operation returned an invalid status code '{}'".format(self.reason) + try: + try: + if self.error.error.code or self.error.error.message: + message = "({}) {}".format( + self.error.error.code, + self.error.error.message) + except AttributeError: + if self.error.message: #pylint: disable=no-member + message = self.error.message #pylint: disable=no-member + except AttributeError: + pass + super(HttpResponseError, self).__init__(message=message, **kwargs) +``` +*message* is the HTTP response error message (optional) + +*response* is the HTTP response (optional). + +*kwargs* are keyword arguments to include with the exception. + +**The following exceptions inherit from HttpResponseError:** + +#### DecodeError +An error raised during response deserialization. + +#### ResourceExistsError +An error response with status code 4xx. This will not be raised directly by the Azure core pipeline. + +#### ResourceNotFoundError +An error response, typically triggered by a 412 response (for update) or 404 (for get/post). + +#### ClientAuthenticationError +An error response with status code 4xx. This will not be raised directly by the Azure core pipeline. + +#### ResourceModifiedError +An error response with status code 4xx, typically 412 Conflict. This will not be raised directly by the Azure core pipeline. + +#### ResourceNotModifiedError +An error response with status code 304. This will not be raised directly by the Azure core pipeline. + +#### TooManyRedirectsError +An error raised when the maximum number of redirect attempts is reached. The maximum amount of redirects can be configured in the RedirectPolicy. +```python +class TooManyRedirectsError(HttpResponseError): + def __init__(self, history, *args, **kwargs): + self.history = history + message = "Reached maximum redirect attempts." + super(TooManyRedirectsError, self).__init__(message, *args, **kwargs) +``` + +*history* is used to document the requests/responses that resulted in redirected requests. + +*args* are any additional args to be included with exception. + +*kwargs* are keyword arguments to include with the exception. + +### Configurations + +When calling the methods, some properties can be configured by passing in as kwargs arguments. + +| Parameters | Description | +| --- | --- | +| headers | The HTTP Request headers. | +| request_id | The request id to be added into header. | +| user_agent | If specified, this will be added in front of the user agent string. | +| logging_enable| Use to enable per operation. Defaults to `False`. | +| logger | If specified, it will be used to log information. | +| response_encoding | The encoding to use if known for this service (will disable auto-detection). | +| proxies | Maps protocol or protocol and hostname to the URL of the proxy. | +| raw_request_hook | Callback function. Will be invoked on request. | +| raw_response_hook | Callback function. Will be invoked on response. | +| network_span_namer | A callable to customize the span name. | +| tracing_attributes | Attributes to set on all created spans. | +| permit_redirects | Whether the client allows redirects. Defaults to `True`. | +| redirect_max | The maximum allowed redirects. Defaults to `30`. | +| retry_total | Total number of retries to allow. Takes precedence over other counts. Default value is `10`. | +| retry_connect | How many connection-related errors to retry on. These are errors raised before the request is sent to the remote server, which we assume has not triggered the server to process the request. Default value is `3`. | +| retry_read | How many times to retry on read errors. These errors are raised after the request was sent to the server, so the request may have side-effects. Default value is `3`. | +| retry_status | How many times to retry on bad status codes. Default value is `3`. | +| retry_backoff_factor | A backoff factor to apply between attempts after the second try (most errors are resolved immediately by a second try without a delay). Retry policy will sleep for: `{backoff factor} * (2 ** ({number of total retries} - 1))` seconds. If the backoff_factor is 0.1, then the retry will sleep for [0.0s, 0.2s, 0.4s, ...] between retries. The default value is `0.8`. | +| retry_backoff_max | The maximum back off time. Default value is `120` seconds (2 minutes). | +| retry_mode | Fixed or exponential delay between attempts, default is `Exponential`. | +| timeout | Timeout setting for the operation in seconds, default is `604800`s (7 days). | +| connection_timeout | A single float in seconds for the connection timeout. Defaults to `300` seconds. | +| read_timeout | A single float in seconds for the read timeout. Defaults to `300` seconds. | +| connection_verify | SSL certificate verification. Enabled by default. Set to False to disable, alternatively can be set to the path to a CA_BUNDLE file or directory with certificates of trusted CAs. | +| connection_cert | Client-side certificates. You can specify a local cert to use as client side certificate, as a single file (containing the private key and the certificate) or as a tuple of both files' paths. | +| proxies | Dictionary mapping protocol or protocol and hostname to the URL of the proxy. | +| cookies | Dict or CookieJar object to send with the `Request`. | +| connection_data_block_size | The block size of data sent over the connection. Defaults to `4096` bytes. | + +### Async transport + +The async transport is designed to be opt-in. [AioHttp](https://pypi.org/project/aiohttp/) is one of the supported implementations of async transport. It is not installed by default. You need to install it separately. + +### Shared modules + +#### MatchConditions + +MatchConditions is an enum to describe match conditions. +```python +class MatchConditions(Enum): + Unconditionally = 1 + IfNotModified = 2 + IfModified = 3 + IfPresent = 4 + IfMissing = 5 +``` + +#### CaseInsensitiveEnumMeta + +A metaclass to support case-insensitive enums. +```python +from enum import Enum +from six import with_metaclass + +from azure.core import CaseInsensitiveEnumMeta + +class MyCustomEnum(with_metaclass(CaseInsensitiveEnumMeta, str, Enum)): + FOO = 'foo' + BAR = 'bar' +``` + +#### Null Sentinel Value + +A falsy sentinel object which is supposed to be used to specify attributes +with no data. This gets serialized to `null` on the wire. + +```python +from azure.core.serialization import NULL + +assert bool(NULL) is False + +foo = Foo( + attr=NULL +) +``` + +## Contributing +This project welcomes contributions and suggestions. Most contributions require +you to agree to a Contributor License Agreement (CLA) declaring that you have +the right to, and actually do, grant us the rights to use your contribution. +For details, visit [https://cla.microsoft.com](https://cla.microsoft.com). + +When you submit a pull request, a CLA-bot will automatically determine whether +you need to provide a CLA and decorate the PR appropriately (e.g., label, +comment). Simply follow the instructions provided by the bot. You will only +need to do this once across all repos using our CLA. + +This project has adopted the +[Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). +For more information, see the +[Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) +or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any +additional questions or comments. + + [package]: https://pypi.org/project/azure-core/ - + diff --git a/docs-ref-services/preview/corehttp-readme.md b/docs-ref-services/preview/corehttp-readme.md index 939cf9d59956..1671cc93e7d6 100644 --- a/docs-ref-services/preview/corehttp-readme.md +++ b/docs-ref-services/preview/corehttp-readme.md @@ -21,7 +21,7 @@ To use `corehttp`, you will need to choose a transport implementation. `corehttp Synchronous transports: - `RequestsTransport` - A synchronous transport based on the [Requests](https://requests.readthedocs.io/en/master/) library. -- `HttpXTransport` - An synchronous transport based on the [HTTPX](https://www.python-httpx.org/) library. +- `HttpXTransport` - A synchronous transport based on the [HTTPX](https://www.python-httpx.org/) library. Asynchronous transports: - `AioHttpTransport` - An asynchronous transport based on the [aiohttp](https://docs.aiohttp.org/en/stable/) library. diff --git a/docs-ref-services/preview/maps-geolocation-readme.md b/docs-ref-services/preview/maps-geolocation-readme.md index abf87bc62763..ee530a19a4da 100644 --- a/docs-ref-services/preview/maps-geolocation-readme.md +++ b/docs-ref-services/preview/maps-geolocation-readme.md @@ -6,207 +6,207 @@ ms.topic: reference ms.devlang: python ms.service: maps --- -# Azure Maps Geolocation Package client library for Python - version 1.0.0b1 - - -This package contains a Python SDK for Azure Maps Services for Geolocation. -Read more about Azure Maps Services [here](/azure/azure-maps/) - -[Source code](https://github.com/Azure/azure-sdk-for-python/tree/azure-maps-geolocation_1.0.0b1/sdk/maps/azure-maps-geolocation) | [API reference documentation](/rest/api/maps/geolocation) | [Product documentation](/azure/azure-maps/) - -## _Disclaimer_ - -_Azure SDK Python packages support for Python 2.7 has ended 01 January 2022. For more information and questions, please refer to _ - -## Getting started - -### Prerequisites - -- Python 3.7 or later is required to use this package. -- An [Azure subscription][azure_subscription] and an [Azure Maps account](/azure/azure-maps/how-to-manage-account-keys). -- A deployed Maps Services resource. You can create the resource via [Azure Portal][azure_portal] or [Azure CLI][azure_cli]. - -If you use Azure CLI, replace `` and `` of your choice, and select a proper [pricing tier](/azure/azure-maps/choose-pricing-tier) based on your needs via the `` parameter. Please refer to [this page](/cli/azure/maps/account?view=azure-cli-latest#az_maps_account_create) for more details. - -```bash -az maps account create --resource-group --account-name --sku -``` - -### Install the package - -Install the Azure Maps Service Geolocation SDK. - -```bash -pip install azure-maps-geolocation -``` - -### Create and Authenticate the MapsGeolocationClient - -To create a client object to access the Azure Maps Geolocation API, you will need a **credential** object. Azure Maps Geolocation client also support two ways to authenticate. - -#### 1. Authenticate with a Subscription Key Credential - -You can authenticate with your Azure Maps Subscription Key. -Once the Azure Maps Subscription Key is created, set the value of the key as environment variable: `AZURE_SUBSCRIPTION_KEY`. -Then pass an `AZURE_SUBSCRIPTION_KEY` as the `credential` parameter into an instance of [AzureKeyCredential][azure-key-credential]. - -```python -from azure.core.credentials import AzureKeyCredential -from azure.maps.geolocation import MapsGeolocationClient - -credential = AzureKeyCredential(os.environ.get("AZURE_SUBSCRIPTION_KEY")) - -geolocation_client = MapsGeolocationClient( - credential=credential, -) -``` - -#### 2. Authenticate with an Azure Active Directory credential - -You can authenticate with [Azure Active Directory (AAD) token credential][maps_authentication_aad] using the [Azure Identity library][azure_identity]. -Authentication by using AAD requires some initial setup: - -- Install [azure-identity][azure-key-credential] -- Register a [new AAD application][register_aad_app] -- Grant access to Azure Maps by assigning the suitable role to your service principal. Please refer to the [Manage authentication page][manage_aad_auth_page]. - -After setup, you can choose which type of [credential][azure-key-credential] from `azure.identity` to use. -As an example, [DefaultAzureCredential][default_azure_credential] -can be used to authenticate the client: - -Next, set the values of the client ID, tenant ID, and client secret of the AAD application as environment variables: -`AZURE_CLIENT_ID`, `AZURE_TENANT_ID`, `AZURE_CLIENT_SECRET` - -You will also need to specify the Azure Maps resource you intend to use by specifying the `clientId` in the client options. The Azure Maps resource client id can be found in the Authentication sections in the Azure Maps resource. Please refer to the [documentation][how_to_manage_authentication] on how to find it. - -```python -from azure.maps.geolocation import MapsGeolocationClient -from azure.identity import DefaultAzureCredential - -credential = DefaultAzureCredential() -geolocation_client = MapsGeolocationClient( - client_id="", - credential=credential -) -``` - -## Key concepts - -The Azure Maps Geolocation client library for Python allows you to interact with each of the components through the use of a dedicated client object. - -### Sync Clients - -`MapsGeolocationClient` is the primary client for developers using the Azure Maps Geolocation client library for Python. -Once you initialized a `MapsGeolocationClient` class, you can explore the methods on this client object to understand the different features of the Azure Maps Geolocation service that you can access. - -### Async Clients - -This library includes a complete async API supported on Python 3.5+. To use it, you must first install an async transport, such as [aiohttp](https://pypi.org/project/aiohttp/). -See [azure-core documentation](https://github.com/Azure/azure-sdk-for-python/blob/azure-maps-geolocation_1.0.0b1/sdk/core/azure-core/CLIENT_LIBRARY_DEVELOPER.md#transport) for more information. - -Async clients and credentials should be closed when they're no longer needed. These -objects are async context managers and define async `close` methods. - -## Examples - -The following sections provide several code snippets covering some of the most common Azure Maps Geolocation tasks, including: - -- [Get Geolocation](#get-geolocation) - -### Get Geolocation - -This service will return the ISO country code for the provided IP address. Developers can use this information to block or alter certain content based on geographical locations where the application is being viewed from. - -```python -from azure.maps.geolocation import MapsGeolocationClient - -BLOCK_COUNTRY_LIST = ['US', 'TW', 'AF', 'AX', 'DL'] -INCOME_IP_ADDRESS = "2001:4898:80e8:b::189" -geolocation_result = client.get_country_code(ip_address=INCOME_IP_ADDRESS) - -result_country_code = geolocation_result.iso_code - -if result_country_code in BLOCK_COUNTRY_LIST: - raise Exception("These IP address is from forebiden country") -``` - -## Troubleshooting - -### General - -Maps Geolocation clients raise exceptions defined in [Azure Core](https://github.com/Azure/azure-sdk-for-python/blob/azure-maps-geolocation_1.0.0b1/sdk/core/azure-core/README.md). - -This list can be used for reference to catch thrown exceptions. To get the specific error code of the exception, use the `error_code` attribute, i.e, `exception.error_code`. - -### Logging - -This library uses the standard [logging](https://docs.python.org/3/library/logging.html) library for logging. -Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO level. - -Detailed DEBUG level logging, including request/response bodies and unredacted headers, can be enabled on a client with the `logging_enable` argument: - -```python -import sys -import logging -from azure.maps.geolocation import MapsGeolocationClient - -# Create a logger for the 'azure.maps.geolocation' SDK -logger = logging.getLogger('azure.maps.geolocation') -logger.setLevel(logging.DEBUG) - -# Configure a console output -handler = logging.StreamHandler(stream=sys.stdout) -logger.addHandler(handler) - -``` - -### Additional - -Still running into issues? If you encounter any bugs or have suggestions, please file an issue in the [Issues]() section of the project. - -## Next steps - -### More sample code - -Get started with our [Maps Geolocation samples](https://github.com/Azure/azure-sdk-for-python/tree/azure-maps-geolocation_1.0.0b1/sdk/maps/azure-maps-geolocation/samples) ([Async Version samples](https://github.com/Azure/azure-sdk-for-python/tree/azure-maps-geolocation_1.0.0b1/sdk/maps/azure-maps-geolocation/samples/async_samples)). - -Several Azure Maps Geolocation Python SDK samples are available to you in the SDK's GitHub repository. These samples provide example code for additional scenarios commonly encountered while working with Maps Geolocation - -```bash -set AZURE_SUBSCRIPTION_KEY="" - -pip install azure-maps-geolocation --pre - -python samples/sample_authentication.py -python sample/sample_get_country_code.py -``` - -> Notes: `--pre` flag can be optionally added, it is to include pre-release and development versions for `pip install`. By default, `pip` only finds stable versions. - -Further detail please refer to [Samples Introduction](https://github.com/Azure/azure-sdk-for-python/tree/azure-maps-geolocation_1.0.0b1/sdk/maps/azure-maps-geolocation/samples/README.md) - -### Additional documentation - -For more extensive documentation on Azure Maps Geolocation, see the [Azure Maps Geolocation documentation](/rest/api/maps/geolocation) on docs.microsoft.com. - -## Contributing - -This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit . - -When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. - -This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments. - - -[azure_subscription]: https://azure.microsoft.com/free/ -[azure_identity]: https://github.com/Azure/azure-sdk-for-python/blob/azure-maps-geolocation_1.0.0b1/sdk/identity/azure-identity -[azure_portal]: https://portal.azure.com -[azure_cli]: /cli/azure -[azure-key-credential]: https://aka.ms/azsdk/python/core/azurekeycredential -[default_azure_credential]: https://github.com/Azure/azure-sdk-for-python/tree/azure-maps-geolocation_1.0.0b1/sdk/identity/azure-identity#defaultazurecredential -[register_aad_app]: /powershell/module/Az.Resources/New-AzADApplication?view=azps-8.0.0 -[maps_authentication_aad]: /azure/azure-maps/how-to-manage-authentication -[create_new_application_registration]: https://portal.azure.com/#blade/Microsoft_AAD_RegisteredApps/applicationsListBlade/quickStartType/AspNetWebAppQuickstartPage/sourceType/docs -[manage_aad_auth_page]: /azure/azure-maps/how-to-manage-authentication +# Azure Maps Geolocation Package client library for Python - version 1.0.0b1 + + +This package contains a Python SDK for Azure Maps Services for Geolocation. +Read more about Azure Maps Services [here](/azure/azure-maps/) + +[Source code](https://github.com/Azure/azure-sdk-for-python/tree/azure-maps-geolocation_1.0.0b1/sdk/maps/azure-maps-geolocation) | [API reference documentation](/rest/api/maps/geolocation) | [Product documentation](/azure/azure-maps/) + +## _Disclaimer_ + +_Azure SDK Python packages support for Python 2.7 has ended 01 January 2022. For more information and questions, please refer to _ + +## Getting started + +### Prerequisites + +- Python 3.7 or later is required to use this package. +- An [Azure subscription][azure_subscription] and an [Azure Maps account](/azure/azure-maps/how-to-manage-account-keys). +- A deployed Maps Services resource. You can create the resource via [Azure Portal][azure_portal] or [Azure CLI][azure_cli]. + +If you use Azure CLI, replace `` and `` of your choice, and select a proper [pricing tier](/azure/azure-maps/choose-pricing-tier) based on your needs via the `` parameter. Please refer to [this page](/cli/azure/maps/account?view=azure-cli-latest#az_maps_account_create) for more details. + +```bash +az maps account create --resource-group --account-name --sku +``` + +### Install the package + +Install the Azure Maps Service Geolocation SDK. + +```bash +pip install azure-maps-geolocation +``` + +### Create and Authenticate the MapsGeolocationClient + +To create a client object to access the Azure Maps Geolocation API, you will need a **credential** object. Azure Maps Geolocation client also support two ways to authenticate. + +#### 1. Authenticate with a Subscription Key Credential + +You can authenticate with your Azure Maps Subscription Key. +Once the Azure Maps Subscription Key is created, set the value of the key as environment variable: `AZURE_SUBSCRIPTION_KEY`. +Then pass an `AZURE_SUBSCRIPTION_KEY` as the `credential` parameter into an instance of [AzureKeyCredential][azure-key-credential]. + +```python +from azure.core.credentials import AzureKeyCredential +from azure.maps.geolocation import MapsGeolocationClient + +credential = AzureKeyCredential(os.environ.get("AZURE_SUBSCRIPTION_KEY")) + +geolocation_client = MapsGeolocationClient( + credential=credential, +) +``` + +#### 2. Authenticate with an Azure Active Directory credential + +You can authenticate with [Azure Active Directory (AAD) token credential][maps_authentication_aad] using the [Azure Identity library][azure_identity]. +Authentication by using AAD requires some initial setup: + +- Install [azure-identity][azure-key-credential] +- Register a [new AAD application][register_aad_app] +- Grant access to Azure Maps by assigning the suitable role to your service principal. Please refer to the [Manage authentication page][manage_aad_auth_page]. + +After setup, you can choose which type of [credential][azure-key-credential] from `azure.identity` to use. +As an example, [DefaultAzureCredential][default_azure_credential] +can be used to authenticate the client: + +Next, set the values of the client ID, tenant ID, and client secret of the AAD application as environment variables: +`AZURE_CLIENT_ID`, `AZURE_TENANT_ID`, `AZURE_CLIENT_SECRET` + +You will also need to specify the Azure Maps resource you intend to use by specifying the `clientId` in the client options. The Azure Maps resource client id can be found in the Authentication sections in the Azure Maps resource. Please refer to the [documentation][how_to_manage_authentication] on how to find it. + +```python +from azure.maps.geolocation import MapsGeolocationClient +from azure.identity import DefaultAzureCredential + +credential = DefaultAzureCredential() +geolocation_client = MapsGeolocationClient( + client_id="", + credential=credential +) +``` + +## Key concepts + +The Azure Maps Geolocation client library for Python allows you to interact with each of the components through the use of a dedicated client object. + +### Sync Clients + +`MapsGeolocationClient` is the primary client for developers using the Azure Maps Geolocation client library for Python. +Once you initialized a `MapsGeolocationClient` class, you can explore the methods on this client object to understand the different features of the Azure Maps Geolocation service that you can access. + +### Async Clients + +This library includes a complete async API supported on Python 3.5+. To use it, you must first install an async transport, such as [aiohttp](https://pypi.org/project/aiohttp/). +See [azure-core documentation](https://github.com/Azure/azure-sdk-for-python/blob/azure-maps-geolocation_1.0.0b1/sdk/core/azure-core/CLIENT_LIBRARY_DEVELOPER.md#transport) for more information. + +Async clients and credentials should be closed when they're no longer needed. These +objects are async context managers and define async `close` methods. + +## Examples + +The following sections provide several code snippets covering some of the most common Azure Maps Geolocation tasks, including: + +- [Get Geolocation](#get-geolocation) + +### Get Geolocation + +This service will return the ISO country code for the provided IP address. Developers can use this information to block or alter certain content based on geographical locations where the application is being viewed from. + +```python +from azure.maps.geolocation import MapsGeolocationClient + +BLOCK_COUNTRY_LIST = ['US', 'TW', 'AF', 'AX', 'DL'] +INCOME_IP_ADDRESS = "2001:4898:80e8:b::189" +geolocation_result = client.get_country_code(ip_address=INCOME_IP_ADDRESS) + +result_country_code = geolocation_result.iso_code + +if result_country_code in BLOCK_COUNTRY_LIST: + raise Exception("These IP address is from forbidden country") +``` + +## Troubleshooting + +### General + +Maps Geolocation clients raise exceptions defined in [Azure Core](https://github.com/Azure/azure-sdk-for-python/blob/azure-maps-geolocation_1.0.0b1/sdk/core/azure-core/README.md). + +This list can be used for reference to catch thrown exceptions. To get the specific error code of the exception, use the `error_code` attribute, i.e, `exception.error_code`. + +### Logging + +This library uses the standard [logging](https://docs.python.org/3/library/logging.html) library for logging. +Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO level. + +Detailed DEBUG level logging, including request/response bodies and unredacted headers, can be enabled on a client with the `logging_enable` argument: + +```python +import sys +import logging +from azure.maps.geolocation import MapsGeolocationClient + +# Create a logger for the 'azure.maps.geolocation' SDK +logger = logging.getLogger('azure.maps.geolocation') +logger.setLevel(logging.DEBUG) + +# Configure a console output +handler = logging.StreamHandler(stream=sys.stdout) +logger.addHandler(handler) + +``` + +### Additional + +Still running into issues? If you encounter any bugs or have suggestions, please file an issue in the [Issues]() section of the project. + +## Next steps + +### More sample code + +Get started with our [Maps Geolocation samples](https://github.com/Azure/azure-sdk-for-python/tree/azure-maps-geolocation_1.0.0b1/sdk/maps/azure-maps-geolocation/samples) ([Async Version samples](https://github.com/Azure/azure-sdk-for-python/tree/azure-maps-geolocation_1.0.0b1/sdk/maps/azure-maps-geolocation/samples/async_samples)). + +Several Azure Maps Geolocation Python SDK samples are available to you in the SDK's GitHub repository. These samples provide example code for additional scenarios commonly encountered while working with Maps Geolocation + +```bash +set AZURE_SUBSCRIPTION_KEY="" + +pip install azure-maps-geolocation --pre + +python samples/sample_authentication.py +python sample/sample_get_country_code.py +``` + +> Notes: `--pre` flag can be optionally added, it is to include pre-release and development versions for `pip install`. By default, `pip` only finds stable versions. + +Further detail please refer to [Samples Introduction](https://github.com/Azure/azure-sdk-for-python/tree/azure-maps-geolocation_1.0.0b1/sdk/maps/azure-maps-geolocation/samples/README.md) + +### Additional documentation + +For more extensive documentation on Azure Maps Geolocation, see the [Azure Maps Geolocation documentation](/rest/api/maps/geolocation) on docs.microsoft.com. + +## Contributing + +This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit . + +When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. + +This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments. + + +[azure_subscription]: https://azure.microsoft.com/free/ +[azure_identity]: https://github.com/Azure/azure-sdk-for-python/blob/azure-maps-geolocation_1.0.0b1/sdk/identity/azure-identity +[azure_portal]: https://portal.azure.com +[azure_cli]: /cli/azure +[azure-key-credential]: https://aka.ms/azsdk/python/core/azurekeycredential +[default_azure_credential]: https://github.com/Azure/azure-sdk-for-python/tree/azure-maps-geolocation_1.0.0b1/sdk/identity/azure-identity#defaultazurecredential +[register_aad_app]: /powershell/module/Az.Resources/New-AzADApplication?view=azps-8.0.0 +[maps_authentication_aad]: /azure/azure-maps/how-to-manage-authentication +[create_new_application_registration]: https://portal.azure.com/#blade/Microsoft_AAD_RegisteredApps/applicationsListBlade/quickStartType/AspNetWebAppQuickstartPage/sourceType/docs +[manage_aad_auth_page]: /azure/azure-maps/how-to-manage-authentication [how_to_manage_authentication]: /azure/azure-maps/how-to-manage-authentication#view-authentication-details - + diff --git a/docs-ref-services/preview/mixedreality-remoterendering-readme.md b/docs-ref-services/preview/mixedreality-remoterendering-readme.md index f88715cb6017..7a6d382fac1d 100644 --- a/docs-ref-services/preview/mixedreality-remoterendering-readme.md +++ b/docs-ref-services/preview/mixedreality-remoterendering-readme.md @@ -6,385 +6,385 @@ ms.topic: reference ms.devlang: python ms.service: remoterendering --- -[![Build Status](https://dev.azure.com/azure-sdk/public/_apis/build/status/azure-sdk-for-python.client?branchName=master)](https://dev.azure.com/azure-sdk/public/_build/latest?definitionId=46?branchName=master) - -# Azure Remote Rendering client library for Python - version 1.0.0b2 - - -Azure Remote Rendering (ARR) is a service that enables you to render high-quality, interactive 3D content in the cloud and stream it in real time to devices, such as the HoloLens 2. - -This SDK offers functionality to convert assets to the format expected by the runtime, and also to manage -the lifetime of remote rendering sessions. - -This SDK supports version "2021-01-01" of the [Remote Rendering REST API](/rest/api/mixedreality/2021-01-01/remote-rendering). - -> NOTE: Once a session is running, a client application will connect to it using one of the "runtime SDKs". -> These SDKs are designed to best support the needs of an interactive application doing 3d rendering. -> They are available in ([.net](/dotnet/api/microsoft.azure.remoterendering) -> or ([C++](/cpp/api/remote-rendering/)). - -[Product documentation](/azure/remote-rendering/) - -## _Disclaimer_ - -_Azure SDK Python packages support for Python 2.7 has ended 01 January 2022. For more information and questions, please refer to https://github.com/Azure/azure-sdk-for-python/issues/20691_ - -# Getting started - -## Prerequisites - -You will need an [Azure subscription](https://azure.microsoft.com/free/dotnet/) and an [Azure Remote Rendering account](/azure/remote-rendering/how-tos/create-an-account) to use this package. - -In order to follow this tutorial it is highly recommended that you [link your storage account with your ARR account](/azure/remote-rendering/how-tos/create-an-account#link-storage-accounts). - -## Install the package - -Install the Azure Remote Rendering client library for Python with [pip][pip]: - -```bash -pip install --pre azure-mixedreality-remoterendering -``` - -## Create and authenticate the client - -Constructing a remote rendering client requires an authenticated account, and a remote rendering endpoint. -For an account created in the eastus region, the account domain will have the form "eastus.mixedreality.azure.com". -There are several different forms of authentication: - -- Account Key authentication - - Account keys enable you to get started quickly with using Azure Remote Rendering. But before you deploy your application to production, we recommend that you update your app to use Azure AD authentication. -- Azure Active Directory (AD) token authentication - - If you're building an enterprise application and your company is using Azure AD as its identity system, you can use user-based Azure AD authentication in your app. You then grant access to your Azure Remote Rendering accounts by using your existing Azure AD security groups. You can also grant access directly to users in your organization. - - Otherwise, we recommend that you obtain Azure AD tokens from a web service that supports your app. We recommend this method for production applications because it allows you to avoid embedding the credentials for access in your client application. - -See [here](/azure/remote-rendering/how-tos/authentication) for detailed instructions and information. - -In all the following examples, the client is constructed with a `endpoint` parameter. -The available endpoints correspond to regions, and the choice of endpoint determines the region in which the service performs its work. -An example is `https://remoterendering.eastus2.mixedreality.azure.com`. - -A full list of endpoints in supported regions can be found in the [Azure Remote Rendering region list](/azure/remote-rendering/reference/regions). - -> NOTE: For converting assets, it is preferable to pick a region close to the storage containing the assets. - -> NOTE: For rendering, it is strongly recommended that you pick the closest region to the devices using the service. -> The time taken to communicate with the server impacts the quality of the experience. - -### Authenticating with account key authentication - -Use the `AzureKeyCredential` object to use an account identifier and account key to authenticate: - -```python -from azure.core.credentials import AzureKeyCredential -from azure.mixedreality.remoterendering import RemoteRenderingClient - -account_id = "" -account_domain = "" -account_key = "" -arr_endpoint = "" - -key_credential = AzureKeyCredential(account_key) -client = RemoteRenderingClient( - endpoint=arr_endpoint, - account_id=account_id, - account_domain=account_domain, - credential=key_credential -) -``` - -### Authenticating with a static access token - -You can pass a Mixed Reality access token as an `AccessToken` previously retrieved from the -[Mixed Reality STS service](https://github.com/Azure/azure-sdk-for-python/tree/azure-mixedreality-remoterendering_1.0.0b2/sdk/mixedreality/azure-mixedreality-authentication) -to be used with a Mixed Reality client library: - -```python -from azure.mixedreality.authentication import MixedRealityStsClient -from azure.mixedreality.remoterendering import RemoteRenderingClient -account_id = "" -account_domain = "" -account_key = "" - -key_credential = AzureKeyCredential(account_key) - -client = MixedRealityStsClient(account_id, account_domain, key_credential) - -token = client.get_token() - -client = RemoteRenderingClient( - endpoint=arr_endpoint, - account_id=account_id, - account_domain=account_domain, - credential=token, -) -``` - -### Authenticating with an Azure Active Directory Credential - -Account key authentication is used in most of the examples, but you can also authenticate with Azure Active Directory -using the [Azure Identity library][azure_identity]. This is the recommended method for production applications. To use -the [DefaultAzureCredential][defaultazurecredential] provider shown below, or other credential providers provided with -the Azure SDK, please install the `@azure/identity` package: - -You will also need to [register a new AAD application][register_aad_app] and grant access to your Mixed Reality resource -by assigning the appropriate role for your Mixed Reality service to your service principal. - -```python -from azure.identity import DefaultAzureCredential -from azure.mixedreality.remoterendering import RemoteRenderingClient - -account_id = "" -account_domain = "" -default_credential = DefaultAzureCredential() - -client = RemoteRenderingClient( - endpoint=arr_endpoint, - account_id=account_id, - account_domain=account_domain, - credential=default_credential -) -``` - -## Key concepts - -### RemoteRenderingClient - -The `RemoteRenderingClient` is the client library used to access the RemoteRenderingService. -It provides methods to create and manage asset conversions and rendering sessions. - -### Long-Running Operations -Long-running operations are operations which consist of an initial request sent to the service to start an operation, -followed by polling the service at intervals to determine whether the operation has completed or failed, and if it has -succeeded, to get the result. - -Methods that convert assets, or spin up rendering sessions are modelled as long-running operations. -The client exposes a `begin_` method that returns an LROPoller or AsyncLROPoller. -Callers should wait for the operation to complete by calling result() on the poller object returned from the -`begin_` method. Sample code snippets are provided to illustrate using long-running operations -[below](#examples "Examples"). - -## Examples - -- [Convert an asset](#convert-an-asset) -- [List conversions](#list-conversions) -- [Create a session](#create-a-session) -- [Extend the lease time of a session](#extend-the-lease-time-of-a-session) -- [List sessions](#list-sessions) -- [Stop a session](#stop-a-session) - -### Convert an asset - -We assume that a RemoteRenderingClient has been constructed as described in the [Authenticate the Client](#authenticate-the-client) section. -The following snippet describes how to request that "box.fbx", found at at a path of "/input/box/box.fbx" of the blob container at the given storage container URI, gets converted. - -Converting an asset can take anywhere from seconds to hours. -This code uses an existing conversion poller and polls regularly until the conversion has finished or failed. -The default polling period is 5 seconds. -Note that a conversion poller can be retrieved using the client.get_asset_conversion_poller using the id of an existing conversion and a client. - -Once the conversion process finishes the output is written to the specified output container under a path of "/output//box.arrAsset". -The path can be retrieved from the output.asset_uri of a successful conversion. - -```python - conversion_id = str(uuid.uuid4()) # A randomly generated uuid is a good choice for a conversion_id. - - input_settings = AssetConversionInputSettings( - storage_container_uri="", - relative_input_asset_path="box.fbx", - blob_prefix="input/box" - ) - output_settings = AssetConversionOutputSettings( - storage_container_uri="", - blob_prefix="output/"+conversion_id, - output_asset_filename="convertedBox.arrAsset" #if no output_asset_filename .arrAsset will be the name of the resulting converted asset - ) - try: - conversion_poller = client.begin_asset_conversion( - conversion_id=conversion_id, - input_settings=input_settings, - output_settings=output_settings - ) - - print("Conversion with id:", conversion_id, "created. Waiting for completion.") - conversion = conversion_poller.result() - print("conversion output:", conversion.output.asset_uri) - - except Exception as e: - print("Conversion failed", e) -``` - -### List conversions - -You can get information about your conversions using the `list_asset_conversions` method. -This method may return conversions which have yet to start, conversions which are running and conversions which have finished. -In this example, we list all conversions and print id and creation ad as well as the output asset URIs of successful conversions. - -```python - print("conversions:") - for c in client.list_asset_conversions(): - print( - "\t conversion: id:", - c.id, - "status:", - c.status, - "created on:", - c.created_on.strftime("%m/%d/%Y, %H:%M:%S"), - ) - if c.status == AssetConversionStatus.SUCCEEDED: - print("\t\tconversion result URI:", c.output.asset_uri) -``` - -### Create a session - -We assume that a RemoteRenderingClient has been constructed as described in the [Authenticate the Client](#authenticate-the-client) section. -The following snippet describes how to request that a new rendering session be started. - -```python - print("starting rendering session with id:", session_id) - try: - session_poller = client.begin_rendering_session( - session_id=session_id, size=RenderingSessionSize.STANDARD, lease_time_minutes=20 - ) - print( - "rendering session with id:", - session_id, - "created. Waiting for session to be ready.", - ) - session = session_poller.result() - print( - "session with id:", - session.id, - "is ready. lease_time_minutes:", - session.lease_time_minutes, - ) - except Exception as e: - print("Session startup failed", e) -``` - -### Extend the lease time of a session - -If a session is approaching its maximum lease time, but you want to keep it alive, you will need to make a call to -increase its maximum lease time. -This example shows how to query the current properties and then extend the lease if it will expire soon. - -> NOTE: The runtime SDKs also offer this functionality, and in many typical scenarios, you would use them to -> extend the session lease. - -```python - session = client.get_rendering_session(session_id) - if session.lease_time_minutes - session.elapsed_time_minutes < 2: - session = client.update_rendering_session( - session_id=session_id, lease_time_minutes=session.lease_time_minutes + 10 - ) -``` - -### List sessions - -You can get information about your sessions using the `list_rendering_sessions` method of the client. -This method may return sessions which have yet to start and sessions which are ready. - -```python - print("sessions:") - rendering_sessions = client.list_rendering_sessions() - for session in rendering_sessions: - print( - "\t session: id:", - session.id, - "status:", - session.status, - "created on:", - session.created_on.strftime("%m/%d/%Y, %H:%M:%S"), - ) -``` - -### Stop a Session - -The following code will stop a running session with given id. Since running sessions incur ongoing costs it is -recommended to stop sessions which are not needed anymore. - -```python - client.stop_rendering_session(session_id) - print("session with id:", session_id, "stopped") -``` - -## Troubleshooting - -For general troubleshooting advice concerning Azure Remote Rendering, see [the Troubleshoot page](/azure/remote-rendering/resources/troubleshoot) for remote rendering at docs.microsoft.com. - -The client methods and waiting for poller results will throw exceptions if the request failed. - -If the asset in a conversion is invalid, the conversion poller will throw an exception with an error containing details. -Once the conversion service is able to process the file, a <assetName>.result.json file will be written to the output container. -If the input asset is invalid, then that file will contain a more detailed description of the problem. - -Similarly, sometimes when a session is requested, the session ends up in an error state. -The poller will throw an exception containing details of the error in this case. Session errors are usually transient -and requesting a new session should succeed. - -### Logging - -This library uses the standard -[logging][python_logging] library for logging. - -Basic information about HTTP sessions (URLs, headers, etc.) is logged at `INFO` level. - -Detailed `DEBUG` level logging, including request/response bodies and **unredacted** -headers, can be enabled on the client or per-operation with the `logging_enable` keyword argument. - -See full SDK logging documentation with examples [here][sdk_logging_docs]. - -### Optional Configuration - -Optional keyword arguments can be passed in at the client and per-operation level. -The azure-core [reference documentation][azure_core_ref_docs] -describes available configurations for retries, logging, transport protocols, and more. - -### Exceptions - -The Remote Rendering client library will raise exceptions defined in [Azure Core][azure_core_exceptions]. - -### Async APIs - -This library also includes a complete async API supported on Python 3.7+. To use it, you must -first install an async transport, such as [aiohttp](https://pypi.org/project/aiohttp/). Async clients -are found under the `azure.mixedreality.remoterendering.aio` namespace. - - - -## Next steps - -- Read the [Product documentation](/azure/remote-rendering/) -- Learn about the runtime SDKs: - - .NET: /dotnet/api/microsoft.azure.remoterendering - - C++: /cpp/api/remote-rendering/ - -## Contributing - -This project welcomes contributions and suggestions. Most contributions require you to agree to a -Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us -the rights to use your contribution. For details, visit https://cla.microsoft.com. - -When you submit a pull request, a CLA-bot will automatically determine whether you need to provide -a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions -provided by the bot. You will only need to do this once across all repos using our CLA. - -This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). -For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or -contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments. - -If you'd like to contribute to this library, please read the -[contributing guide](https://github.com/Azure/azure-sdk-for-python/blob/azure-mixedreality-remoterendering_1.0.0b2/CONTRIBUTING.md) to learn more about how -to build and test the code. - - -![Impressions](https://azure-sdk-impressions.azurewebsites.net/api/impressions/azure-sdk-for-python%2Fsdk%remoterendering%2Fazure-mixedreality-remoterendering%2FREADME.png) - -[azure_core_ref_docs]: https://aka.ms/azsdk/python/core/docs -[azure_core_exceptions]: https://aka.ms/azsdk/python/core/docs#module-azure.core.exceptions -[azure_sub]: https://azure.microsoft.com/free/ -[azure_portal]: https://portal.azure.com -[azure_identity]: https://github.com/Azure/azure-sdk-for-python/tree/azure-mixedreality-remoterendering_1.0.0b2/sdk/identity/azure-identity - -[pip]: https://pypi.org/project/pip/ +[![Build Status](https://dev.azure.com/azure-sdk/public/_apis/build/status/azure-sdk-for-python.client?branchName=master)](https://dev.azure.com/azure-sdk/public/_build/latest?definitionId=46?branchName=master) + +# Azure Remote Rendering client library for Python - version 1.0.0b2 + + +Azure Remote Rendering (ARR) is a service that enables you to render high-quality, interactive 3D content in the cloud and stream it in real time to devices, such as the HoloLens 2. + +This SDK offers functionality to convert assets to the format expected by the runtime, and also to manage +the lifetime of remote rendering sessions. + +This SDK supports version "2021-01-01" of the [Remote Rendering REST API](/rest/api/mixedreality/2021-01-01/remote-rendering). + +> NOTE: Once a session is running, a client application will connect to it using one of the "runtime SDKs". +> These SDKs are designed to best support the needs of an interactive application doing 3d rendering. +> They are available in ([.net](/dotnet/api/microsoft.azure.remoterendering) +> or ([C++](/cpp/api/remote-rendering/)). + +[Product documentation](/azure/remote-rendering/) + +## _Disclaimer_ + +_Azure SDK Python packages support for Python 2.7 has ended 01 January 2022. For more information and questions, please refer to https://github.com/Azure/azure-sdk-for-python/issues/20691_ + +# Getting started + +## Prerequisites + +You will need an [Azure subscription](https://azure.microsoft.com/free/dotnet/) and an [Azure Remote Rendering account](/azure/remote-rendering/how-tos/create-an-account) to use this package. + +In order to follow this tutorial it is highly recommended that you [link your storage account with your ARR account](/azure/remote-rendering/how-tos/create-an-account#link-storage-accounts). + +## Install the package + +Install the Azure Remote Rendering client library for Python with [pip][pip]: + +```bash +pip install --pre azure-mixedreality-remoterendering +``` + +## Create and authenticate the client + +Constructing a remote rendering client requires an authenticated account, and a remote rendering endpoint. +For an account created in the eastus region, the account domain will have the form "eastus.mixedreality.azure.com". +There are several different forms of authentication: + +- Account Key authentication + - Account keys enable you to get started quickly with using Azure Remote Rendering. But before you deploy your application to production, we recommend that you update your app to use Azure AD authentication. +- Azure Active Directory (AD) token authentication + - If you're building an enterprise application and your company is using Azure AD as its identity system, you can use user-based Azure AD authentication in your app. You then grant access to your Azure Remote Rendering accounts by using your existing Azure AD security groups. You can also grant access directly to users in your organization. + - Otherwise, we recommend that you obtain Azure AD tokens from a web service that supports your app. We recommend this method for production applications because it allows you to avoid embedding the credentials for access in your client application. + +See [here](/azure/remote-rendering/how-tos/authentication) for detailed instructions and information. + +In all the following examples, the client is constructed with an `endpoint` parameter. +The available endpoints correspond to regions, and the choice of endpoint determines the region in which the service performs its work. +An example is `https://remoterendering.eastus2.mixedreality.azure.com`. + +A full list of endpoints in supported regions can be found in the [Azure Remote Rendering region list](/azure/remote-rendering/reference/regions). + +> NOTE: For converting assets, it is preferable to pick a region close to the storage containing the assets. + +> NOTE: For rendering, it is strongly recommended that you pick the closest region to the devices using the service. +> The time taken to communicate with the server impacts the quality of the experience. + +### Authenticating with account key authentication + +Use the `AzureKeyCredential` object to use an account identifier and account key to authenticate: + +```python +from azure.core.credentials import AzureKeyCredential +from azure.mixedreality.remoterendering import RemoteRenderingClient + +account_id = "" +account_domain = "" +account_key = "" +arr_endpoint = "" + +key_credential = AzureKeyCredential(account_key) +client = RemoteRenderingClient( + endpoint=arr_endpoint, + account_id=account_id, + account_domain=account_domain, + credential=key_credential +) +``` + +### Authenticating with a static access token + +You can pass a Mixed Reality access token as an `AccessToken` previously retrieved from the +[Mixed Reality STS service](https://github.com/Azure/azure-sdk-for-python/tree/azure-mixedreality-remoterendering_1.0.0b2/sdk/mixedreality/azure-mixedreality-authentication) +to be used with a Mixed Reality client library: + +```python +from azure.mixedreality.authentication import MixedRealityStsClient +from azure.mixedreality.remoterendering import RemoteRenderingClient +account_id = "" +account_domain = "" +account_key = "" + +key_credential = AzureKeyCredential(account_key) + +client = MixedRealityStsClient(account_id, account_domain, key_credential) + +token = client.get_token() + +client = RemoteRenderingClient( + endpoint=arr_endpoint, + account_id=account_id, + account_domain=account_domain, + credential=token, +) +``` + +### Authenticating with an Azure Active Directory Credential + +Account key authentication is used in most of the examples, but you can also authenticate with Azure Active Directory +using the [Azure Identity library][azure_identity]. This is the recommended method for production applications. To use +the [DefaultAzureCredential][defaultazurecredential] provider shown below, or other credential providers provided with +the Azure SDK, please install the `@azure/identity` package: + +You will also need to [register a new AAD application][register_aad_app] and grant access to your Mixed Reality resource +by assigning the appropriate role for your Mixed Reality service to your service principal. + +```python +from azure.identity import DefaultAzureCredential +from azure.mixedreality.remoterendering import RemoteRenderingClient + +account_id = "" +account_domain = "" +default_credential = DefaultAzureCredential() + +client = RemoteRenderingClient( + endpoint=arr_endpoint, + account_id=account_id, + account_domain=account_domain, + credential=default_credential +) +``` + +## Key concepts + +### RemoteRenderingClient + +The `RemoteRenderingClient` is the client library used to access the RemoteRenderingService. +It provides methods to create and manage asset conversions and rendering sessions. + +### Long-Running Operations +Long-running operations are operations which consist of an initial request sent to the service to start an operation, +followed by polling the service at intervals to determine whether the operation has completed or failed, and if it has +succeeded, to get the result. + +Methods that convert assets, or spin up rendering sessions are modelled as long-running operations. +The client exposes a `begin_` method that returns an LROPoller or AsyncLROPoller. +Callers should wait for the operation to complete by calling result() on the poller object returned from the +`begin_` method. Sample code snippets are provided to illustrate using long-running operations +[below](#examples "Examples"). + +## Examples + +- [Convert an asset](#convert-an-asset) +- [List conversions](#list-conversions) +- [Create a session](#create-a-session) +- [Extend the lease time of a session](#extend-the-lease-time-of-a-session) +- [List sessions](#list-sessions) +- [Stop a session](#stop-a-session) + +### Convert an asset + +We assume that a RemoteRenderingClient has been constructed as described in the [Authenticate the Client](#authenticate-the-client) section. +The following snippet describes how to request that "box.fbx", found at a path of "/input/box/box.fbx" of the blob container at the given storage container URI, gets converted. + +Converting an asset can take anywhere from seconds to hours. +This code uses an existing conversion poller and polls regularly until the conversion has finished or failed. +The default polling period is 5 seconds. +Note that a conversion poller can be retrieved using the client.get_asset_conversion_poller using the id of an existing conversion and a client. + +Once the conversion process finishes the output is written to the specified output container under a path of "/output//box.arrAsset". +The path can be retrieved from the output.asset_uri of a successful conversion. + +```python + conversion_id = str(uuid.uuid4()) # A randomly generated uuid is a good choice for a conversion_id. + + input_settings = AssetConversionInputSettings( + storage_container_uri="", + relative_input_asset_path="box.fbx", + blob_prefix="input/box" + ) + output_settings = AssetConversionOutputSettings( + storage_container_uri="", + blob_prefix="output/"+conversion_id, + output_asset_filename="convertedBox.arrAsset" #if no output_asset_filename .arrAsset will be the name of the resulting converted asset + ) + try: + conversion_poller = client.begin_asset_conversion( + conversion_id=conversion_id, + input_settings=input_settings, + output_settings=output_settings + ) + + print("Conversion with id:", conversion_id, "created. Waiting for completion.") + conversion = conversion_poller.result() + print("conversion output:", conversion.output.asset_uri) + + except Exception as e: + print("Conversion failed", e) +``` + +### List conversions + +You can get information about your conversions using the `list_asset_conversions` method. +This method may return conversions which have yet to start, conversions which are running and conversions which have finished. +In this example, we list all conversions and print id and creation ad as well as the output asset URIs of successful conversions. + +```python + print("conversions:") + for c in client.list_asset_conversions(): + print( + "\t conversion: id:", + c.id, + "status:", + c.status, + "created on:", + c.created_on.strftime("%m/%d/%Y, %H:%M:%S"), + ) + if c.status == AssetConversionStatus.SUCCEEDED: + print("\t\tconversion result URI:", c.output.asset_uri) +``` + +### Create a session + +We assume that a RemoteRenderingClient has been constructed as described in the [Authenticate the Client](#authenticate-the-client) section. +The following snippet describes how to request that a new rendering session be started. + +```python + print("starting rendering session with id:", session_id) + try: + session_poller = client.begin_rendering_session( + session_id=session_id, size=RenderingSessionSize.STANDARD, lease_time_minutes=20 + ) + print( + "rendering session with id:", + session_id, + "created. Waiting for session to be ready.", + ) + session = session_poller.result() + print( + "session with id:", + session.id, + "is ready. lease_time_minutes:", + session.lease_time_minutes, + ) + except Exception as e: + print("Session startup failed", e) +``` + +### Extend the lease time of a session + +If a session is approaching its maximum lease time, but you want to keep it alive, you will need to make a call to +increase its maximum lease time. +This example shows how to query the current properties and then extend the lease if it will expire soon. + +> NOTE: The runtime SDKs also offer this functionality, and in many typical scenarios, you would use them to +> extend the session lease. + +```python + session = client.get_rendering_session(session_id) + if session.lease_time_minutes - session.elapsed_time_minutes < 2: + session = client.update_rendering_session( + session_id=session_id, lease_time_minutes=session.lease_time_minutes + 10 + ) +``` + +### List sessions + +You can get information about your sessions using the `list_rendering_sessions` method of the client. +This method may return sessions which have yet to start and sessions which are ready. + +```python + print("sessions:") + rendering_sessions = client.list_rendering_sessions() + for session in rendering_sessions: + print( + "\t session: id:", + session.id, + "status:", + session.status, + "created on:", + session.created_on.strftime("%m/%d/%Y, %H:%M:%S"), + ) +``` + +### Stop a Session + +The following code will stop a running session with given id. Since running sessions incur ongoing costs it is +recommended to stop sessions which are not needed anymore. + +```python + client.stop_rendering_session(session_id) + print("session with id:", session_id, "stopped") +``` + +## Troubleshooting + +For general troubleshooting advice concerning Azure Remote Rendering, see [the Troubleshoot page](/azure/remote-rendering/resources/troubleshoot) for remote rendering at docs.microsoft.com. + +The client methods and waiting for poller results will throw exceptions if the request failed. + +If the asset in a conversion is invalid, the conversion poller will throw an exception with an error containing details. +Once the conversion service is able to process the file, a <assetName>.result.json file will be written to the output container. +If the input asset is invalid, then that file will contain a more detailed description of the problem. + +Similarly, sometimes when a session is requested, the session ends up in an error state. +The poller will throw an exception containing details of the error in this case. Session errors are usually transient +and requesting a new session should succeed. + +### Logging + +This library uses the standard +[logging][python_logging] library for logging. + +Basic information about HTTP sessions (URLs, headers, etc.) is logged at `INFO` level. + +Detailed `DEBUG` level logging, including request/response bodies and **unredacted** +headers, can be enabled on the client or per-operation with the `logging_enable` keyword argument. + +See full SDK logging documentation with examples [here][sdk_logging_docs]. + +### Optional Configuration + +Optional keyword arguments can be passed in at the client and per-operation level. +The azure-core [reference documentation][azure_core_ref_docs] +describes available configurations for retries, logging, transport protocols, and more. + +### Exceptions + +The Remote Rendering client library will raise exceptions defined in [Azure Core][azure_core_exceptions]. + +### Async APIs + +This library also includes a complete async API supported on Python 3.7+. To use it, you must +first install an async transport, such as [aiohttp](https://pypi.org/project/aiohttp/). Async clients +are found under the `azure.mixedreality.remoterendering.aio` namespace. + + + +## Next steps + +- Read the [Product documentation](/azure/remote-rendering/) +- Learn about the runtime SDKs: + - .NET: /dotnet/api/microsoft.azure.remoterendering + - C++: /cpp/api/remote-rendering/ + +## Contributing + +This project welcomes contributions and suggestions. Most contributions require you to agree to a +Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us +the rights to use your contribution. For details, visit https://cla.microsoft.com. + +When you submit a pull request, a CLA-bot will automatically determine whether you need to provide +a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions +provided by the bot. You will only need to do this once across all repos using our CLA. + +This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). +For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or +contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments. + +If you'd like to contribute to this library, please read the +[contributing guide](https://github.com/Azure/azure-sdk-for-python/blob/azure-mixedreality-remoterendering_1.0.0b2/CONTRIBUTING.md) to learn more about how +to build and test the code. + + +![Impressions](https://azure-sdk-impressions.azurewebsites.net/api/impressions/azure-sdk-for-python%2Fsdk%remoterendering%2Fazure-mixedreality-remoterendering%2FREADME.png) + +[azure_core_ref_docs]: https://aka.ms/azsdk/python/core/docs +[azure_core_exceptions]: https://aka.ms/azsdk/python/core/docs#module-azure.core.exceptions +[azure_sub]: https://azure.microsoft.com/free/ +[azure_portal]: https://portal.azure.com +[azure_identity]: https://github.com/Azure/azure-sdk-for-python/tree/azure-mixedreality-remoterendering_1.0.0b2/sdk/identity/azure-identity + +[pip]: https://pypi.org/project/pip/ [sdk_logging_docs]: /azure/developer/python/azure-sdk-logging - + diff --git a/docs-ref-services/preview/purview-workflow-readme.md b/docs-ref-services/preview/purview-workflow-readme.md index fda2b1bea90e..b3e87e38170b 100644 --- a/docs-ref-services/preview/purview-workflow-readme.md +++ b/docs-ref-services/preview/purview-workflow-readme.md @@ -21,7 +21,7 @@ For more details about how to use workflow, please refer to the [service documen ## Getting started -### Prequisites +### Prerequisites - Python 3.7 or later is required to use this package. - You need an [Azure subscription][azure_sub] to use this package. diff --git a/docs-ref-services/preview/schemaregistry-avroserializer-readme.md b/docs-ref-services/preview/schemaregistry-avroserializer-readme.md index 63d9c44e38d8..9f530bb2e6fb 100644 --- a/docs-ref-services/preview/schemaregistry-avroserializer-readme.md +++ b/docs-ref-services/preview/schemaregistry-avroserializer-readme.md @@ -7,321 +7,321 @@ ms.devlang: python ms.service: event-hubs ms.technology: azure --- -# Azure Schema Registry Avro Serializer client library for Python - version 1.0.0b4 - - -Azure Schema Registry is a schema repository service hosted by Azure Event Hubs, providing schema storage, versioning, -and management. This package provides an Avro serializer capable of serializing and deserializing payloads containing -Schema Registry schema identifiers and Avro-encoded data. - -[Source code][source_code] | [Package (PyPi)][pypi] | [API reference documentation][api_reference] | [Samples][sr_avro_samples] | [Changelog][change_log] - -## _Disclaimer_ - -_Azure SDK Python packages support for Python 2.7 is ending 01 January 2022. For more information and questions, please refer to https://github.com/Azure/azure-sdk-for-python/issues/20691_ - -## Getting started - -### Install the package - -Install the Azure Schema Registry Avro Serializer client library and Azure Identity client library for Python with [pip][pip]: - -```Bash -pip install azure-schemaregistry-avroserializer azure-identity -``` - -### Prerequisites: -To use this package, you must have: -* Azure subscription - [Create a free account][azure_sub] -* [Azure Schema Registry][schemaregistry_service] -* Python 2.7, 3.6 or later - [Install Python][python] - -### Authenticate the client -Interaction with the Schema Registry Avro Serializer starts with an instance of AvroSerializer class, which takes the schema group name and the [Schema Registry Client][schemaregistry_client] class. The client constructor takes the Event Hubs fully qualified namespace and and Azure Active Directory credential: - -* The fully qualified namespace of the Schema Registry instance should follow the format: `.servicebus.windows.net`. - -* An AAD credential that implements the [TokenCredential][token_credential_interface] protocol should be passed to the constructor. There are implementations of the `TokenCredential` protocol available in the -[azure-identity package][pypi_azure_identity]. To use the credential types provided by `azure-identity`, please install the Azure Identity client library for Python with [pip][pip]: - -```Bash -pip install azure-identity -``` - -* Additionally, to use the async API supported on Python 3.6+, you must first install an async transport, such as [aiohttp](https://pypi.org/project/aiohttp/): - -```Bash -pip install aiohttp -``` - -**Create AvroSerializer using the azure-schemaregistry library:** - -```python -from azure.schemaregistry import SchemaRegistryClient -from azure.schemaregistry.serializer.avroserializer import AvroSerializer -from azure.identity import DefaultAzureCredential - -credential = DefaultAzureCredential() -# Namespace should be similar to: '.servicebus.windows.net' -fully_qualified_namespace = '<< FULLY QUALIFIED NAMESPACE OF THE SCHEMA REGISTRY >>' -group_name = '<< GROUP NAME OF THE SCHEMA >>' -schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, credential) -serializer = AvroSerializer(client=schema_registry_client, group_name=group_name) -``` - -## Key concepts - -### AvroSerializer - -Provides API to serialize to and deserialize from Avro Binary Encoding plus a -header with schema ID. Uses [SchemaRegistryClient][schemaregistry_client] to get schema IDs from schema content or vice versa. - -### Message format - -The same format is used by schema registry serializers across Azure SDK languages. - -Messages are encoded as follows: - -- 4 bytes: Format Indicator - - - Currently always zero to indicate format below. - -- 32 bytes: Schema ID - - - UTF-8 hexadecimal representation of GUID. - - 32 hex digits, no hyphens. - - Same format and byte order as string from Schema Registry service. - -- Remaining bytes: Avro payload (in general, format-specific payload) - - - Avro Binary Encoding - - NOT Avro Object Container File, which includes the schema and defeats the - purpose of this serialzer to move the schema out of the message payload and - into the schema registry. - - -## Examples - -The following sections provide several code snippets covering some of the most common Schema Registry tasks, including: - -- [Serialization](#serialization) -- [Deserialization](#deserialization) -- [Event Hubs Sending Integration](#event-hubs-sending-integration) -- [Event Hubs Receiving Integration](#event-hubs-receiving-integration) - -### Serialization - -Use `AvroSerializer.serialize` method to serialize dict data with the given avro schema. -The method would use a schema previously registered to the Schema Registry service and keep the schema cached for future serialization usage. It is also possible to avoid pre-registering the schema to the service and automatically register with the `serialize` method by instantiating the `AvroSerializer` with the keyword argument `auto_register_schemas=True`. - -```python -import os -from azure.schemaregistry import SchemaRegistryClient -from azure.schemaregistry.serializer.avroserializer import AvroSerializer -from azure.identity import DefaultAzureCredential - -token_credential = DefaultAzureCredential() -fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE'] -group_name = "" -name = "example.avro.User" -format = "Avro" - -definition = """ -{"namespace": "example.avro", - "type": "record", - "name": "User", - "fields": [ - {"name": "name", "type": "string"}, - {"name": "favorite_number", "type": ["int", "null"]}, - {"name": "favorite_color", "type": ["string", "null"]} - ] -}""" - -schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential) -schema_register_client.register(group_name, name, definition, format) -serializer = AvroSerializer(client=schema_registry_client, group_name=group_name) - -with serializer: - dict_data = {"name": "Ben", "favorite_number": 7, "favorite_color": "red"} - encoded_bytes = serializer.serialize(dict_data, schema=definition) -``` - -### Deserialization - -Use `AvroSerializer.deserialize` method to deserialize raw bytes into dict data. -The method automatically retrieves the schema from the Schema Registry Service and keeps the schema cached for future deserialization usage. - -```python -import os -from azure.schemaregistry import SchemaRegistryClient -from azure.schemaregistry.serializer.avroserializer import AvroSerializer -from azure.identity import DefaultAzureCredential - -token_credential = DefaultAzureCredential() -fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE'] -group_name = "" - -schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential) -serializer = AvroSerializer(client=schema_registry_client, group_name=group_name) - -with serializer: - encoded_bytes = b'' - decoded_data = serializer.deserialize(encoded_bytes) -``` - -### Event Hubs Sending Integration - -Integration with [Event Hubs][eventhubs_repo] to send serialized avro dict data as the body of EventData. - -```python -import os -from azure.eventhub import EventHubProducerClient, EventData -from azure.schemaregistry import SchemaRegistryClient -from azure.schemaregistry.serializer.avroserializer import AvroSerializer -from azure.identity import DefaultAzureCredential - -token_credential = DefaultAzureCredential() -fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE'] -group_name = "" -eventhub_connection_str = os.environ['EVENT_HUB_CONN_STR'] -eventhub_name = os.environ['EVENT_HUB_NAME'] - -definition = """ -{"namespace": "example.avro", - "type": "record", - "name": "User", - "fields": [ - {"name": "name", "type": "string"}, - {"name": "favorite_number", "type": ["int", "null"]}, - {"name": "favorite_color", "type": ["string", "null"]} - ] -}""" - -schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential) -avro_serializer = AvroSerializer(client=schema_registry_client, group_name=group_name, auto_register_schemas=True) - -eventhub_producer = EventHubProducerClient.from_connection_string( - conn_str=eventhub_connection_str, - eventhub_name=eventhub_name -) - -with eventhub_producer, avro_serializer: - event_data_batch = eventhub_producer.create_batch() - dict_data = {"name": "Bob", "favorite_number": 7, "favorite_color": "red"} - payload_bytes = avro_serializer.serialize(dict_data, schema=definition) - event_data_batch.add(EventData(body=payload_bytes)) - eventhub_producer.send_batch(event_data_batch) -``` - -### Event Hubs Receiving Integration - -Integration with [Event Hubs][eventhubs_repo] to receive `EventData` and deserialized raw bytes into avro dict data. - -```python -import os -from azure.eventhub import EventHubConsumerClient -from azure.schemaregistry import SchemaRegistryClient -from azure.schemaregistry.serializer.avroserializer import AvroSerializer -from azure.identity import DefaultAzureCredential - -token_credential = DefaultAzureCredential() -fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE'] -group_name = "" -eventhub_connection_str = os.environ['EVENT_HUB_CONN_STR'] -eventhub_name = os.environ['EVENT_HUB_NAME'] - -schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential) -avro_serializer = AvroSerializer(client=schema_registry_client, group_name=group_name) - -eventhub_consumer = EventHubConsumerClient.from_connection_string( - conn_str=eventhub_connection_str, - consumer_group='$Default', - eventhub_name=eventhub_name, -) - -def on_event(partition_context, event): - bytes_payload = b"".join(b for b in event.body) - deserialized_data = avro_serializer.deserialize(bytes_payload) - -with eventhub_consumer, avro_serializer: - eventhub_consumer.receive(on_event=on_event, starting_position="-1") -``` - -## Troubleshooting - -### General - -Azure Schema Registry Avro Serializer raise exceptions defined in [Azure Core][azure_core]. - -### Logging -This library uses the standard -[logging][python_logging] library for logging. -Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO -level. - -Detailed DEBUG level logging, including request/response bodies and unredacted -headers, can be enabled on a client with the `logging_enable` argument: -```python -import sys -import logging -from azure.schemaregistry import SchemaRegistryClient -from azure.schemaregistry.serializer.avroserializer import AvroSerializer -from azure.identity import DefaultAzureCredential - -# Create a logger for the SDK -logger = logging.getLogger('azure.schemaregistry') -logger.setLevel(logging.DEBUG) - -# Configure a console output -handler = logging.StreamHandler(stream=sys.stdout) -logger.addHandler(handler) - -credential = DefaultAzureCredential() -schema_registry_client = SchemaRegistryClient("", credential, logging_enable=True) -# This client will log detailed information about its HTTP sessions, at DEBUG level -serializer = AvroSerializer(client=schema_registry_client, group_name="") -``` - -Similarly, `logging_enable` can enable detailed logging for a single operation, -even when it isn't enabled for the client: -```py -serializer.serialize(dict_data, schema=schema_definition, logging_enable=True) -``` - -## Next steps - -### More sample code - -Please find further examples in the [samples][sr_avro_samples] directory demonstrating common Azure Schema Registry Avro Serializer scenarios. - -## Contributing - -This project welcomes contributions and suggestions. Most contributions require you to agree to a -Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us -the rights to use your contribution. For details, visit https://cla.microsoft.com. - -When you submit a pull request, a CLA-bot will automatically determine whether you need to provide -a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions -provided by the bot. You will only need to do this once across all repos using our CLA. - -This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). -For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or -contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments. - - -[pip]: https://pypi.org/project/pip/ -[pypi]: https://pypi.org/project/azure-schemaregistry-avroserializer -[python]: https://www.python.org/downloads/ -[azure_core]: https://github.com/Azure/azure-sdk-for-python/blob/azure-schemaregistry-avroserializer_1.0.0b4/sdk/core/azure-core/README.md -[azure_sub]: https://azure.microsoft.com/free/ -[python_logging]: https://docs.python.org/3/library/logging.html -[sr_avro_samples]: https://github.com/Azure/azure-sdk-for-python/tree/azure-schemaregistry-avroserializer_1.0.0b4/sdk/schemaregistry/azure-schemaregistry-avroserializer/samples -[api_reference]: https://docs.microsoft.com/python/api/overview/azure/schemaregistry-avroserializer-readme -[source_code]: https://github.com/Azure/azure-sdk-for-python/tree/azure-schemaregistry-avroserializer_1.0.0b4/sdk/schemaregistry/azure-schemaregistry-avroserializer -[change_log]: https://github.com/Azure/azure-sdk-for-python/tree/azure-schemaregistry-avroserializer_1.0.0b4/sdk/schemaregistry/azure-schemaregistry-avroserializer/CHANGELOG.md -[schemaregistry_client]: https://github.com/Azure/azure-sdk-for-python/tree/azure-schemaregistry-avroserializer_1.0.0b4/sdk/schemaregistry/azure-schemaregistry -[schemaregistry_service]: https://aka.ms/schemaregistry -[eventhubs_repo]: https://github.com/Azure/azure-sdk-for-python/tree/azure-schemaregistry-avroserializer_1.0.0b4/sdk/eventhub/azure-eventhub -[token_credential_interface]: https://github.com/Azure/azure-sdk-for-python/tree/azure-schemaregistry-avroserializer_1.0.0b4/sdk/core/azure-core/azure/core/credentials.py +# Azure Schema Registry Avro Serializer client library for Python - version 1.0.0b4 + + +Azure Schema Registry is a schema repository service hosted by Azure Event Hubs, providing schema storage, versioning, +and management. This package provides an Avro serializer capable of serializing and deserializing payloads containing +Schema Registry schema identifiers and Avro-encoded data. + +[Source code][source_code] | [Package (PyPi)][pypi] | [API reference documentation][api_reference] | [Samples][sr_avro_samples] | [Changelog][change_log] + +## _Disclaimer_ + +_Azure SDK Python packages support for Python 2.7 is ending 01 January 2022. For more information and questions, please refer to https://github.com/Azure/azure-sdk-for-python/issues/20691_ + +## Getting started + +### Install the package + +Install the Azure Schema Registry Avro Serializer client library and Azure Identity client library for Python with [pip][pip]: + +```Bash +pip install azure-schemaregistry-avroserializer azure-identity +``` + +### Prerequisites: +To use this package, you must have: +* Azure subscription - [Create a free account][azure_sub] +* [Azure Schema Registry][schemaregistry_service] +* Python 2.7, 3.6 or later - [Install Python][python] + +### Authenticate the client +Interaction with the Schema Registry Avro Serializer starts with an instance of AvroSerializer class, which takes the schema group name and the [Schema Registry Client][schemaregistry_client] class. The client constructor takes the Event Hubs fully qualified namespace and Azure Active Directory credential: + +* The fully qualified namespace of the Schema Registry instance should follow the format: `.servicebus.windows.net`. + +* An AAD credential that implements the [TokenCredential][token_credential_interface] protocol should be passed to the constructor. There are implementations of the `TokenCredential` protocol available in the +[azure-identity package][pypi_azure_identity]. To use the credential types provided by `azure-identity`, please install the Azure Identity client library for Python with [pip][pip]: + +```Bash +pip install azure-identity +``` + +* Additionally, to use the async API supported on Python 3.6+, you must first install an async transport, such as [aiohttp](https://pypi.org/project/aiohttp/): + +```Bash +pip install aiohttp +``` + +**Create AvroSerializer using the azure-schemaregistry library:** + +```python +from azure.schemaregistry import SchemaRegistryClient +from azure.schemaregistry.serializer.avroserializer import AvroSerializer +from azure.identity import DefaultAzureCredential + +credential = DefaultAzureCredential() +# Namespace should be similar to: '.servicebus.windows.net' +fully_qualified_namespace = '<< FULLY QUALIFIED NAMESPACE OF THE SCHEMA REGISTRY >>' +group_name = '<< GROUP NAME OF THE SCHEMA >>' +schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, credential) +serializer = AvroSerializer(client=schema_registry_client, group_name=group_name) +``` + +## Key concepts + +### AvroSerializer + +Provides API to serialize to and deserialize from Avro Binary Encoding plus a +header with schema ID. Uses [SchemaRegistryClient][schemaregistry_client] to get schema IDs from schema content or vice versa. + +### Message format + +The same format is used by schema registry serializers across Azure SDK languages. + +Messages are encoded as follows: + +- 4 bytes: Format Indicator + + - Currently always zero to indicate format below. + +- 32 bytes: Schema ID + + - UTF-8 hexadecimal representation of GUID. + - 32 hex digits, no hyphens. + - Same format and byte order as string from Schema Registry service. + +- Remaining bytes: Avro payload (in general, format-specific payload) + + - Avro Binary Encoding + - NOT Avro Object Container File, which includes the schema and defeats the + purpose of this serialzer to move the schema out of the message payload and + into the schema registry. + + +## Examples + +The following sections provide several code snippets covering some of the most common Schema Registry tasks, including: + +- [Serialization](#serialization) +- [Deserialization](#deserialization) +- [Event Hubs Sending Integration](#event-hubs-sending-integration) +- [Event Hubs Receiving Integration](#event-hubs-receiving-integration) + +### Serialization + +Use `AvroSerializer.serialize` method to serialize dict data with the given avro schema. +The method would use a schema previously registered to the Schema Registry service and keep the schema cached for future serialization usage. It is also possible to avoid pre-registering the schema to the service and automatically register with the `serialize` method by instantiating the `AvroSerializer` with the keyword argument `auto_register_schemas=True`. + +```python +import os +from azure.schemaregistry import SchemaRegistryClient +from azure.schemaregistry.serializer.avroserializer import AvroSerializer +from azure.identity import DefaultAzureCredential + +token_credential = DefaultAzureCredential() +fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE'] +group_name = "" +name = "example.avro.User" +format = "Avro" + +definition = """ +{"namespace": "example.avro", + "type": "record", + "name": "User", + "fields": [ + {"name": "name", "type": "string"}, + {"name": "favorite_number", "type": ["int", "null"]}, + {"name": "favorite_color", "type": ["string", "null"]} + ] +}""" + +schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential) +schema_register_client.register(group_name, name, definition, format) +serializer = AvroSerializer(client=schema_registry_client, group_name=group_name) + +with serializer: + dict_data = {"name": "Ben", "favorite_number": 7, "favorite_color": "red"} + encoded_bytes = serializer.serialize(dict_data, schema=definition) +``` + +### Deserialization + +Use `AvroSerializer.deserialize` method to deserialize raw bytes into dict data. +The method automatically retrieves the schema from the Schema Registry Service and keeps the schema cached for future deserialization usage. + +```python +import os +from azure.schemaregistry import SchemaRegistryClient +from azure.schemaregistry.serializer.avroserializer import AvroSerializer +from azure.identity import DefaultAzureCredential + +token_credential = DefaultAzureCredential() +fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE'] +group_name = "" + +schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential) +serializer = AvroSerializer(client=schema_registry_client, group_name=group_name) + +with serializer: + encoded_bytes = b'' + decoded_data = serializer.deserialize(encoded_bytes) +``` + +### Event Hubs Sending Integration + +Integration with [Event Hubs][eventhubs_repo] to send serialized avro dict data as the body of EventData. + +```python +import os +from azure.eventhub import EventHubProducerClient, EventData +from azure.schemaregistry import SchemaRegistryClient +from azure.schemaregistry.serializer.avroserializer import AvroSerializer +from azure.identity import DefaultAzureCredential + +token_credential = DefaultAzureCredential() +fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE'] +group_name = "" +eventhub_connection_str = os.environ['EVENT_HUB_CONN_STR'] +eventhub_name = os.environ['EVENT_HUB_NAME'] + +definition = """ +{"namespace": "example.avro", + "type": "record", + "name": "User", + "fields": [ + {"name": "name", "type": "string"}, + {"name": "favorite_number", "type": ["int", "null"]}, + {"name": "favorite_color", "type": ["string", "null"]} + ] +}""" + +schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential) +avro_serializer = AvroSerializer(client=schema_registry_client, group_name=group_name, auto_register_schemas=True) + +eventhub_producer = EventHubProducerClient.from_connection_string( + conn_str=eventhub_connection_str, + eventhub_name=eventhub_name +) + +with eventhub_producer, avro_serializer: + event_data_batch = eventhub_producer.create_batch() + dict_data = {"name": "Bob", "favorite_number": 7, "favorite_color": "red"} + payload_bytes = avro_serializer.serialize(dict_data, schema=definition) + event_data_batch.add(EventData(body=payload_bytes)) + eventhub_producer.send_batch(event_data_batch) +``` + +### Event Hubs Receiving Integration + +Integration with [Event Hubs][eventhubs_repo] to receive `EventData` and deserialized raw bytes into avro dict data. + +```python +import os +from azure.eventhub import EventHubConsumerClient +from azure.schemaregistry import SchemaRegistryClient +from azure.schemaregistry.serializer.avroserializer import AvroSerializer +from azure.identity import DefaultAzureCredential + +token_credential = DefaultAzureCredential() +fully_qualified_namespace = os.environ['SCHEMAREGISTRY_FULLY_QUALIFIED_NAMESPACE'] +group_name = "" +eventhub_connection_str = os.environ['EVENT_HUB_CONN_STR'] +eventhub_name = os.environ['EVENT_HUB_NAME'] + +schema_registry_client = SchemaRegistryClient(fully_qualified_namespace, token_credential) +avro_serializer = AvroSerializer(client=schema_registry_client, group_name=group_name) + +eventhub_consumer = EventHubConsumerClient.from_connection_string( + conn_str=eventhub_connection_str, + consumer_group='$Default', + eventhub_name=eventhub_name, +) + +def on_event(partition_context, event): + bytes_payload = b"".join(b for b in event.body) + deserialized_data = avro_serializer.deserialize(bytes_payload) + +with eventhub_consumer, avro_serializer: + eventhub_consumer.receive(on_event=on_event, starting_position="-1") +``` + +## Troubleshooting + +### General + +Azure Schema Registry Avro Serializer raise exceptions defined in [Azure Core][azure_core]. + +### Logging +This library uses the standard +[logging][python_logging] library for logging. +Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO +level. + +Detailed DEBUG level logging, including request/response bodies and unredacted +headers, can be enabled on a client with the `logging_enable` argument: +```python +import sys +import logging +from azure.schemaregistry import SchemaRegistryClient +from azure.schemaregistry.serializer.avroserializer import AvroSerializer +from azure.identity import DefaultAzureCredential + +# Create a logger for the SDK +logger = logging.getLogger('azure.schemaregistry') +logger.setLevel(logging.DEBUG) + +# Configure a console output +handler = logging.StreamHandler(stream=sys.stdout) +logger.addHandler(handler) + +credential = DefaultAzureCredential() +schema_registry_client = SchemaRegistryClient("", credential, logging_enable=True) +# This client will log detailed information about its HTTP sessions, at DEBUG level +serializer = AvroSerializer(client=schema_registry_client, group_name="") +``` + +Similarly, `logging_enable` can enable detailed logging for a single operation, +even when it isn't enabled for the client: +```py +serializer.serialize(dict_data, schema=schema_definition, logging_enable=True) +``` + +## Next steps + +### More sample code + +Please find further examples in the [samples][sr_avro_samples] directory demonstrating common Azure Schema Registry Avro Serializer scenarios. + +## Contributing + +This project welcomes contributions and suggestions. Most contributions require you to agree to a +Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us +the rights to use your contribution. For details, visit https://cla.microsoft.com. + +When you submit a pull request, a CLA-bot will automatically determine whether you need to provide +a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions +provided by the bot. You will only need to do this once across all repos using our CLA. + +This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). +For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or +contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments. + + +[pip]: https://pypi.org/project/pip/ +[pypi]: https://pypi.org/project/azure-schemaregistry-avroserializer +[python]: https://www.python.org/downloads/ +[azure_core]: https://github.com/Azure/azure-sdk-for-python/blob/azure-schemaregistry-avroserializer_1.0.0b4/sdk/core/azure-core/README.md +[azure_sub]: https://azure.microsoft.com/free/ +[python_logging]: https://docs.python.org/3/library/logging.html +[sr_avro_samples]: https://github.com/Azure/azure-sdk-for-python/tree/azure-schemaregistry-avroserializer_1.0.0b4/sdk/schemaregistry/azure-schemaregistry-avroserializer/samples +[api_reference]: https://docs.microsoft.com/python/api/overview/azure/schemaregistry-avroserializer-readme +[source_code]: https://github.com/Azure/azure-sdk-for-python/tree/azure-schemaregistry-avroserializer_1.0.0b4/sdk/schemaregistry/azure-schemaregistry-avroserializer +[change_log]: https://github.com/Azure/azure-sdk-for-python/tree/azure-schemaregistry-avroserializer_1.0.0b4/sdk/schemaregistry/azure-schemaregistry-avroserializer/CHANGELOG.md +[schemaregistry_client]: https://github.com/Azure/azure-sdk-for-python/tree/azure-schemaregistry-avroserializer_1.0.0b4/sdk/schemaregistry/azure-schemaregistry +[schemaregistry_service]: https://aka.ms/schemaregistry +[eventhubs_repo]: https://github.com/Azure/azure-sdk-for-python/tree/azure-schemaregistry-avroserializer_1.0.0b4/sdk/eventhub/azure-eventhub +[token_credential_interface]: https://github.com/Azure/azure-sdk-for-python/tree/azure-schemaregistry-avroserializer_1.0.0b4/sdk/core/azure-core/azure/core/credentials.py [pypi_azure_identity]: https://pypi.org/project/azure-identity/ - + diff --git a/docs-ref-services/preview/storage-file-datalake-readme.md b/docs-ref-services/preview/storage-file-datalake-readme.md index 976581754f5a..fa551160b49d 100644 --- a/docs-ref-services/preview/storage-file-datalake-readme.md +++ b/docs-ref-services/preview/storage-file-datalake-readme.md @@ -86,7 +86,7 @@ DataLake storage offers four types of resources: * The storage account * A file system in the storage account * A directory under the file system -* A file in a the file system or under directory +* A file in the file system or under directory ### Async Clients This library includes a complete async API supported on Python 3.5+. To use it, you must diff --git a/docs-ref-services/preview/storage-fileshare-readme.md b/docs-ref-services/preview/storage-fileshare-readme.md index 577a3f4a5de3..bdc1e8935542 100644 --- a/docs-ref-services/preview/storage-fileshare-readme.md +++ b/docs-ref-services/preview/storage-fileshare-readme.md @@ -10,367 +10,367 @@ ms.subservice: files ms.technology: azure manager: twolley --- -# Azure Files for Python Readme - Version 12.1.1 -Azure File Share storage offers fully managed file shares in the cloud that are accessible via the industry standard [Server Message Block (SMB) protocol](https://docs.microsoft.com/windows/desktop/FileIO/microsoft-smb-protocol-and-cifs-protocol-overview). Azure file shares can be mounted concurrently by cloud or on-premises deployments of Windows, Linux, and macOS. Additionally, Azure file shares can be cached on Windows Servers with Azure File Sync for fast access near where the data is being used. - -Azure file shares can be used to: - -* Replace or supplement on-premises file servers -* "Lift and shift" applications -* Simplify cloud development with shared application settings, diagnostic share, and Dev/Test/Debug tools - -[Source code](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/azure/storage/fileshare) | [Package (PyPI)](https://pypi.org/project/azure-storage-file-share/) | [API reference documentation](https://aka.ms/azsdk-python-storage-fileshare-ref) | [Product documentation](https://docs.microsoft.com/azure/storage/) | [Samples](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples) - -## Getting started - -### Prerequisites -* Python 2.7, or 3.5 or later is required to use this package. -* You must have an [Azure subscription](https://azure.microsoft.com/free/) and an -[Azure storage account](https://docs.microsoft.com/azure/storage/common/storage-account-overview) to use this package. - -### Install the package -Install the Azure Storage File Share client library for Python with [pip](https://pypi.org/project/pip/): - -```bash -pip install azure-storage-file-share -``` - -### Create a storage account -If you wish to create a new storage account, you can use the -[Azure Portal](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-portal), -[Azure PowerShell](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-powershell), -or [Azure CLI](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-cli): - -```bash -# Create a new resource group to hold the storage account - -# if using an existing resource group, skip this step -az group create --name my-resource-group --location westus2 - -# Create the storage account -az storage account create -n my-storage-account-name -g my-resource-group -``` - -### Create the client -The Azure Storage File Share client library for Python allows you to interact with four types of resources: the storage -account itself, file shares, directories, and files. Interaction with these resources starts with an instance of a -[client](#clients). To create a client object, you will need the storage account's file service URL and a -credential that allows you to access the storage account: - -```python -from azure.storage.fileshare import ShareServiceClient - -service = ShareServiceClient(account_url="https://.file.core.windows.net/", credential=credential) -``` - -#### Looking up the account URL -You can find the storage account's file service URL using the -[Azure Portal](https://docs.microsoft.com/azure/storage/common/storage-account-overview#storage-account-endpoints), -[Azure PowerShell](https://docs.microsoft.com/powershell/module/az.storage/get-azstorageaccount), -or [Azure CLI](https://docs.microsoft.com/cli/azure/storage/account?view=azure-cli-latest#az-storage-account-show): - -```bash -# Get the file service URL for the storage account -az storage account show -n my-storage-account-name -g my-resource-group --query "primaryEndpoints.file" -``` - -#### Types of credentials -The `credential` parameter may be provided in a number of different forms, depending on the type of -[authorization](https://docs.microsoft.com/azure/storage/common/storage-auth) you wish to use: -1. To use a [shared access signature (SAS) token](https://docs.microsoft.com/azure/storage/common/storage-sas-overview), - provide the token as a string. If your account URL includes the SAS token, omit the credential parameter. - You can generate a SAS token from the Azure Portal under "Shared access signature" or use one of the `generate_sas()` - functions to create a sas token for the storage account, share, or file: - - ```python - from datetime import datetime, timedelta - from azure.storage.fileshare import ShareServiceClient, generate_account_sas, ResourceTypes, AccountSasPermissions - - sas_token = generate_account_sas( - account_name="", - account_key="", - resource_types=ResourceTypes(service=True), - permission=AccountSasPermissions(read=True), - expiry=datetime.utcnow() + timedelta(hours=1) - ) - - share_service_client = ShareServiceClient(account_url="https://.file.core.windows.net", credential=sas_token) - ``` - -2. To use a storage account [shared key](https://docs.microsoft.com/rest/api/storageservices/authenticate-with-shared-key/) - (aka account key or access key), provide the key as a string. This can be found in the Azure Portal under the "Access Keys" - section or by running the following Azure CLI command: - - ```az storage account keys list -g MyResourceGroup -n MyStorageAccount``` - - Use the key as the credential parameter to authenticate the client: - ```python - from azure.storage.fileshare import ShareServiceClient - service = ShareServiceClient(account_url="https://.file.core.windows.net", credential="") - ``` - -#### Creating the client from a connection string -Depending on your use case and authorization method, you may prefer to initialize a client instance with a storage -connection string instead of providing the account URL and credential separately. To do this, pass the storage -connection string to the client's `from_connection_string` class method: - -```python -from azure.storage.fileshare import ShareServiceClient - -connection_string = "DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxx;EndpointSuffix=core.windows.net" -service = ShareServiceClient.from_connection_string(conn_str=connection_string) -``` - -The connection string to your storage account can be found in the Azure Portal under the "Access Keys" section or by running the following CLI command: - -```bash -az storage account show-connection-string -g MyResourceGroup -n MyStorageAccount -``` - -## Key concepts -The following components make up the Azure File Share Service: -* The storage account itself -* A file share within the storage account -* An optional hierarchy of directories within the file share -* A file within the file share, which may be up to 1 TiB in size - -The Azure Storage File Share client library for Python allows you to interact with each of these components through the -use of a dedicated client object. - -### Clients -Four different clients are provided to to interact with the various components of the File Share Service: -1. [ShareServiceClient](https://aka.ms/azsdk-python-storage-fileshare-shareserviceclient) - - this client represents interaction with the Azure storage account itself, and allows you to acquire preconfigured - client instances to access the file shares within. It provides operations to retrieve and configure the service - properties as well as list, create, and delete shares within the account. To perform operations on a specific share, - retrieve a client using the `get_share_client` method. -2. [ShareClient](https://aka.ms/azsdk-python-storage-fileshare-shareclient) - - this client represents interaction with a specific file share (which need not exist yet), and allows you to acquire - preconfigured client instances to access the directories and files within. It provides operations to create, delete, - configure, or create snapshots of a share and includes operations to create and enumerate the contents of - directories within it. To perform operations on a specific directory or file, retrieve a client using the - `get_directory_client` or `get_file_client` methods. -3. [ShareDirectoryClient](https://aka.ms/azsdk-python-storage-fileshare-sharedirectoryclient) - - this client represents interaction with a specific directory (which need not exist yet). It provides operations to - create, delete, or enumerate the contents of an immediate or nested subdirectory, and includes operations to create - and delete files within it. For operations relating to a specific subdirectory or file, a client for that entity can - also be retrieved using the `get_subdirectory_client` and `get_file_client` functions. -4. [ShareFileClient](http://aka.ms/azsdk-python-storage-fileshare-sharefileclient) - - this client represents interaction with a specific file (which need not exist yet). It provides operations to - upload, download, create, delete, and copy a file. - -For details on path naming restrictions, see [Naming and Referencing Shares, Directories, Files, and Metadata](https://docs.microsoft.com/rest/api/storageservices/naming-and-referencing-shares--directories--files--and-metadata). - -## Examples -The following sections provide several code snippets covering some of the most common Storage File Share tasks, including: - -* [Creating a file share](#creating-a-file-share "Creating a file share") -* [Uploading a file](#uploading-a-file "Uploading a file") -* [Downloading a file](#downloading-a-file "Downloading a file") -* [Listing contents of a directory](#listing-contents-of-a-directory "Listing contents of a directory") - -### Creating a file share -Create a file share to store your files - -```python -from azure.storage.fileshare import ShareClient - -share = ShareClient.from_connection_string(conn_str="", share_name="my_share") -share.create_share() -``` - -Use the async client to create a file share - -```python -from azure.storage.fileshare.aio import ShareClient - -share = ShareClient.from_connection_string(conn_str="", share_name="my_share") -await share.create_share() -``` - -### Uploading a file -Upload a file to the share - -```python -from azure.storage.fileshare import ShareFileClient - -file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") - -with open("./SampleSource.txt", "rb") as source_file: - file_client.upload_file(source_file) -``` - -Upload a file asynchronously - -```python -from azure.storage.fileshare.aio import ShareFileClient - -file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") - -with open("./SampleSource.txt", "rb") as source_file: - await file_client.upload_file(source_file) -``` - -### Downloading a file -Download a file from the share - -```python -from azure.storage.fileshare import ShareFileClient - -file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") - -with open("DEST_FILE", "wb") as file_handle: - data = file_client.download_file() - data.readinto(file_handle) -``` - -Download a file asynchronously - -```python -from azure.storage.fileshare.aio import ShareFileClient - -file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") - -with open("DEST_FILE", "wb") as file_handle: - data = await file_client.download_file() - await data.readinto(file_handle) -``` - -### Listing contents of a directory -List all directories and files under a parent directory - -```python -from azure.storage.fileshare import ShareDirectoryClient - -parent_dir = ShareDirectoryClient.from_connection_string(conn_str="", share_name="my_share", directory_path="parent_dir") - -my_list = list(parent_dir.list_directories_and_files()) -print(my_list) -``` - -List contents of a directory asynchronously - -```python -from azure.storage.fileshare.aio import ShareDirectoryClient - -parent_dir = ShareDirectoryClient.from_connection_string(conn_str="", share_name="my_share", directory_path="parent_dir") - -my_files = [] -async for item in parent_dir.list_directories_and_files(): - my_files.append(item) -print(my_files) -``` - -## Optional Configuration - -Optional keyword arguments that can be passed in at the client and per-operation level. - -### Retry Policy configuration - -Use the following keyword arguments when instantiating a client to configure the retry policy: - -* __retry_total__ (int): Total number of retries to allow. Takes precedence over other counts. -Pass in `retry_total=0` if you do not want to retry on requests. Defaults to 10. -* __retry_connect__ (int): How many connection-related errors to retry on. Defaults to 3. -* __retry_read__ (int): How many times to retry on read errors. Defaults to 3. -* __retry_status__ (int): How many times to retry on bad status codes. Defaults to 3. -* __retry_to_secondary__ (bool): Whether the request should be retried to secondary, if able. -This should only be enabled of RA-GRS accounts are used and potentially stale data can be handled. -Defaults to `False`. - -### Other client / per-operation configuration - -Other optional configuration keyword arguments that can be specified on the client or per-operation. - -**Client keyword arguments:** - -* __connection_timeout__ (int): Optionally sets the connect and read timeout value, in seconds. -* __transport__ (Any): User-provided transport to send the HTTP request. - -**Per-operation keyword arguments:** - -* __raw_response_hook__ (callable): The given callback uses the response returned from the service. -* __raw_request_hook__ (callable): The given callback uses the request before being sent to service. -* __client_request_id__ (str): Optional user specified identification of the request. -* __user_agent__ (str): Appends the custom value to the user-agent header to be sent with the request. -* __logging_enable__ (bool): Enables logging at the DEBUG level. Defaults to False. Can also be passed in at -the client level to enable it for all requests. -* __headers__ (dict): Pass in custom headers as key, value pairs. E.g. `headers={'CustomValue': value}` - - -## Troubleshooting -### General -Storage File clients raise exceptions defined in [Azure Core](https://github.com/Azure/azure-sdk-for-python/blob/master/sdk/core/azure-core/README.md). -All File service operations will throw a `StorageErrorException` on failure with helpful [error codes](https://docs.microsoft.com/rest/api/storageservices/file-service-error-codes). - -### Logging -This library uses the standard -[logging](https://docs.python.org/3/library/logging.html) library for logging. -Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO -level. - -Detailed DEBUG level logging, including request/response bodies and unredacted -headers, can be enabled on a client with the `logging_enable` argument: -```python -import sys -import logging -from azure.storage.fileshare import ShareServiceClient - -# Create a logger for the 'azure.storage.fileshare' SDK -logger = logging.getLogger('azure.storage.fileshare') -logger.setLevel(logging.DEBUG) - -# Configure a console output -handler = logging.StreamHandler(stream=sys.stdout) -logger.addHandler(handler) - -# This client will log detailed information about its HTTP sessions, at DEBUG level -service_client = ShareServiceClient.from_connection_string("your_connection_string", logging_enable=True) -``` - -Similarly, `logging_enable` can enable detailed logging for a single operation, -even when it isn't enabled for the client: -```py -service_client.get_service_properties(logging_enable=True) -``` - -## Next steps - -### More sample code - -Get started with our [File Share samples](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples). - -Several Storage File Share Python SDK samples are available to you in the SDK's GitHub repository. These samples provide example code for additional scenarios commonly encountered while working with Storage File Share: - -* [file_samples_hello_world.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_hello_world.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_hello_world_async.py)) - Examples found in this article: - * Client creation - * Create a file share - * Upload a file - -* [file_samples_authentication.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_authentication.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_authentication_async.py)) - Examples for authenticating and creating the client: - * From a connection string - * From a shared access key - * From a shared access signature token - -* [file_samples_service.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_service.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_service_async.py)) - Examples for interacting with the file service: - * Get and set service properties - * Create, list, and delete shares - * Get a share client - -* [file_samples_share.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_share.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_share_async.py)) - Examples for interacting with file shares: - * Create a share snapshot - * Set share quota and metadata - * List directories and files - * Get the directory or file client to interact with a specific entity - -* [file_samples_directory.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_directory.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_directory_async.py)) - Examples for interacting with directories: - * Create a directory and add files - * Create and delete subdirectories - * Get the subdirectory client - -* [file_samples_client.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_client.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_client_async.py)) - Examples for interacting with files: - * Create, upload, download, and delete files - * Copy a file from a URL - -### Additional documentation +# Azure Files for Python Readme - Version 12.1.1 +Azure File Share storage offers fully managed file shares in the cloud that are accessible via the industry standard [Server Message Block (SMB) protocol](https://docs.microsoft.com/windows/desktop/FileIO/microsoft-smb-protocol-and-cifs-protocol-overview). Azure file shares can be mounted concurrently by cloud or on-premises deployments of Windows, Linux, and macOS. Additionally, Azure file shares can be cached on Windows Servers with Azure File Sync for fast access near where the data is being used. + +Azure file shares can be used to: + +* Replace or supplement on-premises file servers +* "Lift and shift" applications +* Simplify cloud development with shared application settings, diagnostic share, and Dev/Test/Debug tools + +[Source code](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/azure/storage/fileshare) | [Package (PyPI)](https://pypi.org/project/azure-storage-file-share/) | [API reference documentation](https://aka.ms/azsdk-python-storage-fileshare-ref) | [Product documentation](https://docs.microsoft.com/azure/storage/) | [Samples](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples) + +## Getting started + +### Prerequisites +* Python 2.7, or 3.5 or later is required to use this package. +* You must have an [Azure subscription](https://azure.microsoft.com/free/) and an +[Azure storage account](https://docs.microsoft.com/azure/storage/common/storage-account-overview) to use this package. + +### Install the package +Install the Azure Storage File Share client library for Python with [pip](https://pypi.org/project/pip/): + +```bash +pip install azure-storage-file-share +``` + +### Create a storage account +If you wish to create a new storage account, you can use the +[Azure Portal](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-portal), +[Azure PowerShell](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-powershell), +or [Azure CLI](https://docs.microsoft.com/azure/storage/common/storage-quickstart-create-account?tabs=azure-cli): + +```bash +# Create a new resource group to hold the storage account - +# if using an existing resource group, skip this step +az group create --name my-resource-group --location westus2 + +# Create the storage account +az storage account create -n my-storage-account-name -g my-resource-group +``` + +### Create the client +The Azure Storage File Share client library for Python allows you to interact with four types of resources: the storage +account itself, file shares, directories, and files. Interaction with these resources starts with an instance of a +[client](#clients). To create a client object, you will need the storage account's file service URL and a +credential that allows you to access the storage account: + +```python +from azure.storage.fileshare import ShareServiceClient + +service = ShareServiceClient(account_url="https://.file.core.windows.net/", credential=credential) +``` + +#### Looking up the account URL +You can find the storage account's file service URL using the +[Azure Portal](https://docs.microsoft.com/azure/storage/common/storage-account-overview#storage-account-endpoints), +[Azure PowerShell](https://docs.microsoft.com/powershell/module/az.storage/get-azstorageaccount), +or [Azure CLI](https://docs.microsoft.com/cli/azure/storage/account?view=azure-cli-latest#az-storage-account-show): + +```bash +# Get the file service URL for the storage account +az storage account show -n my-storage-account-name -g my-resource-group --query "primaryEndpoints.file" +``` + +#### Types of credentials +The `credential` parameter may be provided in a number of different forms, depending on the type of +[authorization](https://docs.microsoft.com/azure/storage/common/storage-auth) you wish to use: +1. To use a [shared access signature (SAS) token](https://docs.microsoft.com/azure/storage/common/storage-sas-overview), + provide the token as a string. If your account URL includes the SAS token, omit the credential parameter. + You can generate a SAS token from the Azure Portal under "Shared access signature" or use one of the `generate_sas()` + functions to create a sas token for the storage account, share, or file: + + ```python + from datetime import datetime, timedelta + from azure.storage.fileshare import ShareServiceClient, generate_account_sas, ResourceTypes, AccountSasPermissions + + sas_token = generate_account_sas( + account_name="", + account_key="", + resource_types=ResourceTypes(service=True), + permission=AccountSasPermissions(read=True), + expiry=datetime.utcnow() + timedelta(hours=1) + ) + + share_service_client = ShareServiceClient(account_url="https://.file.core.windows.net", credential=sas_token) + ``` + +2. To use a storage account [shared key](https://docs.microsoft.com/rest/api/storageservices/authenticate-with-shared-key/) + (aka account key or access key), provide the key as a string. This can be found in the Azure Portal under the "Access Keys" + section or by running the following Azure CLI command: + + ```az storage account keys list -g MyResourceGroup -n MyStorageAccount``` + + Use the key as the credential parameter to authenticate the client: + ```python + from azure.storage.fileshare import ShareServiceClient + service = ShareServiceClient(account_url="https://.file.core.windows.net", credential="") + ``` + +#### Creating the client from a connection string +Depending on your use case and authorization method, you may prefer to initialize a client instance with a storage +connection string instead of providing the account URL and credential separately. To do this, pass the storage +connection string to the client's `from_connection_string` class method: + +```python +from azure.storage.fileshare import ShareServiceClient + +connection_string = "DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxx;EndpointSuffix=core.windows.net" +service = ShareServiceClient.from_connection_string(conn_str=connection_string) +``` + +The connection string to your storage account can be found in the Azure Portal under the "Access Keys" section or by running the following CLI command: + +```bash +az storage account show-connection-string -g MyResourceGroup -n MyStorageAccount +``` + +## Key concepts +The following components make up the Azure File Share Service: +* The storage account itself +* A file share within the storage account +* An optional hierarchy of directories within the file share +* A file within the file share, which may be up to 1 TiB in size + +The Azure Storage File Share client library for Python allows you to interact with each of these components through the +use of a dedicated client object. + +### Clients +Four different clients are provided to interact with the various components of the File Share Service: +1. [ShareServiceClient](https://aka.ms/azsdk-python-storage-fileshare-shareserviceclient) - + this client represents interaction with the Azure storage account itself, and allows you to acquire preconfigured + client instances to access the file shares within. It provides operations to retrieve and configure the service + properties as well as list, create, and delete shares within the account. To perform operations on a specific share, + retrieve a client using the `get_share_client` method. +2. [ShareClient](https://aka.ms/azsdk-python-storage-fileshare-shareclient) - + this client represents interaction with a specific file share (which need not exist yet), and allows you to acquire + preconfigured client instances to access the directories and files within. It provides operations to create, delete, + configure, or create snapshots of a share and includes operations to create and enumerate the contents of + directories within it. To perform operations on a specific directory or file, retrieve a client using the + `get_directory_client` or `get_file_client` methods. +3. [ShareDirectoryClient](https://aka.ms/azsdk-python-storage-fileshare-sharedirectoryclient) - + this client represents interaction with a specific directory (which need not exist yet). It provides operations to + create, delete, or enumerate the contents of an immediate or nested subdirectory, and includes operations to create + and delete files within it. For operations relating to a specific subdirectory or file, a client for that entity can + also be retrieved using the `get_subdirectory_client` and `get_file_client` functions. +4. [ShareFileClient](http://aka.ms/azsdk-python-storage-fileshare-sharefileclient) - + this client represents interaction with a specific file (which need not exist yet). It provides operations to + upload, download, create, delete, and copy a file. + +For details on path naming restrictions, see [Naming and Referencing Shares, Directories, Files, and Metadata](https://docs.microsoft.com/rest/api/storageservices/naming-and-referencing-shares--directories--files--and-metadata). + +## Examples +The following sections provide several code snippets covering some of the most common Storage File Share tasks, including: + +* [Creating a file share](#creating-a-file-share "Creating a file share") +* [Uploading a file](#uploading-a-file "Uploading a file") +* [Downloading a file](#downloading-a-file "Downloading a file") +* [Listing contents of a directory](#listing-contents-of-a-directory "Listing contents of a directory") + +### Creating a file share +Create a file share to store your files + +```python +from azure.storage.fileshare import ShareClient + +share = ShareClient.from_connection_string(conn_str="", share_name="my_share") +share.create_share() +``` + +Use the async client to create a file share + +```python +from azure.storage.fileshare.aio import ShareClient + +share = ShareClient.from_connection_string(conn_str="", share_name="my_share") +await share.create_share() +``` + +### Uploading a file +Upload a file to the share + +```python +from azure.storage.fileshare import ShareFileClient + +file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") + +with open("./SampleSource.txt", "rb") as source_file: + file_client.upload_file(source_file) +``` + +Upload a file asynchronously + +```python +from azure.storage.fileshare.aio import ShareFileClient + +file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") + +with open("./SampleSource.txt", "rb") as source_file: + await file_client.upload_file(source_file) +``` + +### Downloading a file +Download a file from the share + +```python +from azure.storage.fileshare import ShareFileClient + +file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") + +with open("DEST_FILE", "wb") as file_handle: + data = file_client.download_file() + data.readinto(file_handle) +``` + +Download a file asynchronously + +```python +from azure.storage.fileshare.aio import ShareFileClient + +file_client = ShareFileClient.from_connection_string(conn_str="", share_name="my_share", file_path="my_file") + +with open("DEST_FILE", "wb") as file_handle: + data = await file_client.download_file() + await data.readinto(file_handle) +``` + +### Listing contents of a directory +List all directories and files under a parent directory + +```python +from azure.storage.fileshare import ShareDirectoryClient + +parent_dir = ShareDirectoryClient.from_connection_string(conn_str="", share_name="my_share", directory_path="parent_dir") + +my_list = list(parent_dir.list_directories_and_files()) +print(my_list) +``` + +List contents of a directory asynchronously + +```python +from azure.storage.fileshare.aio import ShareDirectoryClient + +parent_dir = ShareDirectoryClient.from_connection_string(conn_str="", share_name="my_share", directory_path="parent_dir") + +my_files = [] +async for item in parent_dir.list_directories_and_files(): + my_files.append(item) +print(my_files) +``` + +## Optional Configuration + +Optional keyword arguments that can be passed in at the client and per-operation level. + +### Retry Policy configuration + +Use the following keyword arguments when instantiating a client to configure the retry policy: + +* __retry_total__ (int): Total number of retries to allow. Takes precedence over other counts. +Pass in `retry_total=0` if you do not want to retry on requests. Defaults to 10. +* __retry_connect__ (int): How many connection-related errors to retry on. Defaults to 3. +* __retry_read__ (int): How many times to retry on read errors. Defaults to 3. +* __retry_status__ (int): How many times to retry on bad status codes. Defaults to 3. +* __retry_to_secondary__ (bool): Whether the request should be retried to secondary, if able. +This should only be enabled of RA-GRS accounts are used and potentially stale data can be handled. +Defaults to `False`. + +### Other client / per-operation configuration + +Other optional configuration keyword arguments that can be specified on the client or per-operation. + +**Client keyword arguments:** + +* __connection_timeout__ (int): Optionally sets the connect and read timeout value, in seconds. +* __transport__ (Any): User-provided transport to send the HTTP request. + +**Per-operation keyword arguments:** + +* __raw_response_hook__ (callable): The given callback uses the response returned from the service. +* __raw_request_hook__ (callable): The given callback uses the request before being sent to service. +* __client_request_id__ (str): Optional user specified identification of the request. +* __user_agent__ (str): Appends the custom value to the user-agent header to be sent with the request. +* __logging_enable__ (bool): Enables logging at the DEBUG level. Defaults to False. Can also be passed in at +the client level to enable it for all requests. +* __headers__ (dict): Pass in custom headers as key, value pairs. E.g. `headers={'CustomValue': value}` + + +## Troubleshooting +### General +Storage File clients raise exceptions defined in [Azure Core](https://github.com/Azure/azure-sdk-for-python/blob/master/sdk/core/azure-core/README.md). +All File service operations will throw a `StorageErrorException` on failure with helpful [error codes](https://docs.microsoft.com/rest/api/storageservices/file-service-error-codes). + +### Logging +This library uses the standard +[logging](https://docs.python.org/3/library/logging.html) library for logging. +Basic information about HTTP sessions (URLs, headers, etc.) is logged at INFO +level. + +Detailed DEBUG level logging, including request/response bodies and unredacted +headers, can be enabled on a client with the `logging_enable` argument: +```python +import sys +import logging +from azure.storage.fileshare import ShareServiceClient + +# Create a logger for the 'azure.storage.fileshare' SDK +logger = logging.getLogger('azure.storage.fileshare') +logger.setLevel(logging.DEBUG) + +# Configure a console output +handler = logging.StreamHandler(stream=sys.stdout) +logger.addHandler(handler) + +# This client will log detailed information about its HTTP sessions, at DEBUG level +service_client = ShareServiceClient.from_connection_string("your_connection_string", logging_enable=True) +``` + +Similarly, `logging_enable` can enable detailed logging for a single operation, +even when it isn't enabled for the client: +```py +service_client.get_service_properties(logging_enable=True) +``` + +## Next steps + +### More sample code + +Get started with our [File Share samples](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples). + +Several Storage File Share Python SDK samples are available to you in the SDK's GitHub repository. These samples provide example code for additional scenarios commonly encountered while working with Storage File Share: + +* [file_samples_hello_world.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_hello_world.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_hello_world_async.py)) - Examples found in this article: + * Client creation + * Create a file share + * Upload a file + +* [file_samples_authentication.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_authentication.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_authentication_async.py)) - Examples for authenticating and creating the client: + * From a connection string + * From a shared access key + * From a shared access signature token + +* [file_samples_service.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_service.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_service_async.py)) - Examples for interacting with the file service: + * Get and set service properties + * Create, list, and delete shares + * Get a share client + +* [file_samples_share.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_share.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_share_async.py)) - Examples for interacting with file shares: + * Create a share snapshot + * Set share quota and metadata + * List directories and files + * Get the directory or file client to interact with a specific entity + +* [file_samples_directory.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_directory.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_directory_async.py)) - Examples for interacting with directories: + * Create a directory and add files + * Create and delete subdirectories + * Get the subdirectory client + +* [file_samples_client.py](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_client.py) ([async version](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage/azure-storage-file-share/samples/file_samples_client_async.py)) - Examples for interacting with files: + * Create, upload, download, and delete files + * Copy a file from a URL + +### Additional documentation For more extensive documentation on Azure File Share storage, see the [Azure File Share storage documentation](https://docs.microsoft.com/azure/storage/files/) on docs.microsoft.com. - +