diff --git a/powerbi-docs/collaborate-share/service-embed-report-spo.md b/powerbi-docs/collaborate-share/service-embed-report-spo.md index 589dc8b1bf..9ac02ea28c 100644 --- a/powerbi-docs/collaborate-share/service-embed-report-spo.md +++ b/powerbi-docs/collaborate-share/service-embed-report-spo.md @@ -9,7 +9,7 @@ ms.service: powerbi ms.subservice: pbi-collaborate-share ms.topic: how-to LocalizationGroup: Share your work -ms.date: 12/16/2024 +ms.date: 01/07/2025 --- # Embed a report web part in SharePoint Online @@ -26,6 +26,7 @@ For **Embed report in SharePoint Online** reports to work: * To use an embedded report, users must sign in to the Power BI service to activate their Power BI license. * To embed a web part in SharePoint Online, you need a Power BI Pro or Premium Per User (PPU) license. * Users with a free Fabric license can view a report that's hosted in a [Power BI Premium capacity (EM or P SKU)](../enterprise/service-premium-what-is.md) or [Fabric F64 or greater capacity](/fabric/enterprise/licenses#capacity-and-skus). +* SharePoint Embed is now supported in air gap environments. ## Embed your report @@ -163,7 +164,7 @@ Users viewing a report in SharePoint need either a **Power BI Pro or Premium Per * Azure B2B Guest user accounts aren't supported. Users see the Power BI logo that shows the part is loading, but it doesn't show the report. -* When viewing Power BI reports embedded in SharePoint Online, there is no option for users to switch between Power BI environments. +* When viewing Power BI reports embedded in SharePoint Online, there's no option for users to switch between Power BI environments. * Power BI doesn't support the same localized languages that SharePoint Online does. As a result, you might not see proper localization within the embedded report. @@ -173,7 +174,7 @@ Users viewing a report in SharePoint need either a **Power BI Pro or Premium Per * [URL filters](service-url-filters.md) aren't supported with the SharePoint Online web part. -* You can't view or access Power BI Apps embedded in a SharedPoint site page using a Power BI web part. To access the embedded Power BI report, access the app first in Power BI service before accessing it in the SharePoint site page. +* You can't view or access Power BI Apps embedded in a SharePoint site page using a Power BI web part. To access the embedded Power BI report, access the app first in Power BI service before accessing it in the SharePoint site page. ## Related content diff --git a/powerbi-docs/collaborate-share/service-embed-secure.md b/powerbi-docs/collaborate-share/service-embed-secure.md index f9c8166a02..2316e20bdd 100644 --- a/powerbi-docs/collaborate-share/service-embed-secure.md +++ b/powerbi-docs/collaborate-share/service-embed-secure.md @@ -7,7 +7,7 @@ ms.reviewer: lukaszp ms.service: powerbi ms.subservice: pbi-collaborate-share ms.topic: how-to -ms.date: 07/16/2024 +ms.date: 01/07/2025 LocalizationGroup: Share your work --- @@ -52,7 +52,7 @@ In the Power BI service, you can share embedded reports with users who require a ## Licensing -To view the embedded report, you need either a Power BI Pro or Premium Per User (PPU) license. Or, the content needs to be in a workspace that's in a [Power BI Premium capacity (EM or P SKU)](../enterprise/service-premium-what-is.md#capacities-and-skus). +To view the embedded report, you need either a Power BI Pro or Premium Per User (PPU) license. Or, the content needs to be in a workspace that's in a [Power BI Premium (EM or P SKU)](../enterprise/service-premium-what-is.md#capacities-and-skus) or a [Fabric (F SKU)](/fabric/enterprise/licenses#capacity) capacity. ## Customize your embed experience by using URL settings diff --git a/powerbi-docs/connect-data/asynchronous-refresh.md b/powerbi-docs/connect-data/asynchronous-refresh.md index 80d64cc666..44f856eb24 100644 --- a/powerbi-docs/connect-data/asynchronous-refresh.md +++ b/powerbi-docs/connect-data/asynchronous-refresh.md @@ -6,7 +6,7 @@ ms.author: kfollis ms.service: powerbi ms.subservice: pbi-data-sources ms.topic: conceptual -ms.date: 12/03/2024 +ms.date: 01/07/2025 --- # Enhanced refresh with the Power BI REST API @@ -22,6 +22,7 @@ The Power BI Refresh Dataset REST API can carry out model refresh operations asy - Applying incremental refresh policies - `GET` refresh details - Refresh cancellation +- Timeout configuration > [!NOTE] > - Previously, enhanced refresh was called *asynchronous refresh with REST API*. However, a standard refresh that uses the Refresh Dataset REST API also runs asynchronously by its inherent nature. @@ -78,6 +79,7 @@ The request body might resemble the following example: "commitMode": "transactional", "maxParallelism": 2, "retryCount": 2, + "timeout": "02:00:00", "objects": [ { "table": "DimCustomer", @@ -107,6 +109,7 @@ To do an enhanced refresh operation, you must specify one or more parameters in |`objects` | Array | Entire model | An array of objects to process. Each object includes `table` when processing an entire table, or `table` and `partition` when processing a partition. If no objects are specified, the entire model refreshes. | |`applyRefreshPolicy` | Boolean | `true` | If an incremental refresh policy is defined, determines whether to apply the policy. Modes are `true` or `false`. If the policy isn't applied, the full process leaves partition definitions unchanged, and fully refreshes all partitions in the table.

If `commitMode` is `transactional`, `applyRefreshPolicy` can be `true` or `false`. If `commitMode` is `partialBatch`, `applyRefreshPolicy` of `true` isn't supported, and `applyRefreshPolicy` must be set to `false`.| |`effectiveDate` | Date | Current date | If an incremental refresh policy is applied, the `effectiveDate` parameter overrides the current date. If not specified, the current day is determined using time zone configuration under ['Refresh'](/power-bi/connect-data/incremental-refresh-overview#current-date-and-time). | +|`timeout` | String | 05:00:00 (5 hours) | If a `timeout` is specified, each data refresh attempt on the semantic model adheres to that timeout. A single refresh request can include multiple attempts if `retryCount` is specified, which may cause the total refresh duration to exceed the specified timeout. For instance, setting a `timeout` of 1 hour with a `retryCount` of 2 could result in a total refresh duration of up to 3 hours. Users can adjust the `timeout` to shorten the refresh duration for faster failure detection or extend it beyond the default 5 hours for more complex data refreshes. However, the total refresh duration, including retries, can't exceed 24 hours. | ### Response @@ -251,15 +254,15 @@ The solution is to rerun the refresh operation. To learn more about dynamic memo #### Refresh operation time limits -The maximum amount of time for a single refresh operation is five hours. If the refresh operation doesn't successfully complete within the five-hour limit, and `retryCount` isn't specified or is specified as `0` (the default) in the request body, a timeout error returns. +A refresh operation may include multiple attempts if `retryCount` is specified. Each attempt has a default timeout of 5 hours, which can be adjusted using the `timeout` parameter. The total refresh duration, including retries, must not exceed 24 hours. -If `retryCount` specifies `1` or another number, a new refresh operation with a five-hour limit starts. If this retry operation fails, the service continues to retry the refresh operation up to the greatest number of retries that `retryCount` specifies, or the enhanced refresh processing time limit of 24 hours from the beginning of the first refresh request. +If `retryCount` specifies a number, a new refresh operation starts with the timeout limit. The service retries this operation until it either succeeds, reaches the `retryCount` limit, or hits the 24-hour maximum from the first attempt. -When you plan your enhanced model refresh solution with the Refresh Dataset REST API, it's important to consider these time limits and the `retryCount` parameter. A successful refresh completion can exceed five hours if an initial refresh operation fails and `retryCount` specifies `1` or more. +You can adjust the `timeout` to shorten the refresh duration for faster failure detection or extend it beyond the default 5 hours for more complex data refreshes. -For example, if you request a refresh operation with `"retryCount": 1`, and the initial retry operation fails four hours from the start time, a second refresh operation for that request begins. If that second refresh operation succeeds in three hours, the total time for successful execution of the refresh request is seven hours. +When planning your semantic model refresh with the Refresh Dataset REST API, consider time limits and the retryCount parameter. A refresh may exceed the timeout if the initial attempt fails and retryCount is set to 1 or more. If you request a refresh with "retryCount": 1, and the first attempt fails after 4 hours, a second attempt begins. If this succeeds in 3 hours, the total time for the refresh is 7 hours. -If refresh operations regularly fail, exceed the five-hour time limit, or exceed your desired successful refresh operation time, consider reducing the amount of data being refreshed from the data source. You can split refresh into multiple requests, for example a request for each table. You can also specify `partialBatch` in the `commitMode` parameter. +If refresh operations regularly fail, exceed the timeout time limit, or exceed your desired successful refresh operation time, consider reducing the amount of data being refreshed from the data source. You can split refresh into multiple requests, for example a request for each table. You can also specify partialBatch in the commitMode parameter. ## Code sample diff --git a/powerbi-docs/enterprise/service-security-enable-data-sensitivity-labels.md b/powerbi-docs/enterprise/service-security-enable-data-sensitivity-labels.md index dfd9e98e77..fea440e17b 100644 --- a/powerbi-docs/enterprise/service-security-enable-data-sensitivity-labels.md +++ b/powerbi-docs/enterprise/service-security-enable-data-sensitivity-labels.md @@ -1,5 +1,5 @@ --- -title: Enable sensitivity labels in Fabric +title: Enable sensitivity labels in Fabric and Power BI description: Learn how Fabric administrators can enable sensitivity labels in Fabric. author: paulinbar ms.author: painbar @@ -8,7 +8,7 @@ ms.subservice: powerbi-eim ms.topic: how-to ms.date: 11/16/2023 --- -# Enable sensitivity labels in Fabric +# Enable sensitivity labels in Fabric and Power BI In order for [sensitivity labels from Microsoft Purview Information Protection](/microsoft-365/compliance/sensitivity-labels) to be used in Fabric and Power BI (including Power BI Desktop), they must be enabled on the tenant. This article shows Fabric admins how to do this. For an overview about sensitivity labels in Fabric, see [Sensitivity labels in Fabric](service-security-sensitivity-label-overview.md). For information about applying sensitivity labels in Fabric, see [Apply sensitivity labels to Fabric items](/fabric/get-started/apply-sensitivity-labels) diff --git a/powerbi-docs/transform-model/log-analytics/desktop-log-analytics-faq.md b/powerbi-docs/transform-model/log-analytics/desktop-log-analytics-faq.md index 780b9b41fa..a0e160d364 100644 --- a/powerbi-docs/transform-model/log-analytics/desktop-log-analytics-faq.md +++ b/powerbi-docs/transform-model/log-analytics/desktop-log-analytics-faq.md @@ -87,6 +87,9 @@ For workspace level configuration, you can add an Azure admin as a Power BI work *Answer:* Azure Log Analytics bills storage, ingestion, and analytical queries independently. Cost also depends on the geographic region. It varies depending on how much activity is generated, how long you choose to store the data, and how often you query it. An average Premium capacity generates about 35 GB of logs monthly, but the storage size of logs can be higher for heavily utilized capacities. For for information, see the [pricing calculator](https://azure.microsoft.com/pricing/calculator/). +### I've configured Log analytics successfully, however I cannot see the "PowerBIDatasetsWorkspace" table in my log analytics workspace, why is that? +*Answer:* This is expected behavior. The table will be generated once data is streamed to the log analytics workspace, for example, after a semantic model refresh. + ## Related content The following articles can help you learn more about Power BI, and about its integration with Azure Log Analytics.