Skip to content

Commit

Permalink
Updated: Build a Data Lake with Autonomous Data Warehouse LiveLabs wo…
Browse files Browse the repository at this point in the history
…rkshop (#319)

* updates

* updates to Introduction lab

* Update introduction.md

* LiveLabs workshop version and other freetier changes

* Update introduction.md

* Update introduction.md

* livelabs and freetier changes

* Update load-analyze-json.md

* Update load-analyze-rest.md

* more changes

* more updates

* more updates

* Update adw-dcat-integrate.md

* changed shared with serverless as deployment type

* Update load-local-data.md

* update shared to serverless

* update adb-dcat workshop

* added new lab 12

* Update manifest.json

* added new folder with an .md file in it

Please work!

* more testing

* Delete introduction.md

* new workshop and labs wip

* update labs

* more updates

* more updates

* more updates

* updates

* more updates

* updates before review cycle

* Update endpoint.png

* Update setup-workshop-environment.md

* Update setup-workshop-environment.md

* more updates

* final update before review

* updates

* replacement code

* Update create-share-recipients.md

* Update create-share-recipients.md

* Update create-share-recipients.md

* Update create-share-recipients.md

* Update create-share-recipients.md

* Update create-share-recipients.md

* updates

* Update manifest.json

* folder rename

* added content to data studio folder

* Delete user-bucket-credential-diagram.png

* updates self-qa

* Update introduction.md

* remove extra text files

* Update introduction.md

* Update setup-workshop-environment.md

* Data Studio Workshop Changes

* changes to data studio workshop

* Update setup-workshop-environment.md

* adb changes

* Update recipient-diagram.png

* diagram change

* Update user-bucket-credential-diagram.png

* SME feedback

* Update create-share.md

* Nilay changes

* changes

* Update consume-share.md

* Anoosha's feedback

* Update consume-share.md

* updated 2 screens and a sentence

* minor changes

* deleted extra images and added doc references

* new ECPU changes

* more changes to data sharing workshops

* more changes to fork (data studio)

* more changes

* Marty's feedback

* Marty's feedback to plsql workshop too

* Update setup-workshop-environment.md

* Delete 7381.png

* workshop # 3 ADB set up

and a couple of minor typos in workshops 1 and 2

* changes to adb-dcat workshop

* more changes

* minor typos in all 4 workshops

* quarterly qa build data lake

* new lab 11 in build DL with ADW

* minor changes database actions drop-down list

* final changes to build data lake workshop

* AI updates

AI workshop updates

* ai workshop updates

* Update query-using-select-ai.md

* Update query-using-select-ai.md

* updates

* more updates

* Update query-using-select-ai.md

* more new updates to ai workshop

* Update query-using-select-ai.md

* a new screen capture

* push Marty's feedback to fork

Final changes.

* updates sandbox manifest

* updates

* restored sandbox manifest

* Update setup-environment.md

* updates after CloudWorld

* final updates to ai workshop (also new labs 4 and 5)

* marty's feedback

* incorporated feedback

* minor PR edits by Sarah

* removed steps 7 & 8 Lab 2 > Task 3 per Alexey

The customer asked to remove this as it's not a requirement for the bucket to be public.

* more changes

* more changes per Alexey

* Update load-os-data-public.md

* Quarterly QA

I added a new step per the PM's request in the Data Sharing PL/SQL workshop. I also made a minor edit (removed space) in the Data Lake workshop.

* more updates

* Quarterly QA changes

* Update consume-share.md

* minor edit based on workshop user

* quarterly qa November 2023

* Added new videos to the workshop

Replaced 3 old silent videos with new ones. Added two new videos.

* Adding important notes to the two data sharing workshops

Per the PM's request.

* folder structure only  push to production

This push and the PR later is to make sure the folder structure is in the production repo before I start development. Only 1 .md file and the workshops folder.

* typos

* cloud links workshop

* UPDATES

* Update query-view.png

* update

* minor updates to chat ai workshop (Fork)

* test clones

* test pr

* Alexey's feedback

* Update data-sharing-diagram.png

* sarah's edits

* changes to Data Load UI

* removed script causing ML issue

* Update load-local-data.md

* updates: deprecated procedure and new code

* updates and test

* more updates

* minor update

* testing using a building block in a workshop

* updates

* building blocks debugging

* Update manifest.json

* fixing issues

* Update manifest.json

* delete cleanup.md from workshop folder (use common file)

* use common cleanup.md instead of local cleanup.md

* test common tasks

* update data sharing data studio workshop

* Update create-recipient.png

* PM's 1 feedback

* quarterly qa

* missing "Lab 2" from Manifest

* always free note addition

added a note

* always free change

* Update setup-environment.md

* update manage and monitor workshop

* Folder structure for new data share workshop (plus introduction.md)

* Updated Load and Analyze from clone

* Data Lake minor changes from clone

* manage and monitor workshop

* Remove the lab from the workshop per Marty's request

* mark-hornick-feedback

* used marty's setup file

* replaced notebook with a new one

* updates to lab 6 of manage and monitor

* Update adb-auto-scaling.md

* Nilay's feedback

* Update adb-auto-scaling.md

* updates to second ai workshop

* note change

* Changes to Load and Analyze workshop (other minor changes too)

* quarterly qa

* Update diagrams per Alexey (remove delta share icon)

* updated the 15-minutes workshop

* Update analyzing-movie-sales-data.md

* ords updates and misc

* updated data studio workshop

* ORDS and Misc updates

* updated freetier version

* updated livelabs version

---------

Co-authored-by: Michelle Malcher <[email protected]>
Co-authored-by: Sarah Hirschfeld <[email protected]>
  • Loading branch information
3 people authored May 2, 2024
1 parent 537b278 commit 663434f
Show file tree
Hide file tree
Showing 101 changed files with 180 additions and 244 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -76,8 +76,13 @@ In this task, you will navigate to an AWS Glue Data Catalog instance and explore
2. Open the **Navigation** menu and click **Oracle Database**. Under **Oracle Database**, click **Autonomous Database**.
<if type="livelabs">
3. On the **Autonomous Databases** page, click your **DB-DCAT** ADB instance.
</if>
<if type="freetier">
3. On the **Autonomous Databases** page, click your **ADW-Data-Lake** ADB instance.
![The Autonomous Database is displayed and highlighted.](./images/adb-page.png " ")
</if>
4. On the **Autonomous Database details** page, click the **Database actions** drop-down list, and then click **SQL**.
Expand Down Expand Up @@ -193,7 +198,7 @@ In this task, you will create the required credentials in Autonomous Database th
2. Inspect the new protected schema in Autonomous Database that was created by the sync operation and the two external tables in that schema that were created on top of the two tables in the AWS S3 bucket. In the **Navigator** tab on the left, search for the schema with a name that starts with **`GLUE`**, **`$`**, the AWS Glue connection name, **`AWS_GLUE_CONN`**, the AWS Glue database name, **`PARQ`**, followed by other strings. The generated schema name in this example is **`GLUE$AWS_GLUE_CONN_PARQ_TPCDS_ORACLE_PARQ`**. This schema contains the **`ITEM`** and **`STORE`** external tables.
>**Note:** You might have to click the **Refresh** icon in the **Navigator** tab before you can see the newly created protected schema. If the Refresh doesn't work, select **Sign Out** from the **`ADMIN`** drop-down list, and then click **Leave**. Next, on the **Sign-in** page, sign in as the **admin** user. On the **Launchpad**, in the **Development** section, click the **SQL** card.
>**Note:** You might have to click the **Refresh** icon in the **Navigator** tab before you can see the newly created protected schema. If the Refresh doesn't work, select **Sign Out** from the **`ADMIN`** drop-down list, and then click **Leave**. Next, on the **Sign-in** page, sign in as the **admin** user. On the **Launchpad** page, click the **Development** tab, and then click the **SQL** tab to display the SQL Worksheet.
![Explore protected schema](images/explore-protected-schema.png)
Expand All @@ -209,15 +214,11 @@ In this task, you will create the required credentials in Autonomous Database th
![Run store query](images/run-store-query.png)
6. Let's examine the Data Definition Language (DDL) code for the **`Store`** external table. In the **Navigator** tab, make sure that the newly created schema is selected. Next, right-click **`STORE`**, and then select **Edit** from the context menu.
![Examine the Store external table](images/edit-store.png)
7. In the **Table Properties** panel, in the **External Table** tab, click the **DDL** category.
6. Let's examine the Data Definition Language (DDL) code for the **`Store`** external table. In the **Navigator** tab, make sure that the newly created schema is selected. Next, right-click **`STORE`**, and then select **Quick DDL > Save to File** from the context menu.
![Click DDL](images/click-ddl.png)
![Examine the Store external table](images/ddl-file.png =60%x*)
8. Click the **Create** tab, and then scroll-down to the bottom of the **`CREATE TABLE`** command. The **`Location`** parameter indicates that the source data for the **`store`** external table is in AWS S3.
7. Open the download text file, and then scroll-down to the bottom of the **`CREATE TABLE`** command. The **`Location`** parameter indicates that the source data for the **`store`** external table is in AWS S3.
![Location of the data](images/store-location.png)
Expand All @@ -232,15 +233,13 @@ You may now proceed to the next lab.
## Acknowledgements
* **Author:**
* Lauran K. Serhal, Consulting User Assistance Developer
* **Contributor:**
+ Alexey Filanovskiy, Senior Principal Product Manager
* **Last Updated By/Date:** Lauran K. Serhal, November 2023
* **Author:** Lauran K. Serhal, Consulting User Assistance Developer
* **Contributor:** Alexey Filanovskiy, Senior Principal Product Manager
* **Last Updated By/Date:** Lauran K. Serhal, April 2024
Data about movies in this workshop were sourced from Wikipedia.
Copyright (C) Oracle Corporation.
Copyright (C) 2024 Oracle Corporation.
Permission is granted to copy, distribute and/or modify this document
under the terms of the GNU Free Documentation License, Version 1.3
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,10 @@ This is not a hands-on task. In this task, you will navigate to an OCI Data Cata

In this task, you will learn how to register an OCI Data Catalog instance in ADW in order to connect to that instance and link to the data asset's entities to create external tables in ADW.

If you already accessed the SQL Worksheet in the previous lab, click **Database Actions | SQL** in the banner to display the **Launchpad** page. Click the **Data Studio** tab, and then click the **Data Load** tab to display the **Data Load Dashboard**. You can now skip over to **Step 5**.

If you closed the Web browser tab where the SQL Worksheet was displayed, navigate to the **Data Load** page as follows:

1. Log in to the **Oracle Cloud Console**, if you are not already logged as the Cloud Administrator.

2. Open the **Navigation** menu and click **Oracle Database**. Under **Oracle Database**, click **Autonomous Database**.
Expand All @@ -81,7 +85,7 @@ In this task, you will learn how to register an OCI Data Catalog instance in ADW

4. On the **Autonomous Database details** page, click the **Database actions** drop-down list, and then select **Data Load**.

5. In the **Administration** section, click **CONNECTIONS** to display the **Connections** page.
5. Click the **CONNECTIONS** tile.

![Click the Connections card.](./images/click-connections.png " ")

Expand All @@ -93,10 +97,11 @@ In this task, you will learn how to register an OCI Data Catalog instance in ADW

* **Catalog Name:** Enter a meaningful name. **Note:** The name must conform to the Oracle object naming conventions, which do not allow spaces or **hyphens**.
* **Description:** Enter an optional description.
* **Register Data Catalog Connection:** Select this checkbox.
* **Select Credential:** Select your **`OBJ_STORAGE_CRED`** credential that you created in **Lab 5 > Task 6** from the drop-down list. This could take a minute.
** **Catalog Type:** Accept the default **OCI Data Catalog** from the drop-down list.
* **Credential for Data Catalog Connection:** Select your **`OBJ_STORAGE_CRED`** credential that you created in **Lab 5 > Task 6** from the drop-down list. This could take a minute.
* **Region:** Your region should be already selected after you chose your credential.
* **Data Catalog ID:** If you have several Data Catalog instances, select the Data Catalog instance that you want to register from the drop-down list.
* **Data Catalog ID:** If you have several Data Catalog instances, select the Data Catalog instance that you want to register from the drop-down list. In this workshop, you only have one credential.
* **Register Data Catalog Connection:** Click this field to enable it.

![The completed Register Data Catalog panel is displayed.](./images/register-data-catalog-panel.png " ")

Expand All @@ -116,43 +121,43 @@ In this task, you will learn how to register an OCI Data Catalog instance in ADW

In this task, you will link to data assets from the registered Data Catalog and create two external tables in your Autonomous Database instance based on those data assets.

1. Click **Oracle Database Actions** in the banner to display the Launchpad landing page.
1. Click the **Data Load** link in the breadcrumbs to display the Data Load page.

2. In the **Data Studio**, click **DATA LOAD**.
2. Click the **LINK DATA** tile.

3. In the **What do you want to do with your data?** section, click **LINK DATA**.
![Click Connections.](./images/click-link-data.png " ")

4. In the **Where is your data?** section, select **CLOUD STORE**, and then click **Next**.
3. The **Link Data** page is displayed. You will use this page to drag and drop tables from the registered Data Catalog instance to the data linking job area. Click the **Data Catalog** tab. Next, select the **`REGISTER_DCAT_INSTANCE`** connection that you created from the **Select Cloud Store Location or enter public URL** drop-down list.

5. The **Link Cloud Object** page is displayed. You will use this page to drag and drop tables from the registered Data Catalog instance to the data linking job area. Click the drop-down list. Under the **Catalog Locations** category, select the **`REGISTER_DCAT_INSTANCE`** that you created.
![Drag the data assets from the Data Catalog bucket onto the linking job section.](images/click-data-catalog-tab.png)

![Drag the data assets from the Data Catalog bucket onto the linking job section.](images/drag-and-drop-assets.png)

6. Drag the **`customer_promotions`** and **`moviestream_churn`** data assets from the **moviestream\_sandbox** bucket in our Data Catalog instance, and drop them onto the data linking job section.
4. Drag the **`customer_promotions`** and **`moviestream_churn`** data assets from the **moviestream\_sandbox** bucket in our Data Catalog instance, and drop them onto the data linking job section.

![Click the pencil icon to open settings viewer for customer_contact load task](images/customer-promotions-settings.png)

7. Let's change the default name for the **`customer_promotions`** external table that will be created. Click the **Actions** icon (3-dot vertical ellipsis) for the **`customer_promotions`** link task, and then select **Settings** from the context menu. The **Link Data from Cloud Store Location customer\_promotions** settings panel is displayed. Change the external table name to **`CUSTOMER_PROMOTIONS_DCAT`**, and then click **Close**.
5. Let's change the default name for the **`customer_promotions`** external table that will be created. Click the **Settings** icon (pencil) for the **`customer_promotions`** link task. The **Link Data from Cloud Store Location customer\_promotions** settings panel is displayed. Change the external table name to **`CUSTOMER_PROMOTIONS_DCAT`**, and then click **Close**.

![Change name of the external table.](images/customer-promotions-dcat.png =65%x*)

![Change name of the external table.](images/customer-promotions-dcat.png)
6. Let's change the default name for the **`moviestream_churn`** external table that will be created. Click the **Settings** icon (pencil) for the **`moviestream_churn`** link task. The **Link Data from Cloud Store Location moviestream\_churn** settings panel is displayed. Change the name of the external table to **`MOVIESTREAM_CHURN_DCAT`**, and then click **Close**.

8. Let's change the default name for the **`moviestream_churn`** external table that will be created. Click the **Actions** icon (3-dot vertical ellipsis) for the **`moviestream_churn`** link task, and then select **Settings** from the context menu to view the settings for this task. The **Link Data from Cloud Store Location moviestream\_churn** settings panel is displayed. Change the name of the external table name to **`MOVIESTREAM_CHURN_DCAT`**, and then click **Close**.
![Change name of the external table.](images/moviestream-churn-dcat.png =65%x*)

9. Click **Start** to run the data link job. In the **Run Data Load Job** dialog box, click **Run**.
7. Click **Start** to run the data link job. In the **Run Data Load Job** dialog box, click **Run**.

![Run the data link job](images/run-data-link.png)
![Run the data link job](images/run-data-link.png =65%x*)

10. After the load job is completed, make sure that all of the data link cards have green check marks next to them. This indicates that your data link tasks have completed successfully.
8. After the load job is completed, make sure that the two data link cards have the link icon next to them. This indicates that your data link tasks have completed successfully.

![The Link job tasks completed.](images/link-completed.png)

11. Click **Oracle Database Actions** in the banner to display the Launchpad landing page. In the **Development** section, click **SQL** to display your SQL Worksheet.
9. Click **Database Actions | Data Load** in the banner to display the **Launchpad** page. Click the **Development** tab, and then click the **SQL** tab to display your SQL Worksheet.

12. From the **Navigator** pane, drag and drop the newly created **`CUSTOMERS_PROMOTIONS_DCAT`** external table onto the Worksheet. In the **Choose Type of insertion**, click **Select**, and then click **Apply**.
10. From the **Navigator** pane, drag and drop the newly created **`CUSTOMERS_PROMOTIONS_DCAT`** external table onto the Worksheet. In the **Choose Type of insertion**, click **Select**, and then click **Apply**.

![Drag and drop the external table onto the worksheet.](images/drag-drop-external-table.png)

13. The auto generated query is displayed in the worksheet. Click the **Run Statement** icon to run the query. The results are displayed. You just accessed the data in the registered Data Catalog instance!
11. The auto generated query is displayed in the worksheet. Click the **Run Statement** icon to run the query. The results are displayed. You just accessed the data in the registered Data Catalog instance!

![Run the query.](images/run-query.png)

Expand All @@ -172,15 +177,13 @@ You may now proceed to the next lab.

## Acknowledgements

* **Author:**
* Lauran K. Serhal, Consulting User Assistance Developer
* **Contributor:**
+ Alexey Filanovskiy, Senior Principal Product Manager
* **Last Updated By/Date:** Lauran K. Serhal, January 2024
* **Author:** Lauran K. Serhal, Consulting User Assistance Developer
* **Contributor:** Alexey Filanovskiy, Senior Principal Product Manager
* **Last Updated By/Date:** Lauran K. Serhal, April 2024

Data about movies in this workshop were sourced from Wikipedia.

Copyright (C) Oracle Corporation.
Copyright (C) 2024 Oracle Corporation.

Permission is granted to copy, distribute and/or modify this document
under the terms of the GNU Free Documentation License, Version 1.3
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -39,35 +39,15 @@ Our movie data set has a series of columns that contain different types of detai

JSON data is organized very differently than typical warehouse data. There is a single entry for **producer** but the corresponding key **names** actually has multiple values. This is referred to as an **array** - specifically a JSON array.

>**Note:**
If you already have the **Oracle Database Actions** browser tab open from the previous lab, click the **Database Actions | Launchpad** banner to display the **Database Actions | Launchpad** Home page. In the **Development** section, click the **SQL** card to display the SQL Worksheet. Next, skip to **step 6**; otherwise, start with **step 1** below.
1. You should be already on the **Data Load Dashboard** page from the previous lab. Click **Database Actions | Data Load** in the banner.

![Click the banner.](./images/click-banner.png " ")
![Click the banner.](./images/click-banner.png " ")

1. Log in to the **Oracle Cloud Console**, if you are not already logged as the Cloud Administrator.
2. On the **Database Actions | Launchpad** page, click the **Development** tab, and then click the **SQL** tab.

2. Open the **Navigation** menu and click **Oracle Database**. Under **Oracle Database**, click **Autonomous Database**.

<if type="livelabs">
3. On the **Autonomous Databases** page, click your **DB-DCAT** ADB instance.
![On the Autonomous Databases page, the Autonomous Database that is assigned to your LiveLabs workshop reservation is displayed.](./images/ll-adb-page.png " ")
</if>

<if type="freetier">
3. On the **Autonomous Databases** page, click your **ADW-Data-Lake** ADB instance.
![On the Autonomous Databases page, the Autonomous Database that you provisioned is displayed and highlighted.](./images/adb-page.png " ")
</if>

4. On the **Autonomous Database details** page, click the **Database actions** drop-down list, and then click **SQL**.

<if type="freetier">
![The Database Actions button is highlighted.](./images/click-db-actions.png " ")
</if>

5. The SQL Worksheet is displayed.

![The SQL worksheet is displayed.](./images/sql-worksheet-displayed.png " ")
![The SQL worksheet is displayed.](./images/click-development-sql-tabs.png " ")

The SQL Worksheet is displayed.
6. Use the Autonomous Database ``DBMS_CLOUD.COPY_COLLECTION`` procedure to create and load the movie collection from object storage. Copy and paste the following script into your SQL Worksheet, and then click the **Run Script (F5)** icon in the Worksheet toolbar.
```
<copy>
Expand Down Expand Up @@ -272,4 +252,4 @@ You may now proceed to the next lab.
* **Author** - Marty Gubar, Autonomous Database Product Management
* **Contributor:** Lauran K. Serhal, Consulting User Assistance Developer
* **Last Updated By/Date:** Lauran K. Serhal, February 2024
* **Last Updated By/Date:** Lauran K. Serhal, April 2024
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -83,21 +83,15 @@ Create a **News API** account as follows:

Now that you have the API key, create a PL/SQL function that queries the REST endpoint using the parameters above.

1. Log in to the **Oracle Cloud Console**, if you are not already logged as the Cloud Administrator.
1. Navigate to the SQL Worksheet. Click the **Database Actions | Data Load** in the banner.

2. Open the **Navigation** menu and click **Oracle Database**. Under **Oracle Database**, click **Autonomous Database**.
![Click Data Load in the banner.](./images/click-data-load-banner.png " ")

<if type="livelabs">
3. On the **Autonomous Databases** page, click your **DB-DCAT** ADB instance.
</if>
2. On the **Launchpad** page, click the **Development** tab, and then click the **SQL** tab.

<if type="freetier">
3. On the **Autonomous Databases** page, click your **ADW-Data-Lake** ADB instance.
</if>
![Click Development > SQL tabs.](./images/click-development-sql-tabs.png " ")

4. On the **Autonomous Database details** page, click the **Database actions** drop-down list, and then click **SQL**.

5. The SQL Worksheet is displayed.
The SQL Worksheet is displayed.

![The SQL worksheet is displayed.](./images/sql-worksheet-displayed.png " ")

Expand Down Expand Up @@ -351,4 +345,4 @@ You may now proceed to the next lab.
* **Authors:**
* Marty Gubar, Autonomous Database Product Management
* Lauran K. Serhal, Consulting User Assistance Developer
* **Last Updated By/Date:** Lauran K. Serhal, February 2024
* **Last Updated By/Date:** Lauran K. Serhal, April 2024
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit 663434f

Please sign in to comment.