Skip to content

Commit

Permalink
Modified Notebooks according to feedback (#106)
Browse files Browse the repository at this point in the history
* Create load-CSV-data-S3

* Added notebooks for Load data sections of UI

* Modified with suggested changes

* Modified with suggested changes

* Remove extra header

* Modified with suggested changes and changed Kai Credentials

* Modified with suggested changes and add JSON notebook

* Update notebook.ipynb

* Update notebook.ipynb

* Modified with pre-commit checks

---------

Co-authored-by: chetan thote <[email protected]>
Co-authored-by: Kevin D Smith <[email protected]>
  • Loading branch information
3 people authored Jul 26, 2024
1 parent 5df9689 commit 6a39ef5
Show file tree
Hide file tree
Showing 5 changed files with 577 additions and 26 deletions.
63 changes: 52 additions & 11 deletions notebooks/load-csv-data-s3/notebook.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@
"id": "2d22fd53-2c18-40e5-bb38-6d8ebc06f1b8",
"metadata": {},
"source": [
"## Create a database\n",
"## Create a database (You can skip this Step if you are using Free Starter Tier)\n",
"\n",
"We need to create a database to work with in the following examples."
]
Expand Down Expand Up @@ -161,6 +161,15 @@
"START PIPELINE SalesData_Pipeline;"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "a402a924-5e09-4213-88f6-2723b39ee2aa",
"metadata": {},
"source": [
"### It might take around 1 min to load data from S3 to SingleStore table"
]
},
{
"cell_type": "code",
"execution_count": 4,
Expand All @@ -169,7 +178,7 @@
"outputs": [],
"source": [
"%%sql\n",
"SELECT * FROM SalesData LIMIT 10"
"SELECT count(*) FROM SalesData"
]
},
{
Expand Down Expand Up @@ -296,28 +305,60 @@
"source": [
"## Conclusion\n",
"\n",
"<div class=\"alert alert-block alert-warning\">\n",
" <b class=\"fa fa-solid fa-exclamation-circle\"></b>\n",
" <div>\n",
" <p><b>Action Required</b></p>\n",
" <p> If you created a new database in your Standard or Premium Workspace, you can drop the database by running the cell below. Note: this will not drop your database for Free Starter Workspaces. To drop a Free Starter Workspace, terminate the Workspace using the UI. </p>\n",
" </div>\n",
"</div>\n",
"\n",
"We have shown how to insert data from a Amazon S3 using `Pipelines` to SingleStoreDB. These techniques should enable you to\n",
"integrate your Amazon S3 with SingleStoreDB."
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "83b2d1e6-58b8-493e-a698-2fd46e2ac5a1",
"metadata": {},
"source": [
"## Clean up"
]
},
{
"cell_type": "markdown",
"id": "f028e26e-66c0-44dc-9024-221687334301",
"metadata": {},
"source": [
"#### Drop Pipeline"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "f1f7b94f-2018-464e-9a28-b71cb89d65e3",
"metadata": {},
"outputs": [],
"source": [
"%%sql\n",
"STOP PIPELINE SalesData_Pipeline;\n",
"\n",
"DROP PIPELINE SalesData_Pipeline;"
]
},
{
"cell_type": "markdown",
"id": "33a246bd-36a3-4027-b44d-8c46768ff96d",
"metadata": {},
"source": [
"#### Drop Data"
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "d5053a52-5579-4fea-9594-5250f6fcc289",
"metadata": {},
"outputs": [],
"source": [
"shared_tier_check = %sql show variables like 'is_shared_tier'\n",
"if not shared_tier_check or shared_tier_check[0][1] == 'OFF':\n",
" %sql DROP DATABASE IF EXISTS SalesAnalysis;"
" %sql DROP DATABASE IF EXISTS SalesAnalysis;\n",
"else:\n",
" %sql DROP TABLE SalesData;"
]
},
{
Expand Down
12 changes: 12 additions & 0 deletions notebooks/load-data-json/meta.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
[meta]
authors=["chetan-thote"]
title="Employee Data Analysis JSON Dataset"
description="""\
Employee Data Analysis use case illustrates how to leverage Singlestore's capabilities to process and analyze JSON data from a Amazon S3 data source.
"""
difficulty="beginner"
tags=["starter", "loaddata", "json"]
lesson_areas=["Ingest"]
icon="database"
destinations=["spaces"]
minimum_tier="free-shared"
Loading

0 comments on commit 6a39ef5

Please sign in to comment.