Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reorder sidebar #1787

Merged
merged 22 commits into from
Sep 14, 2024
Merged
Show file tree
Hide file tree
Changes from 11 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 1 addition & 13 deletions docs/website/docs/dlt-ecosystem/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,16 +3,4 @@ title: Integrations
description: List of integrations
keywords: ['integrations, sources, destinations']
---
import DocCardList from '@theme/DocCardList';
import Link from '../_book-onboarding-call.md';

Speed up the process of creating data pipelines by using dlt's multiple pre-built sources and destinations:

- Each [dlt verified source](verified-sources) allows you to create [pipelines](../general-usage/pipeline) that extract data from a particular source: a database, a cloud service, or an API.
- [Destinations](destinations) are where you want to load your data. dlt supports a variety of destinations, including databases, data warehouses, and data lakes.

<DocCardList />

:::tip
Most source-destination pairs work seamlessly together. If the merge [write disposition](../general-usage/incremental-loading#choosing-a-write-disposition) is not supported by a destination (for example, [file sytem destination](destinations/filesystem)), dlt will automatically fall back to the [append](../general-usage/incremental-loading#append) write disposition.
:::
Could not remove this page, please ignore it for now
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: Filesystem
description: dlt verified source for Readers Source and Filesystem
description: AWS S3, Google Cloud Storage, Azure Blob Storage, local files
keywords: [readers source and filesystem, filesystem, readers source]
---
import Header from './_source-info-header.md';
Expand Down Expand Up @@ -123,7 +123,7 @@ For more information, read the

3. You can pass the bucket URL and glob pattern or use `config.toml`. For local filesystems, use
`file://` as follows:

```toml
[sources.filesystem] # use [sources.readers.credentials] for the "readers" source
bucket_url='file://Users/admin/Documents/csv_files'
Expand All @@ -136,7 +136,7 @@ For more information, read the
bucket_url='~\Documents\csv_files\'
file_glob="*"
```

In the example above we use Windows path to current user's Documents folder. Mind that literal toml string (single quotes)
was used to conveniently use the backslashes without need to escape.

Expand Down
46 changes: 30 additions & 16 deletions docs/website/docs/dlt-ecosystem/verified-sources/index.md
Original file line number Diff line number Diff line change
@@ -1,31 +1,45 @@
---
title: Verified sources
description: List of verified sources
keywords: ['verified source']
title: Sources
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one note about this page: we also have a couple of templates now as demonstrated on thursday. @AstrakhantsevaAA knows all about this :)

description: Available sources
keywords: ['source']
---
import DocCardList from '@theme/DocCardList';
import Link from '../../_book-onboarding-call.md';
import DocCardList from '@theme/DocCardList';
import {useCurrentSidebarCategory} from '@docusaurus/theme-common';

Choose from our collection of verified sources, developed and maintained by the dlt team and community. Each source is rigorously tested against a real API and provided as Python code for easy customization.

Planning to use dlt in production and need a source that isn't listed? We're happy to help you build it: <Link />.
Planning to use dlt in production and need a source that isn't listed? We're happy to help you build it: <Link/>.

### Popular sources
### Core sources

- [SQL databases](sql_database). Supports PostgreSQL, MySQL, MS SQL Server, BigQuery, Redshift, and more.
- [REST API generic source](rest_api). Loads data from REST APIs using declarative configuration.
- [OpenAPI source generator](openapi-generator). Generates a source from an OpenAPI 3.x spec using the REST API source.
- [Cloud and local storage](filesystem). Retrieves data from AWS S3, Google Cloud Storage, Azure Blob Storage, local files, and more.
<DocCardList items={useCurrentSidebarCategory().items.filter(
item => item.label === '30+ SQL Databases' || item.label === 'REST API generic source' || item.label === 'Filesystem'
)} />

### Full list of verified sources
### Verified sources

<DocCardList />
Choose from our collection of verified sources, developed and maintained by the dlt team and community. Each source is rigorously tested against a real API and provided as Python code for easy customization.

:::tip
If you're looking for a source that isn't listed and it provides a REST API, be sure to check out our [REST API generic source](rest_api)
source.
If you couldn't find a source implementation, you can easily create your own, check our [tutorial](../../tutorial/grouping-resources) to learn how!
:::

<DocCardList items={useCurrentSidebarCategory().items.filter(
item => item.label !== '30+ SQL Databases' && item.label !== 'REST API generic source'&& item.label !== 'Filesystem'
)} />

### What's the difference?

The main difference between the [core sources](#core-sources) and [verified sources](#verified-sources) lies in their structure.
Core sources are generic collections, meaning they can connect to a variety of systems. For example, the [SQL Database source](sql_database) can connect to any
database which supports SQLAlchemy.

According to our telemetry, core sources are the most widely used among our users!

It's also important to note that core sources are integrated into the `dlt` core library,
whereas verified sources are maintained in a separate [repository](https://github.com/dlt-hub/verified-sources).
To use a verified source, you need to run the dlt init command, which will download the verified source code to
your working directory.


### Get help

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: REST API generic source
description: dlt verified source for REST APIs
description: Loads data from REST APIs using declarative configuration
keywords: [rest api, restful api]
---
import Header from './_source-info-header.md';
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: 30+ SQL Databases
description: dlt pipeline for SQL Database
description: PostgreSQL, MySQL, MS SQL Server, BigQuery, Redshift, and more
keywords: [sql connector, sql database pipeline, sql database]
---
import Header from './_source-info-header.md';
Expand Down
Loading
Loading