Skip to content

Commit

Permalink
Fix links
Browse files Browse the repository at this point in the history
  • Loading branch information
petedejoy committed Mar 4, 2019
1 parent 756b85a commit 3bd74ca
Show file tree
Hide file tree
Showing 8 changed files with 10 additions and 10 deletions.
6 changes: 3 additions & 3 deletions guides/facebook-ads-to-redshift.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ heroImagePath: "https://assets.astronomer.io/website/img/guides/FBToRedshift_pre
tags: ["Building DAGs", "Redshift", "Facebook Ads"]
---

In this guide, we’ll explore how you can use [Apache Airflow](https://airflow.apache.org/) to move your ad data from Facebook Ads to Redshift. Note that this is an effective and flexible alternative to point-and-click ETL tools like [Segment](https://segment.com), [Alooma](https://alooma.com), [Xplenty](https://xplenty.com), [Stitch](https://stitchdata.com), and [ETLeap](https://etleap.com/).
In this guide, we’ll explore how you can use [Apache Airflow](https://airflow.apache.org/) to move your ad data from Facebook Ads to Redshift. Note that this is an effective and flexible alternative to point-and-click ETL tools like [Segment](https://segment.com), [Alooma](https://alooma.com), [Xplenty](https:/www.xplenty.com), [Stitch](https://stitchdata.com), and [ETLeap](https://etleap.com/).

Before we get started, be sure you have the following on hand:

Expand Down Expand Up @@ -51,7 +51,7 @@ Before we get started, be sure you have the following on hand:

### 1. Add Connections in Airflow UI

Begin by creating all of the necessary connections in your Airflow UI. To do this, log into your Airflow dashboard and navigate to Admin-->Connections. In order to build this pipeline, you’ll need to create a connection to your Facebook Ads account, your S3 bucket, and your Redshift instance. For more info on how to fill out the fields within your connections, check out our [documentation here](https://docs.astronomer.io/v2/apache_airflow/tutorial/connections.html).
Begin by creating all of the necessary connections in your Airflow UI. To do this, log into your Airflow dashboard and navigate to Admin-->Connections. In order to build this pipeline, you’ll need to create a connection to your Facebook Ads account, your S3 bucket, and your Redshift instance. For more info on how to fill out the fields within your connections, check out our [documentation here](https://astronomer.io/guides/connections).

### 2. Clone the plugin

Expand Down Expand Up @@ -82,4 +82,4 @@ REDSHIFT_SCHEMA = ''

Once you have those credentials plugged into your DAG, test and deploy it!

If you don't have Airflow already set up in your production environment, head over to [our app](https://app.astronomer.io/signup) to get spun up with your own managed instance!
If you don't have Airflow already set up in your production environment, head over to [our getting started guide](https://astronomer.io/docs/getting-started) to get spun up with your own managed instance!
2 changes: 1 addition & 1 deletion guides/github-to-redshift.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ heroImagePath: "https://assets.astronomer.io/website/img/guides/GithubToRedshift
tags: ["Building DAGs", "Redshift", "Github"]
---

In this guide, we’ll explore how you can use [Apache Airflow](https://airflow.apache.org/) to move your data from Github to Redshift. Note that this is an effective and flexible alternative to point-and-click ETL tools like [Segment](https://segment.com), [Alooma](https://alooma.com), [Xplenty](https://xplenty.com), [Stitch](https://stitchdata.com), and [ETLeap](https://etleap.com/).
In this guide, we’ll explore how you can use [Apache Airflow](https://airflow.apache.org/) to move your data from Github to Redshift. Note that this is an effective and flexible alternative to point-and-click ETL tools like [Segment](https://segment.com), [Alooma](https://alooma.com), [Xplenty](https://www.xplenty.com), [Stitch](https://stitchdata.com), and [ETLeap](https://etleap.com/).

Before we get started, be sure you have the following on hand:

Expand Down
2 changes: 1 addition & 1 deletion guides/hubspot-to-redshift.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ heroImagePath: "https://assets.astronomer.io/website/img/guides/HubspotToRedshif
tags: ["Building DAGs", "Redshift", "Hubspot"]
---

In this guide, we’ll explore how you can use [Apache Airflow](https://airflow.apache.org/) to move your CRM data from Hubspot to Redshift. Note that this is an effective and flexible alternative to point-and-click ETL tools like [Segment](https://segment.com), [Alooma](https://alooma.com), [Xplenty](https://xplenty.com), [Stitch](https://stitchdata.com), and [ETLeap](https://etleap.com/).
In this guide, we’ll explore how you can use [Apache Airflow](https://airflow.apache.org/) to move your CRM data from Hubspot to Redshift. Note that this is an effective and flexible alternative to point-and-click ETL tools like [Segment](https://segment.com), [Alooma](https://alooma.com), [Xplenty](https://www.xplenty.com), [Stitch](https://stitchdata.com), and [ETLeap](https://etleap.com/).

Before we get started, be sure you have the following on hand:

Expand Down
2 changes: 1 addition & 1 deletion guides/imap-to-redshift.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ heroImagePath: null
tags: ["Building DAGs", "Redshift", "IMAP"]
---

In this guide, we’ll explore how you can use [Apache Airflow](https://airflow.apache.org/) to move your data from an IMAP server to Redshift. Note that this is an effective and flexible alternative to point-and-click ETL tools like [Segment](https://segment.com), [Alooma](https://alooma.com), [Xplenty](https://xplenty.com), [Stitch](https://stitchdata.com), and [ETLeap](https://etleap.com/).
In this guide, we’ll explore how you can use [Apache Airflow](https://airflow.apache.org/) to move your data from an IMAP server to Redshift. Note that this is an effective and flexible alternative to point-and-click ETL tools like [Segment](https://segment.com), [Alooma](https://alooma.com), [Xplenty](https://www.xplenty.com), [Stitch](https://stitchdata.com), and [ETLeap](https://etleap.com/).

Before we get started, be sure you have the following on hand:

Expand Down
2 changes: 1 addition & 1 deletion guides/marketo-to-redshift.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ heroImagePath: "https://assets.astronomer.io/website/img/guides/MarketoToRedshif
tags: ["Building DAGs", "Redshift", "Marketo"]
---

In this guide, we’ll explore how you can use [Apache Airflow](https://airflow.apache.org/) to move your data from Marketo to Redshift. Note that this is an effective and flexible alternative to point-and-click ETL tools like [Segment](https://segment.com), [Alooma](https://alooma.com), [Xplenty](https://xplenty.com), [Stitch](https://stitchdata.com), and [ETLeap](https://etleap.com/).
In this guide, we’ll explore how you can use [Apache Airflow](https://airflow.apache.org/) to move your data from Marketo to Redshift. Note that this is an effective and flexible alternative to point-and-click ETL tools like [Segment](https://segment.com), [Alooma](https://alooma.com), [Xplenty](https://www.xplenty.com), [Stitch](https://stitchdata.com), and [ETLeap](https://etleap.com/).

Before we get started, be sure you have the following on hand:

Expand Down
2 changes: 1 addition & 1 deletion guides/mongo-to-redshift.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ heroImagePath: "https://assets.astronomer.io/website/img/guides/MongoDBToRedshif
tags: ["Building DAGs", "Redshift", "MongoDB"]
---

In this guide, we’ll explore how you can use [Apache Airflow](https://airflow.apache.org/) to move your data from your MongoDB to Redshift. Note that this is an effective and flexible alternative to point-and-click ETL tools like [Segment](https://segment.com), [Alooma](https://alooma.com), [Xplenty](https://xplenty.com), [Stitch](https://stitchdata.com), and [ETLeap](https://etleap.com/).
In this guide, we’ll explore how you can use [Apache Airflow](https://airflow.apache.org/) to move your data from your MongoDB to Redshift. Note that this is an effective and flexible alternative to point-and-click ETL tools like [Segment](https://segment.com), [Alooma](https://alooma.com), [Xplenty](https:/www.xplenty.com), [Stitch](https://stitchdata.com), and [ETLeap](https://etleap.com/).

Before we get started, be sure you have the following on hand:

Expand Down
2 changes: 1 addition & 1 deletion guides/salesforce-to-redshift.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ heroImagePath: "https://assets.astronomer.io/website/img/guides/SalesforceToReds
tags: ["Building DAGs", "Redshift", "Salesforce"]
---

In this guide, we’ll explore how you can use [Apache Airflow](https://airflow.apache.org/) to move your CRM data from Salesforce to Redshift. Note that this is an effective and flexible alternative to point-and-click ETL tools like [Segment](https://segment.com), [Alooma](https://alooma.com), [Xplenty](https://xplenty.com), [Stitch](https://stitchdata.com), and [ETLeap](https://etleap.com/).
In this guide, we’ll explore how you can use [Apache Airflow](https://airflow.apache.org/) to move your CRM data from Salesforce to Redshift. Note that this is an effective and flexible alternative to point-and-click ETL tools like [Segment](https://segment.com), [Alooma](https://alooma.com), [Xplenty](https://www.xplenty.com), [Stitch](https://stitchdata.com), and [ETLeap](https://etleap.com/).

Before we get started, be sure you have the following on hand:

Expand Down
2 changes: 1 addition & 1 deletion guides_in_progress/apache_airflow/tutorial/sample-dag.md
Original file line number Diff line number Diff line change
Expand Up @@ -128,4 +128,4 @@ t1 >> t2
t2 >> t3
~~~

With that, we have successfully written an Airflow DAG! Head [here](/v2/apache_airflow/tutorial/dag-deployment) to see how to deploy this DAG. Or, check out our docs on [example DAGS](https://github.com/astronomerio/example-dags) and [DAG Writing Best Practices](/v2/apache_airflow/tutorial/best-practices).
With that, we have successfully written an Airflow DAG! Head [here](https://astronomer.io/docs/getting-started) to see how to deploy this DAG. Or, check out our docs on [example DAGS](https://github.com/astronomerio/example-dags) and [DAG Writing Best Practices](https://www.astronomer.io/guides/dag-best-practices/).

0 comments on commit 3bd74ca

Please sign in to comment.