Scrappy is a simple Python script for my personal use that aggregates other Python scripts and can run them one by one.
The initial aim is to prepare one common environment under many different web scraping scripts.
docker compose up
pip install --no-cache-dir -r requirements.txt -r tools/requirements.txt -r scripts/requirements.txt
python scrappy.py [SCRIPT_NAME]...
Run all scripts:
python scrappy.py
Run single script:
python scrappy.py Script1
Run multiple selected scripts:
python scrappy.py Script1 Script2 Script3 ...
The scraping scripts should be placed in folder app/scripts
. Each script must be a class that extends the base class ScrapMe
and implement scrap
method.
# app/ScrapMe/MyScrapScriptClass.py
from tools.ScrapMe import ScrapMe
class MyScrapScriptClass(ScrapMe):
def scrap(self) -> None:
print('scraping time...')
The base class ScrapMe
uses the BeautifulSoup library and contains a _loadPage
method with which you can download the content of a web page. _loadPage
method returns a BeautifulSoup
object.
In app/scripts
directory you can find three examples of usage:
app/scripts/Sample1.py
- just "Hello world"app/scripts/Sample2.py
- this will throw an exceptionapp/scripts/FakePythonJobs.py
- this will scrap Fake Pyhon Jobs site