The projects I work on in the CS internship program.
In Step 4, I had to write a program to extract and read the Forex data.
Zip Extractor from Scratch
os
zipfile
- Ensure you have Python installed on your system.
- Clone the repository:
git clone https://github.com/nargesghan/cs_internship_journey.git cd cs_internship_journey/zip\ extractor\ from\ scratch
- Run the zip extractor script:
python zip_extractor.py
- Navigate to the
zip extractor from scratch
directory:cd cs_internship_journey/zip\ extractor\ from\ scratch
- Run the tests using
unittest
:python -m unittest zip_extractor_test.py
In Step 5, I worked on the Forex data mentioned earlier and calculated the moving average of this data. Additionally, I practiced with the Seattle Bicycle Counts using Python Data Science Handbook by Jake VanderPlas. Modules Used:
zipfile
pandas
numpy
- Ensure you have Python installed on your system.
- Clone the repository:
git clone https://github.com/nargesghan/cs_internship_journey.git cd cs_internship_journey/time\ series(forex\ and\ bicycle\ counter)
- Install the required dependencies:
pip install pandas numpy
- Run the Jupyter notebook:
jupyter notebook Forex_Historical_Data-bicycle-counter.ipynb
In step 5, I prepared three learning materials for machine learning libraries. I used their cheat sheets and documentation as sources. These notebooks contain useful methods from these libraries, along with examples and sufficient explanations.
- Ensure you have Python installed on your system.
- Clone the repository:
git clone https://github.com/nargesghan/cs_internship_journey.git cd cs_internship_journey/numpy_pandas_matplotlib
- Install the required dependencies:
pip install numpy pandas matplotlib
- Run the Jupyter notebooks:
jupyter notebook numpy_cheat_sheet.ipynb jupyter notebook pandas_cheat_sheet.ipynb jupyter notebook matplotlib_cheat_sheet.ipynb
- Navigate to the
numpy_pandas_matplotlib
directory:cd cs_internship_journey/numpy_pandas_matplotlib
- Run the tests using
unittest
:python -m unittest test_numpy_cheat_sheet.py
In step 6, I implemented a simple one-layer neural network using the NumPy library. I used three different learning algorithms for this task: Perceptron, Adaline, and Adaline with stochastic gradient descent. I trained all of these algorithms using the Iris dataset. If you're interested in learning how to implement a single-layer neural network from scratch, don't miss this notebook.
- Ensure you have Python installed on your system.
- Clone the repository:
git clone https://github.com/nargesghan/cs_internship_journey.git cd cs_internship_journey/simple-classification
- Install the required dependencies:
pip install numpy pandas matplotlib
- Run the Jupyter notebook:
jupyter notebook perceptron.ipynb
In this project, I worked on data preprocessing and dimensionality reduction techniques.
- Ensure you have Python installed on your system.
- Clone the repository:
git clone https://github.com/nargesghan/cs_internship_journey.git cd cs_internship_journey/Dimensionality\ Reduction
- Install the required dependencies:
pip install numpy pandas matplotlib scikit-learn
- Run the Jupyter notebook:
jupyter notebook Compressing_Data_via_Dimensionality_Reduction.ipynb
- Navigate to the
Dimensionality Reduction
directory:cd cs_internship_journey/Dimensionality\ Reduction
- Run the tests using
unittest
:python -m unittest test_dimensionality_reduction.py