Skip to content

Commit

Permalink
Merge pull request #1 from cypienta/v0.4
Browse files Browse the repository at this point in the history
V0.4
  • Loading branch information
ezzeldinadel authored Jun 4, 2024
2 parents f33dd3b + 427501d commit af68628
Show file tree
Hide file tree
Showing 20 changed files with 349 additions and 65 deletions.
9 changes: 2 additions & 7 deletions README.rst
Original file line number Diff line number Diff line change
@@ -1,9 +1,4 @@
Template for the Read the Docs tutorial
Read the Docs for Cypienta AWS offering
=======================================

This GitHub template includes fictional Python library
with some basic Sphinx docs.

Read the tutorial here:

https://docs.readthedocs.io/en/stable/tutorial/
This is GitHub respository for Read the Docs for Cypienta AWS offering.
7 changes: 0 additions & 7 deletions docs/source/api.rst

This file was deleted.

6 changes: 3 additions & 3 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@

# -- Project information

project = 'Lumache'
copyright = '2021, Graziella'
author = 'Graziella'
project = 'Cypienta'
copyright = '2024, Cypienta'
author = 'Cypienta'

release = '0.1'
version = '0.1.0'
Expand Down
84 changes: 84 additions & 0 deletions docs/source/deploy.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
AWS Deployment
==============

Deploy resources using the Cloud Formation template
---------------------------------------------------

1. Clone the Github repo

.. code-block:: shell
$ git clone -b v0.4 https://github.com/cypienta/Lambda.git
.. note::
This command will clone the repository and checkout the branch ``v0.4``

2. Navigate to the AWS console, and search for ``CloudFormation``.

3. Click on ``Stacks`` on the left hand side panel, and click on ``Create stack`` dropdown. Select ``With new resources (standard)`` to start creating a stack

.. image:: resources/create_stack_start.png
:alt: Subscribe to technique detector
:align: center

4. For the ``Prerequisite - Prepare template`` section, select ``Choose an existing template``, and then select ``Upload a template file``. It will enable a ``Choose file`` button. Click on the button to upload the template. The template is present in the root directory of Lambda repository you have cloned. Then click on ``Next``.

.. image:: resources/upload_template_file.png
:alt: Subscribe to technique detector
:align: center

5. Now you can input all the parameters needed for the cloud formation stack. Few parameters are already filled in with default recommended value. You can change the values as required.

Give a name to the stack in ``Stack name``.


Fill in the following parameter values as they require user input:

**BucketName:**\ The name of S3 bucket that you want to create.
(required to change as the current value populated may not be
valid). Follow these
`rules <https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucketnamingrules.html#general-purpose-bucket-names>`__
for naming a bucket. Constraint of the bucket name by AWS is that
the bucket name must be globally unique. So note that your cloud
formation stack may fail if the name provided is already taken. You
can see the failure reasons by clicking on the stack that was
created and clicking on the ``Events`` tab.

**TechniqueModelARN:**\ The ARN of the subscribed model package for
ATTACK Technique detector

**ClusterModelARN:**\ The ARN of the subscribed model package for
Temporal Clustering

**FlowModelARN:**\ The ARN of the subscribed model package for MITRE
flow detector

Recommended value for parameter:

**ChunkSize:**\ The size of a single chunk that will be processed at a
time for an input file uploaded to S3. Recommended chunk_size is
below ``50000``.

6. Click on ``Next`` after adding the parameters.

7. On the page ``Configure stack options``, under the section ``Stack
failure options``, select ``Roll back all stack resources`` for
``Behaviour on provisioning failure``. Select ``Delete all newly
created resources`` for ``Delete newly created resources during a
rollback``. And then click on ``Next``.

8. Now in the ``Review and create`` page, you can review your parameters.
At the bottom of the page, select all checkboxes for ``I
acknowledge…`` and click on ``Submit``. This will start creating the
required resources.

9. You can monitor the events of the cloud stack by clicking on the
recently created cloud stack and going to the ``Events`` tab.

10. Once the cloud stack is completed successfully. You can start using
the products.

Now all your resources are ready to be used.

You may now go to the step :doc:`end_to_end_test` to start testing
your application.
106 changes: 106 additions & 0 deletions docs/source/end_to_end_test.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,106 @@
Sample Test
==================================================

How to test end-to-end
--------------------------

1. Navigate to the AWS console and search for ``S3``. Select the S3 bucket
that you created, and click on ``Create folder``. Give a name of folder
as ``input`` and create the folder.
2. Sample input json file:

.. code-block:: JSON
{
"input": [
{
"instalertid": 25485,
"src": "Internal_User_1",
"dst": "ServerA",
"time": 1.2741200091934108,
"tech": [
"T1490"
],
"name": " ET DNS Query for .cc TLD ",
"other_attributes_dict": {
"priority": 1
}
}
...
]
}
View the `sample input file <https://drive.google.com/file/d/1b9KLQ5k-259zklX1u56Gpk255SUUFeXP/view?usp=drive_link>`__ for your reference

Input data JSON description:

.. code-block:: JSON
{
"input": [
{
"instalertid": int, // alert id
"time": float, // timestamp of the alert
"src": string, // IP address/hostname of the source
"dst": string, // IP address/hostname of the destination
"name": string, // text value of the alert
"tech": [
string //optional. keep empty list if there is no technique label.
] // List of technique labels for the alert.
"other_attributes_dict": {
"priority": float, // optional field of type FLOAT. Priority of the alert Optional field. Only include if the value is not null. (doesnt matter if ascending or descending in importance, as long as its consistent)
"port": int, // optional. Must be an integer 0-65535. Port of the destination. Optional field. Only include if the value is not null.
"url": string, // optional. URL of the alert (if applicable). Optional field of type STRING. Only include if the value is not null.
"user_agent": string, // optional. User agent of the alert (if applicable). Optional field of type STRING. Only include if the value is not null.
"cert": string // optional. Certificate of the alert (if applicable). Optional field of type STRING. Only include if the value is not null.
}
}, ...
],
"node_feature": { // Optional, Add an optional number of nodes, with an optional number of keys per node, make sure to use the same node key/id in its relevant events, feel free to add any features to every node, exclude the features that are non existent for that node
"ServerA": {
"userid": "machine_1",
"user_group": "HR",
"OS": "Linux",
"Internal/External": "Internal"
},
"Internal_User_1": {
"userid": "machine_2",
"user_group": "EMP",
"OS": "Windows11"
},...
}, //optional. list of node features
}
All fields are required unless mentioned otherwise.

3. Upload input json file to the s3 bucket in path: ``s3://{bucket-name}/input/``. The name of the input file does not matter to the end-to-end flow. Note that if you upload a file with the same name, it will be overwritten in S3 bucket.

1. Once you upload the input file. Lets say ``input.json``. Then the control flow will be as follows:

- Enrich_with_technique: lambda function
- Transform-job-tech-{unique-id}: Batch transform job

- Reads input from: ``{bucket}/intermediate/{unique-id}/input_classification.json``
- Output to: ``{bucket}/response/classification_out/{unique-id}/input_classification.json.out``

- Process_enriched_with_technique: lambda function
- Create_cluster: lambda function
- Transform-job-cluster-{unique-id}: Batch transform job

- Reads input from: ``{bucket}/output/classification/{unique-id}/input.zip``
- Output to: ``{bucket}/response/cluster_out/{unique-id}/input.zip.out``

- Process_cluster: lambda function
- Create_flow: lambda function
- Transform-job-flow-{unique-id}: Batch transform job

- Reads input from: ``{bucket}/output/cluster/{unique-id}/input_flow.json``
- Output to: ``{bucket}/response/flow_out/{unique-id}/input_flow.json.out``

- Process_flow: lambda function

2. You can use the Amazon SageMaker console and navigate to Inference → Batch transform jobs, to view the created jobs for your input.

3. You can monitor the progress on CloudWatch logs for each lambda function and transform job created.

4. Wait for a complete output to show up on the S3 bucket. ``s3://alert-detector/output/flow/{unique-id}/``
26 changes: 14 additions & 12 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
@@ -1,22 +1,24 @@
Welcome to Lumache's documentation!
Cypienta
===================================

**Lumache** (/lu'make/) is a Python library for cooks and food lovers
that creates recipes mixing random ingredients.
It pulls data from the `Open Food Facts database <https://world.openfoodfacts.org/>`_
and offers a *simple* and *intuitive* API.
Welcome to the official documentation for Cypienta. This guide will help you understand and utilize the powerful features of our software, ensuring a smooth and efficient experience from subscription to installation.

Check out the :doc:`usage` section for further information, including
how to :ref:`installation` the project.
In this documentation, you will find detailed instructions for:

.. note::
- **Prerequisites**: A list of prerequisites for installation and usage of the product
- **Subscribing to Cypienta products on AWS Marketplace**: A step-by-step guide to help you subscribe to our offering on AWS Marketplace.
- **AWS Deployment**: Detailed instructions to ensure a smooth deployment process to AWS.
- **Sample Test**: An example of how to use the product with a sample test.
- **Troubleshooting and Support**: Solutions to common issues and information on how to get further assistance.

This project is under active development.

Contents
Getting Started
--------

.. toctree::

usage
api
prerequisites
subscription
deploy
end_to_end_test
troubleshoot
32 changes: 32 additions & 0 deletions docs/source/prerequisites.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
Prerequisites
=============

Permissions
-----------
Make sure that you have the required permissions for resources for the IAM user you will be using.

- SageMaker
- Lambda
- S3
- ECS
- EC2
- IAM - create and edit roles
- CloudFormation


Quotas
------

Instance types
~~~~~~~~~~~~~~

Verify your instance type quotas by going to the AWS console. Search for ``Service Quotas``, and select SageMaker from the AWS Services list. Search for ``transform job usage``. You will require a GPU instance type for ``ATTACK Technique Detector`` and ``Temporal Clustering``, so look at the supported and recommended instance types for the product before subscribing and request for an increase of quota if found to be less than 1. The ``MITRE ATTACK Flow Detector`` requires a CPU-based instance type.

.. note::
To check for the supported and recommended instance type. On the AWS marketplace model product page, scroll down to the ``Pricing`` section and click on ``Model Batch Transform`` under ``Software Pricing``.

Lambda concurrent executions
~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Verify that quota limit for ``Concurrent executions`` for AWS Lambda function. On your AWS console for the region where you want to deploy your resources, Search for ``Service Quotas``, and select ``AWS Lambda`` from the AWS Services list. Search for quota name ``Concurrent executions``. Make sure that the applied account-level quota value is more than 10 to allow reserved concurrency for the update_lookup_table lambda function. If the value is not greater than 10, select the ``Concurrent executions`` and click on ``Request increase at account level`` and set to any value greater than 10.

Binary file added docs/source/resources/accept_offer.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/resources/create_stack_start.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/resources/model_package_arn.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/resources/quota_instance_types.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/resources/service_quotas.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/source/resources/upload_template_file.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
82 changes: 82 additions & 0 deletions docs/source/subscription.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
Subscribing to Cypienta products on AWS Marketplace
===================================================

ATTACK Technique Detector
-------------------------

1. Use the `link <https://aws.amazon.com/marketplace/pp/prodview-ygn2hithg564w?sr=0-2&ref_=beagle&applicationId=AWSMPContessa>`_ to explore the marketplace model packages in AWS. Search for ``Cypienta ATTACK Technique Detector``

Click on ``Continue to Subscribe``.

.. image:: resources/subscribe_to_technique_detector.png
:alt: Subscribe to technique detector
:align: center

2. Click on the ``Accept offer`` button on the next page.

.. image:: resources/accept_offer.png
:alt: Subscribe to technique detector
:align: center

.. note::
Do not click on ``Continue to configuration`` at the top of the page. You can move to the next step.

3. Make note of the technique detector model package ARN by going to the Sagemaker console. On the left-hand side panel, navigate to Inference → Marketplace model packages. Select ``AWS Marketplace subscriptions`` tab. Click on the desired product title and copy the ``Model package ARN``.

.. image:: resources/model_package_arn.png
:alt: Subscribe to technique detector
:align: center


Temporal Clustering
-------------------

1. Use the `link <https://aws.amazon.com/marketplace/pp/prodview-a6owq2ddgrcrc?sr=0-3&ref_=beagle&applicationId=AWSMPContessa>`_ to explore the marketplace model packages in AWS. Search for ``Cypienta Temporal Clustering``

Click on ``Continue to Subscribe``.

.. image:: resources/subscribe_to_temporal_clustering.png
:alt: Subscribe to temporal clustering
:align: center

2. Click on the ``Accept offer`` button on the next page.

.. image:: resources/accept_offer.png
:alt: Subscribe to technique detector
:align: center

.. note::
Do not click on ``Continue to configuration`` at the top of the page. You can move to the next step.

3. Make note of the temporal clustering model package ARN by going to the Sagemaker console. On the left hand side panel, navigate to Inference → Marketplace model packages. Select ``AWS Marketplace subscriptions`` tab. Click on the desired product title and copy the ``Model package ARN``.

.. image:: resources/model_package_arn.png
:alt: Subscribe to flow detector
:align: center


MITRE ATTACK Flow Detector
-------------------

1. Use the `link <https://aws.amazon.com/marketplace/pp/prodview-4dismc5uwx4dk?sr=0-1&ref_=beagle&applicationId=AWSMPContessa>`_ to explore the marketplace model packages in AWS. Search for ``Cypienta MITRE ATTACK Flow Detector``

Click on ``Continue to Subscribe``.

.. image:: resources/subscribe_to_flow_detector.png
:alt: Subscribe to technique detector
:align: center

2. Click on the ``Accept offer`` button on the next page.

.. image:: resources/accept_offer.png
:alt: Subscribe to technique detector
:align: center

.. note::
Do not click on ``Continue to configuration`` at the top of the page. You can move to the next step.

3. Make note of the flow detector model package ARN by going to the Sagemaker console. On the left hand side panel, navigate to Inference → Marketplace model packages. Select ``AWS Marketplace subscriptions`` tab. Click on the desired product title and copy the ``Model package ARN``.

.. image:: resources/model_package_arn.png
:alt: Subscribe to technique detector
:align: center
Loading

0 comments on commit af68628

Please sign in to comment.