-
Notifications
You must be signed in to change notification settings - Fork 4
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
0 parents
commit f15c65b
Showing
66 changed files
with
7,277 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,4 @@ | ||
# Sphinx build info version 1 | ||
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. | ||
config: 7c38a6d79d19c67a234cb3141041178f | ||
tags: 645f666f9bcd5a90fca523b33c5a78b7 |
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Empty file.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,25 @@ | ||
Backends | ||
======== | ||
|
||
Backends connect users to DSI Core middleware and backends allow DSI middleware data structures to read and write to persistent external storage. Backends are modular to support user contribution. Backend contributors are encouraged to offer custom backend abstract classes and backend implementations. A contributed backend abstract class may extend another backend to inherit the properties of the parent. In order to be compatible with DSI core middleware, backends should create an interface to Python built-in data structures or data structures from the Python ``collections`` library. Backend extensions will be accepted conditional to the extention of ``backends/tests`` to demonstrate new Backend capability. We can not accept pull requests that are not tested. | ||
|
||
Note that any contributed backends or extensions must include unit tests in ``backends/tests`` to demonstrate the new Backend capability. | ||
|
||
.. figure:: BackendClassHierarchy.png | ||
:alt: Figure depicting the current backend class hierarchy. | ||
:class: with-shadow | ||
:scale: 100% | ||
|
||
Figure depicts the current DSI backend class hierarchy. | ||
|
||
.. automodule:: dsi.backends.filesystem | ||
:members: | ||
|
||
.. automodule:: dsi.backends.sqlite | ||
:members: | ||
|
||
.. automodule:: dsi.backends.gufi | ||
:members: | ||
|
||
.. automodule:: dsi.backends.parquet | ||
:members: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,86 @@ | ||
Core | ||
=================== | ||
The DSI Core middleware defines the Terminal concept. An instantiated Terminal is the human/machine DSI interface. The person setting up a Core Terminal only needs to know how they want to ask questions, and what metadata they want to ask questions about. If they don’t see an option to ask questions the way they like, or they don’t see the metadata they want to ask questions about, then they should ask a Driver Contributor or a Plugin Contributor, respectively. | ||
|
||
A Core Terminal is a home for Plugins (Readers/Writers), and an interface for Backends. A Core Terminal is instantiated with a set of default Plugins and Backends, but they must be loaded before a user query is attempted. Here's an example of how you might work with DSI using an interactive Python interpreter for your data science workflows:: | ||
|
||
>>> from dsi.core import Terminal | ||
>>> a=Terminal() | ||
>>> a.list_available_modules('plugin') | ||
>>> # ['Bueno', 'Hostname', 'SystemKernel'] | ||
>>> a.load_module('plugin','Bueno','reader',filename='./data/bueno.data') | ||
>>> # Bueno plugin reader loaded successfully. | ||
>>> a.load_module('plugin','Hostname','writer') | ||
>>> # Hostname plugin writer loaded successfully. | ||
>>> a.list_loaded_modules() | ||
>>> # {'writer': [<dsi.plugins.env.Hostname object at 0x7f21232474d0>], | ||
>>> # 'reader': [<dsi.plugins.env.Bueno object at 0x7f2123247410>], | ||
>>> # 'front-end': [], | ||
>>> # 'back-end': []} | ||
|
||
|
||
At this point, you might decide that you are ready to collect data for inspection. It is possible to utilize DSI Backends to load additional metadata to supplement your Plugin metadata, but you can also sample Plugin data and search it directly. | ||
|
||
|
||
The process of transforming a set of Plugin writers and readers into a querable format is called transloading. A DSI Core Terminal has a ``transload()`` method which may be called to execute all Plugins at once:: | ||
|
||
>>> a.transload() | ||
>>> a.active_metadata | ||
>>> # OrderedDict([('uid', [1000]), ('effective_gid', [1000]), ('moniker', ['qwofford'])... | ||
|
||
Once a Core Terminal has been transloaded, no further Plugins may be added. However, the transload method can be used to samples of each plugin as many times as you like:: | ||
|
||
>>> a.transload() | ||
>>> a.transload() | ||
>>> a.transload() | ||
>>> a.active_metadata | ||
>>> # OrderedDict([('uid', [1000, 1000, 1000, 1000]), ('effective_gid', [1000, 1000, 1000... | ||
|
||
If you perform data science tasks using Python, it is not necessary to create a DSI Core Terminal front-end because the data is already in a Python data structure. If your data science tasks can be completed in one session, it is not required to interact with DSI Backends. However, if you do want to save your work, you can load a DSI Backend with a back-end function:: | ||
|
||
>>> a.list_available_modules('backend') | ||
>>> # ['Gufi', 'Sqlite', 'Parquet'] | ||
>>> a.load_module('backend','Parquet','back-end',filename='parquet.data') | ||
>>> # Parquet backend loaded successfully. | ||
>>> a.list_loaded_modules() | ||
>>> # {'writer': [<dsi.plugins.env.Hostname object at 0x7f21232474d0>], | ||
>>> # 'reader': [<dsi.plugins.env.Bueno object at 0x7f2123247410>], | ||
>>> # 'front-end': [], | ||
>>> # 'back-end': [<dsi.backends.parquet.Parquet object at 0x7f212325a110>]} | ||
>>> a.artifact_handler(interaction_type='put') | ||
|
||
The contents of the active DSI Core Terminal metadata storage will be saved to a Parquet object at the path you provided at module loading time. | ||
|
||
It is possible that you prefer to perform data science tasks using a higher level abstraction than Python itself. This is the purpose of the DSI Driver front-end functionality. Unlike Plugins, Drivers can be added after the initial ``transload()`` operation has been performed:: | ||
|
||
>>> a.load_module('backend','Parquet','front-end',filename='parquet.data') | ||
>>> # Parquet backend front-end loaded successfully. | ||
>>> a.list_loaded_modules() | ||
>>> # {'writer': [<dsi.plugins.env.Hostname object at 0x7fce3c612b50>], | ||
>>> # 'reader': [<dsi.plugins.env.Bueno object at 0x7fce3c622110>], | ||
>>> # 'front-end': [<dsi.backends.parquet.Parquet object at 0x7fce3c622290>], | ||
>>> # 'back-end': [<dsi.backends.parquet.Parquet object at 0x7fce3c622650>]} | ||
|
||
Any front-end may be used, but in this case the Parquet backend has a front-end implementation which builds a jupyter notebook from scratch that loads your metadata collection into a Pandas Dataframe. The Parquet front-end will then launch the Jupyter Notebook to support an interactive data science workflow:: | ||
|
||
>>> a.artifact_handler(interaction_type='inspect') | ||
>>> # Writing Jupyter notebook... | ||
>>> # Opening Jupyter notebook... | ||
|
||
.. image:: jupyter_frontend.png | ||
:scale: 33% | ||
|
||
You can then close your Jupyter notebook, ``transload()`` additionally to increase your sample size, and use the interface to explore more data. | ||
|
||
Although this demonstration only used one Plugin per Plugin functionality, any number of plugins can be added to collect an arbitrary amount of queriable metadata:: | ||
|
||
>>> a.load_module('plugin','SystemKernel','writer') | ||
>>> # SystemKernel plugin writer loaded successfully | ||
>>> a.list_loaded_modules() | ||
>>> # {'writer': [<dsi.plugins.env.Hostname object at 0x7fce3c612b50>, <dsi.plugins.env.SystemKernel object at 0x7fce68519250>], | ||
>>> # 'reader': [<dsi.plugins.env.Bueno object at 0x7fce3c622110>], | ||
>>> # 'front-end': [<dsi.backends.parquet.Parquet object at 0x7fce3c622290>], | ||
>>> # 'back-end': [<dsi.backends.parquet.Parquet object at 0x7fce3c622650>]} | ||
|
||
.. automodule:: dsi.core | ||
:members: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,26 @@ | ||
.. DSI documentation master file, created by | ||
sphinx-quickstart on Fri Apr 14 14:04:07 2023. | ||
You can adapt this file completely to your liking, but it should at least | ||
contain the root `toctree` directive. | ||
============================================= | ||
The Data Science Infrastructure Project (DSI) | ||
============================================= | ||
|
||
.. toctree:: | ||
:maxdepth: 2 | ||
:caption: Contents: | ||
|
||
introduction | ||
installation | ||
plugins | ||
backends | ||
core | ||
|
||
|
||
Indices and tables | ||
================== | ||
|
||
* :ref:`genindex` | ||
* :ref:`modindex` | ||
* :ref:`search` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,19 @@ | ||
Installation | ||
=================== | ||
|
||
1. Create or activate a DSI virtual environment. | ||
2. ``cd`` into the project space root | ||
3. ``python -m pip install .`` | ||
4. [Optional] If you are running DSI Unit tests ``python -m pip install pytest gitpython coverage-badge pytest-cov``. | ||
5. [Optional] If you are HTML documentation ``python -m pip install sphinx sphinx_rtd_theme`` | ||
|
||
How to create and activate a virtual environment | ||
-------------------------------------------------- | ||
We recommend Miniconda for virtual environment management (`https://docs.conda.io/en/latest/miniconda.html`). To create and activate a Miniconda virtual environment: | ||
|
||
1. Download and install the appropriate Miniconda installer for your platform. | ||
2. If this is the first time creating a DSI virtual environment: ``conda create -n 'dsi' python=3.11``. The ``-n`` name argument can be anything you like. | ||
3. Once the virtual environment is created, activate it with ``conda activate dsi``, or whatever name you picked in the preceding step. | ||
4. Proceed with Step 2 in the "Installation" section. | ||
5. When you've completed work, deativate the conda environment with ``conda deactivate``. | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,74 @@ | ||
============ | ||
Introduction | ||
============ | ||
|
||
The goal of the Data Science Infrastructure Project (DSI) is to manage data through metadata capture and curation. DSI capabilities can be used to develop workflows to support management of simulation data, AI/ML approaches, ensemble data, and other sources of data typically found in scientific computing. DSI infrastructure is designed to be flexible and with these considerations in mind: | ||
|
||
- Data management is subject to strict, POSIX-enforced, file security. | ||
- DSI capabilities support a wide range of common metadata queries. | ||
- DSI interfaces with multiple database technologies and archival storage options. | ||
- Query-driven data movement is supported and is transparent to the user. | ||
- The DSI API can be used to develop user-specific workflows. | ||
|
||
.. figure:: data_lifecycle.png | ||
:alt: Figure depicting the data life cycle | ||
:class: with-shadow | ||
:scale: 50% | ||
|
||
A depiction of data life cycle can be seen here. The Data Science Infrastructure API supports the user to manage the life cycle aspects of their data. | ||
|
||
DSI system design has been driven by specific use cases, both AI/ML and more generic usage. These use cases can often be generalized to user stories and needs that can be addressed by specific features, e.g., flexible, human-readable query capabilities. DSI uses Object Oriented design principles to encourage modularity and to support contributions by the user community. The DSI API is Python-based. | ||
|
||
Implementation Overview | ||
======================= | ||
|
||
The DSI API is broken into three main categories: | ||
|
||
- Plugins: these are frontend capabilities that will be commonly used by the generic DSI user. These include readers and writers. | ||
- Backends: these are used to interact with storage devices and other ways of moving data. | ||
- DSI Core: the *middleware* that contains the basic functionality to use the DSI API. | ||
|
||
Plugin Abstract Classes | ||
----------------------- | ||
|
||
Plugins transform an arbitrary data source into a format that is compatible with the DSI core. The parsed and queryable attributes of the data are called *metadata* -- data about the data. Metadata share the same security profile as the source data. | ||
|
||
Plugins can operate as data readers or data writers. A simple data reader might parse an application's output file and place it into a core-compatible data structure such as Python built-ins and members of the popular Python ``collection`` module. A simple data writer might execute an application to supplement existing data and queryable metadata, e.g., adding locations of outputs data or plots after running an analysis workflow. | ||
|
||
Plugins are defined by a base abstract class, and support child abstract classes which inherit the properties of their ancestors. | ||
|
||
Currently, DSI has the following readers: | ||
|
||
- CSV file reader: reads in comma separated value (CSV) files. | ||
- Bueno reader: can be used to capture performance data from `Bueno <https://github.com/lanl/bueno>`_. | ||
|
||
.. figure:: PluginClassHierarchy.png | ||
:alt: Figure depicting the current plugin class hierarchy. | ||
:class: with-shadow | ||
:scale: 100% | ||
|
||
Figure depicting the current DSI plugin class hierarchy. | ||
|
||
Backend Abstract Classes | ||
------------------------ | ||
|
||
Backends are an interface between the core and a storage medium. | ||
Backends are designed to support a user-needed functionality. Given a set of user metadata captured by a DSI frontend, a typical functionality needed by DSI users is to query that metadata by SQL query. Because the files associated with the queryable metadata may be spread across filesystems and security domains, a supporting backend is required to assemble query results and present them to the DSI core for transformation and return. | ||
|
||
.. figure:: user_story.png | ||
:alt: This figure depicts a user asking a typical query on the user's metadata | ||
:class: with-shadow | ||
:scale: 50% | ||
|
||
In this typical **user story**, the user has metadata about their data stored in DSI storage of some type. The user needs to extract all files with the variable **foo** above a specific threshold. DSI backends query the DSI metadata store to locate and return all such files. | ||
|
||
Current DSI backends include: | ||
|
||
- Sqlite: Python based SQL database and backend; the default DSI API backend. | ||
- GUFI: the Grand Unified File Index system `Grand Unified File-Index <https://github.com/mar-file-system/GUFI>`_ ; developed at LANL, GUFI is a fast, secure metadata search across a filesystem accessible to both privileged and unprivileged users. | ||
- Parquet: a columnar storage format for `Apache Hadoop <https://hadoop.apache.org>`_. | ||
|
||
DSI Core | ||
-------- | ||
|
||
DSI basic functionality is contained within the middleware known as the *core*. The DSI core is focused on delivering user-queries on unified metadata which can be distributed across many files and security domains. DSI currently supports Linux, and is tested on RedHat- and Debian-based distributions. The DSI core is a home for DSI Plugins and an interface for DSI Backends. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,23 @@ | ||
Plugins | ||
======= | ||
Plugins connect data-producing applications to DSI core functionalities. Plugins have *writers* or *readers* functions. A Plugin reader function deals with existing data files or input streams. A Plugin writer deals with generating new data. Plugins are modular to support user contribution. | ||
|
||
Plugin contributors are encouraged to offer custom Plugin abstract classes and Plugin implementations. A contributed Plugin abstract class may extend another plugin to inherit the properties of the parent. In order to be compatible with DSI core, Plugins should produce data in Python built-in data structures or data structures sourced from the Python ``collections`` library. | ||
|
||
Note that any contributed plugins or extensions must include unit tests in ``plugins/tests`` to demonstrate the new Plugin capability. | ||
|
||
.. figure:: PluginClassHierarchy.png | ||
:alt: Figure depicting the current plugin class hierarchy. | ||
:class: with-shadow | ||
:scale: 100% | ||
|
||
Figure depicts the current DSI plugin class hierarchy. | ||
|
||
.. automodule:: dsi.plugins.plugin | ||
:members: | ||
|
||
.. automodule:: dsi.plugins.metadata | ||
:members: | ||
|
||
.. automodule:: dsi.plugins.env | ||
:members: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,123 @@ | ||
/* Compatability shim for jQuery and underscores.js. | ||
* | ||
* Copyright Sphinx contributors | ||
* Released under the two clause BSD licence | ||
*/ | ||
|
||
/** | ||
* small helper function to urldecode strings | ||
* | ||
* See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/decodeURIComponent#Decoding_query_parameters_from_a_URL | ||
*/ | ||
jQuery.urldecode = function(x) { | ||
if (!x) { | ||
return x | ||
} | ||
return decodeURIComponent(x.replace(/\+/g, ' ')); | ||
}; | ||
|
||
/** | ||
* small helper function to urlencode strings | ||
*/ | ||
jQuery.urlencode = encodeURIComponent; | ||
|
||
/** | ||
* This function returns the parsed url parameters of the | ||
* current request. Multiple values per key are supported, | ||
* it will always return arrays of strings for the value parts. | ||
*/ | ||
jQuery.getQueryParameters = function(s) { | ||
if (typeof s === 'undefined') | ||
s = document.location.search; | ||
var parts = s.substr(s.indexOf('?') + 1).split('&'); | ||
var result = {}; | ||
for (var i = 0; i < parts.length; i++) { | ||
var tmp = parts[i].split('=', 2); | ||
var key = jQuery.urldecode(tmp[0]); | ||
var value = jQuery.urldecode(tmp[1]); | ||
if (key in result) | ||
result[key].push(value); | ||
else | ||
result[key] = [value]; | ||
} | ||
return result; | ||
}; | ||
|
||
/** | ||
* highlight a given string on a jquery object by wrapping it in | ||
* span elements with the given class name. | ||
*/ | ||
jQuery.fn.highlightText = function(text, className) { | ||
function highlight(node, addItems) { | ||
if (node.nodeType === 3) { | ||
var val = node.nodeValue; | ||
var pos = val.toLowerCase().indexOf(text); | ||
if (pos >= 0 && | ||
!jQuery(node.parentNode).hasClass(className) && | ||
!jQuery(node.parentNode).hasClass("nohighlight")) { | ||
var span; | ||
var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg"); | ||
if (isInSVG) { | ||
span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); | ||
} else { | ||
span = document.createElement("span"); | ||
span.className = className; | ||
} | ||
span.appendChild(document.createTextNode(val.substr(pos, text.length))); | ||
node.parentNode.insertBefore(span, node.parentNode.insertBefore( | ||
document.createTextNode(val.substr(pos + text.length)), | ||
node.nextSibling)); | ||
node.nodeValue = val.substr(0, pos); | ||
if (isInSVG) { | ||
var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect"); | ||
var bbox = node.parentElement.getBBox(); | ||
rect.x.baseVal.value = bbox.x; | ||
rect.y.baseVal.value = bbox.y; | ||
rect.width.baseVal.value = bbox.width; | ||
rect.height.baseVal.value = bbox.height; | ||
rect.setAttribute('class', className); | ||
addItems.push({ | ||
"parent": node.parentNode, | ||
"target": rect}); | ||
} | ||
} | ||
} | ||
else if (!jQuery(node).is("button, select, textarea")) { | ||
jQuery.each(node.childNodes, function() { | ||
highlight(this, addItems); | ||
}); | ||
} | ||
} | ||
var addItems = []; | ||
var result = this.each(function() { | ||
highlight(this, addItems); | ||
}); | ||
for (var i = 0; i < addItems.length; ++i) { | ||
jQuery(addItems[i].parent).before(addItems[i].target); | ||
} | ||
return result; | ||
}; | ||
|
||
/* | ||
* backward compatibility for jQuery.browser | ||
* This will be supported until firefox bug is fixed. | ||
*/ | ||
if (!jQuery.browser) { | ||
jQuery.uaMatch = function(ua) { | ||
ua = ua.toLowerCase(); | ||
|
||
var match = /(chrome)[ \/]([\w.]+)/.exec(ua) || | ||
/(webkit)[ \/]([\w.]+)/.exec(ua) || | ||
/(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) || | ||
/(msie) ([\w.]+)/.exec(ua) || | ||
ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) || | ||
[]; | ||
|
||
return { | ||
browser: match[ 1 ] || "", | ||
version: match[ 2 ] || "0" | ||
}; | ||
}; | ||
jQuery.browser = {}; | ||
jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true; | ||
} |
Oops, something went wrong.