diff --git a/README.md b/README.md index afad896..4b4f10b 100644 --- a/README.md +++ b/README.md @@ -2,73 +2,44 @@ -```python -## Logging and plotting metrics -```python -from torch_snippets import Report - -n_epochs = 5 -log = Report(n_epochs) - -for epoch in range(n_epochs): - # No need to --> train_epoch_losses, train_epoch_accuracies = [], [] - N = len(trn_dl) - for ix, batch in enumerate(iter(trn_dl)): - ... - pos = (epoch + (ix+1)/N) # a float between 0 - n_epochs - # give any number of kwargs that need to be reported and stored. - # args should be float - log.record(pos=pos, train_acc=np.mean(is_correct), train_loss=batch_loss, end='\r') # impersistent log - - N = len(val_dl) - for ix, batch in enumerate(iter(val_dl)): - ... - pos = (epoch + (ix+1)/N) # a float between 0 - n_epochs - log.record(pos=pos, val_loss=batch_loss, end='\r') # impersistent log - log.report_avgs(epoch+1) # persist the report +## Auxiliary Functions +There are simple functions that are overloaded to take inputs and perform repetitive tasks that usually take a few lines to write -``` -![](../assets/demo.gif) -* Auto calculates time remaining -* No need to preinitialize empty lists -* Automatically stores metrics as collection of key words -* Persistent vs Transient logging (use `end='\r'`) -```python ->>> print(log.logged) # get logged metric names -# ['train_loss', 'train_acc', 'val_loss', 'val_acc'] ->>> log.plot() # plot all the logged metrics -``` +#### Images +`show`, `inspect`, `Glob`, `read`, `resize`, `rotate` +#### Files and Paths +`stem`, `Glob`, `parent`, `name`, `fname`, -* Auto calculates average of all metrics in an epoch -* Plot entire training history with one command -![](../assets/avgs0.png) +`makedir`, `zip_files`, `unzip_file`, -* selectively plot logged metrics using regex -```python ->>> log.plot('*_loss') -# or you can list out the metrics that need plotting -# >>> log.plot(['train_acc','val_acc']) -``` -![](assets/avgs1.png) -## Auxiliary Functions -There are simple functions that are overloaded to take inputs and perform repetitive tasks that usually take a few lines to write -#### Images -`show`, `inspect`, `Glob`, `read` -#### FilePaths -`stem`, `Glob`, `parent`, `name` +`find`, `extn`, + + +`readlines`, `writelines` + +#### Lists +`L`, `flatten` + #### Dump and load python objects `loaddill`,`dumpdill` + #### Misc -`Tqdm`, `Timer`, `randint`, `unique`, `diff` +`Tqdm`, `Timer`, `randint`, `Logger` + +#### Sets +`unique`, `diff`, `choose`, `common` + #### Pytorch Modules `Reshape` and `Permute` (`nn.Modules`) +#### Report as Pytorch Lightning Callback +`LightningReport` + +and many more to come... -and many more... - ## Install `pip install torch_snippets` @@ -78,4 +49,3 @@ import pytorch_snippets dir(pytorch_snippets) ``` -``` diff --git a/dist/torch_snippets-0.315-py3-none-any.whl b/dist/torch_snippets-0.315-py3-none-any.whl deleted file mode 100644 index 220439b..0000000 Binary files a/dist/torch_snippets-0.315-py3-none-any.whl and /dev/null differ diff --git a/docs/index.html b/docs/index.html index 6ab3cc0..9e7d533 100644 --- a/docs/index.html +++ b/docs/index.html @@ -29,95 +29,29 @@ {% endraw %} - {% raw %} - -
-
- -
-
-
```python
-from torch_snippets import Report
-
-n_epochs = 5
-log = Report(n_epochs)
-
-for epoch in range(n_epochs):
-    # No need to --> train_epoch_losses, train_epoch_accuracies = [], []
-    N = len(trn_dl)
-    for ix, batch in enumerate(iter(trn_dl)):
-        ...
-        pos = (epoch + (ix+1)/N) # a float between 0 - n_epochs
-        # give any number of kwargs that need to be reported and stored.
-        # args should be float
-        log.record(pos=pos, train_acc=np.mean(is_correct), train_loss=batch_loss, end='\r') # impersistent log
-
-    N = len(val_dl)
-    for ix, batch in enumerate(iter(val_dl)):
-        ...
-        pos = (epoch + (ix+1)/N) # a float between 0 - n_epochs
-        log.record(pos=pos, val_loss=batch_loss, end='\r') # impersistent log
-    log.report_avgs(epoch+1) # persist the report
-
-```
-![](../assets/demo.gif)
-* Auto calculates time remaining
-* No need to preinitialize empty lists
-* Automatically stores metrics as collection of key words
-* Persistent vs Transient logging (use `end='\r'`)  
-```python
->>> print(log.logged) # get logged metric names
-# ['train_loss', 'train_acc', 'val_loss', 'val_acc']
->>> log.plot() # plot all the logged metrics
-```
-
-
-* Auto calculates average of all metrics in an epoch
-* Plot entire training history with one command  
-![](../assets/avgs0.png)
-
-
-* selectively plot logged metrics using regex
-```python
->>> log.plot('*_loss')
-# or you can list out the metrics that need plotting
-# >>> log.plot(['train_acc','val_acc'])
-```
-![](assets/avgs1.png)
-
-## Auxiliary Functions
-There are simple functions that are overloaded to take inputs and perform repetitive tasks that usually take a few lines to write
-#### Images
-`show`, `inspect`, `Glob`, `read`
-#### FilePaths
-`stem`, `Glob`, `parent`, `name`
-#### Dump and load python objects
-`loaddill`,`dumpdill`
-#### Misc 
-`Tqdm`, `Timer`, `randint`, `unique`, `diff`
-#### Pytorch Modules
-`Reshape` and `Permute` (`nn.Modules`)
-
-
-and many more... 
- 
-## Install
-`pip install torch_snippets`
-
-## Usage
-```python
-import pytorch_snippets
+
+
+

Auxiliary Functions

There are simple functions that are overloaded to take inputs and perform repetitive tasks that usually take a few lines to write

+

Images

show, inspect, Glob, read, resize, rotate

+

Files and Paths

stem, Glob, parent, name, fname,

+

makedir, zip_files, unzip_file,

+

find, extn,

+

readlines, writelines

+

Lists

L, flatten

+

Dump and load python objects

loaddill,dumpdill

+

Misc

Tqdm, Timer, randint, Logger

+

Sets

unique, diff, choose, common

+

Pytorch Modules

Reshape and Permute (nn.Modules)

+

Report as Pytorch Lightning Callback

LightningReport

+

and many more to come...

+

Install

pip install torch_snippets

+

Usage

import pytorch_snippets
 dir(pytorch_snippets)
-```
 
-
-
- {% endraw %} -
diff --git a/docs/report.html b/docs/report.html index def6e02..66df76b 100644 --- a/docs/report.html +++ b/docs/report.html @@ -220,6 +220,31 @@
{% endraw %} +
+
+

Features

+
    +
  • Auto calculates time remaining
  • +
  • No need to preinitialize empty lists
  • +
  • Automatically stores metrics as collection of key words
  • +
  • Persistent vs Transient logging (use end='\r')
    > >> print(log.logged) # get logged metric names
    +# ['train_loss', 'train_acc', 'val_loss', 'val_acc']
    +>>> log.plot() # plot all the logged metrics
    +
    +
  • +
+
    +
  • Auto calculates average of all metrics in an epoch
  • +
  • Plot entire training history with one command
    +

    +
  • +
  • selectively plot logged metrics using regex

    +
  • +
+ +
+
+
diff --git a/nbs/index.ipynb b/nbs/index.ipynb index 7a5c116..e15a536 100644 --- a/nbs/index.ipynb +++ b/nbs/index.ipynb @@ -8,77 +8,47 @@ ] }, { - "cell_type": "code", - "execution_count": null, + "cell_type": "markdown", "metadata": {}, - "outputs": [], "source": [ - "## Logging and plotting metrics\n", - "```python\n", - "from torch_snippets import Report\n", - "\n", - "n_epochs = 5\n", - "log = Report(n_epochs)\n", - "\n", - "for epoch in range(n_epochs):\n", - " # No need to --> train_epoch_losses, train_epoch_accuracies = [], []\n", - " N = len(trn_dl)\n", - " for ix, batch in enumerate(iter(trn_dl)):\n", - " ...\n", - " pos = (epoch + (ix+1)/N) # a float between 0 - n_epochs\n", - " # give any number of kwargs that need to be reported and stored.\n", - " # args should be float\n", - " log.record(pos=pos, train_acc=np.mean(is_correct), train_loss=batch_loss, end='\\r') # impersistent log\n", - "\n", - " N = len(val_dl)\n", - " for ix, batch in enumerate(iter(val_dl)):\n", - " ...\n", - " pos = (epoch + (ix+1)/N) # a float between 0 - n_epochs\n", - " log.record(pos=pos, val_loss=batch_loss, end='\\r') # impersistent log\n", - " log.report_avgs(epoch+1) # persist the report\n", - "\n", - "```\n", - "![](../assets/demo.gif)\n", - "* Auto calculates time remaining\n", - "* No need to preinitialize empty lists\n", - "* Automatically stores metrics as collection of key words\n", - "* Persistent vs Transient logging (use `end='\\r'`) \n", - "```python\n", - ">>> print(log.logged) # get logged metric names\n", - "# ['train_loss', 'train_acc', 'val_loss', 'val_acc']\n", - ">>> log.plot() # plot all the logged metrics\n", - "```\n", + "## Auxiliary Functions\n", + "There are simple functions that are overloaded to take inputs and perform repetitive tasks that usually take a few lines to write\n", "\n", + "#### Images\n", + "`show`, `inspect`, `Glob`, `read`, `resize`, `rotate`\n", "\n", - "* Auto calculates average of all metrics in an epoch\n", - "* Plot entire training history with one command \n", - "![](../assets/avgs0.png)\n", + "#### Files and Paths\n", + "`stem`, `Glob`, `parent`, `name`, `fname`,\n", "\n", "\n", - "* selectively plot logged metrics using regex\n", - "```python\n", - ">>> log.plot('*_loss')\n", - "# or you can list out the metrics that need plotting\n", - "# >>> log.plot(['train_acc','val_acc'])\n", - "```\n", - "![](assets/avgs1.png)\n", + "`makedir`, `zip_files`, `unzip_file`, \n", + "\n", + "\n", + "`find`, `extn`, \n", + "\n", + "\n", + "`readlines`, `writelines`\n", + "\n", + "#### Lists\n", + "`L`, `flatten`\n", "\n", - "## Auxiliary Functions\n", - "There are simple functions that are overloaded to take inputs and perform repetitive tasks that usually take a few lines to write\n", - "#### Images\n", - "`show`, `inspect`, `Glob`, `read`\n", - "#### FilePaths\n", - "`stem`, `Glob`, `parent`, `name`\n", "#### Dump and load python objects\n", "`loaddill`,`dumpdill`\n", + "\n", "#### Misc \n", - "`Tqdm`, `Timer`, `randint`, `unique`, `diff`\n", + "`Tqdm`, `Timer`, `randint`, `Logger`\n", + "\n", + "#### Sets\n", + "`unique`, `diff`, `choose`, `common`\n", + "\n", "#### Pytorch Modules\n", "`Reshape` and `Permute` (`nn.Modules`)\n", "\n", + "#### Report as Pytorch Lightning Callback\n", + "`LightningReport`\n", + "\n", + "and many more to come... \n", "\n", - "and many more... \n", - " \n", "## Install\n", "`pip install torch_snippets`\n", "\n", @@ -88,6 +58,13 @@ "dir(pytorch_snippets)\n", "```\n" ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { diff --git a/nbs/report.ipynb b/nbs/report.ipynb index bc63f71..5451fac 100644 --- a/nbs/report.ipynb +++ b/nbs/report.ipynb @@ -256,6 +256,32 @@ "log.plot(['val_loss','train_loss'])" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Features\n", + "\n", + "![](https://i.imgur.com/5iZl2s3.gif)\n", + "* Auto calculates time remaining\n", + "* No need to preinitialize empty lists\n", + "* Automatically stores metrics as collection of key words\n", + "* Persistent vs Transient logging (use `end='\\r'`) \n", + "```python\n", + ">>> print(log.logged) # get logged metric names\n", + "# ['train_loss', 'train_acc', 'val_loss', 'val_acc']\n", + ">>> log.plot() # plot all the logged metrics\n", + "```\n", + "\n", + "\n", + "* Auto calculates average of all metrics in an epoch\n", + "* Plot entire training history with one command \n", + "![](https://i.imgur.com/BrQIaR7.png)\n", + "\n", + "\n", + "* selectively plot logged metrics using regex" + ] + }, { "cell_type": "code", "execution_count": null, diff --git a/scripts.ipynb b/scripts.ipynb index e67e8f4..73359fb 100644 --- a/scripts.ipynb +++ b/scripts.ipynb @@ -9,7 +9,7 @@ }, { "cell_type": "code", - "execution_count": 12, + "execution_count": 21, "metadata": {}, "outputs": [ { @@ -18,7 +18,8 @@ "text": [ "Converted charts.ipynb.\n", "Converted index.ipynb.\n", - "Converted report.ipynb.\n" + "Converted report.ipynb.\n", + "Converted show.ipynb.\n" ] } ], @@ -27,24 +28,6 @@ "notebook2script()" ] }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Create a softlink" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "%cd nbs\n", - "!ln -s ../ .\n", - "%cd .." - ] - }, { "cell_type": "markdown", "metadata": {}, @@ -54,7 +37,7 @@ }, { "cell_type": "code", - "execution_count": 13, + "execution_count": 22, "metadata": {}, "outputs": [ { @@ -71,7 +54,13 @@ "name": "stdout", "output_type": "stream", "text": [ + "converting: /Users/yreddy31/Documents/github/torch_snippets/nbs/show.ipynb\n", "converting: /Users/yreddy31/Documents/github/torch_snippets/nbs/report.ipynb\n", + "converting: /Users/yreddy31/Documents/github/torch_snippets/nbs/charts.ipynb\n", + "converting: /Users/yreddy31/Documents/github/torch_snippets/nbs/index.ipynb\n", + "Path('/Users/yreddy31/Documents/github/torch_snippets/nbs/../assets/demo.gif') and Path('/Users/yreddy31/Documents/github/torch_snippets/docs/../assets/demo.gif') are the same file\n", + "Conversion failed on the following:\n", + "index.ipynb\n", "converting /Users/yreddy31/Documents/github/torch_snippets/nbs/index.ipynb to README.md\n" ] } @@ -97,6 +86,34 @@ "from nbdev.sync import nbdev_update_lib\n", "nbdev_update_lib()" ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Create a softlink" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "/Users/yreddy31/Documents/github/torch_snippets/nbs\n", + "/bin/sh: libname: No such file or directory\n", + "/Users/yreddy31/Documents/github/torch_snippets\n" + ] + } + ], + "source": [ + "%cd nbs\n", + "!ln -s ../ .\n", + "%cd .." + ] } ], "metadata": { diff --git a/settings.ini b/settings.ini index e71c959..4746ec6 100644 --- a/settings.ini +++ b/settings.ini @@ -13,7 +13,7 @@ author = Yeshwanth Reddy author_email = 1992chinna@gmail.com copyright = sizhky branch = master -version = 0.322 +version = 0.400 min_python = 3.6 audience = Developers language = English diff --git a/torch_snippets.egg-info/PKG-INFO b/torch_snippets.egg-info/PKG-INFO index 361a18a..b66b24a 100644 --- a/torch_snippets.egg-info/PKG-INFO +++ b/torch_snippets.egg-info/PKG-INFO @@ -1,6 +1,6 @@ Metadata-Version: 2.1 Name: torch-snippets -Version: 0.322 +Version: 0.400 Summary: One line functions for common tasks Home-page: https://github.com/sizhky/torch_snippets/tree/master/ Author: Yeshwanth Reddy @@ -10,73 +10,44 @@ Description: # Utilities for simple needs - ```python - ## Logging and plotting metrics - ```python - from torch_snippets import Report - - n_epochs = 5 - log = Report(n_epochs) - - for epoch in range(n_epochs): - # No need to --> train_epoch_losses, train_epoch_accuracies = [], [] - N = len(trn_dl) - for ix, batch in enumerate(iter(trn_dl)): - ... - pos = (epoch + (ix+1)/N) # a float between 0 - n_epochs - # give any number of kwargs that need to be reported and stored. - # args should be float - log.record(pos=pos, train_acc=np.mean(is_correct), train_loss=batch_loss, end='\r') # impersistent log - - N = len(val_dl) - for ix, batch in enumerate(iter(val_dl)): - ... - pos = (epoch + (ix+1)/N) # a float between 0 - n_epochs - log.record(pos=pos, val_loss=batch_loss, end='\r') # impersistent log - log.report_avgs(epoch+1) # persist the report + ## Auxiliary Functions + There are simple functions that are overloaded to take inputs and perform repetitive tasks that usually take a few lines to write - ``` - ![](../assets/demo.gif) - * Auto calculates time remaining - * No need to preinitialize empty lists - * Automatically stores metrics as collection of key words - * Persistent vs Transient logging (use `end='\r'`) - ```python - >>> print(log.logged) # get logged metric names - # ['train_loss', 'train_acc', 'val_loss', 'val_acc'] - >>> log.plot() # plot all the logged metrics - ``` + #### Images + `show`, `inspect`, `Glob`, `read`, `resize`, `rotate` + #### Files and Paths + `stem`, `Glob`, `parent`, `name`, `fname`, - * Auto calculates average of all metrics in an epoch - * Plot entire training history with one command - ![](../assets/avgs0.png) + `makedir`, `zip_files`, `unzip_file`, - * selectively plot logged metrics using regex - ```python - >>> log.plot('*_loss') - # or you can list out the metrics that need plotting - # >>> log.plot(['train_acc','val_acc']) - ``` - ![](assets/avgs1.png) - ## Auxiliary Functions - There are simple functions that are overloaded to take inputs and perform repetitive tasks that usually take a few lines to write - #### Images - `show`, `inspect`, `Glob`, `read` - #### FilePaths - `stem`, `Glob`, `parent`, `name` + `find`, `extn`, + + + `readlines`, `writelines` + + #### Lists + `L`, `flatten` + #### Dump and load python objects `loaddill`,`dumpdill` + #### Misc - `Tqdm`, `Timer`, `randint`, `unique`, `diff` + `Tqdm`, `Timer`, `randint`, `Logger` + + #### Sets + `unique`, `diff`, `choose`, `common` + #### Pytorch Modules `Reshape` and `Permute` (`nn.Modules`) + #### Report as Pytorch Lightning Callback + `LightningReport` + + and many more to come... - and many more... - ## Install `pip install torch_snippets` @@ -86,7 +57,6 @@ Description: # Utilities for simple needs dir(pytorch_snippets) ``` - ``` Keywords: snippets,torch Platform: UNKNOWN