Skip to content

Commit

Permalink
Develop (#28)
Browse files Browse the repository at this point in the history
- Updated IOSXE base config to include netconf setup for consistency w/ scrapli_netconf
- Removed "pipes" authentication for system ssh -- this is mostly an internal change that simplifies the way that
 system transport authenticates. We lose the ability to very easily read out of stderr what is going on so even if we auth with a key now we have to "confirm" that we are authenticated, but this removes a fair bit of code and unifies things as well as allows for the next line item...
- Added support for `auth_private_key_passphrase` to system transport -- allows for entering ssh key passphrase to decrypt ssh keys
- Added an example on how to deal with "weird" things like banners and macros -- these types of things change how the ssh channel works in that they are pseudo "interactive" -- meaning the prompt is modified/removed so scrapli can't ever "know" when a command is done inserting. It would be possible to support these types of config items more "natively" but doing so would lose some of the smarts about how scrapli enters/confirms inputs sent, so for now (and probably for forever) these will need to be configured in a "special" fashion
- Updated IOSXE for functional tests to use 16.12.03 -- this includes updates to the base config/expected configs... AFAIK there is some better netconf/restconf support in this version which may be handy for tests for scrapli-netconf
- Update channel/drivers to never decode bytes -- this now only happens in the response object; primary motivation for this is to not have to decode/re-encode in general, and in scrapli-netconf in particular
  • Loading branch information
carlmontanari authored Jul 4, 2020
1 parent 0f06079 commit f7fefeb
Show file tree
Hide file tree
Showing 39 changed files with 662 additions and 483 deletions.
1 change: 0 additions & 1 deletion .github/workflows/commit.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,6 @@ jobs:
python -m pip install --upgrade pip
python -m pip install setuptools
python -m pip install nox
chmod 0600 tests/test_data/files/vrnetlab_key
- name: run nox
env:
# needed to make the terminal a tty (i think? without this system ssh is super broken)
Expand Down
1 change: 0 additions & 1 deletion .github/workflows/weekly.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,6 @@ jobs:
python -m pip install --upgrade pip
python -m pip install setuptools
python -m pip install nox
chmod 0600 tests/test_data/files/vrnetlab_key
- name: run nox
env:
# needed to make the terminal a tty (i think? without this system ssh is super broken)
Expand Down
18 changes: 18 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,24 @@
CHANGELOG
=======

# 2020.XX.XX
- Updated IOSXE base config to include netconf setup for consistency w/ scrapli_netconf
- Removed "pipes" authentication for system ssh -- this is mostly an internal change that simplifies the way that
system transport authenticates. We lose the ability to very easily read out of stderr what is going on so even if we
auth with a key now we have to "confirm" that we are authenticated, but this removes a fair bit of code and unifies
things as well as allows for the next line item...
- Added support for `auth_private_key_passphrase` to system transport -- allows for entering ssh key passphrase to
decrypt ssh keys
- Added an example on how to deal with "weird" things like banners and macros -- these types of things change how the
ssh channel works in that they are pseudo "interactive" -- meaning the prompt is modified/removed so scrapli can't
ever "know" when a command is done inserting. It would be possible to support these types of config items more
"natively" but doing so would lose some of the smarts about how scrapli enters/confirms inputs sent, so for now
(and probably for forever) these will need to be configured in a "special" fashion
- Updated IOSXE for functional tests to use 16.12.03 -- this includes updates to the base config/expected configs
... AFAIK there is some better netconf/restconf support in this version which may be handy for tests for scrapli-netconf
- Update channel/drivers to never decode bytes -- this now only happens in the response object; primary motivation
for this is to not have to decode/re-encode in general, and in scrapli-netconf in particular

# 2020.06.06
- Converted all priv levels to be kwargs instead of just args for setup -- simple thing but makes it more readable IMO.
- Added to the Juniper prompt pattern to include matching the RE prompt that is on the line "above" the "normal
Expand Down
26 changes: 19 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ Feel free to join the very awesome networktocode slack workspace [here](https://
- [More Examples](#more-examples)
- [Documentation](#documentation)
- [Wiki](#wiki)
- [Other Stuff](#other-stuff)
- [scrapli: What is it](#scrapli-what-is-it)
- [Supported Platforms](#supported-platforms)
- [Advanced Installation](#advanced-installation)
Expand Down Expand Up @@ -134,6 +135,7 @@ end
- [Transport Options](examples/transport_options/system_ssh_args.py)
- [Configuration Modes - IOSXR Configure Exclusive](examples/configuration_modes/iosxr_configure_exclusive.py)
- [Configuration Modes - EOS Configure Session](examples/configuration_modes/eos_configure_session.py)
- [Banners, Macros, and other "weird" Things](examples/banners_macros_etc/iosxe_banners_macros_etc.py)


## Documentation
Expand Down Expand Up @@ -162,6 +164,14 @@ Extra, generally platform/transport-specific, examples/documentation/information
with this repository. You can find it [here](https://github.com/carlmontanari/scrapli/wiki).


## Other Stuff

Other scrapli related docs/blogs/videos/info:

- [Scrapli on Dmitry Figol's Network Automation Channel](https://www.youtube.com/watch?v=OJa2typq7yI)
- [Scrapli Intro on Wim Wauter's blog](https://blog.wimwauters.com/networkprogrammability/2020-04-09_scrapli_introduction/)


# scrapli: What is it

As stated, scrapli is a python library focused on connecting to devices, specifically network devices via SSH or Telnet.
Expand Down Expand Up @@ -238,7 +248,7 @@ scrapli "core" drivers cover basically the [NAPALM](https://github.com/napalm-au
synchronous and an asynchronous version of each of these drivers. Below are the core driver platforms and
currently tested version.

- Cisco IOS-XE (tested on: 16.04.01)
- Cisco IOS-XE (tested on: 16.12.03)
- Cisco NX-OS (tested on: 9.2.4)
- Juniper JunOS (tested on: 17.3R2.10)
- Cisco IOS-XR (tested on: 6.5.3)
Expand Down Expand Up @@ -312,7 +322,7 @@ The available optional installation extras options are:
If you would like to install all of the optional extras, you can do so with the `full` option:

```
pip isntall scrapli[full]
pip install scrapli[full]
```

As for platforms to *run* scrapli on -- it has and will be tested on MacOS and Ubuntu regularly and should work on any
Expand Down Expand Up @@ -729,7 +739,8 @@ The basic usage section outlined the most commonly used driver arguments, this o
| auth_username | username for authentication | Scrape |
| auth_password | password for authentication | Scrape |
| auth_secondary | password for secondary authentication (enable password) | NetworkDriver |
| auth_private_key | private key for authentication | Scrape |
| auth_private_key | private key for authentication | Scrape |
| auth_private_key_passphrase | passphrase for ssh key | Scrape |
| auth_strict_key | strict key checking -- TRUE by default! | Scrape |
| auth_bypass | bypass ssh auth prompts after ssh establishment | Scrape |
| timeout_socket | timeout value for initial socket connection | Scrape |
Expand Down Expand Up @@ -1038,7 +1049,7 @@ Without the `send_command` and similar methods, you must directly access the `Ch
Using the `Scrape` driver directly is nice enough, however you may not want to have to change the prompt pattern, or
deal with accessing the channel to send commands to the device. In this case there is a `GenericDriver` available to
you. This driver has a *very* broad pattern that it matches for base prompts, has no concept of disabling paging or
privilege levels (like `Scrape`), but does provide `send_command`, `send_commands`, `send_interact`, and
privilege levels (like `Scrape`), but does provide `send_command`, `send_commands`, `send_interactive`, and
`get_prompt` methods for a more NetworkDriver-like experience.

Hopefully this `GenericDriver` can be used as a starting point for devices that don't fall under the core supported
Expand Down Expand Up @@ -1259,14 +1270,15 @@ scrapli.exceptions.ScrapliCommandFailure

- Any arguments passed to the `SystemSSHTransport` class will override arguments in your ssh config file. This is
because the arguments get crafted into an "open_cmd" (the command that actually fires off the ssh session), and
these cli arguments take precedence over the config file arguments.
these cli arguments take precedence over the config file arguments. The most important implication of this is the
`auth_strict_key` setting, so keep that in mind!
- If you set `ssh_config_file` to `False` the `SystemSSHTransport` class will set the config file used to `/dev/null
` so that no ssh config file configs are accidentally used.
- There is zero Windows support for system ssh transport - I would strongly encourage the use of WSL or cygwin and
sticking with systemssh instead of using paramiko/ssh2 natively in Windows -- system ssh is very much the focus of
development for scrapli!
- SystemSSH needs to have a terminal set -- without this it fails. My understanding is that without a terminal being
set there is no tty which causes the popen/ptyprocess portions of scrapli to not be able to read from the session
set there is no tty which causes the ptyprocess portions of scrapli to not be able to read from the session
. The fix for this is simply to ensure that there is a `TERM` set -- for example in the GitHub Actions setup for
systemssh tests we simply set `TERM=xterm` as an environment variable. Setting this within scrapli did not seem to
have any affect, but is something worth revisiting later -- meaning it would be nice to have scrapli be able to set
Expand Down Expand Up @@ -1302,7 +1314,7 @@ scrapli.exceptions.ScrapliCommandFailure

## asyncssh

- scrapli asyncssh is not production ready yet!
- None yet

### SSH Config Supported Arguments

Expand Down
66 changes: 66 additions & 0 deletions examples/banners_macros_etc/iosxe_banners_macros_etc.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
"""examples.banners_macros_etc.iosxe_banners_macros_etc"""
from scrapli.driver.core import IOSXEDriver

MY_DEVICE = {
"host": "172.18.0.11",
"auth_username": "vrnetlab",
"auth_password": "VR-netlab9",
"auth_strict_key": False,
}


def main():
"""Simple example of configuring banners and macros on an IOSXEDevice"""
conn = IOSXEDriver(**MY_DEVICE)
conn.open()

my_banner = """This is my router, get outa here!
I'm serious, you can't be in here!
Go away!
"""

# the overall pattern/process is that we must use send_interactive as this is an "interactive"
# style command/input because the prompt changes and relies on a human to understand what is
# going on. this whole operation is completed by the `send_interactive` method, but we break it
# up here so its easier to understand what is going on. first we have a "start" point -- where
# we send the actual command that kicks things off -- in this case "banner motd ^" -- we need to
# tell scrapli what to expect so it knows there is success; "Enter TEXT message." in this
# exmaple. We set the "hidden input" to `True` because this forces scrapli to not try to read
# the inputs back off the channel -- we can't read the inputs because they are interrupted by
# the prompt of enter your text blah blah.
banner_start = ("banner motd ^", "Enter TEXT message.", True)
# next we can simply create an "event" for each line of the banner we want to send, we dont
# need to set the "hidden_prompt" value to `True` here because scrapli can simply read the
# inputs off the channel normally as there is no prompts/inputs from the device
banner_lines = [(line, "\n") for line in my_banner.splitlines()]
# then we need to "end" our interactive event and ensure scrapli knows how to find the prompt
# that we'll be left at at the end of this operation. note that you could just capture the
# config mode prompt via `get_prompt` if you wanted and pass that value here, but we'll set it
# manually for this example
banner_end = ("^", "csr1000v(config)#", True)
# finally we need to add all these sections up into a single list of tuples so that we can pass
# this to the `send_interactive` method -- note the `*` in front of the `banner_lines` argument
# we "unpack" the tuples from the list into this final list object
banner_events = [banner_start, *banner_lines, banner_end]
result = conn.send_interactive(interact_events=banner_events, privilege_level="configuration")
print(result.result)

# Note: csr1000v (at least the version scrapli is regularly tested with does not support macros
# the following has been tested and works on a 3560 switch
my_macro = """# description
desc this_is_a_neat_macro
# do a thing
power inline never
"""

macro_start = ("macro name my_macro", "Enter macro commands one per line.", True)
macro_lines = [(line, "\n", True) for line in my_macro.splitlines()]
macro_end = ("@", "csr1000v(config)#", True)
macro_events = [macro_start, *macro_lines, macro_end]
result = conn.send_interactive(interact_events=macro_events, privilege_level="configuration")
print(result.result)


if __name__ == "__main__":
main()
5 changes: 5 additions & 0 deletions noxfile.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,12 +50,17 @@ def unit_tests(session):
N/A
"""
# ensure test ssh key permissions are appropriate
session.run("chmod", "0600", "tests/test_data/files/vrnetlab_key", external=True)
session.run("chmod", "0600", "tests/test_data/files/vrnetlab_key_encrypted", external=True)

# install this repo in editable mode so that other scrapli libs can depend on a yet to be
# released version. for example, scrapli_asyncssh is new and released and requires the *next*
# release of scrapli; if we set the version to the next release in __init__ and install locally
# we can avoid a kind of circular dependency thing where pypi version of scrapli is not yet
# updated to match the new pins in other scrapli libs
session.install("-e", ".")

session.install("-r", "requirements-dev.txt")
session.run(
"pytest",
Expand Down
12 changes: 6 additions & 6 deletions requirements-dev.txt
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
nox>=2020.5.24
black>=19.10b0
isort>=4.3.21
mypy>=0.780
isort==4.3.21
mypy>=0.782
pytest>=5.4.3
pytest-cov>=2.9.0
pytest-asyncio>=0.12.0
pytest-cov>=2.10.0
pytest-asyncio>=0.14.0
pyfakefs>=4.0.2
pylama>=7.7.1
pycodestyle>=2.6.0
pydocstyle>=5.0.2
pylint>=2.5.2
darglint>=1.4.0
pylint>=2.5.3
darglint>=1.4.1
pdoc3>=0.8.1 ; sys_platform != "win32"
asyncssh>=2.2.1
napalm>=3.0.1
Expand Down
8 changes: 5 additions & 3 deletions scrapli/channel/async_channel.py
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,9 @@ async def get_prompt(self) -> str:
current_prompt = channel_match.group(0)
return current_prompt.decode().strip()

async def send_input(self, channel_input: str, strip_prompt: bool = True) -> Tuple[str, str]:
async def send_input(
self, channel_input: str, strip_prompt: bool = True
) -> Tuple[bytes, bytes]:
"""
Primary entry point to send data to devices in async shell mode; accept input, return result
Expand All @@ -167,7 +169,7 @@ async def send_input(self, channel_input: str, strip_prompt: bool = True) -> Tup
raw_result, processed_result = await self._async_send_input(
channel_input=channel_input, strip_prompt=strip_prompt
)
return raw_result.decode(), processed_result.decode()
return raw_result, processed_result

@operation_timeout("timeout_ops", "Timed out sending input to device.")
async def _async_send_input(
Expand Down Expand Up @@ -209,7 +211,7 @@ async def _async_send_input(
@operation_timeout("timeout_ops", "Timed out sending interactive input to device.")
async def send_inputs_interact(
self, interact_events: List[Tuple[str, str, Optional[bool]]]
) -> Tuple[str, str]:
) -> Tuple[bytes, bytes]:
"""
Async interact with a device with changing prompts per input.
Expand Down
6 changes: 3 additions & 3 deletions scrapli/channel/base_channel.py
Original file line number Diff line number Diff line change
Expand Up @@ -205,7 +205,7 @@ def _pre_send_inputs_interact(interact_events: List[Tuple[str, str, Optional[boo
if not isinstance(interact_events, list):
raise TypeError(f"`interact_events` expects a List, got {type(interact_events)}")

def _post_send_inputs_interact(self, output: bytes) -> Tuple[str, str]:
def _post_send_inputs_interact(self, output: bytes) -> Tuple[bytes, bytes]:
"""
Handle pre "send_inputs_interact" tasks for consistency between sync/async versions
Expand All @@ -220,6 +220,6 @@ def _post_send_inputs_interact(self, output: bytes) -> Tuple[str, str]:
"""
processed_output = self._restructure_output(output=output, strip_prompt=False)
raw_result = output.decode()
processed_result = processed_output.decode()
raw_result = output
processed_result = processed_output
return raw_result, processed_result
6 changes: 3 additions & 3 deletions scrapli/channel/channel.py
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@ def get_prompt(self) -> str:
current_prompt = channel_match.group(0)
return current_prompt.decode().strip()

def send_input(self, channel_input: str, strip_prompt: bool = True,) -> Tuple[str, str]:
def send_input(self, channel_input: str, strip_prompt: bool = True) -> Tuple[bytes, bytes]:
"""
Primary entry point to send data to devices in shell mode; accept input and returns result
Expand All @@ -165,7 +165,7 @@ def send_input(self, channel_input: str, strip_prompt: bool = True,) -> Tuple[st
raw_result, processed_result = self._send_input(
channel_input=channel_input, strip_prompt=strip_prompt
)
return raw_result.decode(), processed_result.decode()
return raw_result, processed_result

@operation_timeout("timeout_ops", "Timed out sending input to device.")
def _send_input(self, channel_input: str, strip_prompt: bool) -> Tuple[bytes, bytes]:
Expand Down Expand Up @@ -202,7 +202,7 @@ def _send_input(self, channel_input: str, strip_prompt: bool) -> Tuple[bytes, by
@operation_timeout("timeout_ops", "Timed out sending interactive input to device.")
def send_inputs_interact(
self, interact_events: List[Tuple[str, str, Optional[bool]]]
) -> Tuple[str, str]:
) -> Tuple[bytes, bytes]:
"""
Interact with a device with changing prompts per input.
Expand Down
26 changes: 15 additions & 11 deletions scrapli/decorators.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
"""scrapli.decorators"""
from concurrent.futures import ThreadPoolExecutor, wait
import multiprocessing.pool
from typing import TYPE_CHECKING, Any, Callable, Dict, Union

from scrapli.exceptions import ConnectionNotOpened
from scrapli.exceptions import ConnectionNotOpened, ScrapliTimeout

if TYPE_CHECKING:
from scrapli.channel import Channel # pragma: no cover
Expand All @@ -15,7 +15,7 @@ def operation_timeout(attribute: str, message: str = "") -> Callable[..., Any]:
Wrap an operation, check class for given attribute and use that for the timeout duration.
Historically this operation timeout decorator used signals instead of the concurrent_futures
Historically this operation timeout decorator used signals instead of the multiprocessing
seen here. The signals method was probably a bit more elegant, however there were issues with
supporting the system transport as system transport subprocess/ptyprocess components spawn
threads of their own, and signals must operate in the main thread.
Expand All @@ -29,14 +29,14 @@ def operation_timeout(attribute: str, message: str = "") -> Callable[..., Any]:
decorate: wrapped function
Raises:
TimeoutError: if timeout exceeded
ScrapliTimeout: if timeout exceeded
"""

def decorate(wrapped_func: Callable[..., Any]) -> Callable[..., Any]:
def timeout_wrapper(
channel_or_transport: Union["Channel", "Transport"],
*args: Union[str, int],
*args: Any,
**kwargs: Dict[str, Union[str, int]],
) -> Any:
# import here to avoid circular dependency
Expand All @@ -63,18 +63,22 @@ def timeout_wrapper(
session_lock = channel_or_transport.session_lock
close = channel_or_transport.close

pool = ThreadPoolExecutor(max_workers=1)
future = pool.submit(wrapped_func, channel_or_transport, *args, **kwargs)
wait([future], timeout=timeout_duration)
if not future.done():
pool = multiprocessing.pool.ThreadPool(processes=1)
func_args = [channel_or_transport, *args]
future = pool.apply_async(wrapped_func, func_args, kwargs)
try:
result = future.get(timeout=timeout_duration)
pool.terminate()
return result
except multiprocessing.context.TimeoutError:
pool.terminate()
channel_or_transport.logger.info(message)
if timeout_exit:
channel_or_transport.logger.info("timeout_exit is True, closing transport")
if session_lock.locked():
session_lock.release()
close()
raise TimeoutError(message)
return future.result()
raise ScrapliTimeout(message)

return timeout_wrapper

Expand Down
Loading

0 comments on commit f7fefeb

Please sign in to comment.