Skip to content

Commit

Permalink
Fix book.yaml
Browse files Browse the repository at this point in the history
Remove 'via'

PiperOrigin-RevId: 220486241
  • Loading branch information
lamberta authored and tensorflower-gardener committed Nov 7, 2018
1 parent 1e417fb commit ff796b1
Show file tree
Hide file tree
Showing 5 changed files with 8 additions and 12 deletions.
4 changes: 0 additions & 4 deletions tensorflow_serving/g3doc/_book.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,12 +25,8 @@ upper_tabs:
path: /serving/setup
- title: Serve a TensorFlow model
path: /serving/serving_basic
- title: REST API
path: /serving/api_rest
- title: Build a TensorFlow ModelServer
path: /serving/serving_advanced
- title: Use TensorFlow Serving with Docker
path: /serving/docker
- title: Use TensorFlow Serving with Kubernetes
path: /serving/serving_kubernetes
- title: Create a new kind of servable
Expand Down
2 changes: 1 addition & 1 deletion tensorflow_serving/g3doc/custom_servable.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ document).

In addition to your `Loader`, you will need to define a `SourceAdapter` that
instantiates a `Loader` from a given storage path. Most simple use-cases can
specify the two objects concisely via the `SimpleLoaderSourceAdapter` class
specify the two objects concisely with the `SimpleLoaderSourceAdapter` class
(in `core/simple_loader.h`). Advanced use-cases may opt to specify `Loader` and
`SourceAdapter` classes separately using the lower-level APIs, e.g. if the
`SourceAdapter` needs to retain some state, and/or if state needs to be shared
Expand Down
10 changes: 5 additions & 5 deletions tensorflow_serving/g3doc/docker.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Using TensorFlow Serving via Docker
# Using TensorFlow Serving with Docker

One of the easiest ways to get started using TensorFlow Serving is via
One of the easiest ways to get started using TensorFlow Serving is with
[Docker](http://www.docker.com/).

## Installing Docker
Expand Down Expand Up @@ -136,8 +136,8 @@ deploy and will load your model for serving on startup.

### Serving example

Let's run through a full example where we load a SavedModel and call it via the
REST API. First pull the serving image:
Let's run through a full example where we load a SavedModel and call it using
the REST API. First pull the serving image:

```shell
docker pull tensorflow/serving
Expand Down Expand Up @@ -209,7 +209,7 @@ details, see [running a serving image](#running-a-serving-image).
### GPU Serving example

Let's run through a full example where we load a model with GPU-bound ops and
call it via the REST API.
call it using the REST API.

First install [`nvidia-docker`](#install-nvidia-docker). Next you can pull the
latest TensorFlow Serving GPU docker image by running:
Expand Down
2 changes: 1 addition & 1 deletion tensorflow_serving/g3doc/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,7 @@ TensorFlow Serving Managers provide a simple, narrow interface --

### Core

**TensorFlow Serving Core** manages (via standard TensorFlow Serving APIs) the
Using the standard TensorFlow Serving APis, *TensorFlow Serving Core* manages the
following aspects of servables:

* lifecycle
Expand Down
2 changes: 1 addition & 1 deletion tensorflow_serving/g3doc/setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

### Installing using Docker

The easiest and most straight-forward way of using TensorFlow Serving is via
The easiest and most straight-forward way of using TensorFlow Serving is with
[Docker images](docker.md). We highly recommend this route unless you have
specific needs that are not addressed by running in a container.

Expand Down

0 comments on commit ff796b1

Please sign in to comment.