Logging is disabled by default, and you can enable it by passing { logger: true }
or { logger: { level: 'info' } }
when you create a Fastify instance. Note
that if the logger is disabled, it is impossible to enable it at runtime. We use
abstract-logging for this
purpose.
As Fastify is focused on performance, it uses
pino as its logger, with the default log
level, when enabled, set to 'info'
.
Enabling the production JSON logger:
const fastify = require('fastify')({
logger: true
})
Enabling the logger with appropriate configuration for both local development and production and test environment requires a bit more configuration:
const envToLogger = {
development: {
transport: {
target: 'pino-pretty',
options: {
translateTime: 'HH:MM:ss Z',
ignore: 'pid,hostname',
},
},
},
production: true,
test: false,
}
const fastify = require('fastify')({
logger: envToLogger[environment] ?? true // defaults to true if no entry matches in the map
})
pino-pretty
needs to be installed as a dev dependency, it is not included
by default for performance reasons.
You can use the logger like this in your route handlers:
fastify.get('/', options, function (request, reply) {
request.log.info('Some info about the current request')
reply.send({ hello: 'world' })
})
You can trigger new logs outside route handlers by using the Pino instance from the Fastify instance:
fastify.log.info('Something important happened!');
If you want to pass some options to the logger, just pass them to Fastify. You can find all available options in the Pino documentation. If you want to specify a file destination, use:
const fastify = require('fastify')({
logger: {
level: 'info',
file: '/path/to/file' // Will use pino.destination()
}
})
fastify.get('/', options, function (request, reply) {
request.log.info('Some info about the current request')
reply.send({ hello: 'world' })
})
If you want to pass a custom stream to the Pino instance, just add a stream field to the logger object.
const split = require('split2')
const stream = split(JSON.parse)
const fastify = require('fastify')({
logger: {
level: 'info',
stream: stream
}
})
By default, Fastify adds an ID to every request for easier tracking. If the
requestIdHeader-option is set and the corresponding header is present than
its value is used, otherwise a new incremental ID is generated. See Fastify
Factory requestIdHeader
and Fastify
Factory genReqId
for customization options.
The default logger is configured with a set of standard serializers that
serialize objects with req
, res
, and err
properties. The object received
by req
is the Fastify Request
object, while the object
received by res
is the Fastify Reply
object. This behavior
can be customized by specifying custom serializers.
const fastify = require('fastify')({
logger: {
serializers: {
req (request) {
return { url: request.url }
}
}
}
})
For example, the response payload and headers could be logged using the approach below (even if it is not recommended):
const fastify = require('fastify')({
logger: {
transport: {
target: 'pino-pretty'
},
serializers: {
res (reply) {
// The default
return {
statusCode: reply.statusCode
}
},
req (request) {
return {
method: request.method,
url: request.url,
path: request.routeOptions.url,
parameters: request.params,
// Including the headers in the log could be in violation
// of privacy laws, e.g. GDPR. You should use the "redact" option to
// remove sensitive fields. It could also leak authentication data in
// the logs.
headers: request.headers
};
}
}
}
});
Note: In certain cases, the Reply
object passed to the res
serializer cannot be fully constructed. When writing a custom res
serializer,
it is necessary to check for the existence of any properties on reply
aside
from statusCode
, which is always present. For example, the existence of
getHeaders
must be verified before it can be called:
const fastify = require('fastify')({
logger: {
transport: {
target: 'pino-pretty'
},
serializers: {
res (reply) {
// The default
return {
statusCode: reply.statusCode
headers: typeof reply.getHeaders === 'function'
? reply.getHeaders()
: {}
}
},
}
}
});
Note: The body cannot be serialized inside a req
method because the
request is serialized when we create the child logger. At that time, the body is
not yet parsed.
See an approach to log req.body
app.addHook('preHandler', function (req, reply, done) {
if (req.body) {
req.log.info({ body: req.body }, 'parsed body')
}
done()
})
Note: Care should be taken to ensure serializers never throw, as an error thrown from a serializer has the potential to cause the Node process to exit. See the Pino documentation on serializers for more information.
Any logger other than Pino will ignore this option.
You can also supply your own logger instance. Instead of passing configuration
options, pass the instance. The logger you supply must conform to the Pino
interface; that is, it must have the following methods: info
, error
,
debug
, fatal
, warn
, trace
, silent
, child
and a string property level
.
Example:
const log = require('pino')({ level: 'info' })
const fastify = require('fastify')({ logger: log })
log.info('does not have request information')
fastify.get('/', function (request, reply) {
request.log.info('includes request information, but is the same logger instance as `log`')
reply.send({ hello: 'world' })
})
The logger instance for the current request is available in every part of the lifecycle.
Pino supports low-overhead log redaction for obscuring
values of specific properties in recorded logs. As an example, we might want to
log all the HTTP headers minus the Authorization
header for security concerns:
const fastify = Fastify({
logger: {
stream: stream,
redact: ['req.headers.authorization'],
level: 'info',
serializers: {
req (request) {
return {
method: request.method,
url: request.url,
headers: request.headers,
host: request.host,
remoteAddress: request.ip,
remotePort: request.socket.remotePort
}
}
}
}
})
See https://getpino.io/#/docs/redaction for more details.