Node Interactive: Node.js Performance and Highly Scalable Micro-Services

Post on 22-Jan-2018

138 views 0 download

Transcript of Node Interactive: Node.js Performance and Highly Scalable Micro-Services

Highly Scalable Node.js Microservices

Chris Bailey (@Chris__Bailey)

Node.js and Swift Runtimes

Chris Vignola (@ChrisVignola)

Cloud Native Node.js

: Key Technologies

: Key Technologies

Container

: Key Technologies

Container Orchestration

: Key Technologies

Container Orchestration

Package and Deploy

: Key Technologies

Container Orchestration

MonitoringPackage and Deploy

: Key Technologies

Container Orchestration

Monitoring Distributed TracingPackage and Deploy

MICROSERVICES: Key Performance Characteristics

IO Speed • Performance • Scale

0300600900

1200

0

1,150

IO Speed • Performance • Scale

MICROSERVICES: Key Performance Characteristics

0300600900

1200

0

1,150

IO Speed • Performance • Scale

MICROSERVICES: Key Performance Characteristics

0300600900

1200897

1,150

MICROSERVICES: Key Performance Characteristics

IO Speed • Performance • Scale

0300600900

1200897

1,150

Startup • Availability • Scaling

MICROSERVICES: Key Performance Characteristics

IO Speed • Performance • Scale

0300600900

1200897

1,150

00.9

Startup • Availability • Scaling

MICROSERVICES: Key Performance Characteristics

IO Speed • Performance • Scale

0300600900

1200897

1,150 13.7

0.9

Startup • Availability • Scaling

MICROSERVICES: Key Performance Characteristics

IO Speed • Performance • Scale

0300600900

1200897

1,150 13.7

0.9

Startup • Availability • Scaling

Memory • Efficiency • Cost

MICROSERVICES: Key Performance Characteristics

IO Speed • Performance • Scale

0300600900

1200897

1,150 13.7

0.9 023.6

Startup • Availability • Scaling

Memory • Efficiency • Cost

MICROSERVICES: Key Performance Characteristics

IO Speed • Performance • Scale

0300600900

1200897

1,150 13.7

0.9

422

23.6

Startup • Availability • Scaling

Memory • Efficiency • Cost

MICROSERVICES: Key Performance Characteristics

IO Speed • Performance • Scale

Startup • Availability • Scaling

Memory • Efficiency • Cost

Benchmarking WG: Performance Gains

IO Speed • Performance • Scale

Startup • Availability • Scaling

Memory • Efficiency • Cost

3,0492,914

2,0761,866

4.x master6.x 8.x

Benchmarking WG: Performance Gains

IO Speed • Performance • Scale

Startup • Availability • Scaling

Memory • Efficiency • Cost

50,42052,64042,500

71,0003,0492,914

2,0761,866

4.x master6.x 8.x 4.x master6.x 8.x

Benchmarking WG: Performance Gains

IO Speed • Performance • Scale

Startup • Availability • Scaling

Memory • Efficiency • Cost

50,42052,64042,500

71,00089,02492,84091,47688,9603,0492,914

2,0761,866

4.x master6.x 8.x 4.x master6.x 8.x 4.x master6.x 8.x

Benchmarking WG: Performance Gains

IO Speed • Performance • Scale

Building Scalable Microservices

func add(_ a: Int, to b: Int) -> Void { print(a + b) }

let a = ”5” let b = 3

public/*test/*server/server.jspackage.jsonREADME.md.gitignore

func add(_ a: Int, to b: Int) -> Void { print(a + b) }

let a = ”5” let b = 3

public/*test/*server/server.jspackage.jsonREADME.md.gitignore

const appName = require('./../package').name;const express = require('express');const log4js = require(‘log4js');

const logger = log4js.getLogger(appName);const app = express();

app.get('/', function (req, res) { res.send('Hello World!')})

const port = process.env.PORT || 3000;app.listen(port, function(){ logger.info(`Express listening on: ` + port);});

func add(_ a: Int, to b: Int) -> Void { print(a + b) }

let a = ”5” let b = 3

public/*test/*server/server.jspackage.jsonREADME.md.gitignore

DockerfileDockerfile-tools.dockerignore

func add(_ a: Int, to b: Int) -> Void { print(a + b) }

let a = ”5” let b = 3

public/*test/*server/server.jspackage.jsonREADME.md.gitignore

DockerfileDockerfile-tools.dockerignore

FROM ibmcom/ibmnode

ENV NODE_ENV productionENV PORT 3000

WORKDIR "/app"

# Install app dependenciesCOPY package.json /app/RUN cd /app; npm install

# Bundle app sourceCOPY . /app

EXPOSE 3000CMD ["npm", "start"]

func add(_ a: Int, to b: Int) -> Void { print(a + b) }

let a = ”5” let b = 3

public/*test/*server/server.jspackage.jsonREADME.md.gitignore

DockerfileDockerfile-tools.dockerignore

node_modules/npm-debug.log

$ docker build -t <your username>/node-app .

$ docker build -t <your username>/node-app . $ docker run -p 49160:3000 -d <your username>/node-app

$ docker build -t <your username>/node-app . $ docker run -p 49160:3000 -d <your username>/node-app

$ docker build -t <your username>/node-app . $ docker run -p 49160:3000 -d <your username>/node-app

func add(_ a: Int, to b: Int) -> Void { print(a + b) }

let a = ”5” let b = 3

public/*test/*server/server.jspackage.jsonREADME.md.gitignore

DockerfileDockerfile-tools.dockerignore

chart/node-app/Chart.yamlchart/node-app/templates/deployment.yamlchart/node-app/templates/hpa.yamlchart/node-app/templates/service.yamlchart/node-app/values.yaml

HELM CHARTS

func add(_ a: Int, to b: Int) -> Void { print(a + b) }

let a = ”5” let b = 3

public/*test/*server/server.jspackage.jsonREADME.md.gitignore

DockerfileDockerfile-tools.dockerignore

chart/node-app/Chart.yamlchart/node-app/templates/deployment.yamlchart/node-app/templates/hpa.yamlchart/node-app/templates/service.yamlchart/node-app/values.yaml

HELM CHARTS

apiVersion: v1description: A Helm chart for Kubernetesname: node-appversion: 1.0.0

func add(_ a: Int, to b: Int) -> Void { print(a + b) }

let a = ”5” let b = 3

public/*test/*server/server.jspackage.jsonREADME.md.gitignore

DockerfileDockerfile-tools.dockerignore

chart/node-app/Chart.yamlchart/node-app/templates/deployment.yamlchart/node-app/templates/hpa.yamlchart/node-app/templates/service.yamlchart/node-app/values.yaml

HELM CHARTS

apiVersion: extensions/v1beta1kind: Deploymentmetadata: name: “node-app-deployment" labels: chart: “node-app-1.0.0”spec: replicas: “5” revisionHistoryLimit: “1” template: metadata: labels: app: “node-app-selector" version: “1.0.0” spec: containers: - name: “node-app” image: “repository:1.0.0” imagePullPolicy: Always livenessProbe: httpGet: path: /health port: 3000 initialDelaySeconds: 3000 periodSeconds: 1000 resources: requests: cpu: "200m"

memory: "300Mi" env: - name: PORT value : “3000”

func add(_ a: Int, to b: Int) -> Void { print(a + b) }

let a = ”5” let b = 3

public/*test/*server/server.jspackage.jsonREADME.md.gitignore

DockerfileDockerfile-tools.dockerignore

chart/node-app/Chart.yamlchart/node-app/templates/deployment.yamlchart/node-app/templates/hpa.yamlchart/node-app/templates/service.yamlchart/node-app/values.yaml

HELM CHARTS

apiVersion: autoscaling/v2alpha1kind: HorizontalPodAutoscalermetadata: name: “node-app-hpa-policy" namespace: defaultspec: scaleTargetRef: apiVersion: apps/v1beta1 kind: Deployment name: “node-app-deployment" minReplicas: 5 maxReplicas: 10 metrics: - type: Resource resource: name: cpu targetAverageUtilization: 70 - type: Resource resource: name: memory targetAverageUtilization: 70

func add(_ a: Int, to b: Int) -> Void { print(a + b) }

let a = ”5” let b = 3

public/*test/*server/server.jspackage.jsonREADME.md.gitignore

DockerfileDockerfile-tools.dockerignore

chart/node-app/Chart.yamlchart/node-app/templates/deployment.yamlchart/node-app/templates/hpa.yamlchart/node-app/templates/service.yamlchart/node-app/values.yaml

HELM CHARTS

apiVersion: v1kind: Servicemetadata: name: “node-app“ labels: chart: “node-app-1.0.0”spec: type: NodePort ports: - port: 3000 selector: app: “node-app-selector"

$ helm package ./chart/node-app

$ helm package ./chart/node-app$ helm install ./node-app-1.0.0.tgz

$ helm package ./chart/node-app$ helm install ./node-app-1.0.0.tgz

$ helm package ./chart/node-app$ helm install ./node-app-1.0.0.tgz

Understanding Microservices Performance

PUBLIC NETWORK CLOUD NETWORK

CATALOG

ORDER

INVENTORY

USER

MySQL

MongoDB

SPARK

ELASTICSEARCH

BACKEND FOR FRONTEND MICROSERVICES SERVICES

LOAD BALANCER

PUBLIC NETWORK CLOUD NETWORK

CATALOG

ORDER

INVENTORY

USER

MySQL

MongoDB

SPARK

ELASTICSEARCH

BACKEND FOR FRONTEND MICROSERVICES SERVICES

LOAD BALANCER

PUBLIC NETWORK CLOUD NETWORK

CATALOG

ORDER

INVENTORY

USER

MySQL

MongoDB

SPARK

ELASTICSEARCH

BACKEND FOR FRONTEND MICROSERVICES SERVICES

LOAD BALANCER

PUBLIC NETWORK CLOUD NETWORK

CATALOG

ORDER

INVENTORY

USER

MySQL

MongoDB

SPARK

ELASTICSEARCH

BACKEND FOR FRONTEND MICROSERVICES SERVICES

LOAD BALANCER

PUBLIC NETWORK CLOUD NETWORK

CATALOG

ORDER

INVENTORY

USER

MySQL

MongoDB

SPARK

ELASTICSEARCH

BACKEND FOR FRONTEND MICROSERVICES SERVICES

LOAD BALANCER

PUBLIC NETWORK CLOUD NETWORK

CATALOG

ORDER

INVENTORY

USER

MySQL

MongoDB

SPARK

ELASTICSEARCH

BACKEND FOR FRONTEND MICROSERVICES SERVICES

LOAD BALANCER

PUBLIC NETWORK CLOUD NETWORK

CATALOG

ORDER

INVENTORY

USER

MySQL

MongoDB

SPARK

ELASTICSEARCH

BACKEND FOR FRONTEND MICROSERVICES SERVICES

LOAD BALANCER

BROWSER

TIME

BROWSER

LOAD BALANCER

TIME

BROWSER

LOAD BALANCER

WEB BFF

TIME

BROWSER

LOAD BALANCER

WEB BFF

ORDER SERVICE

TIME

BROWSER

LOAD BALANCER

WEB BFF

ORDER SERVICE

MongoDB

TIME

BROWSER

LOAD BALANCER

WEB BFF

ORDER SERVICE

MongoDB

INVENTORY SERVICE

TIME

MySQL

BROWSER

LOAD BALANCER

WEB BFF

ORDER SERVICE

MongoDB

INVENTORY SERVICE

TIME

MySQL

BROWSER

LOAD BALANCER

WEB BFF

ORDER SERVICE

MongoDB

INVENTORY SERVICE

MongoDB

TIME

• Collects data from each enabled service

• Propagates correlation ID using HTTP headers

• Provides sampling, tracing, and debug capabilities

• Collects microsecond timestamps

• Correlates data in Zipkin server

• Presents data in Zipkin dashboard

Request Tracking: OpenTracing and Zipkin

const zipkin = require(‘appmetrics-zipkin’);const rest = require(‘rest');const express = require('express');

const app = express();

app.get('/', (req, res) => { rest('http://localhost:9000/api') .then(response => res.send(response.entity)) .catch(err => console.error('Error', err.stack));});

const port = process.env.PORT || 3000;app.listen(port, function(){ logger.info(`Express listening on: ` + port);});

• Collects data from each enabled service

• Requires /metrics endpoint providing data

• Provides storage and correlation capabilities

• Provide customisable dashboard

• Integrates with Graphana, Graphite, etc

Microservice Metrics: Prometheus

const zipkin = require(‘appmetrics-zipkin’);const prometheus = require(‘appmetrics-prometheus’).attach();const rest = require(‘rest');const express = require('express');

const app = express();

app.get('/', (req, res) => { rest('http://localhost:9000/api') .then(response => res.send(response.entity)) .catch(err => console.error('Error', err.stack));});

const port = process.env.PORT || 3000;app.listen(port, function(){ logger.info(`Express listening on: ` + port);});

• ‘appmetrics-dash’ provides self-hosted monitoring

• Inbound and Outbound request performance

• Resource and event loop monitoring

• Request a node-report

• Enable profiling and flame graphs

Deep Analysis: ‘appmetrics-dash’ and Flame Graphs

const zipkin = require(‘appmetrics-zipkin’);const prometheus = require(‘appmetrics-prometheus’).attach();const dash = require(‘appmetrics-dash’).attach();const rest = require(‘rest');const express = require('express');

const app = express();

app.get('/', (req, res) => { rest('http://localhost:9000/api') .then(response => res.send(response.entity)) .catch(err => console.error('Error', err.stack));});

const port = process.env.PORT || 3000;app.listen(port, function(){ logger.info(`Express listening on: ` + port);});

‘appmetrics-dash’

‘appmetrics-dash’

‘appmetrics-dash’

$ yo nodeserver$ yo nodeserver

developer.ibm.com/node

const rest = require(‘rest');const express = require('express');const CLSContext = require('zipkin-context-cls');const {Tracer} = require('zipkin');const {recorder} = require('./recorder');

const ctxImpl = new CLSContext('zipkin');const tracer = new Tracer({ctxImpl, recorder});

const app = express();

const zipkinMiddleware = require('zipkin-instrumentation-express').expressMiddleware;app.use(zipkinMiddleware({tracer, serviceName: 'frontend'}));

const {restInterceptor} = require('zipkin-instrumentation-cujojs-rest');const zipkinRest = rest.wrap(restInterceptor, {tracer, remoteServiceName: ‘backend'});

app.get('/', (req, res) => { zipkinRest('http://localhost:9000/api') .then(response => res.send(response.entity)) .catch(err => console.error('Error', err.stack));});