SplunkLive! Stockholm 2016 - iZettle
-
Upload
splunk -
Category
Technology
-
view
713 -
download
2
Transcript of SplunkLive! Stockholm 2016 - iZettle
![Page 1: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/1.jpg)
Copyright © 2015 Splunk Inc.
Splunk at iZettleJohannes Lofgren, Head of DevOps
j
![Page 2: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/2.jpg)
‹#›
Johannes Löfgren - Head of Devops and Infrastructure
![Page 3: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/3.jpg)
‹#›
The challenge….
Show me a PCI-DSS compliant centralised logging solution in 5 weeks!
![Page 4: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/4.jpg)
‹#›
Why Splunk?• PCI-DSS - Payment Card
Industry Data Security Standard
• iZettle’s first PCI-DSS audit was in Q2 2012
• Starting point: local logs on around 10 backend servers
• Before audit deadline: Prove our control of operations and security using our centralised log solution
![Page 5: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/5.jpg)
‹#›
Starting out
Daily report email
Scheduled alerts (email and sms)
File integrity monitoring
Learn basic search skills
![Page 6: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/6.jpg)
‹#›
Starting out
< 1s, Expected result of automated deploy
90 minutes, Further investigation needed
File integrity monitoring
Daily report email
![Page 7: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/7.jpg)
‹#›
iZettle expansion
All backend systems logging to Splunk
2011
One market
Monolithic backend
Single location traditional hosting
2013
Multiple markets in three continents
Distributed backend
Hybrid cloud infrastructure
![Page 8: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/8.jpg)
‹#›
Splunk at iZettle Today
Usage50% of total implemented alerts80+ usersAll backend services log to splunk
Support Security Development QAOperations
BenefitsEasy to scaleEasy to moveSearch across multiple servicesAdapt alert triggers to trendsFIM
![Page 9: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/9.jpg)
‹#›
Follow the trend - exampleWeekly and daily trend below
![Page 10: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/10.jpg)
‹#›
Follow the trend - example The _internal index tracks logged bytes per source:
earliest = -1h@h latest = @h index=_internal source=*license_usage.log type=Usage s=merchant-reports | eval MB=b/1024/1024 | stats sum(MB) as last
![Page 11: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/11.jpg)
‹#›
Follow the trend - example Run a subsearch for the same, 7 days ago. Column output:
earliest = -1h@h latest = @h index=_internal source=*license_usage.log type=Usage s=merchant-reports | eval MB=b/1024/1024 | stats sum(MB) as last | appendcols [ search earliest = -169h@h latest = -168h@h index=_internal source=*license_usage.log type=Usage s=merchant-reports | eval MB=b/1024/1024 | stats sum(MB) as comparator ]
![Page 12: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/12.jpg)
‹#›
Follow the trend - example Calculate the percentage diff. Add explanatory labels:
earliest = -1h@h latest = @h index=_internal source=*license_usage.log type=Usage s=merchant-reports | eval MB=b/1024/1024 | stats sum(MB) as last | appendcols [ search earliest = -169h@h latest = -168h@h index=_internal source=*license_usage.log type=Usage s=merchant-reports | eval MB=b/1024/1024 | stats sum(MB) as comparator ] | eval percent_change=100*(last/comparator)-100 | rename last as "MB Latest hour", comparator as "MB same hour, 7 days ago"
![Page 13: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/13.jpg)
‹#›
Follow the trend - example The _internal index is lightweight to search:
What to do with this?Create an alert triggering on a positive and negative threshold of the variable “percent_change”Generic enough to suit any system
![Page 14: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/14.jpg)
‹#›
Key lessonsInsert all your
logging services to cross search
systems
Take a generic anomaly approach
on alerts
Make use of what’s already summarised
for light weight searching
Use dynamic alert thresholds
![Page 15: SplunkLive! Stockholm 2016 - iZettle](https://reader036.fdocuments.in/reader036/viewer/2022062311/588037f71a28abfd0a8b4813/html5/thumbnails/15.jpg)
Thank You