RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo...

59
RIPQ: Advanced Photo Caching on Flash for Facebook Linpeng Tang (Princeton) Qi Huang (Cornell & Facebook) Wyatt Lloyd (USC & Facebook) Sanjeev Kumar (Facebook) Kai Li (Princeton) 1

Transcript of RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo...

Page 1: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ: Advanced Photo Caching

on Flash for Facebook

Linpeng Tang (Princeton)

Qi Huang (Cornell & Facebook)

Wyatt Lloyd (USC & Facebook)

Sanjeev Kumar (Facebook)

Kai Li (Princeton)

1

Page 2: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

22* Facebook 2014 Q4 Report

Photo Serving Stack2 Billion* Photos

Shared Daily

Storage

Backend

Page 3: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

3

Photo Caches

Close to users

Reduce backbone traffic

Co-located with backend

Reduce backend IO

Flash

Storage

Backend

Edge Cache

Origin Cache

Photo Serving Stack

Page 4: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

4

Flash

Storage

Backend

Edge Cache

Origin Cache

Photo Serving StackAn Analysis of

Facebook Photo Caching

[Huang et al. SOSP’13]

Segmented LRU-3:

10% less backbone traffic

Greedy-Dual-Size-Frequency-3:

23% fewer backend IOs

Advanced caching

algorithms help!

Page 5: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

5

FlashFIFO was still used

No known way to implement

advanced algorithms efficiently

Storage

Backend

Edge Cache

Origin Cache

In Practice Photo Serving Stack

Page 6: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

6

Advanced caching helps:

• 23% fewer backend IOs

• 10% less backbone traffic

Theory Practice

Difficult to implement on flash:

• FIFO still used

Restricted Insertion Priority Queue:

efficiently implement advanced

caching algorithms on flash

Page 7: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Outline

• Why are advanced caching algorithmsdifficult to implement on flash efficiently?

• How RIPQ solves this problem?– Why use priority queue?

– How to efficiently implement one on flash?

• Evaluation– 10% less backbone traffic

– 23% fewer backend IOs7

Page 8: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Outline

• Why are advanced caching algorithmsdifficult to implement on flash efficiently?– Write pattern of FIFO and LRU

• How RIPQ solves this problem?– Why use priority queue?

– How to efficiently implement one on flash?

• Evaluation– 10% less backbone traffic

– 23% fewer backend IOs8

Page 9: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

FIFO Does Sequential Writes

9

Cache space of FIFO

Head Tail

Page 10: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

FIFO Does Sequential Writes

10

Cache space of FIFO

Head Tail

Miss

Page 11: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

FIFO Does Sequential Writes

11

Cache space of FIFO

Head Tail

Hit

Page 12: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

FIFO Does Sequential Writes

12

Cache space of FIFO

Head Tail

Evicted

No random writes needed for FIFO

Page 13: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

LRU Needs Random Writes

13

Cache space of LRU

Head Tail

Locations on flash ≠ Locations in LRU queue

Hit

Page 14: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

LRU Needs Random Writes

14

Head Tail

Non-contiguous

on flash

Random writes needed to reuse space

Cache space of LRU

Page 15: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Why Care About Random Writes?

• Write-heavy workload

– Long tail access pattern, moderate hit ratio

– Each miss triggers a write to cache

• Small random writes are harmful for flash

– e.g. Min et al. FAST’12

– High write amplification

15

Low write throughput

Short device lifetime

Page 16: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

What write size do we need?

• Large writes

– High write throughput at high utilization

– 16~32MiB in Min et al. FAST’2012

• What’s the trend since then?

– Random writes tested for 3 modern devices

– 128~512MiB needed now

16

100MiB+ writes needed for efficiency

Page 17: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Outline

• Why are advanced caching algorithms

difficult to implement on flash efficiently?

• How RIPQ solves this problem?

• Evaluation

17

Page 18: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Architecture(Restricted Insertion Priority Queue)

18

Advanced Caching Policy

(SLRU, GDSF …)

RIPQ

Priority Queue API

RAM Flash

Flash-friendly Workloads

Approximate Priority Queue

Efficient caching

on flash

Caching algorithms

approximated as well

Page 19: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Architecture(Restricted Insertion Priority Queue)

19

Advanced Caching Policy

(SLRU, GDSF …)

RIPQ

Priority Queue API

RAM Flash

Flash-friendly Workloads

Approximate Priority Queue

Restricted insertionSection merge/split

Large writesLazy updates

Page 20: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Priority Queue API

• No single best caching policy

• Segmented LRU [Karedla’94]

– Reduce both backend IO and backbone traffic

– SLRU-3: best algorithm for Edge so far

• Greedy-Dual-Size-Frequency [Cherkasova’98]

– Favor small objects

– Further reduces backend IO

– GDSF-3: best algorithm for Origin so far

20

Page 21: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Segmented LRU

• Concatenation of K LRU caches

21

Cache space of SLRU-3

HeadL2 L1

TailL3

Miss

Page 22: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Segmented LRU

• Concatenation of K LRU caches

22

HeadL2 L1

TailL3

Miss

Cache space of SLRU-3

Page 23: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Segmented LRU

• Concatenation of K LRU caches

23

Cache space of SLRU-3

HeadL2 L1

TailL3

Hit

Page 24: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Segmented LRU

• Concatenation of K LRU caches

24

Cache space of SLRU-3

HeadL2 L1

TailL3

Hit again

Page 25: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Greedy-Dual-Size-Frequency

• Favoring small objects

25

Cache space of GDSF-3

Head Tail

Page 26: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Greedy-Dual-Size-Frequency

• Favoring small objects

26

Cache space of GDSF-3

Head Tail

Miss

Page 27: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Greedy-Dual-Size-Frequency

• Favoring small objects

27

Cache space of GDSF-3

Head Tail

Miss

Page 28: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Greedy-Dual-Size-Frequency

• Favoring small objects

28

Cache space of GDSF-3

Head

• Write workload more random than LRU

• Operations similar to priority queue

Tail

Page 29: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Relative Priority Queue for

Advanced Caching Algorithms

29

Cache space

Head Tail

1.0 0.0

Miss object: insert(x, p)

p

Page 30: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Relative Priority Queue for

Advanced Caching Algorithms

30

Cache space

Head Tail

1.0 0.0

Hit object: increase(x, p’)

p’

Page 31: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Relative Priority Queue for

Advanced Caching Algorithms

31

Cache space

Head Tail

1.0 0.0

Implicit demotion on insert/increase:

• Object with lower priorities

moves towards the tail

Page 32: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Relative Priority Queue for

Advanced Caching Algorithms

32

Cache space

Head Tail

1.0 0.0

Evict from queue tail

Evicted

Relative priority queue captures the

dynamics of many caching algorithms!

Page 33: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Design: Large Writes

33

• Need to buffer object writes (10s KiB) into block writes

• Once written, blocks are immutable!

• 256MiB block size, 90% utilization

• Large caching capacity

• High write throughput

Page 34: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Design:

Restricted Insertion Points

34

• Exact priority queue

• Insert to any block in the queue

• Each block needs a separate buffer

• Whole flash space buffered in RAM!

Page 35: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Design:

Restricted Insertion Points

35

Solution: restricted insertion points

Page 36: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Section is Unit for Insertion

36

1 .. 0.6 0.6 .. 0.35 0.35 .. 0

Active block with

RAM buffer

Sealed block

on flash

Head Tail

Each section has one insertion point

Section Section Section

Page 37: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Section is Unit for Insertion

37

Head Tail+1

insert(x, 0.55)

1 .. 0.6 0.6 .. 0.35 0.35 .. 01 .. 0.62 0.62 .. 0.33 0.33 .. 0

Insert procedure

• Find corresponding section

• Copy data into active block

• Updating section priority range

Section Section Section

Page 38: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

1 .. 0.62 0.62 .. 0.33 0.33 .. 0

Section is Unit for Insertion

38

Active block with

RAM buffer

Sealed block

on flash

Head Tail

Relative orders within one section not guaranteed!

Section Section Section

Page 39: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Trade-off in Section Size

39

Section size controls approximation error

• Sections , approximation error

• Sections , RAM buffer

Head Tail

1 .. 0.62 0.62 .. 0.33 0.33 .. 0

Section Section Section

Page 40: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Design: Lazy Update

40

Head Tail

increase(x, 0.9)

Problem with naïve approach

• Data copying/duplication on flash

x

+1

Naïve approach: copy to the

corresponding active block

Section Section Section

Page 41: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Design: Lazy Update

41

Head Tail

Solution: use virtual block to

track the updated location!

Section Section Section

Page 42: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Design: Lazy Update

42

Head Tail

Virtual Blocks

Solution: use virtual block to

track the updated location!

Section Section Section

Page 43: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Virtual Block Remembers

Update Location

43

Head Tail

No data written during virtual update

increase(x, 0.9)x

+1

Section Section Section

Page 44: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Actual Update During Eviction

44

Head Tail

x

Section Section Section

x now at tail block.

Page 45: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Actual Update During Eviction

45

Head Tail

-1

+1

xCopy data to

the active block

Always one copy of data on flash

Section Section Section

Page 46: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Design

• Relative priority queue API

• RIPQ design points

– Large writes

– Restricted insertion points

– Lazy update

– Section merge/split

• Balance section sizes and RAM buffer usage

• Static caching

– Photos are static

46

Page 47: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Outline

• Why are advanced caching algorithms

difficult to implement on flash efficiently?

• How RIPQ solves this problem?

• Evaluation

47

Page 48: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Evaluation Questions

• How much RAM buffer needed?

• How good is RIPQ’s approximation?

• What’s the throughput of RIPQ?

48

Page 49: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Evaluation Approach

• Real-world Facebook workloads

– Origin

– Edge

• 670 GiB flash card

– 256MiB block size

– 90% utilization

• Baselines

– FIFO

– SIPQ: Single Insertion Priority Queue

49

Page 50: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Needs Small Number of Insertion Points

Insertion points

50

Ob

ject-

wis

e h

it-r

atio

(%

)

25

30

35

40

45

2 4 8 16 32

Exact GDSF-3

GDSF-3

Exact SLRU-3

SLRU-3

FIFO

+6%

+16%

Page 51: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Needs Small Number of Insertion Points

51

Ob

ject-

wis

e h

it-r

atio

(%

)

25

30

35

40

45

2 4 8 16 32

Exact GDSF-3

GDSF-3

Exact SLRU-3

SLRU-3

FIFO

Insertion points

Page 52: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Needs Small Number of Insertion Points

52

Ob

ject-

wis

e h

it-r

atio

(%

)

25

30

35

40

45

2 4 8 16 32

Exact GDSF-3

GDSF-3

Exact SLRU-3

SLRU-3

FIFO

You don’t need much RAM buffer (2GiB)!

Insertion points

Page 53: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Has High Fidelity

53

Ob

ject-

wis

e h

it-r

atio

(%

)

20

25

30

35

40

45

SLRU-1 SLRU-2 SLRU-3 GDSF-1 GDSF-2 GDSF-3 FIFO

Exact

RIPQ

FIFO

Page 54: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Has High Fidelity

54

Ob

ject-

wis

e h

it-r

atio

(%

)

20

25

30

35

40

45

SLRU-1 SLRU-2 SLRU-3 GDSF-1 GDSF-2 GDSF-3 FIFO

Exact

RIPQ

FIFO

Page 55: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Has High Fidelity

55

Ob

ject-

wis

e h

it-r

atio

(%

)

20

25

30

35

40

45

SLRU-1 SLRU-2 SLRU-3 GDSF-1 GDSF-2 GDSF-3 FIFO

Exact

RIPQ

FIFO

RIPQ achieves ≤0.5% difference for all algorithms

Page 56: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Has High Fidelity

56

Ob

ject-

wis

e h

it-r

atio

(%

)

20

25

30

35

40

45

SLRU-1 SLRU-2 SLRU-3 GDSF-1 GDSF-2 GDSF-3 FIFO

Exact

RIPQ

FIFO

+16% hit-ratio ➔ 23% fewer backend IOs

+16%

Page 57: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ Has High Throughput

57

Thro

ug

hpu

t (r

eq./sec)

RIPQ throughput comparable to FIFO (≤10% diff.)

0

5000

10000

15000

20000

25000

30000

SLRU-1 SLRU-2 SLRU-3 GDSF-1 GDSF-2 GDSF-3

RIPQ

FIFO

Page 58: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

Related Works

RAM-based advanced cachingSLRU(Karedla’94), GDSF(Young’94, Cao’97, Cherkasova’01),

SIZE(Abrams’96), LFU(Maffeis’93), LIRS (Jiang’02), …

Flash-based caching solutionsFacebook FlashCache, Janus(Albrecht ’13), Nitro(Li’13), OP-FCL(Oh’12), FlashTier(Saxena’12), Hec(Yang’13), …

Flash performanceStoica’09, Chen’09, Bouganim’09, Min’12, …

58

RIPQ enables their use on flash

RIPQ supports advanced algorithms

Trend continues for modern flash cards

Page 59: RIPQ: Advanced Photo Caching on Flash for Facebook* Facebook 2014 Q4 Report 22 2 Billion Photo Serving Stack * Photos Shared Daily Storage Backend. 3 Photo Caches Close to users Reduce

RIPQ

• First framework for advanced caching on flash

– Relative priority queue interface

– Large writes

– Restricted insertion points

– Lazy update

– Section merge/split

• Enables SLRU-3 & GDSF-3 for Facebook photos

– 10% less backbone traffic

– 23% fewer backend IOs

59