24
Ceph as Storage backend Jaime Ibar Systems Administrator [email protected] 08/11/2018

Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

  • Upload
    others

  • View
    5

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Ceph as Storage backend

Jaime IbarSystems [email protected]

08/11/2018

Page 2: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Outline• Ceph

• Ceph overview

• Ceph architecture

• OpenNebula cloud backed by Ceph storage

• Ceph infrastructure details• Ceph services• Pools configuration

• Issues

• Use cases

• Research IT

• DRI(Digital Repository of Ireland)

• Next steps

Page 3: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Ceph overview

‒ Open Source software defined storage

‒ Unified Storage Platform

● Block Storage – Rados Block Device(RBD)

● Object Storage - RadosGW(RGW) S3 compatible storage

● File System – POSIX-compliant file system(CephFS)

‒ No single point of failure

‒ Popular in private clouds as VM image service and S3-compatible object storage service

‒ Easy to escalate

● Expanding capacity as easy as adding more osds

Page 4: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Ceph architecture

Page 5: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Ceph network architecture

Page 6: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for
Page 7: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

OpenNebula cloud backed by Ceph storage

● Librbd client integration wiht libvirt/qemu

● Support for live migration

● Directly supported with the Ceph datastore(rbd CLI)

● Images are stored in a Ceph pool(one-<IMAGE ID>)

● Virtual machine disks stored by default in the same pool

● Specific user for accessing Ceph(libvirt by default)

● OpenNebula folder exported over cephfs to the clients

Page 8: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

OpenNebula cloud backed by Ceph storage: Limitations

● OpenNebula nodes must be part of a running Ceph cluster

● Xen support?

● Hypervisor nodes to be part of a working Ceph cluster

● Libvirt QEMU packages need to be recent enough

Page 9: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Ceph infrastructure details

● 3 nodes for mon/mds/mgr

● 7 nodes for osds(12 osds each)

● 1 vm for rados gw

● Ubuntu 16.04

● 32 GB memory

● 72 x 4TB HDD

● 12 x 8TB HDD

● 1GB network uplink(Public and cluster(private) network)

● 349 TB Total capacity

Page 10: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Ceph dashboard(servers)

Page 11: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Ceph services

● 3 Monitoring servers(quorum and HA)

● 3 Metadata servers(1 active, 2 standby)

● 3 Manager daemons

– Introduced since Luminous

– Dashboard enabled

● 1 Rados gateway for object storage(S3)

● 7 pools

– 3 for rbd

– 3 for cephfs

– 1 for radosgw objects

● 84 osds

● Cephfs(kernel client instead of cephfuse)

● Recently updated from Jewel 10.2.10 to Luminous 12.2.8

Page 12: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Ceph dashboard(Cluster health)

Page 13: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Ceph dashboard(OSD daemons)

Page 14: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Cephfs server(I)

Page 15: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Cephfs server(II)

Page 16: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Cephfs clients

Page 17: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Pools configuration

● 1 replicated pool for OpenNebula

– Replica factor 3

● 2 replicated pools for raw storage

– Replica factor 2, 3 depending on users requirements.

● 3 pools for cephfs

– 1 erasure pool for data● Replica factor 5

– 1 replicated pool for metadata● Replica factor 2

– 1 replicated pool for tier cache● Replica factor 2

● 1 replicated pool for radosgw(objects storage)

– Replica factor 3

Page 18: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Ceph dashboard(OpenNebula pool details)

Page 19: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Issues

Issues we found after upgrading to Luminous

● Number of pgs per osd reduced to 200(too many PGs per OSD warning showed up)

● Number of placement groups suboptimal for some pools

– Impact on performance and recovering

– Decrease number of pgs is not an option

– Solution: recreating the pools with right number of pgs● Not clear how to calculate right number of pgs though

● Multimds broke cephfs for some clients

– Kernel version too old(4.4)

– Minimum recommended is 4.9

● Recovering after osd failure caused service interruption and unresponsive vms

Page 20: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Use cases

● Research IT

● Digital Repository of Ireland(DRI)

Page 21: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Use case: Research IT

● Ceph Image datastore for OpenNebula

● OS and datablock(raw storage)

● Running vms(RBD block devices)

● Core services running on OpenNebula

– DNS

– DHCP

– Web apps

– ...

● https://www.tchpc.tcd.ie/

Page 22: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Use case: Digital Repository of Ireland(DRI)

– Core services running on OpenNebula

– Ceph fs for storing uploaded files

– metadata text files

– Data files(jpg, mp3, …)

– RadosGW for S3 access. Objects storage.

Page 23: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Trinity College Dublin, The University of Dublin

Next steps

● Migration to Bluestore backend

– It has to be done one osd at a time

● Improve cluster network redundancy

● PGs

● SSD disks for journal

● 10G network

● ...

Page 24: Ceph as Storage backend - HEAnet · Trinity College Dublin, The University of Dublin OpenNebula cloud backed by Ceph storage Librbd client integration wiht libvirt/qemu Support for

Thank You