Ceph?
在实际的生产环境中,高可用永远是一个避不开的话题,没有用户可以忍耐自己交了钱却无法访问自己的服务,甚至服务产生的数据丢失。在整个云计算体系中,只有 储存 是有状态服务,计算、网络 需求都是可以动态切换到新的物理机上运行的无状态服务。所以,构建一个高可用的储存系统尤为重要。
Ceph,当之无愧的最佳之选:
存储系统 | Ceph | GlusterFS | Sheepdog | Lustre | Swift | Cinder | TFS | HDFS | MooseFS | FastDFS | MogileFS |
---|---|---|---|---|---|---|---|---|---|---|---|
开发语言 | C++ | C | C | C | Python | Python | C++ | Java | C | C | Perl |
开源协议 | LGPL | GPL V3 | GPLv2 | GPL | Apache | Apache | GPL V2 | Apache | GPL V3 | GPL V3 | GPL |
数据存储方式 | 对象/文件/块 | 文件/块 | 块 | 对象 | 对象 | 块 | 文件 | 文件 | 块 | 文件/块 | 文件 |
集群节点通信协议 | 私有协议(TCP) | 私有协议(TCP)/ RDAM(远程直接访问内存) | totem协议 | 私有协议(TCP)/ RDAM(远程直接访问内存) | TCP | 未知 | TCP | TCP | TCP | TCP | HTTP |
专用元数据存储点 | 占用MDS | 无 | 无 | 双MDS | 无 | 未知 | 占用NS | 占用MDS | 占用MFS | 无 | 占用DB |
在线扩容 | 支持 | 支持 | 支持 | 支持 | 支持 | 未知 | 支持 | 支持 | 支持 | 支持 | 支持 |
冗余备份 | 支持 | 支持 | 支持 | 无 | 支持 | 未知 | 支持 | 支持 | 支持 | 支持 | 不支持 |
单点故障 | 不存在 | 不存在 | 不存在 | 存在 | 不存在 | 未知 | 存在 | 存在 | 存在 | 不存在 | 存在 |
跨集群同步 | 不支持 | 支持 | 未知 | 未知 | 未知 | 未知 | 支持 | 不支持 | 不支持 | 部分支持 | 不支持 |
易用性 | 安装简单,官方文档专业化 | 安装简单,官方文档专业化 | 未知 | 复杂。而且Lustre严重依赖内核,需要重新编译内核 | 未知 | 目前来说框架不算成熟存在一些问题 | 安装复杂,官方文档少 | 安装简单,官方文档专业化 | 安装简单,官方文档多 | 安装简单,社区相对活跃 | 未知 |
适用场景 | 单集群的大中小文件 | 跨集群云存储 | 弹性块存储虚拟机 | 大文件读写 | openstack对象存储 | openstack块存储 | 跨集群的小文件 | Mapreduce使用的文件存储 | 单集群的大中文件 | 单集群的中小文件 | 未知 |
FUSE挂载 | 支持 | 支持 | 支持 | 支持 | 支持 | 未知 | 未知 | 支持 | 支持 | 不支持 | 不支持 |
访问接口 | POSIX | POSIX | 未知 | POSIX/MPI | POSIX | 未知 | 不支持POSIX | 不支持POSIX | POSIX | 不支持POSIX | 不支持POSIX |
快速部署
我们认定此时你已完成了
HostName 设置
、hosts 设置
、SSH 免密登录设置
、时间同步
、Docker 安装
这些前置工作
这里,我们选择了 Cephadm 作为部署工具,这样我们就可以使用容器镜像来将 Ceph 部署到 Docker 上了!这样就可以将 OpenStack 将和 Ceph 一同部署到 Docker 上了,不仅可以隔离环境防止运行异常还可以很方便的实现版本升级!所以 Kolla + Cephadm + Ansible 才是 yyds!
apt install -y cephadm
mkdir -p /etc/ceph
因为我们使用了 Ubuntu 20.04 作为底层系统,如果你选择了其它操作系统可能需要手动安装
Cephadm
并添加相关 repo。
首先我们需要创建一个 ceph-initialization.conf 来设置 pool size 为 1 来实现 1 块硬盘 1 台机器部署 Ceph。
[global]
osd pool default size = 1
osd pool default min size = 1
执行如下命令来初始化“集群”:
cephadm bootstrap --config ceph-initialization.conf --mon-ip ${IPAddress}
关闭一个奇怪的 Warning:
ceph config set global mon_warn_on_pool_no_redundancy false
集群的配置
部署 OSD
ceph orch device ls
ceph orch apply osd --all-available-devices
创建 Pool
ceph osd pool create images 32 32
ceph osd pool create volumes 32 32
ceph osd pool create backups 32 32
一步到胃的 Keyring 生成
由于每个服务都需要单独的 Ceph 账号、Keyring 文件,所以我们配置了一步到胃的 Keyring 生成脚本:
mkdir -p /etc/kolla/config/cinder/cinder-backup
mkdir -p /etc/kolla/config/cinder/cinder-volume
mkdir -p /etc/kolla/config/nova
mkdir -p /etc/kolla/config/glance
ceph auth get-or-create client.cinder
ceph auth caps client.cinder mon 'allow r' osd 'allow rwx pool=volumes'
ceph auth get client.cinder -o /etc/kolla/config/cinder/cinder-volume/ceph.client.cinder.keyring
ceph auth get client.cinder -o /etc/kolla/config/cinder/cinder-backup/ceph.client.cinder.keyring
ceph auth get client.cinder -o /etc/kolla/config/nova/ceph.client.cinder.keyring
ceph auth get-or-create client.cinder-backup
ceph auth caps client.cinder-backup mon 'allow r' osd 'allow rwx pool=volumes, allow rwx pool=backups'
ceph auth get client.cinder-backup -o /etc/kolla/config/cinder/cinder-backup/ceph.client.cinder-backup.keyring
ceph auth get-or-create client.glance
ceph auth caps client.glance mon 'allow r' osd 'allow rwx pool=images'
ceph auth get client.glance -o /etc/kolla/config/glance/ceph.client.glance.keyring
截至目前,该权限配置仍有未知的问题:
2022-03-05 13:41:38.987 8 INFO nova.compute.manager [req-3a14a22b-d0f7-46db-883f-4193d98b7ff0 d2f9727b2c2b4af4bfb3e44a5720a2ed 96cf095c53194fcf871bdc6d8c51122f - default default] [instance: 6c195337-1fee-4a0e-b116-5e614cd80e79] Successfully reverted task state from deleting on failure for instance.
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server [req-3a14a22b-d0f7-46db-883f-4193d98b7ff0 d2f9727b2c2b4af4bfb3e44a5720a2ed 96cf095c53194fcf871bdc6d8c51122f - default default] Exception during message handling: rbd.PermissionError: [errno 1] RBD permission error (error listing images)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/exception_wrapper.py", line 71, in wrapped
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification(
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/oslo_utils/excutils.py", line 227, in __exit__
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server self.force_reraise()
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server raise self.value
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/exception_wrapper.py", line 63, in wrapped
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/compute/manager.py", line 183, in decorated_function
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server LOG.warning("Failed to revert task state for instance. "
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/oslo_utils/excutils.py", line 227, in __exit__
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server self.force_reraise()
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server raise self.value
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/compute/manager.py", line 154, in decorated_function
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/compute/utils.py", line 1433, in decorated_function
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/compute/manager.py", line 211, in decorated_function
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context,
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/oslo_utils/excutils.py", line 227, in __exit__
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server self.force_reraise()
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server raise self.value
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/compute/manager.py", line 200, in decorated_function
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/compute/manager.py", line 3095, in terminate_instance
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/oslo_concurrency/lockutils.py", line 360, in inner
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server return f(*args, **kwargs)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/compute/manager.py", line 3093, in do_terminate_instance
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server self._set_instance_obj_error_state(instance)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/oslo_utils/excutils.py", line 227, in __exit__
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server self.force_reraise()
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server raise self.value
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/compute/manager.py", line 3083, in do_terminate_instance
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/compute/manager.py", line 3018, in _delete_instance
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/compute/manager.py", line 2910, in _shutdown_instance
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance,
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/oslo_utils/excutils.py", line 227, in __exit__
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server self.force_reraise()
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server raise self.value
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/compute/manager.py", line 2897, in _shutdown_instance
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server self.driver.destroy(context, instance, network_info,
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/virt/libvirt/driver.py", line 1423, in destroy
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server self.cleanup(context, instance, network_info, block_device_info,
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/virt/libvirt/driver.py", line 1493, in cleanup
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server return self._cleanup(
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/virt/libvirt/driver.py", line 1566, in _cleanup
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server self._cleanup_rbd(instance)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/virt/libvirt/driver.py", line 1643, in _cleanup_rbd
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server rbd_utils.RBDDriver().cleanup_volumes(filter_fn)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/nova/storage/rbd_utils.py", line 413, in cleanup_volumes
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server volumes = RbdProxy().list(client.ioctx)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/eventlet/tpool.py", line 193, in doit
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server result = proxy_call(self._autowrap, f, *args, **kwargs)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/eventlet/tpool.py", line 151, in proxy_call
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server rv = execute(f, *args, **kwargs)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/eventlet/tpool.py", line 132, in execute
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server six.reraise(c, e, tb)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/usr/local/lib/python3.8/dist-packages/six.py", line 719, in reraise
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server raise value
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "/var/lib/kolla/venv/lib/python3.8/site-packages/eventlet/tpool.py", line 86, in tworker
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server rv = meth(*args, **kwargs)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server File "rbd.pyx", line 720, in rbd.RBD.list
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server rbd.PermissionError: [errno 1] RBD permission error (error listing images)
2022-03-05 13:41:38.991 8 ERROR oslo_messaging.rpc.server
可能的解决方案,虽然经过尝试也没有生效:openstack - Permissions for glance user in Ceph - Stack Overflow
临时解决方案:allow *
复制 Ceph 配置
OpenStack 服务连接 Ceph 还需要 ceph.conf
来获取必要的配置信息:
由于默认生成
/etc/ceph/ceph.conf
中包含 \t
,但是 Kolla 要求没有配置 \t
的配置,所以我们需要将 ceph.conf
复制一份并删除所有 \t
!
cp ceph.conf /etc/kolla/config/cinder/cinder-volume/
cp ceph.conf /etc/kolla/config/cinder/cinder-backup/
cp ceph.conf /etc/kolla/config/nova/
cp ceph.conf /etc/kolla/config/glance/
毁灭吧,世界!
在安装过程中不可避免的会出现各种问题,不要灰心,你还有后悔药可以吃:
cephadm rm-cluster --force --fsid `perl -ne 'print $1 if m/fsid = (\S+)/' /etc/ceph/ceph.conf`
rm -rf /etc/ceph /etc/systemd/system/ceph* /var/lib/ceph/
撒花~
Todo