Oracle 18c RAC CRS-6706 Clusterware Release patch level () does not match Software patch level () 解决方法
作者:
dave
给Oracle 18c 的RAC 做RU升级,节点1运行没有问题,节点2打Patch时报各种错误,于是尝试重启了一下操作系统,在启动时,CRS无法启动,错误系统如下,
[root@www.cndba.cn ~]# crsctl start crs
CRS-6706: Oracle Clusterware Release patch level ('4180884209') does not match Software patch level ('1930182575'). Oracle Clusterware cannot be started.
CRS-4000: Command Start failed, or completed with errors.
这个实际上就是Patch打到了一般,导致patch level 不一致导致的。
在MOS上有一篇文章提到这个问题,但是解决方法给的非常模糊:
CRS-6706: Oracle Clusterware Release patch level (‘nnn’) does not match Software patch level (‘mmm’) (文档 ID 1639285.1)
节点1上的patch:
[grid@www.cndba.cn ~]$ kfod op=patches
---------------
List of Patches
===============
27908644
27923415
28090523
28090553
28090557
28256701
28507480
28547619
28655784
28655916
28655963
28656071
[grid@www.cndba.cn ~]$
[grid@www.cndba.cn ~]$ kfod op=patchlvl
-------------------
Current Patch level
===================
3985437637
节点2上的Patch:
[grid@www.cndba.cn trace]$ kfod op=patches
---------------
List of Patches
===============
27908644
27923415
28090523
28090553
28090557
28090564
28256701
28507480
28547619
[grid@www.cndba.cn trace]$
[grid@www.cndba.cn trace]$ kfod op=patchlvl
-------------------
Current Patch level
===================
4180884209
节点2因为执行RU脚本失败,但与之前的版本又不一致,所有在重启CRS后,无法启动。
官网给的解决方案很模糊,执行如下命令:
“
/crs/install/rootcrs.pl -prepatch”
“/crs/install/rootcrs.pl -postpatch”
我们这里可以回退我们之前打的Patch,在启动。
首先使用opatch lsinventory命令查看当前已经打的Patch:
[grid@www.cndba.cn u01]$ $ORACLE_HOME/OPatch/opatch lsinventory
Oracle Interim Patch Installer version 12.2.0.1.14
Copyright (c) 2018, Oracle Corporation. All rights reserved.
Oracle Home : /u01/app/18.3.0/grid
Central Inventory : /u01/app/oraInventory
from : /u01/app/18.3.0/grid/oraInst.loc
OPatch version : 12.2.0.1.14
OUI version : 12.2.0.4.0
Log file location : /u01/app/18.3.0/grid/cfgtoollogs/opatch/opatch2018-11-17_22-37-59PM_1.log
Lsinventory Output file location : /u01/app/18.3.0/grid/cfgtoollogs/opatch/lsinv/lsinventory2018-11-17_22-37-59PM.txt
--------------------------------------------------------------------------------
Local Machine Information::
Hostname: rac2
ARU platform id: 226
ARU platform description:: Linux x86-64
Installed Top-level Products (1):
Oracle Grid Infrastructure 18c 18.0.0.0.0
There are 1 products installed in this Oracle Home.
Interim patches (7) :
Patch 28547619 : applied on Sat Nov 17 15:59:31 CST 2018
Unique Patch ID: 22406652
Patch description: "TOMCAT RELEASE UPDATE 18.0.0.0.0 (28547619)"
Created on 29 Aug 2018, 11:32:05 hrs PST8PDT
Bugs fixed:
27869283, 28402313
……
我们这里是28547619 , 然后回滚该patch,命令如下:
[grid@www.cndba.cn u01]$ $ORACLE_HOME/OPatch/opatch rollback -id 28547619
Oracle Interim Patch Installer version 12.2.0.1.14
Copyright (c) 2018, Oracle Corporation. All rights reserved.
Oracle Home : /u01/app/18.3.0/grid
Central Inventory : /u01/app/oraInventory
from : /u01/app/18.3.0/grid/oraInst.loc
OPatch version : 12.2.0.1.14
OUI version : 12.2.0.4.0
Log file location : /u01/app/18.3.0/grid/cfgtoollogs/opatch/opatch2018-11-17_22-38-51PM_1.log
Patches will be rolled back in the following order:
28547619
The following patch(es) will be rolled back: 28547619
Please shutdown Oracle instances running out of this ORACLE_HOME on the local system.
(Oracle Home = '/u01/app/18.3.0/grid')
Is the local system ready for patching? [y|n]
y
User Responded with: Y
Rolling back patch 28547619...
RollbackSession rolling back interim patch '28547619' from OH '/u01/app/18.3.0/grid'
Patching component oracle.tomcat.crs, 18.0.0.0.0...
RollbackSession removing interim patch '28547619' from inventory
Inactive sub-set patch [28256701] has become active due to the rolling back of a super-set patch [28547619].
Please refer to Doc ID 2161861.1 for any possible further required actions.
Log file location: /u01/app/18.3.0/grid/cfgtoollogs/opatch/opatch2018-11-17_22-38-51PM_1.log
OPatch succeeded.
[grid@www.cndba.cn u01]$
在启动CRS, 启动成功:
[root@www.cndba.cn ~]# crsctl start crs
CRS-4123: Oracle High Availability Services has been started.
[root@www.cndba.cn ~]#
[root@www.cndba.cn ~]# crsctl stat res -t
--------------------------------------------------------------------------------
Name Target State Server State details
--------------------------------------------------------------------------------
Local Resources
--------------------------------------------------------------------------------
ora.ASMNET1LSNR_ASM.lsnr
ONLINE ONLINE rac1 STABLE
ONLINE ONLINE rac2 STABLE
ora.DATA.dg
ONLINE ONLINE rac1 STABLE
ONLINE ONLINE rac2 STABLE
ora.LISTENER.lsnr
ONLINE ONLINE rac1 STABLE
ONLINE ONLINE rac2 STABLE
ora.MGMT.GHCHKPT.advm
OFFLINE OFFLINE rac1 STABLE
OFFLINE OFFLINE rac2 STABLE
ora.MGMT.dg
ONLINE ONLINE rac1 STABLE
ONLINE ONLINE rac2 STABLE
ora.OCR.dg
ONLINE ONLINE rac1 STABLE
ONLINE ONLINE rac2 STABLE
ora.helper
OFFLINE OFFLINE rac1 STABLE
OFFLINE OFFLINE rac2 IDLE,STABLE
ora.mgmt.ghchkpt.acfs
OFFLINE OFFLINE rac1 STABLE
OFFLINE OFFLINE rac2 STABLE
ora.net1.network
ONLINE ONLINE rac1 STABLE
ONLINE ONLINE rac2 STABLE
ora.ons
ONLINE ONLINE rac1 STABLE
ONLINE ONLINE rac2 STABLE
ora.proxy_advm
ONLINE ONLINE rac1 STABLE
ONLINE ONLINE rac2 STABLE
--------------------------------------------------------------------------------
Cluster Resources
--------------------------------------------------------------------------------
ora.LISTENER_SCAN1.lsnr
1 ONLINE ONLINE rac1 STABLE
ora.MGMTLSNR
1 ONLINE ONLINE rac1 192.168.56.100,STABL
E
ora.asm
1 ONLINE ONLINE rac1 Started,STABLE
2 ONLINE ONLINE rac2 Started,STABLE
3 OFFLINE OFFLINE STABLE
ora.cndba.cndba_taf.svc
2 ONLINE ONLINE rac1 STABLE
ora.cndba.db
1 ONLINE OFFLINE Instance Shutdown,ST
ABLE
2 ONLINE ONLINE rac1 Open,HOME=/u01/app/o
racle/product/18.3.0
/db_1,STABLE
ora.cndba.pdb_taf.svc
1 ONLINE ONLINE rac1 STABLE
ora.cvu
1 ONLINE ONLINE rac1 STABLE
ora.mgmtdb
1 ONLINE ONLINE rac1 Open,STABLE
ora.qosmserver
1 OFFLINE OFFLINE STABLE
ora.rac1.vip
1 ONLINE ONLINE rac1 STABLE
ora.rac2.vip
1 ONLINE ONLINE rac2 STABLE
ora.rhpserver
1 OFFLINE OFFLINE STABLE
ora.scan1.vip
1 ONLINE ONLINE rac1 STABLE
--------------------------------------------------------------------------------
[root@www.cndba.cn ~]#
版权声明:本文为博主原创文章,未经博主允许不得转载。