Details
-
Bug
-
Status: Resolved
-
Critical
-
Resolution: Fixed
-
2.6.0
-
None
-
None
Description
STR:
- Install a cluster with Spark and Spark2
- Remove Spark2
- Change a Spark configuration
- Run the Spark Service check
- Attempt to perform an upgrade
The upgrade pre-checks will stop you saying something like:
02 Aug 2017 10:17:23,701 INFO [ambari-client-thread-28] ServiceCheckValidityCheck:149 - Service SPARK latest config change is 08-02-2017 09:45:16, latest service check executed at 12-31-1969 03:59:59
The start time suggests a value of -1 in my cluster. The problem is that I aborted my SPARK2 service check and then removed the service. However, the pre-check seems to be matching on the name an incorrectly detecting the old SPARK2 check as the one for SPARK:
boolean serviceCheckWasExecuted = false; for (HostRoleCommandEntity command : latestTimestamps.values()) { if (null != command.getCommandDetail() && command.getCommandDetail().contains(serviceName)) {
Because contains() is finding SPARK in SPARK2_SERVICE_CHECK, it's incorrectly picking up the wrong values...
Attachments
Attachments
Issue Links
- links to