[Starlingx-discuss] ERROR when deploy stx-monitor.

Peters, Matt Matt.Peters at windriver.com
Wed Jun 10 17:36:13 UTC 2020


Hi Rahmat,
The stx-monitor Armada application is not being actively maintained within since there wasn’t much interest from the community in continuing to support it.
The individual container services can still be deployed using Helm on StarlingX if you require.

There are also several other projects within the CNCF landscape for monitoring that can also be considered.
https://landscape.cncf.io/category=observability-and-analysis&format=card-mode&grouping=category

I hope that answers your question.

Regards, Matt

From: Rahmat Agung <agung at btech.id>
Date: Sunday, June 7, 2020 at 10:34 PM
To: "starlingx-discuss at lists.starlingx.io" <starlingx-discuss at lists.starlingx.io>
Subject: [Starlingx-discuss] ERROR when deploy stx-monitor.

I try to deploy stx-monitor on 3 nworker nodes with label like this:

```
worker-3       Ready    <none>   2d18h   v1.16.2   beta.kubernetes.io/arch=amd64,beta.kubernetes.io/os=linux,elastic-client=enabled,elastic-controller=enabled,elastic-data=enabled,elastic-master=enabled,kubernetes.io/arch=amd64,kubernetes.io/hostname=worker-3,kubernetes.io/os=linux<http://beta.kubernetes.io/arch=amd64,beta.kubernetes.io/os=linux,elastic-client=enabled,elastic-controller=enabled,elastic-data=enabled,elastic-master=enabled,kubernetes.io/arch=amd64,kubernetes.io/hostname=worker-3,kubernetes.io/os=linux>
worker-4       Ready    <none>   2d18h   v1.16.2   beta.kubernetes.io/arch=amd64,beta.kubernetes.io/os=linux,elastic-client=enabled,elastic-controller=enabled,elastic-data=enabled,elastic-master=enabled,kubernetes.io/arch=amd64,kubernetes.io/hostname=worker-4,kubernetes.io/os=linux<http://beta.kubernetes.io/arch=amd64,beta.kubernetes.io/os=linux,elastic-client=enabled,elastic-controller=enabled,elastic-data=enabled,elastic-master=enabled,kubernetes.io/arch=amd64,kubernetes.io/hostname=worker-4,kubernetes.io/os=linux>
worker-5       Ready    <none>   2d16h   v1.16.2   beta.kubernetes.io/arch=amd64,beta.kubernetes.io/os=linux,elastic-master=enabled,kubernetes.io/arch=amd64,kubernetes.io/hostname=worker-5,kubernetes.io/os=linux<http://beta.kubernetes.io/arch=amd64,beta.kubernetes.io/os=linux,elastic-master=enabled,kubernetes.io/arch=amd64,kubernetes.io/hostname=worker-5,kubernetes.io/os=linux>
```
When I check logs:

```
us: <_Rendezvous of RPC that terminated with:
status = StatusCode.UNKNOWN
details = "release mon-kibana failed: timed out waiting for the condition"
debug_error_string = "{"created":"@1591538841.195787781","description":"Error received from peer","file":"src/core/lib/surface/call.cc","file_line":1017,"grpc_message":"release mon-kibana failed: timed out waiting for the condition","grpc_status":2}"
>
2020-06-07 14:07:21.196 7963 ERROR armada.handlers.tiller Traceback (most recent call last):
2020-06-07 14:07:21.196 7963 ERROR armada.handlers.tiller   File "/usr/local/lib/python3.6/dist-packages/armada/handlers/tiller.py", line 473, in install_release
2020-06-07 14:07:21.196 7963 ERROR armada.handlers.tiller     metadata=self.metadata)
2020-06-07 14:07:21.196 7963 ERROR armada.handlers.tiller   File "/usr/local/lib/python3.6/dist-packages/grpc/_channel.py", line 533, in __call__
2020-06-07 14:07:21.196 7963 ERROR armada.handlers.tiller     return _end_unary_response_blocking(state, call, False, None)
2020-06-07 14:07:21.196 7963 ERROR armada.handlers.tiller   File "/usr/local/lib/python3.6/dist-packages/grpc/_channel.py", line 467, in _end_unary_response_blocking
2020-06-07 14:07:21.196 7963 ERROR armada.handlers.tiller     raise _Rendezvous(state, None, None, deadline)
2020-06-07 14:07:21.196 7963 ERROR armada.handlers.tiller grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
2020-06-07 14:07:21.196 7963 ERROR armada.handlers.tiller status = StatusCode.UNKNOWN
2020-06-07 14:07:21.196 7963 ERROR armada.handlers.tiller details = "release mon-kibana failed: timed out waiting for the condition"
2020-06-07 14:07:21.196 7963 ERROR armada.handlers.tiller debug_error_string = "{"created":"@1591538841.195787781","description":"Error received from peer","file":"src/core/lib/surface/call.cc","file_line":1017,"grpc_message":"release mon-kibana failed: timed out waiting for the condition","grpc_status":2}"
2020-06-07 14:07:21.196 7963 ERROR armada.handlers.tiller >
2020-06-07 14:07:21.196 7963 ERROR armada.handlers.tiller
2020-06-07 14:07:21.199 7963 DEBUG armada.handlers.tiller [-] [chart=kibana]: Helm getting release status for release=mon-kibana, version=0 get_release_status /usr/local/lib/python3.6/dist-packages/armada/handlers/tiller.py:539
2020-06-07 14:07:21.402 7963 DEBUG armada.handlers.tiller [-] [chart=kibana]: GetReleaseStatus= name: "mon-kibana"
info {
  status {
    code: FAILED
  }
  first_deployed {
    seconds: 1591538240
    nanos: 977775758
  }
  last_deployed {
    seconds: 1591538240
    nanos: 977775758
  }
  Description: "Release \"mon-kibana\" failed: timed out waiting for the condition"
}
namespace: "monitor"
 get_release_status /usr/local/lib/python3.6/dist-packages/armada/handlers/tiller.py:547
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada [-] Chart deploy [kibana] failed: armada.exceptions.tiller_exceptions.ReleaseException: Failed to Install release: mon-kibana - Tiller Message: b'Release "mon-kibana" failed: timed out waiting for the condition'
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada Traceback (most recent call last):
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada   File "/usr/local/lib/python3.6/dist-packages/armada/handlers/tiller.py", line 473, in install_release
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada     metadata=self.metadata)
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada   File "/usr/local/lib/python3.6/dist-packages/grpc/_channel.py", line 533, in __call__
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada     return _end_unary_response_blocking(state, call, False, None)
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada   File "/usr/local/lib/python3.6/dist-packages/grpc/_channel.py", line 467, in _end_unary_response_blocking
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada     raise _Rendezvous(state, None, None, deadline)
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada status = StatusCode.UNKNOWN
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada details = "release mon-kibana failed: timed out waiting for the condition"
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada debug_error_string = "{"created":"@1591538841.195787781","description":"Error received from peer","file":"src/core/lib/surface/call.cc","file_line":1017,"grpc_message":"release mon-kibana failed: timed out waiting for the condition","grpc_status":2}"
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada >
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada During handling of the above exception, another exception occurred:
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada Traceback (most recent call last):
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada   File "/usr/local/lib/python3.6/dist-packages/armada/handlers/armada.py", line 225, in handle_result
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada     result = get_result()
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada   File "/usr/local/lib/python3.6/dist-packages/armada/handlers/armada.py", line 236, in <lambda>
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada     if (handle_result(chart, lambda: deploy_chart(chart))):
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada   File "/usr/local/lib/python3.6/dist-packages/armada/handlers/armada.py", line 214, in deploy_chart
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada     chart, cg_test_all_charts, prefix, known_releases)
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada   File "/usr/local/lib/python3.6/dist-packages/armada/handlers/chart_deploy.py", line 239, in execute
2020-06-07 14:07[402248.574350] serial8250: too much work for irq4
:21.404 7963 ERROR armada.handlers.armada     timeout=timer)
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada   File "/usr/local/lib/python3.6/dist-packages/armada/handlers/tiller.py", line 486, in install_release
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada     raise ex.ReleaseException(release, status, 'Install')
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada armada.exceptions.tiller_exceptions.ReleaseException: Failed to Install release: mon-kibana - Tiller Message: b'Release "mon-kibana" failed: timed out waiting for the condition'
2020-06-07 14:07:21.404 7963 ERROR armada.handlers.armada
2020-06-07 14:07:21.406 7963 ERROR armada.handlers.armada [-] Chart deploy(s) failed: ['kibana']
2020-06-07 14:07:21.478 7963 INFO armada.handlers.lock [-] Releasing lock
2020-06-07 14:07:21.486 7963 ERROR armada.cli [-] Caught internal exception: armada.exceptions.armada_exceptions.ChartDeployException: Exception deploying charts: ['kibana']
2020-06-07 14:07:21.486 7963 ERROR armada.cli Traceback (most recent call last):
2020-06-07 14:07:21.486 7963 ERROR armada.cli   File "/usr/local/lib/python3.6/dist-packages/armada/cli/__init__.py", line 38, in safe_invoke
2020-06-07 14:07:21.486 7963 ERROR armada.cli     self.invoke()
2020-06-07 14:07:21.486 7963 ERROR armada.cli   File "/usr/local/lib/python3.6/dist-packages/armada/cli/apply.py", line 213, in invoke
2020-06-07 14:07:21.486 7963 ERROR armada.cli     resp = self.handle(documents, tiller)
2020-06-07 14:07:21.486 7963 ERROR armada.cli   File "/usr/local/lib/python3.6/dist-packages/armada/handlers/lock.py", line 81, in func_wrapper
2020-06-07 14:07:21.486 7963 ERROR armada.cli     return future.result()
2020-06-07 14:07:21.486 7963 ERROR armada.cli   File "/usr/lib/python3.6/concurrent/futures/_base.py", line 425, in result
2020-06-07 14:07:21.486 7963 ERROR armada.cli     return self.__get_result()
2020-06-07 14:07:21.486 7963 ERROR armada.cli   File "/usr/lib/python3.6/concurrent/futures/_base.py", line 384, in __get_result
2020-06-07 14:07:21.486 7963 ERROR armada.cli     raise self._exception
2020-06-07 14:07:21.486 7963 ERROR armada.cli   File "/usr/lib/python3.6/concurrent/futures/thread.py", line 56, in run
2020-06-07 14:07:21.486 7963 ERROR armada.cli     result = self.fn(*self.args, **self.kwargs)
2020-06-07 14:07:21.486 7963 ERROR armada.cli   File "/usr/local/lib/python3.6/dist-packages/armada/cli/apply.py", line 256, in handle
2020-06-07 14:07:21.486 7963 ERROR armada.cli     return armada.sync()
2020-06-07 14:07:21.486 7963 ERROR armada.cli   File "/usr/local/lib/python3.6/dist-packages/armada/handlers/armada.py", line 252, in sync
2020-06-07 14:07:21.486 7963 ERROR armada.cli     raise armada_exceptions.ChartDeployException(failures)
2020-06-07 14:07:21.486 7963 ERROR armada.cli armada.exceptions.armada_exceptions.ChartDeployException: Exception deploying charts: ['kibana']
2020-06-07 14:07:21.486 7963 ERROR armada.cli
```
What mean the error above?
I just want to know, is stx-monitor stable or still experimental? Because I could not found documentation about it.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.starlingx.io/pipermail/starlingx-discuss/attachments/20200610/17d3a209/attachment-0001.html>


More information about the Starlingx-discuss mailing list