lists.starlingx.io
Sign In Sign Up
Manage this list Sign In Sign Up

Keyboard Shortcuts

Thread View

  • j: Next unread message
  • k: Previous unread message
  • j a: Jump to all threads
  • j l: Jump to MailingList overview

Starlingx-discuss

Thread Start a new thread
Download
Threads by month
  • ----- 2025 -----
  • May
  • April
  • March
  • February
  • January
  • ----- 2024 -----
  • December
  • November
  • October
  • September
  • August
  • July
  • June
  • May
  • April
  • March
  • February
  • January
  • ----- 2023 -----
  • December
  • November
  • October
  • September
  • August
  • July
  • June
  • May
  • April
  • March
  • February
  • January
  • ----- 2022 -----
  • December
  • November
  • October
  • September
  • August
  • July
  • June
  • May
  • April
  • March
  • February
  • January
  • ----- 2021 -----
  • December
  • November
  • October
  • September
  • August
  • July
  • June
  • May
  • April
  • March
  • February
  • January
  • ----- 2020 -----
  • December
  • November
  • October
  • September
  • August
  • July
  • June
  • May
  • April
  • March
  • February
  • January
  • ----- 2019 -----
  • December
  • November
  • October
  • September
  • August
  • July
  • June
  • May
  • April
  • March
  • February
  • January
  • ----- 2018 -----
  • December
  • November
  • October
  • September
  • August
  • July
  • June
starlingx-discuss@lists.starlingx.io

June 2023

  • 26 participants
  • 48 discussions
[Starlingx-discuss] [build-report] master STX_build_debian_master - Build # 208 - Failure!
by starlingx.build@gmail.com 26 Jun '23

26 Jun '23
Project: STX_build_debian_master Build #: 208 Status: Failure Timestamp: 20230627T060000Z Branch: master Check logs at: http://mirror.starlingx.cengn.ca/mirror/starlingx/master/debian/monolithic/… -------------------------------------------------------------------------------- Parameters BUILD_PACKAGES_LIST: CLEAN_DOWNLOADS: false BUILD_HELM_CHARTS: true USE_DOCKER_CACHE: true DOCKER_IMAGE_LIST: PUSH_DOCKER_IMAGES: true CLEAN_DOCKER: true REFRESH_SOURCE: true DRY_RUN: false CLEAN_PACKAGES: true BUILD_RT: true BUILD_PACKAGES: true CLEAN_REPOMGR: true PKG_REUSE: false JENKINS_SCRIPTS_BRANCH: master BUILD_DOCKER_BASE_IMAGE: true BUILD_DOCKER_IMAGES: true FORCE_BUILD: false REBUILD_BUILDER_IMAGES: true CLEAN_ISO: true BUILD_ISO: true
1 0
0 0
[Starlingx-discuss] Minutes: Community Call (Jun 21, 2023)
by Khalil, Ghada 23 Jun '23

23 Jun '23
Etherpad: https://etherpad.opendev.org/p/stx-status Minutes from the community call June 21, 2023 Standing topics - Build - Main Branch Debian Builds - Some build failures during the last week - One build issue with the container images due to a Debian package upgrade >> already fixed - Build failed yesterday due to a networking outage; this is now happening once/twice per week - There's a build currently in progress - stx.6.0 Weekly RC Builds - Green - Build Output: http://mirror.starlingx.cengn.ca/mirror/starlingx/rc/6.0/ - stx.7.0 Weekly RC Builds - Failed due to a recent change to faciliate transition off of CENGN; ScottL is investigating - Build Output: http://mirror.starlingx.cengn.ca/mirror/starlingx/rc/7.0/ - Note: No container image builds are done for stx.7.0 - stx.8.0 Weekly RC Builds - Green - Build Output: https://mirror.starlingx.cengn.ca/mirror/starlingx/rc/8.0/ - Transition from CENGN - Build team is working on transitioning the builds to WR infrastructure as CENGN will no longer be able to host the StarlingX builds. - Timing: Tentatively end-July / early-Aug - Status: - All mirror jobs are working - Able to build Debian: the main branch builds (iso+container) as well as stx.8.0 RC builds with similar reliability - Now working on building stx.7.0 (CentOS build) and investigating issues - Working w/ WR IT on publicly posting build results << still under discussion - Sanity - Debian Main Branch Platform Sanity - Last sanity email sent on June 20: https://lists.starlingx.io/pipermail/starlingx-discuss/2023-June/014226.html - Status: Green for SX and DX - Debian Main Branch stx-openstack Sanity - Last sanity email sent on June 18: https://lists.starlingx.io/pipermail/starlingx-discuss/2023-June/014218.html - Status: Green - Gerrit Reviews in Need of Attention - tools repo reviews - Virtual Env Setup - https://review.opendev.org/c/starlingx/tools/+/880870 - Key reviewers have not been active in reviewing the code - The changes are isolated and should be low risk - Agreed that Davlet will merge the changes based on the current +1s - Also agreed that the tools repo will be split into smaller repos. This action is with the build team to plan/implement. - From previous meetings: Fixes to libvirt env: https://review.opendev.org/c/starlingx/tools/+/863735 - Review comments provided in Nov; still waiting for author (Scott Kamp) to respond/address review comments - Jan 18: Some activity in the review as of Jan 15 - Feb 1: Alternative fix proposed on Jan 18; waiting for ScottK's review - Feb 15: ScottK is going back to this - Mar 22: Next action is w/ ScottK; As per today's meeting, he'll be looking at the comments - Apr 11: Review is still open. Next action is w/ ScottK (the author) - May 4: ScottK lost his setup, so needs to recover the review - Jun 21: No updates - Reference Links: - Active Branch (open): https://review.opendev.org/q/projects:starlingx+is:open+branch:+master - Active Branch (merged): https://review.opendev.org/q/projects:starlingx+is:merged+branch:master Topics for this week - Manifest repo re-configuration / split to allow stx-openstack to be built independently from the platform - https://lists.starlingx.io/pipermail/starlingx-discuss/2023-June/014206.html - No concerns from the community regarding this proposal. - The key issue is scheduling time for the build team to implement the change. - Action: Build team to propose a timeline based on their priorities/availability. - StoryBoard tracking the work: https://storyboard.openstack.org/#!/story/2010797 - Renaming App Repos to remove the "armada" - https://lists.starlingx.io/pipermail/starlingx-discuss/2023-June/014215.html - Minor / cosematic change. Lower priority, but would be nice to do. Should have no impact on runtime. - Action: Build team to propose a timeline based on their priorities/availability. (lower priority than the manifest repo re-config) ARs from Previous Meetings - Restarting the containers subproject meeting? - Action: Bruce to send the question to the mailing list to see if there is any interest in restarting the meetings - Status: Open / No update - github Mirroring - Action: Greg to review the requirements for github mirroring to confirm if it's required for all repos or just a subset to meet k8s conformance requirements. - Status: Closed - Greg reviewed his notes from the discussion w/ CNCF and confirmed that the requirement is to only mirror the core repos (the config repo is considered the core repo). - There is no explicit requirement to mirror new repos. - ARM Support - Action: Scott Kamp to respond to Jackie/the mailing list to provide assistance/access to some arm machines - Action: Scott Little to respond regarding build questions - Cannot mirror more files on CENGN; will need to wait until we transition away from CENGN - Action: Scott Kamp to explore providing a hosting env temporarily - No plan to publish images to dockerhub/starlingx until they are part of the official build infrastructure - Under discussion in the OS subproject meeting - Jackie is donating hardware, but it needs to be managed by someone - Status: Open - Greg took the action to identify the right prime to drive the ARM Support project within StarlingX - StarlingX Subproject Meetings Update - Action: Scott Little to confirm w/ Mike Matteson on the build meeings - Scott is suggesting to cover build topics as part of the OS-distro meeting. Scott to discuss with the PL/TL. - Status Open / No Update - Action: Chris Friesen to provide an update on the containerization meetings - Bruce will follow up on whether to restart the containerization meetings on the mailing list - Status: Open / No Update Open Requests for Help - Issue w/ pxebooting a second node on StarlingX - Question raised on the community call (Apr 26) by Kevin - The initial node installs/boots correctly, but but an identical second node doesn't. This setup is using vmare to run StarlingX in VMs. - Status: Open / Greg has some information to share, but isn't sure how to get in touch w/ Kevin - Build System Setup - Question was posted on IRC by Kevin - StarlingX documentation has a lot of outdated information. It took a lot of digging to find the right documentation for setting up a build env with minikube. - Requested that Kevin open an LP to the doc team and/or send an email to the mailing list w/ more details to the mailing list - Status: Open / No email sent to the mailing list yet. Unsure if LP was opened. - Support for StarlingX 8 setup bring up - Reported by Prashanthi - Email thread: https://lists.starlingx.io/pipermail/starlingx-discuss/2023-April/014026.ht… - Status: - Open / Discussion is active on the mailing list. Greg providing assistance. The issues are related to date/time issues on servers being setup as well as firewall issues - CURRENTLY PAUSED ... Prashanthi had higher internal priorities that have taken her away from looking at StarlingX for the time being; waiting for Prashanthi to get back to us.
1 0
0 0
[Starlingx-discuss] Minutes: StarlingX Release Meeting - Jun 21/2023
by Khalil, Ghada 23 Jun '23

23 Jun '23
Agenda/Minutes are posted at: https://etherpad.openstack.org/p/stx-releases Release Team Meeting - Jun 21 2023 stx.9.0 - Release/Feature Planning: https://docs.google.com/spreadsheets/d/1aTjYzUkExodfayt-rjTv466jE-DP8b_YjrT… - PTG Release Discussion - Walked through the candidate feature list for stx.9.0 - Discussed the more interesting features - Upcoming Feature Milestones - A number of features are past due their Code Merge Date - Armada Deprecation / Replacement - FluxCD (cont'd) / Bruce Jones / Code Merge: 6/12 - K8S Upversion Duration Reduction for AIO-SX / Bruce Jones / Code Merge: 6/6 - remove Armada / Bruce Jones / Code Merged: 6/12 << duplicate of "Armada Deprecation / Replacement - FluxCD (cont'd)"? - Containerization Component Refresh - k8s 1.26 / Bruce Jones / Code Merge: 6/13 - WAD Users sudo and local linux group assignment / Ghada Khalil / Code Merge: 5/19 - Subcloud Error Root Cause Correction Action / Carlos Fleck / Code Merge: 5/27 - Support for long latency between SystemController and Subclouds / Ram S / Code Merge: 6/19 - Action: Ghada to follow up with feature primes for updates Blogs - FEC Device Configurability (fec-operator Integration) for ACC100 & N3000 - Prime: Balendu (Mouli) Burla - Forecast: May 1 >> May 19 >> need re-forecast - Status: Confirmed / Not posted - Platform Single Core Tuning - Prime: Guilherme Batista Leite - Forecast: May 9 >> Jun 6 - Status: Posted for review - Pull Request: https://github.com/StarlingXWeb/starlingx-website/pull/238 - PTP O-RAN Compliant API Notification - Prime: Ghada Khalil - Forecast: Jun 30 - Status: Confirmed / Not posted
1 0
0 0
[Starlingx-discuss] Manifest decoupling: StarlingX and stx-openstack application
by Cervi, Thales Elero 22 Jun '23

22 Jun '23
Greetings to the community 🙂 I would like to bring to this mailing list a discussion we are having about splitting the stx-openstack application repositories from the main starlingx manifest into a new one. In order to decouple the OpenStack Clients currently shared between platform and application we are suggesting to decouple the platform and application builds in separate manifests (e.g. starlingx/manifest/default.xml and starlingx/manifest/stx-openstack.xml). I understand that the new manifest needs to be created and build jobs configured properly and tested, I can also create the Stories for mapping this work later. BENEFITS: we achieve the freedom of building different versions of the same packages (i.e. openstack clients) that are shared between the platform and the application. The end goal would be something like: Remain as is (default manifest aligns to upstream Debian version => Victoria) ============ stx/upstream openstack/barbican stx/upstream openstack/keystone stx/upstream openstack/python-barbicanclient stx/upstream openstack/python-keystoneclient stx/upstream openstack/python-horizon stx/upstream openstack/python-openstackclient Only keep the services needed by the platform. Some existing patches might have been brought forward to support stx-openstack, so we should re-evaluate as part of this. Those would still be built as part of STX Platform builds, i.e. starlingx/manifest/default.xml Manifest should not include stx/openstack-armada-app repo anymore. Relocate (stx-openstack app independently aligns to version of choice: Victoria [current] => Antelope [future]) ==================== stx/upstream openstack/barbican copy-to stx/openstack-armada-app/upstream/openstack/barbican stx/upstream openstack/keystone copy-to stx/openstack-armada-app/upstream/openstack/keystone stx/upstream openstack/python-barbicanclient copy-to stx/openstack-armada-app/upstream/openstack/python-barbicanclient stx/upstream openstack/python-keystoneclient copy-to stx/openstack-armada-app/upstream/openstack/python-keystoneclient stx/upstream openstack/python-horizon copy-to stx/openstack-armada-app/upstream/openstack/python-horizon stx/upstream openstack/python-openstackclient copy-to stx/openstack-armada-app/upstream/openstack/python-openstackclient stx/upstream openstack/openstack-pkg-tools move-to stx/openstack-armada-app/upstream/openstack/openstack-pkg-tools stx/upstream openstack/openstack-ras move-to stx/openstack-armada-app/upstream/openstack/openstack-ras stx/upstream openstack/python-aodhclient move-to stx/openstack-armada-app/upstream/openstack/python-aodhclient stx/upstream openstack/python-cinderclient move-to stx/openstack-armada-app/upstream/openstack/python-cinderclient stx/upstream openstack/python-glanceclient move-to stx/openstack-armada-app/upstream/openstack/python-glanceclient stx/upstream openstack/python-gnocchiclient move-to stx/openstack-armada-app/upstream/openstack/python-gnocchiclient stx/upstream openstack/python-heatclient move-to stx/openstack-armada-app/upstream/openstack/python-heatclient stx/upstream openstack/python-ironicclient move-to stx/openstack-armada-app/upstream/openstack/python-ironicclient stx/upstream openstack/python-keystoneclient move-to stx/openstack-armada-app/upstream/openstack/python-keystoneclient stx/upstream openstack/python-neutronclient move-to stx/openstack-armada-app/upstream/openstack/python-neutronclient stx/upstream openstack/python-novaclient move-to stx/openstack-armada-app/upstream/openstack/python-novaclient stx/upstream openstack/python-openstacksdk move-to stx/openstack-armada-app/upstream/openstack/python-openstacksdk stx/upstream openstack/python-osc-lib move-to stx/openstack-armada-app/upstream/openstack/python-osc-lib stx/upstream openstack/python-oslo-messaging move-to stx/openstack-armada-app/upstream/openstack/python-oslo-messaging stx/upstream openstack/python-pankoclient move-to stx/openstack-armada-app/upstream/openstack/python-pankoclient stx/upstream openstack/python-wsme move-to stx/openstack-armada-app/upstream/openstack/python-wsme stx/upstream openstack/rabbitmq-server move-to stx/openstack-armada-app/upstream/openstack/rabbitmq-server The OpenStack Distro Team will create a new directory under stx/openstack-armada-app to store all application needed packages. This will include clients and the stx-openstack Docker images we build only for the application. This way, on stx-openstack build we can build the same packages (name) but using different versions/source code. Manifest does not include upstream repo. ------------------------ We have discussed it between a small group of people so I would like to bring it to the attention of everyone and let the community speak in case there are any concerns against it and/or to support this change proposal. This work will be done in a couple of phases, progressively moving towards achieving the end goal explained here. Looking forward for hearing back from you 🙂
2 1
0 0
[Starlingx-discuss] [build-report] master STX_build_debian_master - Build # 204 - Failure!
by starlingx.build@gmail.com 21 Jun '23

21 Jun '23
Project: STX_build_debian_master Build #: 204 Status: Failure Timestamp: 20230622T060000Z Branch: master Check logs at: http://mirror.starlingx.cengn.ca/mirror/starlingx/master/debian/monolithic/… -------------------------------------------------------------------------------- Parameters BUILD_PACKAGES_LIST: CLEAN_DOWNLOADS: false BUILD_HELM_CHARTS: true USE_DOCKER_CACHE: true DOCKER_IMAGE_LIST: PUSH_DOCKER_IMAGES: true CLEAN_DOCKER: true REFRESH_SOURCE: true DRY_RUN: false CLEAN_PACKAGES: true BUILD_RT: true BUILD_PACKAGES: true CLEAN_REPOMGR: true PKG_REUSE: false JENKINS_SCRIPTS_BRANCH: master BUILD_DOCKER_BASE_IMAGE: true BUILD_DOCKER_IMAGES: true FORCE_BUILD: false REBUILD_BUILDER_IMAGES: true CLEAN_ISO: true BUILD_ISO: true
1 0
0 0
[Starlingx-discuss] OpenInfra Summit, PTG and Forum summary
by Ildiko Vancsa 21 Jun '23

21 Jun '23
Hi StarlingX Community, I’m reaching out about the OpenInfra Summit, that happened last week in Vancouver, BC, Canada. We had a great event with people from all around the world. We had StarlingX contributors and people interested in the project attending in person, to make connections and bring discussions forward. Contributors and users presented at the conference to talk about their Starling X use cases or the capabilities of the project. The conference sessions were recorded and will be made available in the coming weeks on the event’s YouTube playlist: https://www.youtube.com/playlist?list=PLKqaoAnDyfgqsxQDbLj4LVpKiZSDbntuC The StarlingX related videos will be added to the StarlingX playlist on YouTube, once available: https://www.youtube.com/playlist?list=PLKqaoAnDyfgp7KWad7EAHnZ30Mdg3Ejqf * Forum Forum sessions are interactive sessions to provide the space and opportunity for users and project contributors to get together and discuss feedback, priorities and next steps. The StarlingX community had a short session to learn about use cases that people are considering StarlingX for. Main discussion topics included: - Connectivity in rural areas - Industrial IoT and sensors - MSPs (Managed Services Providers) Please see the Forum etherpad for notes: https://etherpad.opendev.org/p/r.65f95cd991ecd9d4de6934b865ec0f80 * PTG Contributors and newcomers to the project had a small PTG session to discuss current work items and challenges. Main discussion topics included: - Project communication - Follow-up email is in progress - Onboarding users and new contributors - Activities to move forward with implementing Arm support for StarlingX Please see the PTG etherpad for notes: https://etherpad.opendev.org/p/r.ff22c936a2ed93cc90a8d5d342c323d1 Thanks and Best Regards, Ildikó ——— Ildikó Váncsa Director of Community Open Infrastructure Foundation
1 0
0 0
[Starlingx-discuss] Minutes: OpenStack Distro Team Call (Jun 20, 2023)
by Cervi, Thales Elero 20 Jun '23

20 Jun '23
Etherpad: https://etherpad.opendev.org/p/stx-distro-openstack-meetings Minutes from the OpenStack Distro team call Jun 20, 2023 Build: - Build Issues: None Installation: - Installation Issues # LP 2023085: STX-Openstack | fails to apply - mariadb-server-1 pod in CrashLoopBackoff - Fix Released and Tested Sanity with stx-openstack (main branch): - Last Successful Execution: (OVS) Mon Jun 19 12:39:12 UTC 2023 - Overall status: GREEN       Sanity - Passed: 15 (100.00%) | Failed: 0 (0.0%)       Regression - Passed: 14 (100.00%) | Failed: 0 (0.0%) - Bugs Affecting Weekly Sanity/Regression: - No Reproducible bugs currently open. Only intermittent issues # LP 2012389: STX-Openstack: Failed to activate binding for port for live migration # LP 2007303: STX-Openstack: "nova live-migration" fails to live migrate after host is forcefully turned off/on - StarlingX 9.0 Release: - Storyboard/Tasks for containerizing OpenStack clients: https://storyboard.openstack.org/#!/story/2010797 - In progress - Storyboard/Tasks for decoupling platform and application manifests: https://storyboard.openstack.org/#!/story/2010797 - Discussion on a mail list thread, community has no concerns with it and agrees with this plan https://lists.starlingx.io/pipermail/starlingx-discuss/2023-June/014205.html - Storyboard/Tasks for OpenStack upversion (Antelope): https://storyboard.openstack.org/#!/story/2010715 - Planned General Topics: - STX OpenStack Distro Team: OpenInfra Summit - Vancouver 2023 - Community interested on running OpenStack on K8S clusters: keep OpenStack services up to date (as much as we can) - How to make Users' life easier when they want to upversion stx-openstack services release (e.g., Antelope -> Bobcat) - STX Social Media and Documentation: make the stx-openstack more visible and understandable - Remove Armada related code: - Should we also rename the repository? starlingx/openstack-armada-app - Discussion on a mail list thread, community has no concerns with it and agrees with this plan https://lists.starlingx.io/pipermail/starlingx-discuss/2023-June/014215.html -- Best Regards, Thales Cervi
1 0
0 0
[Starlingx-discuss] Sanity Master Test LAYERED build ISO 20230618T060000Z
by Peng, Peng 20 Jun '23

20 Jun '23
Sanity Test from 2023 June 19 (https://mirror.starlingx.cengn.ca/mirror/starlingx/master/debian/monolithic…) Status: GREEN SX sanity Passed: 17 (100.0%) Failed: 0 (0.0%) Total Executed: 17 List of Test Cases: ------------------------------------------------------ PASS test_system_health_pre_session[pods] PASS test_system_health_pre_session[alarms] PASS test_system_health_pre_session[system_apps] PASS test_horizon_host_inventory_display PASS test_lock_unlock_host PASS test_pod_to_pod_connection PASS test_pod_to_service_connection PASS test_host_to_service_connection PASS test_push_docker_image_to_local_registry_active PASS test_upload_charts_via_helm_upload PASS test_host_operations_with_custom_kubectl_app PASS test_isolated_2p_2_big_pod_best_effort_HT_AIO PASS test_sriovdp_netdev_single_pod[1-1-lock/unlock] PASS test_sriovdp_netdev_connectivity_ipv4[1-1-calico-ipam] PASS test_sriovdp_mixed_add_vf_interface[1] PASS test_system_coredumps_and_crashes[core_dumps] PASS test_system_coredumps_and_crashes[crash_reports] DX sanity Passed: 19 (100.0%) Failed: 0 (0.0%) Total Executed: 19 List of Test Cases: ------------------------------------------------------ PASS test_system_health_pre_session[pods] PASS test_system_health_pre_session[alarms] PASS test_system_health_pre_session[system_apps] PASS test_horizon_host_inventory_display PASS test_lock_unlock_host PASS test_swact_controller_platform PASS test_pod_to_pod_connection PASS test_pod_to_service_connection PASS test_host_to_service_connection PASS test_push_docker_image_to_local_registry_active PASS test_push_docker_image_to_local_registry_standby PASS test_upload_charts_via_helm_upload PASS test_host_operations_with_custom_kubectl_app PASS test_force_reboot_host[active_controller-True] PASS test_force_reboot_host[active_controller-False] PASS test_force_reboot_host[standby_controller-False] PASS test_bmc_verify_bm_type_ipmi PASS test_system_coredumps_and_crashes[core_dumps] PASS test_system_coredumps_and_crashes[crash_reports] Regards, PV team
1 0
0 0
[Starlingx-discuss] Application repositories renaming after Armada deprecation
by Cervi, Thales Elero 20 Jun '23

20 Jun '23
Hi everyone. During our recent discussions on stx-openstack application manifest separation, something that our colleague Robert Church pointed out is that the repository for this application is still mentioning "armada" on its name: starlignx/openstack-armada-app. The naming does not make sense anymore, especially after our last release. The apps are now FluxCD apps and not Armada apps anymore. As he pointed out, this is true for other application repositories too, so there was the suggestion for renaming it. The list follows (as peer Bob's comments): audit-armada-app => app-audit cert-manager-armada-app => app-cert-manager metrics-server-armada-app => app-metrics-server nginx-ingress-controller-armada-app => app-nginx-ingress-controller oidc-auth-armada-app => app-oidc-auth openstack-armada-app => app-openstack portieris-armada-app => app-portieris ptp-notification-armada-app => app-ptp-notification rook-ceph => app-rook SDO-rv-service => app-sdo-rv snmp-armada-app => app-snmp vault-armada-app => app-vault platform-armada-app => app-ceph-storage NOTE: Initial naming was based on the assumption that we would include all platform apps here enabling helm charts as needed. This has proven to be incorrect as each app is its own repo. This contains only the bare-metal ceph application monitor-armada-app => app-monitor NOTE: Don’t think this is maintained anymore nor is part of stx-openstack. This might actually be able to be removed. Can rename at a minimum. Our colleague Scott Little is already looking into the best way to proceed with renaming those, but I am sending this email so anyone in the community can be heard in case there is some kind of concerns related to this repository renaming. Looking forward for hearing back from you : ) Thales Cervi
3 2
0 0
[Starlingx-discuss] Sanity and Regression - StarlingX + STX-Openstack MASTER build [20230618T060000Z] results - Jun-18
by Calixto de Paula, Gabriel 20 Jun '23

20 Jun '23
Hi all, StarlingX + STX-Openstack sanity and regression results: Overall run status: GREEN Build Info: Build date: 18-Jun ISO: https://mirror.starlingx.cengn.ca/mirror/starlingx/master/debian/monolithic… Helm Charts: https://mirror.starlingx.cengn.ca/mirror/starlingx/master/debian/monolithic… AIO-DX Baremetal with VSWITCH_TYPE=OVS Sanity Results: Overall Status: GREEN Automated Test Results Summary: ------------------------------------------------------ Passed: 15 (100.00%) Failed: 0 (0.00%) Total Executed: 15 List of Test Cases: ------------------------------------------------------ PASS  20230620 04:00:12 test_ssh_to_hosts PASS  20230620 04:01:36 test_lock_unlock_host PASS  20230620 04:21:46 test_openstack_services_healthy PASS  20230620 04:22:48 test_reapply_stx_openstack_no_change[controller-0] PASS  20230620 04:24:35 test_reapply_stx_openstack_no_change[controller-1] PASS  20230620 04:33:30 test_horizon_create_delete_instance PASS  20230620 04:37:15 test_swact_controllers PASS  20230620 04:44:44 test_ping_between_two_vms[tis-centos-guest-virtio-virtio] PASS  20230620 04:51:29 test_migrate_vm[tis-centos-guest-live-None] PASS  20230620 04:56:28 test_nova_actions[tis-centos-guest-dedicated-pause-unpause] PASS  20230620 05:01:02 test_nova_actions[tis-centos-guest-dedicated-suspend-resume] PASS  20230620 05:05:39 test_evacuate_vms PASS  20230620 05:38:32 test_system_coredumps_and_crashes[core_dumps] PASS  20230620 05:38:47 test_system_coredumps_and_crashes[crash_reports] PASS  20230620 05:38:53 test_system_alarms ------------------------------------------------------ We'd like to point out that there is a Launchpad open for the TC test_reapply_stx_openstack_no_change[controller-1] : STX-O| config-out-of-date alarm won't disappear on controller-1 after swact and reapplying app<https://bugs.launchpad.net/starlingx/+bug/2023657> This TC is passing on a re-run. Regression Results: Overall Status: GREEN Automated Test Results Summary: ------------------------------------------------------ Passed: 12 (100.0%) Failed: 0 (0.0%) Total Executed: 12 List of Test Cases: ------------------------------------------------------ PASS  20230620 05:48:10 test_lldp_neighbor_remote_port PASS  20230620 05:49:44 test_kernel_module_signatures PASS  20230620 05:50:45 test_delete_heat_after_swact[OS_Cinder_Volume.yaml] PASS  20230620 05:55:37 test_multiports_on_same_network_vm_actions[virtio_x4] PASS  20230620 06:16:08 test_cpu_pol_vm_actions[2-dedicated-image-volume] PASS  20230620 06:28:11 test_vm_mem_pool_default_config[2048] PASS  20230620 06:31:07 test_vm_mem_pool_default_config[1048576] PASS  20230620 06:36:52 test_resize_vm_positive[local_image-4_1_512-5_2_1024-image] PASS  20230620 06:43:57 test_server_group_boot_vms[affinity-2] PASS  20230620 06:49:18 test_server_group_boot_vms[anti_affinity-2] PASS  20230620 06:54:55 test_vm_with_config_drive PASS  20230620 07:21:58 test_lock_with_vms ------------------------------------------------------ regards, STX-Openstack Distro Team
1 0
0 0
  • ← Newer
  • 1
  • 2
  • 3
  • 4
  • 5
  • Older →

HyperKitty Powered by HyperKitty version 1.3.12.