[Starlingx-discuss] build-pkgs cannot complete std build----aclocal: too many loops

Saul Wold sgw at linux.intel.com
Tue Oct 2 13:29:59 UTC 2018


Folks,

What's the status of the review of these PRs for stx-ceph, this is still 
causing build breakage in the 2018.10 branch.

I think this will require an update to the 2018.10 manifest also

Sau!


On 09/29/2018 06:47 AM, Sun, Austin wrote:
> Hi Cindy and All:
>      I have generated two PR, one is for master and another is for r/2018.10.  please help check if the PR process is correct .
> 
> -[0] https://github.com/starlingx-staging/stx-ceph/pull/2
> -[1] https://github.com/starlingx-staging/stx-ceph/pull/3
> 
> 
> Thanks.
> BR
> Austin Sun.
> 
> 
> -----Original Message-----
> From: Xie, Cindy
> Sent: Saturday, September 29, 2018 6:03 PM
> To: Sun, Austin <austin.sun at intel.com>; Saul Wold <sgw at linux.intel.com>; Scott Little <scott.little at windriver.com>; starlingx-discuss at lists.starlingx.io
> Subject: RE: [Starlingx-discuss] build-pkgs cannot complete std build----aclocal: too many loops
> 
> Austin,
> Thanks for the finding - can we generate pull-requests for ax_require_define.m4 against StarlingX-staging?
> 
> Thx. - cindy
> 
> -----Original Message-----
> From: Sun, Austin [mailto:austin.sun at intel.com]
> Sent: Saturday, September 29, 2018 4:41 PM
> To: Saul Wold <sgw at linux.intel.com>; Scott Little <scott.little at windriver.com>; starlingx-discuss at lists.starlingx.io
> Subject: Re: [Starlingx-discuss] build-pkgs cannot complete std build----aclocal: too many loops
> 
> Hi Saul , Scott and Erich:
> 
>      I met same issue in my local env,  I did some study aclocal script and some test.
>         
>      According your analysis , autoconf-archive-2017.03.21-1.el7.noarch.rpm was added , which included a lot of system m4 files used by aclocal,  according [0] and [1],  '--install' option will cp system macros (.m4) to local m4 folder ,
> 
>  From error log:
> 
> BUILDSTDERR: aclocal: installing /usr/share/aclocal/ax_require_defined.m4 to m4/ax_require_defined.m4
> BUILDSTDERR: aclocal: installing 'm4/ax_require_defined.m4' from '/usr/share/aclocal/ax_require_defined.m4'
> BUILDSTDERR: aclocal: running: cp /usr/share/aclocal/ax_require_defined.m4 m4/ax_require_defined.m4
> BUILDSTDERR: aclocal: running aclocal anew, because some files were installed locally
> BUILDSTDERR: aclocal: error: too many loops
> 
> ax_require_defined.m4 cause this issue, If  copied ax_require_defined.m4 to code base stx/git/ceph/m4/,  ceph can be built successfully .
> 
> since cannot upload change to starlingx-staging , please use [3] ax_require_defined.m4 file which is same as file in autoconf-archive-2017.03.21-1.el7.noarch.rpm package
> 
> - About  why ax_require_defined.m4 is a must for ceph build:
>      ax_cxx_compile_stdcxx_11.m4 is upgraded from local serial 4 to system serial 18 including in autoconf-archive rpm , which is using AX_REQUIRE_DEFINED defined in ax_require_defined.m4
>       have tried force upgrade local m4/ax_cxx_compile_stdcxx_11.m4 to serial 19 ,  it can solve this issue too.
> 
> -[0] https://www.gnu.org/software/automake/manual/html_node/Serials.html#Serials
> -[1] https://www.gnu.org/software/automake/manual/html_node/aclocal-Invocation.html#aclocal-Invocation
> -[2] http://git.savannah.gnu.org/gitweb/?p=autoconf-archive.git;a=blob_plain;f=m4/ax_require_defined.m4
> 
> Thanks.
> BR
> Austin Sun.
> 
> -----Original Message-----
> From: Saul Wold [mailto:sgw at linux.intel.com]
> Sent: Saturday, September 29, 2018 8:20 AM
> To: Scott Little <scott.little at windriver.com>; starlingx-discuss at lists.starlingx.io
> Subject: Re: [Starlingx-discuss] build-pkgs cannot complete std build
> 
> 
> 
> On 09/28/2018 01:39 PM, Scott Little wrote:
>> Ok, we've seen 3 ceph failures in our last 6 builds.
>>
>> The common factor:  tpm2-tools builds on 'b0' before ceph builds.
>>
>> Our theory.  The buildRequires of tpm2-tools causes autoconf-archive
>> to be installed... which installs a bunch of .m4 files in
>> /usr/share/aclocal ... which causes ceph grief when it calls aclocal.
>>
>> I don't really know automake or aclocal all that well.  I'm assuming
>> /usr/share/aclocal is acting something like a cache, but it's a cache
>> whos contents are incompatible with ceph.
>>
>> Do we have any autotools / aclocal / m4 experts in the house?
>>
>> Possible fixes:
>> - ceph: can we tell it to not use the aclocal cache... explicitly (a
>> flag to aclocal?)  ... or implicitly (update ceph's m4 files so they
>> look 'newer' than the cache)?
> 
> Not sure about that, I would have to dig deeper into aclocal, it's been a while since I dug into that.
> 
>> - tpm2-tools: Can we remove the dependence on autoconf-archive? No
>> other package we build seems to need it.
>>
> A quick scan show that the autoconf-archive was put in there for travis support, and goes away this past March upstream when they coverted to using a container for travis.  If we could use a newer version of tpm2 that might solve this.
> 
> Maybe Erich's solution can work
> 
> Sau!
> 
> 
>> Scott
>>
>>
>>
>> On 18-09-27 04:45 PM, Saul Wold wrote:
>>>
>>> And of course it worked the third time!
>>>
>>> So, I lost the good logs.
>>>
>>> Sau!
>>>
>>>
>>> On 09/27/2018 12:56 PM, Scott Little wrote:
>>>> On 18-09-27 03:53 PM, Scott Little wrote:
>>>>> Our latest build, based on code synced at 2018-09-27T15:28:00  UTC,
>>>>> built successfully.
>>>>>
>>>>> It took three attempts to get ceph built.  The first two passes
>>>>> aborted quickly due to missing packages.  The final pass did not
>>>>> exhibit the 'aclocal: too many loops'**issue.
>>>>>
>>>>> The only build I have that exhibited the too many loops error was a
>>>>> snapshot on 2018-09-20T15:50:40 UTC
>>>>>
>>>>> I do have a designer with an older snapshot that seems to hit it
>>>>> regularly, so I'll work with him and see if we can learn more.
>>>>>
>>>>> I think we need more data from the community
>>>>> - Who's build is failing on ceph with *aclocal: too many loops?*
>>>>> - Who is building successfully ?
>>>>> - Who can build only intermittently?
>>>>>
>>>>>
>>>>>
>>>>> Info to collect for failed builds:
>>>> - repo sync timestamp
>>>>> - build command used?
>>>>> - Was it a new workspace, a cleaned workspace, or a previously used
>>>>> workspace?
>>>> - $MY_WORKSPACE/CONTEXT
>>>>> - $MY_WORKSPACE/build-std.log
>>>>> - $MY_WORKSPACE/std/results/*/ceph-*/*.log
>>>>>
>>>>> For successful builds, same info. Rather than full build logs, I
>>>>> can settle for:
>>>>> - grep '\(Success building\|iteration\|building ceph\)'
>>>>> $MY_WORKSPACE/build-std.log
>>>>> - grep compute_resources: build-std.log
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On 18-09-27 02:21 PM, Saul Wold wrote:
>>>>>> On 09/26/2018 09:16 AM, Scott Little wrote:
>>>>>>> aclocal 'too many loops' has been popping up sporadically for a
>>>>>>> week or two now.  Possibly 7.5 related.
>>>>>>>
>>>>>>> I suspect that there is a build order and/or race condition
>>>>>>> element to this.   It often goes away if you just run build-pkgs
>>>>>>> a second time.
>>>>>>>
>>>>>> I am seeing this failure also, but it does not go away after a
>>>>>> second rebuild.  I have the lastest stx-root (build-tools) with
>>>>>> the recent patches.
>>>>>>
>>>>>> Is this directly related to the fuzz issue or is there something
>>>>>> else we need to address in CEPH itself.
>>>>>>
>>>>>> This is blocking my local build.
>>>>>>
>>>>>> Sau!
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> Starlingx-discuss mailing list
>>>>> Starlingx-discuss at lists.starlingx.io
>>>>> http://lists.starlingx.io/cgi-bin/mailman/listinfo/starlingx-discus
>>>>> s
>>>>
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> Starlingx-discuss mailing list
>>>> Starlingx-discuss at lists.starlingx.io
>>>> http://lists.starlingx.io/cgi-bin/mailman/listinfo/starlingx-discuss
>>>>
>>
> 
> _______________________________________________
> Starlingx-discuss mailing list
> Starlingx-discuss at lists.starlingx.io
> http://lists.starlingx.io/cgi-bin/mailman/listinfo/starlingx-discuss
> _______________________________________________
> Starlingx-discuss mailing list
> Starlingx-discuss at lists.starlingx.io
> http://lists.starlingx.io/cgi-bin/mailman/listinfo/starlingx-discuss
> 



More information about the Starlingx-discuss mailing list