Bamboo BuildLogger.getLastNLogEntries & getLogEntryCount limitted to 100 lines

Hello

We are investigating a customer issue and noticed that root cause is that BuildLogger.getLastNLogEntries and BuildLogger.getLogEntryCount are limited to 100 entries.
So our app, which does analyze build logs doesn’t see the full log.

I’ve checked the docs and no such limit is mentioned: BuildLogger (Atlassian Bamboo 9.6.2 API)

This API feels broken. It tells you to give a count of log entries and a way to get log entries. But that is not what the API does! It gives you only a tail.

I think the docs should be updated. Even considered to deprecate these methods, as they a sure way to shoot yourself in the foot.

So, my question for now is, what is the best way to read all the logs of a build?

  • Update: There seems exist a ‘LogMemorisingInterceptor’. But its kind of silly to cache all things in memory again by the app. We would prefer to read the logs directly: Bamboo already keeps it somewhere.

cc: @MartynaWojtas

Thanks.

1 Like

Don’t really know the architecture of bamboo. I need two clarifications about your question

  1. Are all the different build logs put into one single file? (I don’t think they do)
  2. Just to confirm, the default log file for bamboo is not present in <bamboo_home>/log directory?

@Saurabh.Vamhi I’m looking for the logs a specific build and accessing it from the Bitbucket Java API.

The regular logs are present in <bamboo_home>/log. But those are not necessarily the build job logs, but the overall logs, right?

Hi @RomanStoffel! These two system properties haven’t been changed recently - the limit was there from the beginning according to product dev team, for the purpose of limiting performance issues with logs. We could prepare a patch for these two properties for you. I still wonder why the problem hasn’t appeared before as I assume it’s not a new app? cc: @pskierczynski

Its a newer app. Which worked some cases, but didn’t work in others. The reason was that it didn’t see all the logs.

We use the LogMemorisingInterceptor in other apps. We will probably fix it with that if there is not other suggestion.

I just want to let you know that this API is inviting bugs in apps. As it API name suggest to give you full logs and the documentation does not mention any limits.

I see: getLastNLogEntries and getLogEntryCount => I expect to get an count and the log lines I requested. Even when I’m careful double check the docs, it doesn’t imply any limit

My (unrealistics) hope:

  1. Remove the limit? But that might then causes issues witch apps which relied on this limit.

Therefore:

  1. document that limitation in the JavaDocs
  2. Deprecate the API, giving a hint that this API doesn’t act as you would expect
  3. Maybe: Add better named alternatives?
    • getLogEntryCount => Can be dropped. If its limit to 100, I’m not that interested in a count.
    • getLogTailEntries => (better names welcome). Tell me that there this doesn’t just get log lines. It has limits.

Anyway, for now. Is there any better API than LogMemorisingInterceptor to get logs from the build in an App? I would prefer avoid buffering log lines once more if there is a good alternative API.
2.

1 Like

Hi @RomanStoffel,
Thank you for letting us know about this problem, I agree it’s confusing and the documentation will be updated. As @MartynaWojtas mentioned, if you need urgent support we can provide you with a patch to increase both limits (for logs and error logs) using system property, however storing larger amount of logs can negatively impact performance of your Bamboo instance. Therefore we didn’t and I’m not sure if we consider introducing such customisation in next Bamboo versions. I can’t think of alternative API, but if LogMemorisingInterceptor in a way solves your issue I would recommend using it instead.

2 Likes

Ok, I’m trying to use the LogMemorisingInterceptor.

The difficulty I have that I need to add it before the builds starts, and then have a reference to it when build completes. There seems no good way to do that?
I’m basically missing some ‘build-context’ where I can put the LogMemorisingInterceptor.

My current hacky work around seems to be this:

  • Have a CustomPreBuildAction, it creates the LogMemorisingInterceptor for that build.
    • Fishy work around: It puts that LogMemorisingInterceptor into a thread local.
  • Have my CustomBuildProcessor get that instance of the LogMemorisingInterceptor out of the thread local.

This reliance on a thread local is very fishy and relies on implementation details. What I’m missing is a way to ‘add’ LogMemorisingInterceptor before the build starts and get it out when the CustomBuildProcessor is called.

Update 1: Hmm, seems to not work if there are more than 1 task. The LogMemorisingInterceptor only receives logs for the 1st task.

Update 2: Ok, the LogInterceptorStack is cleared when the next task runs. So, that doesn’t work, as it removed the added LogMemorisingInterceptor. Is there a way / callback for each Task? Or some other way to add a LogMemorisingInterceptor for the complete job.

@MartynaWojtas @WiolettaDys Ok, I’m stuck at the LogInterceptorStack being reset between Tasks. Is there any CustomPreTaskAction or a way to keep a LogMemorisingInterceptor present through a Job / build?

Update 3: Another thing I tried is using the ‘BuildLogFileAccessorFactory.createBuildLogFileAccessor’, which points to the right log file. However, at CustomBuildProcessor time the file isn’t yet filled with the logs. It looks like that log file is flushed later on, so it doesn’t contain the logs of the build at that time.

3 Likes

Ok my current solution which probably works, but is hacky is this:

  1. Use the BuildLogFileAccessorFactory.createBuildLogFileAccessor to get to the log. It is backed by the actual log file.
  2. However, these logs flushed only periodically, so this log might not be present. Therefore, flush it.
  3. There BuildLoggerManager.getLogger() does return the current logger, but has no flush method.
  4. However, in practise this logger is backed by a implementation which does have a flush method. Therefore, use reflection to call it.
  5. Now the file is up to date for the BuildLogFileAccessorFactory.createBuildLogFileAccessor

Pseudo code, no error handling etc.:

    val buildLogger = buildLoggerManager.getLogger(buildContext.getPlanResultKey)
    val flush = buildLogger.getClass.getMethod("flush")
    flush.invoke(buildLogger)
    val logLines = KeepLogsPretask.retrieveLogsClearMemorizer(buildLogger)

    val fullLog = blgs.createBuildLogFileAccessor(buildContext.getPlanResultKey)
    fullLog.openFileForIteration() // Looks most efficient. Other methods tend load everything into memory
    // iterate over it.

Well, I’ve to check if that works on Build agents and in the latest Bamboo versions. But it seems the best option so far.

1 Like

Hi @RomanStoffel,
Maybe there is a less “hacky” way to fetch the logs you need. As you mentioned <bamboo_home>/logs contains overall logs of the instance, but the specific logs for each job are located, depending on your Bamboo version, in <bamboo_shared_home>/builds (for DC) or <bamboo_home>/xml-data/builds (for Server licence). The directory pattern is following: “plan-<plan_id>-<job_key>” - once you have found the desired job dir, you can find logs for the specific build in /download-data/build_logs.

Hello

So, I finally found a satisfying solution:

  1. There is a StorageLocationService.getLogFile, which return the log file. So, I directly can parse it. There is also a BuildLogFileAccessorFactory which already returns some higher level facility. However, directly parsing the file is better, so I can ensure that it isn’t loaded into memory by accident.
  2. I had to move the processing from the Agent to the Server =(. (CustomProcessorServer → CustomBuildProcessorServer. This is for two reasons:
    • On the Agent the StorageLocationService still gives you the final location, not the ‘spool’ location. I could calculate that myself.
    • On the Agent the logs are not yet flushed. And there is no API to flush it. So, the spool file is missing things. I would need very, very deep reflection hacks to flush it. That isn’t great.

So, the solution is working I’m mostly happy with it. The only bad thing is that the processing needs to be done on Server. I would prefer to run in on the Agent, so that it distributes the load.

So, the code is roughly this:

StorageLocationService locations = ...

val fullLog = logLocations.getLogFile(buildContext.getPlanResultKey).toPath
try(var lines = Files.lines(fullLog)){
   // stream-process the lines
}

1 Like

Ok, for your app the BuildLogFileAccessorFactory solution also doesn’t work. Because we also will analyze the part of the working directory, based on results from the logs.
And in the CustomBuildProcessorServer we do not have the working directory, as we are not running on the build agent.

So, overall, I think there is no solution for our case at the moment, as we need the logs on the Bamboo agent side, because:

  • The spool file is missing the last lines, as it is flushed lazily. And there is no way to flush it.
  • The BuildLogger has the 100 lines limit.

We probably will analyze both and hope overall it will have all the log lines. Of course, it isn’t garanteed to work, but probably the best we can do at the moment.

Update:
Analyzing the BuildLogger and the spool file doesn’t seem to work quite right.
So, current work around seems:

  • Write a ridiculous long log line into the logs. This provokes a flush of the previous log
  • Analyze then the log file.
1 Like