Project

General

Profile

Actions

Task #10954

closed

There is no limit to the access log file number

Added by Tommaso Piccioli over 7 years ago. Updated over 6 years ago.

Status:
Closed
Priority:
Low
Category:
Application
Target version:
Start date:
Jan 17, 2018
Due date:
% Done:

100%

Estimated time:
Infrastructure:
Pre-Production

Description

It seems that there is no limit the number of the access.log files after they are rotated (one a day)

I notice this on gcube@collector-pre.d4science.org and gcube@registry-pre.d4science.org
Probably the problem is on other servers, in pre-prod, dev and prod.


Files

Screen Shot 2018-01-18 at 17.53.21.png (481 KB) Screen Shot 2018-01-18 at 17.53.21.png Pasquale Pagano, Jan 18, 2018 05:53 PM

Related issues

Related to D4Science Infrastructure - Task #11522: Provide a cronjob for removing old access.log filesClosedRoberto CirilloMar 26, 2018

Actions
Actions #1

Updated by Roberto Cirillo over 7 years ago

  • Tracker changed from Support to Task
  • Status changed from New to In Progress
  • Priority changed from Normal to Low

I guess this problem is present in all the containers gCore based.
I've checked the following nodes and this issue is present in all the following nodes:

node66.p.d4science
node24.p.d4science
collector-pre
registry-pre

The configuration is the following so we should have only 30 access.log files in the container:

#GCUBEHandler logger 
log4j.logger.org.gcube.common.handlers=TRACE,ACCESS
log4j.appender.ACCESS=org.apache.log4j.DailyRollingFileAppender
log4j.appender.ACCESS.file=${GLOBUS_LOCATION}/logs/access.log
log4j.appender.ACCESS.DatePattern='.'yyyy-MM-dd
log4j.appender.ACCESS.layout=org.apache.log4j.PatternLayout
log4j.appender.ACCESS.layout.ConversionPattern=%d{ISO8601} %-5p %c{2} [%t,%M:%L] %m%n
log4j.appender.ACCESS.threshold=INFO
log4j.appender.ACCESS.MaxBackupIndex=30
log4j.appender.ACCESS.MaxFileSize=10000KB

I'm going to investigate on this

Actions #2

Updated by Roberto Cirillo over 7 years ago

  • Status changed from In Progress to Feedback
  • Assignee changed from Roberto Cirillo to Tommaso Piccioli
  • % Done changed from 0 to 100

The property MaxBackupIndex is present on the appender but it doesn't take effect because the current version of Log4j (Apache log4j 1.2.6) does not provide any mechanism to delete old log files if you are using DailyRollingFileAppender.
So I guess we can provide a fast solution installing a cronjob to do this job and deploy it on every gCore based node. What do you think about it?

Actions #3

Updated by Tommaso Piccioli over 7 years ago

I would prefer a good solution instead of a quick solution.
Changing the log4j version could help?

Actions #4

Updated by Roberto Cirillo over 7 years ago

Tommaso Piccioli wrote:

I would prefer a good solution instead of a quick solution.
Changing the log4j version could help?

It is not so easily to do. If I'm not wrong, we should migrate to log4j2 and change the appender but I don't know if it is feasible on gCore based containers @lucio.lelii@isti.cnr.it , @pasquale.pagano@isti.cnr.it what do you think about?

Actions #5

Updated by Pasquale Pagano over 7 years ago

In the attachment, there should be the servers running gCore. It is a small number of servers that should be dismissed in the first semester of 2018. I would implement the shortest and costless solution for those servers.

Actions #6

Updated by Lucio Lelii over 7 years ago

I agree with Lino and Roberto, the costless solution is the best since the gCore based services will be dismissed in a short time.

Actions #7

Updated by Roberto Cirillo about 7 years ago

  • Related to Task #11522: Provide a cronjob for removing old access.log files added
Actions #8

Updated by Roberto Cirillo over 6 years ago

  • Status changed from Feedback to Closed
Actions

Also available in: Atom PDF

Add picture from clipboard (Maximum size: 8.91 MB)