Task #9703
closedwrong software and data distribution on node75.d4science.org
100%
Description
Two kind of problems on this host:
- more than duplicate occopus software directory
/home/gcube/nodes/occopus /home/gcube/occopus /home/gcube/tomcat/logs/occopus /home/gcube/tomcat/occopus
more than 32000 data files in the /home/gcube/tomcat
There is a third potential problem, occopus software seems to use an embedded version of python and this is not welcome, at all
Please fix and clean the first thwo problems and let us know the reasons for the third.
Updated by Nunzio Andrea Galante over 7 years ago
- Status changed from New to In Progress
- % Done changed from 0 to 10
Updated by Nunzio Andrea Galante over 7 years ago
- % Done changed from 10 to 50
First, let me say that we need to upgrade to java8 such node, otherwise we have to compile in java7 the entire set of dependecies of this component.
In the past we used the Occopus framework in the EGI project context and yes, it needs his own Python to work correctly. Anyway, since the features of interest for us provided by Occopus (e.g. the OCCI support) are/were already covered from FHNManager Service, i have unistalled Occopus from the node and a new version of the service will be deployed within tomorrow. This has no impacts on the correct execution of the FHNManager Service and the minor changes made will be totally transparent to the portlet using the service.
The only difference is that now you will not find anymore any Occopus directories and that the data files in the /home/gcube/tomcat directory will be empty (actually 32000 files have been clean).
Updated by Andrea Dell'Amico over 7 years ago
Nunzio Andrea Galante wrote:
First, let me say that we need to upgrade to java8 such node, otherwise we have to compile in java7 the entire set of dependecies of this component.
Yes, that's should be already happened.
In the past we used the Occopus framework in the EGI project context and yes, it needs his own Python to work correctly. Anyway, since the features of interest for us provided by Occopus (e.g. the OCCI support) are/were already covered from FHNManager Service, i have unistalled Occopus from the node and a new version of the service will be deployed within tomorrow. This has no impacts on the correct execution of the FHNManager Service and the minor changes made will be totally transparent to the portlet using the service.
The only difference is that now you will not find anymore any Occopus directories and that the data files in the /home/gcube/tomcat directory will be empty (actually 32000 files have been clean).
I noticed that the voms/occi packages where all manually installed. As all the components must be deployed by the provisioning system, that part should be added to the playbook.
We already have roles that install those, you can find into the ansible playbooks repository under library/roles/egi
. The defaults should be replaced with the correct values.
Updated by Nunzio Andrea Galante over 7 years ago
Please refer to fhn-manager-service-1.2.3-4.7.0-153229 component available in the repository towards the new Staging version to deploy.
Updated by Nunzio Andrea Galante over 7 years ago
- % Done changed from 50 to 100
Please refer to fhn-manager-service-1.2.3-4.7.0-153229 component available in the repository towards the new Staging version to deploy.
Updated by Andrea Dell'Amico over 7 years ago
@tommaso.piccioli@isti.cnr.it noted yesterday that the running distribution is Ubuntu 12.04. That VM must be recreated as 14.04 because it's not compatible with java 8.
Updated by Roberto Cirillo over 7 years ago
- Tracker changed from Incident to Task
I'm going to change this tracker to task since the problem is solved and open a subtask for creating a new VM for deploying the new fhn-manager version.
Updated by Nunzio Andrea Galante over 7 years ago
- Due date set to Sep 29, 2017
- Start date changed from Sep 20, 2017 to Sep 29, 2017
due to changes in a related task
Updated by Tommaso Piccioli over 7 years ago
Could you please clarify the situation of this server?
Updated by Andrea Dell'Amico over 7 years ago
You can destroy it. There's a newer one: fhn-manager-t.pre.d4science.org
Updated by Tommaso Piccioli over 7 years ago
I suspect that there is something different still running on node75...
Updated by Tommaso Piccioli over 7 years ago
Could you please tell me what is doing this node?
If there will be no answer I will shutdown it in 24 hours.
Updated by Tommaso Piccioli over 7 years ago
- Status changed from In Progress to Closed