Create
cancel
Showing results for 
Search instead for 
Did you mean: 
Sign up Log in

Getting error : GC overhead limit exceeded while doing manual backup in confluence

saravanan subramanian February 13, 2013

Hi Team,

I getting the below error often "GC Overhead limit exceeded" or "Java heap space error" while doing the manual backup in Confluence. Currently my Java VM Memory Statistics is as below in setenv.sh

JAVA_OPTS="-Xms1769m -Xmx1769m -XX:MaxPermSize=640m $JAVA_OPTS -Djava.awt.headless=true "

Because of this error scheduled backup also fails. I have tried increasing memory MaxPermSize to additional 128m and tried restarting the server. It works only at first time after the restart then the next day both manual and scheduled backup fails again throwing the below error. Its works again after the server restart. Could please advise for the permanant solution for this manual backup error. Please do the needfull as earliest.
Thanks,
Saravanan Subramanian.

Error Logs:

javax.servlet.ServletException: Servlet execution threw an exception

at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:313)

caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
at net.sf.hibernate.type.LongType.get(LongType.java:21)

Stack Trace:[hide]

javax.servlet.ServletException: Servlet execution threw an exception
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:313)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
	at com.atlassian.plugin.servlet.filter.IteratingFilterChain.doFilter(IteratingFilterChain.java:46)
	at com.atlassian.plugin.servlet.filter.DelegatingPluginFilter$1.doFilter(DelegatingPluginFilter.java:66)
	at com.atlassian.applinks.core.rest.context.ContextFilter.doFilter(ContextFilter.java:25)
	at com.atlassian.plugin.servlet.filter.DelegatingPluginFilter.doFilter(DelegatingPluginFilter.java:74)
	at com.atlassian.plugin.servlet.filter.IteratingFilterChain.doFilter(IteratingFilterChain.java:42)
	at com.atlassian.plugin.servlet.filter.ServletFilterModuleContainerFilter.doFilter(ServletFilterModuleContainerFilter.java:77)
	at com.atlassian.plugin.servlet.filter.ServletFilterModuleContainerFilter.doFilter(ServletFilterModuleContainerFilter.java:63)
	at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)
	at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
	at net.sf.hibernate.type.LongType.get(LongType.java:21)
	at net.sf.hibernate.type.NullableType.nullSafeGet(NullableType.java:62)
	at net.sf.hibernate.type.NullableType.nullSafeGet(NullableType.java:53)
	at net.sf.hibernate.type.ManyToOneType.hydrate(ManyToOneType.java:61)
	at net.sf.hibernate.loader.Loader.hydrate(Loader.java:690)
	at net.sf.hibernate.loader.Loader.loadFromResultSet(Loader.java:631)
	at net.sf.hibernate.loader.Loader.instanceNotYetLoaded(Loader.java:590)
	at net.sf.hibernate.loader.Loader.getRow(Loader.java:505)
	at net.sf.hibernate.loader.Loader.getRowFromResultSet(Loader.java:218)
	at net.sf.hibernate.loader.Loader.doQuery(Loader.java:285)
	at net.sf.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:138)
	at net.sf.hibernate.loader.Loader.doList(Loader.java:1063)
	at net.sf.hibernate.loader.Loader.list(Loader.java:1054)
	at net.sf.hibernate.hql.QueryTranslator.list(QueryTranslator.java:854)
	at net.sf.hibernate.impl.SessionImpl.find(SessionImpl.java:1556)
	at net.sf.hibernate.impl.SessionImpl.find(SessionImpl.java:1533)
	at net.sf.hibernate.impl.SessionImpl.find(SessionImpl.java:1525)
	at sun.reflect.GeneratedMethodAccessor682.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.springframework.orm.hibernate.HibernateTemplate$CloseSuppressingInvocationHandler.invoke(HibernateTemplate.java:1123)
	at $Proxy99.find(Unknown Source)

2 answers

0 votes
Septa Cahyadiputra
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
February 13, 2013

Hi Saravanan,

First and foremost, please understand that ConfluenceXML backup and restore feature is meant for evaluation purposes and it has been known to be unreliable for huge instances. Also, it require a huge resources from Confluence to generate the backup everyday.

We highly recommend you to follow this backup method for your production instance:

If you still want to try the XML backup strategy, please try to configure the following parameter:

JAVA_OPTS="-Xms2048m -Xmx4096m -XX:MaxPermSize=256m $JAVA_OPTS -Djava.awt.headless=true"

Hope it helps.

Cheers,
Septa Cahyadiputra

0 votes
Harry Chan
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
February 13, 2013

Hi, I don't think it is a PermGen issue, but a heap issue. I would increase the heap memory instead. Set Xmx higher if possible.

saravanan subramanian February 13, 2013

Hi Harry,

Thanks for your reply. Could you please give me the exact heap memory size i should increase ?

Currently my Java VM Memory Statistics is as below in setenv.sh

JAVA_OPTS="-Xms1769m -Xmx1769m -XX:MaxPermSize=640m $JAVA_OPTS -Djava.awt.headless=true "

I have tried adding 128 M in MaxPermSize ie) 640 + 128 = 768m, but it works only for the first time after the server restart. after that the next day it will through the same error. Can you explain me with the exact size I need to increase in order to avoid the error ?

Thanks,

Saravanan

Harry Chan
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
February 13, 2013

Too many variables here. What version of Confluence? What version of Java? 64bit or 32bit?

How big is your data? The problem is the more data, the larger the XML. Unfortunately this XML dumps most things into memory whilst processing and hence the huge memory requirement.

saravanan subramanian February 13, 2013

Current Confluence Version : 4.2.8

Java : 64 bit

backup size : 741M daily-backup-2013-02-13.zip

Harry Chan
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
February 13, 2013

Is this Java 6 or Java 7? GC Overhead Limit Exceeded means not enough heap memory generally. I suspect >2GB would be required, but I don't have a hard number. You'll need to do some testing.

saravanan subramanian February 13, 2013

Hi Harry,

JAVA SE DEVELOPMENT KIT (JDK), VERSION 6

jdk 1.6.0.18-64

Can i increase the below memory size?

JAVA_OPTS="-Xms2048m -Xmx4096m -XX:MaxPermSize=1024m $JAVA_OPTS -Djava.awt.headless=true"

Old Java Heap size :

JAVA_OPTS="-Xms1769m -Xmx1769m -XX:MaxPermSize=640m $JAVA_OPTS -Djava.awt.headless=true "

saravanan subramanian February 13, 2013

Hi Team,

Can any one advise on this issue ?

Thanks

Saravanan

Harry Chan
Rising Star
Rising Star
Rising Stars are recognized for providing high-quality answers to other users. Rising Stars receive a certificate of achievement and are on the path to becoming Community Leaders.
February 13, 2013

This is a community forum. If you need Atlassian support from the Atlassian support staff, you should create a ticket with Atlassian support.

Either way, upgrading to Java 7 should fix things up. It reduces memory. Updating your settings as you have put it would improve the situation too.

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events