You can subscribe to this list here.
2007 |
Jan
|
Feb
|
Mar
(927) |
Apr
(419) |
May
(352) |
Jun
(431) |
Jul
(463) |
Aug
(345) |
Sep
(304) |
Oct
(596) |
Nov
(466) |
Dec
(414) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2008 |
Jan
(348) |
Feb
(313) |
Mar
(665) |
Apr
(688) |
May
(434) |
Jun
(311) |
Jul
(540) |
Aug
(554) |
Sep
(467) |
Oct
(341) |
Nov
(365) |
Dec
(272) |
2009 |
Jan
(386) |
Feb
(293) |
Mar
(279) |
Apr
(239) |
May
(229) |
Jun
(199) |
Jul
(186) |
Aug
(111) |
Sep
(196) |
Oct
(146) |
Nov
(116) |
Dec
(140) |
2010 |
Jan
(170) |
Feb
(159) |
Mar
(151) |
Apr
(161) |
May
(90) |
Jun
(56) |
Jul
(28) |
Aug
(22) |
Sep
(5) |
Oct
|
Nov
(23) |
Dec
(12) |
2011 |
Jan
(8) |
Feb
(8) |
Mar
(22) |
Apr
(24) |
May
(4) |
Jun
|
Jul
(2) |
Aug
|
Sep
|
Oct
(1) |
Nov
|
Dec
|
2012 |
Jan
(5) |
Feb
(1) |
Mar
|
Apr
(1) |
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2013 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
(1) |
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2014 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(2) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: <bo...@hy...> - 2010-01-04 09:19:08
|
Author: bob Date: 2010-01-04 01:18:59 -0800 (Mon, 04 Jan 2010) New Revision: 14128 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14128 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1309 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2010-01-03 09:23:38 UTC (rev 14127) +++ trunk/etc/version.properties 2010-01-04 09:18:59 UTC (rev 14128) @@ -1,3 +1,3 @@ -#Sun Jan 03 00:32:42 PST 2010 +#Mon Jan 04 00:27:24 PST 2010 version=4.3.0 -build=1308 +build=1309 |
From: <bo...@hy...> - 2010-01-03 09:23:51
|
Author: bob Date: 2010-01-03 01:23:38 -0800 (Sun, 03 Jan 2010) New Revision: 14127 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14127 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1308 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2010-01-02 09:26:53 UTC (rev 14126) +++ trunk/etc/version.properties 2010-01-03 09:23:38 UTC (rev 14127) @@ -1,3 +1,3 @@ -#Sat Jan 02 00:33:02 PST 2010 +#Sun Jan 03 00:32:42 PST 2010 version=4.3.0 -build=1307 +build=1308 |
From: <bo...@hy...> - 2010-01-02 09:27:01
|
Author: bob Date: 2010-01-02 01:26:53 -0800 (Sat, 02 Jan 2010) New Revision: 14126 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14126 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1307 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2010-01-01 09:22:06 UTC (rev 14125) +++ trunk/etc/version.properties 2010-01-02 09:26:53 UTC (rev 14126) @@ -1,3 +1,3 @@ -#Fri Jan 01 00:29:13 PST 2010 +#Sat Jan 02 00:33:02 PST 2010 version=4.3.0 -build=1306 +build=1307 |
From: jrallen9 <jus...@te...> - 2010-01-01 17:31:39
|
I have a problem with how sigar is calling its native methods. I built a simple java test program which made method calls to gather stats on a Windows XP x86 computer. Using a Sigar object I was able to make method calls to CpuPerc, Mem, Filesystem and DiskUsage and display these results to a console. Since I'm using eclipse to develop this program all I had to do was add the sigar.jar to my class path and both java was able to load the jar, and sigar was able to load the sigar-winnt-x86.dll (which was in its default lib directory as sigar.jar). Fast Forward to integrating this code into a production project. Since adding this sigar code to a production project I'm building, I'm faced with a problem where the program calls native methods for CpuPerc.gather(), Mem.gather(), and FileSystem.gather(), just fine with out exceptions, but when calling DiskUsage.gather(), I get a SigarFileNotFoundException indicating the program can't find the native method for the DiskUsage.gather() to return with valid data. This error really confuses me, because when I don't include the path to the lib directory where sigar-winnt-x86.dll resides, every method which uses the sigar API will fail at the point it requires the native code. However when I do include the lib directory for sigar-winnt-x86.dll, all of the above objects are able to find the native code except for the DiskUsage object. The reason I mentioned the test program is because both the test program and the production program reference the same sigar.jar, and the sigar-winnt-x86.dll locations and files. The production program uses multiple threads, but only one thread is responsible for executing code using sigar API objects. In trying to troubleshoot this problem I've been sifting through the source distro of the sigar download and found win32_sigar.c and a C function : SIGAR_DECLARE(int) sigar_disk_usage_get(sigar_t *sigar, const char *dirname, sigar_disk_usage_t *disk) Is this function the native function which DiskUsage.fetch() calls into? Is win32_sigar.c included in the sigar-winnt-x86.dll? Any help trying to resolve this problem would be very much appreciated. Feel free to email me if you need additional stack trace, configuration questions, or have other probative questions. My email is justin (dot) allen (at) teradata (dot) com |
From: <bo...@hy...> - 2010-01-01 09:22:16
|
Author: bob Date: 2010-01-01 01:22:06 -0800 (Fri, 01 Jan 2010) New Revision: 14125 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14125 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1306 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-31 09:20:08 UTC (rev 14124) +++ trunk/etc/version.properties 2010-01-01 09:22:06 UTC (rev 14125) @@ -1,3 +1,3 @@ -#Thu Dec 31 00:24:17 PST 2009 +#Fri Jan 01 00:29:13 PST 2010 version=4.3.0 -build=1305 +build=1306 |
From: <bo...@hy...> - 2009-12-31 09:20:22
|
Author: bob Date: 2009-12-31 01:20:08 -0800 (Thu, 31 Dec 2009) New Revision: 14124 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14124 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1305 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-30 20:29:00 UTC (rev 14123) +++ trunk/etc/version.properties 2009-12-31 09:20:08 UTC (rev 14124) @@ -1,3 +1,3 @@ -#Wed Dec 30 00:29:37 PST 2009 +#Thu Dec 31 00:24:17 PST 2009 version=4.3.0 -build=1304 +build=1305 |
From: <gla...@hy...> - 2009-12-30 20:29:16
|
Author: glaullon Date: 2009-12-30 12:29:00 -0800 (Wed, 30 Dec 2009) New Revision: 14123 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14123 Modified: trunk/plugins/websphere/src/org/hyperic/hq/plugin/websphere/ApplicationCollector.java Log: HHQ-3618 Modified: trunk/plugins/websphere/src/org/hyperic/hq/plugin/websphere/ApplicationCollector.java =================================================================== --- trunk/plugins/websphere/src/org/hyperic/hq/plugin/websphere/ApplicationCollector.java 2009-12-30 09:20:41 UTC (rev 14122) +++ trunk/plugins/websphere/src/org/hyperic/hq/plugin/websphere/ApplicationCollector.java 2009-12-30 20:29:00 UTC (rev 14123) @@ -28,9 +28,13 @@ import org.hyperic.hq.product.PluginException; import com.ibm.websphere.management.AdminClient; +import org.apache.commons.logging.Log; +import org.apache.commons.logging.LogFactory; public class ApplicationCollector extends WebsphereCollector { + private static final Log log = LogFactory.getLog(ApplicationCollector.class.getName()); + protected void init(AdminClient mServer) throws PluginException { super.init(mServer); @@ -60,11 +64,13 @@ Object state = getAttribute(getMBeanServer(), this.name, "state"); - if (state == null) { + log.debug("[collect] name='"+name.getKeyProperty("name")+"' state='"+state+"("+state.getClass()+")'"); + + if ((state == null) || (!(state instanceof Integer))) { setAvailability(false); } else { - setAvailability(true); + setAvailability(((Integer)state).intValue()==1); } } } |
From: raghu <sig...@hy...> - 2009-12-30 18:57:22
|
Hi, Can anyone help me in compiling the hyperic sigar to arm, i saw libsigar-x86-linux.so etc etc this is compiled for x86 in the same way can i compile to arm arch. please provide some inputs in this issue. Thanking you. Regards, Raghu kiran |
From: <bo...@hy...> - 2009-12-30 09:20:56
|
Author: bob Date: 2009-12-30 01:20:41 -0800 (Wed, 30 Dec 2009) New Revision: 14122 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14122 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1304 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-29 09:19:13 UTC (rev 14121) +++ trunk/etc/version.properties 2009-12-30 09:20:41 UTC (rev 14122) @@ -1,3 +1,3 @@ -#Tue Dec 29 00:27:17 PST 2009 +#Wed Dec 30 00:29:37 PST 2009 version=4.3.0 -build=1303 +build=1304 |
From: Raghu K. <ark...@gm...> - 2009-12-29 09:36:11
|
Hi, I am an android developer, currently i am doing a project which has to retrieve the CPU usage and MEM usage of a particular running process. This feature is not available in the android platform in owing to this while googling i founf this site and it is useful but the issue is in android they use arm-linux which is not supported from the sigar site. Can anyone please guide me how i can compile the source code of the hyperic-sigar to arm-linux compatible. Thanking you. BR, Raghu Kiran A |
From: <bo...@hy...> - 2009-12-29 09:19:23
|
Author: bob Date: 2009-12-29 01:19:13 -0800 (Tue, 29 Dec 2009) New Revision: 14121 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14121 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1303 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-28 09:19:11 UTC (rev 14120) +++ trunk/etc/version.properties 2009-12-29 09:19:13 UTC (rev 14121) @@ -1,3 +1,3 @@ -#Mon Dec 28 00:27:04 PST 2009 +#Tue Dec 29 00:27:17 PST 2009 version=4.3.0 -build=1302 +build=1303 |
From: <bo...@hy...> - 2009-12-28 09:19:26
|
Author: bob Date: 2009-12-28 01:19:11 -0800 (Mon, 28 Dec 2009) New Revision: 14120 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14120 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1302 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-27 09:19:39 UTC (rev 14119) +++ trunk/etc/version.properties 2009-12-28 09:19:11 UTC (rev 14120) @@ -1,3 +1,3 @@ -#Sun Dec 27 00:30:09 PST 2009 +#Mon Dec 28 00:27:04 PST 2009 version=4.3.0 -build=1301 +build=1302 |
From: <bo...@hy...> - 2009-12-27 09:19:47
|
Author: bob Date: 2009-12-27 01:19:39 -0800 (Sun, 27 Dec 2009) New Revision: 14119 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14119 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1301 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-26 09:20:08 UTC (rev 14118) +++ trunk/etc/version.properties 2009-12-27 09:19:39 UTC (rev 14119) @@ -1,3 +1,3 @@ -#Sat Dec 26 00:29:25 PST 2009 +#Sun Dec 27 00:30:09 PST 2009 version=4.3.0 -build=1300 +build=1301 |
From: <bo...@hy...> - 2009-12-26 09:20:27
|
Author: bob Date: 2009-12-26 01:20:08 -0800 (Sat, 26 Dec 2009) New Revision: 14118 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14118 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1300 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-25 09:20:08 UTC (rev 14117) +++ trunk/etc/version.properties 2009-12-26 09:20:08 UTC (rev 14118) @@ -1,3 +1,3 @@ -#Fri Dec 25 00:29:13 PST 2009 +#Sat Dec 26 00:29:25 PST 2009 version=4.3.0 -build=1299 +build=1300 |
From: <bo...@hy...> - 2009-12-25 09:20:22
|
Author: bob Date: 2009-12-25 01:20:08 -0800 (Fri, 25 Dec 2009) New Revision: 14117 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14117 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1299 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-24 09:16:55 UTC (rev 14116) +++ trunk/etc/version.properties 2009-12-25 09:20:08 UTC (rev 14117) @@ -1,3 +1,3 @@ -#Thu Dec 24 00:27:18 PST 2009 +#Fri Dec 25 00:29:13 PST 2009 version=4.3.0 -build=1298 +build=1299 |
From: <no...@gi...> - 2009-12-24 23:18:40
|
Branch: refs/heads/master Home: http://github.com/hyperic/hqapi Commit: 80726f621b6a657f1abfe8eda98981940bcf6182 http://github.com/hyperic/hqapi/commit/80726f621b6a657f1abfe8eda98981940bcf6182 Author: pnguyen <pnguyen@192.168.2.231> Date: 2009-12-24 (Thu, 24 Dec 2009) Changed paths: M src/org/hyperic/hq/hqapi1/test/MaintenanceSchedule_test.java Log Message: ----------- Validate the resource's availability state before and during a maintenance window |
From: <bo...@hy...> - 2009-12-24 09:17:09
|
Author: bob Date: 2009-12-24 01:16:55 -0800 (Thu, 24 Dec 2009) New Revision: 14116 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14116 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1298 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-23 09:09:41 UTC (rev 14115) +++ trunk/etc/version.properties 2009-12-24 09:16:55 UTC (rev 14116) @@ -1,3 +1,3 @@ -#Wed Dec 23 00:23:46 PST 2009 +#Thu Dec 24 00:27:18 PST 2009 version=4.3.0 -build=1297 +build=1298 |
From: <no...@gi...> - 2009-12-24 00:01:22
|
Branch: refs/heads/master Home: http://github.com/hyperic/hqapi Commit: 16a30b5e2345f59923121ba5a0d3fca372fa73dc http://github.com/hyperic/hqapi/commit/16a30b5e2345f59923121ba5a0d3fca372fa73dc Author: pnguyen <pnguyen@192.168.2.116> Date: 2009-12-23 (Wed, 23 Dec 2009) Changed paths: M src/org/hyperic/hq/hqapi1/test/AlertDefinitionSync_test.java M src/org/hyperic/hq/hqapi1/test/AlertDefinitionTestBase.java Log Message: ----------- Improve integration test validation when syncing resource type alert definitions |
From: <bo...@hy...> - 2009-12-23 09:09:57
|
Author: bob Date: 2009-12-23 01:09:41 -0800 (Wed, 23 Dec 2009) New Revision: 14115 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14115 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1297 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-22 21:55:42 UTC (rev 14114) +++ trunk/etc/version.properties 2009-12-23 09:09:41 UTC (rev 14115) @@ -1,3 +1,3 @@ -#Tue Dec 22 00:30:26 PST 2009 +#Wed Dec 23 00:23:46 PST 2009 version=4.3.0 -build=1296 +build=1297 |
From: <tr...@hy...> - 2009-12-22 21:55:53
|
Author: trader Date: 2009-12-22 13:55:42 -0800 (Tue, 22 Dec 2009) New Revision: 14114 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14114 Modified: trunk/unittest/src/org/hyperic/util/file/DiskListTest.java Log: Two new test cases, both initially pass Modified: trunk/unittest/src/org/hyperic/util/file/DiskListTest.java =================================================================== --- trunk/unittest/src/org/hyperic/util/file/DiskListTest.java 2009-12-22 20:52:41 UTC (rev 14113) +++ trunk/unittest/src/org/hyperic/util/file/DiskListTest.java 2009-12-22 21:55:42 UTC (rev 14114) @@ -154,6 +154,107 @@ } } + public void testFillAndDeleteAllAndReopen() throws Exception { + + DiskListDataHolder holder = null; + + try { + + try { + holder = new DiskListDataHolder(); + } catch (Exception e) { + e.printStackTrace(); + fail(e.toString()); + } + + String toPut = String.valueOf("dummystring"); + + // Insert until we *almost* spill over + while (holder.list.dataFile.length() < MAXSIZE) { + holder.list.addToList(toPut); + } + + // Iterate and delete + Iterator it = holder.list.getListIterator(); + while (it.hasNext()) { + it.next(); + it.remove(); + } + + holder.list.close(); + + // Check that we can read the proper number after close/reopen, and that they can be cleanly deleted + holder.list = new DiskList(holder.dataFile, + RECSIZE, + CHKSIZE, + CHKPERC, + MAXSIZE); + + it = holder.list.getListIterator(); + // This is current behavior, but bad behavior + assertNull(it); + + } finally { + + holder.dispose(); + + } + } + + public void testFillAndDeleteAllButLastAndReopen() throws Exception { + + DiskListDataHolder holder = null; + + try { + + try { + holder = new DiskListDataHolder(); + } catch (Exception e) { + e.printStackTrace(); + fail(e.toString()); + } + + String toPut = String.valueOf("dummystring"); + + // Insert until we *almost* spill over + long nRecs = 0; + while (holder.list.dataFile.length() < MAXSIZE) { + holder.list.addToList(toPut); + nRecs++; + } + + // Iterate and delete all but the last record. By not deleting the last record, + // we prevent maintenance from happening. + Iterator it = holder.list.getListIterator(); + long nDeleted = 0; + while (it.hasNext()) { + it.next(); + if (++nDeleted < nRecs) { + it.remove(); + } + } + + holder.list.close(); + + // Check that we can read the proper number after close/reopen, and that they can be cleanly deleted + holder.list = new DiskList(holder.dataFile, + RECSIZE, + CHKSIZE, + CHKPERC, + MAXSIZE); + + it = holder.list.getListIterator(); + while (it.hasNext()) { + it.next(); + } + + } finally { + + holder.dispose(); + + } + } + public void testFreeListWithInsertsAndNoDeletes() throws Exception { DiskListDataHolder holder = null; |
From: <sc...@hy...> - 2009-12-22 20:52:51
|
Author: scottmf Date: 2009-12-22 12:52:41 -0800 (Tue, 22 Dec 2009) New Revision: 14113 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14113 Modified: trunk/src/org/hyperic/hq/hqu/rendit/metaclass/ResourceCategory.groovy Log: augmenting error message to make it less cryptic Modified: trunk/src/org/hyperic/hq/hqu/rendit/metaclass/ResourceCategory.groovy =================================================================== --- trunk/src/org/hyperic/hq/hqu/rendit/metaclass/ResourceCategory.groovy 2009-12-22 09:19:14 UTC (rev 14112) +++ trunk/src/org/hyperic/hq/hqu/rendit/metaclass/ResourceCategory.groovy 2009-12-22 20:52:41 UTC (rev 14113) @@ -523,8 +523,15 @@ def servers = svrMan.getServersByPlatformServiceType(subject, parent.instanceId, proto.instanceId) - assert servers.size() == 1, "Unable to find appropriate virtual server for " + - proto.name + " parent = " + parent.name + assert servers.size() == 1, "Cannot create any platform services of " + proto.name + " for " + parent.name + ". This is because the virtual server which relates " + + parent.name + " to the service does not exist in the database." + + " To find out which virtual server is missing, find the " + + "plugin where the platform service exists and get the server " + + "that it should belong to. >From there either remove the agent's datadir " + + "and re-initialize it or contact support for further options." + + " (instanceId=" + parent.instanceId + ")" + server = svrMan.findServerById(servers[0].id) // value -> pojo } else if (parent.isServer()) { |
From: <bo...@hy...> - 2009-12-22 09:19:27
|
Author: bob Date: 2009-12-22 01:19:14 -0800 (Tue, 22 Dec 2009) New Revision: 14112 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14112 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1296 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-22 01:11:31 UTC (rev 14111) +++ trunk/etc/version.properties 2009-12-22 09:19:14 UTC (rev 14112) @@ -1,3 +1,3 @@ -#Mon Dec 21 00:29:58 PST 2009 +#Tue Dec 22 00:30:26 PST 2009 version=4.3.0 -build=1295 +build=1296 |
From: <sc...@hy...> - 2009-12-22 01:11:47
|
Author: scottmf Date: 2009-12-21 17:11:31 -0800 (Mon, 21 Dec 2009) New Revision: 14111 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14111 Modified: branches/HQ_4_2_0_PATCH/src/org/hyperic/hq/events/server/session/MultiConditionEvaluator.java branches/HQ_4_2_0_PATCH/unittest/src/org/hyperic/hq/events/server/session/MultiConditionEvaluatorTest.java Log: [HHQ-3613] removing from a collection while it is being iterated throws ConcurrentModificationException. Need to remove from the iterator itself Modified: branches/HQ_4_2_0_PATCH/src/org/hyperic/hq/events/server/session/MultiConditionEvaluator.java =================================================================== --- branches/HQ_4_2_0_PATCH/src/org/hyperic/hq/events/server/session/MultiConditionEvaluator.java 2009-12-21 09:15:58 UTC (rev 14110) +++ branches/HQ_4_2_0_PATCH/src/org/hyperic/hq/events/server/session/MultiConditionEvaluator.java 2009-12-22 01:11:31 UTC (rev 14111) @@ -114,7 +114,7 @@ // than System.currentTimeMillis Map.Entry entry = (Map.Entry) iter.next(); if (isExpired((AbstractEvent) entry.getValue())) { - events.remove(entry.getKey()); + iter.remove(); } } } Modified: branches/HQ_4_2_0_PATCH/unittest/src/org/hyperic/hq/events/server/session/MultiConditionEvaluatorTest.java =================================================================== --- branches/HQ_4_2_0_PATCH/unittest/src/org/hyperic/hq/events/server/session/MultiConditionEvaluatorTest.java 2009-12-21 09:15:58 UTC (rev 14110) +++ branches/HQ_4_2_0_PATCH/unittest/src/org/hyperic/hq/events/server/session/MultiConditionEvaluatorTest.java 2009-12-22 01:11:31 UTC (rev 14111) @@ -30,6 +30,7 @@ public void setUp() throws Exception { super.setUp(); +Thread.sleep(5000); this.executionStrategy = EasyMock.createMock(ExecutionStrategy.class); } @@ -297,13 +298,20 @@ mockEvent2.setTimestamp(System.currentTimeMillis() - (12 * 60 * 1000)); TriggerFiredEvent triggerFired2 = new TriggerFiredEvent(trigger1Id, mockEvent2); + MockEvent mockEvent3 = new MockEvent(4l, 5); + // trigger fired event occurred 12 minutes ago and is already expired + mockEvent2.setTimestamp(System.currentTimeMillis() - (12 * 60 * 1000)); + TriggerFiredEvent triggerFired3 = new TriggerFiredEvent(trigger1Id, mockEvent3); + EasyMock.replay(executionStrategy); evaluator.triggerFired(triggerFired); + // this setTimestamp() is signifying that time has passed since the last + // trigger fired + mockEvent.setTimestamp(System.currentTimeMillis() - (12 * 60 * 1000)); evaluator.triggerFired(triggerFired2); EasyMock.verify(executionStrategy); Map expectedEvents = new LinkedHashMap(); - expectedEvents.put(trigger2Id, triggerFired); assertEquals(expectedEvents, events); } |
From: <bo...@hy...> - 2009-12-21 09:16:11
|
Author: bob Date: 2009-12-21 01:15:58 -0800 (Mon, 21 Dec 2009) New Revision: 14110 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14110 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1295 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-21 05:58:21 UTC (rev 14109) +++ trunk/etc/version.properties 2009-12-21 09:15:58 UTC (rev 14110) @@ -1,3 +1,3 @@ -#Sun Dec 20 00:30:04 PST 2009 +#Mon Dec 21 00:29:58 PST 2009 version=4.3.0 -build=1294 +build=1295 |
From: <pn...@hy...> - 2009-12-21 05:58:36
|
Author: pnguyen Date: 2009-12-20 21:58:21 -0800 (Sun, 20 Dec 2009) New Revision: 14109 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14109 Removed: trunk/src/org/hyperic/hq/appdef/server/session/AsyncDeleteAgentCache.java Modified: trunk/src/org/hyperic/hq/appdef/shared/ResourcesCleanupZevent.java trunk/src/org/hyperic/hq/bizapp/server/session/AppdefBossEJBImpl.java trunk/src/org/hyperic/hq/measurement/server/session/MeasurementManagerEJBImpl.java trunk/src/org/hyperic/hq/measurement/server/session/MeasurementProcessorEJBImpl.java Log: [HHQ-3598] Eliminate the AsyncDeleteAgentCache singleton class. Inject the same info as a ZeventPayload into the ResourcesCleanupZevent. Unschedule the metrics from the agent in one batch call instead of multiple calls during the async delete process to improve performance. Deleted: trunk/src/org/hyperic/hq/appdef/server/session/AsyncDeleteAgentCache.java =================================================================== --- trunk/src/org/hyperic/hq/appdef/server/session/AsyncDeleteAgentCache.java 2009-12-20 09:15:15 UTC (rev 14108) +++ trunk/src/org/hyperic/hq/appdef/server/session/AsyncDeleteAgentCache.java 2009-12-21 05:58:21 UTC (rev 14109) @@ -1,92 +0,0 @@ -/* - * NOTE: This copyright does *not* cover user programs that use HQ - * program services by normal system calls through the application - * program interfaces provided as part of the Hyperic Plug-in Development - * Kit or the Hyperic Client Development Kit - this is merely considered - * normal use of the program, and does *not* fall under the heading of - * "derived work". - * - * Copyright (C) [2004-2009], Hyperic, Inc. - * This file is part of HQ. - * - * HQ is free software; you can redistribute it and/or modify - * it under the terms version 2 of the GNU General Public License as - * published by the Free Software Foundation. This program is distributed - * in the hope that it will be useful, but WITHOUT ANY WARRANTY; without - * even the implied warranty of MERCHANTABILITY or FITNESS FOR A - * PARTICULAR PURPOSE. See the GNU General Public License for more - * details. - * - * You should have received a copy of the GNU General Public License - * along with this program; if not, write to the Free Software - * Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 - * USA. - */ - -package org.hyperic.hq.appdef.server.session; - -import java.util.Collections; -import java.util.HashMap; -import java.util.Map; - -import org.apache.commons.logging.Log; -import org.apache.commons.logging.LogFactory; - -import org.hyperic.hq.appdef.shared.AppdefEntityID; - -/** - * This class is used during the aynchronous delete process - * to unschedule metrics. It is an in-memory map of resources - * and its agent because that info no longer exists in the DB. - */ -public class AsyncDeleteAgentCache { - private Log log = LogFactory.getLog(AsyncDeleteAgentCache.class); - - private final Map _cache; - - private static final AsyncDeleteAgentCache singleton = - new AsyncDeleteAgentCache(); - - private AsyncDeleteAgentCache() { - _cache = Collections.synchronizedMap(new HashMap()); - } - - /** - * - * @param key The AppdefEntityID of the async deleted resource - * @return Integer The agentId of the async deleted resource - */ - public Integer get(AppdefEntityID key) { - return (Integer) _cache.get(key); - } - - /** - * - * @param key The AppdefEntityID of the async deleted resource - * @param Integer The agentId of the async deleted resource - */ - public void put(AppdefEntityID key, Integer value) { - _cache.put(key, value); - } - - public void remove(AppdefEntityID key) { - _cache.remove(key); - } - - public void clear() { - _cache.clear(); - } - - public int getSize() { - return _cache.size(); - } - - public String toString() { - return _cache.toString(); - } - - public static AsyncDeleteAgentCache getInstance() { - return singleton; - } - -} Modified: trunk/src/org/hyperic/hq/appdef/shared/ResourcesCleanupZevent.java =================================================================== --- trunk/src/org/hyperic/hq/appdef/shared/ResourcesCleanupZevent.java 2009-12-20 09:15:15 UTC (rev 14108) +++ trunk/src/org/hyperic/hq/appdef/shared/ResourcesCleanupZevent.java 2009-12-21 05:58:21 UTC (rev 14109) @@ -6,7 +6,7 @@ * normal use of the program, and does *not* fall under the heading of * "derived work". * - * Copyright (C) [2004-2008], Hyperic, Inc. + * Copyright (C) [2004-2009], Hyperic, Inc. * This file is part of HQ. * * HQ is free software; you can redistribute it and/or modify @@ -25,7 +25,10 @@ package org.hyperic.hq.appdef.shared; +import java.util.Map; + import org.hyperic.hq.zevents.Zevent; +import org.hyperic.hq.zevents.ZeventPayload; public class ResourcesCleanupZevent extends Zevent { @@ -33,4 +36,45 @@ super(null, null); } + /** + * @param agentCache {@link Map} of {@link Integer} of agentIds + * to {@link List} of {@link AppdefEntityID}s + */ + public ResourcesCleanupZevent(Map agentCache) { + super(null, new ResourcesCleanupZeventPayload(agentCache)); + } + + /** + * @return {@link Map} of {@link Integer} of agentIds + * to {@link List} of {@link AppdefEntityID}s + */ + public Map getAgents() { + Map agents = null; + if (getPayload() != null) { + agents = ((ResourcesCleanupZeventPayload)getPayload()).getAgents(); + } + return agents; + } + + private static class ResourcesCleanupZeventPayload + implements ZeventPayload + { + private Map agentCache; + + /** + * @param agentCache {@link Map} of {@link Integer} of agentIds + * to {@link List} of {@link AppdefEntityID}s + */ + public ResourcesCleanupZeventPayload(Map agentCache) { + this.agentCache = agentCache; + } + + /** + * @return {@link Map} of {@link Integer} of agentIds + * to {@link List} of {@link AppdefEntityID}s + */ + public Map getAgents() { + return this.agentCache; + } + } } Modified: trunk/src/org/hyperic/hq/bizapp/server/session/AppdefBossEJBImpl.java =================================================================== --- trunk/src/org/hyperic/hq/bizapp/server/session/AppdefBossEJBImpl.java 2009-12-20 09:15:15 UTC (rev 14108) +++ trunk/src/org/hyperic/hq/bizapp/server/session/AppdefBossEJBImpl.java 2009-12-21 05:58:21 UTC (rev 14109) @@ -58,7 +58,6 @@ import org.hyperic.hq.appdef.server.session.AppdefResourceType; import org.hyperic.hq.appdef.server.session.Application; import org.hyperic.hq.appdef.server.session.ApplicationType; -import org.hyperic.hq.appdef.server.session.AsyncDeleteAgentCache; import org.hyperic.hq.appdef.server.session.CPropResource; import org.hyperic.hq.appdef.server.session.CPropResourceSortField; import org.hyperic.hq.appdef.server.session.Cprop; @@ -1341,16 +1340,21 @@ } AppdefEntityID[] removed = resMan.removeResourcePerms( subject, res, false); + Map agentCache = null; try { final Integer id = aeid.getId(); switch (aeid.getType()) { case AppdefEntityConstants.APPDEF_TYPE_SERVER : final ServerManagerLocal sMan = getServerManager(); - removeServer(subject, sMan.findServerById(id)); + Server server = sMan.findServerById(id); + agentCache = buildAsyncDeleteAgentCache(server); + removeServer(subject, server); break; case AppdefEntityConstants.APPDEF_TYPE_PLATFORM: final PlatformManagerLocal pMan = getPlatformManager(); - removePlatform(subject, pMan.findPlatformById(id)); + Platform platform = pMan.findPlatformById(id); + agentCache = buildAsyncDeleteAgentCache(platform); + removePlatform(subject, platform); break; case AppdefEntityConstants.APPDEF_TYPE_SERVICE : final ServiceManagerLocal svcMan = getServiceManager(); @@ -1378,7 +1382,7 @@ } ZeventManager.getInstance().enqueueEventAfterCommit( - new ResourcesCleanupZevent()); + new ResourcesCleanupZevent(agentCache)); return removed; } @@ -1388,52 +1392,36 @@ * Method is "NotSupported" since all the resource deletes may take longer * than the jboss transaction timeout. No need for a transaction in this * context. + * + * @param agentCache {@link Map} of {@link Integer} of agentIds + * to {@link List} of {@link AppdefEntityID}s + * * @ejb:transaction type="NotSupported" * @ejb:interface-method */ - public void removeDeletedResources() + public void removeDeletedResources(Map agentCache) throws ApplicationException, VetoException, RemoveException { final boolean debug = log.isDebugEnabled(); final StopWatch watch = new StopWatch(); final AuthzSubject subject = getAuthzSubjectManager().findSubjectById(AuthzConstants.overlordId); - AsyncDeleteAgentCache cache = AsyncDeleteAgentCache.getInstance(); - if (debug) { - log.debug("AsyncDeleteAgentCache start size=" + cache.getSize()); - } - + watch.markTimeBegin("unscheduleMeasurementsForAsyncDelete"); + unscheduleMeasurementsForAsyncDelete(agentCache); + watch.markTimeEnd("unscheduleMeasurementsForAsyncDelete"); + + // Look through services, servers, platforms, applications, and groups watch.markTimeBegin("removeApplications"); Collection applications = getApplicationManager().findDeletedApplications(); - for (Iterator it = applications.iterator(); it.hasNext(); ) { - try { - getOne()._removeApplicationInNewTran( - subject, (Application)it.next()); - } catch (Exception e) { - log.error("Unable to remove application: " + e, e); - } - } + removeApplications(subject, applications); watch.markTimeEnd("removeApplications"); - if (debug) { - log.debug("Removed " + applications.size() + " applications"); - } watch.markTimeBegin("removeResourceGroups"); Collection groups = getResourceGroupManager().findDeletedGroups(); - for (Iterator it = groups.iterator(); it.hasNext(); ) { - try { - getOne()._removeGroupInNewTran(subject, (ResourceGroup)it.next()); - } catch (Exception e) { - log.error("Unable to remove group: " + e, e); - } - } + removeResourceGroups(subject, groups); watch.markTimeEnd("removeResourceGroups"); - if (debug) { - log.debug("Removed " + groups.size() + " resource groups"); - } - // Look through services, servers, platforms, applications, and groups watch.markTimeBegin("removeServices"); Collection services = getServiceManager().findDeletedServices(); removeServices(subject, services); @@ -1451,13 +1439,69 @@ if (debug) { log.debug("removeDeletedResources: " + watch); - log.debug("AsyncDeleteAgentCache end size=" + cache.getSize()); - if (cache.getSize() > 0) { - log.debug("AsyncDeleteAgentCache " + cache); + } + } + + /** + * Disable measurements and unschedule from the agent in bulk + * with the agent cache info because the resources have been + * de-referenced from the agent + * + * @param agentCache {@link Map} of {@link Integer} of agentIds + * to {@link List} of {@link AppdefEntityID}s + */ + private void unscheduleMeasurementsForAsyncDelete(Map agentCache) { + if (agentCache == null) { + return; + } + + try { + AuthzSubject subject = + getAuthzSubjectManager().findSubjectById(AuthzConstants.overlordId); + + for (Iterator j = agentCache.keySet().iterator(); j.hasNext(); ) { + Integer agentId = (Integer)j.next(); + Agent agent = getAgentManager().getAgent(agentId); + List resources = (List) agentCache.get(agentId); + + getMetricManager().disableMeasurements( + subject, + agent, + (AppdefEntityID[]) resources.toArray(new AppdefEntityID[0]), + true); } + } catch (Exception e) { + log.error("Error unscheduling measurements during async delete", e); } } + private final void removeApplications(AuthzSubject subject, Collection applications) { + for (Iterator it = applications.iterator(); it.hasNext(); ) { + try { + getOne()._removeApplicationInNewTran( + subject, (Application)it.next()); + } catch (Exception e) { + log.error("Unable to remove application: " + e, e); + } + } + if (log.isDebugEnabled()) { + log.debug("Removed " + applications.size() + " applications"); + } + } + + private final void removeResourceGroups(AuthzSubject subject, Collection groups) { + for (Iterator it = groups.iterator(); it.hasNext(); ) { + try { + getOne()._removeGroupInNewTran(subject, (ResourceGroup)it.next()); + } catch (Exception e) { + log.error("Unable to remove group: " + e, e); + } + } + if (log.isDebugEnabled()) { + log.debug("Removed " + groups.size() + " resource groups"); + } + } + private final void removePlatforms(AuthzSubject subject, Collection platforms) { for (final Iterator it = platforms.iterator(); it.hasNext(); ) { final Platform platform = (Platform)it.next(); @@ -1569,11 +1613,6 @@ StopWatch watch = new StopWatch(); try { - // Update cache for asynchronous delete process - if (debug) watch.markTimeBegin("buildAsyncDeleteAgentCache"); - buildAsyncDeleteAgentCache(platform); - if (debug) watch.markTimeEnd("buildAsyncDeleteAgentCache"); - // Disable all measurements for this platform. We don't actually // remove the measurements here to avoid delays in deleting // resources. @@ -1630,11 +1669,6 @@ StopWatch watch = new StopWatch(); try { - // Update cache for asynchronous delete process - if (debug) watch.markTimeBegin("buildAsyncDeleteAgentCache"); - buildAsyncDeleteAgentCache(server); - if (debug) watch.markTimeEnd("buildAsyncDeleteAgentCache"); - // now remove the measurements if (debug) watch.markTimeBegin("disableMeasurements"); getMetricManager().disableMeasurements( @@ -1675,57 +1709,96 @@ } } - private void buildAsyncDeleteAgentCache(Server server) { - AsyncDeleteAgentCache cache = AsyncDeleteAgentCache.getInstance(); + /** + * @param server The server being deleted + * + * @return {@link Map} of {@link Integer} of agentIds + * to {@link List} of {@link AppdefEntityID}s + */ + private Map buildAsyncDeleteAgentCache(Server server) { + Map cache = new HashMap(); try { Agent agent = findResourceAgent(server.getEntityId()); + List resources = new ArrayList(); for (Iterator i=server.getServices().iterator(); i.hasNext(); ) { Service s = (Service)i.next(); - AppdefEntityID eid = s.getEntityId(); - cache.put(eid, agent.getId()); + resources.add(s.getEntityId()); } - } catch (AgentNotFoundException anfe) { - if (cache.get(server.getEntityId()) == null) { - if (!server.getServerType().isVirtual()) { - log.warn("Unable to build AsyncDeleteAgentCache for server[id=" - + server.getId() - + ", name=" + server.getName() + "]: " - + anfe.getMessage()); - } - } else { - log.debug("Platform has been asynchronously deleted: " - + anfe.getMessage()); - } + cache.put(agent.getId(), resources); } catch (Exception e) { log.warn("Unable to build AsyncDeleteAgentCache for server[id=" + server.getId() + ", name=" + server.getName() + "]: " + e.getMessage()); } + + return cache; } - private void buildAsyncDeleteAgentCache(Platform platform) { - AsyncDeleteAgentCache cache = AsyncDeleteAgentCache.getInstance(); + /** + * @param platform The platform being deleted + * + * @return {@link Map} of {@link Integer} of agentIds + * to {@link List} of {@link AppdefEntityID}s + */ + private Map buildAsyncDeleteAgentCache(Platform platform) { + Map cache = new HashMap(); try { Agent agent = platform.getAgent(); + List resources = new ArrayList(); for (Iterator i=platform.getServers().iterator(); i.hasNext(); ) { Server s = (Server)i.next(); if (!s.getServerType().isVirtual()) { - AppdefEntityID eid = s.getEntityId(); - cache.put(eid, agent.getId()); + resources.add(s.getEntityId()); } - buildAsyncDeleteAgentCache(s); - } + List services = (List) buildAsyncDeleteAgentCache(s).get(agent.getId()); + resources.addAll(services); + } + cache.put(agent.getId(), resources); } catch (Exception e) { log.warn("Unable to build AsyncDeleteAgentCache for platform[id=" + platform.getId() + ", name=" + platform.getName() + "]: " - + e.getMessage()); } + + e.getMessage()); + } + + return cache; } + + /** + * @param zevents {@link List} of {@link ResourcesCleanupZevent} + * + * @return {@link Map} of {@link Integer} of agentIds + * to {@link List} of {@link AppdefEntityID}s + */ + private Map buildAsyncDeleteAgentCache(List zevents) { + Map masterCache = new HashMap(); + + for (Iterator i = zevents.iterator(); i.hasNext();) { + ResourcesCleanupZevent z = (ResourcesCleanupZevent) i.next(); + if (z.getAgents() != null) { + Map cache = z.getAgents(); + + for (Iterator j = cache.keySet().iterator(); j.hasNext(); ) { + Integer agentId = (Integer)j.next(); + List newResources = (List) cache.get(agentId); + List resources = (List) masterCache.get(agentId); + if (resources == null) { + resources = newResources; + } else { + resources.addAll(newResources); + } + masterCache.put(agentId, resources); + } + } + } + + return masterCache; + } /** * Used only during the asynchronous delete process. @@ -4085,15 +4158,14 @@ events.add (ResourcesCleanupZevent.class); ZeventManager.getInstance().addBufferedListener(events, new ZeventListener() { - public void processEvents(List events) { - for (Iterator i = events.iterator(); i.hasNext();) { + public void processEvents(List zevents) { + if (zevents != null && !zevents.isEmpty()) { try { - getOne().removeDeletedResources(); + Map agentCache = buildAsyncDeleteAgentCache(zevents); + getOne().removeDeletedResources(agentCache); } catch (Exception e) { log.error("removeDeletedResources() failed", e); - } - // Only need to run this once - break; + } } } Modified: trunk/src/org/hyperic/hq/measurement/server/session/MeasurementManagerEJBImpl.java =================================================================== --- trunk/src/org/hyperic/hq/measurement/server/session/MeasurementManagerEJBImpl.java 2009-12-20 09:15:15 UTC (rev 14108) +++ trunk/src/org/hyperic/hq/measurement/server/session/MeasurementManagerEJBImpl.java 2009-12-21 05:58:21 UTC (rev 14109) @@ -45,6 +45,7 @@ import org.apache.commons.logging.LogFactory; import org.hyperic.hq.appdef.Agent; import org.hyperic.hq.appdef.AppService; +import org.hyperic.hq.appdef.server.session.AgentManagerEJBImpl; import org.hyperic.hq.appdef.server.session.AppdefResource; import org.hyperic.hq.appdef.server.session.Application; import org.hyperic.hq.appdef.server.session.ApplicationManagerEJBImpl; @@ -55,6 +56,7 @@ import org.hyperic.hq.appdef.server.session.ResourceZevent; import org.hyperic.hq.appdef.server.session.Server; import org.hyperic.hq.appdef.server.session.Service; +import org.hyperic.hq.appdef.shared.AgentNotFoundException; import org.hyperic.hq.appdef.shared.AppdefEntityID; import org.hyperic.hq.appdef.shared.AppdefEntityNotFoundException; import org.hyperic.hq.appdef.shared.AppdefEntityValue; @@ -1135,14 +1137,52 @@ */ public void disableMeasurements(AuthzSubject subject, AppdefEntityID agentId, AppdefEntityID[] ids) + throws PermissionException, AgentNotFoundException { + + Agent agent = AgentManagerEJBImpl.getOne().getAgent(agentId); + + disableMeasurements(subject, agent, ids, false); + } + + /** + * Disable all measurements for the given resources. + * + * @param agent The agent for the given resources + * @param ids The list of entitys to unschedule + * @param isAsyncDelete Indicates whether it is for async delete + * @ejb:interface-method + * + * NOTE: This method requires all entity ids to be monitored by the same + * agent as specified by the agent + */ + public void disableMeasurements(AuthzSubject subject, Agent agent, + AppdefEntityID[] ids, boolean isAsyncDelete) throws PermissionException { - + MeasurementDAO dao = getMeasurementDAO(); for (int i = 0; i < ids.length; i++) { checkModifyPermission(subject.getId(), ids[i]); - - List mcol = dao.findEnabledByResource(getResource(ids[i])); + List mcol = null; + Resource res = getResource(ids[i]); + if (isAsyncDelete) { + // For asynchronous deletes, we need to get all measurements + // because some disabled measurements are not unscheduled + // from the agent (like during the maintenance window) and + // we need to unschedule these measurements + mcol = findMeasurements(subject, res); + } else { + mcol = dao.findEnabledByResource(res); + } + if (mcol.isEmpty()) { + if (log.isDebugEnabled()) { + log.debug("No measurements to disable for resource[" + + ids[i] + + "], isAsyncDelete=" + isAsyncDelete); + } + continue; + } + Integer[] mids = new Integer[mcol.size()]; Iterator it = mcol.iterator(); for (int j = 0; it.hasNext(); j++) { @@ -1159,8 +1199,9 @@ // Unscheduling of all metrics for a resource could indicate that // the resource is getting removed. Send the unschedule synchronously // so that all the necessary plumbing is in place. - try { - MeasurementProcessorEJBImpl.getOne().unschedule(agentId, ids); + try { + MeasurementProcessorEJBImpl.getOne().unschedule( + agent.getAgentToken(), ids); } catch (MeasurementUnscheduleException e) { log.error("Unable to disable measurements", e); } Modified: trunk/src/org/hyperic/hq/measurement/server/session/MeasurementProcessorEJBImpl.java =================================================================== --- trunk/src/org/hyperic/hq/measurement/server/session/MeasurementProcessorEJBImpl.java 2009-12-20 09:15:15 UTC (rev 14108) +++ trunk/src/org/hyperic/hq/measurement/server/session/MeasurementProcessorEJBImpl.java 2009-12-21 05:58:21 UTC (rev 14109) @@ -41,7 +41,6 @@ import org.hyperic.hq.agent.AgentRemoteException; import org.hyperic.hq.appdef.Agent; import org.hyperic.hq.appdef.server.session.AgentManagerEJBImpl; -import org.hyperic.hq.appdef.server.session.AsyncDeleteAgentCache; import org.hyperic.hq.appdef.shared.AgentManagerLocal; import org.hyperic.hq.appdef.shared.AgentNotFoundException; import org.hyperic.hq.appdef.shared.AppdefEntityID; @@ -119,33 +118,20 @@ */ private Map getAgentMap(List aeids) { AgentManagerLocal aMan = AgentManagerEJBImpl.getOne(); - AsyncDeleteAgentCache cache = AsyncDeleteAgentCache.getInstance(); Map rtn = new HashMap(aeids.size()); List tmp; for (Iterator it=aeids.iterator(); it.hasNext(); ) { AppdefEntityID eid = (AppdefEntityID)it.next(); - Integer agentId = null; + Integer agentId; try { agentId = aMan.getAgent(eid).getId(); - } catch (AgentNotFoundException e) { - // HHQ-3585: It may be in an asynchronous - // delete state, so check the cache - agentId = cache.get(eid); - - if (agentId == null) { - log.warn(e.getMessage()); - } else { - // No longer needed, remove from cache - cache.remove(eid); - } - } - - if (agentId != null) { if (null == (tmp = (List)rtn.get(agentId))) { tmp = new ArrayList(); rtn.put(agentId, tmp); } tmp.add(eid); + } catch (AgentNotFoundException e) { + log.warn(e.getMessage()); } } return rtn; @@ -215,6 +201,12 @@ private void unschedule(Agent a, AppdefEntityID[] entIds) throws MeasurementUnscheduleException, MonitorAgentException { + + if (log.isDebugEnabled()) { + log.debug("unschedule agentId=" + a.getId() + + ", numOfResources=" + entIds.length); + } + SRNManagerLocal srnManager = getSRNManager(); for (int i = 0; i < entIds.length; i++) { try { |