You can subscribe to this list here.
2007 |
Jan
|
Feb
|
Mar
(927) |
Apr
(419) |
May
(352) |
Jun
(431) |
Jul
(463) |
Aug
(345) |
Sep
(304) |
Oct
(596) |
Nov
(466) |
Dec
(414) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2008 |
Jan
(348) |
Feb
(313) |
Mar
(665) |
Apr
(688) |
May
(434) |
Jun
(311) |
Jul
(540) |
Aug
(554) |
Sep
(467) |
Oct
(341) |
Nov
(365) |
Dec
(272) |
2009 |
Jan
(386) |
Feb
(293) |
Mar
(279) |
Apr
(239) |
May
(229) |
Jun
(199) |
Jul
(186) |
Aug
(111) |
Sep
(196) |
Oct
(146) |
Nov
(116) |
Dec
(140) |
2010 |
Jan
(170) |
Feb
(159) |
Mar
(151) |
Apr
(161) |
May
(90) |
Jun
(56) |
Jul
(28) |
Aug
(22) |
Sep
(5) |
Oct
|
Nov
(23) |
Dec
(12) |
2011 |
Jan
(8) |
Feb
(8) |
Mar
(22) |
Apr
(24) |
May
(4) |
Jun
|
Jul
(2) |
Aug
|
Sep
|
Oct
(1) |
Nov
|
Dec
|
2012 |
Jan
(5) |
Feb
(1) |
Mar
|
Apr
(1) |
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2013 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
(1) |
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2014 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(2) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: <bo...@hy...> - 2009-12-14 21:43:54
|
Author: bob Date: 2009-12-14 13:43:44 -0800 (Mon, 14 Dec 2009) New Revision: 14089 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14089 Modified: branches/HQ_4_1_4_1/etc/version.properties Log: Release 4.1.4.2 build #1079 Modified: branches/HQ_4_1_4_1/etc/version.properties =================================================================== --- branches/HQ_4_1_4_1/etc/version.properties 2009-12-14 08:54:03 UTC (rev 14088) +++ branches/HQ_4_1_4_1/etc/version.properties 2009-12-14 21:43:44 UTC (rev 14089) @@ -1,3 +1,3 @@ -#Fri Dec 11 16:17:46 PST 2009 +#Mon Dec 14 13:13:16 PST 2009 version=4.1.4.2 -build=1078 +build=1079 |
From: SyRenity <sig...@hy...> - 2009-12-14 18:26:22
|
Hi. Any idea on the above? Thanks! |
From: Faisal <tur...@ho...> - 2009-12-14 18:26:20
|
thnx @excowboy.... In this Rx means Recieved and Tx means sent....? Output: (ZD1211B)IEEE 802.11 b+g USB Adapter - Packet Scheduler Miniport eth0 Link encap:Ethernet HWaddr 00:02:72:55:88:5F inet addr:192.168.1.100 Bcast:192.168.1.255 Mask:255.255.255.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:0 RX packets:77466 errors:0 dropped:0 overruns:-1 frame:-1 TX packets:764076 errors:0 dropped:0 overruns:-1 carrier:-1 collisions:-1 RX bytes:77973680 ( 74M) TX bytes:1080024132 (1.0G) What is the duration/starting time of Rx packets and Tx packets RX bytes and TX bytes calculation? If I want to calculate Throughput percentage of bytes then following formula is true? Throughput= (RX bytes/Sec + TX bytes/Sec) / Link Speed * 100 |
From: Faisal <tur...@ho...> - 2009-12-14 18:26:20
|
Can any body tell me how can we show network utilzation (bytes sent, bytes recieved, Throughput, Link Speed etc) of our system through sigar api? |
From: SyRenity <sig...@hy...> - 2009-12-14 18:26:20
|
Hi. Here is the uname -a output: Linux test.local 2.6.18-128.7.1.el5xen #1 SMP Mon Aug 24 09:14:33 EDT 2009 x86_64 x86_64 x86_64 GNU/Linux And the error is: Exception in thread "main" java.lang.UnsatisfiedLinkError: org.hyperic.sigar.Mem.gather(Lorg/hyperic/sigar/Sigar;)V at org.hyperic.sigar.Mem.gather(Native Method) at org.hyperic.sigar.Mem.fetch(Mem.java:30) at org.hyperic.sigar.Sigar.getMem(Sigar.java:306) at app.helper.SigarTest.main(SigarTest.java:30) Thanks. |
From: Scotty <sig...@hy...> - 2009-12-14 18:26:18
|
Go ahead and supply the following. 1- uname -a 2- Exact error snippet you are getting. |
From: <bo...@hy...> - 2009-12-14 08:54:17
|
Author: bob Date: 2009-12-14 00:54:03 -0800 (Mon, 14 Dec 2009) New Revision: 14088 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14088 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1288 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-13 21:09:02 UTC (rev 14087) +++ trunk/etc/version.properties 2009-12-14 08:54:03 UTC (rev 14088) @@ -1,3 +1,3 @@ -#Sun Dec 13 00:15:53 PST 2009 +#Mon Dec 14 00:16:16 PST 2009 version=4.3.0 -build=1287 +build=1288 |
From: <no...@gi...> - 2009-12-14 04:33:07
|
Branch: refs/heads/evolution Home: http://github.com/hyperic/hqapi Commit: 82cccc8df40a9c184c06c623a2b188f0e95aec44 http://github.com/hyperic/hqapi/commit/82cccc8df40a9c184c06c623a2b188f0e95aec44 Author: Jennifer Hickey <jhickey@calculon.local> Date: 2009-12-13 (Sun, 13 Dec 2009) Changed paths: M .classpath Log Message: ----------- HE-251 Updated Eclipse project to depend on new hq-rendit project |
From: <rm...@hy...> - 2009-12-13 21:34:07
|
Author: rmoore Date: 2009-12-13 13:07:53 -0800 (Sun, 13 Dec 2009) New Revision: 14085 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14085 Modified: trunk/src/org/hyperic/hq/product/SNMPDetector.java Log: V3 Project commit Modified: trunk/src/org/hyperic/hq/product/SNMPDetector.java =================================================================== --- trunk/src/org/hyperic/hq/product/SNMPDetector.java 2009-12-13 21:06:58 UTC (rev 14084) +++ trunk/src/org/hyperic/hq/product/SNMPDetector.java 2009-12-13 21:07:53 UTC (rev 14085) @@ -1,4 +1,7 @@ /* + * 'SNMPDetector.java' + * + * * NOTE: This copyright does *not* cover user programs that use HQ * program services by normal system calls through the application * program interfaces provided as part of the Hyperic Plug-in Development @@ -6,7 +9,7 @@ * normal use of the program, and does *not* fall under the heading of * "derived work". * - * Copyright (C) [2004, 2005, 2006], Hyperic, Inc. + * Copyright (C) [2004, 2005, 2006, 2007, 2008, 2009], Hyperic, Inc. * This file is part of HQ. * * HQ is free software; you can redistribute it and/or modify @@ -38,264 +41,317 @@ import org.hyperic.snmp.SNMPSession; import org.hyperic.util.config.ConfigResponse; -/** +/* * Generic SNMP Detector intended for pure-xml plugins * that extend the Network Device platform or servers * with builtin SNMP management such as squid. */ -public class SNMPDetector extends DaemonDetector { +public class SNMPDetector extends DaemonDetector +{ + private static final Log log = LogFactory.getLog ( SNMPDetector.class.getName ( ) ); - private static final Log log = - LogFactory.getLog(SNMPDetector.class.getName()); + static final String SNMP_INDEX_NAME = SNMPMeasurementPlugin.PROP_INDEX_NAME; + static final String SNMP_DESCRIPTION = "snmpDescription"; - static final String SNMP_INDEX_NAME = - SNMPMeasurementPlugin.PROP_INDEX_NAME; - static final String SNMP_DESCRIPTION = "snmpDescription"; + public List getServerResources ( ConfigResponse platformConfig ) throws PluginException + { + String indexName = getTypeProperty ( SNMP_INDEX_NAME ); - public List getServerResources(ConfigResponse platformConfig) - throws PluginException { + if ( indexName != null ) + { + log.debug ( "Looking for servers with " + indexName ); - String indexName = getTypeProperty(SNMP_INDEX_NAME); - if (indexName != null) { - log.debug("Looking for servers with " + indexName); - return discoverServices(platformConfig, - getTypeInfo().getName()); - } + return discoverServices ( platformConfig, getTypeInfo().getName ( ) ); + } - return super.getServerResources(platformConfig); - } + return super.getServerResources ( platformConfig ); + } - protected List discoverServices(ConfigResponse config) - throws PluginException { + protected List discoverServices ( ConfigResponse config ) throws PluginException + { + return discoverServices ( config, null ); + } - return discoverServices(config, null); - } + protected List discoverServices ( ConfigResponse config, + String type ) throws PluginException + { + log.debug ( "discoverServices(" + config + ")" ); - protected List discoverServices(ConfigResponse config, String type) - throws PluginException { + String[] keys = getCustomPropertiesSchema().getOptionNames ( ); - log.debug("discoverServices(" + config + ")"); - String[] keys = getCustomPropertiesSchema().getOptionNames(); - ConfigResponse cprops = new ConfigResponse(); - SNMPSession session; - try { - session = new SNMPClient().getSession(config); + ConfigResponse cprops = new ConfigResponse ( ); - //custom properties discovery for the server - for (int i=0; i<keys.length; i++) { - String key = keys[i]; - if (SNMPClient.getOID(key) == null) { - log.debug("Cannot resolve '" + key + "'"); - continue; - } - try { - cprops.setValue(key, - session.getSingleValue(key).toString()); - } catch (SNMPException e) { - log.warn("Error getting '" + key + "': " + - e.getMessage()); - } + SNMPSession session; + + try + { + session = new SNMPClient().getSession ( config ); + + // Custom properties discovery for the server... + for ( int i = 0; i < keys.length; i++ ) + { + String key = keys[i]; + + if ( SNMPClient.getOID ( key ) == null ) + { + log.debug ( "Cannot resolve '" + key + "'" ); + + continue; } - setCustomProperties(cprops); - if (type == null) { - //discover services for existings server - return discoverServices(this, config, session); + try + { + cprops.setValue ( key, session.getSingleValue(key).toString ( ) ); } - else { - //discover SNMP services as server types - return discoverServers(this, config, session, type); + catch ( SNMPException e ) + { + log.warn ( "Error getting '" + key + "': " + e.getMessage ( ) ); } - } catch (SNMPException e) { - String msg = - "Error discovering services for " + getTypeInfo() + - ": " + e; - log.error(msg, e); - return null; - } finally { - session = null; - } - } + } - public static List discoverServices(ServerDetector plugin, - ConfigResponse parentConfig, - SNMPSession session) - throws PluginException { + setCustomProperties ( cprops ); - List services = new ArrayList(); + if ( type == null ) + { + // Discover services for existings server... + return discoverServices ( this, config, session ); + } + else + { + // Discover SNMP services as server types... + return discoverServers ( this, config, session, type ); + } + } + catch ( SNMPException e ) + { + String msg = "Error discovering services for " + getTypeInfo ( ) + ": " + e; - Map servicePlugins = plugin.getServiceInventoryPlugins(); - if (servicePlugins == null) { - return services; - } + log.error ( msg, e ); - for (Iterator it=servicePlugins.entrySet().iterator(); it.hasNext();) { - Map.Entry entry = (Map.Entry)it.next(); - String type = (String)entry.getKey(); - //String name = (String)entry.getValue(); - services.addAll(discoverServices(plugin, parentConfig, session, type)); + return null; + } + finally + { + session = null; + } + } + + public static List discoverServices ( ServerDetector plugin, + ConfigResponse parentConfig, + SNMPSession session ) throws PluginException + { + List services = new ArrayList ( ); + + Map servicePlugins = plugin.getServiceInventoryPlugins ( ); + + if ( servicePlugins == null ) + { + return services; + } + + for ( Iterator it = servicePlugins.entrySet().iterator ( ); it.hasNext ( ); ) + { + Map.Entry entry = (Map.Entry)it.next ( ); + + String type = (String)entry.getKey ( ); + + // String name = (String)entry.getValue ( ); + services.addAll ( discoverServices ( plugin, parentConfig, session, type ) ); } - return services; - } + return services; + } - public static List discoverServices(ServerDetector plugin, + public static List discoverServices ( ServerDetector plugin, + ConfigResponse parentConfig, + SNMPSession session, + String type ) throws PluginException + { + return discoverServices ( plugin, parentConfig, session, type, true ); + } + + public static List discoverServers ( ServerDetector plugin, ConfigResponse parentConfig, - SNMPSession session, - String type) - throws PluginException { + SNMPSession session, + String type ) throws PluginException + { + return discoverServices ( plugin, parentConfig, session, type, false ); + } + + private static List discoverServices ( ServerDetector plugin, + ConfigResponse parentConfig, + SNMPSession session, + String type, + boolean isServiceDiscovery) throws PluginException + { + List services = new ArrayList(); - return discoverServices(plugin, parentConfig, session, type, true); - } + String typeName = plugin.getTypeNameProperty ( type ); + String indexName = plugin.getTypeProperty ( type, SNMP_INDEX_NAME ); + String descrName = plugin.getTypeProperty ( type, SNMP_DESCRIPTION ); - public static List discoverServers(ServerDetector plugin, - ConfigResponse parentConfig, - SNMPSession session, - String type) - throws PluginException { + if ( indexName == null ) + { + String msg = "No " + SNMP_INDEX_NAME + " defined for service autoinventory of " + type; - return discoverServices(plugin, parentConfig, session, type, false); - } + log.error ( msg ); + + return services; + } - private static List discoverServices(ServerDetector plugin, - ConfigResponse parentConfig, - SNMPSession session, - String type, - boolean isServiceDiscovery) - throws PluginException { + List column; - List services = new ArrayList(); + try + { + column = session.getColumn ( indexName ); + } + catch ( SNMPException e ) + { + String msg = "Error getting " + SNMP_INDEX_NAME + "=" + indexName + ": " + e; - String typeName = plugin.getTypeNameProperty(type); - String indexName = plugin.getTypeProperty(type, SNMP_INDEX_NAME); - String descrName = plugin.getTypeProperty(type, SNMP_DESCRIPTION); + log.error ( msg ); - if (indexName == null) { - String msg = - "No " + SNMP_INDEX_NAME + - " defined for service autoinventory of " + type; - log.error(msg); - return services; - } + return services; + } + + log.debug ( "Found " + column.size() + " " + type + " services using " + indexName ); - List column; - try { - column = session.getColumn(indexName); - } catch (SNMPException e) { - String msg = - "Error getting " + SNMP_INDEX_NAME + "=" + indexName + - ": " + e; - log.error(msg); - return services; - } + boolean hasDescriptions = false; - log.debug("Found " + column.size() + " " + type + - " services using " + indexName); - - boolean hasDescriptions = false; - List descriptions = null; + List descriptions = null; - if (descrName != null) { - try { - descriptions = session.getColumn(descrName); - } catch (SNMPException e) { - String msg = - "Error getting " + SNMP_DESCRIPTION + "=" + descrName + - ": " + e; - log.warn(msg); - } - if ((descriptions != null) && (descriptions.size() == column.size())) { - hasDescriptions = true; - } - } + if ( descrName != null ) + { + try + { + descriptions = session.getColumn ( descrName ); + } + catch ( SNMPException e ) + { + String msg = "Error getting " + SNMP_DESCRIPTION + "=" + descrName + ": " + e; + + log.warn(msg); + } + + if ( ( descriptions != null ) && ( descriptions.size ( ) == column.size ( ) ) ) + { + hasDescriptions = true; + } + } - String[] keys = - plugin.getCustomPropertiesSchema(type).getOptionNames(); - HashMap cpropColumns = new HashMap(); + String[] keys = plugin.getCustomPropertiesSchema(type).getOptionNames ( ); + + HashMap cpropColumns = new HashMap ( ); - for (int i=0; i<keys.length; i++) { - String key = keys[i]; - try { - cpropColumns.put(key, session.getColumn(key)); - } catch (SNMPException e) { - log.warn("Error getting '" + key + "': " + - e.getMessage()); - } - } + for ( int i = 0; i < keys.length; i++ ) + { + String key = keys[i]; + + try + { + cpropColumns.put ( key, session.getColumn ( key ) ); + } + catch ( SNMPException e ) + { + log.warn ( "Error getting '" + key + "': " + e.getMessage ( ) ); + } + } - for (int i=0; i<column.size(); i++) { - ConfigResponse config = new ConfigResponse(); - ConfigResponse cprops = new ConfigResponse(); - String indexValue = column.get(i).toString().trim(); - String resourceDescr = null; + for ( int i = 0; i < column.size ( ); i++ ) + { + ConfigResponse config = new ConfigResponse ( ); + ConfigResponse cprops = new ConfigResponse ( ); - config.setValue(SNMPMeasurementPlugin.PROP_INDEX_VALUE, - indexValue); + String indexValue = column.get(i).toString().trim ( ); + String resourceDescr = null; - for (int j=0; j<keys.length; j++) { - String key = keys[j]; - List data = (List)cpropColumns.get(key); - if ((data == null) || data.isEmpty()) { - continue; - } - String val = data.get(i).toString().trim(); - cprops.setValue(key, val); + config.setValue ( SNMPMeasurementPlugin.PROP_INDEX_VALUE, indexValue ); + + for ( int j = 0; j < keys.length; j++ ) + { + String key = keys[j]; + + List data = (List)cpropColumns.get(key); + + if ( ( data == null ) || data.isEmpty ( ) ) + { + continue; } - if (hasDescriptions) { - resourceDescr = descriptions.get(i).toString(); + + String val = data.get(i).toString().trim ( ); + + cprops.setValue ( key, val ); + } + + if ( hasDescriptions ) + { + resourceDescr = descriptions.get(i).toString ( ); + } + + String resourceName = typeName + " " + indexValue; + String autoName = plugin.formatAutoInventoryName ( type, parentConfig, config, cprops ); + + if ( isServiceDiscovery ) + { + ServiceResource service = new ServiceResource ( ); + + service.setType(type); + + if ( autoName == null ) + { + service.setServiceName ( resourceName ); } + else + { + service.setName ( autoName ); + } - String resourceName = typeName + " " + indexValue; - String autoName = - plugin.formatAutoInventoryName(type, - parentConfig, - config, - cprops); + service.setProductConfig ( config ); - if (isServiceDiscovery) { - ServiceResource service = new ServiceResource(); - service.setType(type); - if (autoName == null) { - service.setServiceName(resourceName); - } - else { - service.setName(autoName); - } + // Required to auto-enable metric... + service.setMeasurementConfig ( ); + service.setCustomProperties ( cprops ); - service.setProductConfig(config); - //required to auto-enable metric - service.setMeasurementConfig(); - service.setCustomProperties(cprops); - if (resourceDescr != null) { - service.setDescription(resourceDescr); - } + if ( resourceDescr != null ) + { + service.setDescription ( resourceDescr ); + } - services.add(service); + services.add ( service ); + } + else + { + ServerResource server = new ServerResource ( ); + + server.setType ( type ); + + if ( autoName == null ) + { + server.setName ( getPlatformName ( ) + " " + resourceName ); } - else { - ServerResource server = new ServerResource(); - server.setType(type); - if (autoName == null) { - server.setName(getPlatformName() + " " + resourceName); - } - else { - server.setName(autoName); - } - server.setInstallPath("/"); //XXX - server.setIdentifier(server.getName()); + else + { + server.setName ( autoName ); + } - server.setProductConfig(config); - //required to auto-enable metric - server.setMeasurementConfig(); - server.setCustomProperties(cprops); - if (resourceDescr != null) { - server.setDescription(resourceDescr); - } + server.setInstallPath ( "/" ); + server.setIdentifier ( server.getName ( ) ); - services.add(server); + server.setProductConfig ( config ); + + // Required to auto-enable metric... + server.setMeasurementConfig ( ); + server.setCustomProperties ( cprops ); + + if ( resourceDescr != null ) + { + server.setDescription ( resourceDescr ); } - } + + services.add ( server ); + } + } - return services; - } + return services; + } } |
From: <rm...@hy...> - 2009-12-13 21:34:07
|
Author: rmoore Date: 2009-12-13 13:08:09 -0800 (Sun, 13 Dec 2009) New Revision: 14086 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14086 Modified: trunk/src/org/hyperic/hq/product/SNMPMeasurementPlugin.java Log: V3 Project commit Modified: trunk/src/org/hyperic/hq/product/SNMPMeasurementPlugin.java =================================================================== --- trunk/src/org/hyperic/hq/product/SNMPMeasurementPlugin.java 2009-12-13 21:07:53 UTC (rev 14085) +++ trunk/src/org/hyperic/hq/product/SNMPMeasurementPlugin.java 2009-12-13 21:08:09 UTC (rev 14086) @@ -1,4 +1,7 @@ /* + * 'SNMPMeasurementPlugin.java' + * + * * NOTE: This copyright does *not* cover user programs that use HQ * program services by normal system calls through the application * program interfaces provided as part of the Hyperic Plug-in Development @@ -6,7 +9,7 @@ * normal use of the program, and does *not* fall under the heading of * "derived work". * - * Copyright (C) [2004, 2005, 2006], Hyperic, Inc. + * Copyright (C) [2004, 2005, 2006, 2007, 2008, 2009], Hyperic, Inc. * This file is part of HQ. * * HQ is free software; you can redistribute it and/or modify @@ -44,503 +47,651 @@ import org.hyperic.util.StringUtil; import org.hyperic.util.timer.StopWatch; -public class SNMPMeasurementPlugin - extends MeasurementPlugin { +public class SNMPMeasurementPlugin extends MeasurementPlugin +{ + public static final String DOMAIN = "snmp"; + public static final String PROP_INDEX_NAME = "snmpIndexName"; + public static final String PROP_INDEX_VALUE = "snmpIndexValue"; + private static final String PROP_OID = "snmpOID"; + private static final String PROP_VARTYPE = "snmpVarType"; - public static final String DOMAIN = "snmp"; + private static final int VARTYPE_SINGLE = 0; + private static final int VARTYPE_NEXT = 1; + private static final int VARTYPE_COLUMN = 2; + private static final int VARTYPE_INDEX = 3; + private static final int VARTYPE_OID = 4; - public static final String PROP_INDEX_NAME = "snmpIndexName"; - public static final String PROP_INDEX_VALUE = "snmpIndexValue"; + private static long ixTimestamp = 0; + private static long ixExpire = ( 60 * 1000 ) * 60; // 1 hour - private static final String PROP_OID = "snmpOID"; - private static final String PROP_VARTYPE = "snmpVarType"; + private static HashMap VARTYPES = new HashMap ( ); - private static final int VARTYPE_SINGLE = 0; - private static final int VARTYPE_NEXT = 1; - private static final int VARTYPE_COLUMN = 2; - private static final int VARTYPE_INDEX = 3; - private static final int VARTYPE_OID = 4; + private SNMPClient client = new SNMPClient ( ); - private static HashMap VARTYPES = new HashMap(); - private SNMPClient client = new SNMPClient(); - private static Map ixCache = new HashMap(); - private static Object ixLock = new Object(); - private static long ixTimestamp = 0; - private static long ixExpire = (60 * 1000) * 60; //1 hour + private static Map ixCache = new HashMap ( ); - static { - VARTYPES.put("single", new Integer(VARTYPE_SINGLE)); - VARTYPES.put("column", new Integer(VARTYPE_COLUMN)); - VARTYPES.put("index", new Integer(VARTYPE_INDEX)); - VARTYPES.put("oid", new Integer(VARTYPE_OID)); - VARTYPES.put("next", new Integer(VARTYPE_NEXT)); - } + private static Object ixLock = new Object ( ); + + static + { + VARTYPES.put ( "single", new Integer ( VARTYPE_SINGLE ) ); + VARTYPES.put ( "column", new Integer ( VARTYPE_COLUMN ) ); + VARTYPES.put ( "index", new Integer ( VARTYPE_INDEX ) ); + VARTYPES.put ( "oid", new Integer ( VARTYPE_OID ) ); + VARTYPES.put ( "next", new Integer ( VARTYPE_NEXT ) ); + } + + private Log log; - private Log log; - - /** - * @return The MIB names that should be loaded for this plugin. - */ - protected String[] getMIBs() { - String prop = getPluginProperty("MIBS"); - if (prop == null) { - return new String[0]; - } - else { - List mibs = StringUtil.explode(prop, ","); - return (String[])mibs.toArray(new String[0]); - } - } + /* + * @return The MIB names that should be loaded for this plugin. + */ + protected String[] getMIBs ( ) + { + String prop = getPluginProperty ( "MIBS" ); - private static int convertVarType(Properties props) { - String var = props.getProperty(PROP_VARTYPE); - if (var == null) { - if (props.getProperty(PROP_INDEX_NAME) != null) { - var = "index"; - } - else if (props.getProperty(PROP_OID) != null) { - var = "oid"; - } - else { - var = "single"; //default - } - } - Integer type = (Integer)VARTYPES.get(var); - if (type == null) { - String msg = - "Unsupported " + PROP_VARTYPE + ": '" + var + "'"; - throw new IllegalArgumentException(msg); - } - return type.intValue(); - } + if ( prop == null ) + { + return new String[0]; + } + else + { + List mibs = StringUtil.explode ( prop, "," ); - /** - * @see org.hyperic.hq.product.GenericPlugin#init - */ - public void init(PluginManager manager) - throws PluginException { + return (String[])mibs.toArray ( new String[0] ); + } + } - super.init(manager); - this.log = getLog(); + private static int convertVarType ( Properties props ) + { + String var = props.getProperty ( PROP_VARTYPE ); + + if ( var == null ) + { + if ( props.getProperty ( PROP_INDEX_NAME ) != null ) + { + var = "index"; + } + else if ( props.getProperty ( PROP_OID ) != null ) + { + var = "oid"; + } + else + { + var = "single"; // Default + } + } + + Integer type = (Integer)VARTYPES.get ( var ); + + if ( type == null ) + { + String msg = "Unsupported " + PROP_VARTYPE + ": '" + var + "'"; + + throw new IllegalArgumentException ( msg ); + } + + return type.intValue ( ); + } + + /* + * @see org.hyperic.hq.product.GenericPlugin#init + */ + public void init ( PluginManager manager ) throws PluginException + { + super.init ( manager ); + + this.log = getLog ( ); - String prop = "snmp.indexCacheExpire"; + String prop = "snmp.indexCacheExpire"; - String expire = - manager.getProperty(prop); + String expire = manager.getProperty ( prop ); - if (expire != null) { - ixExpire = Integer.parseInt(expire) * 1000; - } + if ( expire != null ) + { + ixExpire = Integer.parseInt ( expire ) * 1000; + } - final String pdkDir = ProductPluginManager.getPdkDir(); + final String pdkDir = ProductPluginManager.getPdkDir ( ); - if (pdkDir == null) { - return; //dont load MIBs in the server - } + if ( pdkDir == null ) + { + return; // Don't load MIBs in the server... + } - MIBTree.setMibDir(pdkDir + "/mibs"); + MIBTree.setMibDir ( pdkDir + "/mibs" ); - try { - if (this.client.init(manager.getProperties())) { - if (this.data != null) { //null in the case of proxies - String jar = this.data.getFile(); - String[] mibs = getMIBs(); + try + { + if ( this.client.init ( manager.getProperties ( ) ) ) + { + if ( this.data != null ) // 'null' in the case of proxies + { + String jar = this.data.getFile ( ); + String[] mibs = getMIBs ( ); - if (jar.endsWith(".xml")) { - //MIB files on local disk - this.client.addMIBs(mibs); - } - else { - //MIB files embedded within the jar - this.client.addMIBs(jar, mibs); - } - } + if ( jar.endsWith ( ".xml" ) ) + { + // MIB files on local disk... + this.client.addMIBs ( mibs ); + } + else + { + // MIB files embedded within the jar... + this.client.addMIBs ( jar, mibs ); + } } - } catch (SNMPException e) { - throw new PluginException(e.getMessage(), e); - } - } + } + } + catch ( SNMPException e ) + { + throw new PluginException ( e.getMessage ( ), e ); + } + } - private double getDoubleValue(SNMPValue snmpValue) - throws PluginException { + private double getDoubleValue ( SNMPValue snmpValue ) throws PluginException + { + final String invalidType = "SNMP query returned a string which could not be handled: "; - final String invalidType = - "SNMP query returned a string which could not be handled: "; + // If toLong or toFloat throw an exception only if + // SNMPException.E_VARIABLE_IS_NOT_NUMERIC + // we swallow those exceptions because we've already checked the type + switch ( snmpValue.getType ( ) ) + { + case SNMPValue.TYPE_LONG: - //if toLong or toFloat throw an exception only if - //SNMPException.E_VARIABLE_IS_NOT_NUMERIC - //we swallow those exceptions because we've already checked the type - switch (snmpValue.getType()) { - case SNMPValue.TYPE_LONG: - case SNMPValue.TYPE_LONG_CONVERTABLE: - try { - return (double)snmpValue.toLong(); - } catch (SNMPException e) {} + case SNMPValue.TYPE_LONG_CONVERTABLE: - case SNMPValue.TYPE_STRING: - String value = snmpValue.toString(); + try + { + return (double)snmpValue.toLong ( ); + } + catch ( SNMPException e ) + { } - //e.g. iplanet iwsInstanceLoad1MinuteAverage - //on anything but solaris - if ("".equals(value)) { - this.log.debug("string value is empty, returning -1"); - return -1; + case SNMPValue.TYPE_STRING: + + String value = snmpValue.toString ( ); + + // e.g. iplanet iwsInstanceLoad1MinuteAverage + // on anything but solaris... + if ( "".equals ( value ) ) + { + this.log.debug ( "string value is empty, returning -1" ); + + return -1; } - //in general we should not be dealing with strings at all. - //however, iplanet for example stores cpu usage as a string. - //in which case we can convert to double. - try { - double val = Double.parseDouble(value); - this.log.debug("converted using Double.parseDouble"); - return val; - } catch (NumberFormatException e) {} + // In general, we should not be dealing with strings at all. + // However, iPlanet for example stores cpu usage as a string. + // In which case we can convert to double... + try + { + double val = Double.parseDouble ( value ); - //snmpValue.toLong() when value is TYPE_STRING - //converts to a date stamp, which is our last attempt. - try { - double val = (double)snmpValue.toLong(); - this.log.debug("converted using snmpValue.toLong"); - return val; - } catch (SNMPException e) {} + this.log.debug ( "converted using Double.parseDouble" ); - default: - throw new PluginException(invalidType + snmpValue.toString()); - } - } + return val; + } + catch ( NumberFormatException e ) + { } - private static boolean listIsEmpty(List list) { - if (list == null) { - return true; - } - return list.isEmpty(); - } + // snmpValue.toLong() when value is TYPE_STRING + // converts to a date stamp, which is our last attempt... + try + { + double val = (double)snmpValue.toLong ( ); - //XXX unless we change snmplib to throw a more informative message - //SNMPSession_v1.getNextValue throws SNMPException with the - //message "SNMPException #101" - //only seems to happen if it cannot talk to the snmp agent. - //otherwise getColumn and getNextValue just return a null List - private MetricUnreachableException snmpConnectException(Metric metric, - SNMPException e) { - Properties props = metric.getObjectProperties(); - String cfg = - props.getProperty(SNMPClient.PROP_IP) + - ":" + - props.getProperty(SNMPClient.PROP_PORT) + - " " + - props.getProperty(SNMPClient.PROP_VERSION) + "," + - props.getProperty(SNMPClient.PROP_COMMUNITY); - String msg = "Unable to connect to SNMP Agent (" + cfg + ")"; - return new MetricUnreachableException(msg, e); - } + this.log.debug ( "converted using snmpValue.toLong" ); - //e.g. ifOperStatus == 0 == AVAIL_DOWN - private boolean isAvail(Metric metric) { - return "true".equals(metric.getObjectProperty("Avail")); - } + return val; + } + catch ( SNMPException e ) + { } - /** - * @see org.hyperic.hq.product.MeasurementPlugin#getValue - */ - public MetricValue getValue (Metric metric) - throws MetricUnreachableException, - MetricNotFoundException, - PluginException { + default: - boolean isDebug = this.log.isDebugEnabled(); - SNMPSession session = getSession(metric); - if (session == null) { - throw new PluginException("SNMPSession was null!"); - } - Properties props = metric.getProperties(); + throw new PluginException ( invalidType + snmpValue.toString ( ) ); + } + } - double value = 0; - String varName = metric.getAttributeName(); + private static boolean listIsEmpty ( List list ) + { + if ( list == null ) + { + return true; + } - if ((varName == null) || - (varName.length() == 0) || - varName.equals("%oid%")) - { - //special case for optional netservices.SNMP.OID Value metric - return MetricValue.NONE; - } + return list.isEmpty(); + } - String varOID = getPluginProperty(varName); - if (varOID != null) { - if (isDebug) { - log.debug(getName() + " defined " + - varName + " to " + varOID); - } - varName = varOID; - } - int varType = convertVarType(props); - List columnOfValues; - int size; - StopWatch timer = null; + // Unless we change snmplib to throw a more informative message + // SNMPSession_v1.getNextValue throws SNMPException with the + // message "SNMPException #101" + // only seems to happen if it cannot talk to the snmp agent. + // Otherwise getColumn and getNextValue just return a null List... + private MetricUnreachableException snmpConnectException ( Metric metric, + SNMPException e ) + { + Properties props = metric.getObjectProperties ( ); - if (isDebug) { - timer = new StopWatch(); - } + String cfg = props.getProperty ( SNMPClient.PROP_IP) + + ":" + + props.getProperty(SNMPClient.PROP_PORT) + + " " + + props.getProperty(SNMPClient.PROP_VERSION) + + "," + + props.getProperty(SNMPClient.PROP_COMMUNITY); - if ((varType == VARTYPE_SINGLE) || (varType == VARTYPE_NEXT)) { - SNMPValue snmpValue; - try { - if (varType == VARTYPE_SINGLE) { - snmpValue = session.getSingleValue(varName); - } - else { - snmpValue = session.getNextValue(varName); - } - } catch (MIBLookupException e) { - throw new MetricInvalidException(e.getMessage()); - } catch (SNMPException e) { - if (isAvail(metric)) { - return new MetricValue(Metric.AVAIL_DOWN); - } - else { - throw snmpConnectException(metric, e); - } - } finally { - if (timer != null) { - this.log.debug("getValue took: " + timer); - } + String msg = "Unable to connect to SNMP Agent (" + cfg + ")"; + + return new MetricUnreachableException ( msg, e ); + } + + // e.g. ifOperStatus == 0 == AVAIL_DOWN... + private boolean isAvail ( Metric metric ) + { + return "true".equals ( metric.getObjectProperty ( "Avail" ) ); + } + + /* + * @see org.hyperic.hq.product.MeasurementPlugin#getValue + */ + public MetricValue getValue ( Metric metric ) throws MetricUnreachableException, + MetricNotFoundException, + PluginException + { + boolean isDebug = this.log.isDebugEnabled ( ); + + SNMPSession session = getSession ( metric ); + + if ( session == null ) + { + throw new PluginException ( "SNMPSession was null!" ); + } + + Properties props = metric.getProperties ( ); + + double value = 0; + + String varName = metric.getAttributeName ( ); + + if ( ( varName == null ) || ( varName.length ( ) == 0 ) || varName.equals ( "%oid%" ) ) + { + // Special case for optional netservices.SNMP.OID Value metric... + return MetricValue.NONE; + } + + String varOID = getPluginProperty ( varName ); + + if ( varOID != null ) + { + if ( isDebug ) + { + log.debug ( getName ( ) + " defined " + varName + " to " + varOID ); + } + + varName = varOID; + } + + int varType = convertVarType ( props ); + int size; + + List columnOfValues; + + StopWatch timer = null; + + if ( isDebug ) + { + timer = new StopWatch ( ); + } + + if ( ( varType == VARTYPE_SINGLE ) || ( varType == VARTYPE_NEXT ) ) + { + SNMPValue snmpValue; + + try + { + if ( varType == VARTYPE_SINGLE ) + { + snmpValue = session.getSingleValue ( varName ); } - value = getDoubleValue(snmpValue); - } - else { - try { - switch (varType) { - case VARTYPE_INDEX: - columnOfValues = session.getBulk(varName); + else + { + snmpValue = session.getNextValue ( varName ); + } + } + catch ( MIBLookupException e ) + { + throw new MetricInvalidException ( e.getMessage ( ) ); + } + catch ( SNMPException e ) + { + if (isAvail ( metric ) ) + { + return new MetricValue ( Metric.AVAIL_DOWN ); + } + else + { + throw snmpConnectException ( metric, e ); + } + } + finally + { + if ( timer != null ) + { + this.log.debug ( "getValue took: " + timer ); + } + } + + value = getDoubleValue ( snmpValue ); + } + else + { + try + { + switch ( varType ) + { + case VARTYPE_INDEX: + + columnOfValues = session.getBulk ( varName ); - if (listIsEmpty(columnOfValues)) { - String msg = "Column data not found: " + varName; - throw new MetricNotFoundException(msg); - } + if ( listIsEmpty ( columnOfValues ) ) + { + String msg = "Column data not found: " + varName; - size = columnOfValues.size(); + throw new MetricNotFoundException ( msg ); + } - int index = getIndex(props, session); - if (index >= columnOfValues.size()) { - String ix = - props.getProperty(PROP_INDEX_NAME); - String val = - props.getProperty(PROP_INDEX_VALUE); - String msg = "No value found for SNMP index: " + - ix + "." + val; - throw new MetricNotFoundException(msg); - } - value = getDoubleValue((SNMPValue)columnOfValues.get(index)); - break; - case VARTYPE_OID: - int idx = -1; - if (props.getProperty(PROP_INDEX_NAME) != null) { - //lookup index if given and include in the oid match - idx = getIndex(props, session); - } - String oid = props.getProperty(PROP_OID); - boolean found = false; + size = columnOfValues.size ( ); - SNMPValue snmpValue = session.getTableValue(varName, idx+1, oid); - if (snmpValue != null) { - value = getDoubleValue(snmpValue); - found = true; - } + int index = getIndex ( props, session ); - if (!found) { - String msg = "OID not found: " + oid; - throw new MetricNotFoundException(msg); - } - break; - case VARTYPE_COLUMN: - columnOfValues = session.getBulk(varName); + if ( index >= columnOfValues.size ( ) ) + { + String ix = props.getProperty ( PROP_INDEX_NAME ); + String val = props.getProperty ( PROP_INDEX_VALUE ); + String msg = "No value found for SNMP index: " + ix + "." + val; + + throw new MetricNotFoundException ( msg ); + } + + value = getDoubleValue ( (SNMPValue)columnOfValues.get ( index ) ); + + break; + + case VARTYPE_OID: + + int idx = -1; + + if ( props.getProperty ( PROP_INDEX_NAME ) != null ) + { + // Lookup index if given and include in the oid match... + idx = getIndex ( props, session ); + } + + String oid = props.getProperty ( PROP_OID ); + + boolean found = false; + + SNMPValue snmpValue = session.getTableValue ( varName, idx+1, oid ); + + if ( snmpValue != null ) + { + value = getDoubleValue ( snmpValue ); + found = true; + } + + if ( !found ) + { + String msg = "OID not found: " + oid; + + throw new MetricNotFoundException ( msg ); + } + + break; + + case VARTYPE_COLUMN: + + columnOfValues = session.getBulk ( varName ); - if (listIsEmpty(columnOfValues)) { - String msg = "Column data not found: " + varName; - throw new MetricNotFoundException(msg); - } + if ( listIsEmpty ( columnOfValues ) ) + { + String msg = "Column data not found: " + varName; - size = columnOfValues.size(); + throw new MetricNotFoundException ( msg ); + } - //XXX should support OID matching here too - for (int i=0; i<size; i++) { - value += getDoubleValue((SNMPValue)columnOfValues.get(i)); - } - default: - throw new MetricNotFoundException("Invalid vartype"); - } - } catch (SNMPException e) { - if (isAvail(metric)) { - return new MetricValue(Metric.AVAIL_DOWN); - } - else { - throw snmpConnectException(metric, e); - } - } finally { - if (timer != null) { - this.log.debug("getValue took: " + timer + - " (type=" + varType + ")"); - } - } - } + size = columnOfValues.size ( ); - if (isAvail(metric)) { - if (value <= 0) { - value = Metric.AVAIL_DOWN; + // Should support OID matching here too... + for ( int i = 0; i < size; i++ ) + { + value += getDoubleValue ( (SNMPValue)columnOfValues.get ( i ) ); + } + + default: + + throw new MetricNotFoundException ( "Invalid vartype" ); } - else { - value = Metric.AVAIL_UP; + } + catch ( SNMPException e ) + { + if ( isAvail ( metric ) ) + { + return new MetricValue ( Metric.AVAIL_DOWN ); } - } + else + { + throw snmpConnectException ( metric, e ); + } + } + finally + { + if ( timer != null ) + { + this.log.debug ( "getValue took: " + timer + " (type=" + varType + ")" ); + } + } + } - return new MetricValue(value); - } + if ( isAvail ( metric ) ) + { + if ( value <= 0 ) + { + value = Metric.AVAIL_DOWN; + } + else + { + value = Metric.AVAIL_UP; + } + } - private int getIndex(Properties props, SNMPSession session) - throws MetricUnreachableException, - MetricNotFoundException { + return new MetricValue(value); + } - String indexName = props.getProperty(PROP_INDEX_NAME); - String indexValue = props.getProperty(PROP_INDEX_VALUE); + private int getIndex ( Properties props, + SNMPSession session ) throws MetricUnreachableException, + MetricNotFoundException + { + String indexName = props.getProperty(PROP_INDEX_NAME); + String indexValue = props.getProperty(PROP_INDEX_VALUE); - if (indexName == null) { - throw new MetricInvalidException("missing indexName"); - } - if (indexValue == null) { - throw new MetricInvalidException("missing indexValue"); - } + if ( indexName == null ) + { + throw new MetricInvalidException ( "missing indexName" ); + } - synchronized (ixLock) { - return getIndex(indexName, indexValue, session); - } - } + if ( indexValue == null ) + { + throw new MetricInvalidException ( "missing indexValue" ); + } - private int getIndex(String indexName, String indexValue, - SNMPSession session) - throws MetricUnreachableException, - MetricNotFoundException { + synchronized ( ixLock ) + { + return getIndex ( indexName, indexValue, session ); + } + } - long timeNow = System.currentTimeMillis(); + private int getIndex ( String indexName, + String indexValue, + SNMPSession session ) throws MetricUnreachableException, + MetricNotFoundException + { + long timeNow = System.currentTimeMillis ( ); - Integer ix; - boolean expired = false; + Integer ix; - if ((timeNow - ixTimestamp) > ixExpire) { - if (ixTimestamp == 0) { - this.log.debug("initializing index cache"); - } - else { - this.log.debug("clearing index cache"); - } + boolean expired = false; - ixCache.clear(); - ixTimestamp = timeNow; - expired = true; - } - else { - if ((ix = (Integer)ixCache.get(indexValue)) != null) { - return ix.intValue(); - } - } + if ( ( timeNow - ixTimestamp ) > ixExpire ) + { + if ( ixTimestamp == 0 ) + { + this.log.debug ( "initializing index cache" ); + } + else + { + this.log.debug ( "clearing index cache" ); + } - //for multiple indices we iterate through indexNames and - //combine the values which we later attempt to match against - //indexValue - List indexNames = StringUtil.explode(indexName, "->"); - ArrayList data = new ArrayList(); + ixCache.clear(); - //XXX this can be optimized, esp. if indexNames.size() == 1 - for (int i=0; i<indexNames.size(); i++) { - String name = (String)indexNames.get(i); + ixTimestamp = timeNow; - List values = null; + expired = true; + } + else + { + if ( ( ix = (Integer)ixCache.get ( indexValue ) ) != null ) + { + return ix.intValue ( ); + } + } - try { - values = session.getBulk(name); + // For multiple indices we iterate through indexNames and + // combine the values which we later attempt to match against + // indexValue... + List indexNames = StringUtil.explode ( indexName, "->" ); - for (int j=0; j<values.size(); j++) { - String value = values.get(j).toString(); - StringBuffer buf = null; + ArrayList data = new ArrayList ( ); - if (data.size()-1 >= j) { - buf = (StringBuffer)data.get(j); - buf.append("->").append(value); - } - else { - buf = new StringBuffer(value); - data.add(buf); - } - } - } catch (SNMPException e) { - throw new MetricInvalidException(e); + // This can be optimized, esp. if indexNames.size() == 1... + for ( int i = 0; i < indexNames.size ( ); i++ ) + { + String name = (String)indexNames.get ( i ); + + List values = null; + + try + { + values = session.getBulk ( name ); + + for ( int j = 0; j < values.size ( ); j++ ) + { + String value = values.get(j).toString ( ); + + StringBuffer buf = null; + + if ( data.size ( ) - 1 >= j ) + { + buf = (StringBuffer)data.get ( j ); + + buf.append("->").append ( value ); + } + else + { + buf = new StringBuffer ( value ); + + data.add ( buf ); + } } - } + } + catch ( SNMPException e ) + { + throw new MetricInvalidException ( e ); + } + } - //we go in reverse in the case of apache having - //two servername->80 entries, the first being the default (unused) - //second being the vhost on 80 which is actually handling the requests - //XXX we could/should? enforce uniqueness here. - for (int i=data.size()-1; i>=0; i--) { - StringBuffer buf = (StringBuffer)data.get(i); - String cur = buf.toString(); + // We go in reverse in the case of apache having + // two servername->80 entries, the first being the default (unused) + // second being the vhost on 80 which is actually handling the requests + // -- We could/should? enforce uniqueness here... + for ( int i = data.size ( ) - 1; i >= 0; i-- ) + { + StringBuffer buf = (StringBuffer)data.get ( i ); + String cur = buf.toString ( ); - //since we fetched all the data might as well build - //up the index cache for future reference. - Integer index = new Integer(i); - ixCache.put(cur, index); - //only seen w/ microsoft snmp server - //where interface name has a trailing null byte - ixCache.put(cur.trim(), index); - } + // Since we fetched all the data might as well build + // up the index cache for future reference... + Integer index = new Integer ( i ); - if (this.log.isDebugEnabled()) { - if (expired) { - this.log.debug("built index cache:"); - for (Iterator it = ixCache.entrySet().iterator(); - it.hasNext();) { - Map.Entry ent = (Map.Entry)it.next(); - this.log.debug(" " + ent.getKey() + - "=>" + ent.getValue()); - } + ixCache.put ( cur, index ); + + // Only seen w/ microsoft snmp server + // where interface name has a trailing null byte... + ixCache.put ( cur.trim ( ), index ); + } + + if ( this.log.isDebugEnabled ( ) ) + { + if ( expired ) + { + this.log.debug ( "built index cache:" ); + + for ( Iterator it = ixCache.entrySet().iterator ( ); it.hasNext ( ); ) + { + Map.Entry ent = (Map.Entry)it.next ( ); + + this.log.debug ( " " + ent.getKey ( ) + "=>" + ent.getValue ( ) ); } - else { - this.log.debug("forced to rebuild index cache " + - " looking for: " + indexValue); - } - } + } + else + { + this.log.debug ( "forced to rebuild index cache looking for: " + indexValue ); + } + } - if ((ix = (Integer)ixCache.get(indexValue)) != null) { - return ix.intValue(); - } + if ( ( ix = (Integer)ixCache.get ( indexValue ) ) != null ) + { + return ix.intValue ( ); + } - String possibleValues = ", possible values="; + String possibleValues = ", possible values="; - if (listIsEmpty(data)) { - possibleValues += "[NONE FOUND]"; - } - else { - possibleValues += data.toString(); - } + if ( listIsEmpty ( data ) ) + { + possibleValues += "[NONE FOUND]"; + } + else + { + possibleValues += data.toString ( ); + } - throw new MetricNotFoundException("could not find value '" + - indexValue + "' in column '" + - indexName + "'" + possibleValues); - } + throw new MetricNotFoundException ( "could not find value '" + + indexValue + + "' in column '" + + indexName + "'" + + possibleValues ); + } - private SNMPSession getSession(Metric metric) - throws PluginException { + private SNMPSession getSession ( Metric metric ) throws PluginException + { + Properties props = metric.getObjectProperties ( ); - Properties props = metric.getObjectProperties(); - if (props.get(SNMPClient.PROP_IP) == null) { - //backcompat: shitty decision to make ip address the domain name - props.put(SNMPClient.PROP_IP, metric.getDomainName()); - } + if ( props.get ( SNMPClient.PROP_IP ) == null ) + { + // Backcompat: poor decision to make ip address the domain name + props.put ( SNMPClient.PROP_IP, metric.getDomainName ( ) ); + } - try { - return this.client.getSession(props); - } catch (SNMPException e) { - throw new PluginException(e.getMessage(), e); - } - } + try + { + return this.client.getSession ( props ); + } + catch ( SNMPException e ) + { + throw new PluginException ( e.getMessage ( ), e ); + } + } } |
From: <bo...@hy...> - 2009-12-13 08:54:46
|
Author: bob Date: 2009-12-13 00:54:34 -0800 (Sun, 13 Dec 2009) New Revision: 14083 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14083 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1287 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-12 08:52:13 UTC (rev 14082) +++ trunk/etc/version.properties 2009-12-13 08:54:34 UTC (rev 14083) @@ -1,3 +1,3 @@ -#Sat Dec 12 00:15:14 PST 2009 +#Sun Dec 13 00:15:53 PST 2009 version=4.3.0 -build=1286 +build=1287 |
From: <jko...@hy...> - 2009-12-12 19:13:44
|
Author: jkonicki Date: 2009-12-11 07:55:01 -0800 (Fri, 11 Dec 2009) New Revision: 14073 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14073 Modified: branches/HQ_4_2_0_PATCH/plugins/netservices/src/org/hyperic/hq/plugin/netservices/NetServicesCollector.java Log: HHQ-3569 No longer require the password to be specified. If the password is null, setting it to empty string. Modified: branches/HQ_4_2_0_PATCH/plugins/netservices/src/org/hyperic/hq/plugin/netservices/NetServicesCollector.java =================================================================== --- branches/HQ_4_2_0_PATCH/plugins/netservices/src/org/hyperic/hq/plugin/netservices/NetServicesCollector.java 2009-12-11 08:52:12 UTC (rev 14072) +++ branches/HQ_4_2_0_PATCH/plugins/netservices/src/org/hyperic/hq/plugin/netservices/NetServicesCollector.java 2009-12-11 15:55:01 UTC (rev 14073) @@ -36,6 +36,8 @@ import javax.net.ssl.SSLSocket; import javax.net.ssl.X509TrustManager; +import org.apache.commons.logging.Log; +import org.apache.commons.logging.LogFactory; import org.hyperic.hq.product.Collector; import org.hyperic.hq.product.PluginException; import org.hyperic.sigar.NetConnection; @@ -46,6 +48,8 @@ public abstract class NetServicesCollector extends Collector { + private static Log log = LogFactory.getLog(NetServicesCollector.class); + private int port = -1; private int defaultPort, defaultSSLPort; private boolean isSSL, enableNetstat; @@ -124,7 +128,10 @@ throw new PluginException("Missing " + PROP_USERNAME); } if (this.pass == null) { - throw new PluginException("Missing " + PROP_PASSWORD); + if (log.isDebugEnabled()){ + log.debug("Password was null, setting to empty string."); + } + this.pass = ""; } } @@ -189,6 +196,10 @@ return this.user; } + /** + * @return The password specified. If no password was specified in the + * properties, an empty string is returned. + */ public String getPassword() { return this.pass; } |
From: <jko...@hy...> - 2009-12-12 19:13:44
|
Author: jkonicki Date: 2009-12-10 11:01:22 -0800 (Thu, 10 Dec 2009) New Revision: 14064 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14064 Modified: trunk/plugins/netservices/src/org/hyperic/hq/plugin/netservices/NetServicesCollector.java Log: HHQ-3569 No longer require the password to be specified. If the password is null, setting it to empty string. Modified: trunk/plugins/netservices/src/org/hyperic/hq/plugin/netservices/NetServicesCollector.java =================================================================== --- trunk/plugins/netservices/src/org/hyperic/hq/plugin/netservices/NetServicesCollector.java 2009-12-10 18:59:26 UTC (rev 14063) +++ trunk/plugins/netservices/src/org/hyperic/hq/plugin/netservices/NetServicesCollector.java 2009-12-10 19:01:22 UTC (rev 14064) @@ -36,6 +36,8 @@ import javax.net.ssl.SSLSocket; import javax.net.ssl.X509TrustManager; +import org.apache.commons.logging.Log; +import org.apache.commons.logging.LogFactory; import org.hyperic.hq.product.Collector; import org.hyperic.hq.product.PluginException; import org.hyperic.sigar.NetConnection; @@ -46,6 +48,8 @@ public abstract class NetServicesCollector extends Collector { + private static Log log = LogFactory.getLog(NetServicesCollector.class); + private int port = -1; private int defaultPort, defaultSSLPort; private boolean isSSL, enableNetstat; @@ -124,7 +128,10 @@ throw new PluginException("Missing " + PROP_USERNAME); } if (this.pass == null) { - throw new PluginException("Missing " + PROP_PASSWORD); + if (log.isDebugEnabled()){ + log.debug("Password was null, setting to empty string."); + } + this.pass = ""; } } @@ -189,6 +196,10 @@ return this.user; } + /** + * @return The password specified. If no password was specified in the + * properties, an empty string is returned. + */ public String getPassword() { return this.pass; } |
From: <bo...@hy...> - 2009-12-12 09:16:13
|
Author: bob Date: 2009-12-12 00:52:13 -0800 (Sat, 12 Dec 2009) New Revision: 14082 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14082 Modified: trunk/etc/version.properties Log: Release 4.3.0 build #1286 Modified: trunk/etc/version.properties =================================================================== --- trunk/etc/version.properties 2009-12-12 01:09:28 UTC (rev 14081) +++ trunk/etc/version.properties 2009-12-12 08:52:13 UTC (rev 14082) @@ -1,3 +1,3 @@ -#Fri Dec 11 00:14:56 PST 2009 +#Sat Dec 12 00:15:14 PST 2009 version=4.3.0 -build=1285 +build=1286 |
From: Mirko P. <m.p...@gm...> - 2009-12-12 03:43:01
|
Hi, there are different bindings available. Check out: http://support.hyperic.com/display/SIGAR/Home If you want to use Java, have a look at the SIGAR sourcecode for the SIGAR shell, especially Ifconfig.java http://github.com/hyperic/sigar/blob/master/bindings/java/src/org/hyperic/sigar/cmd/Ifconfig.java Cheers, Mirko |
From: <bo...@hy...> - 2009-12-12 01:09:42
|
Author: bob Date: 2009-12-11 17:09:28 -0800 (Fri, 11 Dec 2009) New Revision: 14081 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14081 Modified: branches/HQ_4_1_4_1/etc/version.properties Log: Release 4.1.4.2 build #1078 Modified: branches/HQ_4_1_4_1/etc/version.properties =================================================================== --- branches/HQ_4_1_4_1/etc/version.properties 2009-12-12 00:17:05 UTC (rev 14080) +++ branches/HQ_4_1_4_1/etc/version.properties 2009-12-12 01:09:28 UTC (rev 14081) @@ -1,3 +1,3 @@ -#Mon Dec 07 13:52:22 PST 2009 +#Fri Dec 11 16:17:46 PST 2009 version=4.1.4.2 -build=1077 +build=1078 |
Author: trader Date: 2009-12-11 16:16:21 -0800 (Fri, 11 Dec 2009) New Revision: 14079 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14079 Added: trunk/unittest/src/org/hyperic/util/file/ Modified: trunk/src/org/hyperic/hq/agent/server/AgentDListProvider.java trunk/src/org/hyperic/hq/measurement/agent/server/SenderThread.java trunk/src/org/hyperic/util/file/DiskList.java Log: HHQ-3541, HHQ-3114: Integer overflow in DiskList, backlogged agent may throw OOM Clean up DiskList: better error logging, fix int overflow, add unittests. Modified: trunk/src/org/hyperic/hq/agent/server/AgentDListProvider.java =================================================================== --- trunk/src/org/hyperic/hq/agent/server/AgentDListProvider.java 2009-12-11 21:47:16 UTC (rev 14078) +++ trunk/src/org/hyperic/hq/agent/server/AgentDListProvider.java 2009-12-12 00:16:21 UTC (rev 14079) @@ -51,8 +51,8 @@ implements AgentStorageProvider { private static final int RECSIZE = 1024; - private static final long MAXSIZE = 100 * 1024 * 1024; // 100MB - private static final long CHKSIZE = 20 * 1024 * 1024; // 20MB + private static final long MAXSIZE = 50 * 1024 * 1024; // 50MB + private static final long CHKSIZE = 10 * 1024 * 1024; // 10MB private static final int CHKPERC = 50; // Only allow < 50% free private Log log; // da logger Modified: trunk/src/org/hyperic/hq/measurement/agent/server/SenderThread.java =================================================================== --- trunk/src/org/hyperic/hq/measurement/agent/server/SenderThread.java 2009-12-11 21:47:16 UTC (rev 14078) +++ trunk/src/org/hyperic/hq/measurement/agent/server/SenderThread.java 2009-12-12 00:16:21 UTC (rev 14079) @@ -607,6 +607,10 @@ return; } + if (log.isDebugEnabled()) { + log.debug("Woke up, sending batch of metrics."); + } + lastMetricTime = this.sendBatch(); if(lastMetricTime != null){ String backlogNum = ""; Modified: trunk/src/org/hyperic/util/file/DiskList.java =================================================================== --- trunk/src/org/hyperic/util/file/DiskList.java 2009-12-11 21:47:16 UTC (rev 14078) +++ trunk/src/org/hyperic/util/file/DiskList.java 2009-12-12 00:16:21 UTC (rev 14079) @@ -74,13 +74,14 @@ private static final Log log = LogFactory.getLog(DiskList.class.getName()); private String fileName; + private String idxFileName; private RandomAccessFile indexFile; - private RandomAccessFile dataFile; + protected RandomAccessFile dataFile; private int recordSize; // Size of each record private long firstRec; // IDX of first record private long lastRec; // IDX of last record private byte[] padBytes; // Utility array for padding - private SortedSet freeList; // Set(Long) of free rec idxs + protected SortedSet freeList; // Set(Long) of free rec idxs private int modNum; // Modification random number private long checkSize; // Start to check for unused blocks // when the datafile reaches this @@ -119,10 +120,11 @@ int checkPerc, long maxLength) throws IOException { - File idxFileName; + File idxFile; - idxFileName = new File(dataFile + ".idx"); + idxFile = new File(dataFile + ".idx"); this.fileName = dataFile.getName(); + this.idxFileName = idxFile.getName(); this.rand = new Random(); this.dataFile = new RandomAccessFile(dataFile, "rw"); this.recordSize = recordSize; @@ -135,9 +137,9 @@ " to " + maxLength + " bytes"); this.maxLength = maxLength; - this.genFreeList(idxFileName); + this.indexFile = new RandomAccessFile(idxFile, "rw"); + this.genFreeList(idxFile); - this.indexFile = new RandomAccessFile(idxFileName, "rw"); this.closed = false; } @@ -155,16 +157,16 @@ } /** - * Do maintinece on the data and index files. If the datafile size and - * the free block percentange exceed the defined thresholds, the extra + * Do maintenance on the data and index files. If the datafile size and + * the free block percentage exceed the defined thresholds, the extra * free blocks will be removed by truncating the data and index files. * * Since truncation is used, some times it will be possible that even - * though the criteria are met, we won't be albe to delete the free space. + * though the criteria are met, we won't be able to delete the free space. * This is a recoverable situation though, since new blocks will be * inserted at the beginning of the data file. */ - private void doMaintainence() + private void doMaintenence() throws IOException { long lastData = this.dataFile.length()/this.recordSize; @@ -211,7 +213,7 @@ * buffered input stream, which makes our initial startup much * faster, if there is a lot of data sitting in the list. */ - private void genFreeList(File idxFileName) + private void genFreeList(File idxFile) throws IOException { BufferedInputStream bIs; @@ -226,12 +228,12 @@ this.freeList = new TreeSet(); try { - fIs = new FileInputStream(idxFileName); + fIs = new FileInputStream(idxFile); bIs = new BufferedInputStream(fIs); dIs = new DataInputStream(bIs); - for(int idx=0; ; idx++){ + for(long idx=0; ; idx++){ boolean used; long prev, next; @@ -388,12 +390,20 @@ try { this.indexFile.setLength(0); } catch(IOException exc){ + this.log.error("IOException while truncating file " + idxFileName); + if (this.log.isDebugEnabled()) { + this.log.debug(exc); + } sExc = exc; } try { this.dataFile.setLength(0); } catch(IOException exc){ + this.log.error("IOException while truncating file " + fileName); + if (this.log.isDebugEnabled()) { + this.log.debug(exc); + } if(sExc != null){ sExc = exc; } @@ -469,9 +479,11 @@ this.freeList.add(new Long(recNo)); } - if ((this.dataFile.length() > this.checkSize) && - (this.getDataFileFreePercentage() > this.checkPerc)) { - this.doMaintainence(); + long length = this.dataFile.length(); + long percFree = this.getDataFileFreePercentage(); + if ((length > this.checkSize) && + (percFree > this.checkPerc)) { + this.doMaintenence(); } } @@ -492,13 +504,21 @@ try { this.dataFile.close(); - } catch(IOException exc){ + } catch(IOException exc){ + this.log.error("IOException while closing file " + fileName); + if (this.log.isDebugEnabled()) { + this.log.debug(exc); + } sExc = exc; } try { this.indexFile.close(); } catch(IOException exc){ + this.log.error("IOException while closing file " + idxFileName); + if (this.log.isDebugEnabled()) { + this.log.debug(exc); + } if(sExc == null){ sExc = exc; } @@ -550,6 +570,10 @@ try { rec = this.diskList.readRecord(this.curIdx); } catch(IOException exc){ + log.error("IOException while reading record"); + if (log.isDebugEnabled()) { + log.debug(exc); + } throw new NoSuchElementException("Error getting next " + "element: " + exc.getMessage()); @@ -577,6 +601,10 @@ try { this.diskList.removeRecord(this.curIdx); } catch(IOException exc){ + log.error("IOException while removing record"); + if (log.isDebugEnabled()) { + log.debug(exc); + } throw new IllegalStateException("Error removing record: " + exc.getMessage()); } @@ -597,7 +625,7 @@ return new DiskListIterator(this, this.firstRec, this.modNum); } } - + public static void main(String[] args) throws Exception { |
From: <tr...@hy...> - 2009-12-12 00:39:34
|
Author: trader Date: 2009-12-11 16:17:05 -0800 (Fri, 11 Dec 2009) New Revision: 14080 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14080 Added: trunk/unittest/src/org/hyperic/util/file/DiskListTest.java Log: HHQ-3541, HHQ-3114: Integer overflow in DiskList, backlogged agent may throw OOM Clean up DiskList: better error logging, fix int overflow, add unittests. Added: trunk/unittest/src/org/hyperic/util/file/DiskListTest.java =================================================================== --- trunk/unittest/src/org/hyperic/util/file/DiskListTest.java (rev 0) +++ trunk/unittest/src/org/hyperic/util/file/DiskListTest.java 2009-12-12 00:17:05 UTC (rev 14080) @@ -0,0 +1,388 @@ +package org.hyperic.util.file; + +import java.io.File; +import java.io.IOException; +import java.io.RandomAccessFile; +import java.util.Iterator; + +import junit.framework.TestCase; + +public class DiskListTest extends TestCase +{ + private static final int RECSIZE = 1024; + private static final long MAXSIZE = 50 * 1024 * 1024; // 100MB + private static final long CHKSIZE = 10 * 1024 * 1024; // 20MB + private static final int CHKPERC = 50; // Only allow < 50% free + private static final int MAXRECS = 2000; + + public DiskListTest() {} + + public void setUp() { + } + + public void testReadWriteFile() throws Exception { + + DiskListDataHolder holder = null; + + try { + + try { + holder = new DiskListDataHolder(); + } catch (Exception e) { + e.printStackTrace(); + fail(e.toString()); + } + + for (long i = 0; i < MAXRECS; ++i) { + String toPut = String.valueOf(i); + holder.list.addToList(toPut); + } + + Iterator it = holder.list.getListIterator(); + // Check that we can read the proper number of records back + long i = 0; + while (it.hasNext()) { + it.next(); + i++; + } + assertTrue(i == MAXRECS); + + holder.list.close(); + + // Check that we can read the proper number after close/reopen, and that they can be cleanly deleted + holder.list = new DiskList(holder.dataFile, + RECSIZE, + CHKSIZE, + CHKPERC, + MAXSIZE); + it = holder.list.getListIterator(); + i = 0; + while (it.hasNext()) { + it.next(); + it.remove(); + + i++; + } + + assertTrue(i == MAXRECS); + + } finally { + + holder.dispose(); + + } + } + + public void testFreeListWithNoInserts() throws Exception { + + DiskListDataHolder holder = null; + + try { + + try { + holder = new DiskListDataHolder(); + } catch (Exception e) { + e.printStackTrace(); + fail(e.toString()); + } + + holder.list.close(); + + // Check that we can read the proper number after close/reopen, and that they can be cleanly deleted + holder.list = new DiskList(holder.dataFile, + RECSIZE, + CHKSIZE, + CHKPERC, + MAXSIZE); + + assertTrue(holder.list.freeList.size() == 0); + + } finally { + + holder.dispose(); + + } + } + + public void testFreeListWithInsertsAndNoDeletes() throws Exception { + + DiskListDataHolder holder = null; + if (holder == null) return; + + try { + + try { + holder = new DiskListDataHolder(); + } catch (Exception e) { + e.printStackTrace(); + fail(e.toString()); + } + + for (long i = 0; i < MAXRECS; ++i) { + String toPut = String.valueOf(i); + holder.list.addToList(toPut); + } + + holder.list.close(); + + // Check that we can read the proper number after close/reopen, and that they can be cleanly deleted + holder.list = new DiskList(holder.dataFile, + RECSIZE, + CHKSIZE, + CHKPERC, + MAXSIZE); + + assertTrue(holder.list.freeList.size() == 0); + + } finally { + + holder.dispose(); + + } + } + + public void testFreeListWithInsertsAndDeletes() throws Exception { + + DiskListDataHolder holder = null; + + try { + + try { + holder = new DiskListDataHolder(); + } catch (Exception e) { + e.printStackTrace(); + fail(e.toString()); + } + + for (long i = 0; i < MAXRECS; ++i) { + String toPut = String.valueOf(i); + holder.list.addToList(toPut); + } + + holder.list.close(); + + // Check that we can read the proper number after close/reopen, and that they can be cleanly deleted + holder.list = new DiskList(holder.dataFile, + RECSIZE, + CHKSIZE, + CHKPERC, + MAXSIZE); + Iterator it = holder.list.getListIterator(); + int nDeleted = 0; + while (it.hasNext()) { + for (int i = 0; i < 5; ++i) { + it.next(); + } + it.remove(); + nDeleted++; + } + + assertTrue(holder.list.freeList.size() == nDeleted); + + } finally { + + holder.dispose(); + + } + } + + public void testFreeListAfterTruncation() throws Exception { + + DiskListDataHolder holder = null; + long epsilon = 50; + + try { + + try { + holder = new DiskListDataHolder(); + } catch (Exception e) { + e.printStackTrace(); + fail(e.toString()); + } + + String toPut = String.valueOf("dummystring"); + + // Insert until we *almost* spill over + while (holder.list.dataFile.length() + epsilon < MAXSIZE) { + holder.list.addToList(toPut); + } + + assertTrue(holder.list.freeList.size() == 0); + + // Now insert until we spill over, expect a log message from this line + while (holder.list.dataFile.length() > MAXSIZE - epsilon) { + holder.list.addToList(toPut); + } + + assertTrue(holder.list.freeList.size() == 0); + + // After truncation, there should be no records, add one back and then delete it + holder.list.addToList(toPut); + Iterator it = holder.list.getListIterator(); + int nIterated = 0; + while (it.hasNext()) { + it.next(); + + // Each remove should create one spot in the free list + it.remove(); + nIterated++; + } + + assertTrue(nIterated == 1); + assertTrue(holder.list.freeList.size() == 1); + + } finally { + + holder.dispose(); + + } + } + + public void testFreeListAfterMaintenance() throws Exception { + + DiskListDataHolder holder = null; + long epsilon = 50; + + try { + + try { + holder = new DiskListDataHolder(); + } catch (Exception e) { + e.printStackTrace(); + fail(e.toString()); + } + + String toPut = String.valueOf("dummystring"); + + // Insert until we *almost* spill over + long nInserted = 0; + while (holder.list.dataFile.length() + epsilon < MAXSIZE) { + holder.list.addToList(toPut); + nInserted++; + } + + int freeListSize = holder.list.freeList.size(); + assertTrue(freeListSize == 0); + + // Delete every fourth record as we buzz through the list. Because CHKPERC is + // 50%, deleting every fourth should NOT trigger maintenance + int nIterated = 0; + int nDeleted = 0; + int counter = 0; + Iterator it = holder.list.getListIterator(); + while (it.hasNext()) { + it.next(); + + if (++counter == 4) { + // Each remove should create one spot in the free list + it.remove(); + nDeleted++; + counter = 0; + } + + nIterated++; + } + + assertTrue(nIterated == nInserted); + freeListSize = holder.list.freeList.size(); + assertTrue(freeListSize == nDeleted); + + // Now delete the the same amount - 1: still should NOT trigger maintenance + int nToDelete = nDeleted - 20; + int nPreviouslyDeleted = nDeleted; + nDeleted = 0; + it = holder.list.getListIterator(); + while (it.hasNext() && nDeleted < nToDelete) { + it.next(); + it.remove(); + nDeleted++; + } + + assertTrue(nDeleted == nToDelete); + freeListSize = holder.list.freeList.size(); + assertTrue(nDeleted + nPreviouslyDeleted == freeListSize); + + // Now try to trigger maintenance. First: maintenance is only done if the last block is + // free (maintenance truncates off the end only, no internal compacting), so make sure that + // there is stuff to truncate + int offTheEnd = 20; + it = holder.list.getListIterator(); + for (int i = nDeleted + nPreviouslyDeleted; i < nInserted - offTheEnd; ++i) { + it.next(); + } + while (it.hasNext()) { + it.next(); + it.remove(); + } + + // A few more deletes should trigger maintenance. The calculation uses an integer + // percentage rounded from doubles, so exactly (half - 1) deletes may not be enough. + freeListSize = holder.list.freeList.size(); + int freeListPeakSize = freeListSize; + long oldLength = holder.list.dataFile.length(); + int toTriggerMaintenance = ((int) (nInserted / 100)) / 2 + 1; + it = holder.list.getListIterator(); + for (int i = 0; i < toTriggerMaintenance; ++i) { + it.next(); + it.remove(); + freeListPeakSize = Math.max(freeListPeakSize, holder.list.freeList.size()); + } + + freeListSize = holder.list.freeList.size(); + long length = holder.list.dataFile.length(); + assertTrue("Expected free list size < " + freeListPeakSize + ", actual value was " + freeListSize, + freeListSize < freeListPeakSize); + assertTrue("Expected file length < " + oldLength + ", actual value was " + length, + length < oldLength); + + } finally { + + holder.dispose(); + + } + } + + private static class DiskListDataHolder { + DiskList list; + File dataFile; + File indexFile; + + DiskListDataHolder() throws Exception { + File tmpDirFile; + + String tmpDir = System.getProperty("java.io.tmpdir"); + if (tmpDir == null) { + tmpDir = "/tmp"; + } + + tmpDirFile = new File(tmpDir); + + File dataFile = null; + File indexFile = null; + if (tmpDirFile.isDirectory() && tmpDirFile.canWrite()) { + dataFile = new File(tmpDirFile, "datafile"); + indexFile = new File(tmpDirFile, "datafile.idx"); + dataFile.delete(); + indexFile.delete(); + } else { + throw new IllegalStateException("Non-writeable directory!"); + } + + DiskList list = new DiskList(dataFile, + RECSIZE, + CHKSIZE, + CHKPERC, + MAXSIZE); + + this.list = list; + this.dataFile = dataFile; + this.indexFile = indexFile; + } + + void dispose() throws Exception { + list.close(); + dataFile.delete(); + indexFile.delete(); + } + } +} |
From: <bo...@hy...> - 2009-12-11 21:47:29
|
Author: bob Date: 2009-12-11 13:47:16 -0800 (Fri, 11 Dec 2009) New Revision: 14078 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14078 Modified: branches/HQ_4_0_3_1/etc/version.properties Log: Release 4.0.3.2 build #956 Modified: branches/HQ_4_0_3_1/etc/version.properties =================================================================== --- branches/HQ_4_0_3_1/etc/version.properties 2009-12-11 19:04:55 UTC (rev 14077) +++ branches/HQ_4_0_3_1/etc/version.properties 2009-12-11 21:47:16 UTC (rev 14078) @@ -1,3 +1,3 @@ -#Tue Sep 29 23:15:03 PDT 2009 +#Fri Dec 11 13:18:17 PST 2009 version=4.0.3.2 -build=955 +build=956 |
From: <no...@gi...> - 2009-12-11 19:56:47
|
Branch: refs/heads/master Home: http://github.com/hyperic/hqapi Commit: 202a70437b57e068c77e95522b75eaf2e77180c5 http://github.com/hyperic/hqapi/commit/202a70437b57e068c77e95522b75eaf2e77180c5 Author: pnguyen <pnguyen@10.2.0.125> Date: 2009-12-11 (Fri, 11 Dec 2009) Changed paths: M src/org/hyperic/hq/hqapi1/test/HQApiTestBase.java M src/org/hyperic/hq/hqapi1/test/MaintenanceSchedule_test.java Log Message: ----------- Validate the enabled status of the resource's availability measurement before, during, and after the maintenance window. |
From: <no...@gi...> - 2009-12-11 19:43:25
|
Branch: refs/heads/hqapi-2.x Home: http://github.com/hyperic/hqapi Commit: b99b10d8774e574af2b716f55306d3da5903a91e http://github.com/hyperic/hqapi/commit/b99b10d8774e574af2b716f55306d3da5903a91e Author: Ryan Morgan <rm...@hy...> Date: 2009-12-11 (Fri, 11 Dec 2009) Changed paths: M ChangeLog M hqu/hqapi1/app/ResourceController.groovy M xsd/HQApi1.xsd Log Message: ----------- [HHQ-3603] Add support for location field for Resources. |
From: <no...@gi...> - 2009-12-11 19:43:24
|
Branch: refs/heads/master Home: http://github.com/hyperic/hqapi Commit: b99b10d8774e574af2b716f55306d3da5903a91e http://github.com/hyperic/hqapi/commit/b99b10d8774e574af2b716f55306d3da5903a91e Author: Ryan Morgan <rm...@hy...> Date: 2009-12-11 (Fri, 11 Dec 2009) Changed paths: M ChangeLog M hqu/hqapi1/app/ResourceController.groovy M xsd/HQApi1.xsd Log Message: ----------- [HHQ-3603] Add support for location field for Resources. Commit: 9034d518653c344567ef2e8770ca34d090c4d062 http://github.com/hyperic/hqapi/commit/9034d518653c344567ef2e8770ca34d090c4d062 Author: Ryan Morgan <rm...@hy...> Date: 2009-12-11 (Fri, 11 Dec 2009) Changed paths: M ChangeLog M hqu/hqapi1/app/ResourceController.groovy M xsd/HQApi1.xsd Log Message: ----------- Merge branch 'hqapi-2.x' |
From: <tr...@hy...> - 2009-12-11 19:05:06
|
Author: trader Date: 2009-12-11 11:04:55 -0800 (Fri, 11 Dec 2009) New Revision: 14077 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14077 Modified: trunk/src/org/hyperic/hq/product/jmx/MxUtil.java Log: Better diagnostics, reviewed during 4.2 convergence, decision was to hold off for that release and commit to trunk after 4.2 shipped. Modified: trunk/src/org/hyperic/hq/product/jmx/MxUtil.java =================================================================== --- trunk/src/org/hyperic/hq/product/jmx/MxUtil.java 2009-12-11 18:57:49 UTC (rev 14076) +++ trunk/src/org/hyperic/hq/product/jmx/MxUtil.java 2009-12-11 19:04:55 UTC (rev 14077) @@ -26,7 +26,9 @@ package org.hyperic.hq.product.jmx; import java.io.IOException; +import java.lang.reflect.InvocationTargetException; import java.lang.reflect.Method; +import java.lang.reflect.UndeclaredThrowableException; import java.net.MalformedURLException; import java.rmi.RemoteException; import java.util.HashMap; @@ -47,10 +49,9 @@ import javax.management.ReflectionException; import javax.management.j2ee.statistics.CountStatistic; import javax.management.j2ee.statistics.RangeStatistic; -import javax.management.j2ee.statistics.TimeStatistic; - import javax.management.j2ee.statistics.Statistic; import javax.management.j2ee.statistics.Stats; +import javax.management.j2ee.statistics.TimeStatistic; import javax.management.openmbean.CompositeData; import javax.management.remote.JMXConnector; import javax.management.remote.JMXConnectorFactory; @@ -59,7 +60,6 @@ import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; - import org.hyperic.hq.product.Metric; import org.hyperic.hq.product.MetricInvalidException; import org.hyperic.hq.product.MetricNotFoundException; @@ -329,6 +329,16 @@ metric.getAttributeName(), e); } catch (InstanceNotFoundException e) { throw objectNotFound(objectName, e); + } catch (UndeclaredThrowableException e) { + Throwable cause1 = e.getCause(); + if (cause1 instanceof InvocationTargetException) { + Throwable cause2 = cause1.getCause(); + if (cause2 instanceof InstanceNotFoundException) { + throw objectNotFound(objectName, (InstanceNotFoundException) cause2); + } + } + + throw e; } catch (ReflectionException e) { throw error(metric.toString(), e); |
From: <sc...@hy...> - 2009-12-11 18:58:05
|
Author: scottmf Date: 2009-12-11 10:57:49 -0800 (Fri, 11 Dec 2009) New Revision: 14076 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14076 Modified: trunk/src/org/hyperic/hq/events/server/session/AlertManagerEJBImpl.java Log: [HHQ-3499] add getAlertById() Modified: trunk/src/org/hyperic/hq/events/server/session/AlertManagerEJBImpl.java =================================================================== --- trunk/src/org/hyperic/hq/events/server/session/AlertManagerEJBImpl.java 2009-12-11 18:31:06 UTC (rev 14075) +++ trunk/src/org/hyperic/hq/events/server/session/AlertManagerEJBImpl.java 2009-12-11 18:57:49 UTC (rev 14076) @@ -216,6 +216,13 @@ } /** + * @ejb:interface-method + */ + public Alert getAlertById(Integer id) { + return getAlertDAO().getById(id); + } + + /** * Find an alert pojo by ID * * @ejb:interface-method |
Author: scottmf Date: 2009-12-11 10:31:06 -0800 (Fri, 11 Dec 2009) New Revision: 14075 URL: http://svn.hyperic.org/?view=rev&root=Hyperic+HQ&revision=14075 Added: trunk/unittest/data/escalationManagerTests.README trunk/unittest/data/escalationManagerTests.xml.gz trunk/unittest/src/org/hyperic/hq/escalation/server/session/EscalationManagerTest.java trunk/unittest/src/org/hyperic/hq/escalation/server/session/EscalationManagerTestEJBImpl.java Modified: trunk/build.xml trunk/src/org/hyperic/dao/DAOFactory.java trunk/src/org/hyperic/hq/dao/HibernateDAOFactory.java trunk/src/org/hyperic/hq/escalation/server/session/EscalationManagerEJBImpl.java Log: [HHQ-3499] added extra check to make sure the associated alertId is valid. added test to exercise that part of the code Modified: trunk/build.xml =================================================================== --- trunk/build.xml 2009-12-11 18:25:38 UTC (rev 14074) +++ trunk/build.xml 2009-12-11 18:31:06 UTC (rev 14075) @@ -1530,6 +1530,7 @@ <!-- Package the test jars and deploy into the EAR --> <jar file="${ear.dir}/hq-test.jar" basedir="${build.dir}/classes" > <include name="org/hyperic/**/*_test*"/> + <include name="org/hyperic/**/*Test*"/> </jar> <mkdir dir="${ear.dir}/hq-session-test.jar/META-INF" /> Modified: trunk/src/org/hyperic/dao/DAOFactory.java =================================================================== --- trunk/src/org/hyperic/dao/DAOFactory.java 2009-12-11 18:25:38 UTC (rev 14074) +++ trunk/src/org/hyperic/dao/DAOFactory.java 2009-12-11 18:31:06 UTC (rev 14075) @@ -53,6 +53,7 @@ import org.hyperic.hq.events.server.session.ActionDAO; import org.hyperic.hq.events.server.session.AlertActionLogDAO; import org.hyperic.hq.events.server.session.AlertConditionLogDAO; +import org.hyperic.hq.events.server.session.AlertDAO; import org.hyperic.hq.events.server.session.AlertDefinitionDAO; import org.hyperic.hq.events.server.session.TriggerDAO; @@ -81,6 +82,7 @@ // Event DAOs public abstract ActionDAO getActionDAO(); public abstract AlertDefinitionDAO getAlertDefDAO(); + public abstract AlertDAO getAlertDAO(); public abstract TriggerDAO getTriggerDAO(); public abstract AlertActionLogDAO getAlertActionLogDAO(); public abstract AlertConditionLogDAO getAlertConditionLogDAO(); Modified: trunk/src/org/hyperic/hq/dao/HibernateDAOFactory.java =================================================================== --- trunk/src/org/hyperic/hq/dao/HibernateDAOFactory.java 2009-12-11 18:25:38 UTC (rev 14074) +++ trunk/src/org/hyperic/hq/dao/HibernateDAOFactory.java 2009-12-11 18:31:06 UTC (rev 14075) @@ -49,6 +49,7 @@ import org.hyperic.hq.events.server.session.ActionDAO; import org.hyperic.hq.events.server.session.AlertActionLogDAO; import org.hyperic.hq.events.server.session.AlertConditionLogDAO; +import org.hyperic.hq.events.server.session.AlertDAO; import org.hyperic.hq.events.server.session.AlertDefinitionDAO; import org.hyperic.hq.events.server.session.TriggerDAO; import org.hyperic.hq.galerts.server.session.ExecutionStrategyTypeInfoDAO; @@ -195,4 +196,8 @@ public ExecutionStrategyTypeInfoDAO getExecutionStrategyTypeInfoDAO() { return new ExecutionStrategyTypeInfoDAO(this); } + + public AlertDAO getAlertDAO() { + return new AlertDAO(this); + } } Modified: trunk/src/org/hyperic/hq/escalation/server/session/EscalationManagerEJBImpl.java =================================================================== --- trunk/src/org/hyperic/hq/escalation/server/session/EscalationManagerEJBImpl.java 2009-12-11 18:25:38 UTC (rev 14074) +++ trunk/src/org/hyperic/hq/escalation/server/session/EscalationManagerEJBImpl.java 2009-12-11 18:31:06 UTC (rev 14075) @@ -40,7 +40,6 @@ import org.hyperic.dao.DAOFactory; import org.hyperic.hq.authz.server.session.AuthzSubject; import org.hyperic.hq.authz.server.session.AuthzSubjectManagerEJBImpl; -import org.hyperic.hq.authz.server.session.Resource; import org.hyperic.hq.authz.server.shared.ResourceDeletedException; import org.hyperic.hq.authz.shared.PermissionException; import org.hyperic.hq.common.ApplicationException; @@ -61,7 +60,8 @@ import org.hyperic.hq.events.Notify; import org.hyperic.hq.events.server.session.Action; import org.hyperic.hq.events.server.session.ActionManagerEJBImpl; -import org.hyperic.hq.events.server.session.AlertDefinitionManagerEJBImpl; +import org.hyperic.hq.events.server.session.Alert; +import org.hyperic.hq.events.server.session.AlertDAO; import org.hyperic.hq.events.server.session.AlertRegulator; import org.hyperic.hq.events.server.session.ClassicEscalationAlertType; import org.hyperic.hq.events.server.session.SessionBase; @@ -437,6 +437,11 @@ private void endEscalation(EscalationState state) { if (state != null) { + // make sure we have the updated state to avoid StaleStateExceptions + state = _stateDAO.findById(state.getId()); + if (state == null) { + return; + } _stateDAO.remove(state); EscalationRuntime.getInstance().unscheduleEscalation(state); } @@ -485,14 +490,15 @@ // XXX -- Need to make sure the application is running before // we allow this to proceed - _log.debug("Executing state[" + s.getId() + "]"); + final boolean debug = _log.isDebugEnabled(); + if (debug) _log.debug("Executing state[" + s.getId() + "]"); if (actionIdx >= e.getActions().size()) { if (e.isRepeat() && e.getActions().size() > 0) { actionIdx = 0; // Loop back } else { - _log.debug("Reached the end of the escalation state[" + - s.getId() + "]. Ending it"); + if (debug) _log.debug("Reached the end of the escalation state[" + + s.getId() + "]. Ending it"); endEscalation(s); return; } @@ -501,6 +507,14 @@ eAction = (EscalationAction)e.getActions().get(actionIdx); action = eAction.getAction(); + AlertDAO dao = DAOFactory.getDAOFactory().getAlertDAO(); + Alert alert = dao.getById(new Integer(s.getAlertId())); + // HHQ-3499, need to make sure that the alertId that is pointed to by + // the escalation still exists + if (alert == null) { + endEscalation(s); + return; + } Escalatable esc = getEscalatable(s); // HQ-1348: End escalation if alert is already fixed @@ -516,8 +530,8 @@ long nextTime = System.currentTimeMillis() + Math.max(offset, eAction.getWaitTime()); - _log.debug("Moving onto next state of escalation, but chillin' for " - + eAction.getWaitTime() + " ms"); + if (debug) _log.debug("Moving onto next state of escalation, but chillin' for " + + eAction.getWaitTime() + " ms"); s.setNextAction(actionIdx + 1); s.setNextActionTime(nextTime); s.setAcknowledgedBy(null); Added: trunk/unittest/data/escalationManagerTests.README =================================================================== --- trunk/unittest/data/escalationManagerTests.README (rev 0) +++ trunk/unittest/data/escalationManagerTests.README 2009-12-11 18:31:06 UTC (rev 14075) @@ -0,0 +1,39 @@ +bin/exportTable.sh -l jdbc:oracle:thin:@localhost:1522:hqdb -u hqadmin -p hqadmin \ +-t EAM_AGENT,\ +EAM_AUDIT,\ +EAM_CONFIG_RESPONSE,\ +EAM_CRISPO,\ +EAM_CRISPO_OPT,\ +EAM_CRITERIA,\ +EAM_DASH_CONFIG,\ +EAM_MEASUREMENT,\ +EAM_MEASUREMENT_CAT,\ +EAM_MEASUREMENT_TEMPL,\ +EAM_MONITORABLE_TYPE,\ +EAM_OPERATION,\ +EAM_PLATFORM,\ +EAM_PLATFORM_SERVER_TYPE_MAP,\ +EAM_PLATFORM_TYPE,\ +EAM_RES_GRP_RES_MAP,\ +EAM_RESOURCE,\ +EAM_RESOURCE_EDGE,\ +EAM_RESOURCE_GROUP,\ +EAM_RESOURCE_RELATION,\ +EAM_RESOURCE_TYPE,\ +EAM_ROLE,\ +EAM_ROLE_OPERATION_MAP,\ +EAM_ROLE_RESOURCE_GROUP_MAP,\ +EAM_SERVER,\ +EAM_SERVER_TYPE,\ +EAM_SERVICE,\ +EAM_SERVICE_TYPE,\ +EAM_SUBJECT,\ +EAM_SUBJECT_ROLE_MAP,\ +EAM_CONFIG_PROPS,\ +EAM_ESCALATION_STATE,\ +EAM_ESCALATION_ACTION,\ +EAM_ESCALATION,\ +EAM_ALERT_DEFINITION,\ +EAM_ALERT,\ +EAM_ALERT_CONDITION,\ +EAM_ACTION -d escalationManagerTest Added: trunk/unittest/data/escalationManagerTests.xml.gz =================================================================== (Binary files differ) Property changes on: trunk/unittest/data/escalationManagerTests.xml.gz ___________________________________________________________________ Name: svn:mime-type + application/octet-stream Added: trunk/unittest/src/org/hyperic/hq/escalation/server/session/EscalationManagerTest.java =================================================================== --- trunk/unittest/src/org/hyperic/hq/escalation/server/session/EscalationManagerTest.java (rev 0) +++ trunk/unittest/src/org/hyperic/hq/escalation/server/session/EscalationManagerTest.java 2009-12-11 18:31:06 UTC (rev 14075) @@ -0,0 +1,37 @@ +package org.hyperic.hq.escalation.server.session; + +import org.hyperic.hq.escalation.shared.EscalationManagerTestLocal; +import org.hyperic.util.unittest.server.BaseServerTestCase; +import org.hyperic.util.unittest.server.LocalInterfaceRegistry; + +public class EscalationManagerTest extends BaseServerTestCase { + + private static final String FILENAME = "escalationManagerTests.xml.gz"; + private LocalInterfaceRegistry _registry; + + public EscalationManagerTest(String name) { + super(name, true); + } + + public void setUp() throws Exception { + super.setUp(); + super.insertSchemaData(FILENAME); + _registry = deployHQ(); + } + + public void testExecuteStateWithInvalidAlertId() throws Exception { + EscalationManagerTestLocal eMan = (EscalationManagerTestLocal) + _registry.getLocalInterface( + EscalationManagerTestEJBImpl.class, + EscalationManagerTestLocal.class); + eMan.testExecuteStateWithInvalidAlertId(); + } + + public void tearDown() throws Exception { + super.tearDown(); + super.deleteSchemaData(FILENAME); + super.tearDown(); + undeployHQ(); + } + +} Added: trunk/unittest/src/org/hyperic/hq/escalation/server/session/EscalationManagerTestEJBImpl.java =================================================================== --- trunk/unittest/src/org/hyperic/hq/escalation/server/session/EscalationManagerTestEJBImpl.java (rev 0) +++ trunk/unittest/src/org/hyperic/hq/escalation/server/session/EscalationManagerTestEJBImpl.java 2009-12-11 18:31:06 UTC (rev 14075) @@ -0,0 +1,95 @@ +package org.hyperic.hq.escalation.server.session; + +import java.rmi.RemoteException; +import java.util.Collection; + +import javax.ejb.CreateException; +import javax.ejb.EJBException; +import javax.ejb.SessionBean; +import javax.ejb.SessionContext; + +import junit.framework.Assert; + +import org.hyperic.dao.DAOFactory; +import org.hyperic.hq.common.SystemException; +import org.hyperic.hq.escalation.shared.EscalationManagerLocal; +import org.hyperic.hq.escalation.shared.EscalationManagerTestLocal; +import org.hyperic.hq.escalation.shared.EscalationManagerTestUtil; +import org.hyperic.hq.events.server.session.AlertManagerEJBImpl; +import org.hyperic.hq.events.shared.AlertManagerLocal; +import org.hyperic.hq.measurement.MeasurementConstants; + +/** + * The session bean implementing the in-container unit tests for the + * AuthzSubjectManager. + * + * @ejb:bean name="EscalationManagerTest" + * jndi-name="ejb/authz/EscalationManagerTest" + * local-jndi-name="LocalEscalationManagerTest" + * view-type="local" + * type="Stateless" + * + * @ejb:util generate="physical" + * @ejb:transaction type="NotSupported" + */ +public class EscalationManagerTestEJBImpl implements SessionBean { + + public static EscalationManagerTestLocal getOne() { + try { + return EscalationManagerTestUtil.getLocalHome().create(); + } catch (Exception e) { + throw new SystemException(e); + } + } + + /** + * @ejb:interface-method + */ + public void testExecuteStateWithInvalidAlertId() throws Exception { + EscalationManagerLocal eMan = EscalationManagerEJBImpl.getOne(); + AlertManagerLocal aMan = AlertManagerEJBImpl.getOne(); + EscalationState state = new EscalationState(); + state.setAcknowledgedBy(null); + state.setNextAction(0); + state.setNextActionTime(System.currentTimeMillis()+MeasurementConstants.DAY); + int alertId = 10105; + Escalation esc = eMan.findById(new Integer(100)); + state.setEscalation(esc); + state.setAlertId(alertId); + state.setAlertDefinitionId(10001); + state.setAlertTypeEnum(-559038737); + getOne().runEscalation(state); + aMan.deleteAlerts(new Integer[] {new Integer(alertId)}); + Assert.assertNull( + "alert 10105 should not exist", aMan.getAlertById(new Integer(10105))); + try { + // should exit without any errors + eMan.executeState(state.getId()); + } catch (Exception e) { + Assert.assertTrue(e.getMessage(), false); + throw e; + } + Assert.assertNull( + "escalationStateId " + state.getId() + " should not exist", + getEscalationStateDAO().get(state.getId())); + } + + /** + * @ejb:transaction type="Required" + * @ejb:interface-method + */ + public void runEscalation(EscalationState state) { + getEscalationStateDAO().save(state); + EscalationRuntime.getInstance().scheduleEscalation(state); + } + + private EscalationStateDAO getEscalationStateDAO() { + return new EscalationStateDAO(DAOFactory.getDAOFactory()); + } + + public void ejbCreate() throws CreateException {} + public void ejbActivate() throws EJBException, RemoteException {} + public void ejbPassivate() throws EJBException, RemoteException {} + public void ejbRemove() throws EJBException, RemoteException {} + public void setSessionContext(SessionContext arg0) {} +} |