1. Upload into swrep packages
2. changing templates on cdbop : get prod/... !vim prod/... update prod/... commit
3. run spma_wrapper.sh to install the packages on each root machine
4. change configuration files (TODO: yaim)
5. service gridview-publisher start
Tuesday, April 22, 2008
Friday, April 18, 2008
Using Google Toolkit to send messages
RequestBuilder requestBuilder = new RequestBuilder( RequestBuilder.POST, url);
requestBuilder.setHeader("Content-Type", "application/x-www-form-urlencoded");
Request response = requestBuilder.sendRequest("destination=test.msg.gwt&type=topic&body=blahblah+agbf", requestCallback);
Thursday, March 6, 2008
transpose on oracle
SELECT ENCODEDMIN,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-1', cnt, null )) lxplus240_1,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-0', cnt, null )) lxplus240_0,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-2', cnt, null )) lxplus240_2,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-3', cnt, null )) lxplus240_3,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-4', cnt, null )) lxplus240_4,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-5', cnt, null )) lxplus240_5,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-6', cnt, null )) lxplus240_6,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-7', cnt, null )) lxplus240_7,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-8', cnt, null )) lxplus240_8,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-9', cnt, null )) lxplus240_9
FROM (
SELECT CAST(MSGENCODEDTIME/60 AS INT)*60 AS ENCODEDMIN, SOURCEPUBLISHERID, COUNT(*) AS CNT FROM MSGPERFORMANCE
WHERE CAST(MSGENCODEDTIME/60 AS INT)*60 > 1204796640 --AND CAST(MSGENCODEDTIME/60 AS INT)*60
SELECT DECODEDMIN,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-1', cnt, null )) lxplus240_1,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-0', cnt, null )) lxplus240_0,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-2', cnt, null )) lxplus240_2,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-3', cnt, null )) lxplus240_3,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-4', cnt, null )) lxplus240_4,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-5', cnt, null )) lxplus240_5,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-6', cnt, null )) lxplus240_6,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-7', cnt, null )) lxplus240_7,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-8', cnt, null )) lxplus240_8,
MAX( DECODE( SOURCEPUBLISHERID, 'Plxplus240.cern.ch-9', cnt, null )) lxplus240_9
FROM (
SELECT CAST(MSGDECODEDTIME/60 AS INT)*60 AS DECODEDMIN, SOURCEPUBLISHERID, COUNT(*) AS CNT FROM MSGPERFORMANCE
WHERE CAST(MSGDECODEDTIME/60 AS INT)*60 > 1204796640 --AND CAST(MSGENCODEDTIME/60 AS INT)*60
GROUP BY CAST(MSGDECODEDTIME/60 AS INT)*60, SOURCEPUBLISHERID)
GROUP BY DECODEDMIN;
Friday, February 8, 2008
too many connections made the broker stall
Encountered a problem when too many sockets were open and put in "CLOSE_WAIT" status by the broker.
gridmsg001 was rebooted as quick fix, but should be found a different way to keep it up!
http://support.bea.com/application_content/product_portlets/support_patterns/wls/TooManyOpenFilesPattern.html
Update: problem related to excessive number of connections being open at indiana.edu. track of the error seen through checking the log errors. Indiana corrected their algorithm but still todo, find a way to block connections otherwise, a typical Denial of service may occur :S
gridmsg001 was rebooted as quick fix, but should be found a different way to keep it up!
http://support.bea.com/application_content/product_portlets/support_patterns/wls/TooManyOpenFilesPattern.html
Update: problem related to excessive number of connections being open at indiana.edu. track of the error seen through checking the log errors. Indiana corrected their algorithm but still todo, find a way to block connections otherwise, a typical Denial of service may occur :S
Tuesday, February 5, 2008
Using the analysis scripts:
Example of usage:
python rNetworkAnalysis.py fastConsumerSummary.csv comparingDataBufferDataDumping/Consumer*
python timeSeriesAnalysis.py fastConsumerTimes.csv > fastConsumerTimeSeries.csv
python mAnalysis.py fastConsumerTimeSeries.csv ( results goes to temp.csv )
python lagAnalysis.py fastConsumerTimeSeries.csv > fastConsumerLag.csv
Wednesday, January 9, 2008
How to connect as root from anywhere
simply access through gd01.cern.ch/ gd02.cern.ch boxes.
from here, we can log as root as usual
from here, we can log as root as usual
Monday, January 7, 2008
Subscribe to:
Posts (Atom)