Jul 04

Who would have thought! I am starting to use RPG!

I have always said that I did not need to learn or use ‘RPG’ on the IBM i as I always found that ‘C’ could do all that I needed. Recently I was asked by a friend to help them with some RPG code to handle Java and the clean up of the objects it created (Java would not automatically clean up objects because they were effectively created by the RPG program and this program ran constantly, so temp storage just kept growing until it blew up). Not knowing RPG or understanding how the layout worked (I jumped straight into ‘/Free’!) I found this very difficult as ‘/Free’ is not really free format(there are still some column constraints) while ‘C’ really is free format. Still after some research and a lot of head scratching I finally got some sample code working, we then built a Service program that could handle the Java clean up and built code into the existing RPG programs to call it. The solution works the clients systems are not blowing up with memory issues caused by Java objects not being cleaned up.

I though OK that’s the last time I will have to do that and was happy that I could get back to good old ‘C’ programming. Unfortunately, I came across another issue which required me to pick up the RPG manuals and code up a test application.

We have a client who was experiencing problems with an application that uses commitment control and constraints that required us to build a test which would emulate the problem on our systems. As usual the first thing I did was to write a ‘C’ based solution, I did find a Commitment control test which was written by Paul Tuohy here. This was all written using RPG so I thought I would just follow the program logic and write a ‘C’ version which seemed the easiest option. While I could get the simple file update logic built and the program would work without Commitment Control, I found that as soon as Commitment Control was started the program would freeze on receipt of data from STDIN, (I will have to ask IBM why when I have time) so I decided my bets options was to take the code that Paul had provided and build my own interpretation of the program with some additional features I needed.

I wanted the program accept multiple entries plus allow deletes by key before the commit of the data so I had to make a few changes to the logic and add a new delete option. While the program is very clunky it does achieve what I needed it to do and I found out a lot about commitment control and constraints as a result. I am also unsure if the program is as efficient as it could be, but it works and for now that is all that’s needed.

Here is the code I ended up using.


H Option(*SrcStmt : *NoDebugIO) DftActGrp(*No) ActGrp('COMMITDEMO')
FHeader1 UF A E K Disk Commit(SomeTimes)
FDetails1 UF A E K Disk Commit(SomeTimes)

D Commit1 PR ExtPgm('COMMITRPG')
D SomeTimes n

D Commit1 PI
D SomeTimes n

D ToDo S 1a
D ToDel S 1a
D ToAdd S 1a
/Free
ToAdd = *Blanks;
Dow (ToAdd <> 'n');
Dsply 'Enter a Key Value (2 long): ' ' ' Key;
If (Key <> *Blanks);
Text = 'Header for ' + Key;
Write Header;
For Sequence = 1 to 3;
Text = 'Detail for ' + Key + ' ' + %Char(Sequence);
Write Details;
Chain Key Header1;
NumRows += 1;
Update Header;
EndFor;
NumRows = 0;
EndIf;
Key = *Blanks;
Dsply 'Enter a more Keys y/n : ' ' ' ToAdd;
EndDo;
ToDel = *Blanks;
Dsply 'Do you want to delete entries : ' ' ' ToDel;
Dow (ToDel <> 'n');
Key = *Blanks;
Dsply 'Enter the Key to Delete : ' ' ' Key;
if (Key <> *Blanks);
chain Key header1;
if %found;
delete header1;
EndIf;
EndIf;
ToDel = *Blanks;
Dsply 'Do you want to delete more entries : ' ' ' ToDel;
EndDo;
ToDo = *Blanks;
Dow (ToDo <> 'c' and ToDo <> 'r' and ToDo <> 'i');
Dsply 'c - Commit, r - Rollback, i - Ignore ' ' ' ToDo;
EndDo;

If SomeTimes and (ToDo = 'c');
Commit;
ElseIf SomeTimes and (ToDo = 'r');
RolBk;
EndIf;

*InLR = *On;
/End-Free

Note: the Blog does not allow RPG code indentation so the view you see is not what it was copied in as!

The database was exactly the same that Paul had defined including the cascading delete for the details file (I liked that bit) so when we delete the Header Record the matching records in the Details file are also deleted. That saved us having to chain (see I can speak RPG) the details file and remove the entries. Now we can see the problem the client was experiencing and know how to resolve it.

As usual Google was our best friend, thanks to Paul Tuohy and ITJungle for providing the sample code we based the test application on. I am now a little less resistant to RPG and may delve a little more into its capabilities and how I can use it effectively, who knows I may even become good at it?? The point I am trying to make here is that while I still do not want to use RPG, I did what I keep telling others to do, I used the best tool for the job. Using any language just because it is all you know is not always the best option, sometimes you have to jump outside of your comfort zone and try something new.

Chris…

May 31

Bob Cancilla’s off the mark!

I thought Bob Cancilla was actually changing his position on the need to pull away from the IBM i, but it looks like he has had yet another episode! You can find a copy of his latest rant here
http://planet-i.org/2013/05/29/continued-decline-of-the-ibm-i/

Here are my views on his comments.

1. Yes the IBM i install base is dwindling, but that is not because of the platform not being supported by IBM. Companies Merge so the server technology changes and generally decreases through consolidation. Companies go bust and close their doors meaning the servers are no longer needed, if you haven’t noticed the last 5 – 10 years have not been growth years.

2. The fact that COMMON Europe cancelled its conference is not a sign that there are no IBM i installs out there, the economy in Europe is bad and budgets have been cut for everyone! He does not mention what other conferences for his platforms of choice have seen in terms of attendance etc. Having a conference in an exclusive French resort which is very expensive is not the best idea COMMON Europe made. IBM pulled out because sending people to Europe is expensive and the location chosen is obviously a major factor in their decision, especially when no one else was going!

3.The Nordic numbers are not backed up by the graphic in the link, so I assume the reduction in numbers is something he has from some other source? If there were 10,000 customers running IBM i was that systems or was that an actual customer count? Why concentrate on the Nordics as an indicator for the rest of the world? As I have said the numbers must be dwindling, but some of that has to be to do with the power of the newer systems. I personally had 3 systems running for our business until we purchased a new Power 6 system, all of them were in the P05 tier group! I now have a single system running 3 Partitions each of which are probably 3 – 4 times faster than the previous Power 5+ i515 system alone so I need a lot less systems to deliver better user experiences. If I went to a Power 7 this would be increased exponentially again!!! Others have obviously done the same as I did and reduced the number of servers.

4. IBM is getting out of hardware and has been since I worked at IBM Havant in 1975 – 1993, nothing has changed there. The fact that they are selling the x86 business is good for Power, if Power was the problem they would be getting rid of it! Yes IBM invested in Linux, but obviously not for x86 hardware (they are desperately trying to get out of that) so again it was probably for the Power hardware, so why are they doing that if it is being dropped. There are many other reasons such as services revenue and software licensing (Linux is not free at the Enterprise level) so it is a mix of everything above.

5. RPG locks you into the platform so it is bad, hmmmm then why not use one of the other languages available on the platform? You have a choice of many languages on the IBM i and my very personal opinion is that anyone who is just using RPG is cutting their own throat! RPG is just a tool in the toolbox, so pick the best tool for the job. If I am going to have to rebuild my entire application just to change the language why would I ever add a new platform and all of the complexities of the OS into the mix? I could train a ‘C’ developer on Linux to develop in ‘C’ on the IBM i a lot faster than I could train an RPG developer to develop in C on Linux, that goes for any language and the IBM i supports them all (especially Java). Even though RPG is a key tool on the IBM i we need to reduce the emphasis placed on it and start to push the other languages just as hard.

We are being told CLOUD is the next leap in faith for the IT community. If you are to believe the hype it means you are not interested in how the result is delivered and what produced it just that it is available all the time and at a lower cost. As usual there are lots of ideas on what this means in terms of application delivery and many of them are a new set of acronyms for the same technologies that refused to fly years ago. I have doubts if the Cloud is the answer and I am sure that before too long we will have a new word for it! Having said that, if the Cloud is the next evolution of IT delivery why does this do anything but create the need for stable, dependable, highly available, flexible systems (oh did I just explain what the IBM i is???). So while I appreciate Bobs right to keep trying to build his business using scare tactics and bluff, I for one will keep an open mind about dumping IBM i in favor of moving to something new.

Just to set the record straight, I run Windows, MAC, Linux, AIX and IBM i. I have spent a lot of time developing on Windows, Linux and IBM i (IBM i the most) and all in a single language ‘C’ (or the related object version). In my view IBM i is the simplest for many reasons, not least the integration of everything you need to build a total solution. I use PHP for interface building (80 column screens just don’t hack it for me) and prefer to run the Web Services from Linux or Windows, but the IBM i can perform as a web server if needed.

So if you do as Bob says and take a deep and meaningful look at your IT infrastructure, consider changing the development language before jumping to a new development language, platform, OS and development tool set! Remember with ILE you can build the solution out of many languages and they will all work in harmony so you can steadily replace older programs with new ones.

Chris…

Sep 23

Re-test of the new XMLSERVICE after optimizing the RPG programs.

The discussion about the new XMLSERVICE performance on the iSeriesNetwork forums pointed us to a possible improvement for the XMLSERVICE which is not shipped as standard. The poster suggested that we change the Compiler options for the RPG programs to *FULL optimization and try the tests again. I would question why the CL programs which are shipped do not have this already set if its a known improvement. But we dutifully went through the CL programs and set the Optimization parameters and created the programs again.

Here is the result of the tests now, it should be noted that the initial call is a lot slower than a subsequent call so we took the liberty of refreshing the screen a couple of times before copying the results.

New Zend Toolkit. Input value is changing on each call.

Time (loop=5000) total=22.32 sec (4.46 ms per call)

Parameter name Input value Output value
var1 Y C
var2 Z D
var3 001.0001 321.1234
var4 0000000003.04 1234567890.12
ds1 A E
ds2 B F
ds3 005.0007 333.3330
ds4 0000000006.08 4444444444.44

As you can see it does show an improvement over the previous test, but it is still far slower than the EASYCOM for PHP process. The poster does mention that we should be running on the latest version 1.5 which we checked, the programs are not noted with the release level (maybe that would be a good idea) but the Zip file we downloaded is xmlservice-rpg-1.5 which should indicate the contents are the version 1.5 code level? If not maybe the above numbers will also improve with the correct level of code.

These tests were run on a fairly low end system, maybe the use of a bigger system with lots of memory and CPU will change the span between the 2 processes? If you do have a big system and want to run the same tests we did we will be happy to publish your results.

We have also created a new group on LinkedIn which will discuss the use of EASYCOM for PHP, you can join the group and engage with us in making PHP for IBM i a great solution for application modernization.

Chris…

Sep 07

New XMLService is not what I expected.

As posted in the previous post I was setting up the Zend Server on our V6R1 system to give the new XMLSERVICE a try to see if the program calls would be faster using it rather than the old i5_toolkit API’s. Unfortunately we had to abandon the V6R1 install due to a number of problems with the install of the XMLSERVICE and the NewToolkit install. After 5 hours of installing and configuring trying to get things running we were ready to give up.

However we decided to give it a go on the V7R1 system as it did seem to compile the RPG XMLSERVICE programs etc OK. We installed the latest download of the ZendServer but did not install the update Cum package. This seems to have resolved our initial problem with the XMLSERVICE requests which ended in error when calling the Db2supp.php script? But that might be a total co-incidence.

Once we had everything installed we connected all of our old web configurations back up and started the main webserver (we have 5 virtualhosts running below the main webserver) to begin testing the setup. As far as the main PHP services went everything seemed to be running OK so we decide to set up the XMLSERVICE.

First program we compiled CRTXML worked perfectly and the XMLSERVICE programs all created correctly. However as we wanted to use the ZENDSVR install we then tried to run the CRTXML2 program, this ended with an error creating the XMLSERVICE program due to some missing definitions. A quick compare of the 2 programs showed the CRTXML2 program did not create or embed the PLUDB2 or PLUGSQL modules in the service programs. We amended the program and re-created it and then everything worked as it should.

Next we had to link the NewToolkit directory to the web server we were going to use and add the relevant configs to make sure we could use the symbolic links we created to the directory.

The XMLSERVICE is started the first time it is called according to the documentation but to be honest its a bit sparse and not easily understood when this is the first time you go to use it. I am not an expert in the product but I did get it working so it can’t be that bad. So all we had to do was create a couple of scripts to test things out.

A program is shipped in the NewToolkit library which can be compiled and used for the tests. It is called ZZCALL and is an RPG program that takes a few parameters and just adds some values to them. I am not an RPG expert as I have said all along, but the program is pretty simple which is all we needed for the test.

The next thing we needed is the scripts which will call the program and capture time information as they run.

First program calls the RPG program sending in new parameters each time.


<?php
/*
RPG program parameters definition
INCHARA S 1a
INCHARB S 1a
INDEC1 S 7p 4
INDEC2 S 12p 2
INDS1 DS
DSCHARA 1a
DSCHARB 1a
DSDEC1 7p 4
DSDEC2 12p 2
*/
include_once 'authorization.php';
include_once '../API/ToolkitService.php';
include_once 'helpshow.php';

$loop = 5000;

$start_time = microtime();
echo "New Zend Toolkit. Input value is changing on each call.<br><br>";

try {
$ToolkitServiceObj = ToolkitService::getInstance($db, $user, $pass);
}
catch (Exception $e) {
echo $e->getMessage(), "\n";
exit();
}

$ToolkitServiceObj->setToolkitServiceParams(array('InternalKey'=>"/tmp/$user"));

$IOParam['var1'] = array("in"=>"Y", "out"=>"" );
$param[] = $ToolkitServiceObj->AddParameterChar('both', 1,'INCHARA', 'var1', $IOParam['var1']['in']);

$IOParam['var2'] = array( "in"=>"Z", "out"=>"" );
$param[] = $ToolkitServiceObj->AddParameterChar('both', 1,'INCHARB', 'var2', $IOParam['var2']['in']);

$IOParam['var3'] = array( "in"=>"001.0001" ,"out"=>"");
$param[] = $ToolkitServiceObj->AddParameterPackDec('both', 7, 4, 'INDEC1', 'var3', '001.0001');

$IOParam['var4'] = array( "in"=>"0000000003.04","out"=>"" );
$param[] = $ToolkitServiceObj->AddParameterPackDec('both',12,2,'INDEC2', 'var4', '0000000003.04');

$IOParam['ds1'] = array( "in"=>"A" ,"out"=>"");
$ds[] = $ToolkitServiceObj->AddParameterChar('both', 1, 'DSCHARA', 'ds1','A');

$IOParam['ds2'] = array( "in"=>"B" ,"out"=>"");
$ds[] = $ToolkitServiceObj->AddParameterChar('both', 1, 'DSCHARB', 'ds2','B');

$IOParam['ds3'] = array( "in"=>"005.0007","out"=>"" );
$ds[] = $ToolkitServiceObj->AddParameterPackDec('both',7, 4, 'DSDEC1', 'ds3', '005.0007' );

$IOParam['ds4'] = array("in"=>"0000000006.08" ,"out"=>"");
$ds[] = $ToolkitServiceObj->AddParameterPackDec('both',12, 2, 'DSDEC1', 'ds4', '0000000006.08');

//$param[] = array('ds'=>$ds);
$param[] = $ToolkitServiceObj->AddDataStruct($ds);
$param[2]->setParamValue(0);
for ($i=0;$i<$loop;$i++) {
$result = $ToolkitServiceObj->PgmCall('ZZCALL', "ZENDSVR", $param, null, null);
$param[2]->setParamValue($result['io_param']['ds3']);
} // end loop

$end_time = microtime();
$wire_time= control_microtime_used($start_time,$end_time)*1000000;
echo
sprintf("<br><strong>Time (loop=$loop) total=%1.2f sec (%1.2f ms per call)</strong><br>",
round($wire_time/1000000,2),
round(($wire_time/$loop)/1000,2));

if($result){

/*update parameters array by return values */
foreach($IOParam as $key=> &$element){
$element['out'] = $result['io_param'][$key];
}

echo "<br>";
showTableWithHeader(array("Parameter name","Input value", "Output value"), $IOParam);
}
else
echo "Execution failed.";
/* Do not use the disconnect() function for "state full" connection */
$ToolkitServiceObj->disconnect();

function control_microtime_used($before,$after) {
return (substr($after,11)-substr($before,11))+(substr($after,0,9)-substr($before,0,9));
}

?>

Running this script resulted in the following results.

New Zend Toolkit. Input value is changing on each call.

Time (loop=5000) total=26.11 sec (5.22 ms per call)

Parameter name Input value Output value
var1 Y C
var2 Z D
var3 001.0001 321.1234
var4 0000000003.04 1234567890.12
ds1 A E
ds2 B F
ds3 005.0007 333.3330
ds4 0000000006.08 4444444444.44

The next test we ran would just call the program without changing the parameters to see what effect it had.


<?php
/*
RPG program parameters definition
INCHARA S 1a
INCHARB S 1a
INDEC1 S 7p 4
INDEC2 S 12p 2
INDS1 DS
DSCHARA 1a
DSCHARB 1a
DSDEC1 7p 4
DSDEC2 12p 2
*/
include_once 'authorization.php';
include_once '../API/ToolkitService.php';
include_once 'helpshow.php';

$loop = 5000;

$start_time = microtime();
echo "New Zend Toolkit. Input values never change on each call.<br><br>";

try {
$ToolkitServiceObj = ToolkitService::getInstance($db, $user, $pass);
}
catch (Exception $e) {
echo $e->getMessage(), "\n";
exit();
}

$ToolkitServiceObj->setToolkitServiceParams(array('InternalKey'=>"/tmp/$user"));

$IOParam['var1'] = array("in"=>"Y", "out"=>"" );
$param[] = $ToolkitServiceObj->AddParameterChar('both', 1,'INCHARA', 'var1', $IOParam['var1']['in']);

$IOParam['var2'] = array( "in"=>"Z", "out"=>"" );
$param[] = $ToolkitServiceObj->AddParameterChar('both', 1,'INCHARB', 'var2', $IOParam['var2']['in']);

$IOParam['var3'] = array( "in"=>"001.0001" ,"out"=>"");
$param[] = $ToolkitServiceObj->AddParameterPackDec('both', 7, 4, 'INDEC1', 'var3', '001.0001');

$IOParam['var4'] = array( "in"=>"0000000003.04","out"=>"" );
$param[] = $ToolkitServiceObj->AddParameterPackDec('both',12,2,'INDEC2', 'var4', '0000000003.04');

$IOParam['ds1'] = array( "in"=>"A" ,"out"=>"");
$ds[] = $ToolkitServiceObj->AddParameterChar('both', 1, 'DSCHARA', 'ds1','A');

$IOParam['ds2'] = array( "in"=>"B" ,"out"=>"");
$ds[] = $ToolkitServiceObj->AddParameterChar('both', 1, 'DSCHARB', 'ds2','B');

$IOParam['ds3'] = array( "in"=>"005.0007","out"=>"" );
$ds[] = $ToolkitServiceObj->AddParameterPackDec('both',7, 4, 'DSDEC1', 'ds3', '005.0007' );

$IOParam['ds4'] = array("in"=>"0000000006.08" ,"out"=>"");
$ds[] = $ToolkitServiceObj->AddParameterPackDec('both',12, 2, 'DSDEC1', 'ds4', '0000000006.08');

//$param[] = array('ds'=>$ds);
$param[] = $ToolkitServiceObj->AddDataStruct($ds);
$param[2]->setParamValue(0);
for ($i=0;$i<$loop;$i++) {
$result = $ToolkitServiceObj->PgmCall('ZZCALL', "ZENDSVR", $param, null, null);
//$param[2]->setParamValue($result['io_param']['ds3']);
} // end loop

$end_time = microtime();
$wire_time= control_microtime_used($start_time,$end_time)*1000000;
echo
sprintf("<br><strong>Time (loop=$loop) total=%1.2f sec (%1.2f ms per call)</strong><br>",
round($wire_time/1000000,2),
round(($wire_time/$loop)/1000,2));

if($result){

/*update parameters array by return values */
foreach($IOParam as $key=> &$element){
$element['out'] = $result['io_param'][$key];
}

echo "<br>";
showTableWithHeader(array("Parameter name","Input value", "Output value"), $IOParam);
}
else
echo "Execution failed.";
/* Do not use the disconnect() function for "state full" connection */
$ToolkitServiceObj->disconnect();

function control_microtime_used($before,$after) {
return (substr($after,11)-substr($before,11))+(substr($after,0,9)-substr($before,0,9));
}

?>

This resulted in the following output.

New Zend Toolkit. Input values never change on each call.

Time (loop=5000) total=26.95 sec (5.39 ms per call)

Parameter name Input value Output value
var1 Y C
var2 Z D
var3 001.0001 321.1234
var4 0000000003.04 1234567890.12
ds1 A E
ds2 B F
ds3 005.0007 333.3330
ds4 0000000006.08 4444444444.44

After this we ran the same request using the old i5_toolkit option using the following code.


<?php
//require_once('connection.inc');

$loop = 5000;

$start_time = microtime();
echo "Original Easycom Toolkit.<br><br>";

$conn = i5_connect( "", "", "");
if (!$conn)
{ $tab = i5_error();
die("Connect: ".$tab[2]." "."$tab[3], $tab[0]");
}

/* prepare */
$description =
array
(
// single parms
array
( "Name"=>"INCHARA","IO"=>I5_IN|I5_OUT,"Type"=>I5_TYPE_CHAR,"Length"=>"1"),
array
( "Name"=>"INCHARB","IO"=>I5_IN|I5_OUT,"Type"=>I5_TYPE_CHAR,"Length"=>"1"),
array
( "Name"=>"INDEC1","IO"=>I5_IN|I5_OUT,"Type"=>I5_TYPE_PACKED,"Length"=>"7.4"),
array
( "Name"=>"INDEC2","IO"=>I5_IN|I5_OUT,"Type"=>I5_TYPE_PACKED,"Length"=>"12.2"),
// structure parm
array
( "DSName"=>"INDS1",
"Count"=>1,
"DSParm"=>
array
(
array
( "Name"=>"DSCHARA","IO"=>I5_IN|I5_OUT,"Type"=>I5_TYPE_CHAR,"Length"=>"1"),
array
( "Name"=>"DSCHARB","IO"=>I5_IN|I5_OUT,"Type"=>I5_TYPE_CHAR,"Length"=>"1"),
array
( "Name"=>"DSDEC1","IO"=>I5_IN|I5_OUT,"Type"=>I5_TYPE_PACKED,"Length"=>"7.4"),
array
( "Name"=>"DSDEC2","IO"=>I5_IN|I5_OUT,"Type"=>I5_TYPE_PACKED,"Length"=>"12.2"),
)
)
);
$pgm = i5_program_prepare("ZENDSVR/ZZCALL", $description);
if (!$pgm)
{ $tab = i5_error();
die("Prepare: ".$tab[2]." "."$tab[3], $tab[0]");
}

// *** parameter list allocation
$list=
array
(
"DSCHARA"=>"x",
"DSCHARB"=>"y",
"DSDEC1"=>66.6666,
"DSDEC2"=>77777.77,
);
// *** parameter values passed to procedure
$in =
array
(
"INCHARA"=>"a",
"INCHARB"=>"b",
"INDEC1"=>0,
"INDEC2"=>222.22,
"INDS1"=>$list,
);
// *** name of variables created for out parameters
$out =
array
(
"INCHARA"=>"INCHARA",
"INCHARB"=>"INCHARB",
"INDEC1"=>"INDEC1",
"INDEC2"=>"INDEC2",
"INDS1"=>"OUTDS1",
);

for ($i=0;$i<$loop;$i++) {
$rc=i5_program_call($pgm, $in, $out);
if ($rc != false)
{
}
else
{ $tab = i5_error();
die("Call: ".$tab[2]." "."$tab[3], $tab[0]");
}
$in['INDEC1']=$OUTDS1['DSDEC1'];
} // end loop

$end_time = microtime();
$wire_time= control_microtime_used($start_time,$end_time)*1000000;
echo
sprintf("<br><strong>Time (loop=$loop) total=%1.2f sec (%1.2f ms per call)</strong><br>",
round($wire_time/1000000,2),
round(($wire_time/$loop)/1000,2));

echo "<br>";
echo $INCHARA."<br>";
echo $INCHARB."<br>";
echo $INDEC1."<br>";
echo $INDEC2."<br>";
echo $OUTDS1['DSDEC1'].'<br>';
//var_dump($INDS1);

/* close */
/*flush();
set_time_limit(60);
for(;;);
*/

$rc = i5_close($conn);

function control_microtime_used($before,$after) {
return (substr($after,11)-substr($before,11))+(substr($after,0,9)-substr($before,0,9));
}

This resulted in the following output.

Original Easycom Toolkit.

Time (loop=5000) total=0.66 sec (0.13 ms per call)

C
D
321.1234
1234567890.12
333.333

Next we looked at the new XML request using the new i5_toolkit, this is available for download from the Aura website and will install the required objects over an existing ZendCore or ZendServer install. This is good for those developers who wanted an XML type interface for program calls especially as Zend regularly said it was something users wanted as they found the old i5_toolkit method difficult to understand.

This is the code it ran.


<?php
//require_once('connection.inc');

$loop = 5000; //50000;

$start_time = microtime();
echo "Xml Easycom, using Associative Arrays Input/Output.<br><br>";

$conn = i5_connect( "", "", "");
if (!$conn)
{ $tab = i5_error();
die("Connect: ".$tab[2]." "."$tab[3], $tab[0]");
}

/* prepare */

$SRPG="DS1 DS;
DSCHARA 1a;
DSCHARB 1a;
DSDEC1 7p4;
DSDEC2 12p2;

ZZCALL PR extpgm(ZENDSVR/ZZCALL);
INCHARA 1a;
INCHARB 1a;
INDEC1 7p4;
INDEC2 12p2;
INDS1 likeds(DS1);
";
i5_XmlDefine ("s-rpg", $SRPG);

// *** parameter list allocation
$list=array(
"DSCHARA"=>"x",
"DSCHARB"=>"y",
"DSDEC1"=>66.6666,
"DSDEC2"=>77777.77,
);
$ArrayIn["INCHARA"] = "a";
$ArrayIn["INCHARB"] = "b";
$ArrayIn["INDEC1"] = 0;
$ArrayIn["INDEC2"] = 222.22;
$ArrayIn["INDS1"] = $list;

for ($i=0;$i<$loop;$i++) {
$ArrayIn["INDEC1"] = $i/1000;
$ArrayOut = i5_XmlCallProgram ("ZZCALL", $ArrayIn);
$ArrayIn["INDEC1"] = $ArrayOut['INDS1']['DSDEC1'];
} // end loop

$end_time = microtime();
$wire_time= control_microtime_used($start_time,$end_time)*1000000;
echo
sprintf("<br><strong>Time (loop=$loop) total=%1.2f sec (%1.2f ms per call)</strong><br>",
round($wire_time/1000000,2),
round(($wire_time/$loop)/1000,2));

echo '<UL><LI><PRE>';
print_r($ArrayOut);
echo '</PRE></LI></UL>';

/* close */
/*flush();
set_time_limit(60);
for(;;);
*/

$rc = i5_close($conn);

function control_microtime_used($before,$after) {
return (substr($after,11)-substr($before,11))+(substr($after,0,9)-substr($before,0,9));
}

This is the output.

Xml Easycom, using Associative Arrays Input/Output.

Time (loop=5000) total=4.27 sec (0.85 ms per call)

Array
(
[INCHARA] => C
[INCHARB] => D
[INDEC1] => 321.1234
[INDEC2] => 1234567890.12
[INDS1] => Array
(
[DSCHARA] => E
[DSCHARB] => F
[DSDEC1] => 333.333
[DSDEC2] => 4444444444.44
)

)

Finally we decide to look at the performance hit using our preferred installation which is Easycom Server running on the IBMi with Apache and PHP running on a PC or Linux server. The script run on the Linux/PC box with the program being run on the IBMi. The ode we ran is eactly the same code which ran for the test on the IBMi.

here is the output.

Original Easycom Toolkit.

Time (loop=5000) total=1.23 sec (0.25 ms per call)

C
D
321.1234
1234567890.12
333.333

So it looks like the claims against the Easycom i5_toolkit and how the new toolkit is much faster do not stand up in this situation? Is this representative? Maybe Maybe not.. As you can see from the above the old i5_toolkit is around 5 times faster (actually over 40 see the comment) than the new XMLSERVICE using the test programs, the new XML request provided in the new i5_toolkit is also slower than it predecessor but no way near as slow as the new XMLSERVICE, even running the old i5_toolkit requests on a PC and adding the delay for communication is still faster than the new IBM/Zend toolkit.

So we have done what we set out to do and looked at a simple test to see if the new XMLSERVICE stands up to its marketing, our opinion is it is not as fast based on our very simple tests. If you have different results share them with us.

If you would like to understand more about our PHP experiences let us know.

Chris…

Sep 07

Installing PHP again to try the XMLSERVICE toolkit


There is a lot of discussion around the new IBM/Zend toolkit which is meant to replace the EasyCom toolkit currently shipped with the Zend product so I thought I would give it a quick spin to see what all the fuss is about.

First thing I had to do was remove the existing Zend PHP install and clean everything up, no point in testing the new toolkit with a down level server. As usual the un-install takes a very long time to accomplish and does not clean up after itself so a few additional minutes are required to remove the remaining links and objects.

Next we looked for a copy of the ZendServer for IBMi, not the easiest item to find on the Zend site but we did find it and download the zip file. We also noticed a Cumulative PTF package which seems to be needed as well but to be honest we are not sure? The Cum package is actually larger than the base package? Anyhow we decided to download both items and start the install.

Installing the base product took 40 minutes with an additional 13 minutes to install the MySQL server. Because we had downloaded the Cum package we then installed it using the supplied instructions which failed! The install process automatically starts all of the processes but the PTF install fell over because they were started. Once we had stopped all of the running processes the PTF install process took a further 36 minutes to complete. Your times may vary depending on the server and the number of active jobs etc.

Next step was to find the XMLSERVICE package, we have only ever found it using a link provided on the forum boards so not even sure it is the right one to use? (If you know the correct link let us know). We downloaded the stable release and as per the instructions uploaded the save file to the IBMi and restored the XMLSERVICE library. The library has a number of source files and objects which require compiling before you can use the services. We did not have Option 31 for WDS (the RPG compiler) installed so we had to install that first. However we are now at a standstill because the compiler could not compile the PLUGXML module due to some parameters being passed incorrectly in the source code. We have posted to the forums asking for help as we are not RPG programmers and have no idea why the compiler is complaining about the code?

That’s as far as we have got today, its taken 5.5 hours so far and while we do have the Zend Server installed and running we do not have the XMLSERVICE up and running yet to give it a try.

Once we have the problems figured out and the programs created we will create a couple of simple tests to try the new XML routines out against the old i5_toolkit routines. Keep watching for the results..

Chris…

May 11

RPG OA The debate continues (by a few)


Well the few pundits who bother to put pen to paper and speak about the IBM i platform and new technology are certainly putting some passion into this one! I have made my position known and do not see any reason to alter that, but we are interested in what others feel. As usual there are only the few who even bother to place any comment on the boards, having said that those who do certainly add some spice to the discussions and lots of passion! Unfortunately thats more than outweighed by the lack of responses. Most seem to be willing to just sit back and enjoy the ride letting others do their bidding.

I do not agree with some of the things Trevor Perry has to say, but one thing I do agree with him on is unless you show some passion in your support of the IBM i platform you will be adding to its demise! You should take the opportunity to post your comments on boards, blogs etc. which are supporting the IBM i. Even a few words such as “here here” show you have a level of support for what we feel is the best computing platform bar none! One thing I do believe is you need to be honest, I do not agree with the sentiment that negative answers should be shunned! If its true it needs to be said! Just placing positive statements is for the marketing people, debate can take the negative comment and show where it could be positive even though we may end up with an agreement to disagree.

For me the IBM i is a great solution, its not perfect in some respects but it will suffice. I think IBM needs to do better yet this is probably (I say probably because I have not even seen the technology yet) a good start down the right road. Will we ever get to a stage where we have a fully integrated XServer solution on the IBM i, I doubt it. Not that it would be impossible, but I do not think IBM sees that as a good move.

We have placed a poll on the site to see what you think, we have also opened up the comment section to allow you to post without having to register (not sure why but registering seems to be an impediment to some) so you can voice your thoughts and concerns here. Not sure what effect it will have but maybe having a few more places to put your opinion down may help with IBM’s decisions on the future of the technology? Please bother to add your check in the appropriate box on the poll or if you would like to add another question post it here, get involved and get your co-workers to get involved.

Chris…

Apr 25

PHP and IBM ‘i’ without Zend or HTTP running on the IBM ‘i’

We have always looked at the building of HTTP interfaces for IBM ‘i’ with a view that they should run on the IBM ‘i’. Our main reason for this was performance, if we are running a web server on a Linux box which has to talk to a web server on the IBM ‘i’ to get related information this would certainly slow things down significantly plus the complexity would certainly take some managing.

We have for a long time said the IBM ‘i’ HTTP server is very slow in comparison to the Linux Server, we did try to install the Sugar CRM package on the IBM ‘i’ but had to give up and move it back to our Linux server because it was simply too slow.. Add to this the complexity PHP brings to the IBM ‘i’ when Zend Core was first introduced and we felt it was a non starter where Application modernization was concerned. Zend Server did change our view a bit, not having to use the PHP server as a proxy was the first improvement, using FastCGI improved not only the complexity involved in setting up the PHP environment but it also lowered the overhead and improved response times. This made us look at using PHP based interfaces for our products again.

Before we take this story further I would also like to point out that we have looked at the Look Software products for providing an interface to our products, the products are certainly first class and do bring a lot of potential to the market in terms of application modernization. Our concern is the cost of the runtime which we would have to impose on our customers if they wanted to use this kind of interface, plus we had to provide our own changes to the screens etc to make it really worthwhile. I still firmly believe the Look Software products are the best way forward for application modernization where you are looking at refacing to start off with and then add more integration with other platforms and servers as you move forward. Their announced support for the new RPG Open Access and availability of handlers is certainly something RPG shops need to consider. However for our products we feel providing a new PHP interface would be the better option, we don’t need to provide cross product integration and we don’t use RPG for our display management. Starting to develop in RPG just to get access to the technology doesn’t seem the right thing to do?

We have been talking around our use of PHP for a long time, in fact our websites and a number of others that we have developed are all PHP based. Our concern has always been with the setup and management of an environment to support PHP on the IBM ‘i’, we have also noticed a number of issues with the new Zend Server on the IBM ‘i’ as well as a number of IBM HTTP problems particularly the slow response times from the *ADMIN servers and the constant need to manage the JAVA environment. So we need to make sure what ever direction we take it is maintainable.

We recently moved our development from a 520 system running with 2GB of memory to a new system running with 4GB of memory. The 2GB system performed more that adequately for the programming and testing of IBM ‘i’ based programs but really struggled to run very well when we turned on the HTTP server and tried to use it with any degree of simulated loading. The new system with 280GB of DASD (15% utilized) and 4GB of memory does start up the HTTP servers a lot quicker and the overall performance of the applications running under HTTP have improved, although to be honest it is not enough. Once you add the PHP server to the mix which we use to interact with IBM ‘i’ programs/objects and the response times certainly leave something to be desired. I expect if I double the memory again things will change but the cost of that is pretty steep, probably a lot more than a new PC server to run a Linux based Web Server even with IBM’s new memory pricing.

This had us thinking about how we could interact with the IBM ‘i’ from a Linux server running PHP, could we supply the programs and data from the IBM ‘i’ and leave the Linux server to manage the HTTP side of the house.

Our first thoughts were to write some kind of client server tool which would correspond from Linux to the IBM ‘i’ passing it onto the Linux HTTP service. This would require us to create a module for the HTTP server (something we have not done so the impact would be quite heavy in terms of learning curve and time to market) which could be bolted in by our customers. It would not have to be too complex because it would only be for our programs. Next we thought about the PHP modules available from EasyCom, after all this is how the i5_Toolkit works on the IBM ‘i’. As it turns out this is the route we took, our initial concern about having two HTTP servers talking to each other raised their ugly heads but after installing the EasyCom solution we found something which came as a big surprise, they do not require the HTTP or PHP server to be running on the IBM ‘i’!

If you are using the i5_Toolkit already you will know that the I5_COMD server has to be running for PHP to service i5_Toolkit requests. It turns out this is the same service which is used for a remote i5_Toolkit request from a Windows or Linux server, it just needs an additional key to enable the functionality. We did ask about how this is installed when Zend is removed from the IBM ‘i’ and it is simply a FTP of a few objects from the Linux server as they are shipped in the Linux package.

Installation of the Linux modules did take some figuring out (the EasyCom manuals are not the best) but one we did and we moved the same code we had used directly on the IBM ‘i’ to the Linux system to test it out, it worked like a charm.

So what is the downside? Well there will be a cost for the runtime! I am not sure what that cost will be as EasyCom has yet to provide me with any indication on that. That could put this in the same position as the LookSoftware proposal, as we do not know either cost we cannot make the comparison. You also lose the db2_ functions from the PHP stack because these are IBM supplied and no Linux or windows variants are available as far as I can find. Having said that our tests using the i5_Toolkit functions performed just as well if not better than using the db2_ functions.

Upside? well first of all you can get away from all the complexity of setting up the webserver on the IBM ‘i’. I have run Zend PHP on the Linux server for over 10 years and it has never folded on me like the IBM ‘i’ installation did recently! It is faster, I ran the very same pages on the Linux system as I used for testing on the IBM ‘i’ and the responses with data were significantly better. A file which had 30,000 records in it came back in probably half the time. I can remove the terrible *ADMIN servers, I don’t need the HTTP servers for anything other than running PHP services, this means I get a lot of the CPU and memory back which was taken up with running the *ADMIN servers. I can entirely remove the Zend Server from the IBM ‘i’, I am not saying the Zend Server is to blame but removing it will simplify the management of the system. I will have access to more support by going for the open source servers than I do with Zend, our IBM ‘i’ Zend support ran out a long time ago and the cost of re-instating it isn’t worth it in our case. Probably one of the biggest gains is I do not have to expose the IBM ‘i’ to the internet if I want to provide a web service which accesses the IBM ‘i’ for data! If you already have a Windows webserver or Linux webserver which is controlled by a specialist group they can integrate the i5 functionality very easily.

I am sure there are a lot more benefits and drawbacks we will encounter as we move forward and we do have a long way to go before we can be certain this is the right option for us, but the initial responses are favorable.

As we find out more information we will post it, if you are interested in looking at how you can build a similar solution let us know and we will be happy to engage.

Call us if you have any questions or wnat to know more about what we have done so far.

Chris…

Dec 17

New release of Crc Builder in progress, should be faster!

When we started to build the Crc Builder we used technology we had built into the RAP Auditing product as the base. While the Auditing functions work, we felt the speed needed to be improved so we set about looking for better ways to manage the process.

A forum post from Chris Edmonds got us looking at exactly what we were doing and why. He was looking to implement a file checker using the ZLIB source and wondered if the process could be used against 2 files on different systems. He had already built an RPG program which would check a single defined file and was looking for clarification of the results. We then developed the first program for Crc Builder to see what we could offer using the IBM QC3CALHA API.

The initial results did not look promising because the process of calling the API for every record really crawled along. We then looked at streaming the data into a memory block and passing it in to the API. While this did improve the situation it did not run as fast as his implemented solution, so we decided to look at the Adler32 CRC.

We took the same approach of reading the file a byte at a time into the buffer and passing it to the function supplied by ZLIB once the buffer was full. The results were certainly much faster than the IBM API but not as fast as Chris was seeing. So we had to look at how we read the file, using the record functions seemed to be the best way to read the data but experience had shown us that using a record level read and passing it into the IBM API really sucked! we saw a maximum throughput of 471,000 records which was against 30 seconds for 1.2million records using the blocked memory.

We played with programs to simply read the records to see if the C programs were the problem, I have to confess that RPG’s file processing is far superior to the C implementation. But if you look at what IBM does with its compiler spending its not surprising to see that. Come on IBM get the C functions to perform as fast as the RPG DB functions!

We also had to implement a context token for the IBM API’s to allow us to generate many calls to the API, our original process simply created a HASH list and generated a HASH over it for a total HASH Value. We think this has improved the CRC strength as it is using the context token to allow multiple calls to the CRC generation API using internal registers to hold the intermediate value between calls.

We also did a lot of tests to find out the best way for us to use the file read functions and the calls to the API’s. We tried using blocking and setting the type to record for the stream functions, we also experimented with using the internal buffers from the file read instead of copying the data to our buffers, but that seems to be a total failure? We didn’t seem to get much more out of the process but if this is going to be used over very large files a few seconds on our systems could end up as hours on yours.

In the end we have to take two separate routes, for the IBM API’s we will stick with blocked memory, but for the adler32 function we have the option of reading the data a record at a time or sticking with the blocking, our preference for simplicity would be to go with the blocking but the benefits of using record level checking seem to outweigh simplicity!

If you are in need of a simple CRC for data checking the adler32 certainly performs the best, but reading through the notes it does have some problems. The IBM HASH process is definitely a better CRC strength but it comes at a price!

We should have a new version available for download later this week.

Chris…

Jun 20

Source replication without Journalling

As part of testing the next release of the RAP we came across a very nice feature which will help us manage our Source Code across systems. We have always seen the journalling of PF-SRC files as a problem especially with any Replication solution such as MiMiX,iTera or Orion etc. The problem is with the way the journalling of the files is managed, the file is essentially cleared and recreated everytime the editor is finished with the file. If the file is not journalled this can be accomplished in one of 3 ways – row by row, block copy or cloning. However as soon as you journal the file you are automatically forced into a row by row rebuild. (Thanks to Larry Youngren for providing the information on this for me). This adds lots of overhead on the source system which is compounded on the source because while the editor on the source has an idea of what it needs to do and can pre-empt memory allocation etc, the journal replication products have no such information so they just have to plod through the records one at a time! We had been working with a customer on a problem which also seems to cause all file members to be replicated due to hidden entries in the journal, but we need to get IBM involved to sort that one out!

This is the main reason the HA products are never used to replicate source files in this manner. They will adopt a member by member replication or simply do a file level copy, the methods of capturing this are not pretty either even where they do actually provide the capability.

We have come up with an alternative solution which allows the actual changes to be copied to the target system at the row level. one simple command can bring the entire source file up to date very effectively and simply. We will add more functionality to the solution so you can schedule a request through your normal job scheduler that will take a library name and it will go out and sync up the file rows which are different.

If you need a method of keeping your source files in sync across systems give us a call and we can discuss how it can be achieved.

Oct 06

RPG information not forthcoming

I am sorry to say that Terry seems to have stopped his RPG course!  I have not heard from Terry for some time so I have to believe that he is no longer willing to post RPG materials to this site. I dont have the RPG skills or the time to pick them up and allow me to continue where he left off.  So if you have the skills and would like to share those skills with the readers of this Blog I would be happy for you to take up the challenge.  If you have a Blog dedicated to providing those same skills I would be more than happy to add you to the blogroll so send me your details.  I will of course check the link for accuracy and content before posting it.

Chris…