Jul 22

Receiver Apply Program & JobQGenie receive IBM Power Systems Software approval

Both JobQGenie and the Receiver Apply Program have been awarded the IBM Power Systems Software approval from IBM. This allows us to add the IBM Power Systems Software logo to any literature we use for the products.

r4_power_systems_software_color1
While it may seem a small benefit, it does show how we are committed to the IBM Power Systems brand.

Chris…

Jul 21

JobQGenie continues to improve as customer satisfaction grows.

Recovery of Jobs on a Job Queue is something all of the HA products lack the ability to do. They can replicate the attributes of a job queue but not the content. In our view this is a major crack in any ones recovery process because without it you have no idea what state your application database is going to be in. A number of customers now see this requirement as key to recovery and have been using JobQGenie as their application of choice. Even if you don’t have a HA solution being able to reload jobs that failed with the same attributes as they first ran without lots of re-keying is a handy option.

We now have brought JobQGenie into a development stage where any problems we get are getting easier to understand and fix. A case in point was where a customer reported lots of jobs showing in the JobQGenie logs as being on the Job Queue when in fact they had run and ended. This was confusing for the customer and he asked us to investigate the problem with them.

First thoughts was this could be due to a timing issue, reviews of the logs showed some problems during the IPL where JobQGenie jobs were submitted but did not actually start collecting data until a full 3 minutes after the log said they had been submitted. During this time the other subsystems had been started and all of the jobs had run before JobQGenie had time to collect any information.

The puzzling part was why the end messages had not been processed as these are not time dependent. So we took a closer look and could see how JobQGenie had seen the messages but just skipped past them. We then started to look closer at why this could occur. The code seemed to show that any entry would be processed even if it was only just to set the Job State to ended.

After a couple hours we found out what the issue was. When we capture the data from the Exit Points we use the Internal job Identifier to tag each and every record in the data files. Because the system had been IPL’d with the Jobs still sitting on the Job Queue waiting to run during the IPL the Internal Job Identifiers for the jobs changed. So when we looked for the records in the files to update they would not be found because the Internal job Identifiers had changed!

We had to create a program which would reset all of the required Internal job Identifiers within the data queues and files so when the jobs ran we would have the new Internal job Identifiers ready to be linked.

The process takes a few seconds to run and should be run in advance of starting any job activity after an IPL. You can safely start JobQGenie before the process is run but allowing the jobs in the monitored job queues to run will result in missed data.

The new programs will be available once we have done more testing.

Other enhancements have also been added to this fix such as being able to filter by job queue when reviewing the job list, reduced job data search overhead by limiting the search criteria and collection of additional data.

Chris…

Jul 17

QDBRPLAY API Rename Exit Program

In a previous post we published code to show how to code up a call to the QDBRPLAY API, this did not use the capability to call an exit program where you could rename the object to be addressed when calling the API. While the manual does provide information about the call to the exit program it does not provide any information about how you should change the information passed.

The program will be passed a structure as the first parameter plus a second pointer to data which can be passed by your calling program via the API call to the Exit Program (If that sounds confusing, basically you can pass a character pointer to the API which holds data you want the Exit Program to see). The first parameter holds information about the object as described in the journal entry, this is the data which you have to change to address a new object. You can change either the Object Name, Object Library or both. The one mistake we made was to expect the Object Lengths to be 10 where they were representative of standard iOS names! We simply changed the data and did not set the Length variables to the correct values!

We were testing the API’s using Libraries RAPDTA5 and RAPDTA5_T1, when we did the initial testing we found the API was failing to update the correct object it kept saying the object in RAPDTA5 was not available! We had changed the structure to reflect the new library RAPDTA5_T1 but failed to update the length of the name in the structure! Our assumption was that it would be 10 as this is the length of a library name, right? WRONG! IBM passes in the actual length of the library name which in this case was 7. When the information was read back in the API it would only read the first 7 characters ‘RAPDTA5′!!! If we had changed the name to RAPDTA6 all would have been fine, but again we would mask the problem because at some time should the length of the library name that we want be different to the original name we would have started to see API errors again.

So the lesson here is don’t take anything for granted! The manuals are pretty poor in terms of giving you the information required to call the API and how its will format the data before passing it onto your program.

Here is the code we used in the rename program which worked. We pass in via the scatchpad the library name we want to use.

Qdb_Rename_Exit_Parm_t *Parms;

Parms = (Qdb_Rename_Exit_Parm_t *)argv[1];
for(i = 9; i >= 0; i--) {
   if(argv[2][i] != ' ')
      break;
   }
memcpy(Parms->Object_Lib,argv[2],10);
Parms->Length_Object_Lib = i+1; 
Jul 06

LookSoftware and application modernization

We have been thinking of adding a new interface to the Shield Products for sometime now. Originally we had hoped to add a PHP based service which would allow the data and product menu options to be managed through a HTTP based interface. However after some trials and research we found the PHP approach left a couple of glaring issues such as connection concurrency. Not that these were insurmountable but they would make the development process a lot harder than we wanted.

Recently we had been corresponding with a long time friend Marinus Van Sandwyck and he mentioned that he was getting involved in the application modernization market and would be using LookSoftware’s products to go to market with an offering. He offered to use the product to build new interfaces for the Shield Products thereby providing them with some skills development in the technology and us with a newly interfaced product.

Things were going well until we tried to Marinus give access to our systems, unfortunately the VPN device we use would not allow a connection to his system, the ping times seemed to stop the VPN from responding. We are going to work on getting the server to allow longer connection timeouts but in the meantime Marinus kindly offered to introduce us to the company to see if we had any synergy and if we could work together.

The initial communications with the company have been very good, they were very responsive to our questions and have provided us with links to discover just what their software is capable of achieving. I was very surprised at just what the product can achieve, my initial thoughts of this just being a front end screen scraper could not be further from reality. It has many more capabilities such as allowing addition data sources to be integrated with the 5250 data stream’s data allowing a single view over all of the data. The fact that you can create a web service using the 5250 data stream without any additional IBM i code is really cool! I watched a demo on just how easy it was to take a 5250 application and create the transaction data required to carry this out as a single event. No more signing on and going through 10 screens to get to the data you need, just a single click on a web page button which calls the service and you get the results back to the same web page…

I am very excited with what this can do for us internally let alone what it offers the IBM i market place. Having tried a number of solutions to date which would offer some level of beautification of the interface to our products, I feel this offers us the ability to do this a lot easier plus the opportunity to add a lot more functionality should we wish.

We will certainly be banging the drum about this product! If you are having challenges with your management about the IBM i this product should be able to help win a few! I can’t wait to get started on a couple of test projects just to see what it can really do…

Chris…