Friday, December 01, 2017

Revisiting Continuous Integration with PowerBuilder 2017, Bonobo Git and Jenkins

In a previous blog post I ran through a example of how to use the new command line compiler feature of PowerBuilder 2017 to perform continuous integration using Jenkins and a Git repository.  One of the "next steps" I referenced in the close of that post was the creation of a plug-in for Jenkins that would make creating the build steps for the project much simpler.

Well, I've created such a plugin.  Or more specifically what I did was take the existing MSBuild plugin and customize it to create a number of plugins to do PowerBuilder compiles instead.

In this post I'm going to look at how we can use those plugins, plus some additional changes I've made to the project in Jenkins to make better use of the capability of Jenkins and other Jenkin plugins.

Download and install the Oracscript and PBCompile plugins


The first thing you'll need to do is grab a copy of the OrcaScript and PBCompile plugins.  What you'll need is the orcascript.hpi and pbcompile.hpi files.

Go into Jenkins and select the "Manage Jenkins" option, and then the "Manage Plugins" link on that page.


On the Plugin Manager page select "Advanced" and then look for the "Upload Plugin" section. Click the "Browse" button there and navigate one of the HPI files you downloaded.



Do it again for the second HPI file.  You'll need to restart the Jenkins server after installing the plugins before you can use them.

Configure the plugin installations


After the server has been restarted go back into "Manage Jenkins" and then "Global Tool Configuration".


Scroll down that page an you'll see a new "Orcascript" section.  Select the "Add OrcaScript" button.


In the form that appears, provide the installation with a name and the path to the OrcaScript executable (orcascript170.exe).




Scroll down the page a bit more and you will see a new "PBCompile" section.  Select the "Add PBCompile" button as before.  In the form that appears,provide the installation with a name and the path to the PowerBuilder command line compiler (pbc170.exe).



Create OrcaScript and PBCompile build steps


Go back into your PowerBuilder project and select the "Configure" link.




Delete all the previous build steps that called batch files.  Now add a new one and notice a new entitled "Create PowerBuilder PBLs from source code".  Select that one.


In the form that appears, select the version of the OrcaScript installation you just created and provide the relative location to the PowerBuilder project file.  If you have PBDs in the library path that you want OrcaScript to ignore, provide them in the Exclude Liblist option.  The open source PFC contains a reference to the PBDOM PBD, so we're excluding that here.


Add another build step and this time select the "Build a PowerBuilder target using PBC" option.


In the form that appears, select the version of the PBCompile that you just created earlier.



You'll also need to add the command line arguments to pass to the PowerBuilder Autocompile utility.  You can get those from the first page of the project object in your PowerBuilder target.



These plugins provide the same capability as the first two batch files from the previous blog article, namely running OrcaScript to create PBLs from source control and then running the AutoCompile Utility (PBC170.exe) to create the executables and PBD from the PBLs.

Archive the Project Artifacts


What we're going to do now is replace the third batch file, the one that copies the EXE and PBDs into the build folder.  Jenkins has some built in capabilities that make this simpler that I didn't demonstrate in the previous article.

The first thing we're going to do is to tell Jenkins to archive off the EXE and PBDs (Jenkins calls them "artifacts").  To do this, create a Post Build step and select "Archive the Artifacts"


In the new step that is added, enter the following in for the "Files to Archive".

       **/*exe, **/*.pbd

The syntax is Apache Ant include fileset.  This particular setting tells Jenkins to find all of the EXE and PBD files in the workspace - including in any subdirectories - and copy them into the archive location.  When it does so it preserves the subdirectory structure, something we'll deal with in the next step.



Install the Copy Artifacts plugin


Now that we have the PBDs and EXE in the archive location, we want to use the Copy Artifacts Plugin to copy them to the build folder.  First we'll need to install the plugin.  Go back to the Manage Jenkins  -> Manage Plugins page, but this time select the "Available" tab.  Because there are a lot of plugins for Jenkins you might want to use the Filter option in the upper right to pear down the list.


Once again you'll need to restart the server before you can use the plugin.

Create a Copy Artifacts project


After the server has been restarted, create a new project.  We'll fire this project off after the build is complete from our original PowerBuilder project.  You might simply name the new project as the original project name with an " - Output" suffix, as I've done here.


The only thing we're going to do with this new project is create one build step.  Do that, and choose the new "Copy artifacts from another project" option.



  • Project to Copy From:  Select your original PowerBuilder project.
      
  • Artifacts to Copy:  You can leave this blank, as it defaults to copying all artifacts.
  • Target Directory:  The he location you want the files copied to, and can reference the Jenkins environmental variables using $VARIABLENAME (as opposed to %VARIABLENAME% that we used in the batch files).  For example, I'm using:

    c:\builds\$JOB_NAME\$BUILD_NUMBER

    In the previous article I was concatenating the job_name and build_number for the directory.  I like this better as I'm now putting the builds in subdirectories under the job name.
  • Flatten Directories:  The one other option you will want to change is to check the "Flatten Directories" option.  That tells the plugin to just copy over the EXE and PBD files and ignore the subdirectory structure.




Great a Post Build step to fire the Copy Artifacts project


Go back into the original PowerBuilder project.  We need to add another Post Build step. This time select "Build other projects".


And in the new post build step specify your "output" project as the project to run after the PowerBuilder project completes.



Test the new configuration


Run a build of your PowerBuilder project just to verify that it's all working.  You'll notice some differences.  In particular, the build information now shows the artifacts that were created and the "output" project as a "Downstream Project".


The build information from the "output" project isn't particularly interesting, but we can check out build folder and see that the build files are being created and stored correctly.



Caveats


This all works great.  The only issue is that (currently) there are a number of options available in the PowerBuilder target that aren't supported by the command line arguments for the PowerBuilder autocompiler.  Well we have some other options.  One is to user PowerGen, and the other is to use the command line options of the PowerBuilder IDE itself.  Fortunately, I also created a plugin for each of those as well.

PowerGen plugin


If you'd like to use PowerGen instead, grab a copy of the PowerGen plugin.  You'll need the powergencompile.hpi.  Install it as you did the other two plugins above.  Similarly, go to the Global Tool Configuration and create a PowergenCompile installation.



Like the other two plugins, the only thing you need to do here is give the installation a name and provide a path to the PowerGen executable (Pwrgn17.exe).

Go back into your project, delete the PBCompile step and add instead a PowerGen step.


And in the form that appears there selection the install you just created and the relative location of the PowerGen project file you've created for the project.


You'll need to save or copy that PowerGen project file in the workspace.

Caveats


There's a couple of caveats with this approach as well though, both of which I've shared with the folks who make PowerGen.

  • PowerGen doesn't output to the console.  The other utilities we've used so far do, and it's useful for following the progress of a build.  Instead, PowerGen (if the project file is configured for it) outputs to a log file.  What I've done is added *.log to the artifacts.  As a result, the log file PowerGen generates appears in the project summary page and I can click on it to display it after the build.



  • Powergen doesn't indicate build errors by returning a non-zero exit code.  Instead, when there are errors it writes out a pgerror.log file.  However, a non-zero exit code (also often referred to as an ERRORLEVEL code) is what Jenkins is looking for from such tools to indicate when their task has failed.  It's certainly possible that from within the plugin I could look for the existence of a pgerror.log and flag the task as failing.  However, in the long run I think it would be cleaner if PowerGen could update its behavior to include the non-zero exit code.
Note that PowerGen also has the capability to generate PBLs from source code like OrcaScript does.  I haven't looking into making use of that capability as I already have OrcaScript doing that for me.

PowerBuilder IDE Plugin


The last option is to use the PowerBuilder IDE itself to do the compile using command line arguments.  To do that, grab a copy of the PowerBuilder IDE plugin.  You'll need the pbidecompile.hpi file.  Install it as you did the others, and then go into Global Tool Configuration and give it a name and the location of the PowerBuilder IDE.


Go back and replace that last build step with the new "Build a PowerBuilder target with the PowerBuilder IDE command line" option that appears.


And in the form that appears, select the version of the install you just created and indicate the relative location of both the workspace and project file.


Note that the workspace file is a new requirement, and something that isn't part of the source pulled over from the source control server.  You'll have to copy or create a workspace file in the Jenkins workspace in order to use the PowerBuilder IDE command line approach.

Caveats


  • You have to purchase an additional license of the PowerBuilder IDE and install it on the build machine for this option.
  • Like PowerGen, the PowerBuilder IDE doesn't output to the console.  So, like PowerGen, we write the output from the IDE to a log file.
  • You need to copy or create a PBW file for the workspace, as the PBW file isn't one of the source controlled objects.
  • It appears that problems generating a build don't cause the IDE to exit with an error code.  In fact, it seems to prevent the IDE from exiting entirely.  Instead, the build appears to hang as the IDE waits for user interaction.







Monday, November 06, 2017

ISUG-TECH Board of Directors Nominations Open

The nomination form for the 2018-2019 elections is now available through November 13th at:


You can also access it through the Member's Blog article or by going to Polls and Surveys from the home page. I will also post an article on the Facebook page.

These are the positions available for the 2018-2019 term:

   President
   VP of Business Relations
   VP of Technologies
   Benefits Director
   Content Director
   Regional Director - AP
   Regional Director - SLA

Please verify your nomination is qualified and willing to serve. You may nominate yourself. Nominations for President include an additional requirement that the person be a member of the executive committee to run.

ISUG-TECH Annual General Meeting

ISUG-TECH will be holding their annual member meeting on November 8, 2017 at 12:00pm (noon) EST.  This year the meeting will be conducted via webcast. To register, please visit:


You must register to attend the meeting.  All members are invited to attend, although Associate Members will not be permitted to vote on any pending action required by the membership.

We hope you will participate in this important aspect of your ISUG membership.  If you are unable to attend, minutes of the meeting will be available afterwards, as well as (we hope) a recording of the meeting.

Saturday, September 30, 2017

Elevate 2017


Summary

Given that this was the first conference that Appeon had hosted, I thought it went extremely well.  There were a few areas for improvement, which I'll address at the end of this article.  Attendance was good and diverse.  It seemed like there were a significant number of people attending from outside of the United States.   (Appeon later indicated that 24% of the attendees were from outside North America, 4% from Asia, and 10% each from Europe and Latin America).  There was a lot of energy and excitement on the attendees part, and the sessions overall appeared to have been high quality and well attended.  The facilities were great, if a bit small, and the services provided by Appeon to facilitate travel between the downtown hotels and the conference location were a great touch.  If you didn't attend this year I would highly recommend it next year, particularly given we should have some exciting new 2018 features to see next year.

Facility

The conference was held Sept 25th through 27th at the Harris Conference Center in Charlotte, NC.  The facility is located on the campus of the Central Piedmont Community College, just a few miles from the Charlotte Douglas International airport.  Most of the attendees stayed at the Holiday Inn in downtown Charlotte or other downtown hotels.  Appeon provide shuttle busses that ran every 30 minutes between the Holiday Inn and the conference center at the beginning and end of each day of the conference.

The keynote, lunches and some sessions were held in the Full Conference Hall.  Other sessions were run simultaneously in the Ash, Birch, Cypress and Maple conference rooms.


Registration

I'm used to conferences that allow registration the day before the conference begins.  Such was not the case here.  Instead registration opened at 8:00 the first day of the conference and the keynote didn't begin until 9:30, but the registration went smoothly.

Breakfasts

During registration and each morning the next two days breakfast was offered.  It was the standard carbs and fruit offering typical for such events.  Doesn't quite fit my diet so I made other arrangements, but most other attendees seemed content.


Keynote




Armeen Mazda started the opening keynote.  He discussed "what we have accomplished"


  • PowerBuilder 2017 delivered, which offers
    • Core features
    • Easy migration
    • More stable
  • AppeonU
    • Free training
    • Can be completed in 1 week
    • Based on the old fast track to PowerBuilder Course
  • Community delivered
    • Appeon MVP
    • Real tech Q&A
    • How to videos and articles
    • CodeXchange
  • Elevate 2017
    • 60 hours of tech sessions
    • Special product previews for
      • PowerBuilder 2017 R2
      • PowerBuilder 2018
    • Impressive attendees

Georg Brodbeck spoke next, a customer success story.



  • Company has 80 employees, 45 of which are PB developers
  • Their main product is an ERP system
  • Bigger teams require different development processes
    • Continuously integrated
  • What they need to see from future versions of PowerBuilder
    • Address old fashioned IDE and few new features
    • Support for new operating systems

Chris Pollach spoke next

  • Release plan and schedule      
    • Revision approach
      • R2 12/31
      • R3. 6/2018
        • Restful app
        • Source control integr#tion with TFS
      • 2018. 12/2018
        • C# development
        • C# web API
        • 64-bit application enhancements, UI modernization
  • Support mechanism
    • Community
    • Standard
    • Paid
  • EBF vs MR vs Revisions
    • MRs once a quarter
      • MR1 09/2017
    • LTS support
      • Supported for 3 to 5 years
      • 1 year notice before dropping support
      • 2017 R2 will be first LTS version
  • R2 highlights
    • TLS 1.2
    • Consumption of REST/JSON
    • Git and SVN
    • No longer need separate directory for each PBL
    • Native PDFs enhancements
      • PDF A1 and A3
      • Improve font and graphics rendering
      • More page sizes
    • PostgreSQL support
    • Improve standalone compiler
      • More options from project object
  • PB 2018  
    • REST web APIs and NVO.NET (uses Roslyn within the PowerBuilder IDE)


Chris then demoed some of the C# development capabilities they were planning for 2018.




Mike Croyle of CBIZ spoke next, another customer success story

  • Mid 90s Client/Server
  • 2003 PB/EAServer
  • 2009 PB WebForms
  • 2012 Appeon Web



Armeen then spoke again on Appeon 2018 and beyond

  • .NET stack
  • Deploy anywhere
  • N-tier (cloud)
  • Leverage existing investments
  • 2018-2019 focus
    • .NET desktop cloud apps
    • R3      
      • JSON DataWindow
      • TFS integration
    • 2018
      • C# server projects
      • PB native IDE enhancements
    • 2019
      • Desktop cloud target
      • UI framework
  • Cloud app benefits
    • Simplified deployment
    • Code reuse and extensible (more code to server)
    • Integration ready (web APIs)
    • .NET interoperability
      • BUT your app has to be fully partitioned
  • Rapidly evolving existing apps
    • PowerServer Web add-on
    • ~80% automated conversion
  • Hybrid n-tier
    • JSON DW
    • C# web APIs
  • Optimize UI
    • Optimize existing
    • Rebuild UI layer

Filiberto Sosa of Sizes and Colors then spoke, another customer success story

  • 21 Years ago POS system
  • Grew to full ERP system
  • Now handles 30% of Mexico╩╝s shoe sales
  • Web and mobile clients
  • 6 Appeon Web servers handling 5000 concurrent users
  • 4 PB developers


Keynote Takeaways

  • Release schedule - I think most attendees were quite pleased to see an even higher-paced release schedule (multiple minor releases per year with incremental new features).  It means new features end up in our hands sooner rather than waiting for next major release.
  • C# development - Here I think most of the attendees left somewhat undecided about this particular feature.  It appears the purpose of the enhancement has many purposes:
    • Allow developers who are not familiar with PowerBuilder but know C# to use PowerBuilder
    • Assist in the migration of PowerBuilder apps to n-tier
  • Consumption of REST/JSON - I think many attendees were pleased with this feature.  PowerBuilder has fallen far behind the curve in web services support.  Personally, I'd like to see improvements in SOAP as well, but REST seems to be where the most growth and demand is in web services support.
  • Git and SVN - I know some customers are excited to see this coming, particularly me.  My company moved to SVN some time ago.  However, the vendor that provided the bridge product that we use to get PowerBuilder to talk to SVN apparently ceased development about 5 years ago.  In addition, the open source PowerBuilder projects I support have moved to GitHub, making Git integration important.  I'm currently using another bridge product, but it has a number of limitations.
  • PostgreSQL support - Another enhancement I'm particularly interested in.  The developer edition of SQL Anywhere is no longer provided with the product.  Support for PostgreSQL provides a low (no) cost for initial development work.  In addition, unlike SQL Anywhere, support for PostgreSQL allows for low (no) cost production deployment.
  • Improved standalone compiler - Yet another one I'm particularly interested in.  I normally compile my applications as a small executable (basically just the application object) from the first PBL and PBDs for the remaining PBLs.  That method of compilation isn't supported by the current standalone compiler.
  • New UI framework - Many of the attendees were excited about this enhancement, but a few expressed some concern about the timing of its introduction.  That is, some attendees were more interested in seeing this feature introduced sooner rather than some of the other proposed features.  Others thought that existing methods of improving the UI of PowerBuilder applications were adequate for the immediate future and were willing to wait for that and have the other features first.

Conference App

During the keynote they announced the availability of a mobile application (based of course on PowerBuilder/Appeon) for accessing the conference schedule.



The app showed the time and duration of the sessions, where they were located, and gave a synopsis of them.  There were also printed session lists posted throughout the conference area.  The rooms weren't very far apart and there was an electronic notice outside each room with the session name so it was relatively easy to find them.

Lunch

The lunches had a local flair to them, as a result of which I ate more macaroni and cheese in the three days of the conference than I have in the last three years!


Sessions

I had three sessions I presented on Monday afternoon:

  • Migrating to PowerBuilder 2017
  • Migrating to 64-bit
  • Calling advanced web services from PowerBuilder using a COM Callable Wrapper (CCW) client
Quite a number of the sessions became standing room only, such as this presentation by Ronnie Po on Tuesday:


Breaks

Snacks and soft drinks and water were provided in mid-mornings and mid-afternoons.

Special Event

The special event was held Monday evening at the NASCAR Hall of Fame.  It's normally only open from 10 AM to 6 PM, so we had it for a private event.



  Food (hamburgers or bratwurst with sauerkraut) were provided as well as an open bar.


The main attractions were simulators for pit crews and races.  Many of the attendees seemed to having great deal of fun squaring off against their fellow attendees.





Tuesday

I had four sessions I was presenting on Tuesday:

  • Preparing applications for the Windows App Store
  • Accessing any data from PowerBuilder through OData
  • Using .NET visual controls from PowerBuilder
  • Continuous Integration with PowerBuilder

Because of my own schedule I wasn't able to attend any other sessions.

Wednesday

I had two sessions I was presenting on Wednesday:

  • Using .NET nonvisual assemblies from PowerBuilder
  • Calling advanced web services from PowerBuilder using a proxy web service

That allowed me to attend the only session by another presenter I was able to get to the entire conference.  On Wednesday afternoon John Hnat of Foundation Software did an "ISV Discussion on PowerBuilder's Roadmap".



His discussion points were:

  • Are you staying client/server or moving to the web
  • Are you doing any new development
  • Where are you finding / training new developers
  • What features do you want to see in PowerBuilder
  • Are you satisfied with the proposed direction of PB
  • Open mic

Thoughts for next year

  • Location
    • A different location. With the NCPBUG meetings in Charlotte in 2014 and 2015 and Elevate in Charlotte in 2017 we've seen a lot of Charlotte.  I'm ready for a change, as long as it's not Las Vegas.
  • Sessions
    • Do some pre-planning to ensure that sessions aren't overcrowded.  I'd suggest allowing for session selection during online registration so they can get some feel for the size of room needed for each session and which perhaps should be repeated.
    • Schedule specific break times and provide coffee as well as the soda and snacks during the breaks.
    • Build in a bit of overlap between sessions in the same room (e.g., 15 minutes) so that the incoming presenter has time to set up.
    • Modify the conference app for next year so that it includes the name of the speaker for each session.
    • If printed session lists are going to be posted, make sure they include the room number.
  • Workshops
    • For hands-on workshops, consider charging a small fee and using that money to offset the cost of doing the workshop via Amazon Workspaces or something similar.  The problem with hands-on workshops where virtual machines are provided on USB sticks is that you can end up spending most of the workshop trying to get the environment set up correctly.  With Amazon Workspaces the instructor can get the student machines all configured in advance.  The students then use remote desktop to access the machines during the class and off hours.  I participated in a hands on workshop like that back in the SAP TechEd days and it allowed us to get pretty much straight to the training with only a small time needed for setting up the environment.