Friday, December 01, 2017

Revisiting Continuous Integration with PowerBuilder 2017, Bonobo Git and Jenkins

In a previous blog post I ran through a example of how to use the new command line compiler feature of PowerBuilder 2017 to perform continuous integration using Jenkins and a Git repository.  One of the "next steps" I referenced in the close of that post was the creation of a plug-in for Jenkins that would make creating the build steps for the project much simpler.

Well, I've created such a plugin.  Or more specifically what I did was take the existing MSBuild plugin and customize it to do PowerBuilder compiles instead.

In this post I'm going to look at how we can use that plugin, plus some additional changes I've made to the project in Jenkins to make better use of the capability of Jenkins and other Jenkin plugins.

Download and install the PBCompile plugin


The first thing you'll need to do is grab a copy of the pbcompile plugin.  What you'll need is the PBCOMPILE.HPI. file.

Go into Jenkins and select the "Manage Jenkins" option, and then the "Manage Plugins" link on that page.


On the Plugin Manager page select "Advanced" and then look for the "Upload Plugin" section. Click the "Browse" button there and navigate to the HPI file you downloaded.



You'll need to restart the Jenkins server after installing the plugin before you can use it.

Configure the PBCompile plugin installation


After the server has been restarted go back into "Manage Jenkins" and then "Global Tool Configuration".


Scroll down that page and you'll see a new "PBCompile" section.  Select the "PBCompile installations..." button


The only two items you'll need to provide are a name for the installation and the directory where the orcascript170 and pbc170 files are at.



Create a new PBCompile build step


Go back into your PowerBuilder project and select the "Configure" link.




Delete all the previous build steps.  Now add a new one and notice a new "Build a PowerBuilder target using PBC" option has been added.  Select that one.


You'll now see a build step that is custom to PowerBuilder 2017.


  • PBCompile Version: Select the name you defined in the "PBCompile installations" above.  It should be the only available entry.
      
  • PBT File:  Provide PBT file for your project, including any relative directory path.
     
  • Exclude Liblist:  Optional,  It only needs to be populated if there are PBD files in the library.  That tells OrcaScript to take those as are rather than attempting to generate them.
  • PBC Command Line Arguments:  The data you pulled from the project painter in PowerBuilder that previously we put in the batch file.


  • Pass build variables as properties:  The checkbox currently isn't used.  MSBuild has this feature, and the option was left in here just in case a later version of the PowerBuilder Autocompile utility adds similar capability.
The plugin provides the same capability as the first two batch files from the previous blog article, namely running OrcaScript to create PBLs from source control and then running the AutoCompile Utility (PBC170.exe) to create the executables and PBD from the PBLs.

Archive the Project Artifacts


What we're going to do now is replace the third batch file, the one that copies the EXE and PBDs into the build folder.  Jenkins has some built in capabilities that make this simpler that I didn't demonstrate in the previous article.

The first thing we're going to do is to tell Jenkins to archive off the EXE and PBDs (Jenkins calls them "artifacts").  To do this, create a Post Build step and select "Archive the Artifacts"


In the new step that is added, enter the following in for the "Files to Archive".

       **/*exe, **/*.pbd

The syntax is Apache Ant include fileset.  This particular setting tells Jenkins to find all of the EXE and PBD files in the workspace - including in any subdirectories - and copy them into the archive location.  When it does so it preserves the subdirectory structure, something we'll deal with in the next step.



Install the Copy Artifacts plugin


Now that we have the PBDs and EXE in the archive location, we want to use the Copy Artifacts Plugin to copy them to the build folder.  First we'll need to install the plugin.  Go back to the Manage Jenkins  -> Manage Plugins page, but this time select the "Available" tab.  Because there are a lot of plugins for Jenkins you might want to use the Filter option in the upper right to pear down the list.


Once again you'll need to restart the server before you can use the plugin.

Create a Copy Artifacts project


After the server has been restarted, create a new project.  We'll fire this project off after the build is complete from our original PowerBuilder project.  You might simply name the new project as the original project name with an " - Output" suffix, as I've done here.


The only thing we're going to do with this new project is create one build step.  Do that, and choose the new "Copy artifacts from another project" option.



  • Project to Copy From:  Select your original PowerBuilder project.
      
  • Artifacts to Copy:  You can leave this blank, as it defaults to copying all artifacts.
  • Target Directory:  The he location you want the files copied to, and can reference the Jenkins environmental variables using $VARIABLENAME (as opposed to %VARIABLENAME% that we used in the batch files).  For example, I'm using:

    c:\builds\$JOB_NAME\$BUILD_NUMBER

    In the previous article I was concatenating the job_name and build_number for the directory.  I like this better as I'm now putting the builds in subdirectories under the job name.
  • Flatten Directories:  The one other option you will want to change is to check the "Flatten Directories" option.  That tells the plugin to just copy over the EXE and PBD files and ignore the subdirectory structure.




Great a Post Build step to fire the Copy Artifacts project


Go back into the original PowerBuilder project.  We need to add another Post Build step. This time select "Build other projects".


And in the new post build step specify your "output" project as the project to run after the PowerBuilder project completes.



Test the new configuration


Run a build of your PowerBuilder project just to verify that it's all working.  You'll notice some differences.  In particular, the build information now shows the artifacts that were created and the "output" project as a "Downstream Project".


The build information from the "output" project isn't particularly interesting, but we can check out build folder and see that the build files are being created and stored correctly.


Monday, November 06, 2017

ISUG-TECH Board of Directors Nominations Open

The nomination form for the 2018-2019 elections is now available through November 13th at:


You can also access it through the Member's Blog article or by going to Polls and Surveys from the home page. I will also post an article on the Facebook page.

These are the positions available for the 2018-2019 term:

   President
   VP of Business Relations
   VP of Technologies
   Benefits Director
   Content Director
   Regional Director - AP
   Regional Director - SLA

Please verify your nomination is qualified and willing to serve. You may nominate yourself. Nominations for President include an additional requirement that the person be a member of the executive committee to run.

ISUG-TECH Annual General Meeting

ISUG-TECH will be holding their annual member meeting on November 8, 2017 at 12:00pm (noon) EST.  This year the meeting will be conducted via webcast. To register, please visit:


You must register to attend the meeting.  All members are invited to attend, although Associate Members will not be permitted to vote on any pending action required by the membership.

We hope you will participate in this important aspect of your ISUG membership.  If you are unable to attend, minutes of the meeting will be available afterwards, as well as (we hope) a recording of the meeting.

Saturday, September 30, 2017

Elevate 2017


Summary

Given that this was the first conference that Appeon had hosted, I thought it went extremely well.  There were a few areas for improvement, which I'll address at the end of this article.  Attendance was good and diverse.  It seemed like there were a significant number of people attending from outside of the United States.   (Appeon later indicated that 24% of the attendees were from outside North America, 4% from Asia, and 10% each from Europe and Latin America).  There was a lot of energy and excitement on the attendees part, and the sessions overall appeared to have been high quality and well attended.  The facilities were great, if a bit small, and the services provided by Appeon to facilitate travel between the downtown hotels and the conference location were a great touch.  If you didn't attend this year I would highly recommend it next year, particularly given we should have some exciting new 2018 features to see next year.

Facility

The conference was held Sept 25th through 27th at the Harris Conference Center in Charlotte, NC.  The facility is located on the campus of the Central Piedmont Community College, just a few miles from the Charlotte Douglas International airport.  Most of the attendees stayed at the Holiday Inn in downtown Charlotte or other downtown hotels.  Appeon provide shuttle busses that ran every 30 minutes between the Holiday Inn and the conference center at the beginning and end of each day of the conference.

The keynote, lunches and some sessions were held in the Full Conference Hall.  Other sessions were run simultaneously in the Ash, Birch, Cypress and Maple conference rooms.


Registration

I'm used to conferences that allow registration the day before the conference begins.  Such was not the case here.  Instead registration opened at 8:00 the first day of the conference and the keynote didn't begin until 9:30, but the registration went smoothly.

Breakfasts

During registration and each morning the next two days breakfast was offered.  It was the standard carbs and fruit offering typical for such events.  Doesn't quite fit my diet so I made other arrangements, but most other attendees seemed content.


Keynote




Armeen Mazda started the opening keynote.  He discussed "what we have accomplished"


  • PowerBuilder 2017 delivered, which offers
    • Core features
    • Easy migration
    • More stable
  • AppeonU
    • Free training
    • Can be completed in 1 week
    • Based on the old fast track to PowerBuilder Course
  • Community delivered
    • Appeon MVP
    • Real tech Q&A
    • How to videos and articles
    • CodeXchange
  • Elevate 2017
    • 60 hours of tech sessions
    • Special product previews for
      • PowerBuilder 2017 R2
      • PowerBuilder 2018
    • Impressive attendees

Georg Brodbeck spoke next, a customer success story.



  • Company has 80 employees, 45 of which are PB developers
  • Their main product is an ERP system
  • Bigger teams require different development processes
    • Continuously integrated
  • What they need to see from future versions of PowerBuilder
    • Address old fashioned IDE and few new features
    • Support for new operating systems

Chris Pollach spoke next

  • Release plan and schedule      
    • Revision approach
      • R2 12/31
      • R3. 6/2018
        • Restful app
        • Source control integr#tion with TFS
      • 2018. 12/2018
        • C# development
        • C# web API
        • 64-bit application enhancements, UI modernization
  • Support mechanism
    • Community
    • Standard
    • Paid
  • EBF vs MR vs Revisions
    • MRs once a quarter
      • MR1 09/2017
    • LTS support
      • Supported for 3 to 5 years
      • 1 year notice before dropping support
      • 2017 R2 will be first LTS version
  • R2 highlights
    • TLS 1.2
    • Consumption of REST/JSON
    • Git and SVN
    • No longer need separate directory for each PBL
    • Native PDFs enhancements
      • PDF A1 and A3
      • Improve font and graphics rendering
      • More page sizes
    • PostgreSQL support
    • Improve standalone compiler
      • More options from project object
  • PB 2018  
    • REST web APIs and NVO.NET (uses Roslyn within the PowerBuilder IDE)


Chris then demoed some of the C# development capabilities they were planning for 2018.




Mike Croyle of CBIZ spoke next, another customer success story

  • Mid 90s Client/Server
  • 2003 PB/EAServer
  • 2009 PB WebForms
  • 2012 Appeon Web



Armeen then spoke again on Appeon 2018 and beyond

  • .NET stack
  • Deploy anywhere
  • N-tier (cloud)
  • Leverage existing investments
  • 2018-2019 focus
    • .NET desktop cloud apps
    • R3      
      • JSON DataWindow
      • TFS integration
    • 2018
      • C# server projects
      • PB native IDE enhancements
    • 2019
      • Desktop cloud target
      • UI framework
  • Cloud app benefits
    • Simplified deployment
    • Code reuse and extensible (more code to server)
    • Integration ready (web APIs)
    • .NET interoperability
      • BUT your app has to be fully partitioned
  • Rapidly evolving existing apps
    • PowerServer Web add-on
    • ~80% automated conversion
  • Hybrid n-tier
    • JSON DW
    • C# web APIs
  • Optimize UI
    • Optimize existing
    • Rebuild UI layer

Filiberto Sosa of Sizes and Colors then spoke, another customer success story

  • 21 Years ago POS system
  • Grew to full ERP system
  • Now handles 30% of Mexico╩╝s shoe sales
  • Web and mobile clients
  • 6 Appeon Web servers handling 5000 concurrent users
  • 4 PB developers


Keynote Takeaways

  • Release schedule - I think most attendees were quite pleased to see an even higher-paced release schedule (multiple minor releases per year with incremental new features).  It means new features end up in our hands sooner rather than waiting for next major release.
  • C# development - Here I think most of the attendees left somewhat undecided about this particular feature.  It appears the purpose of the enhancement has many purposes:
    • Allow developers who are not familiar with PowerBuilder but know C# to use PowerBuilder
    • Assist in the migration of PowerBuilder apps to n-tier
  • Consumption of REST/JSON - I think many attendees were pleased with this feature.  PowerBuilder has fallen far behind the curve in web services support.  Personally, I'd like to see improvements in SOAP as well, but REST seems to be where the most growth and demand is in web services support.
  • Git and SVN - I know some customers are excited to see this coming, particularly me.  My company moved to SVN some time ago.  However, the vendor that provided the bridge product that we use to get PowerBuilder to talk to SVN apparently ceased development about 5 years ago.  In addition, the open source PowerBuilder projects I support have moved to GitHub, making Git integration important.  I'm currently using another bridge product, but it has a number of limitations.
  • PostgreSQL support - Another enhancement I'm particularly interested in.  The developer edition of SQL Anywhere is no longer provided with the product.  Support for PostgreSQL provides a low (no) cost for initial development work.  In addition, unlike SQL Anywhere, support for PostgreSQL allows for low (no) cost production deployment.
  • Improved standalone compiler - Yet another one I'm particularly interested in.  I normally compile my applications as a small executable (basically just the application object) from the first PBL and PBDs for the remaining PBLs.  That method of compilation isn't supported by the current standalone compiler.
  • New UI framework - Many of the attendees were excited about this enhancement, but a few expressed some concern about the timing of its introduction.  That is, some attendees were more interested in seeing this feature introduced sooner rather than some of the other proposed features.  Others thought that existing methods of improving the UI of PowerBuilder applications were adequate for the immediate future and were willing to wait for that and have the other features first.

Conference App

During the keynote they announced the availability of a mobile application (based of course on PowerBuilder/Appeon) for accessing the conference schedule.



The app showed the time and duration of the sessions, where they were located, and gave a synopsis of them.  There were also printed session lists posted throughout the conference area.  The rooms weren't very far apart and there was an electronic notice outside each room with the session name so it was relatively easy to find them.

Lunch

The lunches had a local flair to them, as a result of which I ate more macaroni and cheese in the three days of the conference than I have in the last three years!


Sessions

I had three sessions I presented on Monday afternoon:

  • Migrating to PowerBuilder 2017
  • Migrating to 64-bit
  • Calling advanced web services from PowerBuilder using a COM Callable Wrapper (CCW) client
Quite a number of the sessions became standing room only, such as this presentation by Ronnie Po on Tuesday:


Breaks

Snacks and soft drinks and water were provided in mid-mornings and mid-afternoons.

Special Event

The special event was held Monday evening at the NASCAR Hall of Fame.  It's normally only open from 10 AM to 6 PM, so we had it for a private event.



  Food (hamburgers or bratwurst with sauerkraut) were provided as well as an open bar.


The main attractions were simulators for pit crews and races.  Many of the attendees seemed to having great deal of fun squaring off against their fellow attendees.





Tuesday

I had four sessions I was presenting on Tuesday:

  • Preparing applications for the Windows App Store
  • Accessing any data from PowerBuilder through OData
  • Using .NET visual controls from PowerBuilder
  • Continuous Integration with PowerBuilder

Because of my own schedule I wasn't able to attend any other sessions.

Wednesday

I had two sessions I was presenting on Wednesday:

  • Using .NET nonvisual assemblies from PowerBuilder
  • Calling advanced web services from PowerBuilder using a proxy web service

That allowed me to attend the only session by another presenter I was able to get to the entire conference.  On Wednesday afternoon John Hnat of Foundation Software did an "ISV Discussion on PowerBuilder's Roadmap".



His discussion points were:

  • Are you staying client/server or moving to the web
  • Are you doing any new development
  • Where are you finding / training new developers
  • What features do you want to see in PowerBuilder
  • Are you satisfied with the proposed direction of PB
  • Open mic

Thoughts for next year

  • Location
    • A different location. With the NCPBUG meetings in Charlotte in 2014 and 2015 and Elevate in Charlotte in 2017 we've seen a lot of Charlotte.  I'm ready for a change, as long as it's not Las Vegas.
  • Sessions
    • Do some pre-planning to ensure that sessions aren't overcrowded.  I'd suggest allowing for session selection during online registration so they can get some feel for the size of room needed for each session and which perhaps should be repeated.
    • Schedule specific break times and provide coffee as well as the soda and snacks during the breaks.
    • Build in a bit of overlap between sessions in the same room (e.g., 15 minutes) so that the incoming presenter has time to set up.
    • Modify the conference app for next year so that it includes the name of the speaker for each session.
    • If printed session lists are going to be posted, make sure they include the room number.
  • Workshops
    • For hands-on workshops, consider charging a small fee and using that money to offset the cost of doing the workshop via Amazon Workspaces or something similar.  The problem with hands-on workshops where virtual machines are provided on USB sticks is that you can end up spending most of the workshop trying to get the environment set up correctly.  With Amazon Workspaces the instructor can get the student machines all configured in advance.  The students then use remote desktop to access the machines during the class and off hours.  I participated in a hands on workshop like that back in the SAP TechEd days and it allowed us to get pretty much straight to the training with only a small time needed for setting up the environment.







Tuesday, August 08, 2017

Spell checking in the new PowerBuilder 2017 Rich Text Edit Control

The original Rich Text Editing control shipped with PowerBuilder was based on an OEM of a popular third party control at the time called HighEdit. By the time 10.5 came out though, that control was quite dated and no longer supported by the vendor. As a result, in 2006 Sybase replaced that control with an OEM of another popular third party control called TX Text Control. There are licensing issues with that control though.  So with the release of PowerBuilder 2017 Appeon updated the control again, replacing the OEM of the TX Text Control with an OEM of the TE Edit Control. Note that if you encounter regressions with the new control you can switch back to the TX Text Control through a new RichTextEdit option in the Additional Properties dialog for the application object.  If you do however, you will have to obtain your own license of TX Text Control.


One feature that became possible when Sybase adopted the OEM of the TX Text Control was in place spell checking of the text through third party utilities like JRSpell, WSpell or VSSpell. With the previous control, about the only thing you could do was copy out the unformatted text, spell check that using Word through OLE Automation, and then paste back in corrected, but unformatted, text.  Many people found losing the formatting of the text an unpleasant side effect of spell checking that way.  The new controlled allowed spell checking to be done without losing formatting.


In order for those third party utilities to perform in place spell checking, they needed to be provided with the handle to the control containing the rich text.  In an article I did for the PowerBuilder Developers Journal when the control was first introduced, I provide sample code for obtaining the control handle to pass to those utilities.  The sample code was essentially this, where a_rte is a reference to the PowerBuilder rich text edit control:

 ulong        hWin  
 hWin = Handle ( a_rte )  
 hWin = FindWindowEx ( hWin, 0, "PBTxTextControl", 0 )  
 hWin = FindWindowEx ( hWin, 0, "AfxOleControl42u", 0 )  
 hWin = FindWindowEx ( hWin, 0, "TX11P", 0 )  

Starting with the PowerBuilder RTE control we had to drill down three levels until we reached the part of the OEM TX Text Control that contained the text.  FindWindowEx is a windows API call to get a control handle based on a class name.  The text values included here are the class names that PowerBuilder used for the various controls.  Note that each time a new version of PowerBuilder was released and referenced a newer version of the TX Text Control the last class name used here had to be updated for the class name for that version.

With the new rich text edit control in PowerBuilder 2017 we use the same technique, though we don't have to drill down quite so deeply.  The code I'm using now in my migrated PowerBuilder 2017 application looks like this.

 ulong        hWin  
 hWin = Handle ( a_rte )  
 hWin = FindWindowEx ( hWin, 0, "Ter24Class", 0 )  


Saturday, July 15, 2017

The return of browser plugins?

Apparently there is a W3C open standard currently in development called WebAssembly that is "a memory-safe, sandboxed execution environment" for browsers (and non web use).  The binary formatted code in WebAssembly deployment can be parsed up to 20 times faster than JavaScript can be parsed.  While the working group has been focused on providing support for C/C++ a project called Blazor has been developed that provides support for C#.

InfoQ article


Thursday, July 13, 2017

Continuous Integration with PowerBuilder 2017, Bonobo Git and Jenkins

In a previous blog post I examined how we could use the Git MSSCCI provider from PB Software in order to use GitHub as a source code provider for PowerBuilder.

In this blog post we're going to take that to the next step, in that we're going to create a build machine separate from our PowerBuilder development machine and then set it up to perform continuous integration.  The term "continuous integration" has a somewhat ambiguous definition, but generally it means that:
  • developers check in changes frequently (at least daily) and
  • that build are done on a regular basis (at least daily, but can be as frequently as after each check-in)
Ideally, automated testing routines would be run on each build to ensure that feedback on any functionality that was broken by the latest code changes are returned to the developers as soon as possible.  Automated testing is outside the scope of this particular article

One of the new features added in PowerBuilder 2017 is a license free stand alone compiler and we're going to use that for this article.   If you are using a older version of PowerBuilder you could use the same approach using the command line argument feature of the PowerBuilder IDE, but it would require installing the full PowerBuilder IDE (including a license) on the build machine.  Alternatively, regardless of which version of PowerBuilder you're using you could use PowerGen in scripted mode.

Prerequisites

  • A windows machine (can be virtual) with .Net Framework 4.6 and IIS configured for ASP.Net
  • Bonobo Git Server
  • Jenkins
  • AutoCompile.exe from the C:\Program Files (x86)\Appeon\PowerBuilder 17.0\AutoCompiler directory from your PowerBuilder 2017 install

Install Bonobo Git Server

Bonobo is a ASP.Net application, hence the need for the machine we're installing it on to have the .Net Framework installed and IIS configured for ASP.Net.  Installation is fairly straightforward, as all you need to do is:

  • Copy the main folder from the unzipped download into the wwwroot folder for IIS
  • Give the IIS User modify and write permissions to the App_Data folder of the app
  • Convert the folder into an Application in IIS Manager
  • Ensure that the app pool that the application uses is based on .Net 4.0 rather than 2.0

Install Jenkins

The Jenkins install is also fairly straightforward as it uses a standard Windows installer.  After the install is complete it will generate a random password for the administrator into a file within the install and provide you with a message as to where to locate it.  You will need that password to do the initial login and configuration of Jenkins.

One you have started Jenkins it will prompt you as to the plugins you wish to install.  There are 1000+ plugins available for Jenkins.  The primary one we're interested in is the Git plugin, which is part of the default set of plugins that Jenkins will recommend for use. You can just accept their recommendations.

Install AutoCompile

Autocompile has a fairly simple install too, and automatically adds the install directory to the system path.

Configure Jenkins

Change the Jenkins URL from localhost to the actual name of the server that it's running on.


Create a Repository in Bononbo

You need to be logged in as the admin user to create a repository.  The only thing you have to provide is the name.


Once you're back at the list of repositories you can use the option there to copy the url for the new repository to the clipboard.


Configure the PowerBuilder IDE to use the new repository

Technically what you will be doing is configuring the PowerBuilder IDE to use a local repository and then configuring that local repository to use the new repository as a remote.  The detailed steps are outlined in my earlier article about using GitHub.  In summary, the steps are:


  1. Use TortoiseGit to create a new 'bare' repository in the directory where the PowerBuilder source is located.
  2. Create a small text file (e.g., readme.md) in the directory.
  3. Use TortoiseGit to add and commit the file.
  4. In PowerBuilder, select PGS Git MSSCCI as the source control system for the workspace.  Make sure that "Suppress prompts to overwrite read-only files" is checked.
  5. In PowerBuilder, add the application target(s) to source control.
  6. In PowerBuilder, add the remaining PowerBuilder objects to source control.
  7. In TortoiseGit settings for the directory, configure the Bonobo repository you created above as a remote repository for the local repository.
  8. In TortoiseGit, do a push from the local repository to the remote repository.

Create a new 'FreeStyle Project' in Jenkins



Under the Source Code Management section of the new project, select Git and then provide the repository URL and credentials to connect to the Git server.


Under the Build Triggers, specify any automatic build triggers you want to use.  For this example, I'm going to use "Poll SCM" and configure it to poll every 15 minutes.


Do a "Build Now" on the project


Because we haven't created any build steps yet this won't build anything yet.  What it will do is create a workspace under C:\Program Files (x86)\Jenkins\workspace and populate it with the current source code from the Git repository.

Create ORCAscript and batch files

Create an ORCAScript file to create PBLs from the source code (createpbls.orca):

 start session
 set debug true
 scc set connect property logfile "createpbls.log"
 scc connect offline  
 scc set target "pfc_ci\pfc_ci.pbt" importonly
 scc exclude liblist "pbdom\pbdom170.pbd"  
 scc refresh target 3pass
 scc close  
 end session  

Because Jenkins pulled the source code for us already we don't have to provide source control settings in the ORCAScript file and can use scc connect offline.  "ImportOnly" tells ORCAScript to build PBLs from source.  "Refresh Target 3pass" tells ORCAScript to do a multi-pass compile.  I'm using a sample app based on a recent version of the open source PFC, which now includes a reference to PBDOM.  Therefore I'm using "scc exclude liblist" to tell ORCAScript to ignore that library during the source code import.

Create a batch file that will run the ORCAScript executable on the ORCAScript file (run_orcascript.cmd).

 orcascr170 createpbls.orca  

Create a batch file to call the PowerBuilder stand alone compiler.  We're going to need to pass a number of arguments to the compiler.  Fortunately, the application project object in PowerBuilder shows you the arguments you would need to pass to match the settings in the project object.



Using those arguments - modified slightly because we're deploying in a different location - we should have something like this (run_pbc.cmd):

 pbc170 /d "pfc_ci\pfc_ci.pbt" /o "pfc_ci\pfc_ci.exe" /w n /m n /x 32 /p "PowerBuilder Enterprise Series" /cp "Appeon" /de "Appeon Product File" /v "1.0.0.1" /fv "1.0.0.1"   

Finally, create a batch file that will copy the generated exe and pbd files into another directory when the build is complete (copyfiles.cmd).

 md c:\builds\%JOB_NAME%_%BUILD_NUMBER%\  
 FOR /d %%a in (*) do copy %%a\*.exe c:\builds\%JOB_NAME%_%BUILD_NUMBER%\  
 FOR /d %%a in (*) do copy %%a\*.pbd c:\builds\%JOB_NAME%_%BUILD_NUMBER%\  

This batch files uses environment variables that Jenkins makes available when the batch file is run to create a separate directory for each build.

Create Build Steps

Go back into the Jenkins project and under Build add a Build Step that executes a batch command.


Specify the run_orcascript.cmd file as the first batch file to run.


Add another build step after the orcascript step and point this one at the run_pbc.cmd file.  Finally, create one more build step after the run_pbc one and have it run the copyfiles.cmd file.

Do a Build Now on the project

We're going to test out our scripts to make sure that Jenkins can do the build.  Once you've scheduled a build you should see it being processed in the list of builds.


If you click on the build, you'll be taken to another page with more details, including the ability to view the console output from the running build.


Test the SCM poll

Now that we know the build works, let's see if we actually have continuous integration.  Go back into PowerBuilder and check out an object.  Make a trivial change and commit it.  Then using TortoiseGit, push that change to the Bonoho repository.  Now watch Jenkins and what you should see is that some time after Bonoho has been updated (depending on what you set the SCM polling time to in Jenkins) Jenkins will automatically launch a build of the project.

Next Steps

That gives us the basics of a continuous integration setup.  I'm going to be looking at taking it a bit further.  In particular.

  • There is a Jenkins plugin for the Visual Studio command line build utility (MSBUILD).  I'm looking a creating a similar plugin for the PowerBuilder build utility.
  • Integration with JIRA.  Rather than firing a build on every checkin, the checkins would be tagged with a JIRA issue and the build would only fire when the JIRA issue is moved to the JIRA "Waiting for Deployment" status.