Tuesday, December 23, 2014

Haiti Missions Trip 2014

Monday, December 15th

Arrived in Haiti at 9 AM after 9 hours of flying (including layover).  Went directly to the compound at Blanchard and rested for a bit.  One of the first thing I notice is that they've sealed the windows in the guest rooms with plastic and have installed mini-split (ductless) air conditioners.


A bit later that day I start on my first project.  One of the guest rooms is rather large.  It has two A/C units, each to cool half the room.  It's a bit of waste when smaller teams come down, or mixed gender teams and only one gender can stay in the room.  So, we're adding a wall to split the room in half.  If a smaller team comes down, they can stay in one of the smaller rooms now and the other half doesn't have to have the air conditioner running.  If a mixed gender team comes down, each gender has one of the smaller rooms to themselves.


This year they have a small LED TV set and a DVD player in the break area, so that night a few of us watched a movie and one of the staff popped some popcorn.

Tuesday, December 16th

Continuing to work on the wall in the large guest room.  We got half the framing up Monday (this it what it looked like at the beginning of the day).


And this is what it looked by at the end of the day.  We got plywood and trip up on one side.  We need to get insulation from one of the other sites and install it before we can complete the other side.



Wednesday, December 17th

While waiting for the insulation to be available I'm starting on my second, third and fourth projects.

  • Redo the wifi setup for the compound.  They have several wifi access points in order to provide coverage throughout the property.  When they switched internet providers, the provider came in and set up their wifi routers so that each one was directly connected to the internet and operated on a separate LAN segment.  Some time later somebody with networking experience came down from the US and set up the network as a single LAN and set up the wifi access points as repeaters.  That is, each uses the same SSID, so you can move throughout the property without having to manually reconnect to a new access point.  From the client's perspective, it stays connected to the same access point, but it's actually being handed off between the different access points.  Initially that was good, but eventually too many people had learned what the wifi access code was, and it needed to be changed.  However, the person who set it up either didn't leave them the password for the admin console for the access points, or they lost it, and they no longer remembered who had done the setup so they might contact them for the password.  Over the next several days I reset them all and then continued the configuration as a single LAN, but broke up the access points depending on who was going to be using it (church, school, office staff, short term teams or long term volunteers) with different passwords for the different groups.  I then gave them some printed documentation (including the passwords for the admin consoles).
  • Show vocational instructions on how to use ID badging software and hardware.   One of the pastors had acquired a badge printer, signature panel, digital camera and software for a professional grade badging setup.  On Wednesday and again on Sunday afternoon I showed the staff how to use the equipment, so they could then train their students.  Sort of a train-the-trainer approach.
  • Do some maintenance work on one of the pastor's computers.  Something I get a lot in the US as well.  Folks report that their computers are running a lot slower than usual.  I generally find Trojan software installed, often masquerading as anti-virus software.  That was the case here as well.  Took a couple of days to completely clean it up.

Meanwhile, a couple of the local staff have painted the one side of the wall we've completed.  We get the insulation we need and finish the other side as well.  This is what it looks like after the local staff have finished painting it.


Thursday, December 18th

Continue working on my second (wifi reconfigure) and fourth (computer maintenance) projects.

On a side note, in an earlier trip I noted that playground equipment had been installed at the three different sites.  In the last couple of years I haven't been around any of the campuses when the children were using it though.  For example, last year I spend almost all of the trip at a remote site building a house.  Well, since this year I spent almost all my time at the Blanchard campus, I did get to see it in use, and it is well used.


Friday, December 19th

A bit of a break.  Took a tour of the Rhum Barbancourt factory.


We also stopped by the high school that HOM has under development.  When I was here last year they were working on the first floor.  At the time I was visiting this year they have finished the second floor and were about to start on the third.


They expect to have it finished in time to be able to start holding classes in it for the 2015-2016 school year.  This is just the first of what will be numerous building at the site.

Then back to the campus to finish up the wifi setup.

I'd noted last year that they had started working on a new kitchen area with gas fired stoves rather than the charcoal fires they had been using.  Well, it's done and in use.


I also noted last year that the government was well on the way to having the road in front of the high school done.  At that time they had leveled it and were working on the storm drains on either side of the road.  This year, the storm drains are covered, and the road has been paved.



Saturday, December 20th

I've completed the wifi setup except for one access point that is in a room that I can't get access to right away (the school), so I'm looking for a fifth project.  I find it painting some of the walls in one of the guest rooms.  This is what it looked like before I started:


And what it looked like when I was done:


A bit later in the day I finally got access to the last access point and completed the wifi configuration.

Sunday, December 21st

Generally we don't do any project work on Sundays.  First thing is church.  Here I with one of the children from the orphanage, while one of the staff tries to help me sing in French.  Haitian hymnals just give the French or Creole words, it doesn't show the tune at all.


After church they usually take the visiting team out for some sight-seeing.  Given that this is my fourth trip, I've already seen most of the local sights.  Also given that I'm a one person team this trip, I asked if I could do something a bit different.  Specifically visiting Lake Azuei (Etang Saumatre) near the border with the Dominican Republic.  Like it's twin Lake Enriquillo in the Dominican Republic or the more well know Dead Sea in Israel the lake has no outlet and as a result is a salt lake.  While Lake Enriquillo is saltier than the ocean, Lake Azuei has fresh water to the east and reaches only about 11% salinity to the west.  Unlike the Dead Sea though, swimming is not recommended, particularly because it is a habitat for American crocodiles.  That's one of the biggest reasons that the area hasn't been developed, although I did find one company indicating that they were going to be building timeshares near it.


Since we were so close, we decided to head over to the Dominican Republic border crossing.


And then walked across the border to a market that included a restaurant where we had lunch.


Although the island was originally colonized by Spain, France later took over the part that became Haiti, which is why they speak Creole/French whereas they speak Spanish primarily in the Dominican Republic.  Because I live in an area that has strong Spanish influences, it felt rather like being at home to be in a restaurant that had Spanish language music playing in the background and offered Spanish dishes.

As I mentioned earlier, one the last things I did was some additional "train-the-trainer" computer training.

Monday, December 22nd

Not much to report here.  Headed to the airport for a 10:30 flight and 15 hours of flying (including layover).

Some other observations

A couple of changes that have been in the works for the last few year.

1.  The government has placed an emphasis in providing public schools.

2.  There is now a second international airport, one located at Cap-Haitien.  While it technically reopened after a 2 year reconstruction effort in February of 2013, it wasn't until October of 2014 that a major carrier (American) started operating flights to it.

Also, there has been quite a lot of protests going on against corruption in the government lately, including one just before I arrived that resulted in one death and the resignation of the prime minister.






Tuesday, December 02, 2014

The results from the 2014 PowerBuilder Survey are available


2014 Survey Results

Some of my own observations:

The adoption rate for PowerBuilder 12 seems to have been more rapid than for previous versions of PowerBuilder (at least 9 and later).


And it kills me that 6 of the 8 most desired new features for the PowerBuilder Classic IDE were provided in the PowerBuilder.Net IDE.  I think most PB developers don't know what they're missing out on.


There are also a lot of comments included at the end of the survey.  I haven't gone through them all yet myself.


Wednesday, November 19, 2014

PowerBuilder product manager leaves SAP

Sue Dunnell, a Director of Product Management for SAP (primarily for PowerBuilder), has left to join Infosys.   Sue originally worked in as a Technical Support engineer with Sybase.  After four years, she took over as Director of Product Management in 2000.  She continued with in that role with SAP when SAP acquired Sybase.  Sue was primarily known for her work shepherding the PowerBuilder product line.  At this time, a replacement has not been named.  


Monday, November 17, 2014

Highlights from VSCONNECT - Day Two

One other announcements from Day One that got overshadowed by some of the major announcements.

Visual Studio 2015 Preview is available, which includes:
  •    a preview of .Net 2015
  •    a preview of ASP.Net 5
  •    an Android emulator for Visual Studio.
A pretty good summary of the preview is available here.

This slide was shown in a discussion of ASP.Net 5, but it's applicable to the broader .Net development environment.  The middle column represents the layers.  The left column represents .Net development prior to the VSCONNECT announcements.  The right column represents .Net development after the VSCONNECT announcements.


Note that platform specific sections of .Net are not being open sources (i.e., WPF).

Here's another important diagram from the announcements.


Which is a lot more comforting to .Net developers than their last rather famous diagram from Build in 2011.


There's some pretty good analysis of the announcements on iProgrammer here and here.










Wednesday, November 12, 2014

Highlights from VSCONNECT - Day One

While most people were watching a EAS land a probe on a comet, I was watching the first day of VSCONNECT.   It was a bit of information overload, but here's the main things I got:

  • Future versions of the .Net Framework will be named based on the year they are released.  For example, the next version will be called .Net 2015.
  • The source code for the .Net Framework will be open sourced.  The core portions of it are already available on Microsoft's GitHub server.
  • The core .Net Framework has been ported to Linux and OSX.
  • A new edition of VS.Net, called the community edition is now available free of charge to individual developers and small development groups.  It is not an express version of VS, but a full featured version that can be used for production development   Licensing is only required for "enterprise use".

More details on the announcements in day one can be found here and here.  Recordings of the webcast can be found here.

During the "halftime show" there were a lot of questions about the future of WPF.  That has been responded to in a blog post here.

Wednesday, September 10, 2014

Deploying PowerBuilder apps to desktops and mobile devices using Microsoft RemoteApp

Microsoft has recently announced a new feature of Windows Azure called RemoteApp that allows Windows applications to be hosted on Azure and then run via RDP on the desktop as well as mobile devices, including iOS and Android devices.  We're going to look to see how well that works for a PowerBuilder application.

The RemoteApp feature, at the time this blog was written, is in preview, and is available free of charge for developers to try out.  Azure itself also has a 30 day free trial program going.  However, there can be a rather long delay between the time you sign up for the RemoteApp preview and when it is approved.  In my case, my 30 day free trial of Azure was expiring by the time I got approval for RemoteApp, and so I had to obtain a pay-as-you-go subscription for Azure and apply for RemoteApp access again using the new subscription.  Note also that the notice they send letting you know that your RemoteApp access was approved is sent to the Microsoft account email address (e.g., hotmail, outlook.com, live.com) that you used to register, so you need to monitor that email address.

RemoteApp comes in two flavors:  Cloud and Hybrid.  The Cloud version is for hosting applications that can run entirely within Azure, include Microsoft Office applications and a few other Microsoft programs provided on the default image, as well as your own custom applications that do not require access to remote resources. The Hybrid version is for applications that need to access a database or other on-premise resources, and involve the creation of a site-to-site VPN link to allow the Azure cloud to securely access your on-premise resources.  This is a significant feature, as one of the stumbling blocks for many companies for moving to a public cloud offering has been the security and control over their database resources.  By providing a Hybrid option that leaves the database resources on-premise, Microsoft has taken great strides in overcoming those objections.

For the purpose of this walkthrough, we're going to create a custom template image which includes a PowerBuilder application that does not need to access a database (I decided to use the FreeCell sample app from the Sybase CodeXchange site).  We'll then do a straight Cloud deployment. We're going to follow the instructions at How to create a custom template image for RemoteApp and How to create a cloud deployment of RemoteApp to do so.

Windows Azure and Remote App Access
The first step, obviously, is to create an account on Windows Azure and then obtain access to the RemoteApp service.  You can sign up for the free 30 day Azure account at Microsoft Azure Free Trial: Try Azure | Azure Free Trial or the Pay as You Go plan at Pay-As-You-Go.  Once you have one of those, you can request access to RemoteApp via this page:RemoteApp.

Install Azure PowerShell
You'll need to have Azure PowerShell installed on your local machine in order to upload the custom template image to Azure.  Instructions on obtaining and installing Azure PowerShell can be found at How to install and configure Azure PowerShell

Obtain a Windows Server 2012 R2 Image
The template image we're going to create will be a customization of a Windows Server 2012 R2 image.  Therefore, you will need to obtain a DVD or ISO file to use to install on the image.  I used my MSDN Operating System account to obtain an ISO image.  Note that images obtained through MSDN are only for development purposes, you would need to obtain a retail license for Windows Server 2012 R2 to move to production.

Create a VHD file
There's a great walk-through on how to create a VHD file on Windows 7 at: Create and Use a Virtual Hard Disk on Windows 7.  The guide on creating the custom template image also has some pretty good instructions.  First, access the Disk Management tool via Control Panel -> System and Security -> Administrative Tools -> Computer Management:

dsikmanager.PNG

1. Right-click Disk Management and then click Create VHD. Follow the prompts that appear.
2. Right-click the new disk and then click Initialize Disk. Click OK.
3. Right-click the new disk and then click New Simple Volume (or select a different volume type, if available). Follow the prompts that appear.

Obtain a Microsoft Hyper-V Product
You'll need to install Windows Server 2012 R2 on the VHD file, install your application, and then configure the image using one of Microsoft's Hyper-V offerings.  Two options are Microsoft Hyper-V Server 2008 R2 and Microsoft Hyper-V Server 2012 R2.  Both are command line only server products.  The 2008 version is available free of charge.  The 2012 is available as an unlimited evaluation.  I tried both of them, but found that setup and configuration of them was too much effort to justify.  To use the server products you need to run the Hyper-V manager from another machine and deal with a huge number of configuration issues to get the two machines to talk to one another.  If that's something you feel comfortable with, more power to you.  Otherwise, you might want to consider the third option.

I ended up with the third option, which is to use the Hyper-V manager built into Windows 8.x.  Since I didn't want to dedicate a physical machine for this purpose, I created a virtual Windows 8 machine using VMPlayer, which is available free of charge for non-commercial purposes.  Once again, I used my MSDN Operating Systems account to obtain a Windows 8.1 image.  One thing you need to make sure you do when you create the Windows 8 virtual machine is change the Guest Operating System Version from "Windows 8" to "Hyper-V (unsupported)".  That's because VMPlayer is a Hyper-V server, but it needs to allow Windows 8 to see hardware support for Hyper-V as well so it's Hyper-V capability will work.

hyperv.PNG
What's interesting is that once you get Windows 8 Hyper-V running and install the Windows Server 2012 image on the VHD and start that, you'll actually have *three* different Hyper-V products running, one inside another, in a sort ofMatryoshka doll configuration.  VMPlayer -> Windows 8 -> Windows Server 2012.

Install Windows 2012 R2 on the VHD image file

Once you have Windows 8 (or one of the other Hyper-V products), open the Hyper-V manager, connect to the server (in the case of Windows 8 the manager just connects to the process running on the Windows 8 machine), right click and select "New -> Virtual Machine".  At the "Specify Generation" page of the wizard, leave it set to Generation 1.  On the "Connect Virtual Hard Drive", choose the "Use an existing virtual hard disk" option and select the VHD file you created earlier.  In the "Installation Options" subpage, you can point to the DVD or ISO image that has the Windows Server 2012 R2 setup on it.

selectdrive.PNG
Start and configure the Windows Server 2012 R2 image

Once Windows Server 2012 R2 is installed on the VHD file, start up the image and perform the customization steps listed in the custom template image guide.  In particular:

  • Enable the Remote Desktop Services role and Desktop Experience.
  • Install your application and ensure that it runs on the custom template image.
  • In order to expose your application as a RemoteApp, it needs to appear on the Start Menu.  To do that, create a shortcut and then copy that shortcut to the %SYSTEMDRIVE%\Program Data\Windows\Start Menu\Programs directory.  (You can also publish by path once the service is provisioned, but adding the shortcut to the Start Menu is easier).
  • Disable the file encrypting system
  • Sysprep the image

The image will shut down as the final steps of the sysprep.  At this point you need to copy the modified VHD file from the Windows 8 image back to the computer that has the Azure PowerShell installed on it.

Upload the custom template image

Go back into the Azure Management Console, and under the "RemoteApp" tab select "Template Images" and then the option to "Upload a Template Image".

uploadtemplate1.PNG

Once you give the template image a name, the management console will provide you with a script and a command to run using the script that you will need to run under an elevated privileges Azure PowerShell command prompt.

uploadtemplate2.PNG
Do that by searching for the Azure PowerShell, and then right clicking on it and running it as Administrator

powershell.PNG
Then navigate to the directory where the script file downloaded from the Azure site is located, and paste in the command to run.

powershell2.PNG
Once you grant it access to run, it will prompt you for the location of the custom template image.  The script will then calculate an MD5 hash for the file and preform the upload.  On my machine with an Intel i7 processor and a 75/75 FIOS line, it takes approximately 30 minutes to calculate the hash and another 30 minutes to do the upload for a 8GB VHD file.

Create a RemoteApp Service

Now that you have loaded a custom template image, go back into the Azure Management Console -> RemoteApp -> RemoteApp Services and click the option to create a new service.  In the dialog that appears, choose "Quick Create", give it a name, and then select your custom image from the drop down for template images.

createremoteapp.PNG

At that point, Azure will start provisioning the RemoteApp service, which can take 30 minutes or more.

Publish RemoteApp Applications

Once the RemoteApp service has been provisioned, click on it to open up the Quick Start menu.  From that menu, select the "Publishing" option.

pub2.PNG

You have a choice between selecting programs that appear on the Start Menu or just specifying the path to the application.

pub3.PNG

Since we added the program to the Start Menu, we'll choose that option.  The system will then provide us with a list of the programs on the images Start Menu.  From that, select the application(s) you want the users to have access to.  In this case, we'll make the custom app we added to the image available.

publishremoteapp.PNG
Download the RemoteApp client to your desktop and/or mobile device(s) and access the app

We're ready to run the application.  For the desktop, you can obtain the RemoteApp client from the following location: Microsoft Azure RemoteApp.  For mobile devices (i.e., iOS, Android) you can download them from the respective app stores (iTunes or Play Store).  I don't have (or want) a Windows based mobile device, so I can't address how to install the client there.

Once the client is downloaded, run it and login to Azure.  For the mobile device apps, this may involve selecting an "Add RemoteApp" option on the client menu.  The application you published should appear, and you can run it from the device.

Here's what the client and application looks like when running RemoteApp from my Windows desktop:

windowsmenu.PNG

windowsapp.PNG

Here's the client menu and application running on my iPad mini:

IMG_0614.PNG

IMG_0613.PNG

And finally, here is the client menu and application running on my Samsung Galaxy S5 (Android):

Screenshot_2014-09-09-23-47-31.png


Screenshot_2014-09-09-23-48-02.png
It appears that changes in orientation are not supported on mobile devices (or at least I didn't figure out how to support it).  It seems I was restricted to landscape mode.

Also note that since this is based on Remote Desktop, my signing in on a second device while the app was running in an initial device did not start up a new instance of the application.  It just transferred control of the original instance of the application to the more recent device login.  Of course, this only applies to logins from the same users, but it is an interesting and potentially useful feature of Microsoft's implementation.

Finally, it may be possible to create the custom image entirely within Azure using the virtual machine options they provide.  I'll leave that as an exercise for the reader.

Friday, July 18, 2014

Creating a REST web service using PowerBuilder.Net 12.5

One of the new features introduced with PowerBuilder.Net 12.5 was the ability to create WCF web services.  The version of the product also introduced a client for REST web services as well, and a WCF client had been introduce in an earlier version.  One frequent question I heard when presenting the new features in conference or online sessions was when PowerBuilder.Net would provide the capability to create REST services, not just consume them.

Perhaps what few people realized (including myself at the time) is that WCF web services isn't just for creating SOAP services.  Since .Net 3.0, they have been capable of creating REST services as well.  So we've actually have had the capability to create REST web services with PowerBuilder.Net since 12.5 was released.  In this blog post we'll look at how we do that.

The first thing we need to do is go ahead and create a WCF soap web service.  We're going to use pretty much the same approach that is demonstrated in this SAP D&T Academy video.   One difference is that I'm just going to use an ODBC data source for this sample.  In the video I used an ADO.Net datasource, which is the better approach for anything more than a demo when using .Net targets.

As in that video, I have a datawindow that selects the employees from the EAS demo database.  I also have a structure that has the same configuration as the result set of the datawindow, so transferring the data to and array of that structure can be accomplished through a one line dot notation call.  The code for the function that retrieves the employees looks like this.

  1. DataStore     lds  
  2. long          ll_rc  
  3. s_structure     emps[]  
  4. Transaction     ltrans  
  5. ltrans = create Transaction  
  6. // Profile EAS Demo DB V125  
  7. ltrans.DBMS = "ODBC"  
  8. ltrans.AutoCommit = False  
  9. ltrans.DBParm = "ConnectString='DSN=EAS Demo DB V125 Network Server;UID=dba;PWD=sql'"  
  10. //ltrans.DBParm = "ConnectString='DSN=EAS Demo DB V125 - 64 bit;UID=dba;PWD=sql'"  
  11. connect using ltrans ;  
  12. if ltrans.SQLCode <> 0 then  
  13.      emps[1].emp_fname = "Connect failed: " + ltrans.SQLErrText  
  14. else  
  15.      lds = create DataStore  
  16.      lds.DataObject = 'd_grid'  
  17.      ll_rc = lds.SetTransObject ( ltrans )  
  18.      if ll_rc <> 1 then  
  19.           emps[1].emp_fname = "SetTransObject failed: " + string ( ll_rc )  
  20.      else  
  21.           ll_rc = lds.Retrieve()  
  22.           if ll_rc < 0 then  
  23.                emps[1].emp_fname = "Retrieve failed: " + string ( ll_rc )  
  24.           else  
  25.                emps = lds.Object.Data  
  26.           end if  
  27.           disconnect using ltrans ;  
  28.      end if ;  
  29.      destroy ltrans  
  30. end if ;  
  31. return emps  
I'm running the EAS Demo database in network server mode with TCP enabled as a communications method so that the web service can connect to an already running database.

At this point we can go into the project painter for the service, select that function to be exposed in the service and run the project (having specified the wfcservice_host.exe file in the webservice.out/bin/Debug directory as what is run when we run the project).  We'll see a command prompt window display, and we should be able to access the WSDL for the SOAP service and create a WCF client for it.

Once we know that part is working, we're going to make a few modifications to make it a REST service instead.  The first thing we're going to do is go back into the project painter, select the method we're exposing and then click on the operational attributes button.  Note that the button won't be enabled until you select a method that you want to adjust the attributes for.

operationalattribute.PNG

Within the dialog that appears, select the WebInvoke Attribute category.  Within that category, set the Method to Get and provide a UriTemplate.  In a REST web service, the method is called by adding the UriTemplate onto the end of the root URL for the service.  So in this case, since the service URL is:


The method for retrieving the employees becomes:


Normally in REST services, a GET method is mapped to a retrieve, PUT to an insert, POST to an update, and DELETE deletes.  Arguments to the method are additional entry on the URL.  For example, we could create a method that returns a single employee record and takes the employee id as an argument.  If the employee id was 123, then the method URL might look like this:


Unless we specify a specific ResponseFormat and RequestFormat, XML is assumed (same as the SOAP service).  REST services are more likely to return JSON though, as it is not as verbose.  We can tell WCF that we want JSON returned instead by specifying that for the ResponseFormat.

attributes.PNG
We're done with the service project.  What we need to do now is make some changes to the <projectname>.config file in the root directory.  First, we need to find the endpoint address entry for the service and change the binding from basicHttpBinding to webHttpBinding.  We're also going to add a behaviorConfiguration attribute and give it a name.  We'll define that a bit later in the same file.

Original File:
  1.   <system.serviceModel>  
  2.     <services>  
  3.       <service name="Sybase.PowerBuilder.WCFNVO.n_customnonvisual"  
  4.   behaviorConfiguration="ServiceNameBehavior">  
  5.         <endpoint address=""  
  6.   binding="basicHttpBinding"  
  7.   contract="Sybase.PowerBuilder.WCFNVO.n_customnonvisual"  
  8.   bindingNamespace="http://tempurl.org" />  
Revised File
  1.   <system.serviceModel>  
  2.     <services>  
  3.       <service name="Sybase.PowerBuilder.WCFNVO.n_customnonvisual"  
  4. behaviorConfiguration="ServiceNameBehavior">  
  5.         <endpoint address=""  
  6.   binding="webHttpBinding"  
  7.   contract="Sybase.PowerBuilder.WCFNVO.n_customnonvisual"  
  8.   bindingNamespace="http://tempurl.org"   
  9.   behaviorConfiguration="EndpointNameBehavior" />  
There should already be a serviceBehaviors section in the file within the behaviors section.  What we're going to do is add a endpointBehaviors section to the file as well below the serviceBehaviors.  Give it the same name as you referenced in the new attribute for the endpoint earlier.  The only thing we need to include it in is the webHttp attribute:

  1.     </serviceBehaviors>  
  2.       <endpointBehaviors>  
  3.         <behavior name="EndpointNameBehavior">  
  4.           <webHttp />  
  5.         </behavior>  
  6.       </endpointBehaviors>  
  7.     </behaviors>  
With that, we're done.  Redeploy the project so that all of the new settings apply.  You should now be able to open a browser and give it the base URL plus the URITemplate.  If all is working correctly, you should see JSON being returned.

json.PNG

One of the downsides of REST services that there really isn't a way to automatically generate documentation for how the service operates, like the WSDL for a SOAP operation.   You're going to have to develop documentation by hand to let users know how to consume the service.

Now let's look at a method that has arguments.  We're going to create a method that takes a single argument, the emp_id, and returns that employee.  If you have more than one argument, you'll just extend the technique we use here for a single argument.

The code that retrieves a single employee is a slight modification of the code that returns them all:

  1. DataStore     lds  
  2. long          ll_rc  
  3. s_structure     emp  
  4. Transaction     ltrans  
  5. ltrans = create Transaction  
  6. // Profile EAS Demo DB V125  
  7. ltrans.DBMS = "ODBC"  
  8. ltrans.AutoCommit = False  
  9. ltrans.DBParm = "ConnectString='DSN=EAS Demo DB V125 Network Server;UID=dba;PWD=sql'"  
  10. //ltrans.DBParm = "ConnectString='DSN=EAS Demo DB V125 - 64 bit;UID=dba;PWD=sql'"  
  11. connect using ltrans ;  
  12. if ltrans.SQLCode <> 0 then  
  13.      emp.emp_fname = "Connect failed: " + ltrans.SQLErrText  
  14. else  
  15.      lds = create DataStore  
  16.      lds.DataObject = 'd_grid2'  
  17.      ll_rc = lds.SetTransObject ( ltrans )  
  18.      if ll_rc <> 1 then  
  19.           emp.emp_fname = "SetTransObject failed: " + string ( ll_rc )  
  20.      else  
  21.           ll_rc = lds.Retrieve( Integer ( empid ) )  
  22.           if ll_rc < 0 then  
  23.                emp.emp_fname = "Retrieve failed: " + string ( ll_rc )  
  24.           else  
  25.                emp = lds.Object.Data[1]  
  26.           end if  
  27.           disconnect using ltrans ;  
  28.      end if ;  
  29.      destroy ltrans  
  30. end if ;  
  31. return emp  
What's not obvious from the code is that the emp_id we're taking as an argument is of type string, and we're converting it to an integer within the code.  We have to pass all of the arguments to the function as string values, and deal with converting to the appropriate data type within the method because all of the arguments passed in on a URL reference are considered to be of string data type.

Lets look at the way we set up the WebInvoke configuration for this operation, and you'll see another difference:

Capture.PNG

Note that the UriTemplate is now:

     employee/{empid}

That means that the method is expecting to be in the form we mentioned earlier where the argument is obtained from part of the URL itself,  In particular:


The {empid} indicates where an argument will occur and what name it has in the underlying method.  If you've created REST web service clients in PowerBuilder.Net, you should be somewhat familiar with that type of approach.

While this works well when there is only one argument, it's not ideal when a number of arguments need to be passed.  In that case, we can use an alternative UriTemplate approach which uses variable name value pairs:
Capture.PNG
And the URL used to retrieve a single employee would then be:


Multiple arguments follow along with a & between them.

For a detailed discussion of how UriTemplates are used, see the "Designing the UreTemplates" section of this guidance from Microsoft:


One last note.  ODBC profiles are unique between 32 and 64 bit apps.  PowerBuilder.Net is 32 bit, and the WCF service will (if run on a 64 bit machine) be running as a 64 bit app.  That means the service would need to use a different ODBC profile than the one I used to develop the app.  Further, PowerBuilder.Net has an easier time debugging the WCF service if it's running as 32 bit rather than 64 bit.  Therefore, I actually wanted the WCF service host to run as 32 bit rather than 64 bit for development and debugging.

To accomplish that, I copied the wcfservices_host.exe file in the output folder and renamed the copy to wcfservices_host32.exe.  I then ran corflags on it to change it to a 32 bit application.  I copied the wcfservices_host.exe.config file that PowerBuilder generated and renamed it to match the new executable name. Next, I marked them both read only so PowerBuilder wouldn't delete them the next time I deployed the app.  Finally, I modified the service project so it ran the 32 bit executable whenever  I wanted to run or debugged the service.