Host Integration Server 2004 Technical overview Alessandro Appiani MCT MCSE (2000 NT 4.0 NT 3.5)
MOSS Integration with IBM Systems using Microsoft® Host...
Transcript of MOSS Integration with IBM Systems using Microsoft® Host...
MOSS Integration with IBM Systems using
Microsoft® Host Integration Server 2009
White Paper
Published: January 2010
Applies to: Host Integration Server 2009
Summary: Microsoft Office SharePoint® Server 2007 (MOSS) enables enterprise IT groups to
facilitate collaboration, provide content management, implement business processes, and
supply access to information. Many enterprises have investments in existing IBM programs,
messages and data upon which they rely for mission-critical operations. Microsoft Host
Integration Server 2009 offers technologies and tools that assist IT groups in integrating their
existing systems with new solutions based on MOSS. In this white paper, we will examine how
to apply the new HIS 2009 features, such as the WCF Channel for WebSphere MQ and Entity
Provider for DB2, to building customized MOSS adapters. Along the way, we will compare
MOSS to Windows® SharePoint Server (WSS) where necessary, as well as look at how to
apply compelling technologies in HIS 2009 to building new enterprise integration solutions.
About the Author: Jon Fancey is a co-founder of Affinus, a UK-based consultancy specializing
in Microsoft technologies with .NET Framework, BizTalk® Server, Host Integration Server and
MOSS. In addition, Jon is a member of the Pluralsight technical staff, a Microsoft .NET
Framework training provider, where Jon owns and teaches the SharePoint curriculum. Contact
Jon via his blog at www.pluralsight.com/blogs/jfancey or at [email protected].
Look for information on Microsoft Host Integration Server 2009 on the Microsoft BizTalk Server
2009 Web site (http://www.microsoft.com/biztalk/en/us/host-integration.aspx).
2
Copyright
The information contained in this document represents the current view of Microsoft Corporation
on the issues discussed as of the date of publication. Because Microsoft must respond to
changing market conditions, it should not be interpreted to be a commitment on the part of
Microsoft, and Microsoft cannot guarantee the accuracy of any information presented after the
date of publication.
This white paper is for informational purposes only. MICROSOFT MAKES NO WARRANTIES,
EXPRESS, IMPLIED, OR STATUTORY, AS TO THE INFORMATION IN THIS DOCUMENT.
Complying with all applicable copyright laws is the responsibility of the user. Without limiting the
rights under copyright, no part of this document may be reproduced, stored in, or introduced into
a retrieval system, or transmitted in any form or by any means (electronic, mechanical,
photocopying, recording, or otherwise), or for any purpose, without the express written
permission of Microsoft Corporation.
Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual
property rights covering subject matter in this document. Except as expressly provided in any
written license agreement from Microsoft, the furnishing of this document does not give you any
license to these patents, trademarks, copyrights, or other intellectual property.
Unless otherwise noted, the example companies, organizations, products, domain names, e-
mail addresses, logos, people, places, and events depicted herein are fictitious, and no
association with any real company, organization, product, domain name, e-mail address, logo,
person, place, or event is intended or should be inferred.
© 2010 Microsoft Corporation. All rights reserved.
Microsoft, BizTalk, Excel, Hyper-V, InfoPath, Internet Explorer, MSDN, PowerPoint, SharePoint,
SQL Server, Visual C#, Visual Studio, Windows, Windows Server, and Windows Vista are
trademarks of the Microsoft group of companies.
All other trademarks are property of their respective owners.
3
Contents
Introduction ................................................................................................................................ 4
Business Data as a Commodity ................................................................................................. 6
Integrating with Web Services .................................................................................................... 7
Creating BDC Applications ........................................................................................................13
Working Directly with Host Data ................................................................................................26
Integrating with WebSphere MQ ...............................................................................................36
Using the ADO.NET Data Provider for Host Files ......................................................................47
Applying the Entity Framework for use with DB2 .......................................................................51
Appendix A – Building a Web service using HIS 2009...............................................................60
Appendix B – TI Banking Web Service Application Definition file ..............................................72
</LobSystem> Appendix C – Sample Source for a TI Web Part .............................................73
Appendix D – Example of BDC Associations ............................................................................76
</LobSystem>Appendix E - MQ Web Part Code ....................................................................79
Appendix F – Sample Workflow for use with HIS ......................................................................84
Appendix F – Prerequisite Software for Samples ......................................................................98
Conclusion .............................................................................................................................. 100
4
Introduction
A brief introduction to Microsoft SharePoint Services and Microsoft Office SharePoint Server
2007 is essential before we delve into how to leverage the new features in Host Integration
Server 2009 (HIS) to build new collaboration and content solutions that extend investments in
existing IBM systems.
Windows SharePoint Services 3.0 (WSS) is a technology included in Microsoft Windows
Server® 2003 that enables organizations to increase the efficiency of business processes and
improve team productivity by enabling collaboration and by providing access to documents and
information. Windows SharePoint Services works with ASP.NET 2.0 and Internet Information
Services 6.0 (IIS) to provide IT organizations with a cost-effective foundation for building Web-
based business applications.
Microsoft Office SharePoint Server 2007 (MOSS) is a new server program that is part of the
2007 Microsoft Office system. With MOSS, enterprise IT groups can facilitate collaboration,
manage content, support business processes, and provide access to vital information. MOSS
relies on WSS to provide a consistent framework for building lists, libraries, and sites. MOSS
extends WSS by offering enhanced capabilities, templates, and tools to enable enterprise
search, enterprise content management, portals, business process, enterprise security, and
business intelligence. Two key MOSS technologies are Form Templates and Web Parts, which
we will utilize extensively in the samples discussed in this white paper. MOSS has a
dependency on Microsoft SQL Server® 2005 (SQL) for content storage.
Finally, MOSS 2007 adds a number of other compelling features, such as Microsoft Office
System 2007 integration (Excel®, PowerPoint® and InfoPath®), personalization, and the
Business Data Catalog (BDC). I hope that it is becoming clear that Microsoft designed
SharePoint as an extensible platform and architecture for the enterprise.
HIS 2009 New Features
ADO.NET Entity Framework support for DB2
WCF Channel for WebSphere MQ with new project templates
Transaction Integrator managed runtime with improved performance
Transaction Integrator deployment as WCF services
Visual Studio® 2008 Server Explorer support for DB2 and Host File data sources
ADO.NET Provider for Host Files support for off-line data reader
Support for Visual Studio 2008, .NET Framework 3.5, SQL Server 2008, BizTalk Server 2009,
Windows Vista®, and Windows Server 2008 including virtualization with Hyper-V™
Table 1. New features in HIS 2009.
5
Yet, what exactly is SharePoint? The goals of SharePoint were to provide portal technologies
where people could share and collaborate as easily and efficiently as possible. In this sense,
SharePoint provides a core feature set to act as an enabler. MOSS takes this a large step
forwards with many essential additions to manage the enterprise and empower people to share
information more easily than ever.
In the paper I will show how you, the HIS developer, can use SharePoint to power your
applications. Much of what I will cover will apply to both WSS and MOSS and I will call out
where this is not the case. One obvious area that I will discuss in depth, that I have already
mentioned is the BDC, a MOSS-only feature.
SharePoint has several central concepts that I will return to and expand on during this paper:
Features
Web Parts
Applications
Documents and lists
Features are the deployment and packaging mechanism in SharePoint that provide
encapsulation enabling a particular piece of functionality to be deployed and versioned as a unit.
In addition, you can programmatically enable or disable a feature for any site, via the
SharePoint Web-based UI or the command line tool STSADM. A feature can consist of almost
anything, from a custom list to a Windows Workflow.
Web Parts provide discrete functionality that you can connect to shared data and collaborate to
create applications out of reusable components. I will cover what Web Parts are and how they
work extensively in this paper.
A SharePoint application is a top-level container used for holding a
collection of Web sites. The application defines the SQL Server
database used to store site content and the URL used to access it.
Within an application, sites can be created which can be based on a
provided template or on your own. In this way a site for team
collaboration may be quickly and easily provisioned. Because the
content is maintained at the site level, which includes web pages too,
the term application is often used to refer to an application/site
combination.
SharePoint manages documents as versionable artifacts in the content database. A document
can be almost anything and once added to a document library. You can track, share or manage
documents. With SharePoint lists, you can organize information. You define lists based on
metadata that you can use to display or hide items based on content attributes. A document
library is actually a type of list that holds documents as its items.
6
Business Data as a Commodity
MOSS offers new ways to build Web-enabled applications rapidly and to surface data and
functionality like never before. One of the cornerstones is the Business Data Catalog (BDC).
The BDC provides a way to abstract data from the mechanism used to obtain it. This allows you
to treat data in standard ways, allowing you to access data more easily. You can use any Web
Part to display or manipulate data without needing to know what the source of the data is. This
is very powerful. You can declare and reuse the definitions of the data across applications. In
contrast, when pursuing the traditional approach to data access, you must create an application-
specific data access layer (or DAL). With the BDC, less is more. The less code you have to
create, the fewer bugs you will have and the more time to work on application features not
boilerplate.
Before moving on, it is worth mentioning that the BDC is one of a number of shared services.
Shared services are a scalability and configuration mechanism in SharePoint allowing a
particular service to run on a number of nodes. The Shared Services Provider (SSP)
infrastructure is responsible for sharing services, enabling you to access them from any
SharePoint Web application. Shared Services provided include search, indexing, Excel Services
and the BDC.
The BDC offers two primary mechanisms to access data: as Web services or via ADO.NET. We
will look at the detail of both of these later, but the core idea here is that you can either leverage
your existing Web services investments or connect directly to a data source using any of the
standard mechanisms for doing so. It is worth pointing out now too that the BDC allows not just
the fetching of data, but updating and manipulation of it as well.
One of the first things to address though is how to work with the BDC. The Business Data
Catalog Definition Editor, which is included in the MOSS SDK (see references), provides you
with a means to create the BDC definitions required to declare Application Definition Files
(ADFs). It is probably no surprise that the file format for these is XML-based. The grammar is
defined by a supplied schema that can be used to author (or edit existing) ADFs in Visual Studio
so that you get the IntelliSense® validation to make things easier. The schema, (by default) is
located at C:\Program Files\Microsoft Office Servers\12.0\bin\bdcmetadata.xsd. As you become
more familiar with the BDC, you will find it is sometimes easier to change the raw XML than use
the editor to do so. You can add this support when you add a new XML file to your project: in
the properties window, under Schemas, add this schema to the list.
7
Integrating with Web Services
To continue this discussion we need a Web service to invoke. See Appendix A for instructions
on creating a sample Web service. Once you create this Web service, you will have exposed a
Host program as a service that we can call via the BDC.
Let us now turn our attention to how we can consume it from MOSS. The first stage of this is to
use the BDC to create a definition of it, generically termed a Line of Business (or LOB) System.
The terminology used by MOSS is significant because its purpose is to take existing business
systems and functionality and open them up via the creation of SharePoint applications as
easily as possible. In this context, a line of business system could consist of a set of Web
services or a backend data source (such as SAP) that we wish to expose in our application.
There are four steps to creating a new LOB System definition:
1. Add a new LOB system
2. Add system connection details (URL, connection string etc.) to it
3. Select entities (tables, service operations)
4. Define any associations between entities (more on that later)
Figure 1. Authoring and executing BDC definitions.
8
After you create an ADF, you must import the ADF into MOSS. You can use these imported
definitions from the ADF in any MOSS application either by using the out of the box (OOB) Web
parts or by using custom Web parts that you create. See Figure 1 for a depiction of this process.
Figure 1 shows the process of creating and importing an ADF (1). You import the ADF through
the SharePoint admin site (2) and store the ADF in the SQL Server-based content database (6).
When a user (7) accesses a Web application (2) configured to use the LOB system (4) defined
in the BDC, the definition SharePoint retrieves the definition from the SQL Store and the Shared
Services Provider host (3) executes the mechanism to invoke the LOB system and return the
data required (5). For now, I will gloss over the details. Do not worry though — we will go into
much more detail on how all this works later.
Figure 2. Add Web Service.
Let us create our first BDC definition. Although the MOSS SDK provides an editor, you must
manually install it. If you have not already installed the BDC editor run setup.exe from the
following folder:
C:\Program Files\2007 Office System Developer Resources\Tools\BDC Definition Editor
We can start the BDC definition editor (see Figure 2) to create the necessary definitions and
even test them within the editor.
Clicking ‗Add LOB System‘ followed by ‗Connect to Webservice‘ opens a dialog to enter the
URL of the service. Clicking ‗Connect‘ will retrieve the service‘s details and close the dialog.
Figure 3 shows the results of this with the Right-hand pane showing the available operations
(Web Methods) found in the service‘s WSDL.
9
Figure 3. Add Entity.
The editor divides the design surface into two areas: a workspace on the left, and a list of
entities on the right. You can drag entities onto the design surface to create their corresponding
BDC definition. In this context, you will use the entity as a resource with which to retrieve data
from the LOB system. Figure 3 shows the GetAccts Web method added to the design surface.
When we are finished adding entities, clicking OK will close this view. The editor will prompt you
to enter a new name for your LOB system. In the sample, I have called it BankingService.
Figure 4 shows the results where I have tidied up the generated definitions, renaming Entity0 as
Accounts.
Name Description
Finder Find all instances of something
SpecificFinder Find a specific instance of something
ViewAccessor Similar to SpecificFinder but can occur multiple times with the BDC
providing an implicit key
Association Relationship between two entities
IdEnumerator Returns the IDs for a set of entity instances
Scalar Returns a single value, usually an aggregate
AccessChecker Provides the rights the current user has over the entity
GenericInvoker Execute arbitrary request such as UPDATE, INSERT etc
Table 2. Entity Method Instance Types.
10
The final thing we are going to cover now is methods. Once we have an entity definition we can
add method instances to it. A method instance is the definition of one of the BDC-provided
method types. You will use each method of a particular type for a particular purpose (I will
provide more info on this subject later. If you cannot wait, then see the full list depicted in Table
2). In addition, like any other good method you have probably come across, BDC methods may
also have parameters.
Figure 4
One of the most useful types and the first we will look are finder methods. Finders retrieve data
from a data source. We can add an instance of a finder by expanding the Entities node and right
clicking the Instances leaf (Figure 4). Clicking ―Add Method Instance‖ will show a dialog (Figure
5) where we can select the method instance type. In this case, I have picked the Finder. Note
the Return Type Descriptor box, which details all the items in each data row retrieved by the
Finder. As you can see there are a number of fields listed, which we have extracted from the
Web service operation‘s response message contract in its metadata. Clicking OK will close the
dialog and add a new method instance to the tree. I have renamed this to GetAccounts. When
clicked, an Execute button appears on the toolbar. If we click the Execute button now, and then
Execute again on the dialog that appears, to see the dialog shown in Figure 6.
Figure 5
11
Now that we have created and tested our LOB, let us look at what the editor created for us. You
can do this by selecting the LOB in the tree view and then clicking Export in the toolbar. You will
find the full source is in Appendix B. You can divide the ADF into a few constituents. The
structure of the ADF takes the form shown below. You should recognize this from the steps I
outlined earlier in the creation of a new LOB system.
<LobSystem>
<LobSystemInstances>
{...}
</LobSystemInstances>
<Entities>
{...}
</Entities>
<Associations>
{...}
</Associations>
</LobSystem>
First, you must specify how to connect to the data source, by using the LobSystem /
LobSystemInstance elements. An ADF may contain any number of LobSystemInstance
elements describing different sources, but in practice, it is best to create a separate ADF for
each LOB (there are some limitations otherwise not being able to define a Web service and
database in the same ADF, for example). This is the basic required information necessary to
invoke the LOB System. After defining the LOB system, you should establish data entities with
which to work.
Figure 6
The Entities element defines all the data entities. In this example, a single entity models the
service operation I dragged onto the design surface in the BDC Definition Editor (GetAccounts).
You should think of an entity as just that, a named thing that represents the definition of the data
that you will retrieve or manipulate. My distinction between retrieval and manipulation here is
important. Bear this in mind as I will come back to it later; the BDC is heavily geared towards the
retrieval scenario. The entity defines two things: the method to act on the data, and the data
itself. The ‗acting‘ part in our case here is very simple: a Web service operation is the action,
and you express the data as parameters to this operation. Of course, parameters always need
12
to have a direction, and as you can see from our example in the extract in Figure 7 there are
two input parameters (―name‖ and ―PIN‖) and one output the results of the call. A parameter has
a Type Descriptor, which is a formal description of the parameter type passed. For the return
parameter, this consists of the values passed back. One other noteworthy item at this point is
the IsCollection attribute on the TypeDescriptor element. This specifies whether the group of
items repeats or not, which in our case, it does.
<Parameters>
<Parameter Direction="In" Name="name">
<TypeDescriptor TypeName="System.String" Name="name" />
</Parameter>
<Parameter Direction="In" Name="PIN">
<TypeDescriptor TypeName="System.String" Name="PIN" />
</Parameter>
<Parameter Direction="Return" Name="Return">
<TypeDescriptor TypeName="BDC.ACCTINFO[],TestBanking" IsCollection="true"
Name="Return">
<TypeDescriptors>
<TypeDescriptor TypeName="BDC.ACCTINFO,TestBanking" Name="Item">
<TypeDescriptors>
<TypeDescriptor TypeName="System.String" Name="ACCOUNTNUMBER" />
<TypeDescriptor TypeName="System.String" Name="ACCOUNTTYPE" />
<TypeDescriptor TypeName="System.Decimal" Name="CURRENTBALANCE"/>
<TypeDescriptor TypeName="System.Int16" Name="INTERESTBEARING" />
<TypeDescriptor TypeName="System.Single" Name="INTERESTRATE" />
<TypeDescriptor TypeName="System.Decimal" Name="MONTHLYSVCCHG" />
</TypeDescriptors>
</TypeDescriptor>
</TypeDescriptors>
</TypeDescriptor>
</Parameter>
</Parameters>
Figure 7
Finally, the Associations element allows us to relate one entity to another. Entities may declare
an identifier (which can be a composite, and not necessarily a primary key) from which an
association can relate the pair. In this way, the selection of an item from one list in the
SharePoint UI can drive the display of a set of related items in another to create master-detail
pages. Never fear, I‘ll dive into much more detail as we go through this paper with examples of
how each can be used.
13
Creating BDC Applications
Modeling the Web service as an LOB system in the BDC is only the first part of the story. The
next step is to consume the definitions of the entities we have created from a Web page to start
to create an application. In order to make our ADF available for this use, however, it must be
imported. I have already discussed the Shared Services infrastructure in WSS and that the BDC
makes use of this extensibility. To import the ADF we exported from the BDC Definition Tool we
need to turn to the Shared Services Administration site. You can access this easily accessible,
by clicking Start, Programs, and Microsoft Office Server SharePoint 3.0 Central Administration.
This shortcut will launch the Internet Explorer® browser window similar to that shown in Figure
8, from where you can click ‗SharedServices1‘.
Figure 8
Under the Business Data Catalog, section is a link ‗Import application definition‘ – this will allow
us to import our ADF into the BDC. Click the link and then in the next page, click the Browse
button. Navigate to the ADF you saved before and OK the File dialog. Then click Import to start
the process. After a few seconds, Internet Explorer will display a results page. You can safely
ignore the warning that IE displays; I will explain what it means later.
Now it is time to create a page to display the data fetched from our LOB System. Before I do
that though, I want to discuss one of the core features of portal development – Web Parts.
What is a Web Part? Web Parts are controls, similar to other technologies based on the ASP.NET 2.0 architecture.
However, Microsoft designed Web Parts to work together and interact with one another through
a prescriptive, cooperative, framework. With Web Parts, you can develop rich, common
functionality that you can share and reuse in the infrastructure itself rather than baked into the
application, such as security and persistence. The WebPartManager controls and manages
Web Parts. There must be exactly one instance of this webclass on each Web Part page in
SharePoint. The WebPartManager will load and unload Web Parts, managing their state,
serializing and de-serializing the object as required.
14
Zone Name Purpose
WebPartZone Host control for Web Parts that you add to page.
CatalogZone Used to display a list of WebParts that can be added to a page.
EditorZone Container for editor parts. Is shown when page mode switched to edit.
Editor Web parts are used to configure other Web Parts on the page and
layout of the page itself.
ConnectionsZone Used to connect one WebPart to another to enable them to share data.
Table 3. SharePoint WebPart Zones.
SharePoint divides a Web Part page into a number of zones. You can define a collection of
zones using a template. You can think of a zone as an ASP.NET Web control acting as a Web
part container while the template is the page itself. Zones are a formatting mechanism used to
control a particular area of the page to lay out its appearance. There are several types of zone,
listed in Table 3. MOSS provides a number of templates for you as shown in Figure 10.
SharePoint implements templates as ASP.NET master pages. You can create your own custom
templates. You can pre-define master pages, using these as convenient templates for creating
additional content. The base page is a ‗boilerplate‘ that can provide layout, formatting and in this
case the page management, simplifying the amount of work you need to do on each page. The
templates provided already have pre-defined zones as well as a WebPartManager added to
them.
A Web Part page at its simplest is an ASPX page with a single WebPartManager control
instance and one or more WebPartZones containing one or more Web Part controls. Other
aspects of page processing that the WebPartManager is in charge of are the display modes of
the page, listed in Table 4. Display modes are one of the most powerful features of this model
as they allow not just developers to create and modify pages and configuration at design time,
but also end users at ‗run time‘ – i.e. once an application has been deployed. In fact, as far as
SharePoint is concerned, everyone is a user. Users have different permissions modeled as
roles where different people belong to different (or overlapping) roles. Display modes work in
conjunction with the zones to enable or show different zone depending on the mode. For
example, when you switch the page to edit mode, the Web Parts defined in the EditorZone will
become visible, enabling you to configure all aspects of the page and its Web Parts.
15
Display Mode Purpose
Browse Default mode – show page content
Design Used to add Web Parts to page
Edit Enables ability to edit page content
Catalog Used for picking Web Parts to add from a list
Connect For connection one Web Part to another
Table 4. Web Part page display modes.
MOSS Web parts are very straightforward to create and have the additional benefit that they
can become a reusable asset for future applications. The more time you invest in creating
custom Web parts the more useful you can make them.
Figure 9. Create New Page.
Building your first Web Part Page Let us now see how to create a Web Part page in MOSS. For that, we need a site to host it.
From the navigation bar, click the ―My Site‖ link at the top. This will create a new personal site to
which we can add content. After a minute or two, the site will be ready. Under the Site Actions
menu (see Figure 9) click the Create link and on the displayed page, click Web Part Page under
Web Pages. This will display a page similar to Figure 10. We must give the page a name and
select a layout and a document library. The document library specifies the container in which
our new page will be stored. Clicking the Create button will open the new page as shown in
Figure 11.
16
Figure 10
We can add a Web Part to the page simply by clicking on one of the Add a Web Part links in
one of the zones displayed. In the page shown, SharePoint depicts three WebPart zones laid
out in a header, navigation and body style. By clicking the ―Add a Web Part‖ link, you can open
and show the hidden Catalog Zone – a list of available Web Parts in the WebPart catalog. By
default, this will contain all of the Web Part provided out of the box but you can also add your
own and categorize them, as we shall see later. SharePoint provides a number of BDC Web
Parts (listed in Table 5) that allow the retrieval of data from a BDC data source (the earlier ADF
we create for example). Pick the Business Data List Web Part and click Add, as shown in Figure
11. This will close the CatalogZone and add the Web Part to the page.
Figure 11
We now need to configure it to specify what to show from the BDC. Clicking the ‗Open the Tool
Pane‘ link on the Web part will switch the display mode of the page to Edit and display the
EditorZone. Clicking the browse icon next to the ‗Type‘ field at the top of the editor will open the
Business Data Type Picker as shown in Figure 13.
17
Figure 12
The picker is showing all entities from all imported LOB System definitions from ADF files. There
is one displayed, which is from the ADF we imported earlier. The picker displays each in LOB
System instances/Entity Instance format. Recall that an ADF may contain any number of LOB
system instance declarations so there needs to be a way to disambiguate the entity instances.
Name Description
Business Data List Retrieve and display data from entity defined in BDC
Business Data Related List Retrieve and display data from an entity that is associated
with another
Business Data Actions Displays a list of actions defined for an entity
Business Data Item Retrieve a single entity instance (for a database, this could be
a table row)
Business Data Catalog Filter Provides a picker that allows users to filter by specified
criteria
Table 5. BDC Web Parts.
Pick the one item displayed (see Figure 13) and click OK. Ensure that you have SimHost
running (refer to Appendix A for more details) and click Apply in the Editor. This will apply the
configuration to the WebPart, which will pull the data back from the BDC. The BDC will invoke
the Web service we created earlier which will in turn invoke the TI component and the Host
program to return the account list. When successfully processed, as seen in Figure 14,
SharePoint will display the data from the LOB system.
18
Figure 13
You can verify that SharePoint is actually calling the Web service by looking at the counts in
SimHost. It should be incrementing by one each time you refresh the page.
Figure 14
Where are we? As I hope you can now appreciate, creating Web pages in this way, with no code, is very
powerful. What is also interesting is that it starts to blur the lines between developer and user,
design-time and run-time – and instead empowers all types of user to create, edit and change
pages directly in the browser. Of course, a particular user‘s ability to do this is strictly
controllable through the security model allowing the granting of particular actions as appropriate
to limit the ability to remove Web Parts from pages for example.
19
Using the BDC in this way is not without its limitations however. For starters, the approach of
invoking a Web service has its own drawback. It is an additional piece that requires packaging
and deployment. In addition, you must consider the performance and scalability of the service,
sizing Web servers appropriately with load balancing for example. One further area of concern
is in the security of the Web service. This is something that is harder to do when using the BDC
to act as the bridge rather than implementing your own code to make the call.
It may also have occurred to you that calling a Host program in this way introduces an
additional, arguably unnecessary step; that of calling the Web service compared to going
straight to the data source (e.g. a database or application). We can achieve a more efficient
solution through the creation of a custom Web part that will invoke the Host program directly
and thus avoid the HTTP-based Web service call, which is relatively expensive in terms of
introducing additional latency.
One reason you may have to go to creating a custom Web part rather than using the BDC
directly with a Web service is that the BDC only supports Web service operations, which
SharePoint defines with simple types as their parameters. This is not how people are building
Web services today. Instead, developers use XML schema to define the message types (as
complex types in XSD) for the request and response that are exchanged in a single input and
return parameter to the operation. WCF takes this approach by default, whereas ASMX
(ASP.NET Web services) takes the alternative, simple type, approach by default (although it too
supports both). Unfortunately, there is currently no way with the BDC to specify the schema to
use for a particular Type Descriptor.
Creating Custom Web parts The principal alternative to the issues and limitations discussed above is to create your own
Web parts. MOSS (and therefore SharePoint Services 3.0 (and ASP.NET 2.0)) provides a
framework for doing this. A Web part must inherit from the WebPart base class. You can
implement a number of key methods through the Web Part infrastructure, to invoke in the page
and part lifecycle. See Figure 15.
Figure 15 shows the interaction between the .aspx page, the Web Part manager and the Web
parts on the page. Although this is an event-centric model, most of the methods you implement
are actually overrides of the base classes rather than delegates (which you would typically use
for such things as event handlers on controls, OnClick for example). Moreover, some of the
methods you will implement are neither overrides nor delegates, but are looked for by the
SharePoint plumbing with calls dispatched to them as appropriate in the page‘s lifecycle. We will
look at an example of this later, but the method name is actually unimportant, it is the fact it
carries a particular attribute, which causes it to be invoked.
20
Figure 15. Interaction between page, Web Part Manager and Web Part (simplified).
SharePoint bases Web parts on the same inheritance hierarchy as regular ASP.NET 2.0 Web
parts but there are some important differences, summarized in Table 6.
Feature ASP.NET WSS 3.0/MOSS
Persistence Any store through API SQL Server only
Database access Custom code BDC (MOSS only)
Re-Hosting User controls can be auto-
rehosted as Web Parts
Not possible – see SmartPart
(ref) for one solution
Web part management WebPartManager SPWebPartManager
Table 6
To start, we need to create a class library project. Here, I have called the class TIWebPart. The
code below shows the class, inheriting from WebPart. You will need to add a reference to the
System.Web assembly first in order to use this namespace. As we saw earlier, the
CreateChildControls method is called after the OnInit event and this is the place to instantiate
any controls our WebPart is hosting. Of course, by taking over the WebPart development
21
ourselves, we also take on the responsibility of how to render the data in the UI. To keep things
simple, I have used a SharePoint control, SPGridView, which is simply bound to the data
returned from the Web Service. SPGridView provides the SharePoint standard ‗look and feel‘ of
the built-in Web Parts. Since we are using SharePoint components you will also need to add a
reference to the Microsoft.SharePoint.dll which can be found at \Program Files\Common
Files\Microsoft Shared\Web Server Extensions\12\ISAPI\Microsoft.SharePoint.dll
namespace Microsoft.HIS2009.Samples
{
public class TIWebPart : System.Web.UI.WebControls.WebParts.WebPart
{
// UI controls
private SPGridView grid;
protected override void CreateChildControls()
{
base.CreateChildControls();
grid = new SPGridView();
grid.AutoGenerateColumns = false;
this.Controls.Add(grid);
}
}
}
As our goal is to call the Host program as directly as possible we‘ll do this by referencing the TI
assembly directly in the Web part. This will enable us to invoke the method on the interface we
created in the TI project. This in turn will call the host program and return the data we want.
We‘ll then allow the connection of the data we return to another Web part that will display it on
the page. The OnPreRender, the WebPartManager calls the method prior to a request to the
WebPart to render itself. Here you can see that we are creating an instance of the TI class in
the TI assembly. The GETACCTS method will invoke the Host program we saw earlier with the
ACCTINFO structure that we defined in the TI Designer used to hold the results.
We create an ADO.NET DataTable so that we can populate it with the results of the call and
bind the DataGrid to it. This makes the displaying of the results as straightforward as possible
for the purposes of this example.
protected override void OnPreRender(EventArgs e)
{
Banking bank = new Banking();
ACCTINFO[] accounts = bank.GETACCTS(m_Name, m_PIN);
DataTable table = new DataTable();
table.Columns.Add("ACCOUNTNUMBER");
table.Columns.Add("ACCOUNTTYPE");
table.Columns.Add("CURRENTBALANCE");
table.Columns.Add("INTERESTBEARING");
table.Columns.Add("INTERESTRATE");
table.Columns.Add("MONTHLYSVCCHG");
for (int i = 0; i < accounts.Length; i++)
{
table.Rows.Add(accounts[i].ACCOUNTNUMBER,
22
accounts[i].ACCOUNTTYPE,
accounts[i].CURRENTBALANCE,
accounts[i].INTERESTBEARING,
accounts[i].INTERESTRATE,
accounts[i].MONTHLYSVCCHG);
}
DataSet ds = new DataSet();
ds.Tables.Add(table);
foreach (DataColumn col in table.Columns)
{
SPBoundField fld = new SPBoundField();
fld.HeaderText = col.ColumnName;
fld.DataField = col.ColumnName;
grid.Columns.Add(fld);
}
grid.DataSource = ds;
grid.DataBind();
}
Finally, to implement the method, you must ―draw‖ the control. You must choose from a couple
of overrides, when performing the actual render, Render or RenderContents both declared in
the WebControl class in the inheritance hierarchy. The difference between them is that
RenderContents renders only the inner contents of the control whereas Render gives you full
control over its appearance (see code below). You should not override Render. This is because
the burden of rendering not just the Web Part‘s contents but its container, including all the
SharePoint ―chrome‖, falls to you (which includes the surrounding top level HTML table). An
important feature of SharePoint is the consistent look and feel it provides. This is intended to
give users a comfortable, familiar experience when working with any SharePoint application.
protected override void RenderContents(System.Web.UI.HtmlTextWriter
writer)
{
grid.RenderControl(writer);
}
The finished result is shown in Figure 16. As you can see, it looks almost the same as the BDC
example. The SPGridView is the basis for many of the MOSS-provided Web Parts and we could
actually go further with a bit more customization and make it look the same as Figure 14 earlier,
with sorting, filtering etc. It also has the additional benefit that it works in WSS 3.0 and not just
MOSS.
23
Figure 16
Now that we have created our Web Part, we need to deploy it. There are a few options here, so
we will start with the quickest. Later we will look at using features, which I mentioned in the
introduction, to package and deploy a Web Part. This technique is more appropriate for deploy
to a server farm or different environments than your own development one. At a minimum, in
order to use our shiny new Web Part we must create a .WebPart file and add a SafeControl
element to web.config for it. This is the Web Part description file and provides details about the
Web Part such as the type name, the name to display and a description. This is shown below.
Add this to your project and call it TIWebPart.webpart.
<webParts>
<webPart xmlns="http://schemas.microsoft.com/WebPart/v3">
<metaData>
<type name="Microsoft.TechEd.Europe.TIWebPart" />
<importErrorMessage>Cannot import this Web
Part.</importErrorMessage>
</metaData>
<data>
<properties>
<property name="Title" type="string">Transaction Integrator Web
Part</property>
<property name="Description" type="string">Provide access to
Transaction Integrator</property>
</properties>
</data>
</webPart>
</webParts>
Next copy the project binary and corresponding TI assembly to the folder under the site that you
provisioned. E.g. if you created a new site collection, it will have created a new site in IIS with a
new port. If you are using the default, it will be port 80 and you will need to copy the files to
C:\inetpub\wwwroot\wss\VirtualDirectories\80\bin. Then add the SafeControl Element to the
SafeControls section in the Web Config file. See Figure 17 for the format of the SafeControl
element.
<SafeControl Assembly="[Assembly Name]" Namespace="[Namespace]" TypeName="*"
Safe="True" />
24
To use the Web Part, we will create a new page as before. Enter a name for the page and
choose and template. When IE shows the page, click the ‗Add a Web Part‘ link, but this time
when the Web Part catalog opens, click the ‗Advanced Web Part gallery and options‘ as shown
in Figure 17. The Add Web Parts Editor will now open. Click the Browse bar at the top, and
select Import. This will show the Import page (see Figure 18). Next, click the browse button, and
locate the .webpart file you just created.
Figure 17
Click the Upload button, and the Transaction Integrator Web Part will now appear (Figure 18,
right-hand picture). You can now drag and drop this onto one of the Web Part zones on your
page. The result should look like Figure 14 shown earlier. This quickly becomes a bit laborious.
Later, we will look at how to add your Web Parts permanently to the Catalog shown in Figure 17
rather than having to upload them each time.
25
Figure 18
26
Working Directly with Host Data
As I have already mentioned the BDC is not just for consuming Web services, but direct data
content as well. Although the BDC provides special support work with SQL Server and Oracle
DB, it supports any database provider that ADO.NET does (except managed providers
currently). This of course includes the providers shipped with HIS 2009 including DB2 and Host
files. SharePoint supports both ODBC and OLE DB providers. This means that we can access
data from a wide array of sources in a uniform way. It is worth thinking about that for a second.
What the BDC enables is the ability to abstract where the data comes from, how it is accessed
and how it is delivered to you, through an xml-based grammar. As a developer you are able to
deal with just the data, the fact it came from Oracle or DB2 is largely irrelevant (although not
completely as I will explain later). This makes it a direct replacement for the traditional data
access layer (DAL) which is often project and database specific. The amount of effort this has
the potential to reduce and the opportunity for reuse should not be underestimated. Do not
forget that the less code you have to write means the fewer bugs you will have to find and fix.
The first thing to do is to create the connection details for the data source. For this, we can use
the Data Access Tool (DAT), provided with HIS. The DAT walks you through the creation of a
data source (see references for full details). Once we have a connection string, we can use it to
connect to the data source through the BDC Definition Editor.
In this example, I am using the Microsoft DB2 OLEDB provider and have configured it against
the local IBM DB2 SAMPLE database that is created as part of the install if selected (see the
prerequisites section for full details on what you will need to try this for yourself).
The next obvious step is to open the BDC Editor and author the application definition. However,
at the current time this will not work due a compatibility issue between the HIS providers and the
editor. The product group is looking into the issue though for now the alternative is to manually
create the ADF.
Remember the ADF consists of three main things; the connection details, entity definitions and
associations between entities (if required). We have the connection string from the DAT, so let
us turn it into a <LOBSystemInstance>.
The snippet in Figure 20 shows the connection details derived from the DAT. The first
observation to note is that any property prefixed with ―rdbconnection‖ is passed straight through
to the provider itself. This means that you can control any of the provider-specific configurations,
as required. Putting it another way, Figure 20 is equivalent to the following connection string:
Provider=DB2OLEDB;User ID=<user>;Password=<password>;Host CCSID=37;PC Code
Page=1252;Units of Work=RUW;DBMS Platform=DB2/NT;
As can be seen in the BDC Editor in Figure 19, there are four types of data connection, listed in
Table 7. The DatabaseAccessProvider <property> represents this choice in the corresponding
ADF XML and takes one of these values as its contents.
27
Table 7. BDC Data Sources.
DatabaseAccessProvider Data Provider
SQLServer MS SQL Server version 7 onwards
Oracle Oracle DB version 9i onwards
OleDb Use an OLE DB provider
ODBC Use an ODBC driver
Figure 19
<LobSystemInstances>
<LobSystemInstance Name="LOBSampleInstance">
<Properties>
<Property Name="rdbconnection Provider" Type="System.String">DB2OLEDB</Property>
<Property Name="rdbconnection DBMS Platform" Type="System.String">DB2/NT</Property>
<Property Name="rdbconnection Host CCSID" Type="System.String">37</Property>
<Property Name="rdbconnection PC Code Page" Type="System.String">1252</Property>
<Property Name="rdbconnection Units of Work" Type="System.String">RUW</Property>
<Property Name="rdbconnection User ID" Type="System.String">{user}</Property>
<Property Name="rdbconnection PWD" Type="System.String">{password}</Property>
<!-- additional properties removed -->
</Properties>
</LobSystemInstance>
</LobSystemInstances>
Figure 20. LOB Connection Details.
28
Now that we've defined the connection properties, let's continue by adding an entity definition. In
the DB2 Sample database there is a table called EMPLOYEE. Consider the XML shown below.
<Entity Name="EmployeeEntity" EstimatedInstanceCount="100">
<Methods>
<Method Name="FindEmployees">
<Properties>
<Property Name="RdbCommandType"
Type="System.Data.CommandType, System.Data, Version=2.0.0.0, Culture=neutral,
PublicKeyToken=b77a5c561934e089">
Text</Property>
<Property Name="RdbCommandText" Type="System.String">
SELECT EMPNO, LASTNAME FROM EMPLOYEE WHERE LASTNAME LIKE ?
</Property>
</Properties>
<Parameters>
<Parameter Direction="In" Name="LAST_NAME">
<TypeDescriptor TypeName="System.String" Name="LAST"/>
</Parameter>
<Parameter Direction="Return" Name="EmployeeTypes">
<TypeDescriptor TypeName="System.Data.IDataReader,System.Data,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
IsCollection="true" Name="EmployeeTypesDataReader">
<TypeDescriptors>
<TypeDescriptor Name="EmployeeTypesDataRecord"
TypeName="System.Data.IDataRecord, System.Data, Version=2.0.0.0,
Culture=neutral,PublicKeyToken=b77a5c561934e089">
<TypeDescriptors>
<TypeDescriptor Name="EMPNO" TypeName="System.String" />
<TypeDescriptor Name="LASTNAME" TypeName="System.String"/>
</TypeDescriptors>
</TypeDescriptor>
\</TypeDescriptors>
</TypeDescriptor>
</Parameter>
</Parameters>
<MethodInstances>
<MethodInstance Name="EmployeeTypesInstance" Type="Finder"
ReturnParameterName="EmployeeTypes"
ReturnTypeDescriptorName="EmployeeTypesDataReader"
ReturnTypeDescriptorLevel="0" />
</MethodInstances>
</Method>
<Methods>
<Entity>
This contains a lot of information but it breaks down quite simply. An entity is a representation of
some data in the LOB system. It has a name and a collection of methods that define how the
data is retrieved. In this example I have a single method with two parameters.
In contrast to the Web service example earlier, there are two new <Property> elements on the
entity: RdbCommandType and RdbCommandText. RdbCommandType specifies whether a raw
SQL string is provided or the name of a stored procedure holding the values "Text" or
"StoredProcedure" respectively. RdbCommandText then contains either the SQL itself or the
name of the stored procedure to call. Here, I have specified a SQL Select statement.
As we saw earlier, a parameter has a name and direction (such as In or Return) while the
<parameter> element contains a <TypeDescriptor> element defining the parameter type. The
29
parameter name must match the name of a replaceable parameter in the command text SQL, or
the name of a parameter for the supplied stored procedure. Note that this is one area where the
database implementations differ. While SQL Server allows the names to be specified in the SQL
as in WHERE name = @name, the Microsoft DB2 provider does not. Instead it requires that
positional parameter markers are specified with a ―?‖ and cannot be named. The value of the
LAST_NAME parameter will be used to replace the ? at evaluation time. Note that this makes
the order in which the parameters important as care must be taken to ensure they match the
SQL/Stored procedure. This begs the question ―where does the value come from?‖ as usually
you would want to supply a dynamic value or you might as well hard code it in the SQL
expression itself. This is answered by Filters.
Filters provide a mechanism to dynamically supply values to parameters from a variety of
sources. For example, the provided BDC Data List Web Part can render filters as input boxes to
take a value from the user which is then plugged into the Filter. This is illustrated in the XML
shown below. In addition, default values can be specified for both parameters and filters. The
two are associated by the specification of the AssociatedFilter attribute on the <Parameter>.
<FilterDescriptors>
<FilterDescriptor Name="LASTNAME" DefaultDisplayName="SURNAME"
Type="Comparison"/>
</FilterDescriptors>
<Parameter Direction="In" Name="name">
<TypeDescriptor AssociatedFilter="LASTNAME" TypeName="System.String,
mscorlib,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
Name="name" />
</Parameter>
Any Web Part that is a connection point provider however can be wired up to a Web Part filter.
A good illustration of this is the Query String Web Part. Consider a Web page that has a
Business Data List based on our EMPLOYEE entity as shown in Figure 21 We would like to
specify that the LAST_NAME parameter can come from one of two sources depending on how
the user got to the page. For this we can specify a filter as well as a Query String Web Part.
What this Web part allows is the mapping of the browser‘s URL Query string parameter to a
Filter. This means that the URL can be used to navigate to a page with specific data on it
passed in the query string.
Filters are specified on methods in the editor. Expanding a method (under the Methods node)
will show a number of additional nodes including Filters. Right clicking the Filters node and
selecting "Add Filter" will display the filter property sheet. The Filter Type property will display
the dropdown shown in Figure 20. The Type in the FilterDescriptor determines how the filter‘s UI
is rendered and controls the operator that is shown on the filter. Figure 23 shows an example. It
has three constituents; the item to filter, the operator and a value. The value chosen for the
Filter Type actually affects two attributes in the ADF; the Type and an optional <property>
element that can appear under the <FilterDescriptor>. Each of the options affects the UI of the
filter in different ways. A Wildcard filter is intended to be used in conjunction with the SQL LIKE
clause. It provides four operators; ―is equal to‖, ―starts with‖, ―ends with‖ and ―contains‖.
30
Figure 21. BDC Definition Editor Filter Types.
The operator chosen at runtime actually constructs the value passed to the SQL query by
prefixing or suffixing it with the SQL "%" operator, or in the case of ―contains‖, both. Figure 22
shows the different valid values for the Type attribute. As you can see, this differs from the list in
Figure 21. This is because the editor is combining some of the options available through the two
values, Type and Property. For example, the Comparison type can specify the optional
<Property> element (see below) which defines which comparison to use (from the list in
Figure 21). Thus the first six items in the list all map to the Comparison operator with varying
property values.
<Properties>
<Property Name="Comparator" Type="System.String">LessThanEquals</Property>
</Properties>
Recall from earlier that Visual Studio Intellisense on the BDC schema can be enabled by adding
a reference to the schema in the ADF xml document's property sheet Schemas property.
Figure 22
31
The full xml for the ADF file including the filter definitions is provided in Appendix D.
Figure 23
Associations can be viewed as an alternative to Filters, allowing entities to be related to one
another by their identifiers, similar to foreign keys in an RDB. This compares to filters allowing
the specification of a value from one list being made available to another. Although this probably
won‘t make sense right now, I will come back to it at the end of this section to consider this more
fully. Associations make it possible to navigate a relationship such that one Web Part can drive
what is queried by another to create a master-detail style page, for example. MOSS provides
the Business Data List Web part we‘ve already used as well as the Business Data Related List.
This Web part enables the picking of an entity that has a defined association to another in the
BDC (if an entity has no associations, it won‘t appear in the Web Part picker at all). Figure 24
shows an example of a related list with the Employee and Department tables of the DB2
SAMPLE database.
32
Figure 20
Once the associated entity has been set on the related list, the list can be connected to the
source BDC Data List. This is achieved by clicking the edit dropdown on the source Web Part
and clicking Connections which brings up a submenu with ―Send Selected Item To‖ highlighted.
Again, clicking on this will bring up a list of Web Parts that are compatible, in our case it will
show one item, the BDC Related List Web Part (see Figure 25). Under the covers, this uses
connection points, something that we will look at more deeply later.
Figure 21
As you can see, MOSS renders a radio button in the first column of the business data list which,
when selected, will automatically refresh the related DepartmentEntity list. Selecting an item
passes the identifier value as defined in the BDC to the related part which then performs a
query as defined in its SpecificFinder method. This raises two important points. Firstly, your
SQL needs to be written with this in mind – it‘s very likely that even if you use a tool to create
the ADF it will require hand-tweaking. Secondly, if you want to use this functionality you must
provide both Finder and SpecificFinder methods. Remember the warning we saw earlier when
importing an ADF file? The text of the warning was similar to the one we get for this ADF:
No method instance of type SpecificFinder defined for for application
'TechEdDemo', entity 'EmployeeEntity'. Profile page creation skipped.
33
We can now understand what this means. What it‘s telling us is that there‘s no SpecificFinder
method on our entity. So what? SpecificFinder allows the retrieval of a single entity instance
(and will error if the specified query returns more than one). When an ADF is imported into the
BDC it will automatically create a Web page, called a profile page, which is preconfigured to
display a singular entity instance (such as database table row). To do this the SpecificFinder
must also specify at least one identifier as that‘s what is used to look up a single instance. It is
then possible to pull up any entity instance using a URL to the page with a query string in a Web
browser.
Let us fix this up. There is currently an issue with my previous example. If the source entity
identifier is not unique, when a non-unique row‘s radio button is selected, all the rows with the
same value for the identifier are selected. This does not affect the destination entity-bound
related list, provided that it *is* unique when the retrieval operation is executed – remember a
SpecificFinder must only return a single instance. To correct this, we must specify another
identifier in theEmployeeEntity as shown below.
<Identifiers>
<Identifier Name="DEPTNO" TypeName="System.String" />
<Identifier Name="LASTNAME" TypeName="System.String" />
</Identifiers>
Together this pair is unique. Next, we must ensure the value for this identifier is set, so it must
be referenced from the return parameter type descriptor. We do this as before, by adding the
IdentifierName attribute to the TypeDescriptor element, with a value that matches the
<Identifiers> node of LASTNAME. This fixes the first problem where multiple items are selected
in the source list. However, we will now get an error in the related list when an item is selected.
Could not find fields to insert all the Identifier Values to correctly execute an Association with
Name 'EmployeeToDepartment'. Ensure input Parameters have TypeDescriptors associated
with every Identifier defined for all the Source Entities in this Association.
The reason for this is that we must actually provide an input parameter
specifying the same identifier to ensure the identifier is always set.
However, we don’t have a value and therefore must provide a default
value as follows:
<Parameter Direction="In" Name="DEPTNO">
<TypeDescriptor TypeName="System.String" Name="DEPTNO"
IdentifierName="DEPTNO" >
<DefaultValues>
<DefaultValue Type="System.String" MethodInstanceName
="EmployeeTypesInstance">%</DefaultValue>
</DefaultValues>
</TypeDescriptor>
</Parameter>
Note that although this ‗dummy‘ parameter will be passed into the query execution it will be
ignored. The final change we need to make is to add the same dummy parameter to the
destination entity. This is analogous to saying, for any given entity instance in the BDC list, the
related list should show retrieve the row with the specified identifier value.
<Parameter Name="LASTNAME" Direction="In">
34
<TypeDescriptor Name="LASTNAME" TypeName="System.String"
IdentifierName="LASTNAME"IdentifierEntityName="EmployeeEntity" />
</Parameter>
Notice that for the destination entity parameters we use the additional IdentifierEntityName
attribute to reference back to the identifier declared on the source entity. Now we can turn our
attention to getting the profile page created for EmployeeEntity. If we add a SpecificFinder
method to it as shown below, the ADF import error will disappear. <MethodInstance Name="EmployeeTypesSpecificInstance"
Type="SpecificFinder"
ReturnParameterName="EmployeeTypes"
ReturnTypeDescriptorName="EmployeeTypesDataReader"
ReturnTypeDescriptorLevel="0" />
Figure 22
We can see what entity profile pages are created for us by looking at the BDC application
definition. Figure 26 shows the entities listed. Clicking on the EmployeeEntity link will show
Figure 27. We can then use the URL shown (highlighted in red) to retrieve a single entity
instance. The URL shown is:
http://bts01:2126/ssp/admin/Content/EmployeeEntity.aspx?WORKDEPT={0}&LASTNAME={1}
35
Figure 23
The {0} and {1} must be replaced with actual values. Figure 28 shows an example of this with
each replaced by A00 and HAAS respectively. The profile page shows all the fields defined in
the return parameter‘s type descriptor that have been retrieved by the SQL query executed. The
full ADF is provided in Appendix D.
Figure 24
36
Integrating with WebSphere MQ
One of the most exciting features of HIS 2009 is the new IBM WebSphere MQ WCF Channel
Stack. This should be seen as the natural successor to the BizTalk MQ adapter allowing unified
cross-platform messaging and integration. It is owned by the same Product Group benefiting
from many years of practical experience of working with customers and listening to their
requirements. The MQ channel supports the following features:
Transactional sends and receives
MQ Header support that‘s extensible
Ordered delivery – singleton pattern
Once and only once delivery
Fully configurable poll/wait intervals & retries
Poison message (via dead letter queue) configuration
Perfmon counters
First, a quick overview. WebSphere MQ is an asynchronous message queuing technology akin
to Microsoft‘s MSMQ. The principal difference between them is that MQ is implemented on over
18 different platforms (so not just Windows) making it very popular in large and medium-sized
enterprises for software integration. Due to its asynchronous behavior it suits disconnected or
temporary connection scenarios where both sender and receiver do not have to be online at the
same point in time and where the receiver has no knowledge of the sender. The queue is the
only shared resource and is itself the rendezvous point for integration. It allows the passing of
messages from one platform to another transactionally if required enabling once and only once
delivery. MQ allows higher-order message exchanges such as request/reply (essentially
correlated responses) to be easily built on top to create RPC-style semantics. MQ also supports
carrying both a message payload and headers where the headers can be a mixture of both IBM-
defined and your own enabling extensibility. Typically, headers are used to control behavior of
the message handling process and are set on a per-message basis. MQ handles large
message support through segmentation – the breaking up of a large message into a number of
smaller ones (of a maximum size of 100MB) and then automatically (or manually) reassembling
them the other end. To use the channel You will need to install MQ 7.0 (a 90-day free trial is
available). Both derivatives of MQ are supported, Base and Extended Transactional client. The
channel has no dependency on IBM‘s .NET managed wrapper.
Using the MQ Channel is no different to using any other with WCF so if you are already familiar
with WCF you are off to a flying start. No host is provided (that is coming in the Dublin
timeframe) so the options are the same as usual namely, create your own (Windows Service,
Forms or console app for example) or use IIS (through svc configuration). Windows Activation
Services (WAS) support is planned, requiring an MQ listener adapter, and this will be available
in a later release. For more details on WAS and WCF hosting see the references section at the
37
end of this paper. Here we will look at hosting in MOSS through the use of another custom Web
part. We can make this generic such that the Web part can take the message contents from
another and put into a queue/queue manager configured on the Web part itself. Let us see how
this is done.
Web Part for MQ Consider the scenario where we wish to place a message on a queue as a result of an action in
the UI of our MOSS application. We need to create a Web part that can be connected to
another that will provide the data we will place in the message sent to an MQ queue. We‘d also
like the queue and queue manager that are used to be configurable on the Web part itself so
that we can reuse this Web part in the future. For brevity (and because you are an expert now) I
will cover aspects of Web part creation we haven‘t looked at yet.
In this next example, we‘re going to create a BDC-aware Web part. With MOSS, as we‘ve seen,
there are a number of out of the box (OOB) BDC Web parts that can pull data from the entities
you (or others) create (from imported ADF files). These Web parts provide functionality such as
displaying items or lists of items from the entity. Indeed, this is the primary advantage over
traditional approaches as both the Web parts and BDC definitions are so readily reusable and
consumable across different Web pages and applications. To use the BDC from your own Web
parts requires some understanding of the BDC assemblies and APIs within MOSS so let us start
there. The main assembly we will use is Microsoft.SharePoint.Portal and despite its slightly
misleading name, it is only present in MOSS. By default, you will find it located at:
C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\ISAPIThere are
two namespaces of interest to us here that contain a number of interesting classes that we‘ll use
for calling the BDC and retrieving data. These are:
Microsoft.Office.Server.ApplicationRegistry.Runtime
Microsoft.Office.Server.ApplicationRegistry.MetadataModel
The MetadataModel provides an object-based representation of the BDC. For example, the
Entity class represents an <entity> defined in the BDC while there is a View class which
represents the type of method for the entity required (such as SpecificFinder). The RunTime
namespace provides access to the runtime objects of the BDC. The principal object here is the
IEntityInstance interface type. This is used to represent an instance of an entity, which can be
queried, filtered etc to harvest the data provided by the BDC.
Connecting Web parts SharePoint allows the creation of connectable Web parts that can interact with one another and
exchange data. A Web part can act as a provider, consumer or even both. The overriding goal
of the Web part architecture is to encapsulate discrete functionality such that an application‘s
complexity remains manageable as it grows. To indicate that a method in a Web part is to act as
the receiver or consumer of something is as simple as decorating it with the following attribute:
[System.Web.UI.WebControls.WebParts.ConnectionConsumer("Message")]
38
On a Web part page, this will then be selectable as shown in Figure 29.
Figure 25. Connecting Web Parts.
The method itself is defined as follows:
public void SetConnectionPoint(IEntityInstanceProvider provider)
The ‗something to consume‘ is represented by the single parameter of type
IEntityInstanceProvider. This represents the mechanism to obtain an instance of an entity in the
BDC and is defined in Microsoft.SharePoint.Portal.WebControls. Although this interface is
documented as ‗Internal use only‘ it is the BDC architecture‘s mechanism used by all the out of
the box BDC Web Parts. This warning really refers to the fact that it should not be implemented
by your code. The method itself will be called exactly once by the WebPartManager every time
any entity instance it is associated with (via the Connection in the UI) changes, which generally
forces a page repaint.
The mechanism of passing an interface is based on the approach handed down from ASP.NET
2.0 Web parts whereby an interface is declared to describe the data to be passed from the
provider to the consumer. The provider will create and populate the object implementing the
interface and the consumer then receives the reference and can extract the data. In MOSS, the
BDC provides this infrastructure, using the ADF entity definitions to define what is passed.
Let us consider how this whole process bootstraps from a page loading perspective. When a
Web Part acts as both a provider and consumer to perhaps many other web Parts it‘s obvious
that something is needed to control and coordinate this process otherwise problems could
occur. For example, a circular reference (between two Web parts acting as provider and
consumer to one another) could result in causing an endless processing loop. Again, this task
falls to the WebPartManager. Figure 15, shown earlier, shows the event model of the Web Part,
WebPartManager and the hosting Web page (aspx). You should note that it is unsafe to use a
provider‘s passed interface instance until the PreRender event. This is because it is only at this
point that all Web Parts have been loaded, initialized and connected. It is still possible however
to create a circular reference, where two Web Parts are both providers and consumers to one
another. The WebPartManager takes care of ensuring that each is initialized only once
however. Some thought is therefore required on how Web Parts obtain their data as they must
have obtained it prior to the manager requesting it and passing it to the Web Part‘s consumers.
In this way, the WebPartManager can ensure that traditional referencing issues are avoided. By
default, a Web Part can act as a provider connection to many consumers but can have only one
consumer connection to any given provider.
39
I mentioned earlier that associations and filters are similar in nature. The reason for this is that
both allow the passing of a data item from one Web Part to another. As we‘ve seen, in the case
of associations this must be defined statically in the ADF but with filters we can specify as many
as we want and write the SQL queries (or stored procedures) to base their logic on what values
they get. This makes filters a more dynamic option as filter values from one Web part can also
be used as a provider connection to another without the need for associations.
WCF Channel for WebSphere MQ To work with the MQ Channel we need to add the following project references:
System.ServiceModel.dll
System.ServiceModel.Channels.dll
System.ServiceModel.Channels.WebSphereMQ.Channel.dll
In addition, we need to reference the following assemblies. These contain the BDC API and
standard SharePoint libraries
Microsoft.Sharepoint.dll
Microsoft.Sharepoint.Portal.dll
These can be found in
C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\ISAPI
The following using clauses.in our class files source are also required:
using System;
using System.Web.UI.WebControls;
using System.Web.UI.WebControls.WebParts;
using Microsoft.Office.Server.ApplicationRegistry.Runtime;
using Microsoft.SharePoint.Portal.WebControls;
using Microsoft.Office.Server.ApplicationRegistry.MetadataModel;
using System.ServiceModel;
using System.ServiceModel.Channels;
using System.ServiceModel.Channels.WebSphereMQ;
using System.ComponentModel;
Let us take a closer look at our consumer code. The IEntityInstanceProvider reference passed
contains a reference to an entity instance. Recall from our earlier look at the BDC and ADF
definitions that an entity is a formal declaration that represents a database query (via either SQL
or a stored procedure call) or a Web service operation call. Therefore, an entity instance is a
runtime instantiation of the entity. Our method will get passed this after it has been evaluated by
the BDC infrastructure such that it is populated with the data it represents. What we need to get
hold of is the data that the entity instance is holding.
40
To do this, consider the following code:
[ConnectionConsumer("Message")]
public void SetConnectionPoint(IEntityInstanceProvider provider)
{
Entity entity = provider.SelectedConsumerEntity;
View view = entity.GetSpecificFinderView();
if (view == null)
{
return;
}
IEntityInstance inst = provider.GetEntityInstance(view);
if (inst == null)
{
return;
}
System.Data.DataTable dt = inst.EntityAsDataTable;
if (dt == null)
{
return;
}
if (dt.Rows.Count == 0)
{
return;
}
m_Message = dt.Rows[0][0].ToString();
}
This code is quite defensive (and I have removed exception handling code) because this
method is called more than once as noted earlier. In fact, the first time it‘s called it won‘t have
been evaluated or contain data, hence the null checks. When it does contain data you can see
that we simply work with standard ADO.NET types, Data Table, DataRow etc to process it. Here
we are pulling the first row of the first column and storing it for use by our Web Part later, in
m_Message, which is a class-level member variable of type string. We will use this data item as
our message.
To make our Web Part configurable we want to be able to specify details such as the MQ
Queue Manager and Queue name at design time. We can easily achieve this by creating public
properties and decorating them with the []WebBrowsableattribute defined in the
System.Web.UI.WebControls.WebParts namespace. This attribute takes the name and type
of the property and is used in the ‗configure Web Part‘ (see Figure 30 to render the UI. The code
takes the following form:
string m_QueueManager;
[WebBrowsable(true),
Category("MQ Configuration"),
Personalizable(PersonalizationScope.Shared),
DefaultValue(""),
41
WebDisplayName("Queue Manager"),
WebDescription("The name of the MQ Queue Manager to use.")]
public string QueueManager
{
get { return m_QueueManager; }
set { m_QueueManager = value; }
}
Next, let us look at the WCF channel code that will actually put the message received from
another Web Part in the Queue specified in the UI configuration. This is shown below:
void PutMsg()
{
WebSphereMQBinding binding = new WebSphereMQBinding();
binding.ConnectionType = m_ConnectionType;
binding.MqcdChannelName = m_Channel;
binding.MqcdTransportType = m_Transport;
binding.MqcdConnectionName = m_ConnectionName;
ChannelFactory<ITestNet> factory = new ChannelFactory<ITestNet>(binding);
factory.Open();
string epa = "net.mqs://" + m_QueueManager + "/" + m_Queue;
EndpointAddress to = new EndpointAddress(epa);
ITestNet proxy = factory.CreateChannel(to);
proxy.TestMethod1(ViewState["message"] as string);
((IChannel)proxy).Close();
}
This code first sets a number of binding-specific properties by creating an instance of the MQ
Binding. The URI for the end point address is structured as mqs://queuemanager/queue broken
down as follows:
Scheme Queue Manager Queue
net.mqs:// QM or QM{cluster} MyQueue
Finally, a ChannelFactory is created. The ChannelFactory is used to instantiate the channel for
the URI-based address so it is given our MQ binding and EndpointAddress. The channel‘s
constructor is a generic to which we specify the service‘s contract. The contract is an interface
that specifies the service operations. The operations then specify their exchange pattern (one-
way etc) and types (what is passed into and returned from the operation). The type for this is
shown in the following code. For more background reading on WCF see references. In our
example, we have one operation, ―TestMethod1‖ that simply takes a string. This string will be
the payload of our MQ message. We can now call the method, passing the message which will
42
create a message on the queue. The last thing we do is close the channel. As a side node, it‘s
worth bear in mind that creating a ChannelFactory is an expensive operation and in production
code it‘s advisable to keep hold of it across requests to reuse it to improve performance.
[ServiceContract]
interface IMQActions
{
[OperationContract(IsOneWay = true, Action = "PutMsg")]
void PutMsg(string x);
}
class MqChannel
{
static int numberOfMessagesReceived = 0;
// server class which implements the interface
[ServiceBehavior]
class MQActions : IMQActions
{
string m_Message = "";
[OperationBehavior]
public void PutMsg(string x)
{
m_Message = x;
numberOfMessagesReceived++;
}
}
}
A Web Part renders HTML to the browser. There are two events important to this, OnPreRender
and Render. In our example the Web Part will render its state (queue/queuemanager) and a
button, that when clicked, will place a message on the queue.
protected void PutButton_Click(object sender, EventArgs e)
{
PutMsg();
}
The complete code is provided in Appendix E.
43
Putting it all Together As we saw before, to test our Web part, first we need to deploy it, add it to a new Web page and
configure it. Then we can use our new page to start putting messages on the queue. To
verifying this is working we can use the WebSphere MQ Explorer.
This time, instead of manually importing it we will deploy it using a Feature. Features were
briefly mentioned at the beginning of this article and are a way to package and deploy a piece of
functionality. Features can be enabled or disabled for a given site as well encouraging the
packaging of a number of components as a unit.
There are several steps involved in this process:
1. Create a manifest.xml file
2. Create a feature xml file
3. Create an elements xml file
4. Create a webpart file
5. Create a WSP file
6. Run STSADM.EXE
The Visual Studio extensions for WSS (VSeWSS) can make this process significantly easier as
it will automate nearly the whole process. Note I am using the 1.3 CTP version of VSeWSS so
some aspects may change prior to final release. There are a number of SharePoint project
types provided including a WebPart one. Three files will be visible in the project, a .cs (or .vb)
file and a .webpart file. The source file is where we will put our implementation shown above.
The .webpart file is an xml document that contains a basic description of the Web Part – this is
the file we created manually earlier. What you enter in this file is what will show up in the Web
Part Catalog used to add Web Parts to a page. The final file is another xml document called the
elements file. This contains a reference to the webpart file.
The next step is to press F5 (Debug) which will complete the file creation and deployment
process. The following files are then created: feature.xml, manifest.xml, .wsp file and of course
the Web Part assembly itself. These are all added to the debug folder under the project. The
feature file contains references to webpart and elements file. The manifest file holds the details
of the features being packaged and the Web Part assembly. It also specifies the configuration
shown below that adds the Web Part to the <SafeControls> section of the target site‘s
web.config file. This is necessary in order for the Web Part to be trusted. Finally, the .wsp file is
the Windows SharePoint Package file that can be used to deploy the Web Part to another machine. This file contains everything necessary to deploy the Web Part.
<SafeControls>
<SafeControl Assembly="mqwebpart, Version=1.0.0.0, Culture=neutral,
PublicKeyToken=9f4da00116c38ec5" Namespace="mqwebpart"
TypeName="WebPart1" Safe="True" />
44
</SafeControls>
To deploy the .wsp to another server the STSADM admin tool is used as follows:
stsadm.exe -o deploysolution -name Samples.wsp -url http://localhost
STSADM.EXE is by default located in C:\Program Files\Common Files\Microsoft Shared\web
server extensions\12\BIN. There are two parameters required, the name of the .wsp file and the
url to deploy to (this could be an existing site for example).
Once the Web Part is deployed, we can add it to a page. As before, under site settings, click
‗New Web Part page‘ to add a new .aspx page to the associated document library. After giving
the page a name and choosing its layout, we can add our new Web Part by clicking one of the
―Add a Web Part‖ links. Our new Web Part will appear under the miscellaneous section. Click
the edit dropdown and select ―Modify Shared Web Part‖. Scroll down in the right-hand editor
pane that now shows and under MQ Settings, expand the ―+‖. This displays our two MQ
properties, Queue Manager and Queue name. Enter the following as shown in Figure 30.
Figure 30. Configure MQ Web Part
Now we need to add a Web Part to connect to ours that will supply the data for the message we
are going to PUT to MQ. For this we‘ll employ the BDC Data List we used earlier that pulls data
from our DB2 Employee BDC definition. Follow the same steps as before to add and configure
this Web Part. The page now looks like Figure 31. Once done, we need to connect them. The
BDC List will be the provider and the MQ Web Part will be the consumer. We can actually
specify on either the provider or consumer. Figure 32 illustrates this. An important point to bear
in mind is that to enable a BDC Data List to act as a provider to another Web part it must
provide both a Finder and a SpecificFinder. The Finder is used to display the list while the
SpecificFinder is used to retrieve the individual selected instance (remember that a radio button
is added to the UI) and pass it to the consumer(s).
45
Figure 26
Now we have completed the deployment and page creation we can test it and see the
messages appearing in MQ. For this, open the IBM WebSphere MQ Explorer. Select an item in
the BDC List and then click the button on the Web Part. This will write the selected item to a
message in the queue as shown in figure 33.
Figure 27. Connecting Web Parts.
46
Figure 28. WebSphere MQ.
Review
Let us recap. The MQ Channel stack for WCF allows you to get and put messages from and to
MQ queues through the standard WCF architecture and APIs. The primary usage of MQ is for
application integration. The ability to send and receive messages asynchronously across
platforms is very powerful. As you have seen, we can host this in SharePoint to enable the
sending of messages from our MOSS applications. Currently there are a few limitations
however. The channel only supports one way operations. This means that creating
request/reply or solicit/response message patterns will require additional work on your behalf.
As I have mentioned, typically these scenarios are modeled by correlating one message to
another either to the same queue or providing a reply channel that contains the responses. MQ
supports this with the MQMD header to provide simple correlation. Another obvious limitation is
that you must provide your own host. A WAS listener adapter is coming that will provide hosting
via the Activation Services infrastructure that will mostly alleviate this.
47
Using the ADO.NET Data Provider for Host Files
Now that we are Web part and BDC experts let us consider the need to surface host file data in
our SharePoint Web application. A new feature of the HIS 2009 Host File Managed Provider is
the ability to read files in offline mode. This makes it possible to download a file from the host
environment and use it locally without the need to have a host connection, which is a great
productivity boost. If you are thinking that you just have to create a LOB system in the BDC
using the Host File Provider and hook it up to a BDC Web part and you are done – you are
going to be disappointed. One limitation of the BDC is currently it does not support .NET
managed providers. This leaves us no option but to create a custom Web Part to retrieve the
data. Let us look at how easy this is now that we have mastered the basics. The approach I will
take to save space and time is to base this Web part on the TI Web part I showed you earlier.
To use the Host file provider we need to add a reference to the
Microsoft.HostIntegration.MsHostFileClient assembly and a using statement to our class:
using Microsoft.HostIntegration.MsHostFileClient;
The bulk of the work will be performed in the OnPreRender method. This will be where we read
the host file data and bind it to an SPGridView as we did with the TI data. The code for this is
shown below. This method handles the host file query and resultant dataset, which can then be
bound to the SPGridView.
protected override void OnPreRender(EventArgs e)
{
DataSet result = new DataSet();
HostFileCommand cmd = new HostFileCommand(
new HostFileConnection(ConnectionString));
cmd.CommandText = Query;
HostFileDataAdapter adapter = new HostFileDataAdapter();
adapter.SelectCommand = cmd;
adapter.Fill(result);
cmd.Dispose();
grid.Columns.Clear();
foreach (DataColumn dc in result.Tables[0].Columns)
{
if (dc.ColumnMapping == MappingType.Element)
{
SPBoundField fld = new SPBoundField();
fld.HeaderText = dc.ColumnName;
fld.DataField = dc.ColumnName;
grid.Columns.Add(fld);
}
}
grid.DataSource = result;
grid.DataBind();
}
48
The final task to complete is to add a couple of getters and setters to allow the configuration of
our Web part. The code is given below. These follow the same pattern as we have already seen
and allow the connection string configuration and the query to execute to be specified in the
editor zone (see Figure 34).
Figure 29. Host File Web Part Configuration.
string m_ConnectionString;
[WebBrowsable(true), Category("Data Connection"),
Personalizable(PersonalizationScope.Shared), DefaultValue(""),
WebDisplayName("Connection details"),
WebDescription("Details of the connection to file host file.")]
public string ConnectionString
{
get { return m_ConnectionString; }
set { m_ConnectionString = value; }
}
string m_Query;
[WebBrowsable(true),
Category("Data Connection"),
Personalizable(PersonalizationScope.Shared),
DefaultValue(""),
WebDisplayName("SQL Query"),
WebDescription("Query to execute against the specified connection
details.")]
public string Query
{
get { return m_Query; }
set { m_Query = value; }
}
49
An example connection string is shown below.
Provider=SNAOLEDB;Metadata="C:\ WoodgroveBankFiles.DLL";Local
Folder="C:\ "
There are only three pieces of information required; the provider name, the metadata assembly
and the folder in which the file is located. The metadata assembly is created using the Host File
project type in Visual Studio. In this case, I am using the SDK sample which is in the
SDK\Samples\EndToEndScenarios\WoodgroveBank\Account Management folder under
C:\Program Files\Microsoft Host Integration Server 2009 by default. If you open the solution,
you will find the WoodgroveBankFiles project and double-clicking the Host file library
(WoodgroveBankFiles.DLL) will open the designer window shown in Figure 35.
Figure 30. Host File Designer.
The principle here is simple if you have ever opened anything in notepad – until you understand
the structure of the contents - it is meaningless. The layout of a file must be defined to enable
the fields and records in it to be identified and to do that requires an understanding of the data
types and field lengths contained within. In Figure 35 you can see that the WoodgroveBankFiles
assembly contains the definitions of three files; Customers, Accounts and Transactions. The
right-hand pane is open on the Host column definitions for the Customers table. The most
important information here is the column definitions showing where each starts and ends and
their datatype. These fields are shown under the CUSTOMER_RECORD type shown in the left-
hand pane. Host File definitions may be created in several ways, manually in the designer, or
more commonly through the importing of a COBOL copybook defining the layout or retrieving
the file details for AS/400 files. Both of these expedite the process of creating the file definition.
The query specified in Figure 36 is
SELECT * FROM CUSTOMERS
50
Notice that no file name is actually given in the connection string. The way this works is that the
file name is actually present in the metadata definition in the properties for the Customers table.
There is a property called Host File Name. The value of this is used in conjunction with the
Local Folder in the connection string to find a file with the same name at this location. The SQL
syntax is very convenient as it is very familiar to most developers. In our example, here all
columns will be returned as part of the query. The final result is shown in Figure 36.
Figure 31
As before, the SPGridView used here renders a SharePoint style grid with the usual chrome
making it fit in with the Microsoft-provided Web parts. There are plenty of improvements that
could be made here. For example, making this a connection provider would enable another
Web part such as a BDC Data List to receive data from our Web part and retrieve related data
from another source. We could also implement filters and sorting too to mimic the OOB Web
parts and increase the utility of our Web part in the future. I‘ll leave that to the reader to have fun
with.
51
Applying the Entity Framework for use with DB2
The final new feature in HIS 2009 we will look at here is the DB2 Entity Framework (EF)
provider. The Entity Framework is a relatively new technology that shipped with .NET 3.5 SP1 in
the fall of 2008. It enables the modeling and separation of a physical database schema from its
logical representation. This abstraction enables one to change independently of the other
isolating clients from the impact. A complimentary technology ADO.NET Data Services
(originally code-named Astoria) is implemented on top of the EF enabling the projection of the
logical model as a RESTful Web service implementation. A primary principle of REST is that
resources are addressable. This typically translates to using REST in data-oriented scenarios
where the URI syntax and HTTP verbs (such as GET and PUT) can be used in conjunction to
work with resources to retrieve, insert and update data.
The new EF DB2 support allows the creation of models using the EF visual designer in Visual
Studio. The purpose of the provider is twofold, firstly it allows the retrieval of database schema
information to build the model and secondly it creates the appropriate SQL at runtime enabling
you to focus on application functionality and not worry about the specifics of the database
access implementation. Microsoft currently provides SQL Server support (and Oracle DB via
Codeplex) but third parties have already provided support for other data sources as well. As a
developer, this is a boon as you can be shielded from the many differences between alternative
SQL dialects each database uses. Let us look at an example.
Figure 32. Entity Framework Model.
Figure 37 above shows two entities in the designer which renders the EDMX, the underlying
xml-based definition. The designer allows the round-tripping of the visual designer with the
EDMX. This contains a number of things; a storage model, conceptual model and the mapping
between the two. It is this separation between the two that gives the EF the ability to manage
changes in a compatible way so as not to impact clients of the model. The storage model is
defined in terms of the underlying data source. For DB2, this contains such details as the
ADO.NET provider, the database tables, column names and relationships between them. For
52
example, the cardinality between the EMPLOYEE and DEPARTMENT entities is added
automatically based on the underlying table constraints and relationships. The conceptual
model drives the entity definitions from a client‘s perspective such as the names of entities and
their properties and finally, the mapping describes the relationship between the two. For more
information on the structure of the EDMX see the references section. When the EDMX is
compiled, the output is a .NET assembly that can be referenced from your code.
SharePoint (WSS and therefore MOSS) support the use of Windows Workflow (WF). WF is a
natural complement to the document collaboration features of SharePoint because documents
often have a lifecycle or process they must follow such as an invoice approval process. MOSS
augments WSS with a number of predefined workflow templates (including approval) but you
are free to create your own custom workflows templates as well. However, there is no way to do
this in the browser; you must use Visual Studio to create them. A workflow template is simply a
workflow that has been deployed to SharePoint, generally as a feature.
Here I am going to create a workflow and two custom activities that use the DB2 EDMX created
earlier and update the database based on data passed from a Web page in MOSS. The
workflow will be triggered by the addition of a new document to a document library. The first
activity we will look as is the DB2Activity. The data access code in the activity is shown below
and I am employing LINQ to Entities to enable the querying of the DB2 SAMPLE database
using our model from the assembly created by the EDMX.
using System.Data.Entity;
using System.Data.Objects;
using System.Data;
protected override ActivityExecutionStatus Execute
(ActivityExecutionContext context)
{
IEnumerable<EMPLOYEE> employee = (from emp in
testEntities.EMPLOYEE
.Include("DEPARTMENT")
where emp.LASTNAME ==
Path.GetFileNameWithoutExtension(FileName)
orderby emp.HIREDATE descending
select emp).AsEnumerable();
ReturnDataSet = employee.ToList<EMPLOYEE>();
return ActivityExecutionStatus.Closed;
}
Even if you are not familiar with LINQ, hopefully this is easy to understand. Here, I am querying
the EMPLOYEE table for all rows where the LASTNAME column is equal to the name of the file
(minus its extension) that triggered the workflow. The results are ordered by HIREDATE and
assigned to an IEnumerable of <T> - in this case EMPLOYEE. One interesting feature of LINQ
is that this style of querying results in deferred execution, that is, the database will not be
queried until we request its enumerator. However, we can force execution by converting to a
List of EMPLOYEEs. This keeps things simpler as we can then make this list available to
53
subsequent activities, preventing the connection details being required by the second custom
activity we‘ll create that consumes this data returned.
The about code implements the Execute method overridden from SequenceActivity (in the
System.Workflow.Activities namespace) in our custom activity. Additionally we can provide
properties for an activity, allowing the provision of values in UI enabling the activity to be used
and configured differently for different contexts. The code below shows a property I have added
to allow the connection details to be specified.
[DesignerSerializationVisibility(DesignerSerializationVisibility.Visibl
e)]
[BrowsableAttribute(true)]
[DescriptionAttribute("Database connection string")]
[CategoryAttribute("DB2")]
public string ConnectionString
{
get
{
return
((string)(base.GetValue(DataActivity.ConnectionStringProperty)));
}
set
{
base.SetValue(DataActivity.ConnectionStringProperty, value);
}
}
Figure 39 shows the entire workflow. The first activity is added for us when we create a new
SharePoint sequential workflow from the project templates added by the VSeWSS. This activity,
OnWorkflowActivated gives us access to the SharePoint context details populated when
SharePoint starts a workflow instance. To do this we must use a binding, a mechanism to
associate a variable (property or field) declared in a workflow with the property of an activity.
This is very powerful as it is the primary mechanism for passing data between activities via the
workflow. The binding dialog is shown in Figure 38.
Figure 33. Property Binding.
54
OnWorkflowActivated is contained in the Microsoft.SharePoint.WorkFlowActions assembly.
There are several others, one of which is the LogToHistory activity. This enables us to write out
messages that are viewable in SharePoint against the workflow instance history that it tracks.
Figure 34. SharePoint Workflow.
Figure 39 shows the configuration of the workflow activities (a workflow is itself an activity)
through the use of the property sheet to specify absolute values or bindings to other properties.
Our first custom activity, DB2 activity is next in the sequence. This retrieves the data from DB2
and makes it available to other activities. The While activity allows looping until a condition is
met. Here, I am iterating over each row returned by the DB2 activity. I have done this by
maintaining an index variable, which is incremented inside the loop and compared to the
number of EMPLOYEEs we have. You can see that the code (which is in the workflow
definition) is referencing the DB2 activity‘s ReturnDataSet property which was assigned the
results of the query earlier. The ConditionalEventArgs parameter has a Result Boolean
property, which is assigned the results of the comparison. By returning true, the test is satisfied
and the contained activities will be executed.
55
private void testCondition(object sender, ConditionalEventArgs e)
{
IEnumerable<DB2Entities.EMPLOYEE> employees =
Activity1.ReturnDataSet.OfType<DB2Entities.EMPLOYEE>();
e.Result = index < employees.Count<DB2Entities.EMPLOYEE>();
}
The next activity is another custom activity I have created, the BDCInvoker. I alluded earlier to
the fact that the BDC allows both the retrieval and manipulation of data. We have covered the
retrieval side in depth so let‘s consider manipulation. What I was referring to was that the BDC
is not constrained to just fetching data but allows updates too. In the context of a database this
essentially means inserts, updates and deletes. The GenericInvoker method type allows the
specification of any SQL in the same way as the (Specific)Finders but without the same
expectations or constraints with regard to output parameters. Replaceable parameters can be
specified in the same way as we have already seen, in the ADF, to pass data in and out and
can also be bound to filters. The downside is that there are no provided Web Parts that are
capable of consuming GenericInvoker methods, so we have to write our own. However, I have
chosen to implement this capability as an activity so we can trigger BDC updates via WF.
The bulk of the work is performed in the Execute method as before so let‘s look at that first.
protected override ActivityExecutionStatus Execute
(ActivityExecutionContext context)
{
object[] parameters = new object[2];
List<EntityFramework.EMPLOYEE> l = DataSet as
List<EntityFramework.EMPLOYEE>;
parameters[0] = l[Position].EMPNO;
parameters[1] = l[Position].EMPNO;
SqlSessionProvider.Instance().SetSharedResourceProviderToUse
(SharedServiceProvider);
NamedLobSystemInstanceDictionary instances =
ApplicationRegistry.GetLobSystemInstances();
LobSystemInstance instance = instances[LOBSystemInstance];
Entity entity = instance.GetEntities()[Entity];
MethodInstance methInst = entity.GetMethodInstances()[MethodName];
SPSecurity.RunWithElevatedPriviliges(delegate()
{
entity.Execute(methInst, instance, ref parameters);
}
);
return ActivityExecutionStatus.Closed;
}
This code uses the values set at design time of several properties. LOBSystemInstance, Entity
and MethodName we have seen before and you should hopefully recognize straight away. They
refer to the BDC LOB system name that we imported from our ADF, the Entity we wish to use
and the method instance name to invoke. We also need to specify the name of the Shared
56
Service Provider (SSP), which by default is SharedService1. The reason for this is that the
activity is not necessarily tied to a MOSS context. Because of this, it cannot determine the SSP
to use and we must specify it by name. Finally, the DataSet property is bound to a workflow field
to allow the access of the previously retrieved data by our DB2Activity. The method in the ADF
looks like this:
<Method Name="UpdateEmployee">
<Properties>
<Property Name="RdbCommandText" Type="System.String">
UPDATE EMPLOYEE SET SALARY = SALARY * 1.1 WHERE EMPNO = ?
</Property>
<Property Name="RdbCommandType" Type="System.String">Text</Property>
</Properties>
<Parameters>
<Parameter Direction="In" Name="@EMPNO">
<TypeDescriptor TypeName="System.String" Name="EMPNO" />
</Parameter>
<Parameter Direction="Return" Name="UpdateStatus">
<TypeDescriptor TypeName="System.String"
Name="SomethingToReturn" />
</Parameter>
</Parameters>
<MethodInstances>
<MethodInstance Name="UpdateEmployee" Type="GenericInvoker"
ReturnParameterName="UpdateStatus" />
</MethodInstances>
</Method>
Here you can see the SQL provided and the parameters on the GenericInvoker method that will
be supplied by the custom activity and insert a data item into the entity. The input parameter is
taken from our previous activity, which is providing a list of EMPLOYEEs. I hope that you can
also see that each lucky employee is receiving a 10% pay rise when the matching file is added
to the document library. The final activity in the workflow is another LogToHistory activity that
logs that an update has been performed successfully.
The actual execution of the BDC method needs some consideration. Because the workflow is
scheduled by MOSS it doesn't run under the same account as the user who initiated it. Instead,
the workflow will run under the SharePoint/system account, which by default won't have access.
To address this SharePoint provides the ability to execute code under the site administrator
account using the SPSecurtity.RunWithElevatedPriviliges method in the Microsoft.SharePoint
assembly. The BDC has several levels of security. The ability to access the BDC is controlled
through the SharePoint Central Administration site, as shown in Figure 40. In addition, the
account under which the Execute method is run must have access to the data source being
accessed. So far the examples have shown either anonymous or database credentials
(username/password) access in the ADF itself. However, the BDC also integrates with Single
Sign-On (SSO) which allows the Windows principal calling into the BDC to be used either to
resolve to database credentials in the SSO store or passed through in integrated mode. Either
way, you must ensure that the user accessing the BDC API and the data the BDC method
accesses have sufficient privileges.
57
Figure 40
To get this up and running in SharePoint, everything must be deployed and configured. The
VSeWSS simplifies this greatly at the expense of some compromise in control. For example, the
necessary files required to package the workflow as a feature and deploy it as part of the build
is provided for you including a WSP package that can be used to deploy to other servers too.
Our referenced custom activities can be added to the feature and deployed with the workflow as
well. Once deployed, a workflow association must be created in SharePoint. This is shown in
Figure 41. Here I am associating a workflow template with a document library. We are given a
couple of options, to start a new workflow when a document is added or updated. In this case,
we only want to add a workflow when a new document is added to the library.
This example is perhaps a little contrived simply to try to reduce the complexity and amount to
take in. There is a lot going on here. Essentially, we have a SharePoint document library that
has a workflow configured such that when a new document is added to the library a workflow
instance is spun up. The workflow will retrieve rows from DB2 using LINQ to Entities where the
surname matches the first part of the file name of the document added. Thus adding a file called
JOHN.TXT will return all rows from the EMPLOYEE table where LASTNAME = JOHN. The
filename is accessible in the workflow via the passed in SharePoint properties. The workflow
then, for each row, invokes a method in the BDC to update the employee‘s salary by increasing
it 10%.
58
Figure 35. SharePoint Workflow Association.
One reason this example is artificial is that there are other ways to capture data in SharePoint
and pass it to a workflow template that provide much more control and power. The two
mechanisms are to provide a custom .aspx page that is integrated with the template deployment
or to use InfoPath. Unfortunately, space precludes me from covering these, but both provide the
ability to capture and collect information in SharePoint that can be passed and consumed by
WF. This is of course much more flexible than the simple approach I took where I based the
query on the name of the file.
59
Summary In this example, I used a LOBSystemInstance configured against DB2 but the real power with
this activity and the BDC is that it can be easily configured to call another method defined in
another ADF to perhaps write to a host file using the HIS 2009 Host File Provider for example.
This ability to reuse componentry quickly and easily is a real productivity boost when considered
in conjunction with the collaborative framework SharePoint. Of course, we could extend these
activities further, allowing parameterized binding to the BDC for even greater flexibility. In
addition, control over the query formulated against DB2 in the DataActivity would enable more
scenarios to be realized. I hope and encourage you to have fun and experiment with this.
I hope you are starting to see how powerful SharePoint and MOSS can be in enterprise
integration scenarios when coupled with the new features of Host Integration Server 2009. Both
Web Parts and Windows Workflow can be leveraged to create compelling applications powered
by the data and functionality you already have in backend systems and data stores. SharePoint
adds to this rich Web-based collaboration features to bring these assets to the forefront.
60
Appendix A – Building a Web service using HIS 2009
Rather than describing each element and attribute of the BDC file format, let us jump straight in
with an example. Here, we will use one of the HIS features enhanced in 2009 that allows the
easy creation of a WCF Web service interface from a host program or copybook (COBOL or
RPG) or from scratch. This is achieved right within Visual Studio, and HIS 2009 allows the use
of either Visual Studio 2008 or 2005 for this. The screenshots show the use of 2008 for this
example.
To begin, open Visual Studio, create a new project, and under Host Integration Projects, select
Host Application (see Figure 42). A Host Application project is used to create the Host program
definitions and mappings to enable invocation from .NET or COM. In the Name textbox, enter
Woodgrove Bank and click OK.
Figure 42. Add HIS Project.
You now need to specify the details of the library being added. TI invokes Host functionality via
COM or .Net through the creation of libraries. A library is a container for the type definitions You
will create and is either a .Net assembly or a COM typelib (TLB) and is what also loads the TI
runtime.
Right-click on the Project you just created, and choose ‗Add‘ and ‗Add .Net Client Library‘.
On the dialog that displays Next (see Figure 43), select .NET Client Library, enter
BankingLibrary.dll in the Name textbox, and click Add.
61
Figure 36. Add new library.
The library defines your ‗interfaces‘ to the Host programs. These interfaces are manifested as
actual COM or .Net interfaces with the ability to define methods and their respective
parameters. A method is analogous to a Host program with the parameters (in/out/inout or
return value) representing the storage area (e.g. COMMAREA for CICS) of the program used to
communicate back and forth. In this way a natural, integrated, programming experience is
created for .Net developers without requiring detailed knowledge of the Host environment,
languages, or communication. Figure 44 shows the creation of the default interface, IBanking.
This will be added as an interface type to our TI assembly. The Type Restrictions refer to the
ability to limit the schema produced from the definition; other products (e.g. BizTalk) can also be
used for integration via the BizTalk Adapters for Host systems (see references). As we are
going to create a WCF service, select the "WCF and WebServices" option from the dropdown.
Figure 37. Setting schema type restrictions.
62
Clicking Next displays the dialog shown in Figure 45. This details the remote environment (i.e.
the Host). Enter the details as shown and click Next.
Figure 38. Remote Environment Settings.
On the confirmation page, click Create to add the library to the TI Project. In the Solution
Explorer, double-click the Banking library. This will open up the library contents in the TI
Designer. To simplify the process of creating the Host interfaces, the designer allows the
importing of a program or copybook from which the interface can be extracted. This is illustrated
in Figure 46. We will use one of the SDK-provided Host programs for Woodgrove Bank - a Host
implementation of a simple banking application. Source is provided for this in both COBOL and
RPG. We will use the COBOL source in our example. For this, right-click the library in the
designer and click Import followed by Host Definition. In the dialog that displays, click the
Browse button and point to the following COBOL program deployed as part of the HIS SDK:
C:\Program files\Microsoft Host Integration Server 2009\SDK\Samples\ApplicationIntegration\
WindowsInitiated\SampleMainframeCode\TCP CICS MSLink\CICSGetAcctsLink.cbl
63
The dialog in Figure 47 will then load the COBOL program and display it. Check that it shows
the GETACCTS program. This is the part of the Woodgrove Bank application that will return a
list of customer bank accounts.
Figure 40
Click Next to display the dialog in Figure 48 – the Item Options. This dialog specifies the type of
item that is being defined. There are two principal options: as a method on the interface, or as a
type. The type option allows the specification of the data types exchanged with the Host
program. As we will be calling the program, the Method option is required. The program name is
defaulted into the Name text box. This will add the GETACCTS method to our previously
created interface. The next step is to define the method‘s signature. Enter the details as shown
and click the Next button.
Figure 39
64
Figure 41
Figure 49 shows the extracted DFHCOMMAREA of the COBOL/CICS program we imported.
The COMMunication AREA is the area of Working Storage (i.e. memory) that holds the input
and output data. As you can see the import examines the DFHCOMMAREA and has split it into
a repeating group of account details (based on the COBOL OCCURS clause) and two additional
items, NAME and PIN. The checkboxes next to the items allow the selection of specific items if
required to map only a subset of the variables if needed. Click the top-level DFHCOMMAREA to
select everything and click Next.
Figure 42
Figure 50 adds additional metadata to this process by allowing the definition of whether the
items identified are input or output parameters, or both. Everything defaults to In/Out by default,
but we will change this to indicate that NAME and PIN are input parameters and the ACCTINFO
65
structure is the return value. To do this, simply click the icon and select the direction
required. Once complete, click Next.
The final step in the Interface Wizard, shown in Figure 51, is to define any Data Tables,
Structures or Unions. The Wizard has already marked the ACCTINFO group as a structure
which is what we want. Clicking Next will add the program call definition to the library as shown
in Figure 52.
Figure 43
The designer provides several views over this definition. We can view the generated .Net code
(in C#) by clicking on the ―Definition‖ tab in the designer (to see all the code press ALT+F12).
You can also see the Host code (in COBOL or RPG depending on what you selected) that was
used to generate it. You can even start from scratch creating the interface first, in which case
the ―Host Data Definition‖ tab will be the generated Host code required to match it. This can
simplify new development as the Host code can be exported and then inserted into a new Host
program. Lastly, a XML Schema (XSD) is also created for you. This is useful when working with
typed messages and BizTalk, especially when using the Host Adapters. This makes it
straightforward to create messages of the correct type in BizTalk to pass to the adapter.
Remember if you use this ability to select the correct type restrictions to ensure the schema
created is compatible with the technology you plan to use (WCF, ASMX, WCF or BizTalk).
66
Figure 44
Before finishing the definition, I would like to mention client context. Each method defined on the
interface has an "Include Context Parameter" as shown in the property sheet in Figure 53.This
allows the addition of a context parameter which can specify additional, per call, information. For
example, the host program to call can be overridden from the defaults specified earlier, which
was GETACCTS if you recall. This gives a high degree of flexibility when required in dynamic
scenarios where decisions in code may require routing to different end points. Ensure this
property is set to false, as we won't need it for this example. Changing it to true will add a
ClientContext parameter to your method's signature, which is of type ClientContext. It can then
be set programmatically during the setup of the call parameters and passed as a reference.
Figure 45
67
Figure 53
Saving the library will automatically create the assembly.
The final action we need to take is to deploy the assembly created for us by the designer in the
TI Manager. To do this, launch TI Manager from Start -> All Programs-> Microsoft Host
Integration Server 2009 -> TI Manager. This starts the Microsoft Management Console (MMC)
with three Snap-Ins pre-loaded, TI Manager itself, Internet Information Services (IIS) and
Component Services (COM+).
Figure 54
68
First, we need to create a remote environment (or ―RE‖) that describes the Host we are
connecting to. For this, we will make use of a host simulator application.
HIS 2009 ships with an offline host simulator called SimHost shown in Figure 54. You will find it
in C:\Program Files\Microsoft Host Integration Server 2009\System. SimHost allows us to run
our samples without Host connectivity through the use of recorded ‗playback‘ files of Host
interaction. Several of these are provided in the SDK.
A Remote Environment (RE) is defined in the TI Manager MMC snap-in and includes the type of
Host, the connectivity to it and other details. Figure 55 shows the snap-in.
Figure 55. TI Manager
Expand the Windows-Initiated Processing node and right-click Remote Environments and select
"New" then Remote Environment. Give it the name SimHost, ensure the network type is set to
TCP and click Next. On the next dialog (see Figure 56) ensure the Target Host is CICS and the
Programming Model is ELM Link, then click Next. Enter localhost in the IP/DNS Address and
click the Edit button next to the Port list text box. Enter 7511 in the New Port text box and click
Add. Click OK and on the following dialog click Next. Finally click Finish to complete the
process.
69
Figure 56. Remote Environment.
The TI assembly will be hosted in IIS as a WCF service. Whether you are using IIS 6 or 7
affects the steps needed to accomplish this.
IIS 6 (Windows XP or Windows Server 2003)
For IIS 6, we need to create a new Web Site and Virtual Directory that will act as the endpoint
for our service. Expand the Internet Information Services Node in the left pane of MMC, then the
local computer. Right-click Web Sites and select New -> Web Site. Enter TI DEMO in the
Description field and click Next. Leave the IP address as ―(All Unassigned)‖ and enter 8080 for
the port. Click Next. For the path, click Browse and point to the project created earlier. Click OK
and then Next. Click Next again and then click Finish. Right-click the new site TI DEMO and
select New -> Virtual Directory. Click Next and in the alias text box enter BankingService. Click
Next and for the path, browse again to the project created earlier. Click Next and then ensure
the Read and Run Scripts checkboxes are ticked. Click Next and then Finish. Add the account
for the Application pool associated with the TI Demo Virtual Directory to the HISRuntime group.
We will now configure the TI assembly to use this Virtual Directory.
IIS 7 (Windows Vista or Windows Server 2008)
For IIS 7, expand the <computer name>/Sites folder in the left-hand pane and right-click
"Default Web Site". Select "Add Application" and in the Alias field enter "Banking". Set the
Physical path to the location of the TI Library we just created (click Browse and point to the
project created earlier). Click Ok to close the dialog and then in the Features view, double-click
"Directory Browsing" and click the "Enable" link in the right-hand pane. Also, in the Features
view double click ―Handler Mappngs‖ and click ―Edit Feature Permissions‖. Ensure the Read
and Run Scripts checkboxes are ticked next Click ―Ok‖. . Add the account for the Application
pool associated with the Banking Application Directory to the HISRuntime group. We will now
configure the TI assembly to use this Application Directory.
70
Figure 57
Now we need to select the library we created in Visual Studio into TI. Under the Windows
Initiated Processing node under Transaction Integrator, right-click the Objects node and select
New -> Object. Click Next on the opening dialog of the Wizard that appears and then browse to
the assembly we just created and then Next. Enter the details as shown in Figure 57. Note that
the Application/Virtual Directory set up earlier should appear in the dropdown list and the
service type is WCF. Click Next and select "Basic HTTP Bindings" for the Bindings dropdown
and "metaDataEnabledWithExceptions" in the behaviours dropdown. Click Next and then
ensure on the following dialog that the Remote Environment shows SimHost. Clicking Next will
start the creation of the WIP Object. Wait for this to finish and ensure the following appears in
the Messages area:
Progress
BankingLibrary.Banking.1 - Created
Finished
Click Next and then Finish.
Go back to the IIS node in the snap-in and do one of the following depending on whether you
are using IIS6 or 7.
For IIS 7, click the Banking application you created and then click the Content View tab. There
should be an .svc file added to this location now. Right-click the file and select "Browse". You
should see something similar to Figure 55 in your browser window.
For IIS 6, select the Banking virtual directory and right-click the .svc file in the right-hand pane
and click "Browse". You should see something similar to Figure 58 in your browser window.
71
Figure 58
We're now ready to call the WCF service from the BDC Editor.
72
Appendix B – TI Banking Web Service Application
Definition file
<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<LobSystem xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://schemas.microsoft.com/office/2006/03/BusinessDataC
atalog BDCMetadata.xsd" Type="WebService" Version="1.0.0.0"
Name="BankingLOBSystem"
xmlns="http://schemas.microsoft.com/office/2006/03/BusinessDataCatalog">
<Properties>
<Property Name="WsdlFetchUrl"
Type="System.String">http://localhost:8090/Banking/BankingLibrary.Banking.svc
</Property>
<Property Name="WebServiceProxyNamespace"
Type="System.String">BDC</Property>
</Properties>
<LobSystemInstances>
<LobSystemInstance Name="BankingLOBSystem_Instance">
<Properties>
<Property Name="LobSystemName"
Type="System.String">BankingLOBSystem</Property>
<Property Name="WebServiceAuthenticationMode"
Type="Microsoft.Office.Server.ApplicationRegistry.SystemSpecific.WebService.H
ttpAuthenticationMode">PassThrough</Property>
</Properties>
</LobSystemInstance>
</LobSystemInstances>
<Entities>
<Entity EstimatedInstanceCount="10000" Name="Accounts">
<Methods>
<Method Name="GETACCTS">
<FilterDescriptors>
<FilterDescriptor Type="Wildcard" Name="Name" />
<FilterDescriptor Type="Wildcard" Name="PIN" />
</FilterDescriptors>
<Parameters>
<Parameter Direction="In" Name="name">
<TypeDescriptor TypeName="System.String, mscorlib, Version=2.0.0.0,
Culture=neutral, PublicKeyToken=b77a5c561934e089" AssociatedFilter="Name"
Name="name">
<DefaultValues>
<DefaultValue MethodInstanceName="GetAccounts"
Type="System.String"></DefaultValue>
</DefaultValues>
</TypeDescriptor>
</Parameter>
<Parameter Direction="In" Name="PIN">
<TypeDescriptor TypeName="System.String, mscorlib, Version=2.0.0.0,
Culture=neutral, PublicKeyToken=b77a5c561934e089" AssociatedFilter="PIN"
Name="PIN">
<DefaultValues>
<DefaultValue MethodInstanceName="GetAccounts"
Type="System.String"></DefaultValue>
</DefaultValues>
</TypeDescriptor>
73
</Parameter>
<Parameter Direction="Return" Name="Return">
<TypeDescriptor TypeName="BDC.ACCTINFO[],BankingLOBSystem"
IsCollection="true" Name="Return">
<TypeDescriptors>
<TypeDescriptor TypeName="BDC.ACCTINFO,BankingLOBSystem"
Name="Item">
<TypeDescriptors>
<TypeDescriptor TypeName="System.String, mscorlib,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
Name="ACCOUNTNUMBER" />
<TypeDescriptor TypeName="System.String, mscorlib,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
Name="ACCOUNTTYPE" />
<TypeDescriptor TypeName="System.Decimal, mscorlib,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
Name="CURRENTBALANCE" />
<TypeDescriptor TypeName="System.Boolean, mscorlib,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
Name="CURRENTBALANCESpecified" />
<TypeDescriptor TypeName="System.Int16, mscorlib, Version=2.0.0.0,
Culture=neutral, PublicKeyToken=b77a5c561934e089" Name="INTERESTBEARING" />
<TypeDescriptor TypeName="System.Boolean, mscorlib,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
Name="INTERESTBEARINGSpecified" />
<TypeDescriptor TypeName="System.Single, mscorlib,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
Name="INTERESTRATE" />
<TypeDescriptor TypeName="System.Boolean, mscorlib,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
Name="INTERESTRATESpecified" />
<TypeDescriptor TypeName="System.Decimal, mscorlib,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
Name="MONTHLYSVCCHG" />
<TypeDescriptor TypeName="System.Boolean, mscorlib,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
Name="MONTHLYSVCCHGSpecified" />
</TypeDescriptors>
</TypeDescriptor>
</TypeDescriptors>
</TypeDescriptor>
</Parameter>
</Parameters>
<MethodInstances>
<MethodInstance Type="Finder" ReturnParameterName="Return"
ReturnTypeDescriptorName="Return" ReturnTypeDescriptorLevel="0"
Name="GetAccounts" />
</MethodInstances>
</Method>
</Methods>
</Entity>
</Entities>
</LobSystem>
74
Appendix C – Sample Source for a TI Web Part
The complete source for the TI Web part is shown below. You must add the references detailed in the TI section to compile this code. using System;using System.Web.UI.WebControls;
using System.Web.UI;
using System.Web.UI.WebControls.WebParts;
using Microsoft.Office.Server.ApplicationRegistry.Runtime;
using Microsoft.SharePoint.Portal.WebControls;
using System.Data;
using Microsoft.HostIntegration.SNA.Session;
using BankingLibrary;
using System.ComponentModel;
using Microsoft.SharePoint.WebControls;
namespace Microsoft.HIS.Samples.WebParts
{
public class TIWebPart : System.Web.UI.WebControls.WebParts.WebPart
{
private SPGridView grid;
#region Properties
string m_Name;
[WebBrowsable(true),
Category("Method Parameters"),
Personalizable(PersonalizationScope.Shared),
DefaultValue(""),
WebDisplayName("Name"),
WebDescription("Account name to retrieve.")]
public string Name
{
get { return m_Name; }
set { m_Name = value; }
}
string m_PIN;
[WebBrowsable(true),
Category("Method Parameters"),
Personalizable(PersonalizationScope.Shared),
DefaultValue(""),
WebDisplayName("PIN"),
WebDescription("PIN code for account.")]
public string ConnectionString
{
get { return m_PIN; }
set { m_PIN = value; }
}
#endregion
protected override void CreateChildControls()
{
base.CreateChildControls();
grid = new SPGridView();
75
grid.AutoGenerateColumns = false;
this.Controls.Add(grid);
}
protected override void OnPreRender(EventArgs e)
{
Banking bank = new Banking();
ACCTINFO[] accounts = bank.GETACCTS(m_Name, m_PIN);
DataTable table = new DataTable();
table.Columns.Add("ACCOUNTNUMBER");
table.Columns.Add("ACCOUNTTYPE");
table.Columns.Add("CURRENTBALANCE");
table.Columns.Add("INTERESTBEARING");
table.Columns.Add("INTERESTRATE");
table.Columns.Add("MONTHLYSVCCHG");
for (int i = 0; i < accounts.Length; i++)
{
table.Rows.Add(accounts[i].ACCOUNTNUMBER,
accounts[i].ACCOUNTTYPE,
accounts[i].CURRENTBALANCE,
accounts[i].INTERESTBEARING,
accounts[i].INTERESTRATE,
accounts[i].MONTHLYSVCCHG);
}
DataSet ds = new DataSet();
ds.Tables.Add(table);
foreach (DataColumn col in table.Columns)
{
SPBoundField fld = new SPBoundField();
fld.HeaderText = col.ColumnName;
fld.DataField = col.ColumnName;
grid.Columns.Add(fld);
}
grid.DataSource = ds;
grid.DataBind();
}
protected override void RenderContents(System.Web.UI.HtmlTextWriter
writer)
{
grid.RenderControl(writer);
}
}
}
76
Appendix D – Example of BDC Associations <?xml version="1.0" encoding="utf-8" ?>
<LobSystem Name="LOBDB2System" Version="1.0.0.0" Type="Database"
xmlns:xsi="http://www.w3.org/2001/XmlSchema-instance"
xmlns="http://schemas.microsoft.com/office/2006/03/BusinessDataCatalog">
<Properties>
<Property Name="WildcardCharacter" Type="System.String">%</Property>
</Properties>
<LobSystemInstances>
<LobSystemInstance Name="LOBSample">
<Properties>
<Property Name="DatabaseAccessProvider"
Type="Microsoft.Office.Server.ApplicationRegistry.SystemSpecific.Db.DbAccessProvider">
OleDb</Property>
<Property Name="rdbconnection Provider" Type="System.String">DB2OLEDB</Property>
<Property Name="rdbconnection Integrated Security"
Type="System.String">True</Property>
<Property Name="rdbconnection Trusted_Connection"
Type="System.String">yes</Property>
<Property Name="rdbconnection Initial Catalog"
Type="System.String">SAMPLE</Property>
<Property Name="rdbconnection Network Address"
Type="System.String">localhost</Property>
<Property Name="rdbconnection Network Port" Type="System.String">50000</Property>
<Property Name="rdbconnection Network Transport Library"
Type="System.String">TCP</Property>
<Property Name="rdbconnection Package Collection"
Type="System.String">SYSIBM</Property>
<Property Name="rdbconnection DBMS Platform"
Type="System.String">DB2/NT</Property>
<Property Name="rdbconnection Process Binary as Character"
Type="System.String">0</Property>
<Property Name="rdbconnection Host CCSID" Type="System.String">37</Property>
<Property Name="rdbconnection PC Code Page" Type="System.String">1252</Property>
<Property Name="rdbconnection Units of Work" Type="System.String">RUW</Property>
<Property Name="rdbconnection APPC Security Type"
Type="System.String">PROGRAM</Property>
<Property Name="rdbconnection Defer Prepare" Type="System.String">0</Property>
<Property Name="rdbconnection User ID"
Type="System.String">Administrator</Property>
<Property Name="rdbconnection PWD" Type="System.String">@ff1nus08</Property>
<Property Name="AuthenticationMode"
Type="Microsoft.Office.Server.ApplicationRegistry.SystemSpecific.Db.DbAuthenticationMo
de">PassThrough</Property>
</Properties>
</LobSystemInstance>
</LobSystemInstances>
<Entities>
<Entity Name="EmployeeEntity" EstimatedInstanceCount="100">
<Identifiers>
<Identifier Name="DEPTNO" TypeName="System.String" />
<Identifier Name="LASTNAME" TypeName="System.String" />
</Identifiers>
<Methods>
<Method Name="FindEmployees">
<Properties>
<Property Name="RdbCommandType" Type="System.Data.CommandType, System.Data,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089">Text</Property>
77
<Property Name="RdbCommandText" Type="System.String">SELECT EMPNO, FIRSTNME,
LASTNAME, WORKDEPT FROM EMPLOYEE WHERE LASTNAME LIKE ?</Property>
</Properties>
<FilterDescriptors>
<FilterDescriptor Type="Wildcard" Name="LastName" DefaultDisplayName="Last
Name"/>
</FilterDescriptors>
<Parameters>
<Parameter Direction="In" Name="LAST_NAME">
<TypeDescriptor TypeName="System.String" Name="LAST"
AssociatedFilter="LastName" IdentifierName="LASTNAME" />
</Parameter>
<Parameter Direction="In" Name="DEPTNO">
<TypeDescriptor TypeName="System.String" Name="DEPTNO" IdentifierName="DEPTNO"
>
<DefaultValues>
<DefaultValue Type="System.String" MethodInstanceName
="EmployeeTypesInstance">%</DefaultValue>
</DefaultValues>
</TypeDescriptor>
</Parameter>
<Parameter Direction="Return" Name="EmployeeTypes">
<TypeDescriptor TypeName="System.Data.IDataReader,System.Data, Version=2.0.0.0,
Culture=neutral, PublicKeyToken=b77a5c561934e089" IsCollection="true"
Name="EmployeeTypesDataReader">
<TypeDescriptors>
<TypeDescriptor Name="EmployeeTypesDataRecord"
TypeName="System.Data.IDataRecord, System.Data, Version=2.0.0.0,
Culture=neutral,PublicKeyToken=b77a5c561934e089">
<TypeDescriptors>
<TypeDescriptor Name="EMPNO" TypeName="System.String" />
<TypeDescriptor Name="FIRSTNME" TypeName="System.String" />
<TypeDescriptor Name="LASTNAME" TypeName="System.String"
IdentifierName="LASTNAME" />
<TypeDescriptor Name="WORKDEPT" TypeName="System.String"
IdentifierName="DEPTNO"/>
</TypeDescriptors>
</TypeDescriptor>
</TypeDescriptors>
</TypeDescriptor>
</Parameter>
</Parameters>
<MethodInstances>
<MethodInstance Name="EmployeeTypesInstance" Type="Finder"
ReturnParameterName="EmployeeTypes" ReturnTypeDescriptorName="EmployeeTypesDataReader"
ReturnTypeDescriptorLevel="0" />
<MethodInstance Name="EmployeeTypesSpecificInstance" Type="SpecificFinder"
ReturnParameterName="EmployeeTypes" ReturnTypeDescriptorName="EmployeeTypesDataReader"
ReturnTypeDescriptorLevel="0" />
</MethodInstances>
</Method>
<Method Name="UpdateEmployee">
<Properties>
<Property Name="RdbCommandText" Type="System.String">
UPDATE EMPLOYEE SET SALARY = SALARY * 1.1 WHERE EMPNO = ?
</Property>
<Property Name="RdbCommandType" Type="System.String">Text</Property>
</Properties>
<Parameters>
<Parameter Direction="In" Name="@EMPNO">
<TypeDescriptor TypeName="System.String" Name="EMPNO" />
</Parameter>
<Parameter Direction="Return" Name="UpdateStatus">
78
<TypeDescriptor TypeName="System.String" Name="SomethingToReturn" />
</Parameter>
</Parameters>
<MethodInstances>
<MethodInstance Name="UpdateEmployee" Type="GenericInvoker"
ReturnParameterName="UpdateStatus" />
</MethodInstances>
</Method>
</Methods>
</Entity>
<Entity Name="DepartmentEntity" EstimatedInstanceCount="10">
<Identifiers>
<Identifier Name="DEPTNO" TypeName="System.String"/>
</Identifiers>
<Methods>
<Method Name="FindDepartment">
<Properties>
<Property Name="RdbCommandType" Type="System.Data.CommandType, System.Data,
Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089">Text</Property>
<Property Name="RdbCommandText" Type="System.String">SELECT DEPTNO, DEPTNAME,
MGRNO FROM DEPARTMENT WHERE DEPTNO = ?</Property>
</Properties>
<Parameters>
<Parameter Name="DEPTNO" Direction="In">
<TypeDescriptor Name="DEPTNO" TypeName="System.String" IdentifierName="DEPTNO"
IdentifierEntityName="EmployeeEntity" />
</Parameter>
<Parameter Name="LASTNAME" Direction="In">
<TypeDescriptor Name="LASTNAME" TypeName="System.String"
IdentifierName="LASTNAME" IdentifierEntityName="EmployeeEntity" />
</Parameter>
<Parameter Direction="Return" Name="DepartmentType">
<TypeDescriptor TypeName="System.Data.IDataReader,System.Data, Version=2.0.0.0,
Culture=neutral, PublicKeyToken=b77a5c561934e089" IsCollection="true"
Name="DepartmentTypeDataReader">
<TypeDescriptors>
<TypeDescriptor Name="DepartmentTypeDataRecord"
TypeName="System.Data.IDataRecord, System.Data, Version=2.0.0.0,
Culture=neutral,PublicKeyToken=b77a5c561934e089">
<TypeDescriptors>
<TypeDescriptor Name="DEPTNO" TypeName="System.String"
IdentifierName="DEPTNO" />
<TypeDescriptor Name="DEPTNAME" TypeName="System.String" />
<TypeDescriptor Name="MGRNO" TypeName="System.String" />
</TypeDescriptors>
</TypeDescriptor>
</TypeDescriptors>
</TypeDescriptor>
</Parameter>
</Parameters>
<MethodInstances>
<MethodInstance Name="DepartmentTypeInstance" Type="SpecificFinder"
ReturnParameterName="DepartmentType"
ReturnTypeDescriptorName="DepartmentTypeDataReader" ReturnTypeDescriptorLevel="0">
</MethodInstance>
</MethodInstances>
</Method>
</Methods>
</Entity>
</Entities>
<Associations>
<Association Name="EmployeeToDepartment"
79
AssociationMethodEntityName="DepartmentEntity"
AssociationMethodName="FindDepartment"
AssociationMethodReturnParameterName="DepartmentType" >
<SourceEntity Name="EmployeeEntity" />
<DestinationEntity Name="DepartmentEntity" />
</Association>
</Associations>
</LobSystem>
80
Appendix E - MQ Web Part Code
using System;
using System.Web.UI.WebControls;
using System.Web.UI.WebControls.WebParts;
using Microsoft.Office.Server.ApplicationRegistry.Runtime;
using Microsoft.SharePoint.Portal.WebControls;
using Microsoft.Office.Server.ApplicationRegistry.MetadataModel;
using System.ServiceModel;
using System.ServiceModel.Channels;
using System.ServiceModel.Channels.WebSphereMQ;
using System.ComponentModel;
namespace Microsoft.TechEd.Europe
{
public class MQWebPart : System.Web.UI.WebControls.WebParts.WebPart
{
// UI controls
#region UI controls
private Button PutButton;
#endregion
public string m_Message = null;
#region Properties
string m_QueueManager;
[WebBrowsable(true),
Category("MQ Configuration"),
Personalizable(PersonalizationScope.Shared),
DefaultValue(""),
WebDisplayName("Queue Manager"),
WebDescription("The name of the MQ Queue Manager to use.")]
public string QueueManager
{
get { return m_QueueManager; }
set { m_QueueManager = value; }
}
string m_Queue;
[WebBrowsable(true),
Category("MQ Configuration"),
Personalizable(PersonalizationScope.Shared),
DefaultValue(""),
WebDisplayName("Queue Name"),
WebDescription("The name of the MQ queue to use.")]
public string QueueName
{
get { return m_Queue; }
set { m_Queue = value; }
}
string m_ConnectionType;
[WebBrowsable(true),
81
Category("MQ Configuration"),
Personalizable(PersonalizationScope.Shared),
DefaultValue(""),
WebDisplayName("Connection Type"),
WebDescription("The name of the connection")]
public string ConnectionType
{
get { return m_ConnectionType; }
set { m_ConnectionType = value; }
}
string m_Channel;
[WebBrowsable(true),
Category("MQ Configuration"),
Personalizable(PersonalizationScope.Shared),
DefaultValue(""),
WebDisplayName("Channel Name"),
WebDescription("The name of the MQ channel to use.")]
public string Channel
{
get { return m_Channel; }
set { m_Channel = value; }
}
string m_Transport;
[WebBrowsable(true),
Category("MQ Configuration"),
Personalizable(PersonalizationScope.Shared),
DefaultValue(""),
WebDisplayName("Transport"),
WebDescription("The name of the transport to use.")]
public string Transport
{
get { return m_Transport; }
set { m_Transport = value; }
}
string m_ConnectionName;
[WebBrowsable(true),
Category("MQ Configuration"),
Personalizable(PersonalizationScope.Shared),
DefaultValue(""),
WebDisplayName("Connection Name"),
WebDescription("The name of the MQ Connection to use.")]
public string ConnectionName
{
get { return m_ConnectionName; }
set { m_ConnectionName = value; }
}
#endregion Properties
public MQWebPart()
{
this.ExportMode = WebPartExportMode.All;
}
82
protected override void CreateChildControls()
{
base.CreateChildControls();
PutButton = new Button();
PutButton.Text = "Put Message";
// wire up event handlers
PutButton.Click += new EventHandler(PutButton_Click);
this.Controls.Add(PutButton);
}
protected override void Render(System.Web.UI.HtmlTextWriter writer)
{
PutButton.RenderControl(writer);
}
protected void PutButton_Click(object sender, EventArgs e)
{
PutMsg();
}
#region BDC Connection Point
[ConnectionConsumer("Message",AllowsMultipleConnections=true)]
public void SetConnectionPoint(IEntityInstanceProvider provider)
{
Entity entity = provider.SelectedConsumerEntity;
IdentifierCollection col = entity.GetIdentifiers();
// Get a view we have to use to ask selected entity.
Microsoft.Office.Server.ApplicationRegistry.MetadataModel.View view =
entity.GetSpecificFinderView();
if (view == null)
{
return;
}
IEntityInstance inst = provider.GetEntityInstance(view);
if (inst == null)
{
return;
}
System.Data.DataTable dt = inst.EntityAsDataTable;
if (dt == null)
{
return;
}
// Check if data table has rows.
if (dt.Rows.Count == 0)
{
return;
}
83
// Everything is okay, let's save filter value;
for(int i = 0; i < dt.Columns.Count; i++)
{
m_Message = m_Message + dt.Rows[0][i].ToString() + @"\r\n\";
ViewState["message"] = m_Message;
}
}
#endregion
void PutMsg()
{
WebSphereMQBinding binding = new WebSphereMQBinding();
binding.ConnectionType = m_ConnectionType;
binding.MqcdChannelName = m_Channel;
binding.MqcdTransportType = m_Transport;
binding.MqcdConnectionName = m_ConnectionName;
ChannelFactory<IMQActions> factory = new
ChannelFactory<IMQActions>(binding);
factory.Open();
string epa = "net.mqs://" + m_QueueManager + "/" + m_Queue;
EndpointAddress to = new EndpointAddress(epa);
IMQActions proxy = factory.CreateChannel(to);
proxy.PutMsg(ViewState["message"] as string);
((IChannel)proxy).Close();
}
}
[ServiceContract]
interface IMQActions
{
[OperationContract(IsOneWay = true, Action = "PutMsg")]
void PutMsg(string x);
}
class MqChannel
{
static int numberOfMessagesReceived = 0;
// server class which implements the interface
[ServiceBehavior]
class MQActions : IMQActions
{
[OperationBehavior]
public void PutMsg(string x)
{
numberOfMessagesReceived++;
}
}
}
}
84
Appendix F – Sample Workflow for use with HIS
Prerequisites Make sure you have either of the following installed:
1. VSeWSS 1.1 for VS2005 or
2. VSeWSS 1.2 for VS2008 (as I write this 1.3 is available as a CTP)
The first thing we will do is create the activities that we‘ll later use in a custom workflow. This
workflow will then be deployed to SharePoint as a feature and enable the invocation of the
resultant workflow template via WSS or MOSS. In this walkthrough, I am using Visual Studio
2008. Note that I will be using LINQ to Entities here, and that Visual Studio 2005 does not
provide the necessary support.
Step-by-step instructions
1. Start Visual Studio 2008
Create a new, Visual C#® Class Library project. Call it DB2Entities (see Figure 59). Right click
the project that is created in the Solution Explorer and select ―Add new item‖. From the dialog
that appears, click Data in the Categories list and then select ADO.NET Entity Data Model. Give
the file the name SAMPLE_Model.EDMX and click ok. In the Wizard that now appears, leave
―Generate from database‖ selected and click Next. Click the New Connection button. Here we‘ll
connect to a DB2 instance to retrieve the physical schema details. The first thing to do is choose
the data source. Select DB2 Database (see Figure 60) and click continue.
Figure 59. Entity Framework Project.
Now we need to set up the connection to the database. If you have installed the Windows
version of DB2 (for example DB2 Express-C), you can use the settings I provide here. If you are
85
connecting to an existing DB2 database, say on AS/400 then you must provide your site-specific
connection details as required.
Figure 46. Select Data Source.
Assuming you is using DB2 on Windows, give the data source the name DB2 and click the
Configure button. On the dialog that displays, select DB2/NT and click Next. On the following
Network Connection dialog (Figure 61), enter localhost or the IP address of the DB2 server. For
a default installation, the port number is 50000. Click Next.
Figure 61. Network Connection.
The next step is to enter the database details. For this example, we will be using the DB2
SAMPLE database which can be installed during the DB2 installation and configuration process.
Enter the details as shown in Figure 62 on the DB2 Database dialog and click Next.
On the Locale dialog, leave the defaults as they are and click Next. For the following Security
dialog, enter the security details you used when you set up DB2 and click Next. Finally, on the
Advanced Options dialog accept the defaults displayed and click Next. To test the connection,
click the Connect button on the Validation dialog now shown. If all is well, click Next to finish.
This will take you back to the starting dialog, which should now look similar to Figure 63. As you
86
can see the Entity Connection String area is now populated with the necessary details to
connect to the SAMPLE database.
Figure 47
For the purposes of this example, I have selected to include the userid/password details in the
connection string; this is of course a potential risk in a production environment where you would
most likely elect to encrypt the connecting string area of the config file containing it or provide
the details through SSO. Clicking Next will take you to a dialog where you can choose the
entities to add to the model. Expand the tables and select the EMPLOYEE and DEPARTMENT
tables, enter DB2DemoModel as the Model Namespace and click Finish. The dialog windows
should now close and an EDMX file should have been added to your project. The model
designer should look similar to Figure 64.
Figure 48
87
Here you can see the two selected tables have been added to the design surface as well as the
relationships represented between them.
Figure 49. Model Designer.
Build the model and verify that it is successful. This completes the creation of the Entity
Framework model that we will use later.
2. Create the Workflow Activities
Add a new workflow activity type project to the solution – call it HIS2009Activities as shown in
Figure 65.
Figure 50. Workflow Activity Project.
88
Rename the class activity1, DB2Activity and add a reference to the DB2Entities project we just
created above. In addition, we need to add a reference to the Entity Framework assembly
System.Data.Entity. After this, add the following using statements to the top of the source file:
using DB2Entities;
using System.Data.Entity;
using System.Collections.Generic;
using System.Linq;
Creating a workflow activity is typically a two stage process. The first is to define the various
properties that the activity exposes, while the second is to implement the functionality the
activity provides. Let us start by adding several properties to the DB2Activity class as shown
below:
public static DependencyProperty ConnectionStringProperty =
DependencyProperty.Register("ConnectionString", typeof(String),
typeof(DB2Activity));
public static DependencyProperty ReturnDataSetProperty =
DependencyProperty.Register("ReturnDataSet", typeof(IEnumerable),
typeof(DB2Activity));
public static DependencyProperty KeyValueProperty =
DependencyProperty.Register("KeyValue", typeof(string), typeof(DB2Activity));
Name Description Category Type Field
KeyValue Value to query DB2 string KeyValueProperty
ConnectionString Database
connection
DB2 string ConnectionStringProperty
ReturnDataSet Data Returned DB2 IEnumerable ReturnDataSetProperty
Table 8. Properties.
[DesignerSerializationVisibility (DesignerSerializationVisibility.Visible)]
[BrowsableAttribute(true)]
[DescriptionAttribute("Database connection details")]
[CategoryAttribute("WebTearActivity Property")]
public string KeyValue
{
get
{
return ((string)(base.GetValue(DB2Activity.KeyValueProperty)));
}
set
{
base.SetValue(DB2Activity.KeyValueProperty, value);
}
}
[DesignerSerializationVisibility (DesignerSerializationVisibility.Visible)]
[BrowsableAttribute(true)]
[DescriptionAttribute("DatSet")]
[CategoryAttribute("DataSet")]
public IEnumerable ReturnDataSet
{
89
get
{
return
((System.Collections.IEnumerable)(base.GetValue(DB2Activity.ReturnDataSetProper
ty)));
}
set
{
base.SetValue(DB2Activity.ReturnDataSetProperty, value);
}
}
[DesignerSerializationVisibility(DesignerSerializationVisibility.Visible)]
[BrowsableAttribute(true)]
[DescriptionAttribute("Database connection details")]
[CategoryAttribute("WebTearActivity Property")]
public string ConnectionString
{
get
{
return ((string)(base.GetValue(DB2Activity.ConnectionStringProperty)));
}
set
{
base.SetValue(DB2Activity.ConnectionStringProperty, value);
}
}
These will appear in the workflow designer when the activity is dropped onto a workflow and will
allow configuration via the specification of absolute values via the activity‘s property sheet or
using binding. Binding is more flexible as it allows the activity‘s properties to be set based on
another property or code expression from the workflow. Figure 34 earlier shows the property
sheet in action. The next step is to implement the required functionality. This activity is going to
allow us to fetch data from DB2. This will be done using the previously created Entity
Framework assembly that we added as a reference above. With this reference, we can easily
retrieve data from DB2 using the new HIS 2009 EF support. For this, we will use LINQ to
Entities. The code of the Execute method is given below:
protected override ActivityExecutionStatus Execute(ActivityExecutionContext ctx)
{
using (Entities employees = new Entities(ConnectionString))
{
string name = System.IO.Path.GetFileNameWithoutExtension(KeyValue);
IEnumerable<EMPLOYEE> employee = (from emp in
employees.EMPLOYEE
where emp.LASTNAME == name
orderby emp.HIREDATE descending
select emp).AsEnumerable();
ReturnDataSet = employee.ToList<EMPLOYEE>();
}
return ActivityExecutionStatus.Closed;
}
I have already covered what this does and how it works. So let‘s move on to the next step and
create our second activity.
Right-click the HIS2009Activities project and select New->Activity. Rename the activity class
that gets added BDCActivity.cs. This is the activity that will call the BDC via the MOSS API to
90
enable us to create a very flexible and reusable activity capable of returning data from any entity
defined against any LOB system. We need to add some assembly references to allow us to call
out to the BDC. Add the following, all located in c:\program files\common files\microsoft
shared\web server extensions\ISAPI
Microsoft.Office.Server.dll
Microsoft.Sharepoint.portal.dllMicrosoft.SharePoint.dll
Add the same using statements as before in addition to the ones below. These are required to
reference the BDC APIs and types.
using Microsoft.Office.Server.ApplicationRegistry.MetadataModel;
using Microsoft.Office.Server.ApplicationRegistry.Runtime;
using Microsoft.Office.Server.ApplicationRegistry.SystemSpecific;
using Microsoft.Office.Server.ApplicationRegistry.Infrastructure;
using Microsoft.SharePoint;
We now need to add some properties to allow configuration of the activity. Add the following
code:
public static DependencyProperty SharedServiceProviderProperty =
DependencyProperty.Register("SharedServiceProvider", typeof(String),
typeof(BDCActivity));
public static DependencyProperty LOBSystemInstanceProperty =
DependencyProperty.Register("LOBSystemInstance", typeof(String), typeof(BDCActivity));
public static DependencyProperty EntityProperty =
DependencyProperty.Register("Entity", typeof(String), typeof(BDCActivity));
public static DependencyProperty MethodNameProperty =
DependencyProperty.Register("MethodName", typeof(String), typeof(BDCActivity));
public static DependencyProperty DataSetProperty =
DependencyProperty.Register("DataSet", typeof(IEnumerable), typeof(BDCActivity));
public static DependencyProperty PositionProperty =
DependencyProperty.Register("Position", typeof(Int32), typeof(BDCActivity));
public static DependencyProperty ReturnValueProperty =
DependencyProperty.Register("ReturnValue", typeof(String), typeof(BDCActivity));
These allow the specification of the SSP and LOB system (defined in the ADF). In addition, the
entity and method to invoke on it can be specified. Finally, the dataset provides a way of
passing in parameters to the method from a previous activity and the position will be used to
specify which row in the dataset is used.
The code pattern for the property getters and setters is shown below. Each of these allows
access to the DependencyProperty fields specified above. Repeat this for the properties in
Table 8. Note that the Position and ReturnValue properties should have the Browsable attribute
set to false.
Name Description Category Type Field
DataSet Data Data IEnumerable DataSetProperty
MethodName Method
Name
BDC string MethodNameProperty
91
Position Position BDC Int PositionProperty
Entity Entity Name BDC string EntityProperty
SharedServiceProvider SSP BDC string SharedServiceProviderProperty
LOBSystemInstance LOB
System
BDC String LOBSystemInstanceProperty
ReturnValue Return
Value
Data String ReturnValueProperty
Table 8. Properties.
[DesignerSerializationVisibility
(DesignerSerializationVisibility.Visible)]
[Browsable (true)]
[Description ("DataSet")]
[Category ("BDC")]
public System.Collections.IEnumerable DataSet
{
get
{
return
((IEnumerable)(base.GetValue(BDCActivity.DataSetProperty)));
}
set
{
base.SetValue(BDCActivity.DataSetProperty, value);
}
}
The final step is to override the Execute method which will be where we call out to the BDC
Shared Services Provider to execute the specified method. The code for the execute method is
given below. Add this to your class and then build.
protected override ActivityExecutionStatus Execute(ActivityExecutionContext ctx)
{
object[] parameters = new object[1];
parameters[0] = DataSet.OfType<EMPLOYEE>().ElementAt<EMPLOYEE>(Position-1).EMPNO;
SqlSessionProvider ssp = SqlSessionProvider.Instance();
ssp.SetSharedResourceProviderToUse(SharedServiceProvider);
NamedLobSystemInstanceDictionary ins = ApplicationRegistry.GetLobSystemInstances();
LobSystemInstance instance = ins[LOBSystemInstance];
Entity entity = instance.GetEntities()[Entity];
MethodInstance methInst = entity.GetMethodInstances()[MethodName];
SPSecurity.RunWithElevatedPrivileges(delegate()
{
entity.Execute(methInst, instance, ref parameters);
}
);
92
return ActivityExecutionStatus.Closed;
}
3. Define the Workflow
The final step is to create the workflow that will perform a process instigated by SharePoint. As
discussed earlier, VSeWSS greatly simplifies the work we need to do. I am using version 1.2,
which provides a number of improvements over its predecessor including a set of dialogs to
define the workflow association. This then authors the various required xml configuration files
for deployment.
Add a new workflow project to the solution. As shown in Figure 66, under Workflow projects,
select the SharePoint Sequential Workflow template and call it HIS2009DEMO. Click OK. The
Workflow Designer will now open with a single activity at the top of the workflow,
OnWorkflowActivated. At runtime, this activity acts as the bridge between SharePoint and
Workflow, facilitating the correlation between the list or document in SharePoint that triggered
the instantiation of the workflow instance. Activation context details are passed to the workflow
including the name of the object that triggered it. This will come in useful shortly, as you will see.
Let us add the activities we need by adding the activities in the order listed in Table 9.
Figure 51
93
Make sure the workflow looks the same as Figure 67:
Activity Group
DB2Activity HIS2009Activities
While Windows Workflow v3.0
BDCActivity HIS2009Activities
Table 9.
The next task is to configure each of the activities. The DB2Activity has three properties that
must be set. The first is the ConnectionString. To set an activity‘s properties, click the activity to
highlight it and use the property sheet displayed (if you cannot see the property grid right-click
the activity and click Properties). The ConnectionString contains the details needed to connect
to DB2 to retrieve data. You can get this from the ADO.NET Entity Framework project we
created earlier. Go to the Model Designer and right-click anywhere on the design surface and
click ―Update Model from Database‖. In the Update Wizard that appears, cut and paste the
contents of the Entity Connection String box into the DB2Activity‘s ConnectionString property.
For the second property, ReturnDataSet, we need to bind the property to a workflow variable.
As we do not have one yet, double-click the binding icon next to the property. The Bind dialog
appears. Click the Bind to a new member tab, accept the defaults and click OK. You can now
see that a binding expression has been placed in the property value. As previously discussed,
binding is the principal mechanism to pass data between activities in workflow. Finally, the third
property is the KeyValue. This must be bound to one of the workflow properties. Enter the
binding expression shown below:
Activity=onWorkflowActivated1, Path=WorkflowProperties.Item.File.Name
This references the name of the file that was added to a SharePoint list that triggered the
creation of the workflow instance
94
Figure 52. Unconfigured Workflow
The next activity to configure is the While activity. While is a composite activity, meaning that it
can contain other activities, in this case an instance of our BDCActivity. The While activity
implements a loop where the contained activities will be iterated over while the specified
condition is true. There are two type of condition, Rule and Code. For this example, we will use
a simple code condition. In the Property sheet select Code Condition for the Condition property
and then expand the + next to it. In the text field type testCondition and hit enter. This will add
the condition method to the workflow‘s code behind file. Add the following to the condition:
private void testCondition(object sender, ConditionalEventArgs e)
{
IEnumerable<DB2Entities.EMPLOYEE> employees =
Activity1.ReturnDataSet.OfType<DB2Entities.EMPLOYEE>();
e.Result = index++ < employees.Count<DB2Entities.EMPLOYEE>();
}
Now let us configure the final activity. This is going to update DB2 using the data retrieved from
the DB2Activity. Set the properties on this activity as shown in Table 10. The activity will invoke
a method specified in the ADF on the EmployeeEntity entity. This takes one parameter, which
we must now the assign the value for. This value will come from the ReturnDataSet property on
the DB2Activity we configured earlier. Again, we do this by binding this activity‘s property to the
other activity‘s. Double click the binding icon and in the dialog shown in Figure 68, select the
ReturnDataSet property of Activity1. There is one final binding we need, to the index property
that must be declared on the workflow. This is the property updated in the while loop. Its
95
purpose is to process each row in turn in the dataset that was provided by the DB2 activity.
Double click the Position property binding icon on the activity and click the ―Bind to a new
member‖ tab. Enter index as the name and change the type to field.
Property Value
SharedServiceProvider SharedServices1
LOBSystemInstance SAMPLE_Instance
Entity EmployeeEntity
MethodName UpdateEmployee
Table 10.
We are all set. Check that the solution builds without errors. Now we need to deploy everything
and associate a document library with our new workflow. Right-click the workflow project in the
Solution Explorer and click deploy. This will deploy the workflow to SharePoint. In addition, the
activities must be added to the GAC. Ensure they are strong named and add them. We can now
perform the final step, to create a workflow association to a list and test our solution.
Figure 68
There following steps must be completed:
1. Enable our new workflow 'feature' in SharePoint
2.. Associate the workflow with a list
The SharePoint Workflow project created a feature.xml file describing our workflow to
SharePoint. Now that we've deployed it, we must activate it for the site we wish to use it in.
Open the Admin page (under Microsoft Office Server-> SharePoint 3.0 Central Administration)
and click on "My Site". Click Site Actions and select "Site Settings". Under Site Collection
96
Administration click the Site collection features link. You should see our new workflow,
DB2Workflow feature. Click the button next to it to activate it (see Figure 69).
Figure 69
Next, we must associate our workflow with something to trigger it. Click on My Site and click the
Shared Documents link in the left hand navigation bar. In the list, click the Settings link and
select "Document Library Settings". Under Permissions and Management click the "Workflow
Settings" link. The new workflow should be in the list. Select it and give the workflow template
the name PayRise. Under Start Options check the "Start this workflow when a new item is
created." option. Make sure the page looks like Figure 70. This will cause a workflow instance to
be created for each new document added to the library. Click the OK button to save the
changes.
97
Figure 70
Now it is time to test our workflow. Open the DB2 Control Center and execute the following query. SELECT * FROM EMPLOYEE WHERE EMPNO = '000110'
Make a note of the SALARY column value. The BDC activity in the workflow is designed to pick up the name of the file uploaded to the library and use it to update an employee with the same name in the EMPLOYEE table. This is as easy as creating a file in notepad and save it is LUCCHESSI.txt. It must not be empty but its content is unimportant. In the SharePoint library, click the Upload action and browse to the file. Click OK to upload. A new column should appear in the Document Library, PayRise, with the status of the workflow that has been started. You can click on this to find out its status and progress or even to terminate it. Wait a few seconds and refresh the page. The status should change to Completed. Re-run the DB2 query and observe that the salary has increased by 10%.
98
Appendix F – Prerequisite Software for Samples
The following pre-requisite software is needed to re-create the samples described in this white
paper.
HIS 2009 Product Software Host Integration Server 2009 Evaluation Edition (package named
―BizTalk2009Eval_HIS_EN.exe‖)
o http://www.microsoft.com/downloads/details.aspx?FamilyID=1e545278-0d5b-
4bbd-b073-18111ce86995&displaylang=en
HIS 2009 Product Installation HIS 2009 requires one of the following operating systems (32-bit x86 or 64-bit x64), when
recreating these MOSS samples.
Windows Server 2003 SP2
Windows Server 2003 R2 SP2
Windows Server 2008
Additionally, HIS 2009 requires the following other software products, which are considered
product installation prerequisites.
.NET Framework 3.5 SP1
o http://www.microsoft.com/downloads/details.aspx?FamilyID=AB99342F-5D1A-
413D-8319-81DA479AB0D7&displaylang=en
Core XML Services (MSXML) 6.0 SP1
o http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=d21
c292c-368b-4ce1-9dab-3e9827b70604
HIS 2009 Feature Installation HIS 2009 requires one of the following developer tools products, which are considered feature
installation prerequisites.
Visual Studio 2005 SP1 (with C++ and C#); or
Visual Studio 2008 (with C++ and C#)
o http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=83c
3a1ec-ed72-4a79-8961-25635db0192b
HIS 2009 Feature Configuration HIS 2009 requires the following software products.
99
SQL Server 2005 SP2 or SQL Server 2008
o http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=265
f08bc-1874-4c81-83d8-0d48dbce6297
IBM WebSphere MQ Server 7.0 with Fix Pack 7.0.0.1
o http://www.ibm.com/developerworks/downloads/ws/wmq/?S_TACT=105AGX01&
S_CMP=DLTAB
Additional Prerequisite Product Installation for Samples The MOSS samples require the following prerequisite software, in addition to HIS 2009.
Microsoft Office SharePoint Server 2007 Trial Version
o http://www.microsoft.com/downloads/details.aspx?FamilyId=2E6E5A9C-EBF6-
4F7F-8467-F4DE6BD6B831&displaylang=en
Microsoft Office Servers 2007 Service Pack 1
o http://www.microsoft.com/downloads/details.aspx?FamilyId=AD59175C-AD6A-
4027-8C2F-DB25322F791B&displaylang=en
SharePoint Server 2007 SDK: Software Development Kit (includes the BDC editor)
o http://www.microsoft.com/downloads/details.aspx?familyid=6D94E307-67D9-41AC-B2D6-0074D6286FA9&displaylang=en
o To install the Microsoft Business Data Catalog Definition Editor tool:
After the SDK is installed, navigate to SDK installation path\Tools\BDC Definition Editor\readme.html. The default installation path for the MOSS SDK install is <%Program Files%>\2007 Office System Developer Resources\.
Follow the instructions in the Readme file to install and use the Business Data Catalog Defintion Editor tool.
Visual Studio 2008 extensions for Windows SharePoint Services 3.0, v1.3 - Mar 2009
CTP
o http://www.microsoft.com/downloads/details.aspx?familyid=FB9D4B85-DA2A-
432E-91FB-D505199C49F6&displaylang=en
IBM DB2 for Windows V9.5 (with SAMPLE database installed)
o http://www.ibm.com/developerworks/downloads/im/udb/?S_TACT=105AGX28&S
_CMP=TRIALS
100
Conclusion
Microsoft Office SharePoint Server 2007 (MOSS) enables enterprise IT groups to improve
collaboration and content management, while deploying new web applications. Microsoft Host
Integration Server 2009 provides enterprise IT developers and administrators to integrate
MOSS solutions with existing IBM systems, programs, messages and data. The combination of
these products, technologies and tools, enables enterprise IT to improve productivity, while
returning value to the enterprise organization.
Look for information on Microsoft Host Integration Server 2009 on the Microsoft BizTalk Server
2009 Web site (http://www.microsoft.com/biztalk/en/us/host-integration.aspx). Participate in the
Microsoft Host Integration Server community.
Microsoft Connect (https://connect.microsoft.com/site/sitehome.aspx?SiteID=66).
Microsoft Newsgroups (http://www.microsoft.com/communities/newsgroups/en-
us/default.aspx?dg=microsoft.public.hiserver.general).
Microsoft TechNet (http://technet.microsoft.com/en-us/his/default.aspx).
Microsoft Developer Network (MSDN®) (http://msdn.microsoft.com/en-
us/library/aa286574.aspx).