Monday, September 15, 2008

CV references and certificates

Cranswick Computing reference






Cranswick Colleague 1





Cranswick Colleague 2





J2EE Web Site Development reference




Van Zyl & Pritchard certificate




MCSD course completion certificate 2524



MCSD course completion certificate 2073



MCSD course completion certificate 2090



MCSD course completion certificate 2310



MCSD course completion certificate 2373



MCSD course completion certificate 2389



Copy of my passport

Sunday, September 14, 2008

Shipment Tracking System User Manual

.
KUEHNE & NAGEL SHIPPING TRACKING SYSTEM

USER MANUAL

SYSTEM OUTLINE

This system is intended to simplify and improve on the existing excel based system currently in place making it quicker and easier to perform the basic functions required. The existing system can be broken down into the following key steps:
  1. A list of OPEN PURCHASE ORDERS, in the form of an excel spreadsheet, is extracted from Syspro using an existing visual basics program based on a user entered date range. This list includes local suppliers so these suppliers are manually deleted from the list. This list is then emailed to K & N.
  2. Certain suppliers in the list generated in step 1 are to be sent to LOCTON. Those suppliers that do not need to be sent to LOCTON are deleted from the list and the list is then emailed to LOCTON.
  3. Open orders generated in step 1 and sent to K & N are taken as being in a status of 0. To avoid confusion only the first weeks open orders in the list generated in step 1 are placed in a second excel spreadsheet called the K & N status report and emailed to K & N. K & N will update this data with details on the status of each purchase order line and send the list back to ZF. This process is repeated until a purchase order line is complete and it is placed in a status of 99.
  4. A third excel spreadsheet called the container requests list is updated daily with details on all containers that are expected to arrive. The items in these containers are placed IN TRANSIT in Syspro and a GRN is created.
  5. When items physically arrive a warehouse transfer is performed in Syspro and the items are placed in the AC warehouse.

This entire process will be replaced by the new XML based system and this guide provides detailed instructions on how to perform the steps above using the new system.

1. EXPORTING THE XML FILES

The logic in the existing visual basics program, used to generate the initial list of open orders, was used as a base for the XML export function in the new system. This function can be accessed from the Export XML File button on the Import XML file tab as shown below:

The following screen will be displayed:

The user input required for this function is exactly the same as was required for the existing visual basics program. The user must supply the start date or week and the number of weeks after the start date, giving the end date, to be used as the range of LATEST DUE DATES when extracting open orders from Syspro.

The RETRIEVE ORDERS function extracts the K & N open orders list, the Locton open orders list and the K & N status report and displays these lists to the user. The K & N open orders list is exactly the same list that was generated by the existing visual basics program except that it excludes all suppliers that have a currency of ‘R’ thereby excluding local suppliers and it excludes Locton suppliers. The Locton open order list is a subset of the K & N open orders list generated by the existing visual basics program.

The system has a function for maintaining the list of Locton suppliers (Please see the Query Database functions for details on the Locton supplier list maintenance function). The K & N status report is a combination of the first weeks K & N open orders, the start date entered plus 6 days, and the existing orders in the new systems SQL database that are in a certain status. A status list table is contained in the new systems SQL database that defines the status values to include when generating the K & N status report (At present all lines that have a status of 10 or less will be included). This status list does not have a maintenance function but a system administrator could easily maintain this list if required.

If the lists displayed to the user are correct the user can export them to XML files by clicking the Export XML Files button. The exported xml files can be found in the C:\KUEHNE & NAGEL SHIPPING TRACKING SYSTEM\XML\EXPORT folder on the users PC. The file names are based on the current date and time so that a history of exported files can be kept as shown below:

· 2007-01-30_03-39 PM_KN_OPEN_ORDERS.XML
· 2007-01-30_03-39 PM_KN_STATUS_REPORT.XML
· 2007-01-30_03-39 PM_LOCTON_OPEN_ORDERS.XML

These files can then be emailed to the required people.

2. IMPORTING THE K & N STATUS REPORT

The 2007-01-30_03-39 PM_KN_STATUS_REPORT.XML file generated with the export XML files function would have been emailed to K & N and they would have updated the information in it as required and then sent it back to ZF. In the existing system this would have been done using the K & N status report. The K & N Status Report and the Container Requests List in the form of excel spreadsheets are used by the existing system to track and maintain the status of containers and items but these lists must be manually maintained in the existing system.

The new system stores this information in a more efficient and reliable SQL database and uses the standard XML file format to import and export the data into and out of the SQL database. The new system also provides advanced functions to query, sort, filter and maintain the data in the SQL database replacing the need for the K & N Status Report and the Container Requests List.

Importing the K & N status report XML file into the SQL database is performed using the function provided on the Import XML File tab as shown below:

Because the K & N status report XML file can contain a fairly large amount of records it may be difficult to validate and correct errors in the file and then import the file in a single sitting. In most cases the user will have to perform other tasks or leave the office during the validation and import process. To avoid starting from scratch each time the system automatically stores the XML files that are imported in the C:\KUEHNE & NAGEL SHIPPING TRACKING SYSTEM\XML folder on the users PC so the user can either import a new K & N status report XML file or continue working on a previous one.

The first step is to import a new K & N status report XML file. This is done by clicking the Select File to Import button as shown above. The standard windows file open screen will be displayed and the user must browse to the location where the file attached to the returned K & N email was saved.

If the user selects to open the file its contents will then be validated for basic data format errors. The XML file generated by the export function contains a set of rules called a schema defining the required structure of the data in the file. If K & N use the XML file and its schema in the correct manner as demonstrated by the XMLEditor.exe program provided by Cranswick Computing no basic data format errors should ever occur. In the unlikely event that such a data format error does occur the following type of error message will be displayed:

In the above example a date field in the file contained the string “WRONG_DATE_FORMAT” and not a valid date.

In most cases the file will contain no data format errors and its contents will just be displayed to the user as shown below:

At this point a copy of the selected XML file has been created in the C:\KUEHNE & NAGEL SHIPPING TRACKING SYSTEM\XML folder on the users PC and this copy should always be used from this point onwards and not the original file saved from the returned K & N email. The file is named based on the current date and time when it was first imported as shown below

· 2007-01-31_01-53 PM.XML

If the user closes the program at this point and re-opens it the file can be loaded by clicking on the Load Previous Import button. The following screen will then be displayed:

Clicking on the dropdown list displays all previous import files. In most cases there should only be one file per date making it easy to load the latest file.

At this point the user is ready to attempt to import the data into the SQL database for the first time. To do this the Import Data button must be clicked.

The program will attempt to validate all lines to import. If any error occurs while importing a line it will be displayed in red. If the line passes all validations it will be imported into the SQL database and will be displayed as a successfully imported line in green. Clicking on the left grey area of a line will cause information on why it failed the validations, if any, to be displayed at the bottom of the screen as shown below:

First data format validations will be performed for example if a line has no container number it can not be imported. The bottom of the screen contains 3 tabs: Data format errors, Import errors and linked lines as shown below:

If the line passes all data format validations the import validations will then be performed.

The validation criteria are as follows:

  • If no matching line exists (based on purchase order number, line and container) but there are lines that have the same purchase order number and line number then those lines will be displayed to the user for checking in the linked lines tab. The line will be displayed in purple indicating that it needs to be evaluated. The user can then decide to accept or reject the new line. This process is discussed in more detail further on in this section.
  • If the line does exist but the vessel, voyage or ETS date has changed the new value will be assumed to be correct and the program will automatically import the line and overwrite the existing values.
  • If the line does exist but values other than the vessel, voyage or ETS date have changed the new values will be displayed to the user for checking. The line will be displayed in purple indicating that it needs to be evaluated. The user can decide to discard or accept the new values. This process is discussed in more detail further on in this section.
  • The suppler code must be a valid Syspro supplier
  • The Purchase order must exists and be in a valid status
  • The purchase order line must exists
  • The part must be a valid Syspro stock code
  • The part number for the line must be correct
  • The Invoice Line value will be validated. First the price of the part is extracted as the newest valid contract price for the supplier and part number set up in Syspro’s contract pricing. The lines invoice date must be between the valid from and valid to dates set up for the contract price. The price extracted multiplied by the order quantity for the line must match the invoice line value. Up to 5 decimal places.
  • The status will be validated against a status table in the SQL database.

If the line fails one or more of the import validations the errors will be listed in the import errors tab as shown below:

LINKED LINES (Imported from XML file)

If the line has linked lines as discussed in one of the validations above the line will be displayed in purple indicating it requires evaluation. A message will be displayed in the import errors tab indicating this as shown below:

The existing lines in the database that have the same purchase order number and line number will be displayed under the linked lines tab as shown below:

At this stage the user must accept or reject the line. To reject the line the user must first select the line by clicking the grey area to the left of the line and then click the Reject Line button as shown below:

The line will then change to an orange colour indicating it has been rejected.

Any rejected line can be restored by first select the line by clicking the grey area to the left of the line and then click the Restore Line button. The restore line button will only be visible if a rejected line is currently selected.

To accept the line the user must first check the checkbox in the import errors tab, then select the line by clicking the grey area to the left of the line and then click the Accept Line button. The line will then change to a yellow colour indicating it has been accepted.

If the import data function is now run, all accepted lines will be imported into the database and will change to green.

CHANGED LINES (Change by K & N and imported from XML file)

If line values other than the vessel, voyage or ETS date have changed the line will be displayed as purple and new values will be displayed to the user for checking in the import errors tab as shown below:

The user can either, reject all changes by first selecting the line by clicking the grey area to the left of the line and then click the Reject Line button, or accept one or more of the suggested changes. To do this the user must first check the checkboxes of the changes desired in the import errors tab, then select the line by clicking the grey area to the left of the line and then click the Accept Line button. The line will then change to a yellow colour indicating it has been accepted. Also any changes that where not checked will be set back to the original values.

If the import data function is now run, all accepted line changes will be imported into the database and the line will change to green.



CHANGING LINE VALUES MANUALLY

Any value on a line can be changed manually in the Import XML File screen by simply double clicking inside the field to change and typing in the new value as shown below:

Once a value in the line has been changed it will change to a blue colour as shown below:

If the import data function is now run, all changed lines will be validated again to allow the user to correct minor errors immediately.

NB It must be noted that if the user makes a change manually that would normally result in the changes being evaluated, such as changing the container number which would usually cause the linked lines to be displayed, the system will take it for granted that the user desires these changes and will therefore not ask the user to evaluate them but will just import the line.


FILTER BUTTONS

The colour coded buttons at the top of the import XML file screen are used to display only the lines of interest such as error lines. Simply click the desired filter button and the lines displayed on screen will be reduced.


ERROR REPORT

If at any time, after the import data function has been run, and if one or more lines have errors, an exception report can be printed by clicking on the Print Exception Report button. An example of the exception report is shown below:

The report lists all fields for the line and then the import errors for the line. Data format errors, such as blank container numbers, are not printed in this report as these errors are considered obvious and do not require any special attention. If the invoice line value does not match the order quantity multiplied by the parts contract price however, the error may require special attention.

Lines that require evaluation are not considered to be error lines although they make use of the Import Errors tab at the bottom of the screen. As such they are not included in the error report.



SECURITY MAINTENANCE

The system has a security facility to enable an administrator to disable certain functionality to certain users if required. This facility is also used to store parameters used by the system that may change from time to time such as the network path to the database server and web services. The security facility can be accessed by clicking on the Maintain Security button. The following screen will be displayed:

The program is designed to be launched from within the users My Menu in Syspro. It can also be launched from a shortcut or MSDOS BATCH script saved in a text file ending in a .BAT extension e.g. TEST.BAT. The users Syspro operator code is passed into the program and is used to record operator activity such as changing data in the database. The Syspro operator code is also used to control access to certain functionality in the system. It may be desirable to limit certain operators to query access only and not allow them to change data in the database.

The program is set to look for a UserSettings folder that can either be on the clients PC or on the server. This folder will contain an XML file for each operator that will use the program. The folder contains a DEFAULT.XML file. If a new operator must be added the operator code is simply typed in the dropdown lists text area (ADMIN above left image) and the Ok button pressed. This will create a new security file for the new operator using the default setting contained in the DEFAULT.XML file. The DEFAULT.XML file can be maintained like any other operator file.

To maintain an operator file select the operator code from the dropdown list and press the Ok button. The following screen will be displayed:

The progrmID must always be ZFKN_ImportQuery. The setting names must never be changed by anyone other than a Cranswick Computing developer as this may render the program inactive. The setting value is used by the system administrator to control security. If the user must be denied Change Access the SettingValue for the HasChangeAccess parameter must be set to false.

In most cases these parameters must not be changed. It is advised that before changing these setting a developer at Cranswick Computing is consulted with.


MANUALLY ADDING IMPORT LINES

In some cases, such as for BMW suppliers, it may be required to add an import line manually that would not be in the K & N Status Report XML file. To add a line manually click the Add Import Line button. The following screen will be displayed:

The information required must be entered in the fields provided and when the Save button is pressed the line will be added. After the line has been added the Import Data function must be run to validate and import the added line. The Line Type can be K & N Sea Freight, K & N Air Freight or BMW Freight.

Only line type of K & N Sea Freight will be considered when exporting the K & N Status Report. All other line types are simply stored in the system and can be queried using the sort, filter and query functions. These lines can also be receipted into Syspro using the system functions.


2. QUERING THE DATABASE

Once the data is in the systems database it can be changed, queried, deleted or receipted into Syspro. All this functionality is provided on the Query Database screen.

To make the system more efficient a default (can be changed in the security settings) of 2 months of data is retrieved. This date range is based on ETS Date. To retrieve the data from the database click the Refresh Data button. This will fetch 2 months of data and display it as shown below:

If the user needs to see older records the Modify Query Batch Criteria function can be used to change the range of dates fetched from the database. Clicking on the Modify Query Batch Criteria will display the following screen:

Any range of dates can be selected however as the database grows in size larger query sets will take longer to fetch from the database making the program less efficient. The user must make use of his/her discretion when selecting date ranges.



FILTERING THE DATA

Once a batch of data has been fetched from the database it can be filtered based on a number of filter criteria. Clicking on the Modify Filter Criteria button will display the following screen:

The filtering process searches for substrings in the selected field. For example, if the user wishes to see all suppliers that deal with fasteners the text Fastener can be typed in the supplier name field and the Apply Filters button pressed. Only lines that have the string Fastener within the supplier name will then be displayed, such as Fasteners are us, ABC Fasteners etc.

An exception to this is the date fields and the Status field. For date fields a single date is selected and all lines with matching dates will be displayed. The status field dropdown list allows the user to filter by a single status at a time.

Filters can be concatenated. For example if the user wants to see all lines for a single container CONTAINER_1 but only those for part ABC, then CONTAINER_1 would be entered in the Container Number field and ABC would be entered in the Part Number field.

It must be noted that the filters only apply to the batch of data that was extracted from the database when the Refresh Data or Modify Query Batch Criteria functions where executed. Filtering by a particular container number, for example, does not guarantee that all lines for that container number will be displayed. If one line for the container number has an ETS Date within the selected range at the time of the batch query but the other has an ETS Date outside the selected range at the time of the batch query, then only one line for the container number will be displayed even if the container number filter is used.



PURGING OLD DATA

At some stage data will become too old to be of any interest. In this case the data can be purged using the Purge Old Data function. Clicking the Purge Old Data button will display the following screen:

All record with an ETS data older than the selected data will be deleted.

This function can NOT be undone so care must be taken when using it.

CHANGING DATA MANUALLY

Changing data in the system will be a constant requirement, for example changing the status of a line, and this is done using the Change Selected Line function. First a line must be selected by clicking the grey area to the left of the line and then clicking on the Change Selected Line button. The following screen will be displayed:

Because each line is uniquely identified by the purchase order number - line number - container number combination, none of these fields can be changed. The value of all other fields can be manually changed. The same validations performed during the IMPORT DATA procedure will be applied here except for the EVALUATION of changes to the line as it is assumed that the user changed the line on purpose.

If an error is found a message will be displayed indicating this as shown below:

To make it easier to find a supplier or part number BROWSE functions are included indicated by the looking glass buttons on the right of the part number and supplier code fields. Clicking one of these buttons will display a screen like this:

The list will contain all suppliers or part number in Syspro. The filters can be used to make finding a supplier or part number easier.

The filtering process searches for substrings in the selected field. For example, if the user wishes to see all suppliers that deal with fasteners the text Fastener can be typed in the supplier name field and then the TAB key pressed. Only lines that have the string Fastener within the supplier name will then be displayed, such as Fasteners are us, ABC Fasteners etc.



DELETING LINES

In some cases a line can be found to be invalid or incorrect and it must be deleted. To do this a line must be selected by clicking the grey area to the left of the line and then clicking on the Delete Selected Line button. A confirmation message will be displayed and if the user accepts, the line will be permanently deleted.

This function can NOT be undone so care must be taken when using it.

RECEIPTING LINES INTO SYSPRO

At some stage during the process the items must be receipted into inspection and a GRN created. This can be done at a line level or at a container level.

To receipt a single line select the line by clicking the grey area to the left of the line and then and then click the Receipt Selected Line button. The system will then perform a purchase order receipt in Syspro. The fields used are as follows:

  • Purchase Order Number – Purchase Order Number
  • Purchase Order Line – Purchase Order Line
  • Order Quantity – Order Quantity
  • Container Number – Notation
  • Bin – Bin
  • Delivery Note – Lot Number

If an error occurs during the receipt process an error message will be displayed:

Once a line had been successfully receipted it will change to green. Green lines can not be receipted again.

To receipt a container the filters must first be used to ensure that only a single containers lines are displayed. If more than one container lines are displayed an error message will indicate this. Only the container number filter must be active at the time of receipting a container to avoid missing lines. If other filters are active at the time of receipting a container an error message will indicate this.

It must be noted that the filters only apply to the batch of data that was extracted from the database when the Refresh Data or Modify Query Batch Criteria functions where executed. Filtering by a particular container number, for example, does not guarantee that all lines for that container number will be displayed. If one line for the container number has an ETS Date within the selected range at the time of the batch query but the other has an ETS Date outside the selected range at the time of the batch query, then only one line for the container number will be displayed even if the container number filter is used.

Once all the containers lines are displayed correctly the container can be receipted by clicking the Receipt Container button. The program will attempt to receipt each line for the container separately. If the line has already been receipted it will just be ignored. If an error occurs while receipting the line the program will try to receipt the next line until all the lines have been receipted. At the end of the process any errors that occurred will be displayed in a message box to the user. Once the error has been corrected, the can try to receipt the container again or just the line/lines that failed.



COMMENTS

Comments can be maintained against each line in the database. This can be useful for providing extra information about the status of a particular line. To maintain comments for a line select the line by clicking in the grey area to the left of the line and then click the Comments button. The following screen will be displayed:

Edit the comments and click the Ok button to save your changes.

STATUS TABLE MAINTENANCE

The status values used in the system are stored in a status table. These values can be maintained using the Maintain Status Table function. Clicking on the Maintain Status Table button will display the following screen:

This function works like an excel spreadsheet. To add a new line simply type in the fields next to the * on the bottom left of the screen. To change a value click inside the field and change the cells value. To delete a line, select the line by clicking on the grey area to the left of the line and hit the DELETE key on your keyboard.



BIN TABLE MAINTENANCE

The bin values used in the system are automatically generated each time a new container number is added or imported. The last used bin number is incremented by 1 giving the next bin number to use. Because this process is automated these bin values should never need to be maintained but if for some reason the user must maintain a bin number they are stored in a bin table. These values can be maintained using the Maintain Bin Table function. Clicking on the Maintain Bin Table button will display the following screen:

This function works like an excel spreadsheet. To add a new line simply type in the fields next to the * on the bottom left of the screen. To change a value click inside the field and change the cells value. To delete a line, select the line by clicking on the grey area to the left of the line and hit the DELETE key on your keyboard.



LOCTON SUPPLIER TABLE MAINTENANCE

The list of Locton supplier values used in the system is stored in a table. These values can be maintained using the Maintain Locton Suppliers function. Clicking on the Maintain Locton Suppliers button will display the following screen:

This function works like an excel spreadsheet. To add a new line simply type in the fields next to the * on the bottom left of the screen. To change a value click inside the field and change the cells value. To delete a line, select the line by clicking on the grey area to the left of the line and hit the DELETE key on your keyboard.


Saturday, September 13, 2008

Web Site Development Template

The main thing I am trying to do is simplify the process of developing small personal or company web sites. I think that there are four main concerns:

  • Online content management.
  • CSS style and theme selection and editing.
  • Application functionality.
  • Security.

I firstly started out using Stringbeans Portal (http://www.nabh.com/projects/sbportal ) because I liked the idea of being able to build a site up in grid like manner by either manually editing the content online or by including application functionality in the form of portlets.

I had a number of problems with Stringbeans such as the lack of communication between portlets and its layout is heavily table based.

I started to use Java Studio Creator or JSC which has now been replaced by Netbeans 6.1 to develop simple web applications. I made use of the drag and drop JSF web components that makes creating web application very easy.

I created a portal like project template that uses CSS to control its layout and styling.

The basic idea is to use web pages created in JSC or Netbeans to control the layout, navigation, security, login and registration of the site but to store all actual site content as HTML in a MySQL database. There is only one actual JSF web page per page layout type (e.g. two column wide narrow) and all the actual sites content pages are simply records in MySQL database tables.

So to navigate to the About Us page in the site the URL would be something like this:

…/Faces/Layout_2_col_NW.jsp?PageName=AboutUs

where the application root is www.sastudent/SAstudent/

When the Layout_2_col_NW.jsp page loads the PageName URL parameter is used to fetch the actual page content from the MySQL database. The Layout_2_col_NW.jsp page also contains logic to check if the user needs to be logged in to view the page to be loaded and if so they will first be redirected to the login page as shown below. Security could be implemented using Acegi and Spring.

http://www.springframework.org/

http://www.acegisecurity.org/





Also if the user is a site administrator edit links will be displayed for each grid section of the page. If the administrator clicks one of these links they will be redirected to the content editor page as shown below where they can edit the HTML content for that grid section.

Even the tab menu at the top of the home page is a simple HTML and CSS based menu that can be edited online using this HTML editor.

The TinyMCE editor is very versatile and images, tables and even multimedia can be added to the page content.





TinyMCE is used as the HTML editor. http://tinymce.moxiecode.com/

Using this method a new page can be added to the site by simply inserting a record into a table in the MySQL database and the page’s content can then be edited and styled online.

This means that a JSF web page can be created and used to add, change and delete the sites pages online providing portal like functionality.

Because the page layout and content is styled using CSS (NO TABLES) it is very easy to change the look and feel of the site.

TinyMCE also allows HTML templates to be created so inserting commonly used content such as a quick links box with rounded corners etc is very easy.

I am planning on using IFrames to include actual web application GUI’s within standard site pages. This means the business logic could be handled by a completely separate web application that has access to the same MySQL database as the presentation MVC application i.e. this project template.

An example of a page in a site using this project template is shown below. It does not look like much at this stage but I have been focusing more on getting the project template working and not so much on the actual site content or styling.




So basically what I want to be able to do is to load the project template deployed as a simple .war file and from then on edit the site contents online. New pages can be added online and the CSS style rules can also be changed online. The simple tab based menu and the side menu can both be maintained online.

So once the site is loaded graphic designers can the carry on styling the site and administrators can carry on adding the content while the development team focuses more on the application development side.

This project template is the MVC part and I am using the Spring application framework as the J2EE backbone for the application business logic etc and possibly Hibernate as the data access layer.

This will mean a J2EE web application based on an n-tier architecture.

I do know of some content management systems and portals such as:

but they are all a bit more complex that what I need and I think would over complicate things unnecessarily.

Friday, September 12, 2008

Shipment Tracking System Feasibility Document

FILE IMPORT AND VALIDATION

The file currently returned by Kuhane & Nagel is a Microsoft Excel file containing, among others, the following columns:

Supplier name
Supplier code
Purchase order number
Purchase order line number
Ex Works date
Supplier delivery note number
Supplier invoice date
Supplier invoice number
Invoice value in Euros
Part value in Euros
Part number
Order quantity
Container size
Container number
Vessel
Voyage
Estimated ship date
Estimated arrival date in Cape Town
Status

A program will be created to validate and import the data in this file into the system. The user will be required to save the file attachment from the email message onto their computer. The program will then ask the user for the location of the file to import. This will allow the user to overwrite the existing file each time when saving the attachment or save each file received with a new file name in order to keep a history of files received.

In order for the program to import the file the format of the file must be fixed and must be agreed upon by all parties. The following column names are suggested:

SupplierName
SupplierCode
POnumber
POline
ExWorksdate
DeliveryNote
InvoiceDate
InvoiceNumber
InvoiceValue
PartValue
PartNumber
OrderQty
ContainerSize
ContainerNumber
Vessel
Voyage
ETSdate
ETAdate
Status

All date fields must be in the CCYY/MM/DD format for example 2006/08/28. An example of this file in the required format will be made available.

The first validation performed by the program will be to make sure that the format of the file is acceptable in that all required columns exists in the file identified by their column names and that the data contained in each column is in the correct format such as numeric, alphanumeric or date(CCYY/MM/DD) etc. For example a common problem with importing Excel files is that one ore more blank lines exist at the end of the file or one or more blank columns exists. This problem can easily be corrected by simply deleting this blank information and saving the file. If the file is found to be in an incompatible format an error message will be displayed to inform the user.

Once the format of the file is correct the program will display the lines to be imported. The user will then select to import the displayed lines by means of an IMPORT BUTTON.

The program will attempt to validate all imported lines. It will be assumed that the purchase order, line number and container combination will uniquely identify each record. If the imported line does not pass one or more validation criteria a descriptive error message will be displayed for each validation criteria that failed.

The validation criteria are as follows:

If no matching line exists (based on purchase order number, line and container) but there are lines that have the same purchase order number and line number then those lines will be displayed to the user for checking. The user can then decide to accept or reject the new line. This validation criteria applies to the import function only.
If the line does exist but the vessel, voyage or ETS date has changed the new value will be assumed to be correct and the program will automatically overwrite the existing values. This validation criteria applies to the import function only.
If the line does exist but values other than the vessel, voyage or ETS date have changed the new values will be displayed to the user for checking. The user can decide to discard or accept the new values. This validation criteria applies to the import function only.
The suppler code must be a valid Syspro supplier
The Purchase order must exists and be in a valid status
The purchase order line must exists
The part must be a valid Syspro stock code
The part number for the line must be correct
The part value will be validated based on the rand value of the purchase order line in Syspro and the exchange rate for the invoice date. The exchange rate for the invoice date could be maintained in an external table or possibly using Syspro’s contract pricing.
The status will be validated against a status table

The program will display valid and invalid lines differently (possibly using different colors or by means of a check box) and for invalid lines will display one or more error messages linked to the line. A printable exception report will display all invalid lines. Each validation could possibly be given a code so that exceptions displayed or printed can be filtered by type to allow specific users to resolve only errors of interest to them.

The routines that perform the above validations can also be run after corrections have been made by means of a VALIDATE button. The same routines will also be performed as part of the PURCHASE ORDER RECEIPT function.

If any changes made, such as expected due date, require modification of the purchase order details in Syspro the PORTOI business object will be used to make the required changes in Syspro.

Work to be done:

Design of tables and database structure…………………. 6 hours
Initial import of Excel file……………………………….. 6 hours
Validation functions…………………………………….. 20 hours
Exception screen display………………………………... 5 hours
Exception reports………………………………………… 10 hours
Syspro purchase order changes PORTIO………………. 20 hours


QUERY FUNCTIONS

Once the data has been imported into the system the data can be queried using the same program. To make the query more functional datasets will be used to temporarily store batches of data on the client side. Using this approach trips to the database to retrieve data are reduced decreasing network traffic.

Client side filters will then select the data actually displayed in the program to help the user find the desired records. This process should be quite efficient because the batch or records to filter are cached in memory on the client computer. The size of the record batch should be large enough to limit network traffic but small enough to remain efficient. To achieve this, the actual query to the database should include some less specific selection criteria such as an EST date range of two months defaulting to the current month and last month.

The following fields will be used to filter data on the client side:

Supplier name
Supplier code
Purchase order number
Delivery note number
Invoice number
Invoice date
Part number
Container number
Vessel
Voyage
ETA date

Each filter will apply a LIKE selection to the dataset. For example, if the user was looking for a part number that started with 6760330- but the part could be 6760330-01,6760330-02 or 6760330-03 the user would simply type 6760330- into the stock code filter text box and only lines that have part numbers that contain the character string 6760330- would be displayed. The character string can appear anywhere in the part number. If the user typed in 760330- into the stock code filter text box then lines containing the following part numbers would be displayed:

· 1760330-01
· 1760330-02
· 2760330-03
· 2760330-04
· 3760330-05
· 3760330-06
· 4760330-07
· 4760330-08

Combining more than one filter selection at a time will provide the user with a very fast and efficient way to find the desired records.

This program can be setup to run from an icon on the users desktop or an icon in Syspro’s user defined menu.

Work to be done:

Query of record batch and required screen options………. 4 hours
Filter options screen………………………………………. 8 hours


LOGIN AND SECURITY

The program will limit user functionality based on command line arguments passed into the program to allow some users to edit the data but only allow others to query the data. The program will be run from within Syspro, so this will provide some level of security because in order to run the program the user would need to be logged into Syspro.

Using command line arguments such as MODE=EDIT or MODE=QUERY when setting up the user menu icons in Syspro would be the simplest and most cost effective way of providing security and limiting user functionality and with detailed installation manuals for the program it should be quite simple for an administrator to manage user access and security for the system.

The Syspro operator code could be passed into the program, so that when any records in the database are changed the system could record which operator made the last change as well as the data and time the change was made.

Work to be done:

Command line arguments passed into program……………. 3 hours



PURCHASE ORDER RECEIPTS

The receipt of goods into the AC warehouse will be done from within this program by clicking on the CONTAINER RECEIPT button. The user will make use of the container filter to ensure that only records for a single container are displayed. All other filters will automatically be cleared to ensure that all records for the container are included. If records for more than one container are included in the selection this function will display an error message indicating this.

Before continuing this function will run the validation routines detailed above and will only attempt the container receipt if there are no errors found. The function will use Syspro’s purchase order receipt business object (PORTOR) to receive all lines for the container into the AC warehouse.

Once a line has been receipted it will be flagged as such and will no longer be visible in the program to avoid duplicate receipts.

Work to be done:

Function validations…………………………………………. 5 hours
Purchase order receipt PORTOR……………………………. 20 hours


PURGING OF HISTORY DATA

A function will be provided in the program to allow the size of the database to be controlled by purging unwanted records from the database. The user will input the purge date and all records with a supplier invoice date older than the purge date will be permanently deleted.

Work to be done:

Create purge function………………………………………… 5 hours


SYSTEM DESIGN

The system will be written in Microsoft Visual Studio 2005 using C# and will make use of web services over the intranet as the interface between client and server. This will provide greater scalability and should reduce connectivity problems.



TOTAL HOURS…………………………………………………….. 112 hours

Shipment Tracking System


Quick links
Project Summary
Project Overview
My Roles
Feasibility Study
Detailed Requirements Analysis
System Design
First Release Prototype
Design Problems
Phase 2
Documentation
APPENDIX A - Feasibility Document
APPENDIX B - Cranswick Feasibility Document
APPENDIX C - Program design & implementation examples
Export XML Sequence Diagram
APPENDIX D - The Domain Model
APPENDIX E - Simple Browse Utility


Project Summary

Name: Shipment Tracking System
Dates: 2006/09/01 until 2007/02/01
Employer: SELF EMPLOYED – Working for Cranswick on a contract basis
Client: Cranswick Computing / ZF Lemforder S.A

Cranswick Computing – Web site: http://www.cranswick.co.za/
Email Adrian the project leader: adrian@cranswick.co.za

ZF Lemforder S.A - http://w

ww.zf.com/f/content/en/import/2/startseite/standorte_1/afrika/Rosslyn.html


Syspro ERP – http://www.syspro.com/


Project Overview

This is my main project report. I have tried to keep it short and to the point in the main body of
the report and have included all extra details in APPENDIX sections at the end of this document.

ZF Lemforder South Africa (Roslyn branch) manufactures front & rear axles for various
passenger and commercial vehicles including certain BMW models. The existing system that
was in place at ZF and was to be replaced by this system was intended to track the status of
parts being shipped from suppliers in Germany to South Africa and then transported to the
Roslyn warehouse.

ZF runs Syspro ERP and purchase orders would be created in Syspro for parts required from various suppliers used to assemble the final manufactured items. Kuehne + Nagel, a global logistics company http://www.kuehne-nagel.com/ , was responsible for transporting the parts from the supplier in Germany to ZF.

Certain parts require special packaging before transport and in these cases the items would need to go via LOCTON (packaging company) before being placed in a container. Because a container size is fixed certain purchase order line quantities may have to be split across many containers or a single purchase order may be split and have detail line items in many containers. The supplier invoices would also not always tie up exactly to the ZF purchase orders in some cases.

In order to ensure that the manufacturing lines at ZF function efficiently they need to know what parts will be arriving at the ZF warehouse each day and how many.

This system was intended to replace an existing system that had a VB6 program used to export data from the Syspro ERP SQL Database. Various Microsoft Excel spreadsheets were used to query and maintain this data. The existing system relied heavily on manual processes and Excel and had very little data validation procedures. One of the spreadsheets was emailed to a person at Kuehne + Nagel and manually updated by this person and then emailed back to ZF for further processing.

ZF contacted Adrian at Cranswick Computing and asked him to present them with a proposal for a new improved system aimed at addressing these core problems:

  • There are too many data sources (SQL, Excel) and user interfaces (VB6 program, Excel, Syspro) in various locations making the existing system difficult to use, support and maintain.

  • There are not enough validation procedures and too much room for human error.

  • Since the data is spread across many Excel spreadsheets and the Syspro SQL database,
    querying the data is difficult.

  • The required Syspro functions such as a purchase order receipt need to be manually run
    from within Syspro.

  • Kuehne + Nagel has an XML based EDI interface that ZF could make use of to automate
    certain data exchange procedures currently performed using Excel spreadsheets and
    email.

Adrian then contacted me and asked me if I would like to take on this project and I accepted.


My Roles

I took on the roles of systems analyst, programmer, technical architect and project manager and Adrian took on the roles of business analyst and project leader. The project was small enough that I did not have to make use of any of Cranswick’s in house programmers.


Feasibility Study

Adrian and I went to ZF for an initial meeting with the primary user of the existing system, ZF’s IT Manager and a ZF systems administrator to establish the base requirements for the proposed system.

With these in hand I had to produce estimates of cost and delivery timescales and suggested solutions to meet the basic requirements.

I compiled a document and sent it to Adrian for review. Please see APPENDIX A for a copy of the feasibility document.

Adrian then compiled another document containing any amendments and Cranswick’s markup on the proposed systems cost and emailed it to ZF for review. Please see APPENDIX B for an extract from this document.


Detailed Requirements Analysis

The development methodology that I used while working at Cranswick and during this project was similar to other known agile methods such as XP (Extreme Programming). Agile methods are a good fit with the type of projects Cranswick typically takes on. Most projects were fairly small to start with and in time new requirements were identified and incorporated into the projects. Ultimately most projects would evolve into fairly large complex systems.

To manage the evolution of systems better I have found that phasing projects can offer some advantages. It makes it easier for the customers to absorb the costs of software development.

Requirements can be classified in order of importance and projects can be phased to address crucial requirements first. The cost of the first phase of development will be lower and it is then more likely that the customer will be willing to go ahead with the project because of the reduced risk. If the benefits of the initial phase of development are actually realized by the customer they can decide to move on to the next phase. Alternatively the project can be terminated with less concern about the loss of the initial investment.

It also becomes more difficult for large, complex and expensive systems to pass user acceptance testing and I have found that identifying core requirements and addressing those in the first release while leaving less important requirements for implementation in a later release can be beneficial for all parties concerned.

In this case the project was split into 2 parts:

  1. The core functionality used internally at ZF.

  2. The automation of data transfer to and from Kuehne + Nagel using their EDI
    interface.

I visited ZF many times and held several fact finding interviews with the various stakeholders such as the primary users, managers and administrators.

One thing I have noticed is that initially people seem to be too busy to set aside time for the fact finding interviews and usually want to get them over and done with as quickly as possible. After a while they begin to realize the time and effort they could potentially save as a result of the system being developed they usually start to want to contribute more and this is when most of the detailed requirements are identified. This is another reason why agile methods work well in these types of projects because users will usually want to contribute more after having seen the first release prototype.

In this case the main user of the existing system was a lady who was quite difficult to deal with at the start of the project. It was a huge inconvenience for her to sit in on meetings and fact finding interviews and she seemed to think it was all a waste of time. She did finally come round though but it did make the requirements analysis process more difficult than usual and as a result I did miss certain requirements to start with.

I try to identify as many of the requirements as I can in the timeframe allocated for
requirements analysis but in most cases this will not be ALL possible requirements. This goes against structured systems analysis and design methods or SSADM. I do always try to identify the main requirements driving the system first and then move on to the secondary requirements.

It is difficult to justify the cost of requirements analysis to users and the customer early on in the project and they start to get edgy until a prototype is produced and they can see the results of their efforts and those of the development team as well as the full potential of the software under development.

The requirements analysis phase was repeated several times during the development iterations of this project.

Once the detailed requirements were known I compiled a project plan and identified tasks to be completed. I also prioritized certain tasks and identified dependant tasks and their estimated development times based on the feasibility document. I then provided Adrian with the information for review and we agreed on an estimated first prototype release date and what functionality it would include.


System Design

The first step was to produce a rough use case diagram. I don’t usually use CASE tools to do this, especially for smaller projects, as I see modeling more as a design tool than as a way to record the requirements. Most models produced while working at Cranswick were sketched on paper or a white board and discarded after use. In some cases if the concepts were crucial we would make use of CASE tools or a white board that can print what is on it.

For larger more complex projects with bigger budgets and more resources such as time, developers and analysts I would do things more by the book. I would then use CASE tools to record the requirements using models, a requirements catalogue and a data dictionary etc.

I think that models have 2 main uses:

  1. As a design tool used to help understand complex systems through abstraction and increasing levels of detail.

  2. To transfer understanding of the requirements from one person to another.

When there are many analysts and programmers involved in the development process and where the programmers do not interact directly with the users or the customer they must then rely on the ability of the analysts to gain a full understanding of the requirements. The analysts must then transfer this understanding on to the programmers and for this purpose models are very useful.

In my case I was usually present at meetings, prototype demos and fact finding interviews or at the very least was in direct contact with another member of my development team that was present. As a result I think there was less need for the use of models to record the requirements.
The methods taught in books and training courses such as the MCSD (modeling in the MSCD course helps to produces a good database design but does not help with the actual system design) usually follow a sequence of steps. Each step would produce a deliverable that would be used as input into the next step as is the case with OO design. So the steps could be: first produce use cases and the use case diagram, then a system sequence diagram, then produce a domain model, then interaction diagrams, then the class diagram and finally some state transition diagrams and so on.

Following a fixed process worked well for me when I didn’t have the experience to make decisions on my own and needed more guidance. Now I try and understand what these steps are trying to achieve rather than just blindly following them.

A good example of this is the comparison of the business requirements modeling course using Visio in the MCSD track and the use of entity relationship modeling and normalization. Both methods can arrive at a database in fifth normal form but Visio hides a good deal of the decision making process from the user. The user just needs to know what to do next and not so much why they are doing it.

Although this can be an advantage, especially for junior developers, I think it is better to understand why a method works well. This way it can be adapted to better suite your needs and it also helps in choosing between similar methods. Certain methods are not always suitable for the task at hand and in some cases the benefits gained from their use may not outweigh the cost of using them in terms of time and effort.

I think that ER modeling and normalization can be a much faster way to generate a reasonable database design than using Visio and Object Role Modeling. The process of normalization is quite complex and can take a good deal of time itself especially if you want a database in fifth normal form.

However if views, stored procedures, Datasets and Data Adapters are used to provide good database independence to application code then a database in third normal form may be fine to start with. At a later stage the database design can be improved upon and because of the built in data independence the changes in the database design will most likely have very little impact on the application code.

From what I understand, the point of OO design is to arrive at a good responsibility driven design where each object is responsible for a clearly defined task and performs only that task. The three main principles are expert doer, high cohesion and low coupling. By making use of well designed interfaces the implementation of an object can be changed without affecting the other objects it interacts with. Also, if there is an error in the system it is easy to trace which object was responsible for the task that resulted in the error.

Basically good OO design results in a system that is easier to test, debug and maintain and one that will respond to changes in the requirements more gracefully. Things like Dependency Injection of DI and Aspect Oriented Programming or AOP can take this idea even further (Used in the Spring application framework).

I try to identify areas in the design that are unclear to me and I use whatever model would help me to understand the problem better.

For the use cases & use case diagram of this system and some example code please see
APPENDIX C

Entity Relationship modeling would normally be used to produce a good database and functional dependencies, join dependencies and multi valued dependencies would be identified and used to normalize the database by decomposing relations into smaller relations but in this case the database is quite simple so I will not give details of the database design process.

Most of the systems I have worked on since moving from COBOL client-server development to .NET (n-tier) have made use of web services and have been partitioned into 3 layers:

  1. The data layers which consists of one or more CISAM file data stores (linked to via ODBC), SQL databases, stored procedures, views and functions.

  2. The business logic layer which consists of web services, related classes and interfaces to integrated systems such as Syspro’s E.net business objects.

  3. The presentation layer which could be a desktop application, a web front end or a mobile device.

This architecture has worked well in the past for a number of reasons:

  • It offers better scalability if the system grows larger in a short space of time in that the processing load can be spread horizontally over more than one server while the interface to the presentation layer remains unchanged.

  • It offers automated and remote deployment options.

  • It is easier to integrate multiple data sources and to provide a higher level of data independence to application code.

  • It is easy to change from local access to the business logic layer, to remote access such as a web interface or mobile device.

  • There are less communication problems between client and server and better security because port 80 is usually open on most firewalls.

In some cases using this architecture for very small applications may seem excessive but at most customer sites I have worked at smaller systems usually evolved into more complex ones in time and in these cases the extra effort and complexity paid off.

For the domain model for this system please see APPENDIX D .

Because most of the systems we developed at Cranswick were once off systems and could not be resold to other customers the only opportunities for software reuse were in the form of utilities and components. We were constantly on the lookout for repetition of functionality across systems. I developed a number of utilities and components that could be reused and some of these were used in this system.

One example is the automated deployment service. A runner application is installed on the client machine and is passed command line parameters relating to:

  • The program to be run

  • The location of the UserSetting.xml file

  • The Syspro operator code to identify who is running the program.

The UserSetting.xml file contains:

  • The path to the deployment web service

  • The username & password for security checks

  • Other details about the user.

The runner application will fetch the assembly details from the deployment web service and check the version of the assembly on the client’s machine against the latest version on the server. If the server version is newer than the client version the entire assembly including all of its referenced dll’s etc will be streamed from the server to the client.

Once the latest version exists on the client the runner application will load the assembly, create an instance of it and then run it to start the actual application. The entire process is seamless to the user and all they need to do is click an icon in their Syspro menu.

The automated deployment service was created for a mobile sales order application and was used to deploy applications to mobile phones.

Another example of the use of previously developed components in this system is the simple browse utility. Browsing a list of items such as supplier codes or stock codes etc is a very common operation. With a few lines of code and a simple SQL view a fully functional browse form with filter selections is generated. Please see APPENDIX E for more information.


First Release Prototype

After about 3 or 4 weeks Adrian and I went to ZF to present the first release prototype at a meeting that was attended by the IT manager, 2 of the main system users and an accounts manager that had become interested in the system. I presented the meeting using a projector connected to my laptop and Adrian was on hand to clarify certain points if required.

During the meeting there were a few requirements identified that were not in the original list of detailed requirements. Most of them were fairly simple such as:

  • A facility to manually add a line to import rather than importing an XML file.

  • The conversion rate used to convert Euro values into Rand values was based on the date of the supplier invoice and the exchange rate for that date. Because Syspro’s internal reports (some standard & some written using Syspro’s report writer) had nowhere in the standard Syspro tables to find the conversion rate used we had to populate one of the user defined fields on a standard Syspro table with the conversion rate used so that it could be displayed on management reports.

Some new requirements were more complex such as:

  • The process of receipting purchase orders in Syspro was changed. Purchase order items were now being receipted into inspection and once inspections had passed, they would be receipted from inspection into the actual warehouse. I had originally designed the system to perform a single purchase order receipt directly into the warehouse.

This is a good example of a change in the requirements that would have been unavoidable regardless of how much effort was put into the requirements analysis process. This type of change is quite common when developing in a Syspro environment and is another reason agile methods are well suited.

In this case the change did not have a huge impact on the system code because all required changes were in the web service classes and the interface to the client application remained unchanged. This was a direct result of using the 3 tier architecture and using a good OO design process.

A good change control process is required to manage these types of changes. There is always a debate as to whether or not the functionality was part of the original requirements and cost estimate or constitutes an addition to the original requirements. Good documentation of the decision making process is essential such as who requested the change, when, why and who approved it etc. Vague requirements specifications leave loopholes and the customer can then claim that they were clear as to what they wanted but the analyst did not understand the requirements fully. Making sure that the customer and in fact all stakeholders involved sign off on the requirements specification before development commences helps to avoid these types of problems.

Although I believe that there is a limit to how much time and effort should be allocated to the requirements analysis process, once the list of requirements has been identified they must be documented in a very clear, non ambiguous manner. In some cases it may be necessary to split a single requirement into sub-requirements and specify which of these falls under the scope of current project phase. If this is unclear, more fact finding interviews may be required.

We held many such prototype demo meetings during the course of the project and many new requirements were introduced.


Design Problems

Looking back at the system design and code I can now see many problems as is usually the case when looking at systems you have previously developed after some time. One of the first things I noticed is that the controller form frmMain became quite cluttered. There are too many attributes and methods and this suggests that other objects could have been identified and assigned responsibilities. This is what the high cohesion design principle tries to avoid.

The same thing can be said for the shipping web service. It may have been a good idea to split the web service into 2 or more separate services each offering an interface relating to a more specialized group of functions.

The convenience of double clicking a button and simply adding code can lead to this situation if not monitored. Constant redesign and refactoring of code is required as the system and its requirements evolve.

The use of components and utilities can also help with this as common functionality is removed from the main application code and placed in the components. This allows the application code to focus more on what it needs to do and less on how to do it. (i.e. Controller Object)


Phase 2

The second phase of the project involved automating the process of communication with the Kuehne + Nagel EDI interface. At first Kuehne + Nagel was going to handle the conversion from the ZF file format to their EDI file format and then once they had processed the file they would convert the result back into the ZF file format. This changed for some reason and Cranswick was then required to produce an XML file in the Kuehne + Nagel EDI format. They sent me an example XML file and an XSD schema. I started creating routines to convert the ZF file to the Kuehne + Nagel EDI format by reading a template XML file into a dataset and then working with the dataset in my program code.

At this stage I had to go to Dubai for by brothers wedding and I had to hand over the project to an in house developer at Cranswick. I spent a few days setting up a development environment with him on his machine including all related database, stored procedures, views, test data etc. We also went through the system in detail so that he could understand the requirements and the design.

I made sure that they could contact me via email in case of an emergency but the handover went smoothly and without any problems.


Documentation

Once a system nears completion the user manual and technical installation instructions need to be produced. This is a very time consuming and tedious task for most developers but it is important and should not be neglected. I usually just create a word document or a PDF file but at Cranswick we did start experimenting with a program that records sound and a succession of screenshots to produce a useful image based instructional video with voice instructions and explanations. It is much quicker and easier than typing out a word document and I think could be more easily understood by some users.

For this system I only had time to produce a user manual and because I handed the project over to a developer at Cranswick and the second phase of development was still in full swing he agreed to take on the task on producing the technical installation manual.




<-----End of Report----->






APPENDIX A - Feasibility Document -Please see here.


APPENDIX B - Cranswick Feasibility Document



APPENDIX C - Program design & implementation examples

Some use cases identified



Use case UC-1: Fetch data to export


NOTE: Looking back at this use case I can see a problem. The process of separating the datasets or lists fetched from the web service is being done in the presentation layer. This goes against the design idea of keeping all business logic in the web service. This makes the code difficult to find and as a result makes the system more difficult to maintain. A better option would have been to fetch each list from the web service in turn using a separate web method. On the down side this would mean that each web method that uses the open orders list would need to retrieve the list and this would duplicate efforts and place more load on the database and network. (Unless datasets are cached in the web service)

The use case diagram



NOTE: The Syspro ERP actor participates in the Receipt Purchase Order use case but not in any other use cases such as Fetch Open Orders. This is because Receipt Purchase Order is the only use case that makes use of Syspro’s E.NET business objects. The other use cases will fetch data from the Syspro SQL database directly so these functions are under the control of this system.

The table below shows some of the responsibilities identified that could be handled by various objects.



When a web reference to a web service is added in the client application the Visual Studio .NET IDE automatically generates a proxy class for you that is used as the interface to the web service in the presentation layer. In this case this proxy class is called ZFSS (ZF Shipping Service).

The sequence diagram shown below is for the Export XML use case. There are 2 operations:

  1. Fetch Export Data for review by the user
  2. Export XML Files

It is not very clear so I will provide a few details to indicate what is going on. There are 6 classes shown from left to right:

  1. frmMain: The main controller form of the application
  2. frmExport: The form controlling the Export functionality
  3. ProcessExportData: This is a dummy class created because the UML modeling tool I was using (ArgoUML) has a few problems. This class does not actually exist in the application and represents processing that happens within frmExport.
  4. ZFSS: The ZF Shipping Web Service interface.
  5. dsOpenOrders: An instance of the dsImportFile dataset class.
  6. dsExistingOrders: An instance of the dsImportFile dataset class.

The Fetch Export Data operation starts by calling the GetOpenOrders() method of the shipping web service and an instance of the dsImportFile dataset is returned. This dataset is split into 2
separate datasets, one for the K&N open orders and one for the Locton orders. Then the GetExistingOrders() method of the shipping web service is called and another instance of the dsImportFile dataset is returned. The K&N open orders list is combined with the existing orders list to produce the K&N status list. The three lists generated are shown to the user for review. The user can then decide to export the displayed lists out to XML files. That is the
second operation shown in the sequence diagram.

Below the sequence diagram is the domain model diagram. This model would usually display the actors and how they associate with the objects in the domain model but they are not shown in this case.

To help understand the sequence diagram I include a screenshot of the Export XML form
below:



I will now include code showing the process of retrieving the orders. This is a good example of the interaction between the presentation layer, the business layer and the data layer. This is how all the functionality in this application (and most other applications I have developed) is implemented.

Shown below is the code behind the Retrieve Orders button:

private void btnRetrieve_Click(object sender, EventArgs e)
{
//Get open orders and all lines found in the status list
try
{
this.Cursor = Cursors.WaitCursor;
Application.DoEvents();
dsOpenOrders = new dsImportFile();
dsLoctonOrders = new dsImportFile();
ZFSS.Service ZFSS_WS = new ZFKN_ImportQuery.ZFSS.Service();
ZFSS_WS.Url = URL;
int INDEX = cbStartDate.SelectedIndex;
DateTime FromDate = Convert.ToDateTime(cbStartDate.SelectedItem);
cbStartDate.SelectedIndex = cbStartDate.SelectedIndex +
Convert.ToInt32(txtWeeks.Text);
DateTime ToDate = Convert.ToDateTime(cbStartDate.SelectedItem);

cbStartDate.SelectedIndex = INDEX;

//FOR TESTING ONLY
FromDate = Convert.ToDateTime("2005/05/14");
ToDate = Convert.ToDateTime("2007/01/01");
//FOR TESTING ONLY

DataSet DS = ZFSS_WS.GetOpenOrders(FromDate, ToDate);

//This dataset will contain all suppliers so it must be
//split between K&N and LOCTON
DataTable DT_LOCTON = new DataTable();
DataTable DT_KN = new DataTable();
DT_LOCTON.Merge(DS.Tables["Table"]);
DT_KN.Merge(DS.Tables["Table"]);
//NOTE: The query returning the rows of this table
//used a left outer join against the Locton Supplier List
//table so if it is not a Locton supplier the value will be null
foreach (DataRow LR in DT_LOCTON.Rows)
{
if(LR.IsNull("LoctonSupplier"))
{
LR.Delete();
}
}
//foreach (DataRow KNR in DT_KN.Rows)
//{
// if (!KNR.IsNull("LoctonSupplier"))
// {
// KNR.Delete();
// }
//}
//Get rid of the LoctonSupplier columns
DT_LOCTON.Columns.Remove("LoctonSupplier");
DT_KN.Columns.Remove("LoctonSupplier");

dsOpenOrders.ImportFile.Merge(DT_KN);
dgvOpenOrders.DataSource = dsOpenOrders.ImportFile;

dsLoctonOrders.ImportFile.Merge(DT_LOCTON);
dgvLoctonOrders.DataSource = dsLoctonOrders.ImportFile;


dsExistingOrders = new dsImportFile();
ZFSS.dsShipping dsSP = ZFSS_WS.GetExistingOrders();
ZFSS.dsShipping.ZF_ImportDataTable TABLE = new
ZFKN_ImportQuery.ZFSS.dsShipping.ZF_ImportDataTable();
TABLE.Merge(dsSP.ZF_Import);
TABLE.Columns.Remove("Status");
TABLE.Columns.Remove("PartValue");
TABLE.Columns.Remove("Bin");
TABLE.Columns.Remove("Receipted");
TABLE.Columns.Remove("Operator");
TABLE.Columns.Remove("Record_ID");
TABLE.Columns.Remove("LineStatus");
TABLE.Columns.Remove("LineType");
TABLE.Columns.Remove("ImportFileDate");

dsExistingOrders.ImportFile.Merge(TABLE);


dsEXPORT = new dsImportFile();

//To resolve duplicates where the line is still an open
//order in Syspro but it has already been imported into the system
//delete all lines from dsOpenOrders.ImportFile that are also
//in dsExistingOrders.ImportFile based on P/O and Line No
dsEXPORT.ImportFile.Merge(dsOpenOrders.ImportFile);
foreach (dsImportFile.ImportFileRow R in
dsExistingOrders.ImportFile.Rows)
{
string FILTER = "POnumber = '" + R.POnumber + "' and POline =
" +

R.POline;
dsImportFile.ImportFileRow[] to_delete =
(dsImportFile.ImportFileRow[])dsEXPORT.ImportFile.Select(FILTER);
foreach (dsImportFile.ImportFileRow r in to_delete)
{
r.Delete();
}
}
dsEXPORT.ImportFile.AcceptChanges();

//Now ONLY the first weeks open orders must be merged
//into the existing orders list so delete any row from
//dsOpenOrders.ImportFile that do not fall in the first week

//So add one week to the from date
DateTime CheckDate = FromDate.AddDays(8);
foreach (dsImportFile.ImportFileRow R in dsEXPORT.ImportFile.Rows)
{
if (R.ExWorksDate >= CheckDate)
{
R.Delete();
}
}
dsEXPORT.ImportFile.AcceptChanges();

dsEXPORT.ImportFile.Merge(dsExistingOrders.ImportFile);

dgvExistingOrders.DataSource = dsEXPORT.ImportFile;
dgvExistingOrders.AutoResizeColumns(
DataGridViewAutoSizeColumnsMode.AllCells);
this.Cursor = Cursors.Default;
}
catch (Exception exc)
{
this.Cursor = Cursors.Default;
string MES = "The Export Function failed: " + exc.Message;
string TITLE = "Export XML File Error";
MessageBox.Show(MES, TITLE, MessageBoxButtons.OK,
MessageBoxIcon.Error);
}
}

First an instance of the web service interface, ZFKN_ImportQuery.ZFSS.Service() is created. Note that the URL of the web service is set at run time. It is obtained from the UserSetting.XML file that is maintained by the user settings component in the utilities.dll. This makes the application easier to configure and support. The ZFSS_WS.GetOpenOrders(FromDate, ToDate) web method is then executed and it returns a Dataset object. The dataset returned contains both normal orders and LOCTON orders so it is split into two separate lists. The lists are then bound to controls so that the user can review the information.

Next the ZFSS_WS.GetExistingOrders() is executed and the dataset returned is processed. The existing orders list is then bound to a control and the user is then given the opportunity to review the three lists generated and can decide to export the lists to XML.

The GetOpenOrders() web method is shown below:

[WebMethod(MessageName = "GetOpenOrders", Description = "This method returns all

Open Purchase Orders." "It returns null indicating failure.")]
public DataSet GetOpenOrders(DateTime FromDate, DateTime ToDate)
{
try
{
string sCONN = GetConnectionString();
DataSet dsRET = null;
object[] sp_params = new object[2];
sp_params[0] = FromDate;
sp_params[1] = ToDate;

dsRET = SqlHelp.DAL.SqlHelper.ExecuteDataset(sCONN, "ZF_GetOpenOrders",
sp_params);

//If the dataset is null then throw an error
if (dsRET == null)
{
throw new Exception("ZF_GetOpenOrders result DataSet was returned
empty.");
}

return dsRET;
}
catch (Exception exc)
{
Service_EventLog.WriteEntry("Error in GetOpenOrders(): " + exc.Message,
System.Diagnostics.EventLogEntryType.Error);
return null;
}
}

The method returns NULL to indicate failure and writes a log message to the web service event log if an error occurs. The method returns a dataset object that will be merged into an instance of the dsImportFile dataset in the calling procedure. The SqlHelper class hides the code that executes the stored procedure and populates a dataset from the result set returned by the stored procedure.

The "ZF_GetOpenOrders" stored procedure is shown below:

CREATE PROCEDURE ZF_GetOpenOrders
(
@FromDate datetime,
@ToDate datetime
)
AS

SELECT
SysproOutdoors.dbo.ApSupplier.SupplierName,

SysproOutdoors.dbo.PorMasterDetail.MStockCode AS PartNumber,
CAST(ROUND(SysproOutdoors.dbo.PorMasterDetail.MOrderQty, 0) AS int) AS OrderQty,
SysproOutdoors.dbo.PorMasterDetail.MLatestDueDate AS ExWorksDate,

SysproOutdoors.dbo.PorMasterDetail.PurchaseOrder AS POnumber,

CAST(ROUND(SysproOutdoors.dbo.PorMasterDetail.Line, 0) AS int) AS POline,

SysproOutdoors.dbo.PorMasterHdr.Supplier AS SupplierCode,
ZF_LoctonSuppliers.SupplierCode AS LoctonSupplier
FROM
SysproOutdoors.dbo.PorMasterDetail
INNER JOIN
SysproOutdoors.dbo.PorMasterHdr ON
SysproOutdoors.dbo.PorMasterDetail.PurchaseOrder =

SysproOutdoors.dbo.PorMasterHdr.PurchaseOrder
INNER JOIN SysproOutdoors.dbo.ApSupplier ON
SysproOutdoors.dbo.PorMasterHdr.Supplier = SysproOutdoors.dbo.ApSupplier.Supplier

LEFT OUTER JOIN ZF_LoctonSuppliers ON
SysproOutdoors.dbo.ApSupplier.Supplier = ZF_LoctonSuppliers.SupplierCode
WHERE
(SysproOutdoors.dbo.PorMasterDetail.MLatestDueDate >= @FromDate)
AND (SysproOutdoors.dbo.PorMasterDetail.MLatestDueDate < @ToDate) AND (SysproOutdoors.dbo.ApSupplier.Currency <> '$')

RETURN

Note that a left outer join is used on the LOCTON suppliers table so that if no LOCTON supplier is available NULL will be returned. This fact is used in the logic that separates the returned dataset into normal suppliers and LOCTON suppliers. This stored procedure is located in the applications database called ZF_CRANSWICK. The Syspro ERP database is referenced by means of the SysproOutdoors.dbo prefix. I have seen developers place their stored procedure directly in Syspro’s database but this is a bad practice as it clutters up the Syspro database and it makes the system more difficult to maintain. If all related stored procedures and functions for an application are contained within its own database the application is easier to maintain and support.

This code is not ideal because there is still too much processing performed in the presentation layer.

Export XML Sequence Diagram





APPENDIX D - The Domain Model

Domain Model






NOTE: In most cases the applications database would be created on the same SQL server as the Syspro database. In the domain model only the Syspro database is shown but the application would have its own database.

APPENDIX E - Simple Browse Utility

An example of the simple browse form is shown below:



The filters use a LIKE selection on the Code and Name strings to make finding records easier. For instance if “Hardware” was typed into the name filter only suppliers that contain the phrase “Hardware” in their name would be displayed. Because a view is used as the source of the data it could filter suppliers by region or range of parts they supply etc to reduce the size of the returned dataset. Data from multiple tables can be joined to create the dataset. This can
be very versatile.

The SQL statements in the vw_ApSupplier view used as the source for the form data is shown
below:

SELECT Supplier, SupplierName
FROM SysproOutdoors.dbo.ApSupplier


The client side code used to invoke the browse form is shown below:

private void btnBrowseSupplier_Click(object sender,

EventArgs e)
{
try
{
Utilities.frmSimpleBrowse SB = new Utilities.frmSimpleBrowse();
string CODE_NAME = "Supplier";
string DESC_NAME = "SupplierName";
string CODE_TXT = "Supplier Code";
string DESC_TXT = "Supplier Name";
string SOURCE_OBJECT_NAME = "vw_ApSupplier";
string FORM_HEADING_TEXT = "Supplier Browse";

SB.CODE_NAME = CODE_NAME;
SB.CODE_TXT = CODE_TXT;
SB.DESC_NAME = DESC_NAME;
SB.DESC_TXT = DESC_TXT;
SB.SOURCE_OBJECT_NAME = SOURCE_OBJECT_NAME;
SB.FORM_HEADING_TEXT = FORM_HEADING_TEXT;

//Now get the data
ZFSS.Service ZFSS_WS = new ZFKN_ImportQuery.ZFSS.Service();
ZFSS.dsSimpleBrowseData.BrowseDataDataTable SBD =
ZFSS_WS.GetSimpleBrowseData(CODE_NAME, DESC_NAME,
SOURCE_OBJECT_NAME);
SB.SIMPLE_BROWSE_DATA = new
Utilities.dsSimpleBrowseData.BrowseDataDataTable();
SB.SIMPLE_BROWSE_DATA.Merge(SBD);

if (SB.ShowDialog() == DialogResult.OK)
{
txtSupplierCode.Text = SB.RETURN_CODE;
txtSupplierName.Text = SB.RETURN_DESC;
}
}
catch (Exception exc)
{
string MES = "The following error while browsing for supplier: " +
exc.Message;
string TITLE = "Supplier Browse Error";
MessageBox.Show(MES, TITLE, MessageBoxButtons.OK,
MessageBoxIcon.Error);
}
}


The code simply executes the GetSimpleBrowseData()web method to obtain a simple browse dataset dsSimpleBrowseData that contains the view data. It then creates an instance of the simple browse form, sets a few parameters and passes it the dataset containing the browse data and then shows the form.

The web method code is shown below:

[WebMethod(MessageName = "GetSimpleBrowseData",

Description = "This method returns the data required for a simple browse."
+ "It returns null indicating failure.")]
public Utilities.dsSimpleBrowseData.BrowseDataDataTable GetSimpleBrowseData(string

CODE_NAME, string DESC_NAME, string SOURCE_OBJECT_NAME)
{
try
{
Utilities.Utility U = new Utilities.Utility();
string sCONN = GetConnectionString();
return U.GetSimpleBrowseData(sCONN, CODE_NAME, DESC_NAME,
SOURCE_OBJECT_NAME);
}
catch (Exception exc)
{
Service_EventLog.WriteEntry("Error in GetSimpleBrowseData(): " +
exc.Message, System.Diagnostics.EventLogEntryType.Error);
return null;
}
}


The web method is currently in the shipping web service but could easily be moved into a utilities web service.

To use this component each application will need a reference to the Utilities.dll and the web service that contains this web method.