MQ and FTP versus SOA
- Mark Skilton
- Jul 28, 2006
- 5 min read
The Vehicle Direct to Dealer Invoice service is a combination of technologies and service oriented architecture SOA design practices.
Legacy mainframe systems with functionality coded in Cobol and other proprietary standards that need to be considered
Question is what technologies are able to deliver these services.
Options being considered
Option 1
DataPower Network based routing and services management
Ability to execute web service transactions , with built in security and large cost of ownership benefits. Boasts that it does not have any memory overhead so unclear if supports publish-subscribe model (no table storage capability. (which would need persistence of published messages)
Option 2
Pub-sub is the eHub solution model. The limits of eHub are well known at 2mega message size.
Option 3
D4data – MQ Broker. Looking at option to use this new technology.
Requirements
The requirements for the DDI solution include the following:
Request reply messaging
This assumes that a number of the messaging and functions will be interactive near real time 2 phase asynchronous or synchronous. The current transaction response times of these systems may not support this service and need to be assessed given the majority are said to function as batch messaging. Internal functions that are wrapped may offer transactions that are operative in real time. The issue here would be latency effects on legacy systems remaining.
Publish-subscribe
Conversion of some master data messaging into a common information hub could be appropriate for some of the services. Notably there are two modes of operation
Punching out events for changes that need to have a push operation to target systems
An update replication action where a changed or newly created piece of master data needs to be presented to all those system need to replicate the event or data.
Some elements of front end services choreography
The current wisdom is to design a portal like facility (e4 style) to establish the user interaction of services. That is to start with the MMS end and consolidate into a set of services. While this is possible it presents challenges in agreeing country based standards and then of course the need to link to back end systems arises.
Data translation, protocol translation and time-frame transformation services
Some of the data will need to be translated from one format to another
Some of the data will need to be converted from FTP to a XML style format
Service enablement
Legacy systems will have functionality that will need to be wrapped – this will create one form of services. This will mean a deconstruction or construction of additional services into and of an application domain. At worst case this is a complete rewrite of the application to decouple components and logic out.
Messages will need to be converted to web services - this is another form of services
Development of new services and database sources
The pricing service is a potentially a complex logic service
The Data warehouse for reporting on DDI
Legacy system migration
The existing systems will need to be maintained by ensuring the services
Solution options
The requirements push a number of conditions on integration strategy
Performance issues
security, tracking, auditting, in-order delivery, bandwidth throttling, etc etc
With the publish-subscribe model and the FTP legacy there is a strong argument to support this with a specific solution infrastructure. For file transfer. Changing this to web services needs to be based on a clear change from batch to a real-time business process rather than for the sake of it. The data semantics would follow the SOA model.
Recommendations
Key it simple
The use of simple asynchronous web services called would be preferred model to reduce the impact of service latency.
Polarization
It would suggest there will be two service infrastructure models
The development of simple web services to support calls
This would be the preferred route to avoid a bus overhead structure
A ehub like service bus that supports pub-sub and translation services
This would provide event and data synchronization management
This would be used to manage system-system transparent automation services
Performance
The performance is critical, a large number of the systems currently send large files in batch.
The messages will not be optimized for web services (and eHub restrictions on 2M)
Recommendations are to use a File manager broker. This thinking is not specifically SOA but supports managed file transfer (MFT) for specific patterns: MI and end of month processing – it is not real time.
Current knowledge is around:
Sterling commerce
Axway file broker - XFB which is BTs preferred transferring batch method.
The key thing with Axway is the ability to handle web services. They can transfer files but then provide assess to data as web services. This is SOA, and a route we should consider.
An option could be to create a data hub to manage master data synchronization and to build the data warehouse. I don’t like this option as its more complex than seeking to use a ESB type data translation service. We need to push the use of web services where ever possible to make connections easy.
Creating real time services with web services. This is a significant move to a new system solution. My gut feel is that this potentially means a rewrite and a reframing of the DDI solution. E.g. to move to a portal based app with web services.
Use the network based accelerators. This is the future Gartner recommended approach – the one to watch.
A potential solution could be:
Message data broker + Network accelerators for MFT
Use web services to move to network accelerators for request-response services
BM's MQ Broker product a good fit for a couple of our current issues - the broker itself for ESB legacy access, and combined with PM4Data (basically puts FTP over an encrypted MQ connection) a solution for some of our issues with unsecured links for critical data.
...
It does sound here that it is trying to shoehorn two requirements into one.
Managed File Transfer is a legitimate part of any integration strategy, because the whole world does not necessarily have to run in real time (e.g. MI feeds, end-of-month close down etc). It just sounds a bit concerning that someone would try to use MQ for this.
MQ is pretty good with large document sizes (they used to max out at 2Mb, but I think that it is almost unlimited now), but its the old "if all you have is nails, everything looks like a hammer" problem.
Managed File Transfer, I would argue, is a specific problem space, with specific tool requirements - and this is certainly a view backed up by Gartner, who have a paper dedicated to this - players such as Sterling Commerce, Axway, IBW etc. It also has specific requirements for e.g. security, tracking, auditting, in-order delivery, bandwidth throttling, etc etc
With respect to SOA principles, I would say that the alignment would not be at the messaging level, but at the data level...i.e. even if the message is a batch file, it should still be using the same data structure and semantics of data descriptions as any other message, and be governed in the same way - it just happens to be very asynchronous, and very large file sizes.
It just so happens, that I've written an integration strategy for BP which tries to pull together ESB, B2B, ETL, Managed File Transfer (MFT) and BPMS - but that's another story...
Never come across FTP over IBM MQ, however MQ is obviously a very mature product. However, for secure, reliable batch transfers there are products which do this like Axaway's File Broker, XFB, which is BT's preferred method of transferring batch information.
http://www.axway.com/products/synchrony_transfer.php
Comments