Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Serving Up SOAP: Page 12 of 20

Take advantage of the tuning suggestions made by your vendor of choice. Changes to the Java Virtual Machine parameters, such as memory or threads, and modifications of activation policies, log levels and connection-level settings can drastically increase the performance of an application server, so consider tuning part of the deployment process.

We set up a Hewlett-Packard Compaq ProLiant DL750, rolling in beefiness with its eight Xeon processors and 2 GB of RAM. Then we imaged eight SCSI 18-GB drives with a copy of Microsoft Windows 2000 Advanced Server, hooked up the Gigabit Ethernet NIC and prepared to test.

We installed each product on one of the prepared drives, then evaluated it in terms of management capabilities, monitoring and configuration options. We then built two Web services with the same functionality, one using DOC/Literal and one RPC/Encoded, and deployed them to the platform using the vendor's development environment. On each platform, we built two Web services, each based on a simple case within the WS-I's interoperability tests. We called both services echoInt. Both services took a single integer argument and returned that argument, plus one. The only difference between the two services was the encoding model. We also implemented a service that connected to our NWC Inc. customer database. This "getName" service took the user name of a customer and returned the first and last name of that customer. Each product was tested on the same version of Sun Microsystems' Java Virtual Machine, version 1.3.1.

Performance testing was accomplished by harnessing five Dell Optiplex machines, all running Red Hat Linux 8.0, and sending SOAP requests via ApacheBench to the product under test. Each test was configured to run for one minute or 50,000 requests (reaching either threshold completed the test) with a concurrency level of 10 on each machine, and was run three times against each Web service. Additional tests were run from a single client machine with concurrency levels of 10, 20 and 30, under the same time- and request-limit constraints.

Each client machine was time-synchronized to a public NTP (Network Time Protocol) server, and a script scheduled via at kicked off the tests to coordinate the five load-generating clients.

We tested interoperability by building MindElectric GLUE (Java) and Microsoft .Net (C#) clients from the WSDL (Web Services Definition Language) served up by each product and then running the respective clients against the service. ApacheBench, though used for our performance tests, was also indicative of the interoperability of each product, as the service was required to interact with the tool in order to perform our tests. We encountered some problems, specifically when trying to access the Web services from a .Net client. Minor tweaks to the WSDL provided the solution, but proved that concerns over interoperability are not completely unfounded.