The processing power required to parse XML documents, as well as validate data elements and security, has had an impact on the use of emerging technologies dependent on XML, such as web services. The latter is expected by many experts to become the next generation standard for integrating business applications over the Internet.
"Companies want to do these things, but the performance issue can be a real deal stopper," Ronald Schmelzer, analyst for ZapThink LLC, said. "If you can't figure out how to make it work better, than you just won't do it."
XML overhead also has caused some companies to adopt questionable practices, such as not validating data elements within an arriving XML document, Schmelzer said. Instead, companies will often validate documents from business partners during the testing stage, and then assume once the system goes into production, that everything received will contain good data.
The danger, of course, is that the incoming data may be bad, which means it will later need to be corrected manually.
Today, a lot of XML processing is handled by the application server, which can affect the amount of resources available to process business logic. "What (DataPower) is saying is if people are dedicating an application server just to do XML processing, then it may make a lot more sense just to offload that task to a separate box," Schmelzer said.