skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: An Iterative Procedure for Efficient Testing of B2B: A Case in Messaging Service Tests

Abstract

Testing is a necessary step in systems integration. Testing in the context of inter-enterprise, business-to-business (B2B) integration is more difficult and expensive than intra-enterprise integration. Traditionally, the difficulty is alleviated by conducting the testing in two stages: conformance testing and then interoperability testing. In conformance testing, systems are tested independently against a reference system. In interoperability testing, they are tested simultaneously against one another. In the traditional approach for testing, these two stages are performed sequentially with little feedback between them. In addition, test results and test traces are left only to human analysis or even discarded if the solution passes the test. This paper proposes an approach where test results and traces from both the conformance and interoperability tests are analyzed for potential interoperability issues; conformance test cases are then derived from the analysis. The result is that more interoperability issues can be resolved in the lower-cost conformance testing mode; consequently, time and cost required for achieving interoparble solutions are reduced.

Authors:
 [1]
  1. ORNL
Publication Date:
Research Org.:
Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
Sponsoring Org.:
Work for Others (WFO)
OSTI Identifier:
931547
DOE Contract Number:
DE-AC05-00OR22725
Resource Type:
Conference
Resource Relation:
Conference: International Conference on Interoperability for Enterprise Software and Application, Madeira, Portugal, 20070328, 20070330
Country of Publication:
United States
Language:
English
Subject:
99 GENERAL AND MISCELLANEOUS//MATHEMATICS, COMPUTING, AND INFORMATION SCIENCE; TESTING; JOINT VENTURES; COMPATIBILITY; SYSTEMS ANALYSIS

Citation Formats

Kulvatunyou, Boonserm. An Iterative Procedure for Efficient Testing of B2B: A Case in Messaging Service Tests. United States: N. p., 2007. Web.
Kulvatunyou, Boonserm. An Iterative Procedure for Efficient Testing of B2B: A Case in Messaging Service Tests. United States.
Kulvatunyou, Boonserm. Thu . "An Iterative Procedure for Efficient Testing of B2B: A Case in Messaging Service Tests". United States. doi:.
@article{osti_931547,
title = {An Iterative Procedure for Efficient Testing of B2B: A Case in Messaging Service Tests},
author = {Kulvatunyou, Boonserm},
abstractNote = {Testing is a necessary step in systems integration. Testing in the context of inter-enterprise, business-to-business (B2B) integration is more difficult and expensive than intra-enterprise integration. Traditionally, the difficulty is alleviated by conducting the testing in two stages: conformance testing and then interoperability testing. In conformance testing, systems are tested independently against a reference system. In interoperability testing, they are tested simultaneously against one another. In the traditional approach for testing, these two stages are performed sequentially with little feedback between them. In addition, test results and test traces are left only to human analysis or even discarded if the solution passes the test. This paper proposes an approach where test results and traces from both the conformance and interoperability tests are analyzed for potential interoperability issues; conformance test cases are then derived from the analysis. The result is that more interoperability issues can be resolved in the lower-cost conformance testing mode; consequently, time and cost required for achieving interoparble solutions are reduced.},
doi = {},
journal = {},
number = ,
volume = ,
place = {United States},
year = {Thu Mar 01 00:00:00 EST 2007},
month = {Thu Mar 01 00:00:00 EST 2007}
}

Conference:
Other availability
Please see Document Availability for additional information on obtaining the full-text document. Library patrons may search WorldCat to identify libraries that hold this conference proceeding.

Save / Share:
  • The iterative solution of large sparse symmetric and unsymmetric linear systems comprises the most time consuming stage when solving many computationally intensive 3D industrial problems. The standard approach to construction of efficient parallel solution methods consists in designing first of all efficient parallel preconditioners. Unfortunately, it is very difficult to find a constructive compromise between parallel properties, preconditioning quality and arithmetic costs for constructing the preconditioner. Another way to enhance the parallelism of the iterative solution methods is related to consideration of block iterative schemes like block CG method introduced by O`Leary and Underwood. Unfortunately, there does not exist anymore » ease approach to choose the optimal block size of such schemes which is optimal with respect to using parallelism, the resulting convergence rate, and the arithmetic costs of one block iteration. We introduce the so called variable Block CG methods and the Variable Block Arnoldi procedure where we can adaptively reduce the current block size without any restarts. This makes it possible to reduce substantially the block size while preserving at the same time the high convergence rate of the block iterative scheme with the original large constant block size.« less
  • Large-scale information processing applications must rapidly search through high volume streams of structured and unstructured textual data to locate useful information. Content-based messaging systems (CBMSs) provide a useful technology platform for building such stream handling systems. CBMSs make it possible to efficiently execute queries on messages in streams to extract those that contain content of interest. In this paper, we describe efforts to augment an experimental CBMS with the ability to perform efficient free-text search operations. The design of the CBMS platform is described, and an empirical evaluation is presented to demonstrate the performance implications of a range of queriesmore » varying in complexity.« less
  • Large-scale information processing environments must rapidly search through massive streams of raw data to locate useful information. These data streams contain textual and numeric data items, and may be highly structured or mostly freeform text. This project aims to create a high performance and scalable engine for locating relevant content in data streams. Based on the J2EE Java Messaging Service (JMS), the content-based messaging (CBM) engine provides highly efficient message formatting and filtering. This paper describes the design of the CBM engine, and presents empirical results that compare the performance with a standard JMS to demonstrate the performance improvements thatmore » are achieved.« less
  • A seismic stratigraphic analysis based on seismic attribute and stratigraphic modeling techniques was done on Paleocene submarine fan mounds in two North Sea blocks. The principal objective of these studies was to develop new interpretation concepts for resolving and mapping sandstone buildups and channel fills. Improved resolution and interpretation of these features should contribute to development of Paleocene exploration plays and reservoir characterization in these blocks.
  • Measurements at the central facility of the Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) are intended to verify, improve, and develop parameterizations in radiative flux models that are subsequently used in General Circulation Models (GCMs). The reliability of this approach depends upon the representativeness of the local measurements at the central facility for the site as a whole or on how these measurements can be interpreted so as to accurately represent increasingly large scales. The variation of surface energy budget terms over the SGP CART site is extremely large. Surface layer measurements of the sensible heat flux (H)more » often vary by a factor of 2 or more at the CART site (Coulter et al. 1996). The Planetary Boundary Layer (PBL) effectively integrates the local inputs across large scales; because the mixed layer height (h) is principally driven by H, it can, in principal, be used for estimates of surface heat flux over scales on the order of tens of kilometers. By combining measurements of h from radiosondes or radar wind profiles with a one-dimensional model of mixed layer height, they are investigating the ability of diagnosing large-scale heat fluxes. The authors have developed a procedure using the model described by Boers et al. (1984) to investigate the effect of changes in surface sensible heat flux on the mixed layer height. The objective of the study is to invert the sense of the model.« less