EB SERVICES TESTING CHALLENGES AND APPROACHES

dankishbeeSecurity

Nov 3, 2013 (3 years and 10 months ago)

85 views

W
EB SERVICES TESTING CHALLENGES AND

APPROACHES

Sana Azzam
CIS department, IT faculty
Yarmouk University
Irbid, Jordan
sana_azzam@yahoo.com




Abstract— The web is evolving and expanding
continuously and its services are getting more and more
complex, public and usable. However, one of the major
benefits and challenges in web services is for those
services to be offered in a very useful, flexible, effective
and secure way. In this paper, we will investigate web
services, show their generic structure and show examples
and challenges related to testing web services.
Keywords- Text mining, plagiarism; documents
similarity, and string search.
Keywords- Software testing, web services, e-transactions ,
e-services, service security, performance, reliability.
I.

I
NTRODUCTION

Web Services technology is one of the most
significant Web technologies in these days. Moreover,
it is considered as an important topic in the field of
software testing. Therefore, different tools have been
developed and designed to use different types of Web
Services. This has motivated the Web experts to
propose different methods and different tools to test
new Web Services efficiently. In this study, we make
a comparison between three important tools for testing
Web Services. These tools are: SoapUI, PushToTest,
and WebInject. We can define WS as a service
oriented architecture or a software system which
designed to provide an interoperable application to
application interaction through network, where the
systems interact with applications using Simple Object
Access Protocol (SOAP).











The (SOAP) is used to exchange XML-based
messages over HTTP or HTTPs for identifying the
WS not through GUI interface. The user requests a
service from a specific Web Service using Application
Programming Interface (API), then the Web Service
sends the request to the user in formal XML form
[1][2][3][4]. In the WS, the client communicates with
a service in the server through a program.
Testing Web Services is a significant problem that
should be studied carefully; the testing has to be
extensive and comprehensive to all important levels
(unit, component, and system level). Some recent
studies show that the error rate of new software
programs has the range from 3 to 6 errors per 1000
statement [8].
"Web Services" (as a term) is an important topic in
the field of software testing; therefore, different tools
have been developed and designed to enhance the use
of Web Services. In this research, we will test a
number of tools. These tools are: SoapUI, PushToTest,
and WebInject. The previous mentioned tools are used
primarily for collecting, creating, and testing Web
Services. The major question of this research is:
"Which is the best tool for testing Web Services?"
This paper determines the efficiency of using each
tool (such as "SoapUI") with a group of several
available and created Web Services.
The proposed process is as following:
• Understanding the tools that have been
selected for the comparison.
• Applying the created and collected Web
Services on each selected tool.
Mohammed Naji Al-Kabi
CIS department, IT faculty
Yarmouk University
Irbid, Jord
mohammedk@yu.edu.jo

Izzat Alsmadi
CIS department, IT faculty
Yarmouk University
Irbid, Jordan
ialsmadi@yu.edu.jo

© ICCIT 2012
291
• Comparing the selected tools based on some
factors (like testing coverage).
The results of this paper will answer the main
question and summarize the most efficient tools for
testing different Web Services?
II. B
ACKGROUND

Web Services provide many means of
interoperation between different software applications,
and have the ability to work on many frameworks and
platforms [4].
The architecture of WS consists of logical
evolution of component architecture, and object
oriented analysis and design. The architecture of WS is
designed to be able to achieve three main purposes.
First, providing a conceptual model and context, and
clarifying the relationship among them. Second,
describing a number of characteristics of common Web
Services. Third, supporting the interoperability with
older applications [5] [6] [7].
Many tools have been implemented for testing Web
Services, and few approaches have been proposed to
study the quality of these tools. This study is based on
three open source tools. These tools are considered as
effective software solutions for testing Web Services.
Next subsections describe briefly the three selected
tools.
• SoapUI Tool
This tool is a Java based open source tool. It can
work under any platform provided with Java Virtual
Machine (JVM). The tool is implemented mainly to
test Web Services such as SOAP, REST, HTTP, JMS
and other based services. Although SoapUI
concentrates on the functionality, it is also consider
performance, interoperability, and regression testing
[11].
• PushToTest Tool
One of the objectives of this open source tool is to
support the reusability and sharing between people
who are involved in software development through
providing a robust testing environment. PushToTest
primarily implemented for testing Service Oriented
Architecture (SOA) Ajax, Web applications, Web
Services, and many other applications.
This tool adopts the methodology which is used in
many reputed companies. The methodology consists of
four steps: planning, functional test, load test, and
result analysis. PushToTest can determine the
performance of Web Services, and report the broken
ones. Also, it is able to recommend some solutions to
the problems of performance [12].
• WebInject Tool
This tool is used to test Web applications and
services. It can report the testing results in real time,
and monitor applications efficiently. Furthermore, the
tool supports a set of multiple cases, and has the ability
to analyze these cases in reasonable time. Practically,
the tool is written in Perl, and works with the platforms
which have Perl interpreter.
The architecture of WebInject tool includes:
WebInject Engine and Graphical User Interface (GUI),
where the test cases are written in XML files and the
results are shown in HTML and XML files [13].
III. L
ITERATURE REVIEW

Several researches have been conducted to
improve testing of Web Services. This chapter
describes the related work to the domain of testing
Web Services.
We will first present some techniques that have
been proposed for testing Web Services.
Ashok Kumar et al. have proposed a model called
Automated Regression Suite model. The proposed
model includes parsing of WSDL file and generating
SOAP to represent how to automate the testing of any
Web Service with any environment by making a
comparison between responses of Golden and SOAP.
Moreover, the number of methods that passed or failed
is identified through generated response report [9].
Sneed et al. has developed a WSDL test for testing
Web Services. This technique is based on generating a
request and adjusting it based on a pre-condition. The
proposed technique dispatches the requests and
capture responses [8].
Siblini et al. has proposed a technique relied on
mutation analysis for testing Web services. This
technique includes applying mutation operators to
WSDL document in order to generate a mutated Web
Services interface. The technique is proposed to test
Web services through discovering of errors in WSDL
292
interface and Web Services as well. Nine mutation
operators, mutation groups (switch, special,
occurrence) and WSDL document are used in the
proposed technique [2].
Automatic Test Generation from GUI Applications
for Testing Web Services is another approach that has
been proposed by Conroy et al. This approach
includes proposing a novel generic approach (smart)
to generate test case from reference legacy GAP
depends on extraction of data from GUI, and then
applying this test in Web services to reduce the time
consuming [10].
IV. G
OALS AND APPROACHES

To use the selected tools to test the collected Web
Services, we have to apply these Web Services in the
tools. In other words, collected Web Services have to
be tested by each tool. Obviously, each testing tool has
its own method to test the Web Services. This includes
importing the Web Services into the tool, dealing with
tool’s options which involves in testing process, and
finally, showing the results of the analysis process in
the supported format. Actually, this means that we
have to understand all the required process for
applying Web Services and get the results’ report from
each tool deeply.

1.Results’ Collecting and Analyzing
Testing tools generates its results based on some
testing criteria. These criteria include functionality,
Performance (loading), Interoperability, and many
other criteria. Test of performance has many types. No
doubt that discussing these types is helpful to clarify
how to test the performance of a specific Web Service.
Table 1 shows different types of performance testing.

Table 1: Types of Performance Testing
T
y
pe
Definition
Baseline Testing
Normal modes performance testing.
Load Testing High load performance testing.
Stress Testing
Exceptional loads testing.
Soak Testing Long period testing.
Scalability Testing
Large volumes testing.


2. Evaluation and Comparison
Comparing between different testing tools of Web
Services is a complex task, the testing criteria are not
the same in all the selected tools, which means that
one tool may have the ability to test the
Interoperability (for example), while another one does
not test this criteria. Furthermore, some criteria may
be affected by factors which are not related to testing
tools or Web Services. For example, the performance
of internet connection may affect the loading test of
Web Services and therefore cause a negative impact in
the comparison between tools. To compare selected
testing tools, some factors are chosen as a base of the
comparison process. Table 2 shows comparison
factors.
Table 2: Comparison Factors
F
actor
D
etails
Ease of Use
Is the tool user friendly or not?

Testing Coverage How many criteria the tool can test?
Functionality
What is the testing tool response time?
Performance Dealing withloading in testing Web
Services?
Throughput
How much Web Services the tool
can handle in a period of time?
Our method to make a comparison between testing
tools includes the following steps:
• Testing each Web Service from the collected
Web Services using the three selected tools, taken
into consideration the testing criteria.
• Analyzing the testing result of each tool based on
the comparison factors.
• Reporting results between the testing tools.
The results of comparison will help us to specify
the more efficient testing tools. Note that it is possible
that the tool will be the best in testing one criterion
and the worst in testing another one.

3.Discussing the Results
The tools which used to test the Web Services are
varying in the time which they need to react. In other
293
words, some tools take less time than others to test the
Web Services. In the evaluation of the three selected
tools, response time of tools for each Web Services is
calculated. Table 3 shows the response time of each
tool for testing the Web Services. Figure 1 shows the
average response time of each tool for all Web
Services. The results show that the SoapUI testing tool
is the fastest tool in responding to and testing of the
Web Services. The second fastest tool is PushToTest,
where WebInject is the slowest tool.

Table 3: Response Time for the Web Services


Web Service
Response time for each Tool
in (ms)
WebInject
PushToTest
SoapUI
Weather Forecast 1237 832 1120
Global Weather
1198
943
428
Send Fax 1264 1412 755
Currency Convertor
1603
692
776
Country Details 1869 1364 606
Bible Web Service
1401
1542
373
Text to Braille 958 1526 631
Weight Unit con.
977
601
369
Com. Unit con. 981 635 619
V. Credit Card
929
394
639
Statictic 932 895 804
Stock Quote
953
742
682
Fedach 1329 683 565
Send SMS World
1272
842
478
V. Email Address 966 732 596
Overall average
1029.14
813.761
519

4.The Performance Comparison Factor
The performance (loading testing) of tools is a
critical factor in the comparison process. The
conducted tests by this study include studying the
normal load of each of the three tools under
consideration, where several requests are sent per
minutes, and determine how the tool resistant the
loading.















Figure 1: Average Response Time.

Table 4 shows the performance of each tool for
testing the Web Services. Figure 2 shows the average
performance of each tool for all Web Services. The
results show that the SoapUI testing tool has better
performance than PushToTest and WebInject, while
PushToTest is better than WebInject in the
performance.

Table 4: Tools Performance

Web Service
Performance (Loading Test) in (ms)
WebInject
PushToTest
SoapUI
Weather Forecast 1271.8 860 399.98
Global Weather
1115
1040
447.81
Send Fax 1215.8 1286 1459.35
Currency
Convertor
1119.6
0960
1,382.46
Country Details 1028.6 1534 6
Bible Web Ser.
1544.6
1260
379.46
Text to Braille 0977.8 850 650
Weight Unit C.
0951.2
700
367.69
Com. Unit Con. 1152.4 800 940
V. Credit Card
1066.8
360
353.09
Statistic 1082.4 553 377.72
Stock Quote
0884.8
462
459.62
Fedach 1028.4 1046 1,635.55
Send SMS
1388
930
337.71
V. Email add. 1164 560 321.58
Overall average
983.70
730.047
556.751




 
1029.14 
813.761 
519.88 
0
200
400
600
800
1000
1200
WebInject PushToTest SoapUI
Response Time  in (ms)
Testing Web Services Tools
294











Figure 2. Average Performance of each Tool.

5.The Throughput Comparison Factor
Throughput can be defined the as the number of
requests per seconds. High number of requests with
low load leads to high performance. This research tries
to study the throughput of each tool depending on the
selected Web Services, but only the SoapUI and
PushToTest tools support this kind of testing. Table 5
shows the throughput of SoapUI and PushToTest tools
for testing the Web Services. SoapUI has better
throughput than the two other tools.
Table 5: The Throughput of SoapUI.


Web Service
Throughput of SoapUI
and PushToTest Tool (sec)
SoapUI
PushToTest
USA Weather Forecast 4.13 1.6
Global Weather
4.02
0.86
Send Fax 0.84 0.81
Currency Convertor
0.82
1.26
Country Details 6.8 0.78
Bible Web Service
4.21
0.88
Text to Braille 6.04 1.68
Weight Unit Convertor
4.4
5.5
Computer Unit Convertor 1.32 4.9
Validate Credit Card
4.49
4.4
Statistic
4.42 6.5
Stock Quote
4.09
4.8
Fedach
2.04 0.82
Send SMS World
4.54
3.9
Validate Email Address 4.64 5.1
Average
4.169
3.9947

6. The Ease of Use Comparison Factor

The study of the testing tools shows that the
SoapUI tool is the most users friendly tool among the
selected tools. The reason for that is the excellent
graphical user interface which the tool support for
testing the Web Services. The PushToTest is somehow
can be used with a little effort, but it is still more
complex than SoapUI. The last tool which is
WebInject is the most complex tool to use, it requires
a lot of effort because it needs a configuration process
as well as it does not support graphical user interface,
and push the user to use a command line.

7. The Testing Coverage Comparison Factor
The studying process of the three tools shows that
the SoapUI and PushToTest can provide multiple
cases in the same testing criteria more than WebInject
tool. For example, for the loading test side, SoapUI
and PushToTest can test the Web Service in multiple
cases: Baseline Testing, Load Testing, Stress Testing,
Soak Testing, and Scalability Testing. The SoapUI has
a big advantage over the other two tools since it
supports high level of security.
Clearly, the results illustrates that the three testing
tools have the following order depending on the total
factors from the best to the worst: SoapUI,
PushToTest, and WebInject. SoapUI outperforms
PushToTest and WebInject in all the comparison
factors, and with a huge difference. SoapUI has an
excellent response time, performance, and throughput.
It is easy to use and has a group of multiple cases in
testing the same criteria. PushToTest is a good testing
tool; even it could not outperform SoapUI in any
factor. The PushToTest testing tool has got good
values in response time and performance. WebInject
has the lowest values in the comparison factors. This
tool does not have the ability to test the Web Services
in the same quality with the other tools. Moreover, it is
complex and difficult to use.
V.
CONCLUSION AND FUTURE WORK

In this paper, several selected web services are
evaluated based on several known important quality
factors. Examples of those quality factors used in
evaluating web services include: performance,
 
983.704762
730.047619
556.7519048
0
200
400
600
800
1000
1200
WebInject PushToTest SoapUI
Performance in(ms)
Testing Web Services Tools
295
reliability, security, throughput, etc. The obtained
results from the evaluation process show that the
SoapUI outperforms PushToTest, and WebInject in all
the comparison factors. In the same way, PushToTest
outperforms WebInject in the compared factors. In
general, the project suggests using SoapUI to test
different Web Services because of its good
performance and high quality test of Web Services.
References
[1] "Web Service (WS)", (visited in 11-october-2010),
http://en.wikipedia.org/wiki/Web_service.


[2] Siblini, R.; Mansour, N. (2005), "Testing Web Services",
aiccsa, pp.135-vii, ACS/IEEE 2005 International Conference on
Computer Systems and Applications .
[3] Martin, E.; Basu, S.; and Xie, T. (2006), "Automated
Robustness Testing of Web Services", In Proceedings of the 4th
International Workshop on SOA And Web Services Best Practices .


[4] Bertolino, A.; Polini, A. (2005), “The Audition Framework
for Testing Web Services Interoperability" in Proceedings of the
2005 31st EUROMICRO Conference on Software Engineering and
Advanced Applications .
[5] "What's Web Service", (visited in 21-october-2010),
http://www.w3.org/TR/ws-arch/#gengag.

[6] "Web Service architecture",(visited in 22-october-2010),
https://www.ibm.com/developerworks/Webservices/library/w-ovr/.

[7] Judith, M. "Web Service architectures", Published by Tect,29
South LaSalle St. Suite 520 Chicago Illinois 60603 USA.

[8] Sneed, H, M.; Huang, S.( 2006), " WSDLTest – A Tool for
Testing Web Services", Eighth IEEE International Symposium on
Web Site Evolution .

[9] Kumar, A, S.; Kumar, P, G.; and Dhawan, A.(2009),
“Automated Regression Suite for Testing Web Services"
International Conference on Advances in Recent Technologies in
Communication and Computing, pp.590-592 .

[10] Conroy, K, M.; Grechanik, M.; Hellige, M.; Liongosari, E,
S.; and Xie, Q.( 2007), " Automatic Test Generation From GUI
Applications For Testing Web Services", Software Maintenance
2007,ICSM 2007,IEEE International Conference on 2-5 Oct 2007,
pp.345-354 .

[11] "SoapUI tool" , (visited in 14-November),
http://www.SoapUI.org
.

[12] "PushToTest tool" , (visited in 16-November), http://
www.PushToTest.com
.

[13] "WebInject" (visited in 30-october-2010) ,
http://www.WebInject.org/
.






296