. Lecturer : DR. THI LIP SAM

sunfloweremryologistData Management

Oct 31, 2013 (3 years and 7 months ago)

57 views

Benchmarking, Bench Marketing or

Bench Baloney p 202




.

Lecturer : DR. THI LIP SAM

MANAGEMENT OF INFORMATION SYSTEM (MIS )






Which DBMS product is the fastest ?


Which product yields the lowest price / performance ratio


What computer equipment works best for each DMBS product ?


So, vendors and third parties have defined benchmark, Their jobs to compare
performance, analysts run competing DBMS products on the same benchmark and
measure the result.


Typical measure are number of transaction processed per second


Number of Web pages served per second


Average response time per user


DBMS vendors set up their own benchmark test and than published the result. Of course
this vendor A to claim that its product was superior to all others. ( so others competitor
don’t want to believed)


In this respect third parties defined / set up standard benchmarks. Even that led to
problem.



According to The Benchmark Handbook
(
www.benchmarkresources.com/handbook
)



“ When comparative numbers were published by third parties / competitors,
the loser generally cried foul and discredit the benchmark. Such event often
caused benchmark wars. Then the loser reruns using regional specialists &
get new and winning numbers. Then the opponent rerun it using his regional
specialists, and of course gets even better numbers. The loser again reruns
using some
one
-
star gurus
. This progression continue all the to
Five
-
star
gurus.


discussion focus on PC Magazine published in July 2002 ran benchmark
using standard benchmark called the
Nile benchmark.



The test compare 5 DBMS product.


DB2 (IBM)


MySQL (a free, open source DBMS )


Oracle ( Oracle Corporation )


SQL Server ( Microsoft )


ASE (Sybase Corporation)




Overall,

Oracle
9
i

and

MySQL

had

the

best

performance

and

scalability

with

Oracle
9
i

just

very

slightly

ahead

of

MySQL

for

most

of

the

run
.

ASE,

DB
2
,

Oracle
9
i

and

MySQL

finished

in

a

dead

heat

up

to

about

550

Web

users
.

At

this

point,

ASEs

performance

leveled

off

at

500

pages

per

second,

about

100

pages

per

second

less

than

Oracle
9
is

and

MySQLs

leveling
-
off

point

of

about

600

pages

per

second
.

DB
2
s

performance

dropped

substantially,

leveling

off

at

200

pages

per

second

under

high

loads
.

Due

to

its

significant

JDBC

(Java

Database

Connectivity)

driver

problems,

SQL

Server

was

limited

to

about

200

pages

per

second

for

the

entire

test
.




Drivers, memory tuning and database design issues were
the three factors that had the most impact on
performance tests.



The Oracle and
MySQL

drivers had the best combination
of a complete JDBC feature set and stability. (
MySQL

staff
chose to use the
MySQL

JDBC driver written by Mark
Matthews because the company does not have its own
JDBC driver.)



SQL Server and
MySQL

were the easiest to tune, and
Oracle9i was the most difficult because it has so many
separate memory caches that can be adjusted. This issue
was even more nettlesome with Oracle9i because it
required the most memory per concurrent connection to
the database (about 400KB of RAM).




By comparison, DB2 required 177KB of RAM per
connection, and SQL Server,
MySQL

and ASE all required
about 50KB of RAM per connection. As a result, Oracle9is
data and query plan caches had to be smaller than those
of the other databases because of memory taken by user
connections.



MySQLs

great performance was due mostly to use of an
in
-
memory query results cache that is new in
MySQL

4.0.1.
When tested without this cache,
MySQLs

performance fell
by two
-
thirds.



MySQL

staff took advantage of a feature unique to
MySQL

among databases tested

the ability to use different
database engines on a table
-
by
-
table basis.



From 5 databases tested , only Oracle9i and
MySQL

were
able to run Nile application as originally written for 8
hours without problems.



a.What

are TPC
-
C, TPC
-
R , TPC
-
W



TPC
-
C
-

A benchmark that measures overall transaction processing performance.
is
an OLTP workload. It is a mixture of read
-
only and update intensive transactions
that simulate the activities found in complex
OLTP

application environments



TPC
-
R
-
Transaction Processing Performance Council
-

is a decision support
benchmark. It consists of a suite of business oriented queries and concurrent data
modifications. The queries and the data populating the database have been chosen
to have broad industry
-
wide relevance while maintaining a sufficient degree of ease
of implementation.



TPC
-
W
-
Transaction Processing Performance Council
-

E
-
Commerce workload that
stimulates the activities of a retail store website. Emulated users can browse and
order products from the website. In this case the product are books.


b. As a Oracle marketing department, how to use TPC results in the TPC
-
C benchmark


Refer the Picture , through TPC
-
C result shows that using HP
ProLiant

ML350 G6 the
tpmC

are 290,040 at 0.39 USD.


c.The

dangers to Oracle using TPC
-
C benchmark.


For tuning , Oracle9i was the most difficult because it has so many separate
memory caches that can be adjusted.









d. As a DB2 at IBM marketing department, how to use TPC results
in the TPC
-
C benchmark .


Refer the Picture , through TPC
-
C result shows that using IBM
System X3850 X5
tpmC

are 2,308,099 at 0.64 USD.


e. Do the result for TPC
-
C change the answer to question 1.


Base on the answer of question 1, the best DBMS are Oracle and
MySQL

but through TPC
-
C results base on price the Oracle is
cheaper at 0.39 USD , while the best performance are DBMS goes
to IBM with
tpmC

2,308,099.


f. As DBMS Vendor can we ignore benchmarks.


NO, still need of benchmark :
-

benefits


reminding everybody in the company of the need to be competitive


making the company's relative performance very clear


providing clear quantitative targets to management


providing targets that are not just visions of the future, but reality in other companies


providing the impetus for management to start behaving proactively, and to look for
ways of working which will bring significant improvements.



In the quest for increased competitiveness, companies often
ask themselves the questions, “ How are we doing?” Asking
this question leads logically to the next questions, “Compared
to what?” To fully answer this second questions involves an
examination of a company’s own operations, and
subsequently comparing the operations with those of other
organizations identified to be leaders in the field. Such
comparisons are at the heart of benchmarking.


There are 3 major reasons.


Benchmarking provides an objective evaluation of a
company’s business processes in other organizations.


Benchmarking serves as a vehicle to source for
improvement ideas from other organizations.


Benchmarking broadens an organization’s experience
base by providing insights into systems and methods
that work and those that don’t. It therefore supports the
notion of a learning organization.


tpmC
-
Transaction
-
per
-
minute
-
C rating
from the TPC
-
C benchmark, which
measures overall transaction processing
performance.


OLTP


OnLine Transaction Processing