CDC Tip #6

southdakotascrawnyData Management

Nov 29, 2012 (4 years and 9 months ago)

298 views






Ten Tips for Change Data Capture

How to Maximize

Data Integration
Value

in Analytic and
Operational Systems









January 2010













Ten Tips for Change Data Capture


P a g e

|
2

Contents

Executive Summary

................................
................................
................................
........

3

Background

................................
................................
................................
.....................

3

CDC Tip# 1
-

Using CDC to Increase Business Acumen

................................
........................

4

CDC Tip#2


Non
-
homogenous systems that need to share data

................................
..........

5

CDC Tip #3
-

Distributing data for Line of Bus
iness information management

........................

6

CDC Tip #4
-

Managing business data flow

................................
................................
............

7

CDC Tip #5
-

Establishing a Master Data Management environment means having a data
“common denominator”

................................
................................
................................
..........

7

CDC Tip #6
-

Supporting applications that cannot run on certain databases

..........................

8

CDC Tip #7
-

Reducing stress on operational databases

................................
.......................

9

CDC Tip #8


Supporting a Disaster Recovery or Backup/Restore Plan

................................
10

CDC Tip #9
-

Reduce cost of database

systems

................................
................................
...
10

CDC Tip #10
-

Distributed environments with multiple databases

................................
..........
10

CASE STUDY: TriActive Supports SaaS Business Model Using DBMoto™ To
Synchronize Customer Data Availability Among Oracle Servers

................................
..

12

Business Problem

................................
................................
................................
.................
12

Selection Criteria

................................
................................
................................
...................
13

Problem So
lved

................................
................................
................................
.....................
13

Major Benefits of using DBMoto for Data Integration

................................
.............................
14

Summary

................................
................................
................................
.......................

15

About HiT Software, Inc.

................................
................................
...............................

15




Ten Tips for Change Data Capture


P a g e

|
3


Executive Summary

Many organizat
ions think of data integration as simply merging or copying data from one system to
another. However, data integration can be the underlying linchpin to a host of critical business
processes and applications. Using data integration strategies and special
ized functions make the job of
accessing fresh data cost
-
effective, quick to achieve and easy to manage. This paper will describe how
Change Data Capture, a specialized function of data integration, can optimize
value in
both analytical
and operational
sys
tems with
in a data
-
driven environment.

Background

Change

Data C
apture
(CDC)

is a term used to describe
a methodology for updating a data set to bring it
current. Prior to CDC, businesses updated their data sets by copying entire sets from one system to
an
other, replacing any stale data. While this methodology, called snapshot or refresh, works just fine to
bring a data set current,
it is time
-
consuming and
resource
-
intensive. With CDC, only the delta, or
changes made within the data set, are updated

leaving non
-
changed data untouched

making it a
much faster process with much less
intrusion and
stress on the system
s involved
.


Variations of CDC have been popular
for many years.
The foundation of CDC is based on reading

the changes
and activities on

the underlying source database system,
and
articulating the changes that then must be applied
to the target database

while negotiating conflict
resolution
.

Common methodologies for CDC have included
customized programs (
often
built inhouse) to detect
c
hanges;
the
use of
triggers (event alerters
) on the
source database

which are designed to “
fire

wh
en a
change

has occurred; and accessing
source
database
logs or journals to passively capture changes once they
have been
applied

in order to
transfer
thos
e changes

to
the target
.


The most effective and certainly the least intrusive methodology is reading
source
database logs and
transferring changes to the target database.
Most databases produce a transaction log that records
changes to data and metadat
a. Reading the log is an unobtrusive way to discover and take action on
changes. However, no standards exist across databases

each database has its own specialized format
for its log

making it very difficult for a DBA to manage without the assistance of
software designed
specifically for that purpose.

Few software solutions are capable of providing a combination of
features such as
CDC, log
-
reading,
real
-
time updates and bi
-
directional synchronization


all of which clearly improve

the velocity of

data
av
ailability
. Many database providers claim this functionality, but
,
for environments where multiple

Ten Tips for Change Data Capture


P a g e

|
4


database systems are

used,

it is best to evaluate each solution based on its ability to support
heterogeneous databases, allowing better vendor neutrality a
nd transparency.

This paper
provides
ten examples where CDC can be effectively applied to improve
analytic

and
operational velocity
,

and

includes
a case study of a Software
-
as
-
a
-
Service provider
who uses

CDC to
solve operational
data availability to
improve business performance.

CDC Tip#

1

-

Using CDC to Increase Business Acumen

In

many
businesses
,
understanding the value of
corporate data is determined by analysis.
M
ost data is
produced for an operational reason, i
.
e
.
, invoices are produced on sales
; customer data is produced on
new
wins

or marketing activities; financial data
is generated through inventory and sales
. Inevitably
there needs to be a systematic way to
access various parts of disparate databases in order to assess
conditions and take ap
propriate actions.

Using CDC to support analytics, reporting and business intelligence means you can increase the speed of
decisioning due to an elevated throughput of fresh information to your analysis systems.
For example,
r
educing the time between
rep
orting on
customer
purchase
history
and a
sales
decision to offer a special
promotion could help an
organization achieve
shorter time
-
to
-
revenue
goals

reducing analysis
windows from quarterly to
monthly, weekly, daily or
even at the


point of
purchase.” U
nderstanding
which manufacturing line is
slower in comparison to
others and why
,

can help
the organization streamline
supplier ordering,
improve

production process
es,

and
better manage staffing.

The value of
corporate
dashboards
is relative to
the value
of the underlying
data.
Dashboards
can be
implemented
through individual connections to each data store, but in a scenario where there are
multiple databases and multiple dashboard applications, is it wise to allow unlimited connections to
critical operat
ional systems at the risk of reducing performance
?
U
nless data
organized across multiple

Ten Tips for Change Data Capture


P a g e

|
5


databases can be updated quickly, analysi
s can take days or even weeks

throttling any
short
-
term
efforts to

reduce
cost and time.

When replicating critical information to a secondary system used specifically for reporting and
dashboarding, CDC is the methodology of choice for keeping that secondary system as fresh as the
original system.

CDC Tip#2


Non
-
homogenous systems that need t
o share data

In most mid
-
to
-
large sized organizations there are multiple systems at play, and a host of
disparate
databases installed to support the data production of the systems. It isn’t often that an organization of
$500M
-

$1B in revenue is dependen
t on
a single
database
management
system. In those situations
where multiple databases are installed, it is likely that portions of data will eventually need to be shared
with
other parts of the organization.

For example,

the
customer care part
of the b
usiness
(customer/technical
support)
typically
need
s

access to the

information on

product
s

purchased
by customers
;

or
the
quality assurance
and testing groups
may
need access to
engineering data
; or

the marketing group
regularly makes use
of

sales data
.
T
h
e
systems used by
customer care, or
QA, or marketing
may
reside

on
completely different databases
b
ut, in order to progress quickly, they need
direct
access to that data
(affecting the system performance of the host system)
. Alternatively

they
could

access

a staging
database where data
would

be updated in real
-
time from the host systems. Using CDC to populate a
staging database (or data mart) for these purposes makes information readily available, while reducing
the access time and detrimental effects on t
he
performance of the
host systems.

CDC can

also

play an important role in increasing business velocity in non
-
homogenous
database
environments
, when an organization acquires or merges with another. Many times, the corpora
te
systems are very different

different database systems, different data descriptors, etc. Using CDC,

Ten Tips for Change Data Capture


P a g e

|
6


d
isparate data

can be merged

into a useful repos
itory
for reporting and dashboarding, as well
as
into
a
system that can feed
new
corporate systems. By using CDC, the original systems
can
remain
intact
and
continue to run as long as necessary,
while

changes
are propagated
to
the
corporate
repository

ma
intaining

a fresh and updated version of the remote company data
, accessible by an
y designated
user
.

CDC Tip #3
-

Distributing

data for
Line of Business information
management

One of the visionary

ways to apply CDC in
an
organization is to proactively push information to lines of
business (LOB) where there may not yet be a systematic or

developed way to manage information about
the
department. For instance, a large services part of
an
organization

might

benefit from a repository o
f
customer information, or sales data
.
In most cases, services professionals

would connect

(often
remotely)

to the customer or sales application to access that information
,
possibly impacting

the
performance of the remote application with heavy user traffic
.

It can be

more efficient to use CDC to make current information available in a separate system
accessib
le to these LOBs,
either
in an application they are already using

o
r in a
generic
web app
lication

accessible to

all
LOBs in
an
organization who need access to the same data
.

CDC can also be
used to direct
ly

improve

business
processes by
providing the
ability to
generate alerts
and warnings
.

In manufacturing,

it
can be critical
to know

if
customers
changed or
canceled large
orders,
subsequently
alter
ing a
manufacturing
deadline.

O
n
the financial side, it would help
ful to know if customers changed or added new orders prior to the
close of the month.
Using CDC, updates or changes to data can be reflected quickly and project leads
can be alerted of these changes in order to respond
in a timely and appropriate way
.


Ten Tips for Change Data Capture


P a g e

|
7


CDC

Tip #4
-

Managing
business
data f
low

In some business models, there is a need to both
collect and
share

key
data. Think about the insurance
industry, where agents consume data from insured individuals and businesses, while the corporate
office pushes da
ta out to the agents on programs, offers and ser
vice.

Businesses

need to
benefit from
information
coming from
either direction.

I
nformation
about client
responses to
programs and
pricing should
have a bearing
on new offers.
Information
about policy
cha
nges and
rates should be
related to
consumer
buying patterns
in specific
geographies.

Using CDC, businesses can continuously update this data based on rules and requirements

as it changes
from user input or corporate rules
. Information
changes
can be tr
ansparent to the applications and
customer
-
facing portals, as only the un
derlying databases are affected

synchronizing
changed
data and
surfacing specific information
ruled by

governance, exception handling and security. Data can be
updated on a scheduled

(time
-
based) basis, or in real
-
time, based on the organization’s requirements.
Creating
this type of

data architecture allows the organization to apply business models and rules
at the
point of transformation

controlling data accessibility

while assuring
fresh updates to its clients and
suppliers.

CDC Tip #5
-

Establishing a Master Data Management

environment means
having a data “common denominator”

Master Data Management (MDM) provides valuable guidance in defining and managing fundamental
data principles

in an organization. In smaller organizations with singular data systems, this can be
handled easily by technology and a smart DBA to organize it. However, in larger, distributed operations,
MDM
becomes complex and requires more than technology to establ
ish, maintain and nurture its
existence.


Ten Tips for Change Data Capture


P a g e

|
8


According to Gartner, “MDM is a technology
-
enabled discipline in which business and IT work together
to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the
enterprise’s
official, shared master data assets.”
1

In order to support such an endeavor, organizations
need to scrutinize their data process and flow.

Using CDC, it is possible t
o
enable a “common
denominator”
data set that
coordinates source data
from disparate sys
tems
into a specialized location
or system, while
establishing continuous
updates to that system
from each of the source
systems, ensuring fresh
data at the target system.
Technology can provide
any necessary
transformations and rules
to adjust data as it

passes
to the target system. Resulting target systems can either be accessible for reporting purposes, or can
stand as the MDM default system from which other systems draw their particular data requests. CDC
can establish a reliable mechanism to keep da
ta refreshed at the target system, as well as enable
data
availability at unlimited downstream systems.

CDC Tip #6
-

Supporting
application
s that cannot
run on certain databases

There are countless business
software applications in use today,
for operatio
nal as well as analytic
purposes. Every business has its
reasons for selecting or developing
its particular choice of software
applications. However, there are
occasions when the software
selected for valid business reasons
is not designed to be deployed

on



1

“Magic Quadrant for Master Data Management of Customer Data,” John Radcliffe, June 16, 2009,
Gartner, Inc.


Ten Tips for Change Data Capture


P a g e

|
9


the organization’s established platforms or systems. Many organizations then
command
their project
leaders to re
-
think the software selection and come up with another product that suits the inhouse
system.

However, using CDC, organizations have the
benefit of selecting whichever software application is best
suited to their business objectives, without concern for underlying platform. Establishing a real
-
time
CDC solution between
a

database supported by the application and the inhouse system database

provides an easy and effective solution for data transparency
and availability.

CDC Tip #7
-

Reducing stress on operational

database
s

Many organizations rely on
one or more

critical database systems for production activities, and in some
cases, these acti
vities run 24x7. Any
data request

that requires a new connection to that production
system will result in a performance hit on that system, and the performance hit will be relative to the
processing power and size of the production system.
When any numbe
r of users
is
allowed access to
that production system, there is a high likelihood of unanticipated downtime or poor performance on
the production system. User access can be controlled by system administration, however users may not
always be able to dict
ate exactly when they will need access to that production data.

In the case of
analytics, users may
need to run extensive
queries that consume
an extremely large
amount of processing
power, and for each
of these requests,
additional stress is
added to t
he
production system.

Businesses need to
evaluate the number
of connections they
are allowing to these
highly critical
production systems.
CDC is invaluable in
this scenario


using
CDC, organizations
can
synchronize

a
copy of key data
with

a secondary da
tabase of choice
and keep that replicated data refreshed in real
-
time, using a fraction of
the connection time and consumed resources on the production system.


Ten Tips for Change Data Capture


P a g e

|
10


CDC Tip #8


Supporting a Disaster Recovery or Backup/Restore Plan

Establishing a disaster recov
ery plan, or a backup and restore plan, is now mandatory in some industries
and is recommended where failures cause incalculable disruption to business. However, many disaster
recovery and backup/restore solutions are based on replacement of data in block
-
based fashion,
meaning that portions of data are replaced as they are stored, not by their
contextual
meaning.
Using
CDC, businesses can

have the best of both worlds

they can keep a hot standby copy of their data, while
maintaining the most accurate (and

freshest) version of the data available in
the standby copy,
for
immediate access
in
the event of a catastrophe.

CDC Tip #9
-

Reduce

cost of
database systems

In the case suggested
in
Tip #7 above, a production server may be a critical component of business
data
flow. It is imperative to keep that system operational.

When
evaluat
ing

the costs associated with
downtime from that server, it quickly becomes apparent how valuable a second
ary system is, and
how
much it could
be
worth from a business sense
. In cases like this, it usually is easy to see that the
investment cost of a secondary server,
operating system

and database
(using CDC)
may be minor
compared to lost business. Another c
onsideration is the option of
using
open source systems and
database software,
helping reduce costs enormously
. However, in either of these cases,
businesses

need
data integration
technology that not only supports CDC, but also supports many
different
dat
abases

both commercial and open source.

D
atabase system cost can

also

be reduced through CDC in those environments where a legacy,
mainframe, or other expensive
-
to
-
manage system is the production system. In this case, it makes sense
to synchronize data to

a less
-
expensive system
,
reducing
the stress
on t
he legacy system
,

giving it more
breathing room and
lifetime.

One
way of measuring cost reduction
in this type of an environment
is
evaluating
the exp
ense involved in IT resources

legacy systems often requ
ire expensive and hard
-
to
-
find expertise
. U
sing CDC to synchronize with

a Microso
ft SQL Server or MySQL database, for instance,
reduces IT cost

and offers a substantially larger pool of expertise and resource availability at a lower
price point.

CDC Tip #
10

-

Distributed environment
s

with multiple databases

In some cases,
for very valid
business
reasons, organizations want to maintain a farm of disparate
systems. It could be that sensitive customer data must be partitioned, or for organizational reasons, it’s
easier to track and maintain data managed in separate places. In some cases, data may b
e managed in a
virtualized way and the underlying system is a farm of storage devices.

In each of these cases, businesses will still need to correlate data for reporting and analysis, as well as to
make operational decisions across the full data set. A
top consideration is the speed and accessibility of
the data. Some systems apply individual reporting and analysis to each source data set, and then send
aggregated information to a corporate reporting site. However, if the information is defined individ
ually
by data set, how is the aggregated data going to be identifiable? For example, if each data set is
comprised of an individual database and its applications, it will have a unique way of identifying
customer attributes. It is possible that informati
on such as “reason for buying” or “preferences” may
not be easily mapped to the same data in a different stack. Therefore, if each data set provides its own

Ten Tips for Change Data Capture


P a g e

|
11


aggregation, reporting on the consolidated data could be misleading. It makes better sense to
rat
ionalize the data at a lower level, for instance between the databases, before aggregating data for
reporting. This removes the possibility of mismatched detailed data and allows the organization to
apply rules or corporate definitions to the data

in a un
iform way
.

The secondary advantage of integrating data at the database level is the option for real
-
time updates.
Using CDC at the database level provides the opportunity to populate a data mart of aggregated results
with up
-
to
-
the
-
minute updates.

Addi
tionally,
CDC
can
benefit distributed environments
in the

Software
-
as
-
a
-
Service (SaaS)
field
. SaaS
provides savings in cost and resources to organizations where there might be limited time, budget or
expertise allocated
for inhouse IT management.

In a Saa
S environment, the data integration challenges are removed from the business organization and
transferred to the SaaS vendor, whose role it is to manage incoming data from multiple client accounts.
In addition, the SaaS vendor must understand how to gain m
eaningful insights from
information across
these client accounts, in order to increase and substantiate its software business.


Ten Tips for Change Data Capture


P a g e

|
12


CASE STUDY:

TriActive Supports SaaS Business Model Using
D
BMoto™

T
o Synchronize

Customer Data Availability
A
mong Oracle
Serv
ers


TriActive, Inc., founded in 1997, is a pioneer in SaaS (software
-
as
-
a
-
service) for Systems Management
solutions for IT and Managed Service Providers. With offices in Austin (Texas), Asia and Europe, TriActive
services a wide range of customers and IT

asset configurations.

TriActive offers a fully integrated suite including comprehensive asset management to track client
systems and software; report and auditing for user change history; software delivery; automated patch
management; a 500,000+ title so
ftware catalog; Web remote control and diagnostic tools; a
comprehensive knowledge
-
base; end
-
user s
elf
-
service center, and more

through a Web
-
based global
help desk. TriActive provides everything IT departments need to identify and manage IT assets, and
de
livers first
-
rate service to reduce the complexity and total cost of ownership.

While the TriActive SaaS solution helps customers securely manage their distributed and increasingly
mobile IT assets from anywhere, it also gives companies the opportunity to
leverage best practices from
a growing TriActive community of IT specialist and solution partners.

Business Problem

TriActive depends heavily on the capabilities and scalability of its own internal IT infrastructure to
support its SaaS environment. Origin
ally, its IT infrastructure was built entirely on an Oracle 10G

database. However, with
W
eb
deployment to its clients

whose user
s ranged from 200 to 20
,000

PCs

it quickly became clear that data performance, growth and reliability were critical.

TriActive’
s solutions are designed to provide services such as ass
et identification and matching,

i.e.,
when a new user logs into his/her PC, associated hardware and software are matched to the asset list
governed by their organization. With assets stored in dozens
of tables per client, the task to identify and
match them quickly becomes a data performance nightmare.

TriActive moved to an Oracle RAC (clustering) database
environment (Oracle RAC configurations provide distributed
processing but look just like an Or
acle database on a single
server to database applications), however performance was
still not acceptable for the query response times required by
the clients.

In order to improve data scalability and reliability, TriActive
decided it would be better to div
ide up the database into a group of smaller databases, coordinating
customer information in a way that made data organization and quick access more realistic. However,
once the decision was made to proliferate into 4 to 5 database
s, a new requirement surfa
ced: H
ow to
synchronize the data between these databases so that
TriActive’s

W
eb
applications are always served
,

and clients are always getting updated and consolidated information?


Ten Tips for Change Data Capture


P a g e

|
13


Selection Criteria

TriActive

went on a search for a cross
-
database replication tool that could handle replication between
various relational databases including Oracle, PostgreSQL and potentially MySQL


considering a future
growth plan to open source databases. TriActive did some r
esearch and discovered both open source
and commercial software products for data replication.

“We evaluated a couple of open source tools for this, but experienced a great deal of configuration
trouble,” said Steve

Sinnott, IT
Director

at TriActive
. “
Then

w
e downloaded a trial version of DBMoto,
configured it as proof of concept for our replication needs, and verified that it worked properly in our
test environment. Although not originally identified as a requirement,
the graphical user interface in

DBMoto

was much easier to use than the text
-
based configuration used by other tools.”

TriActive implemented DBMoto in a few days, deploying DBMoto for Oracle
-
to
-
Oracle replication in
order to make data transparently available across its customer databases and we
b applications.
“Although the setup was not as easy as we would have hoped, the DBMoto support team did a good job
of pointing us in the right direction,” added Steve. TriActive is planning on using the DBMoto software
for Oracle
-
MySQL replications next.

Problem Solved

One of the services TriActive provides is matching software
packages that reside on customer
-
remote PCs to tables of
application identification and authentication on the
TriActive system. TriActive has aggregated a large amo
unt
of application description information from its many
customers and is able to use that definition knowledge
-
base to identify packages and versions on new customer
systems. This requires static and mildly variable data
replications for functions such a
s software signature
matching, where data must be updated among two or
more databases in order to accurately compare signatures.

Says Steve,
“We have encountered various challenges,
such as how to manage foreign key constraints and
replication groupings, a
nd DBMoto just powers through
it!”

Serving organizations with large numbers of users is no easy task when they are constantly upgrading
PCs, adding new software, and moving positions in the company. Each change requires a connection to
the database to iden
tify and authenticate the changed applications and hardware. DBMoto’s advanced
technology reduces the stress on systems when updating these large databa
ses

through Change Data
C
apture technology, DBMoto updates only the changes

(inserts, updates, deletes)

eliminating the need
for intensive full
-
system copies that other products would require even when there are only small
changes. As TriActive’s customers grow in size,
TriActive

expect
s

to be able to service increased PC and
application activity, and
they

are

very pleased with DBMoto’s scalability. “DBMoto runs on a separate
system than our Oracle servers, so there isn’t any issue in

the application’s scalability,

Steve

explains
.

“Data replication to
numerous global datacenters
is core to making both the
software and customer data
available on
-
demand. This
replication must happen
quickly and accurately.
DBMoto does this for us,”
according to Steve

Sinnott,


IT Director,
TriActive, Inc
.


Ten Tips for Change Data Capture


P a g e

|
14


“As a non
-
DBA expert, I am thrilled to have a software package that was easy

to understand and deploy,
and that manages my data synchronization for me. It’s low maintenance and seems to be logically set
-
up. All of my mirroring replications and groupings are easy, and the software just works!”

TriActive’s average database size is

100G and growing.
According to Steve,
“One of our databases
contains 800 customers with more than 161,000 desktops. The nice part about DBMoto is that once
you’ve learned the concept, it’s really easy to set it up and then forget about it. We haven’t to
uched our
replications in a couple of weeks now. DBMoto works extremely well and very fast, in my opinion. I
would recommend this product to anyone who wants a cross
-
database replication product but doesn’t
have data replication expertise.”

Major Benefit
s
of using

DBMoto for Data Integration

TriActive links its asset management data with help ticket data in order to quickly resolve and fix missing
applications, out
-
of
-
date packages and user authorizations. This ability to manage assets remotely is a
huge

help for the customer, freeing up valuable IT time, and is enabled through the use of DBMoto’s
replication technology to match and associate data between database tables.

TriActive also supports a growing set of management consultants who depend on TriA
ctive’s systems
management applications for their clients. The ability for DBMoto to constantly update and synchronize
data across multiple databases gives TriActive the ability to confidently support any number of
consultants and end
-
customers. “DBMoto h
as reduced the numbers of man
-
hours in a process that
we've used for over 10 years,” said Barry Meyer, CTO

at TriActive
.

Steve Sinnott reports that he has been very impressed with DBMoto’s

error management system. “Its
graphical interface allowed me to quickly spot errors in logs, and made it easy to troubleshoot after I
had entered the wrong credentials.” He added, “DBMoto has cut the time of our software roll
-
outs by
20%. Since our sys
tem updates occur in scheduled maintenance windows, or during downtime for our
customers, this means that we can red
uce the impact on our customers’

business and be more
predictable as we scale out
our

business.”


Ten Tips for Change Data Capture


P a g e

|
15


Summary

Change

Data Capture is a fundament
al methodology in data integration and is
well
-
suited to help
improve

the speed
, accuracy and ease

in making
critical
data
is available

to

both business analytics and
operations.

CDC is most
efficient
when it can be applied through transactional log
-
readi
ng. If you are
searching for CDC

solutions
, look for
options
that support log
-
reading for many different databases, and
that can support bi
-
directional synchronization in real
-
time in order to take advantage of the Ten Tips
provided above.

When implementi
ng CDC, evaluate
critical factors such as
:



How much time the
organization
can afford to
spend on
implementing and maintaining
data integration projects in order to reach business goals



How many and which databases will be involved (some solutions for data
integration only
support certain databases)



How
many inhouse staff are

available to support the project and how much training they
have



Important functionality
(
versus bloatware
)

in software products



purchase

only what is
needed

Each of these factors can affect the cost and
time
-
to
-
completion
of a CDC project, which in turn reflects
on the organization’s ability to respond quickly and successfully in making important business data
available to departments, lines of business and ex
ecutive management.

About HiT Software, Inc.

For more than a decade, HiT Software products have been providing access to critical data, enabling
data availability and offering programming
-
free data integration across enterprise systems. HiT
Software’s stan
dards
-
based products perform real
-
time, bi
-
directional replication between all major
databases; execute real
-
time, bi
-
directional transformations between XML and all major databases; and
connect applications to IBM DB2 databases via .NET, OLE DB, ODBC and
JDBC standards. Founded in
1994 and based in San Jose, California, HiT Software is relied upon by thousands of organizations in
virtually all vertical markets around the globe. Additional information is available at
www.hitsw.com
,
through

e
-
mail at
info@hitsw.com
, or by telephone at +1(408)345
-
4001.




©2010, HiT Software, Inc.

All Rights Reserved. All
trademarks or registered trademarks are the
property of their respective owners.


1015
-
11300
-
002_a