An electric utility with an energy crisis
- By Kathleen Ohlson
- July 1, 2005
American Electric Power faced a different type of energy problem a few years
ago when it found one of its primary databases was overflowing with customer
information, throttling performance.
AEP supports more than 5 million customers linked across 11 states, all connected
to its electricity transmission and distribution grid. The wholesale marketer
of energy commodities owns more than 36,000 megawatts of generating capacity
in the U.S., taking in more $1 billion in annual revenue
Data generated by utility’s MACSS systems (shorthand for marketing, customer
service, order processing and other business-critical apps) flowed in a constant
stream into an IBM DB2 database. “We were struggling and cramming
it in a bag,” says Gary Girard, AEP CIS Application Lead. “We realized
the performance issues through the morass of data.”
The times for batch processes crept steadily upward in sync with the increasing
volume of data. With 2,000 batch jobs to run per night, the process became excruciatingly
slow. AEP selected Princeton Softech’s Archive for DB2; in part because
it already was using Princeton’s Relational Tools test data management
Before initiating the project, AEP needed the approval of a slew of
authorities—internal and external auditors, regulatory commissions and
sponsors of the business project—to show customer information
would be available online immediately.
“There were millions of rows to get rid of and they [needed to] realize
they could get it back,” Girard says. “We’re saying: ‘We’re
deleting this data, so trust us, we’ll get it back.’ I understand
their reluctance. We had to gain their trust and [show] the process does work.”
The other issue was justifying the cost, but it was difficult to show savings.
“The bean counters want dollar and cents, and this [project] particularly
couldn’t really say that,” Girard says. The tangibles would
be increased response time, faster batch times and getting the data in the user’s’
AEP installed Archive for DB2 into its test environment for a few months, testing
the archiving capabilities in a product simulation environment. After getting
the go-ahead, the IT staff implemented the tool in two phases, initially concentrating
on full archiving and then implementing it for partial archiving.
The full archiving phase focused on historical data for AEP’s inactive
customers. Girard and his team developed rules to identify how much
historical data for these accounts needed to be maintained online, and how much
data could be archived and purged from the production database.
“It wasn’t a substantial volume, [but] it was a good way to get
the ball rolling and very low risk,” he says. AEP now archives inactive
customer accounts every quarter.
Though they couldn’t recognize immediate performance gains, Girard says
the performance degradation “stopped in its tracks.”
“We can’t put our fingers on it and say we saved 2 billion
rows of data, the response time was Xnumber, and the batch number was Xnumber,”
Girard says. “It’s safe to say performance was markedly
The partial archive phase was recently implemented, focusing on archiving years
of historical data for active customers. AEP archived the oldest data until
reaching a point where there was only 3 or 4 years of historical data available
for immediate access. AEP now runs the partial archive process monthly.
The utility’s goal is to reach a plateau where it can archive and remove
rows from the MACSS database at approximately the same rate data is being added.
Information added to the database currently increases by about 125 million rows
per week, and AEP’s next is contemplating keeping only 2 years of online
data available for real-time access.
Even with the successful archiving project, AEP still has more work to do.
“We get rid of what’s there, and we have more coming in,
we’re pretty much treading water,” Girard says.
Back to feature: Goods
for the Last Drop of Data
Kathleen Ohlson is senior editor at Application Development Trends magazine.