pick two different machines with an Oracle database on each
On the TARGET connect as the Oracle user SYS in sqlplus and
create a database link to the REPOSITORY database as
any user, call him REPO
create databse link
On the REPOSITORY create the repository table
On the TARGET connect in SQLPLUS as the user SYS and
create the package to collect the ASH data.
You can test the pack with the following commands in sqlplus
set serveroutput on
Now we need to point the package to the REPOSITORY machine.
We need to edit the pack to include a database link to the
REPOSITORY table ASH_DATA
editing ash package for database link
edit package for db link
Now we are ready to run our ASH collection.
basically you just run the following two
commands in a loop for as long as you want to
collect ASH data
Eventually you will run out of room.
You can delete old data but deleting is costly.
YOu probably want to partition the ASH data by day
and then drop old days that you no longer need.
Before dropping a days worth of ASH it might be worth
extracting some data. The most obvious thing to extract
is the count of each of I/O, WAIT and CPU over certain size
buckets like 1 minute buckets. This data extaction will be
pretty small, 1 row per minute. WIth this data it's easy
to chart the performance of the database over days, weeks
or months without using much storage space.