Validating databases after migration oracle 8i 10g

Rated 4.59/5 based on 735 customer reviews

It is only intended for use where a known and limited number of blocks is affected.

This results in a reduced mean time to recovery (MTTR) and higher availability as only the affected blocks are offline during the operation.

You mean if i am using streams for this scenario, i do not need a pre created copy of the database nad instantiating the tables will also create them at destination?

As per the below note, it seems we need to have a copy of the object at destination side before instantiating it - thought we need to use exp/imp (or RMAN if platforms are same) to create the database copy first and then use streams to sync source and destination.

Please advise Thanks Rohit Thanks Rohit January 12, 2011 - am UTC 1) Logically - utf8 is a superset of we8iso8859p1 - every character in we8iso has a corresponding character in utf8.

At this point the table can be used again but you will have to take steps to correct any data loss associated with the missing blocks.Hey Tom, I need to migrate my 9i database on Solaris SPARC to 11g on Solaris X86 platform. I think the best approach to achieve the downtime requirement is DATAGUARD, STREAMS or GOLDEN GATE, right? Please suggest a good approach to perform such a migration because we need to resolve endian incompatibility, change character set and avoid downtime as part of this task.Also, as part of the migration, character set needs to be changed from WE8ISO8859P1 to UTF. Any pointers will be really helpful Sir Thanks Rohit January 07, 2011 - am UTC 1) data guard would not apply - you cannot do data guard in a heterogeneous environment.streams would apply - golden gate would apply.2) you will be rebuilding the database from scratch - which can be accomplished with replication (streams/golden gate). Replication would also minimize downtime to as small as you want it to be.What is the best (recommended) way to migrate them?for these small ones, probably data pump if it was really large, a mixture of data pump to get the code over and cross platform transports to get the large bits of data over.

Leave a Reply