Home » Server Options » Streams & AQ » Streams working with 1 and 800 rows but not 90000 (10.2.0.2)
Streams working with 1 and 800 rows but not 90000 [message #353424] Mon, 13 October 2008 17:39 Go to next message
rmantha
Messages: 4
Registered: October 2008
Location: Rochester, NY, USA
Junior Member
I had successfully implemented a one-way streams replication process from two new databases that I created...for a POC....

I used a script from the Streams section on www.oracle.com...

It worked fine with 1 row... split second copy...

It worked fine with 800 rows... split second copy...

90000 rows... NEVER Copied...

1. Why not?

2. I have tried to corrrolate the data from dba_% and v$stream% tables from all three stages(CAPTURE, PROPAGATE and APPLY). Is there a quick checklist of how to find what stage the streams is lock down in... and which data from which tables to look at from tables from all three stages...

Please help... trying to perform POC and then, configure on BETA system with true tables and implement on Production in January...

Not very much time to do so I can't be stuck on POC for long...

Thanks Ahead!!!

Ron.
Re: Streams working with 1 and 800 rows but not 90000 [message #353669 is a reply to message #353424] Tue, 14 October 2008 09:16 Go to previous messageGo to next message
varunvir
Messages: 389
Registered: November 2007
Senior Member
Stream replication is efficient upto 30000 records.So it
is not a good idea to replicate more than 30000 records at one time.
Re: Streams working with 1 and 800 rows but not 90000 [message #353685 is a reply to message #353669] Tue, 14 October 2008 11:10 Go to previous messageGo to next message
Michel Cadot
Messages: 68625
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
varunvir wrote on Tue, 14 October 2008 16:16
Stream replication is efficient upto 30000 records.So it
is not a good idea to replicate more than 30000 records at one time.

Where did you get that?

Regards
Michel

[Updated on: Tue, 14 October 2008 11:10]

Report message to a moderator

Re: Streams working with 1 and 800 rows but not 90000 [message #353696 is a reply to message #353685] Tue, 14 October 2008 12:42 Go to previous messageGo to next message
rmantha
Messages: 4
Registered: October 2008
Location: Rochester, NY, USA
Junior Member
I did do a test yesterday for 30000 rows and it worked fine but again not for 90000.... I didn't try anything between...

Once I got the message from varunvir , I thought I would try to test commit intervals of 10000....I thought will probably work...

I was performed test for doing commit 10000 intervals once of 30000 and 90000). I first tried to insert 30000 rows with
interval commits of 10000.... WORKED GREAT... Tried 90000 with 10000 row
interval commits....

FUNNY thing happenned.... I saw it commit on destination 10000, 20000, 30000, THEN....

NO More.... something seems to stop it at 30000 rows no matter what...

From my point of view, it doesn't just not work well... It doesn't work at all past 30000 rows... something must be stoping it... at 30000... If anybody has insight... would be great...
Re: Streams working with 1 and 800 rows but not 90000 [message #353701 is a reply to message #353696] Tue, 14 October 2008 13:47 Go to previous messageGo to next message
rmantha
Messages: 4
Registered: October 2008
Location: Rochester, NY, USA
Junior Member
Another test and note for you....

I decided to run ADDM on the origin database and found that it mentioned Streams pool memory was insufficient and enqueues were stopping...

I thought this might be it....

So I shut everything down and increased Stream_pool_size to 500M from 256M....

Remember that before, it could never get passed 30000 rows...

Once I increased the memory, it allowed me to perform a 60000 row insert(10000 interval commits). Success!!!

So I deleted the 60000 rows(success again in 30000 increments)....

Then, I tried again inserting 90000(with 10000 interval commits)... I thought at the least it would insert 60000 because it had worked...

I wasn't doing it.... waited about 5 to 6 minutes and while I was checking other things.... went back to see if it had inserted 90000 on destination.... it inserted 10000 and then, 20000(after 5 to 10 minutes it started)...and NOW, IT IS STUCK ON 20000 rows on destination....

Stream_pool_size seemed to be the problem and the increase, seemed to fix things temporarily....

It seems like Stream_pool_size is not being release quick enough after interval commits....

Any thoughts....
Re: Streams working with 1 and 800 rows but not 90000 [message #353727 is a reply to message #353701] Tue, 14 October 2008 21:45 Go to previous messageGo to next message
varunvir
Messages: 389
Registered: November 2007
Senior Member
Even oracle recommends not to to bulk upload more than 30000 records in streaming environment in oracle 10g
You can commit after every 30000 records or you can set the
tagging and run the scripts on both sides.
Re: Streams working with 1 and 800 rows but not 90000 [message #353741 is a reply to message #353727] Tue, 14 October 2008 23:34 Go to previous messageGo to next message
Michel Cadot
Messages: 68625
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator
varunvir wrote on Wed, 15 October 2008 04:45
Even oracle recommends not to to bulk upload more than 30000 records in streaming environment in oracle 10g...

Where? Put your reference.

Regards
Michel

Re: Streams working with 1 and 800 rows but not 90000 [message #353910 is a reply to message #353741] Wed, 15 October 2008 10:11 Go to previous messageGo to next message
varunvir
Messages: 389
Registered: November 2007
Senior Member
Hi Michel,
I read some article about that.Anyways I am taking my
words back because I dont have any reference to it.
Regards,
Varun Punj
Re: Streams working with 1 and 800 rows but not 90000 [message #353934 is a reply to message #353424] Wed, 15 October 2008 13:03 Go to previous messageGo to next message
rmantha
Messages: 4
Registered: October 2008
Location: Rochester, NY, USA
Junior Member
Increased streams_pool_size from 33M to 250M(trying 90000 ins. but only able to insert 30000 rows)....
Increased streams_pool_size from 250M to 500M(trying 90000 ins. but only able to insert 60000 rows)....
Increased streams_pool_size from 500M to 2G(trying 500000 ins. but only able to insert 200000 rows)....

Seems like streams_pool_size is not getting released....

These inserts were done with 10000 commit intervals... that memory should be getting released every commit.... I could see if I was trying to insert 500000 in one shot that it might not have enough memory of 2G but this was all done with increments of 10000 row commits...

Any thoughts on this...
Re: Streams working with 1 and 800 rows but not 90000 [message #355549 is a reply to message #353934] Sat, 25 October 2008 02:37 Go to previous messageGo to next message
Dipali Vithalani
Messages: 278
Registered: March 2007
Location: India
Senior Member
Hi Rmantha,

The performance of your propagation or apply process may also cause problem to the functionality of your capture process.

For stream replication to work properly, all capture , propagation and apply process work togather. Conceptually, capture process enqueues LCRs in queue which are propagated by propagation process and dequeued by apply process and applied.

If you have tuned your capture site the best way providing much resources (in your case you tried to allocate more and more streams buffer), but at your apply site, apply process is not properly tuned and resources are not enough, you cannot get much benifit from capture site resource enhancements. Because, capure process will be so fast in capturing and enqueuing changes (LCRs), but at the other end, apply process is slow, not in sync of capture process, and is not dequeuing LCRs in that much speed as was expected. So after some time, the queue will be full and capture process will not be able to enqueue more data (LCR) in queue, untill space would made available by apply process. In such case, you will get a problem of "ora-25307 - enqueue rate too high, flow control enabled"


Moral of the story is "Have a look at propagation and apply process too.. they may also cause your capture process perform not according to your expectations".


Regards,
Dipali..
Re: Streams working with 1 and 800 rows but not 90000 [message #361541 is a reply to message #355549] Wed, 26 November 2008 21:18 Go to previous message
webjimmy
Messages: 7
Registered: December 2005
Location: MD
Junior Member
Hi Dipali,
you made a good analyse. we have bi_direction stream and frequently broken with havey data loading( no from application. just load data). the 2 database server seem that there are different parameters value. based on your analysis, i need to set all parameter ( like stg, stream pool) as some value.
Also I know which parmete involve propsgation and apply process?

Any suggestion for data loading in stream environment?

Nice holiday
webjimmy
Previous Topic: Remove Site from Replication Stream Environment
Next Topic: find stream codes from oracle10gR2 database
Goto Forum:
  


Current Time: Thu Mar 28 13:05:06 CDT 2024