After spending a number of years with Ab Initio Software Corporation, this practice leader with a leading consulting firm now assists customers with establishing Ab Initio policies and development environments.Follow along as he shares his knowledge on enterprise problem solving and its appropriate approaches, pitfalls associated with various dimensions of the solution domain, and - as a centerpiece for the discussion - how the Ab Initio product suite applies to any or all of it.One group actually pulls "all" fifty million records from a table into a graph, preps them with update/delete/insert, truncates the original table contents and bulk-loads it.This is far faster for them than attempting to perform updates or even delete/inserts against a table of this size. Disclaimer: Blog contents express the viewpoints of their independent authors and are not reviewed for correctness or accuracy by Toolbox for IT. You are currently viewing the Oracle section of the Wrox Programmer to Programmer discussions.This is a community of tens of thousands of software programmers and website developers including Wrox book authors and readers. By joining today you can post your own programming questions, respond to other developers questions, and eliminate the ads that are displayed to guests. We are in a process to design a batch, which will read around 10 Millions records from various tables and then gather their associated details from other related tables.
Next time we may have selected SAMPLE to reduce update stats time.
Currently we are using batch INSERTs and we can't use LOAD DATA as we are using INSERT ...
ON DUPLICATE KEY and LOAD DATA does not have similar ON DUPLICATE KEY functionality.
These are the troubleshooting and resolution we used to help a customer explore faster ways to update a 100 million row table.
Customer was running UPDATE query affecting 100 million rows.
The author has no business relationship with Ab Initio Software Corporation, nor does his consulting firm.