Need to generate some test data. This insert is 800,000 X 1,000. I know a lot but this is a real application where the random will be a calculated number.
How can I break this up so the transaction log does not fill up?
insert into sIDcrossMatch
select
docSVsys1.sID, docSVsys2.sID, Abs(Checksum(NewId())) % 100 As RandomInteger
from docSVsys as docSVsys1
join docSVsys as docSVsys2
on docSVsys1.sID <> docSVsys2.sID
where docSVsys1.sID < 1000
order by docSVsys1.sID, docSVsys2.sID
It will insert one docSVsys1.sID
without filling up the transaction log.
Since that is your test database, make sure your Recovery model to Simple first and then let log grow as much as you could provide space to it (add more files if needed). And be sure that You understand consequences of these settings.
Next step, or first step if you can't set recovery model and allow log to grow, split your insert statement into multiple insert statements by adding where clause like this:
if that would not be enough, increase divider (2) and add more insert statements. The idea behind multiple inserts is to use less log space and reuse log space.
Or, if possible for You, use SSIS and have one source component with your select query and one destination component (don't forget to set batch size).