Skip to Content.
Sympa Menu

cado-nfs - Re: [Cado-nfs-discuss] First steps towards testing "multi-thread merge" is just getting cado-nfs ?

Subject: Discussion related to cado-nfs

List archive

Re: [Cado-nfs-discuss] First steps towards testing "multi-thread merge" is just getting cado-nfs ?


Chronological Thread 
  • From: Dennis Clarke <dclarke@blastwave.org>
  • To: cado-nfs-discuss@lists.gforge.inria.fr
  • Subject: Re: [Cado-nfs-discuss] First steps towards testing "multi-thread merge" is just getting cado-nfs ?
  • Date: Fri, 19 Apr 2019 12:33:42 -0400
  • Authentication-results: mail3-smtp-sop.national.inria.fr; spf=None smtp.pra=dclarke@blastwave.org; spf=None smtp.mailfrom=dclarke@blastwave.org; spf=None smtp.helo=postmaster@atl4mhob05.registeredsite.com
  • Ironport-phdr: 9a23:eHhF1x++h7i+kf9uRHKM819IXTAuvvDOBiVQ1KB31+0cTK2v8tzYMVDF4r011RmVBNydsKIP07ee8/i5HzBZudDZ6DFKWacPfidNsd8RkQ0kDZzNImzAB9muURYHGt9fXkRu5XCxPBsdMs//Y1rPvi/6tmZKSV3wOgVvO+v6BJPZgdip2OCu4Z3TZBhDiCagbb9oIxi6sBjdutMYjIZhJao91hvEqWZMd+hK2G9kP12ekwvg6suq4JJv7yFcsO89+sBdVqn3Y742RqFCAjQ8NGA16szrtR3dQgaK+3ARTGYYnAdWDgbc9B31UYv/vSX8tupmxSmVJtb2QqwuWTSj9KhkVhnlgzoaOjEj8WHXjstwjL9HoB+kuhdyzZLYbJ2TOfFjeK7WYNEUSndbXstJSiJPHI28YYsMAeQPM+lXoIvyqEcBoxalGQmhBvnixiNUinL436A31fkqHwHc3AwnGtIDqHrao8vzNKcUUOC117TDwCvHb/xMwzf965bHeQ0mrP6RQb1wddDeyVMsFwzblVifsojlMCmO1uQRvWmU9fRgWvy1h24gsQFxrSGiy8ExgYfHgYIVz0rL9SR/wIstJN24TlR7Yd++H5dKuSGaLY17Sd4hTWFwoCs21KEKtJqhcCQXypkr3QPTZ+GHfoSS7R/uWuCcKipiin1/YrKwnROy/FCgyuLiUsm0105HoTBZktnIuX0N0hzT6s+cRfdh/kqtwyyP2B7c6uFFOkA0ibTUK4Q/zbEtjZoTsFjDETHslErqi6+Wc10o+umu6+v5frXrvoGQO5Nwhw3kL6gjmNazDfklPgUORWSW9uqx2KXm/ULjQbVKivM2krPesJDfPckUu7K2AwtP3YY56BawESyr388cnXYdN1JFZByGgJL3O17QOvz4Cu2/g1u0nDdx2//GJqHhAonKLnXbjLjuZ6ty60lFxAo1w9Bf/ItYBawAIPLoRkDxqcfYDgQiPgywwubnE8l91pgEVWKADK+ZN7nSsVCW6e41IumMYpUVuDfnJPQ/6f7ulyxxpVhIdqCl2t4RZmv9EvV9KFiCelLog8wdCiEFsA0kQ+GsiVuYUDcVaWzhcbg742QXBYSgDJyLbYSkmr+MlHO3FZlffGNLIlqIGGz0aIyaVrELci3EcZwpqSANSbX0E9xp7hqprgKvk+M2fNqRwTURsNfY7PYw4uTSkR8o8jktUpaX12iWU359gG5OQCU5jvgm/R5Nj2yb2K09uMR2UMRJ7qoUAA0+PIXA0OVkBpb1QA2TJo7UGmbjec2vBHQKdvx0w9IKZB0kSdy+ikzG3y2gW+JTjbGEBYco/7jR02PwO8s7wHHDhvEs
  • List-archive: <http://lists.gforge.inria.fr/pipermail/cado-nfs-discuss/>
  • List-id: A discussion list for Cado-NFS <cado-nfs-discuss.lists.gforge.inria.fr>

On 4/19/19 12:09 PM, Dennis Clarke wrote:

As described by Paul Zimmermann in :

https://lists.gforge.inria.fr/pipermail/cado-nfs-discuss/2019-April/001012.html


Seems reasonable to at least get cado-nfs-2.3.0 running and tested first.

However the README file is not clear.



Reply to myself ... I have moved on to the gut repo which seems to
instantly begin compile out of the box without much complaint.

I will ignore the tarball and focus on the git repo.

As an aside Pollard Rho will find a factor in the test data
90377629292003121684002147101760858109247336549001090677693 in 1509 secs
with a very very non-optimized bit of mpfr code :

nix$
nix$ /usr/bin/time -p ./pr_mpfr_quiet 90377629292003121684002147101760858109247336549001090677693 256 1
INFO : will assume debug is requested.

We shall use 256 bits of precision.
INFO : chars_formatted = 67
INFO : buf = '90377629292003121684002147101760858109247336549001090677693'
INFO : perfect match on data input.

We shall find a factor of 90377629292003121684002147101760858109247336549001090677693.000000
------------------------------------------------------
Pollard Rho shall proceed with 256 bits of precision.
------------------------------------------------------
loop 1 count = 3 x = 26.000000 factor = 1.000000
loop 2 count = 5 x = 44127887745906175987802.000000 factor = 1.000000
WARN : mpfr_inexflag_flag is set.
WARN : mpfr_mul() raised a flag
loop 3 All variables are re-initialized with 384 bits of precision.
WARN : mpfr_inexflag_flag is set.
WARN : mpfr_mul() raised a flag
loop 3 All variables are re-initialized with 512 bits of precision.
loop 3 count = 9 x = 89345867934367081695842937468485262216619525981276833552560.000000 factor = 1.000000
loop 4 count = 17 x = 50327363847198913163673390430277771075336745528481605121256.000000 factor = 1.000000
loop 5 count = 33 x = 10637116863020719574506881939454624946320970442470469737542.000000 factor = 1.000000
loop 6 count = 65 x = 42252486238197318513237739384306097331454983139591478381949.000000 factor = 1.000000
loop 7 count = 129 x = 69003710910694564959067691326725079987683288584909121354767.000000 factor = 1.000000
loop 8 count = 257 x = 47685415569767766903103138798832813846717550595622314111318.000000 factor = 1.000000
loop 9 count = 513 x = 11698760607548175189464802426102532243670566422993496057032.000000 factor = 1.000000
loop 10 count = 1025 x = 56671656009952591199431172532702404347232151915954684917404.000000 factor = 1.000000
loop 11 count = 2049 x = 6900954712039672122650181568225115490175707273059762267053.000000 factor = 1.000000
loop 12 count = 4097 x = 72842514216465191669381867874270001414502037632800868121458.000000 factor = 1.000000
loop 13 count = 8193 x = 81015949431655567632391930053543529569449828293909653554489.000000 factor = 1.000000
loop 14 count = 16385 x = 53004322494515190856771488738830993207526086299989787094322.000000 factor = 1.000000
loop 15 count = 32769 x = 28206591016280240087776550181896947245595393328633083663437.000000 factor = 1.000000
loop 16 count = 65537 x = 29724043511501651743029628293290274820546557305732103896404.000000 factor = 1.000000
loop 17 count = 131073 x = 79073461794582885283023073236712159751773462580094145630267.000000 factor = 1.000000
loop 18 count = 262145 x = 43313989103343824055558790711529053527610443358175528059790.000000 factor = 1.000000
loop 19 count = 524289 x = 54963833940218294160787786969406753017306430399280646006556.000000 factor = 1.000000
loop 20 count = 1048577 x = 53752323125081661807493982114112307317492554855619836309418.000000 factor = 1.000000
loop 21 count = 2097153 x = 46353242444574099471380678907766579590718385541425278800225.000000 factor = 1.000000
loop 22 count = 4194305 x = 6278321732476436672083117288727418670625564766756602475941.000000 factor = 1.000000
loop 23 count = 8388609 x = 15920677854179921985609883812237387525308466679024372237089.000000 factor = 1.000000
loop 24 count = 12027312 x = 37961300933121756447396047231390555772723153771771264235646.000000 factor = 760926063870977.000000
DONE : used 512 bits of precision.
: factor of 90377629292003121684002147101760858109247336549001090677693.000000 is 760926063870977.000000
real 1509.45
user 1508.66
sys 0.25
nix$

So my hope is that CADO-NFS is far far faster :

nix$
nix$ /usr/bin/time -p ./cado-nfs.py 90377629292003121684002147101760858109247336549001090677693 -t 1
Info:root: Using default parameter file ./parameters/factor/params.c60
Info:root: No database exists yet
Info:root: Created temporary directory /tmp/cado.83ies_nv
Info:Database: Opened connection to database /tmp/cado.83ies_nv/c60.db
Info:root: Set tasks.threads=1 based on --server-threads 1
Info:root: tasks.threads = 1 [via tasks.threads]
Info:root: tasks.polyselect.threads = 1 [via tasks.polyselect.threads]
Info:root: tasks.sieve.las.threads = 1 [via tasks.sieve.las.threads]
Info:root: tasks.linalg.bwc.threads = 1 [via tasks.threads]
Info:root: tasks.sqrt.threads = 1 [via tasks.threads]
Info:root: slaves.scriptpath is /home/dclarke/local/build/cado-nfs
Info:root: Command line parameters: ./cado-nfs.py 90377629292003121684002147101760858109247336549001090677693 -t 1
.
.
.
Info:Complete Factorization: Total cpu/elapsed time for entire factorization: 324.6/52.7252
Info:root: Cleaning up computation data in /tmp/cado.83ies_nv
760926063870977 588120598053661 260938498861057 773951836515617
real 55.71
user 5.16
sys 0.94
nix$

Brilliant !

Also Fermat8 = 2^( 2 ^ 8 ) + 1 was amazingly fast :

.
.
.
Info:Complete Factorization: Total cpu/elapsed time for entire factorization: 1753.03/194.232
Info:root: Cleaning up computation data in /tmp/cado.sxu4wjjr
93461639715357977769163558199606896584051237541638188580280321 1238926361552897
real 197.25
user 33.24
sys 1.70
nix$

Beautiful. Much faster than my 1328 secs for Pollard Rho.

OKay .. moving onwards.

--
Dennis




Archive powered by MHonArc 2.6.19+.

Top of Page