curprev22:3222:32, 30 September 2022 192.114.91.212talk 7,525 bytes−47 Addition doesn't really make much sense if we also consider general groups. General groups have only a "multiplication" operation. I would also suggest writing the time complexity as O*(root(n)) as the operation may take any polylog factor including constant time.undoTags: Visual editMobile editMobile web edit
curprev18:4018:40, 26 September 2022 2a0d:6fc7:20c:5141:178:5634:1232:5476talk 7,573 bytes+438 It was seem as if each repeated multiplication is free previously, however the values to be stored aren't known ahead of time and take time to compute. There are root(n) of them and each one takes about linear time (in bits) to compute.undoTags: Visual editMobile editMobile web edit
curprev20:5420:54, 1 March 2020 Frobitztalkcontribs 7,368 bytes+411 Cleaned up references and added credit to Shanks (the fact that Nechaev claimed in 1994 that Gelfond knew of the algorithm in 1962 is worth noting but it doesn't change the fact that Shanks was the first to publish it in 1971 and is credited as its source in the literature.undo
curprev04:3004:30, 1 May 2019 2001:5b0:49cb:1c08:41ce:9d7a:ffac:e7a7talk 6,703 bytes−1 Note that for example 2**10 == 34 mod 99 and that in this case m = ceil(sqrt(99)) = 10. Thus we need to write 10 = i*10+j, however this is not possible if we restrict j < m. Since i >= 0 we need 10=0*10+10 where i=0 and j=10. Thus j may be equal to m.undo