The big assumption in that paper isn't whether or not jobs will be replaced, but this:
Why exactly would people not be writing code that writes code? If that's what affects the scale of computerization, it's easily one of the most valuable things you could do, yet the paper you've cited assumes this won't happen. As a person who works in similar research, I'm already quite aware that it is happening and will continue to happen. Virtual AIs that are set to specific tasks will massively improve upon these inefficiencies and solve problems at considerably improved rates. The industrial revolution happened as a result of a few major changes, I think it's safe to assume that a few major changes in current computer technology stand to have similar effects (as the microprocessor, fuzzy logic, etc.). To the paper itself, the standard deviation rates in its own assumptions about computerization are also pretty massive.