Discussion about this post

User's avatar
gregvp's avatar

This suffers from three fallacies at least.

First, current AI, the large language model form, is a lossy information summariser. Yes, it increases the productivity of basic legal research, and may appear to replace human researchers in the short term, but whether or not jobs are lost in the medium term depends term on the price elasticity of demand for legal research. I note there have already been several cases where lawyers have been embarrassed and censured for relying on AI, which often makes up legal precedents out of whole cloth. The fashion for dispensing with junior staff may yet die out.

A few jobs that are basically the lossy summarisation of information: journalism, legal research, helpdesk--may be at risk, but no job that requires correctness or precision is at stake. Nor is any job that requires judgement, not any job that deals with physical goods.

Second, it conflates AI and robotics. Self driving vehicles have been arriving "next year" since 2012. And truck drivers do far more than just drive trucks, just as accountants do much more than write journal entries and calculate depreciation. Warehouse robots are still very much at the single-robot trial stage, and they have been at that stage for at least three years. The fact that there have been no mainstream media articles talking about major rollouts of warehouse robotics suggests that nearly all of these trials have quietly failed.

Robotics is a whole bunch of difficult engineering problems that have to be solved individually--not least of them, safety in all situations--and then traded off against each other. Virtually none of the problems still to be fully solved in robotics have to do with AI.

Third, it suffers from out of date ideas about women. Women have the majority of tertiary qualifications these days. They are not a vulnerable minority. Also, ethnic minorities tend to work in occupations that are not at risk.

Expand full comment

No posts