More people working more

People at work

As unemployment rates go down, economists begin to explore what it would take to get more people working for longer.

This research concludes that flexibility in working hours, controlled by the person working would make a significant difference in total effort available for work.

More flexibility in working hours requires management and scheduling so that people who need to collaborate and interrupt each other agree on times and days for that, and that there will be other work time when they work alone. Some jobs are mainly customer facing, so the majority of the hours are interruptable.

People who have managed international teams on assorted time zones have a head start on understanding how to do this.

Vanguard sponsored research published by the American Economic Association

Quantum computing

Snapshot, November 2019

State of the art summary, references and curated comments Scott Aaronson

Small companies

IonQ Trapped ion computing
PsiQ Silicon photonics
QCWare Software services
Rigetti Quantum cloud services
Xanadu Quantum photonic processors

Large companies
Honeywell Trapped ion Qbits

Upcoming conference San Jose 10 -12 December 2019

AI and new jobs

Last year when we were preparing for the AI and ML panel at the Markets Group meeting, we spent a lot of effort to prepare for questions on potential and actual adverse effects – but no-one asked. The audience were institutional investors, many of them managing pension funds for employees, so we really had expected pointed questions about the potential for removal of existing jobs and about how new occupations might arise.

Prompted by a blog post from Timothy Taylor, and quoting from a paper titled ‘The Wrong Kind of AI’ , it seems useful to think “about the future of work as a race between automation and new, labor-intensive tasks. Labor demand has not increased steadily over the last two centuries because of technologies that have made labor more productive in everything. Rather, many new technologies have sought to eliminate labor from tasks in which it previously specialized. All the same, labor has benefited from advances in technology, because other technologies have simultaneously enabled the introduction of new labor-intensive tasks. These new tasks have done more than just reinstate labor as a central input into the production process; they have also played a vital role in productivity growth.”


IZA DP No. 12292 Institue of Labor Economics The Wrong Kind of AI?
Artificial Intelligence and the Future of Labor Demand APRIL 2019
Daron Acemoglu MIT and IZA
Pascual Restrepo Boston University

Consolidation in high capacity interface business

First Nividia announced it was to acquire Mellanox. Now Xilinx announces the acquisition of Solarflare. These are the two big sources of expertise in the high capability, high throughput Network Interface Card market.

When this sort of consolidation happens, it’s a signal to watch for one or more smaller players to emerge, potentially with expertise and money coming from the acquired companies, to develop the next state change in one of the often overlooked but critical enablers of the very large scale datacenters enabling cloud operational scale.

The competition is AWS, who have built their own ASIC, used in the NICs in the Nitro System. James Hamilton describes the system, used for I/O acceleration, security, and to implement a hypervisor.

February 2019
March 2019
April 2019

More machine learning – ScaledML

27 – 28 March 2019

The ScaledML conference is growing up; from a Saturday at Stanford to a two day event at the Computer History Museum with sponsors.

Two big new themes emerged

  • Concern for power efficiency (Simon Knowles, Graphcore, talked about Megawatts; Pete Warden, Tensorflow talked about milliwatts and energy harvesting
  • Development platforms – Adam D’Angelo, Quora, was particularly clear on how Quora operate development to efficiently support a small number of good developers

David Paterson gave the first talk on Domain Specific architectures for Neural Networks – an updated version of this talk

The roofline performance model is a useful way to visualize comparative performance. For future performance improvements functionally specific architectures are the way forward; this requires both hardware updates (what Google is doing with the TPUs) and improved compiler front and back ends.

Fig 3 from the Domain Specific Architectures paper linked above.

Intel recognizes this trend – Wei Li described the work his team is doing to incorporate domain specific support into Xeon processors. This blog post has the gist of what he presented.

Most of the talks are here on YouTube