Measuring Ourselves: How the IETF performs at producing documents
In this episode of PING, Christian Huitema discusses how looking into the IETF data tracker allowed him to assess "how well we are doing" at document production.
As the IETF has grown, and as the process of developing standards has got more complex its understandable it takes a bit longer to produce a viable RFC but some questions have been made about exactly where in process the delays come from. Are we really doing better or worse than we used to? and, why might that be?
Christian took an interesting approach to the problem, using a random sample of 20 documents from 2018 (initially) and a hand method of collating the issues, and then applied the same methodology back into 2008 and 1998. His approach to measurement was rigorous and careful, separating his own opinions from the underlying data to aide reproducibility.
Christian has a long history of network development and research, with experience in industry, and in the french national computing research institute "INRIA" before joining Bell Communications Research, and Microsoft. He worked on OSI systems, X.500 directories, Satellite communications, and latterly the IPv6 stack including the "Tededo" transition technology, the H/D ratio used in determining IPv6 allocations and assignments in the RIR model, and the QUIC transport layer protocol.
The views expressed by the featured speakers are their own and do not necessarily reflect the views of APNIC.