I dislike the fact that the library is forever expanding its feature set while a lot of the core code is badly written or buggy (see the method for calculating the angle between two T3vectors, for example). This is crazy when development is not under commercial competitive pressure. I dislike the use of global pointers to control the behaviour of the library and I find the design decisions in particular regarding histograms and the lack of separation between data and presentation bewildering (e.g. a graph is a subtype of a histogram, which should contain it so that it can be plotted. A plot frame is a type of histogram object, which makes no sense that I can understand, etc.). The library makes no use of a lot of core C++ features; it really is C-with-classes, where the use of exceptions could really improve the code base and potential usage. I feel quite strongly that a numeric calculation should not return a number if something went wrong, for example.
I dislike that it has a strong feeling of not-invented-here syndrome where it could include dependencies on well tested code, e.g. the gnu scientific library, but instead rewrites everything. And up until about 6 months ago the default plot styles were damned ugly, although this has gotten a lot better. There seem to be three different math libraries that each implement some subset of mathematical functions independently and a lot of the graphics stuff makes little sense to me (take a look at how to ensure platform-independent type face size in a plot, for example). The CINT interface almost seemed designed to ensure that when students were learning, they learned ROOT and not C or C++, which is great for productivity in the short term, and incredibly sucky in the medium to long term.
I understand that a lot of this is legacy cruft (and even made sense at the time!) and that I could contribute patches for it, but I'm busy doing my actual job. I'm impressed that a lot of my objections are being worked on too - I was heavily dependent on the framework for my PhD about 5 years ago and a lot of my frustration stems from then. It often feels like the ROOT team learn-by-doing; they want to understand something, so they make it in ROOT. Which is a fine way to learn, but not the best way to develop stable code that should be used by thousands and thousands of people. In some senses ROOT is an amazing achievement and I still find myself using it on occasion. But I've now mostly replaced what I use it for with matplotlib and the GSL and my life is easier.
breathes
I also dislike the way that it seems partially to have enabled physics to go in a direction where we pump out PhDs who don't understand code or physics, but act as worker-drones for the large collaborations. But that's not really ROOT's fault, and is a whole 'nother topic.
I second that.
The aim of root is certainly good. But the implementation of that idea, especially the overall design, is absolutely awful. It's understandable, it has grown over a long time and was written by physics experts, not software design experts. People who where used to Fortran and paw. The actual code implementation is bad but not hopelessly so. The interface is broken and because so many people are used to it, it will be hard to replace with anything new.
For Mainz experiment, we have or own code base. Not pretty, but because it's a lot more specialist, less confusing. I am working now on OLYMPUS, and we are using ROOT for that. I created a Framework for the analysis based on ROOT, trying to hide the most problematic areas and making it easier for use by the other collaboration members. Also trying to make them write programs, not ROOT macros. Every time I look up a new feature, I'm surprised that they managed to find a non standard way of doing it.
My pet peeve? TH1D is a 1-d historgram class. What does TH1D.Clear do?
Wrong! It clears the histogram title and name, not the histogram itself. For that, you need Reset. It makes kind of sense if you know the class hierarchy, TNamed and all. But who remembers that? I saw this mistake in the wild a lot.
Protip: Gnuplot. While also a little bit arcane in its command language, it's for me the best tool to produce paper-ready plots. With the tikz terminal you can include it in your Latex flow, and with some Makefile trickery you can have e.g. \cites resolve correctly in plot labels. With numbering correctly reflecting the position of the plot in the paper!
I don't think ROOT is very common in the atomic community. For experimental control, LabView is quite common, and it seems to have pitfalls too. But from what I know, the experiment itself is very straight forward, there are not so many places error can hide. Especially since they can calibrate/cross check everything with known H20 lines.
On the electron scattering side, ROOT is more common, but the experiments which are most relevant for the radius either predate ROOT or are known to not work with ROOT. Doesn't mean the software used has no errors, quite the contrary, however since all the results from different measurements with different software agree, I think a programming error is not the culprit.
Sadly, ROOT is generally-speaking the best part of physics code...if it were me, I would ask that all software used to derive a result be made available. It's far too easy for scientists to make stupid errors in their coding and it cannot hurt to be open.
I dislike that it has a strong feeling of not-invented-here syndrome where it could include dependencies on well tested code, e.g. the gnu scientific library, but instead rewrites everything. And up until about 6 months ago the default plot styles were damned ugly, although this has gotten a lot better. There seem to be three different math libraries that each implement some subset of mathematical functions independently and a lot of the graphics stuff makes little sense to me (take a look at how to ensure platform-independent type face size in a plot, for example). The CINT interface almost seemed designed to ensure that when students were learning, they learned ROOT and not C or C++, which is great for productivity in the short term, and incredibly sucky in the medium to long term.
I understand that a lot of this is legacy cruft (and even made sense at the time!) and that I could contribute patches for it, but I'm busy doing my actual job. I'm impressed that a lot of my objections are being worked on too - I was heavily dependent on the framework for my PhD about 5 years ago and a lot of my frustration stems from then. It often feels like the ROOT team learn-by-doing; they want to understand something, so they make it in ROOT. Which is a fine way to learn, but not the best way to develop stable code that should be used by thousands and thousands of people. In some senses ROOT is an amazing achievement and I still find myself using it on occasion. But I've now mostly replaced what I use it for with matplotlib and the GSL and my life is easier.
breathes
I also dislike the way that it seems partially to have enabled physics to go in a direction where we pump out PhDs who don't understand code or physics, but act as worker-drones for the large collaborations. But that's not really ROOT's fault, and is a whole 'nother topic.