sRGB has bugged me from the start, since it's not even clear to me which actual matrix to use to convert between linear sRGB colors and XYZ colors. I count at least 3 different matrices in IEC 61966-2-1, each of which I have seen different people ascribe to as the true version:<p>1. The matrix implied by the reference primaries in Table 1: [X;Y;Z] = [506752/1228815,87098/409605,7918/409605; 87881/245763,175762/245763,87881/737289; 12673/70218,12673/175545,1001167/1053270]*[R;G;B].<p>2. The matrix in section 5.2: [X;Y;Z] = [1031/2500,447/1250,361/2000; 1063/5000,447/625,361/5000; 193/10000,149/1250,1901/2000]*[R;G;B].<p>3. The inverse of the matrix in section 5.3: [X;Y;Z] = [248898325000/603542646087,71938950000/201180882029,36311670000/201180882029; 128304856250/603542646087,143878592500/201180882029,14525360000/201180882029; 11646692500/603542646087,23977515000/201180882029,191221850000/201180882029]*[R;G;B].<p>The distinction starts to matter for 16-bit color. The CSS people seem to take the position that the matrix implied by primaries is the true version, but meanwhile, the same document's Annex F (in Amd. 1) seems to suggest that the 5.2 matrix is the true version, and that the 5.3 matrix should be rederived to the increased precision. There's no easy way to decide, as far as I can tell.<p>Meanwhile, I agree with the author that the ICC's black-point finagling in their published profiles has not helped with the confusion over what exactly sRGB colors are supposed to map to.