Any light source is more tolerable if color rendition is improved, especially high CCT lamps. And even though some fluorescents might have CRIs in the 90s, the spectrum still looks different than daylight. Problem is CRI isn't the greatest way to compare color quality. It measures how eight color patches appear relative to how they appear under a blackbody at the same CCT. Fluorescent lamp makers can get a good match on most of those patches, getting a high CRI score, and yet still render colors like skin tones poorly. In tests with LEDs CRI hasn't necessarily correlated with how pleasing people find a light source. Some combinations of red, green, and blue LEDs had very low CRIs, in the 20s, and yet they were preferred over other sources. And some sources have a CRI of 100 just by virtue of being a blackbody, and yet they render colors very poorly (i.e. candlelight). Right now alternative measures, such as the
color quality scale, are being developed to overcome this deficiency.