Method to convert position uncertainties/errors between coordinate systems

Is there a built-in astropy method (or way that uses built-in methods) to convert uncertainties/errors in one coordinate system to another?

My specific goal is to find the mean and standard deviation on the positions of a sample of points (originally in RA/Dec) in galactic coordinates (l,b). This seemed like not an uncommon goal, but I haven’t been able to find any built-in methods to do this (at least looking at the SkyCoord, Uncertainties, and related methods).

I know that one valid way to do this would be to compute the statistics in RA/Dec and then use the jacobian from the RA/Dec → l,b transformation equations to convert the standard deviation. I haven’t done this yet since I figured it is a common-enough problem to have been already implemented somewhere, but I haven’t had luck finding it.

The method I have done as an alternative is convert the sample to (l,b) using SkyCoord methods and then compute the statistics. However, this doesn’t seem to be accurate for points near the galactic poles (even after accounting for circular statistics).

As a workaround, I tried to approximate the l,b uncertainty as simply the sky separation of two points generated from the RA/Dec uncertainties/means, but this doesn’t seem to be valid/consistent with my other method. I’ve considered trying to use galactic longitude departures to compute uncertainties near the poles, but wanted to ask here before trying to figure that out.

Is there is a way to do the first method using astropy more directly, or a way to correctly compute statistics near the galactic poles similarly to the second method?