Looking at Fedora CoreOS, Cloud and Server Editions countme stats

I am occasionally keeping an eye at the Fedora CoreOS countme stats to get a feeling of its adoption.
But today I got curious about how it compares to other Fedora Editions that live in the same space, so I gathered the countme stats[1] for Fedora CoreOS, Cloud and Server editions.

  • Number of instances that have been running for longer than 1 week

The Server Edition has more long lived instances than CoreOS and Cloud which is not surprising since they both target platforms and infrastructure that make it easy to re-provision the machine. Another interesting fact is the growth of CoreOS which is getting good adoption.

  • Number of instances per edition that have been running for longer than 1 week per Fedora release.

This graph shows the repartition of each edition for each Fedora release. With my Fedora CoreOS filter on, this tells me that FCOS automatic updates and stream release model does a great job at keeping users on a supported version.

One thing to note is that the OKD project stabilizes a minor version on a specific version of Fedora. For example, OKD 4.10 is using FCOS based on Fedora 35, OKD 4.11 is using FCOS based on Fedora 36 etc …

  • Number of instances per editions that have been running for longer than 1 week per Fedora architecture.

Finally this is the repartition of each edition for each supported architecture (I removed armhfp which was only present in the Server edition). There are no big surprises in seeing a low number of FCOS instances in IBM Z and PowerPC since we only have enabled Z recently and are still working towards building consistently ppc64le for FCOS.
However, I was really surprised about how popular Fedora CoreOS on aarch64 is. There might be a few reasons for it:

  • It is possible that FCOS is quite a popular choice for raspberry pi and other Arm64 small form factor computers.
  • FCOS is easily accessible on public cloud providers that offer Arm64 instances.

It is good to note that the second point is also true for the Cloud edition tho.

Anyway, I just wanted to share these statistics with you and start a discussion. What are you making of these? Anything else that you are curious about?

[1] - You can access the weekly countme stats dump from https://data-analysis.fedoraproject.org/csv-reports/countme/ and https://www.pagure.io/mirrors-countme to learn more about how to parse the data.

7 Likes

I noticed the same thing with ARM — in fact, CoreOS is our most popular variant on aarch64.

1 Like

Thanks for putting this together @cverna!

Thanks for sharing @cverna!

  1. Number of FCOS instances that have been up for more than 1 week => growing,
  2. Number of Server instances that have been up for more than 1 week => growing,
  3. Number of Cloud instances that have been up for more than 1 week => dropping.

Can we call it already a successful adoption?

I’m sure there is some cannibalization (Fedora eating Fedora) going on, but IMO we want to try to grow all variants of Fedora (i.e. pull in new non-Fedora users). One thing Fedora CoreOS does well is try to be available in a lot of platforms. This is something the Cloud working group is working on too, so hopefully we see Cloud AND FCOS numbers go up in the future!

1 Like

Another thing to consider too is: “what are the <= 1week numbers?”

Since cloud workloads can often be spin up and spin down a lot of nodes don’t even make it to 1 week. I bet the Cloud Edition has improved metrics there as compared to the >= 1 week metric.

That’s something I did not consider @dustymabe - valid point.
Still, I love how charts can tell more than one story, leaving some areas free to interpretation.

Yes, the Cloud Edition has a lot of instances in the up for 1 week or less category. I would not be surprised if it was the most popular variant in that category.

My “brontosaurusifier” script had some heuristics to separate transient systems from those which are new in a given week.[1] But I was messing around with how to do the graphs… I don’t like my current approach, and I didn’t have the sense to do it on a branch. I’ll spend some time soon getting it in shape to share…


  1. By considering increase in the higher categories the _next_week. There is some uncertainty because there is no tracking of individual systems, so I probably over-count “transient” systems and under-count fresh installs slightly. This only affects age group 1 — I don’t carry any adjustments to 2, 3, or 4, but 1 is split into 1 and “0”. ↩︎