Looking back at our historical data, we realized how much the Internet and Cloudflare grew. With more than 150 datacenters, 10 percent of web-based applications, customers everywhere around the world, from the tiny islands in the Pacific to the big metropolises, we have an Internet landscape of almost every country and continent.
Cloudflare’s mission is to help build a better Internet. To do that we operate datacenters across the globe. By having datacenters close to end user we provide a fast, secure experience for everyone. Today I’d like to talk about our datacenters in Africa and our plans to serve a population of 1.2 billion people over 58 countries.
Internet penetration in developed countries skyrocketed since the 2000s, Internet usage is growing rapidly across Africa. We are seeing a 4% to 7% increase in traffic month on month. As of July 2018, we have 8 datacenters on the African Continent:
While we see changes on the horizon, the majority of Internet content providers are located in North America and Western Europe. This means that only the billion people living in those areas close to the content they are trying to reach. When it comes to Africa, submarine cables will usually bring back the packets to Europe to hubs like Marseille, Paris, London, Lisbon and sometimes Frankfurt or Amsterdam, adding precious milliseconds, slowing down communications.
By setting up datacenters on the African continent, Cloudflare is able to serve content locally, increasing download speed in the region, improving the end-user experience, and ultimately increasing Internet usage.
Growth of a continent
We wondered if the Internet usage increased when we set-up a datacenter in a region which was previously not well served.
It was surprising to see a fast growth in the following months in each of the countries we turn on our equipment. We took into account bandwidth and also quantity of information exchanged. The increase in bandwidth usually leads to an increase in usage.
Why is bandwidth increasing with lower latency?
Bandwidth is the maximum rate information can be transferred over a link. The interpretation of maximum depends on the type of transmission.
The explanation why the rate is dependent on latency comes from the TCP protocol specification on which run most of the web applications. Every transmission ends with an acknowledgement, to determine if the data was correctly received. The next transmission will begin after the sender receives the acknowledgement. This means while the acknowledgement is transmitting, nothing is actually being received. The difference of data received over time is the transfer rate.
This is summarized in the following diagram:
With the most common algorithms, the amount of data sent before an acknowledgment increases until an error appears. Any link will drop packets: whether there is a transmission issue (wireless and perturbations) or a processing issue (router dropping packets to reduce load). This contributes never reaching the maximum bandwidth of a link. This is why above 80 milliseconds latency, performances start to worsen drastically. Coast to coast in the USA is around 70 milliseconds. Paris to London is 10 milliseconds. Satellites connections implement proxies to allow sustainable throughput despite 600 milliseconds round-time-trip.
Why is the amount of requests increasing with higher bandwidth?
The explanation is behavioral. If an Internet connection is slow and frustrating, chances are we will use it only if we are forced to. While getting access to the content immediately, people are likely to fetch more content and click on links.
The following chart shows the growth of requests from a country after every datacenter turn up. Traffic is normalized on the latest value (July 2018).
Djibouti shows the biggest increase in traffic after the launch.
For the rest of the continent, we are seeing a steady increase in delivered traffic over the last four years.
Overall, Sierra Leone is the country which grew the most at around 8% per month. At the opposite scale, Algeria only increased its traffic by 2% per month. The mean of all those countries is 6.2% per month, which is also the Internet traffic growth of South Africa.
In comparison to Europe, North America, it will take, at the current rate, 4 years and 3 months for Africa to reach today’s traffic levels of those two continents. However, if Europe, North America keeps its current 4% growth rate, then the African continent will take approximately eight to twelve years to catch up.
Please note these estimations are not perfectly representative as Cloudflare only see a part of the Internet and the numbers also includes our growing base of customers.
The units on the vertical axis represent the growth based on the initial ratio between Europe/USA/Canada and Africa.
Latencies and inter-African routing
We analyzed the latencies between Europe and African IP addresses that hit our edge. On the following Demer map from Vasco Asturiano, the area of the country is exponentially proportional to the response time in milliseconds.
All the coastal Africa is well connected with submarines cables. The only exception is Eritrea, where the main provider (EriTel) which uses two satellite providers for its outbound links.
The island of Saint Helena is on the path of the future SAex cable, but the only connection at the moment for its 5000 inhabitants is through satellite, causing a latency of 600ms.Central African Republic also uses satellite or connections through Cameroon.
On average a packet round trip from Northern Africa to Europe will take around 50-100ms while a round trip from the South will take 250ms.
Regarding inter-African routing, Africa has only a limited number of cities where interconnection happens successfully: Johannesburg in South Africa, Nairobi in Kenya, and Djibouti with datacenters and internet exchange points. The rest of the continent is mostly split between providers which will interconnect in Europe or Asia. Chances are, a user in Cameroon talking to a user in Ghana will likely go through Paris.
Using RIPE Atlas, we are able to show inter-provider routing usually happens in Europe. Only one traceroute showed packets being exchanged in South Africa.
Traceroute to two Ghana Atlas probes from other Atlas Probes
To Ghana only via Europe
To Ghana with one through South Africa
Cloudflare and many network operators are using extensively RIPE Atlas to determine performance and troubleshoot issues. If you want to help us improve Internet quality in Africa, become a probe host here.
What you may be wondering is: what are the countries in Europe that route the African traffic? We are using an anycast network ; when a user fetch a website on Cloudflare, the packets will take the shortest path seen by the router to reach its destination. In telecommunications, the shortest path does necessarily mean geographical proximity. The selection metric of a path usually involve cost and performance.
An anycast address will have the same metric everywhere on our side. This mean a service provider having links to London and Paris will see our IP addresses originating from London datacenter and Paris datacenter. The choice between the two will only be depending on metrics set by the provider: will it cost more to reach London?
Cloudflare has a significant number of points of presence to be able to show small metric differences. Due to the density of its network in Europe, it is common to have datacenters one hop away from each others (London and Paris, Paris and Amsterdam). Our analysis of the next-hop will show a provider’s preference.
In the case of Africa, a provider can rent capacity on submarine cable to a city, for instance Lisbon, Paris, London. Ideally, the provider wants to maximize the resources it can obtain without adding more hops (costlier).
Paris, London, Amsterdam and Frankfurt have a lot of content providers and interconnections options, the differences are going to be on the content. One major bias will be the language: France will have more francophone content hosted in Parisian datacenters, it also helps running operations to if both parties speak French.
As an example case: if Paris is chosen, for the content that can only be found in London, another provider will carry the packets to London from Paris, adding one hop to the path, reducing preference. We will see the landing point in Paris.
Why isn’t the provider getting a link to London? On thing to know is any Internet link has a flat rate price plus a variable rate based on consumption.
Bandwidth within major European cities is among the cheapest in the world. The flat rate for submarine capacity is high, as a result, the quantity of content that is available from London for cheaper than Paris has to make up the difference.
If you are interested to know more about the costs of bandwidth around the world, check out this article by Nitin Rao.
The following map represents the most common Cloudflare datacenter reached for each country. The countries filled with color are where Cloudflare datacenters are. The border color indicates the destination country.
As aforementioned, we can notice most of the French-speaking countries will tend to go towards our datacenter in Paris. South Africa remains well connected with its neighbors.
Where is the content hosted?
Previously, we mentioned there are some content providers in South Africa and Djibouti. We took a look at the popular news and banking websites.
Over the top 200 origins of the websites (Alexa ranking) for African countries in 2017, only 42 were attached to an African country and only 10 among those were serving non-education/government content (news websites, banks).They were located mostly in South Africa and Zimbabwe.
We also took a look at our 10 millions zones behind Cloudflare. The ones that are using Afrinic IPs are hosted in the following countries:
What about IPv6?
Only four countries out of sixty four are using IPv6 in a measurable way. Egypt (1%), Kenya (2%), Gabon (2%) and Zimbabwe (9%). In each of these countries we found that only one provider that deployed IPv6 at any significant scale. Since the beginning of the year, IPv6 usage in Egypt has doubled. In Gabon, a major provider rolled IPv6 at the end of December 2017.
As Mathieu Paonessa from Group Vivendi Afrique told us, “Gabon went from 0% to 2% IPv6 in only 9 months, following the opening of our CanalBox Gabon FTTH service.”
The timeline is similar with a small ISP in Kenya and has been increasing steadily. We can maybe hope they double their IPv6 usage by the end of 2018.
Compared to the rest of the world, Belgium is still ahead with 37%, followed by India at 35%. Most European countries are above 10%. USA is 22% and Canada is 15%.
One explanation for the small number of deployments is the size of the remaining IPv4 pool of addresses.
As of July 2018, there were 36,245 /24 available (approximately 9.2 millions IPs). The total Afrinic size is 6 /8 (approximately 100 millions IPs). 9% are left.
The following graphs shows the allocations and usage of each /24 in the Afrinic allocations.
This is the Hilbert graph of Afrinic IPv4 exhaustion. It was inspired by Ben Cartwright-Cox’s blog post and used masscan and ipv4-heatmap.
Prefix
Announced (yellow) vs Available (green) vs Reserved (red)
Response to ICMP (blue = low reply, red = all reply)
41.0.0.0/8
102.0.0.0/8
105.0.0.0/8
154.0.0.0/8
196.0.0.0/8
197.0.0.0/8
We notice that even if allocated, some blocks remain dark. 102.0.0.0/8 remains the range with the most IP space left in the world.
More deployments coming soon
Africa is the largest continent on earth, yet it is very lagging behind on Internet connectivity and traffic levels. Historically it's been entirely relying on its European interconnections. However we predict that its traffic will outgrow EU and US by 2027. This will be enabled by the fast deployment of local content, IPv6, and alternative submarine cables.
We are working on deploying even more points of presence in Africa and increasing our current capacity in the existing ones. Fifteen new locations in the African continent are part of our global expansion plans. These include: Algeria (Algiers), Cameroon (Yaoundé), Congo (Kinshasa), Côte d’Ivoire (Abidjan), Egypt (Alexandria), Ghana (Accra), Kenya (Nairobi), La Réunion (Sainte-Marie), Madagascar (Antananarivo), Morocco (Casablanca), Nigeria (Lagos), Tanzania (Dar es Salaam), Tunisia (Tunis), Uganda (Kampala), and Zimbabwe (Harare)
Header image source