Jackie King Jackie King

Methodology in the Ranking

How the comprehensive ranking was developed and evaluated.

The ranking of European cities shown here are meant to reveal the best places to live in the continent currently, based on twelve key factors. Compiled in the list are 54 major European cities, with every country having at least one city featured, and the top 10 most populated countries featuring two cities. The cities were decided based on population only, meaning the most populated city (or cities) in a country were the only ones featured. Using data from numerous sources including Numbeo, World Population Review, and the US News and World Report, each city was ranked from 1 to 54 based on how strong they were in the particular category, 1 being the best, 54 being the worst. Once scored using this system in each of the 12 categories, with the weight of each category being the same, the average placement in the 1 to 54 ranking for each city was calculated. Using this average placement, an overall ranking for best places to live was produced, putting the average score closest to 1 at the top, and the average score closest to 54 at the bottom. This is the overall idea of the rankings, but each individual metric had a different calculation method that should be discovered. 

  1. The first metric used was the happiness of people in the specific city. For this ranking, two data sources were used, the “World Happiness Report” rating scale for happiness in cities around the world, and the European Commission Survey asking people in cities how happy they were about their general lifestyle. Using two sources made sure that if some cities were missing from one list, they could still be represented with the other list. Still, some cities weren’t on either list, so some general research and educated guesses had to put those cities in the list. This was true for a good portion of the metrics, as some of the cities in smaller countries got left out, leading to extra research beyond the main source being needed. Once both sources were used, the average for the cities between the two lists was calculated, and then the cities were put into the ranking, with the unrepresented cities being filled in on the 1-54 ranking with some informed guesses. 

  2. The next metric was the amount of historical landmarks in and around each city. For this metric, the only data set used was simply the amount of UNESCO World Heritage Sites near the city. The sites didn’t specifically have to be within the city limits, there was a general rule that if a site was within an hour and fifteen minute drive from the city, it could be counted towards the total for the city. This rule got discarded for the smaller countries and cities, as for some countries that length of drive could get you halfway across the country, and that’s not an accurate representation of a city’s sites. With this set, there were many ties that occurred, meaning that in the final calculation of the ranking, multiple cities were given the same ranking, such as 7 cities getting a 2nd place score. This meant that once a group of ties was complete (see the tied for 2nd place), the placement would drop down significantly (seven cities tied for second meaning the next highest placement would be tied for ninth). 

  3. The third metric that got used in this ranking was the quality of sports within the city. For this ranking, the caliber of football (or soccer) clubs that played in the city was the main factor, as football is by far the most popular sport in Europe. In order to rank this, the standard of the country’s league was first taken into account using the UEFA Club Coefficient ranking. This was important because one city might have 5 clubs, but they’re all much worse than a different city with 2 clubs that play in a much more prestigious league. The two higher quality clubs will be more interesting for the viewer, so they should be ranked higher despite having less. After this data was used, the amount of clubs within each city was found. In order to compile the final ranking for this metric, leagues were compiled into groups of 5, where the cities within those 5 leagues would be ranked based on the amount of clubs there in the highest division of the domestic footballing pyramid. After all the cities in the first 5 leagues went, the next 5 leagues and the cities within them would be next, and the cycle continued until a 1-54 rank was established. 

  4. The fourth metric was the schools and education in each city. In order to rank the cities, the US News and World Report ranking of the best colleges in the world was used. The cities were organized based on how many entries they had in this list, with the cities having the most top rated colleges being ranked highly, and the cities with none being ranked the lowest. For the cities with the same amount of top colleges, the tiebreaker was which city had the highest placing college. For the tiebreaker between the cities with 0 entries, the order was determined based on how many of the top 5 colleges in that cities’ country were in that city. 

  5. For the fifth metric, each cities’ governmental stability was ranked. For this metric, the ranking was based on data from the Corruption Perceptions Index from Transparency International, which gave a rating to governmental stability in each country. Because the data was country based and not city based, the countries with two cities were put next to each other. Despite being not ranked by city, the country data is still a good indication because the people at the national level ultimately have a large say in how the government operates at the city level. Once the data was used, the cities were put into a ranking, with the highest scores of stability being ranked highest, and the lowest ranking lowest. 

  6. The sixth metric was focused on the crime levels in each city, and this ranking used Numbeo’s European City Crime Index. This data assigned a score to each city based on the overall amount and severity of the crime occurring there. Once all of this data was collected, the ranking was made simply by putting the lowest scores representing least crime at the top of the list, and the highest scores at the bottom of the list. As mentioned before, some of the smaller cities weren’t included in this data set, so some extra research and approximations had to be made to find the proper placement for those. 

  7. The seventh metric looked at the healthcare in each city, and this used data from both World Population Review and Numbeo. With the two sources, more cities were able to be covered and for the cities that were included in both data sets, a more appropriate average score was able to be used. Both of these sources focused more on country data rather than city data, but it was still important at the city level nonetheless because healthcare is an important part of deciding between two cities. So, cities from the same countries were put back to back in this list. Once the rankings that both lists used were averaged out, as well as the cities that needed approximations due to a lack of data on them, an overall ranking for this metric was created, with the cities that have the best healthcare at the top, and the worst cities for healthcare at the bottom. 

  8. The eighth metric was based on the affordability of each city. This metric used the cost of living ratings given by Numbeo as its data set. For this list, each city was given a score based on how affordable they are for the average person, and that score was then used in the ranking of cities. Cities with the lowest score placed highest, as they were the most affordable, while the more expensive, higher score cities placed lower. Once again, not all the cities were represented on the data set, so some extra research had to be done to find where those fit in. 

  9. The ninth metric took a look at the quality of food in each of the cities. For this ranking, the amount of Michelin Star Restaurants in each city was the main data set used. This meant that cities with the most Michelin Star Restaurants placed the highest, while those with 0 placed the lowest. It’s worth mentioning that many cities didn’t have any Michelin Star Restaurants because the Michelin critics simply hadn’t been there to test the food, so they couldn’t be given any. While this is unfortunate for the data, it still works because, although there might be a few restaurants of Michelin quality in the unvisited cities, these places probably wouldn’t rank very high anyways because the critics have been in no rush to try the food in these places, likely for good reason. The tiebreaker in this data set put cities with less population ahead, as it is less impressive for a city to have 0 Michelin restaurants with a much bigger population than a tiny city with 0 of that quality. 

  10. The tenth metric was the level of innovation in each city. For this ranking, 2ThinkNow’s global innovation ranking was used, which indicates how cities are currently doing in terms of creating new ideas and following through on them. The ranking that 2ThinkNow published was simply transferred over to the ranking for this metric, although the list was missing a few cities. So, as ever, the missing cities were researched separately and then placed into the ranking based on educated guesses. 

  11. The eleventh metric was focused on the pollution in all of the cities. This ranking was based on Numbeo’s Pollution Ratings, which assigned a score to each city explaining how they are doing with the air quality. Once the score for each city was obtained, they were put in order based on the pollution. The cities with the lowest score, indicating the least pollution, were placed highest, while the high score, high pollution cities were placed lower. 

  12. The twelfth and final metric was based around the climate of each city. For this ranking, the average yearly high temperature in each city was examined, in degrees Fahrenheit. Once the data was collected for each city, the cities with the hottest average yearly high temperature placed the highest on the ranking, while the coldest cities in this category placed the lowest. If two cities were tied in this department, the city with the hottest singular month on average was put ahead. 

So, after all the metrics were examined and the cities were ranked in each department, the complete list as it is now was compiled based on how well the cities placed on average. That is how this list for the best European cities to live in was created and what the metrics were for this important ranking. 

Read More