It has been a recent global trend to openly embrace technology to turn cities into something more substantial, impressive, and altogether more intelligent. Cities now need to be ‘smart’.
Is all this talk of using technology to make our cities ‘smarter’ somewhat superfluous? This may not be the case, but the current rhetoric surrounding the benefits of adopting a smart city model needs to be approached with a certain degree of caution.
Before deliberating over whether such an approach would, as a whole, be beneficial or detrimental to society, it is important to first consider what being a ‘smart city’ should entail. These discussions need to be approached with a certain degree of impartiality; they should not be driven solely by private sector technology firms. We need to ensure that the benefits of using technology to make our cities ‘smart’ is not a novelty, but rather something that is more enduring.
So, what exactly makes a city ‘smart’? It may not necessarily be the use of technology, at least not in the conventional ways that this term is often interpreted. Indeed, the concept of adopting new approaches to developing a city’s infrastructure is not entirely new. There have been several documented instances in the past where new technology has been used for the intended benefit of a society. Notable examples include the implementation of aqueducts during the Roman Empire for the purposes of hydration and sanitation and the construction of the Autobahn in early 20th century Germany – an ambitious means of physically connecting an entire nation. These examples have much in common: they required a certain degree of ingenuity; they were ambitious in terms of the scale in which they attempted to affect change; and lastly, they were well ahead of their time, usually having to overcome considerable skepticism regarding both their feasibility and whether they were borne out of any real necessity. Most importantly however, they are examples of initiatives that as a whole, made everyday life easier and more efficient.
The above examples do not rule out the use of modern day technology to further develop a city; rather, they illustrate that in order to be useful, technology must serve a clear purpose and should not be the sole focus of a city’s ambitions. It is important that we acknowledge the prominence of technology in the everyday life of a typical citizen as a tool – one of many – to establish Dubai as a smart city rather than as the ultimate goal. There needs to be a clear plan in place to determine exactly how technology will be used. Otherwise, this city – like any other – will be at risk of being
used as a platform for competing technology companies to posit their respective opinions of what, in their view, a smart city should be.
A good way of gauging how a city can use technology to develop further is to ask its people. There are several aspects of any city’s infrastructure that can be used to enhance the quality of its inhabitants; education, healthcare, transport and security are all notable examples. Leveraging public opinion through the use of citizen engagement initiatives to determine which of these things should take precedence is a great place to start. Once this is established, technology should be used to not only improve each of these sectors in isolation; it should also attempt to provide greater cohesion between them, allowing for things to be centralized and run more efficiently. This is already being done to good effect through a smart city strategy announced by the Government of Dubai. This approach aims to use technology responsibly to help both improve some of the sectors mentioned above and enable a greater degree of connectivity between them.
So, is this push for smart cities altogether redundant? If approached responsibly and with a certain degree of caution, aspiring to make a city ‘smart’ can be hugely beneficial. Doing so, especially when Dubai is still in its infantile stage